Larry Brister
asked on
MS SQL Server 2008, C# and User Defined Tables
I have a service that I pull in member data from a customer
There are @ 20 columns and usually @ 5,000 records.
Is it more efficient in the C# Service to read the data into a datatable
Then... pass it all in at one time to SQL Server with Structured Data type
The Stored procedure takes the paramater and parses it as a type Suer Defined Table
OR...
Just read one line at a time from the customer and pass it into SQL Server one record at a time?
There are @ 20 columns and usually @ 5,000 records.
Is it more efficient in the C# Service to read the data into a datatable
Then... pass it all in at one time to SQL Server with Structured Data type
The Stored procedure takes the paramater and parses it as a type Suer Defined Table
OR...
Just read one line at a time from the customer and pass it into SQL Server one record at a time?
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
OK guys... great comments. So... here is the process.
I have a Web Service that gets the data from the customer
I execute against this service from a simple assembly on SQL Server that I call in a SQL Job
SELECT [dbo].[ctcMember]()
Generally... there are 3-7k records
I am getting those records into a DataTable in my WCF and then passing to SQL in Structured format parameter which is received as a UserDefinedTable Type
That Stored procedure loops through the User Defined Table with a CURSOR/Fetch
Executes each data row against a "Insert_Update_Individual Stored procedure.
Based on what I am reading from you guys...
This is fine... and if I have problems either "clean up" the fetch / Insert process or increase the job interval a little bit?
I have a Web Service that gets the data from the customer
I execute against this service from a simple assembly on SQL Server that I call in a SQL Job
SELECT [dbo].[ctcMember]()
Generally... there are 3-7k records
I am getting those records into a DataTable in my WCF and then passing to SQL in Structured format parameter which is received as a UserDefinedTable Type
That Stored procedure loops through the User Defined Table with a CURSOR/Fetch
Executes each data row against a "Insert_Update_Individual Stored procedure.
Based on what I am reading from you guys...
This is fine... and if I have problems either "clean up" the fetch / Insert process or increase the job interval a little bit?
I think so. From what you say, the CURSOR is the weakest link. If it's possible to treat the data manipulation as a "set" operation (as opposed to one-by-one with the FETCH), that would certainly be better and more efficient - but whether that can be done or not also depends on the logic inside Insert_Update_Individual. Otherwise, if it's working, don't break it :)
ASKER
Hey guys,
Looks like MlandaT had most (personally) useful answer.
Any problems with 350to him and 150 nod to James Burger?
Looks like MlandaT had most (personally) useful answer.
Any problems with 350to him and 150 nod to James Burger?
We have absolutely nothing to say about how you assign the points. You are the one that can best see how each answer helps you and thus decide how the points should be assigned.
ASKER
Thanks folks
But why do you think about efficiency? How often do you plan to import this data a day?