Link to home
Start Free TrialLog in
Avatar of Larry Brister
Larry BristerFlag for United States of America

asked on

MS SQL Server 2008, C# and User Defined Tables

I have a service that I pull in member data from a customer

There are @ 20 columns and usually @ 5,000 records.

Is it more efficient in the C# Service to read the data into a datatable
Then... pass it all in at one time to SQL Server with Structured Data type
The Stored procedure takes the paramater and parses it as a type Suer Defined Table


OR...
Just read one line at a time from the customer and pass it into SQL Server one record at a time?
SOLUTION
Avatar of Jacques Bourgeois (James Burger)
Jacques Bourgeois (James Burger)
Flag of Canada image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of ste5an
And: 5000 rows sounds not that much. Depending on the actual service (does it return JSON or XML) even XML parsing in SQL Server is an option.

But why do you think about efficiency? How often do you plan to import this data a day?
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Larry Brister

ASKER

OK guys... great comments.   So... here is the process.

I have a Web Service that gets the data from the customer
I execute against this service from a simple assembly on SQL Server that I call in a SQL Job
SELECT [dbo].[ctcMember]()

Generally... there are 3-7k records

I am getting those records into a DataTable in my WCF and then passing to SQL in Structured format parameter which is received as a UserDefinedTable Type

That Stored procedure loops through the User Defined Table with a CURSOR/Fetch
Executes each data row against a "Insert_Update_Individual Stored procedure.

Based on what I am reading from you guys...
This is fine... and if I have problems either "clean up" the fetch / Insert process or increase the job interval a little bit?
I think so. From what you say, the CURSOR is the weakest link. If it's possible to treat the data manipulation as a "set" operation (as opposed to one-by-one with the FETCH), that would certainly be better and more efficient - but whether that can be done or not also depends on the logic inside Insert_Update_Individual. Otherwise, if it's working, don't break it :)
Hey guys,
 Looks like MlandaT had most (personally) useful answer.

Any problems with 350to him and 150 nod to James Burger?
We have absolutely nothing to say about how you assign the points. You are the one that can best see how each answer helps you and thus decide how the points should be assigned.
Thanks folks