We help IT Professionals succeed at work.

MS SQL Server 2008, C# and User Defined Tables

49 Views
Last Modified: 2016-04-21
I have a service that I pull in member data from a customer

There are @ 20 columns and usually @ 5,000 records.

Is it more efficient in the C# Service to read the data into a datatable
Then... pass it all in at one time to SQL Server with Structured Data type
The Stored procedure takes the paramater and parses it as a type Suer Defined Table


OR...
Just read one line at a time from the customer and pass it into SQL Server one record at a time?
Comment
Watch Question

CERTIFIED EXPERT
Top Expert 2015
Commented:
This one is on us!
(Get your first solution completely free - no credit card required)
UNLOCK SOLUTION
ste5anSenior Developer
CERTIFIED EXPERT

Commented:
And: 5000 rows sounds not that much. Depending on the actual service (does it return JSON or XML) even XML parsing in SQL Server is an option.

But why do you think about efficiency? How often do you plan to import this data a day?
CERTIFIED EXPERT
Commented:
This one is on us!
(Get your first solution completely free - no credit card required)
UNLOCK SOLUTION
Larry Bristersr. Developer

Author

Commented:
OK guys... great comments.   So... here is the process.

I have a Web Service that gets the data from the customer
I execute against this service from a simple assembly on SQL Server that I call in a SQL Job
SELECT [dbo].[ctcMember]()

Generally... there are 3-7k records

I am getting those records into a DataTable in my WCF and then passing to SQL in Structured format parameter which is received as a UserDefinedTable Type

That Stored procedure loops through the User Defined Table with a CURSOR/Fetch
Executes each data row against a "Insert_Update_Individual Stored procedure.

Based on what I am reading from you guys...
This is fine... and if I have problems either "clean up" the fetch / Insert process or increase the job interval a little bit?
CERTIFIED EXPERT

Commented:
I think so. From what you say, the CURSOR is the weakest link. If it's possible to treat the data manipulation as a "set" operation (as opposed to one-by-one with the FETCH), that would certainly be better and more efficient - but whether that can be done or not also depends on the logic inside Insert_Update_Individual. Otherwise, if it's working, don't break it :)
Larry Bristersr. Developer

Author

Commented:
Hey guys,
 Looks like MlandaT had most (personally) useful answer.

Any problems with 350to him and 150 nod to James Burger?
CERTIFIED EXPERT
Top Expert 2015

Commented:
We have absolutely nothing to say about how you assign the points. You are the one that can best see how each answer helps you and thus decide how the points should be assigned.
Larry Bristersr. Developer

Author

Commented:
Thanks folks
Unlock the solution to this question.
Join our community and discover your potential

Experts Exchange is the only place where you can interact directly with leading experts in the technology field. Become a member today and access the collective knowledge of thousands of technology experts.

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.