Table variable : sending data in small chunks

I am reading from a csv file and sending data as table valued parameter to a stored procedure.  From what i have tested so far , I am able to process 300k records in 3 mins 30 seconds . The file may contain up to millions of records as we go. I wanted to know if its a good idea to send all these records to the stored procedure in one go or should I send them in batches of say 500k
mkrchAsked:
Who is Participating?
 
mkrchConnect With a Mentor Author Commented:
I created an extension method to break the list in chunks and it works thanks.
http://stackoverflow.com/questions/769373/whats-a-clean-way-to-break-up-a-datatable-into-chunks-of-a-fixed-size-with-linq
0
 
vastoCommented:
Check the operation with the SQL profiler and you will see that there are inserts for each record behind the scene. I guess it doesn't matter what is the size of the chunk, the data is transferred record by record anyway.
0
 
Saurabh BhadauriaCommented:
Remember table valued parameter is like a table variable , so inserting whole data in this table variable will not be a good idea. Microsoft it self suggest to use table valued parameter for less then 1000 records, and my point of view this no of records will also depends upon the no of columns  defined in table variable.

Read below Microsoft  article for more details.
http://msdn.microsoft.com/en-us/library/bb510489.aspx

Also set your chunk size some where around 500 to 800 , and check performance with profiler for each chunk size.
0
Cloud Class® Course: Python 3 Fundamentals

This course will teach participants about installing and configuring Python, syntax, importing, statements, types, strings, booleans, files, lists, tuples, comprehensions, functions, and classes.

 
mkrchAuthor Commented:
I have 7 columns . I am not using any id columns at this time.
0
 
Saurabh BhadauriaCommented:
Then you can test your code with chunk size 500-800 through profiler ,
and can choose better chunk value.
0
 
mkrchAuthor Commented:
I did , there was no significatnt difference. Basically the mgmt wants me to send the data in batches. I want to know how can I do that. I have a datatable and I am inserting the datatable into TVP that is passed to a stored procedure. Is there a way I can send the data in batches like 1-10 then 10-20 etc . thanks
0
 
Saurabh BhadauriaCommented:
If you want to read your file at fornt end then this is the better way for chunks processing.

otherwise you can have a look on below link, to get some idea about bulk file upload.

http://www.codeproject.com/Articles/439843/Handling-BULK-Data-insert-from-CSV-to-SQL-Server
0
 
mkrchAuthor Commented:
Thank you . Basically I am trying to break my datatable in chunks . For instance
Datatable = [1,2,3,4,5,6,7,8]
datatable2=[1,2,3], [4,5,6 ], [7,8.]....or something like that
0
 
Saurabh BhadauriaCommented:
can you please describe in detail what you are trying to achieve ?
0
 
mkrchAuthor Commented:
efficient method
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.