Solved

How to use ADO.Net DataTable to bulk-insert many records into database table?

Posted on 2009-05-07
11
3,252 Views
Last Modified: 2012-05-06
Is it possible to create a DataTable object, define a few columns, add a few thousand rows, connect it to a database table, then "sync" up and have it insert those rows into a SQL Server database table?

Is this reasonable efficient or is it still "under the covers" doing multiple inserts? (and in that case no need to go through all the hassle of using the DataTable approach?)

Has anyone tried this?

If so, what are the concrete steps of how to create these objects and assign properties to pull this off?
0
Comment
Question by:ZuZuPetals
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 4
  • 2
  • +1
11 Comments
 
LVL 9

Expert Comment

by:tculler
ID: 24325212
This is possible, I believe, but I won't go into the "how" because it would be mercilessy inefficient. The purpose ofthe DataTable and DataSet objects are to load data into memory. If you're planning on bulk inserting thousands of rows into a database, there's no reason to load them into memory first and have them sit there, eating up all of your RAM. Anything to do with a database comes down to SQL statements under the hood--that's the point of a Database Engine.

I recommend just using an "INSERT INTO" SQL-statement (or a series of them) to do the trick. If you need help on that, let me know--just post a few more details on what exactly you're trying to achieve.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 24325358
>>Is it possible to create a DataTable object, define a few columns, add a few thousand rows, connect it to a database table, then "sync" up and have it insert those rows into a SQL Server database table?<<
This is only possible in SQL Server 2008.  See this article from Tim Chapman (chapmandew):
Passing table valued parameters in SQL Server 2008
http://blogs.techrepublic.com.com/datacenter/?p=168

With SQL Server 2000, your options are down to:
1. Inserting one row at a time.
2. Passing all the rows as one using Xml or some other mechanism.

0
 
LVL 2

Author Comment

by:ZuZuPetals
ID: 24325647
I was hoping this would be an optimal solution for my earlier post:

    "Hourly Import 20K Rows... What is Best Practice? SSIS?"

I already have data in memory and have the task of getting it into the database efficiently somehow.
0
Comparison of Amazon Drive, Google Drive, OneDrive

What is Best for Backup: Amazon Drive, Google Drive or MS OneDrive? In this free whitepaper we look at their performance, pricing, and platform availability to help you decide which cloud drive is right for your situation. Download and read the results of our testing for free!

 
LVL 9

Expert Comment

by:tculler
ID: 24326172
Well, if it must be in memory, I would use a massive update. How is it formatted in memory? I mean to ask, what are the exact types of objects, what is the collection you're using, etc.
0
 
LVL 2

Author Comment

by:ZuZuPetals
ID: 24326563
Let me rephrase the question:  The problem is this...

We're capturing data in variables in code and need to write the information as a record in the database.  But we need to do this many thousands of times per day, so we're looking for a way to avoid a constant stream of one record inserts to the database because we believe this will be less efficient than sort of "caching" them up and then periodically bulk inserting them.  

The data we're collecting does not come from a structured format (does not come from a file or table) it is captured as variable data in code and needs to be inserted as a record.  

Are we correct in assuming periodic bulk inserts will be more efficient than an endless string of 1 record inserts?  If so, does anyone have any ideas on how best to "cache" this data and then bulk insert it periodically?
0
 
LVL 2

Author Comment

by:ZuZuPetals
ID: 24326881
They are numbers stored in a C# String object, delimited by row, easily machine readable/parseable/transformable into whatever form is necessary.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 24328072
In case you missed my comment, here it goes again:

With SQL Server 2000, your options are down to:
1. Inserting one row at a time.
2. Passing all the rows as one using Xml or some other mechanism.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 24328095
You are using SQL Server 2000, right?  If not what version are you using?
0
 
LVL 2

Author Comment

by:ZuZuPetals
ID: 24328833
Using SQL Server 2005.

I can easily construct a simple XML document from my data.  How do I call ADO.Net object so that the data is inserted into the target database table.  That would swell!
0
 
LVL 75

Accepted Solution

by:
Anthony Perkins earned 500 total points
ID: 24331388
Using ADO.NET you pass the Xml to a Stored procedure and then you have a couple of options:
1. Use OPENXML() (also supported in SQL Server 2000)
2. Use the Xml Data methods (new with SQL Server 2005)

To shred the Xml into table(s).
0
 
LVL 2

Expert Comment

by:muskad202
ID: 25217705
0

Featured Post

Stressed Out?

Watch some penguins on the livecam!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
Performance in games development is paramount: every microsecond counts to be able to do everything in less than 33ms (aiming at 16ms). C# foreach statement is one of the worst performance killers, and here I explain why.
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

726 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question