Solved

LINQ-to-SQL Bulk insert optimisation

Posted on 2010-08-20
15
1,360 Views
Last Modified: 2013-11-11
Hi,

I'm working with a relatively large data export where I read data from the COM objects, store them in classes generated by LINQ and then run inserts.

For example, I have the following data structure.

Customer
CustomerId
FName
LName

Order
OrderId
OrderDateTime

Item
ItemId
ItemDescription

Customer can have 1-to-many orders.
Order must have at least 1 item.

I have a list of 100 customers, where each customer has 500 orders and each order contains 10 items.

The current implementation is as follows:
Pseudo code
foreach(Customer c in COMObject.Customers)
{

    ExportCustomer(c); // This will generate customer Id that can be used in the next foreach loop

    foreach(Order o in c.Orders)
    {
        ExportOrder(o); // This will generate order id that can be usedi n the next foreach loop
        foreach(Item i in o.Items)
        {
               ExportItem(i);       
        }
    }
}

Open in new window


Each Export method opens a SQL connectino through LINQ DataContext and submits changes to the underlying database.

First of all the problem is that this export runs in O^3 which is already a problem - but I don't think there is a way to overcome this since I need to have a primary key within each loop.

This adds up to half a million SQL calls which is nasty, especially when GetTable<DataType> does a lot through generics and reflection.

What I'm considering is storing the entire export in memory and then running a bulk insert where GetTable<DataType> will be called few times at the most.

My question is where there are any alternatives to this? I can't avoid using LINQ and I can't use stored procedures either.

Also I'd like to know what is a reasonable amount of data that I can store in memory and how much data is stored behind the scenes. For example, the customer table contains only 3 members, but how much is stored behind the scenes to map that to the SQL table?

Example that I have provided is very basic but hopefully it explains my concerns. If I were to associate my example with the system that we are working on, than in the worse case scenariod we'd have:
1500 customers
5 orders per customer
10 items per order

this would still add up to 75000 SQL calls.

Thank you

Edit
As I was reading the question again, I realised that the data in the COM object is already in memory, so my guess is that it's not that much data that I'm storing.

The second idea that I had is to leave the data context opened to avoid calling GetTable<DataType> on each loop iteration.
0
Comment
Question by:capcap
  • 6
  • 4
  • 2
15 Comments
 
LVL 19

Expert Comment

by:Bhavesh Shah
ID: 33486220
Hi,

As you working with large data,I would like to suugest you to insert data on procedure level.

It will be very easy to handle it in backend.
0
 
LVL 19

Expert Comment

by:Bhavesh Shah
ID: 33486278
Sory,

Forget to ask u, why you cant use procedures......?
0
 

Author Comment

by:capcap
ID: 33486304
At the present there are time constraints that don't allow us to re-write the whole of export.

Stored procedures would work in the similar way, I wouldn't be calling GetTable<DataType> each time and I will have an advantage of SQL execution plan caching, but apart from that I don't see much benefit. Am I missing something out?
0
 
LVL 19

Expert Comment

by:Bhavesh Shah
ID: 33486395

Please correct me if I'm wrong.
I'm assuming you are trying to Insert Data????

In that case,i'm not finding you should find any big problem.

Sory,if m wrong......
0
 

Author Comment

by:capcap
ID: 33486499
Mainly it's inserts. 30% of the time it's a conditional insert. E.g. Insert only if the object's ID is 0.

There are few updates but I think I can work around that.
0
 
LVL 19

Expert Comment

by:Bhavesh Shah
ID: 33487068
gud going sir.i suggest dat.all d best.
0
Threat Intelligence Starter Resources

Integrating threat intelligence can be challenging, and not all companies are ready. These resources can help you build awareness and prepare for defense.

 
LVL 19

Expert Comment

by:Bhavesh Shah
ID: 33692695
Hi....

Does your issue is over??
did you get the answer?
0
 

Author Comment

by:capcap
ID: 33714626
Hi,

None of the answers above address the issue of running bulk exports through LINQ-to-SQL

Thank you
0
 
LVL 19

Expert Comment

by:Bhavesh Shah
ID: 33714738
Ok.


If u need more assistance from high level experts, please click on Request Attention Link.
0
 
LVL 40

Expert Comment

by:Vadim Rapp
ID: 33720530
I think the practical alternative is to export first into a file, can be xml, and then import the file into the database.
0
 

Author Comment

by:capcap
ID: 33723048
Hi Vadim,

This means that data will be flattened into XML and than I'll have to read that XML and build bulk SQL inserts. Isn't this twice as much work?
0
 
LVL 40

Accepted Solution

by:
Vadim Rapp earned 500 total points
ID: 33758929
(sorry for the delay, for some reason apparently not all notifications work)

I think you will still get big advantage in performance because sql utility will import the file in one bulk operation, contrary to insane opening separate connection per each line. It's probably impossible to positively tell how it's going to work without trying, but that would be the first thing I'd try - primarily because it's known fact that sql bulk import utility has high performance.
0

Featured Post

Top 6 Sources for Identifying Threat Actor TTPs

Understanding your enemy is essential. These six sources will help you identify the most popular threat actor tactics, techniques, and procedures (TTPs).

Join & Write a Comment

A long time ago (May 2011), I have written an article showing you how to create a DLL using Visual Studio 2005 to be hosted in SQL Server 2005. That was valid at that time and it is still valid if you are still using these versions. You can still re…
Although it can be difficult to imagine, someday your child will have a career of his or her own. He or she will likely start a family, buy a home and start having their own children. So, while being a kid is still extremely important, it’s also …
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.

746 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now