Solved

Very long process (DataAdapter.Update), how to avoid ContextSwitchDeadlock (without simply disabling the exception)? (Urgent!)

Posted on 2006-06-25
6
663 Views
Last Modified: 2011-09-20
I have 16 commands like this:

OleDbDataAdapter.Update(DataTable);

and each one is trying to insert about 10,000 rows to a Microsoft Access 2000 table.

So for sure, I will get the ContextSwitchDeadlock exception while debugging.

My question is how to make my application as optimized as possible to avoid throughing this exception without simply disabling it from the Debug --> Exceptions --> Managed...

And by the way, does the Release Build of the appplication will through the same exception while running?

Thanks in advance
0
Comment
Question by:salan_alani
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
6 Comments
 
LVL 41

Expert Comment

by:graye
ID: 16980920
This error typically occurs while running with the debugger...   This is a new "feature" in Visual Studio 2005 that is universally hated by all.  OK, perhaps that's just my opinion :)

To turn off this rather annoying error, simply change a setting from the Debug menu.  Click on the Exceptions menu item to lauch a menu.  Drill into "Managed Debugging Assistants" and uncheck "ContextSwitchDeadlock"
0
 
LVL 2

Author Comment

by:salan_alani
ID: 16981719
I know how to turn this feature off and I already mentioned that in my post. What I want is how to make my application as optimized as possible to avoid throughing this exception. In other words, how to meet all the rules that avoid detecting the deadlock. For example, how to use the pumping window or how to use the threading in my application where I am using a 16 commands like this:

OleDbDataAdapter.Update(DataTable);

and each one is trying to insert about 10,000 rows to a Microsoft Access 2000 table.
0
 
LVL 41

Expert Comment

by:graye
ID: 16983963
So, you're comfortable with the fact that you're not really dealing with an error...  and are really just trying to optimize your code.

Aside from the fact that the process takes a lot of time, it's also probably consuming a lot of memory.  So breaking up the Updates into more managable chunks will solve the "so-called" error and also reduces the memory footprint.

One of the features that's new to the Dot Net Framework 2.0 is the ability to perform a Fill() incrementally.  This allows you to process the updates in sizable chunks (rather than all rows at once).  Here is a short (partial) code snippet to demonstrate the technique that I use

                    ' Since some tables can be quiet large, we break them up
                    ' into smaller chunks.
                    For i = 0 To row_count Step max_rows
                        dts(0).Clear()
                        da.Fill(i, max_rows, dts)
                        ' mark the rows as "Added"
                        for each dr in dts(0).Rows
                            dr.SetAdded()
                        Next
                        da.Update(dts(0))
                    Next

If you need more speed than that, then I'd consider using the "classic" ADO drivers for Microsoft Access.  They are connection-based rather than the disconnected model of ADO.Net.  In extreme cases with over a half million rows for a single table, I've used both techniques... the incremental fill and "classic" ADO

0
Forrester Webinar: xMatters Delivers 261% ROI

Guest speaker Dean Davison, Forrester Principal Consultant, explains how a Fortune 500 communication company using xMatters found these results: Achieved a 261% ROI, Experienced $753,280 in net present value benefits over 3 years and Reduced MTTR by 91% for tier 1 incidents.

 
LVL 2

Author Comment

by:salan_alani
ID: 16984582
Thanks graue, it's a great idea. But could you tell me approximately what is the max_rows value that I should use for a normal speed PC (INTEL P4 1.7 GHz, 256 KB Cache Memory, 512 MB RAM, Windows 2000 Server Edition, ...). Although my tables have approximately 30 columns for each one. Or if there is a special way to calculate this value, or if it is possible to tell me the value that you used with your projects, so I could have an idea about how big/small the value is.

Regarding the classic ADO drivers for MS Access, I really don't have any idea on how to use it. So, do you have any article links on this subject to learn it and see if it will help me increasing the speed of my application.

The thing that my application is taking almost all the CPU and Memory usage, and I am trying to optimize it as much as possible. So I am waiting for your feedback please..

Thanks for your help..
0
 
LVL 41

Accepted Solution

by:
graye earned 500 total points
ID: 16985570
Actually, I just experiment by watching the memory values in the Task Manager to pick a max_rows value.   For my example above where I have ~600,000 rows in a single table, I use 10,000 as the max_rows value.

I've just recently posted a beta version of a program that uses this technique, if you'd like to see the whole thing in action.  It uses both techniques as I've described above (incremental fills and classic ADO).

The VB.Net source code is for an entire suite of programs (only one of which, called BackupSOSOS, will be of interest to you... the other 95% of the suite you can disregard).   You're welcome to take a look and copy/paste bits and pieces (or the whole thing).

http://home.hot.rr.com/graye/Temp/SOSOSv3_Beta
0
 
LVL 2

Author Comment

by:salan_alani
ID: 16995995
It was really great help, thank you very much
0

Featured Post

Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Welcome my friends to the second instalment and follow-up to our Minify and Concatenate Your Scripts and Stylesheets (http://www.experts-exchange.com/Programming/Languages/.NET/ASP.NET/A_4334-Minify-and-Concatenate-Your-Scripts-and-Stylesheets.html)…
Real-time is more about the business, not the technology. In day-to-day life, to make real-time decisions like buying or investing, business needs the latest information(e.g. Gold Rate/Stock Rate). Unlike traditional days, you need not wait for a fe…
With Secure Portal Encryption, the recipient is sent a link to their email address directing them to the email laundry delivery page. From there, the recipient will be required to enter a user name and password to enter the page. Once the recipient …
Exchange organizations may use the Journaling Agent of the Transport Service to archive messages going through Exchange. However, if the Transport Service is integrated with some email content management application (such as an antispam), the admini…

756 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question