Improve company productivity with a Business Account.Sign Up

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 697
  • Last Modified:

Very long process (DataAdapter.Update), how to avoid ContextSwitchDeadlock (without simply disabling the exception)? (Urgent!)

I have 16 commands like this:

OleDbDataAdapter.Update(DataTable);

and each one is trying to insert about 10,000 rows to a Microsoft Access 2000 table.

So for sure, I will get the ContextSwitchDeadlock exception while debugging.

My question is how to make my application as optimized as possible to avoid throughing this exception without simply disabling it from the Debug --> Exceptions --> Managed...

And by the way, does the Release Build of the appplication will through the same exception while running?

Thanks in advance
0
salan_alani
Asked:
salan_alani
  • 3
  • 3
1 Solution
 
grayeCommented:
This error typically occurs while running with the debugger...   This is a new "feature" in Visual Studio 2005 that is universally hated by all.  OK, perhaps that's just my opinion :)

To turn off this rather annoying error, simply change a setting from the Debug menu.  Click on the Exceptions menu item to lauch a menu.  Drill into "Managed Debugging Assistants" and uncheck "ContextSwitchDeadlock"
0
 
salan_alaniAuthor Commented:
I know how to turn this feature off and I already mentioned that in my post. What I want is how to make my application as optimized as possible to avoid throughing this exception. In other words, how to meet all the rules that avoid detecting the deadlock. For example, how to use the pumping window or how to use the threading in my application where I am using a 16 commands like this:

OleDbDataAdapter.Update(DataTable);

and each one is trying to insert about 10,000 rows to a Microsoft Access 2000 table.
0
 
grayeCommented:
So, you're comfortable with the fact that you're not really dealing with an error...  and are really just trying to optimize your code.

Aside from the fact that the process takes a lot of time, it's also probably consuming a lot of memory.  So breaking up the Updates into more managable chunks will solve the "so-called" error and also reduces the memory footprint.

One of the features that's new to the Dot Net Framework 2.0 is the ability to perform a Fill() incrementally.  This allows you to process the updates in sizable chunks (rather than all rows at once).  Here is a short (partial) code snippet to demonstrate the technique that I use

                    ' Since some tables can be quiet large, we break them up
                    ' into smaller chunks.
                    For i = 0 To row_count Step max_rows
                        dts(0).Clear()
                        da.Fill(i, max_rows, dts)
                        ' mark the rows as "Added"
                        for each dr in dts(0).Rows
                            dr.SetAdded()
                        Next
                        da.Update(dts(0))
                    Next

If you need more speed than that, then I'd consider using the "classic" ADO drivers for Microsoft Access.  They are connection-based rather than the disconnected model of ADO.Net.  In extreme cases with over a half million rows for a single table, I've used both techniques... the incremental fill and "classic" ADO

0
Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

 
salan_alaniAuthor Commented:
Thanks graue, it's a great idea. But could you tell me approximately what is the max_rows value that I should use for a normal speed PC (INTEL P4 1.7 GHz, 256 KB Cache Memory, 512 MB RAM, Windows 2000 Server Edition, ...). Although my tables have approximately 30 columns for each one. Or if there is a special way to calculate this value, or if it is possible to tell me the value that you used with your projects, so I could have an idea about how big/small the value is.

Regarding the classic ADO drivers for MS Access, I really don't have any idea on how to use it. So, do you have any article links on this subject to learn it and see if it will help me increasing the speed of my application.

The thing that my application is taking almost all the CPU and Memory usage, and I am trying to optimize it as much as possible. So I am waiting for your feedback please..

Thanks for your help..
0
 
grayeCommented:
Actually, I just experiment by watching the memory values in the Task Manager to pick a max_rows value.   For my example above where I have ~600,000 rows in a single table, I use 10,000 as the max_rows value.

I've just recently posted a beta version of a program that uses this technique, if you'd like to see the whole thing in action.  It uses both techniques as I've described above (incremental fills and classic ADO).

The VB.Net source code is for an entire suite of programs (only one of which, called BackupSOSOS, will be of interest to you... the other 95% of the suite you can disregard).   You're welcome to take a look and copy/paste bits and pieces (or the whole thing).

http://home.hot.rr.com/graye/Temp/SOSOSv3_Beta
0
 
salan_alaniAuthor Commented:
It was really great help, thank you very much
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 3
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now