ORA-01000: maximum open cursors exceeded

I have an application that iterates through a flat file and parses out some fields and inserts those fields into a table.  I have previously run this in a test environment and inserted over 8K rows without a problem.

The Oracle running on the server was updated and now I can only insert 444 rows.  I make it a point to close connections and command objects but still have this problem.

Please advise.
LVL 1
Howard BashSenior Software EngineerAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
Howard BashConnect With a Mentor Senior Software EngineerAuthor Commented:
I have searched/checked for every connection object and command object that I create and associate with the database connection.  I have tried having one global connection object set to the database and also the one connection object closed and reopened for each access and I always blow with an exceeded max connections error message.

The only thing that seemed to "fix" this problem was calling GC.Collect().
0
 
käµfm³d 👽Connect With a Mentor Commented:
Are you reusing the same connection object for the insert or are you creating a new object for each insert? Can you post an excerpt of how you interact with the DB?
0
 
slightwv (䄆 Netminder)Connect With a Mentor Commented:
Did you know Oracle comes with a program to do this.  It's called SQL*Loader.
0
Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

 
AkenathonCommented:
You might just need more cursors:

select * from v$parameter where name='open_cursors';

Open in new window


To raise the parameter:

ALTER SYSTEM SET open_cursors = 1000;

Open in new window


I agree with the previous comment that SQL*Loader or (preferrably) external tables are the way to go for what you are trying to accomplish.
0
 
slightwv (䄆 Netminder)Connect With a Mentor Commented:
You should not need to increase the open_cursors parameter for 444 rows.  It's likely a logic bug that kaufmed is hinting at.
0
 
AkenathonConnect With a Mentor Commented:
It depends on how many cursors he has left when executing the code, and on how the code does it.

For instance: If he used bind variables, he'd use only one cursor. So chances are he's using dynamic SQL to concatenate the values and execute an INSERT. That's bad practice for a number of reasons, including increased parse times, more shared memory used, and opening the door to SQL injection.

The best solution would be to rewrite the whole thing using external tables or else SQL*Loader. However, a quick fix is to increase the parameter... and not a bad one: If he's only 444 cursors from exhausting the parameter, he may hit that limit sooner or later on other portions of code anyway.

Adjusting CURSOR_SHARING is also a nice possibility, but it would need some expertise to perform exhaustive testing before changing it instance-wide. It is definitely a good one to recommend doing this before the INSERT loop:

alter session set cursor_sharing=force;

Open in new window


But again... 444 cursors from hitting the limit can be too few IMHO.
0
 
johnsoneConnect With a Mentor Senior Oracle DBACommented:
Even if you are not using bind variables, which I agree is bad practice, the cursor should not remain open past the insert statement running.

Also OPEN_CURSORS is a session level parameter.  Why would a single session ever need 1000 open cursors?
0
 
Howard BashSenior Software EngineerAuthor Commented:
I close and dispose of all connection and command objects per row (added to see if it corrects this).  Also,  in my googling I see that Oracle "apparently" has some issues with freeing up connections even after the underlying code requests this.  Many matches for this error with most responses alluding to a problem with Oracle in this regard.

Regarding modifying the server connections: it's a production server and I cannot do that.

In sniffing around more for answers what I found that seems to work (I will do more testing) is to every say 200 inserts,  I call garbage collection for the vb.net application and that somehow does the trick.  Not clear what is happening to make Oracle release these connections, but have successfully inserted 9K rows with this addition of calling garbage collection.
0
 
slightwv (䄆 Netminder)Connect With a Mentor Commented:
I've not seen this problem when you manually call dispose after close.  I would verify that you are disposing all objects and not missing one.

I've also run into this.  for some reason Oracle's clean up still sees .Net connections as 'live' even though .Net has closed it and will never release it.  To date, as long as I make sure I dispose the objects, I've not run into the problem again.
0
 
johnsoneConnect With a Mentor Senior Oracle DBACommented:
I don't know a lot about the application side.  Are you using connection pooling?  I have run into issues with various connection pooling applications where they are not correctly cleaned up.  That is not an issue with the database, that is an issue with the connection pool.  You request the connection to be closed, but the pool leaves it open hoping to reuse it, it never does and causes this problem.
0
 
slightwv (䄆 Netminder)Connect With a Mentor Commented:
It might be a bug of some type.  

To further assist you we would need version numbers of the database, client and version/type of data access provider you are using as well as some code.
0
 
Howard BashSenior Software EngineerAuthor Commented:
All good suggestions.  However te GC.Collect() call worked.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.