?
Solved

ORA-01000: maximum open cursors exceeded

Posted on 2010-11-29
12
Medium Priority
?
2,111 Views
Last Modified: 2012-06-27
I have an application that iterates through a flat file and parses out some fields and inserts those fields into a table.  I have previously run this in a test environment and inserted over 8K rows without a problem.

The Oracle running on the server was updated and now I can only insert 444 rows.  I make it a point to close connections and command objects but still have this problem.

Please advise.
0
Comment
Question by:Howard Bash
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
  • 2
  • +2
12 Comments
 
LVL 75

Assisted Solution

by:käµfm³d 👽
käµfm³d   👽 earned 252 total points
ID: 34232466
Are you reusing the same connection object for the insert or are you creating a new object for each insert? Can you post an excerpt of how you interact with the DB?
0
 
LVL 77

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 1000 total points
ID: 34232561
Did you know Oracle comes with a program to do this.  It's called SQL*Loader.
0
 
LVL 11

Expert Comment

by:Akenathon
ID: 34232657
You might just need more cursors:

select * from v$parameter where name='open_cursors';

Open in new window


To raise the parameter:

ALTER SYSTEM SET open_cursors = 1000;

Open in new window


I agree with the previous comment that SQL*Loader or (preferrably) external tables are the way to go for what you are trying to accomplish.
0
Ransomware-A Revenue Bonanza for Service Providers

Ransomware – malware that gets on your customers’ computers, encrypts their data, and extorts a hefty ransom for the decryption keys – is a surging new threat.  The purpose of this eBook is to educate the reader about ransomware attacks.

 
LVL 77

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 1000 total points
ID: 34232690
You should not need to increase the open_cursors parameter for 444 rows.  It's likely a logic bug that kaufmed is hinting at.
0
 
LVL 11

Assisted Solution

by:Akenathon
Akenathon earned 252 total points
ID: 34233088
It depends on how many cursors he has left when executing the code, and on how the code does it.

For instance: If he used bind variables, he'd use only one cursor. So chances are he's using dynamic SQL to concatenate the values and execute an INSERT. That's bad practice for a number of reasons, including increased parse times, more shared memory used, and opening the door to SQL injection.

The best solution would be to rewrite the whole thing using external tables or else SQL*Loader. However, a quick fix is to increase the parameter... and not a bad one: If he's only 444 cursors from exhausting the parameter, he may hit that limit sooner or later on other portions of code anyway.

Adjusting CURSOR_SHARING is also a nice possibility, but it would need some expertise to perform exhaustive testing before changing it instance-wide. It is definitely a good one to recommend doing this before the INSERT loop:

alter session set cursor_sharing=force;

Open in new window


But again... 444 cursors from hitting the limit can be too few IMHO.
0
 
LVL 35

Assisted Solution

by:johnsone
johnsone earned 496 total points
ID: 34233181
Even if you are not using bind variables, which I agree is bad practice, the cursor should not remain open past the insert statement running.

Also OPEN_CURSORS is a session level parameter.  Why would a single session ever need 1000 open cursors?
0
 
LVL 1

Author Comment

by:Howard Bash
ID: 34233964
I close and dispose of all connection and command objects per row (added to see if it corrects this).  Also,  in my googling I see that Oracle "apparently" has some issues with freeing up connections even after the underlying code requests this.  Many matches for this error with most responses alluding to a problem with Oracle in this regard.

Regarding modifying the server connections: it's a production server and I cannot do that.

In sniffing around more for answers what I found that seems to work (I will do more testing) is to every say 200 inserts,  I call garbage collection for the vb.net application and that somehow does the trick.  Not clear what is happening to make Oracle release these connections, but have successfully inserted 9K rows with this addition of calling garbage collection.
0
 
LVL 77

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 1000 total points
ID: 34234000
I've not seen this problem when you manually call dispose after close.  I would verify that you are disposing all objects and not missing one.

I've also run into this.  for some reason Oracle's clean up still sees .Net connections as 'live' even though .Net has closed it and will never release it.  To date, as long as I make sure I dispose the objects, I've not run into the problem again.
0
 
LVL 35

Assisted Solution

by:johnsone
johnsone earned 496 total points
ID: 34234016
I don't know a lot about the application side.  Are you using connection pooling?  I have run into issues with various connection pooling applications where they are not correctly cleaned up.  That is not an issue with the database, that is an issue with the connection pool.  You request the connection to be closed, but the pool leaves it open hoping to reuse it, it never does and causes this problem.
0
 
LVL 1

Accepted Solution

by:
Howard Bash earned 0 total points
ID: 34236202
I have searched/checked for every connection object and command object that I create and associate with the database connection.  I have tried having one global connection object set to the database and also the one connection object closed and reopened for each access and I always blow with an exceeded max connections error message.

The only thing that seemed to "fix" this problem was calling GC.Collect().
0
 
LVL 77

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 1000 total points
ID: 34238495
It might be a bug of some type.  

To further assist you we would need version numbers of the database, client and version/type of data access provider you are using as well as some code.
0
 
LVL 1

Author Closing Comment

by:Howard Bash
ID: 34289824
All good suggestions.  However te GC.Collect() call worked.
0

Featured Post

NFR key for Veeam Backup for Microsoft Office 365

Veeam is happy to provide a free NFR license (for 1 year, up to 10 users). This license allows for the non‑production use of Veeam Backup for Microsoft Office 365 in your home lab without any feature limitations.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this series, we will discuss common questions received as a database Solutions Engineer at Percona. In this role, we speak with a wide array of MySQL and MongoDB users responsible for both extremely large and complex environments to smaller singl…
In this article, I’ll talk about multi-threaded slave statistics printed in MySQL error log file.
This video explains what a user managed backup is and shows how to take one, providing a couple of simple example scripts.
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
Suggested Courses

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question