Error "Resource limit exceeded..."

I get the following error on running a VFP6 (sp3) executable.
The program reads a dB/2 database on an IBM AS400 (The AS/400 is a CISC machine running on V3R2M0 of the OS/400) using IBM Client Access (Version level 3.1.2). I use an ADO connection in a VB6 ActiveX DLL. . The AS/400 in question is very heavily loaded and I feel that this could/may be causing the error below, but I have no ammunition with which to prove this to the customer. Initially,  I thought the problem was to do with the number of licenses the customer had for query/sql/client access, but their license limits is set to *nomax, so this obviously is not the issue.
The query runs ok for less than 1000 records but the error below occurs when the number is greater than 1000.

Error: OLE IDispatch exception code 0 from dbQuery.Execute: [IBM][Client Access ODBC Driver (32-bit) ][DB2/400 SQL]SQL0904 - Resource limit exceeded…
What causes this?
Thank you all in advance
Who is Participating?

Improve company productivity with a Business Account.Sign Up

P_S_PriceConnect With a Mentor Commented:
I am inclined to disagree with deighton, I believe it is your SQL ststement the error that you have posted clearly identifies which resource is breached and an In clause of 12122 characters could quite likely come within that remit!!

You need to find the message on the AS/400 to determine which of the following caused the error.

Find the job that services the SQL call and look at its joblog.

From the AS/400 message files

COMMAND wrkmsgf qsqlmsg

Message ID . . . . . . . . . :   SQL0904                                      
Message file . . . . . . . . :   QSQLMSG                                      
  Library  . . . . . . . . . :     QSYS                                        
Message . . . . :   Resource limit exceeded.                                  
Cause . . . . . :   Resource limit type &1 exceeded.  A list of the limit      
  types follows:                                                              
    -- Type 1 indicates that the user profile storage limit or the machine    
  storage limit was exceeded.                                                  
    -- Type 2 indicates that the machine lock limit was exceeded.              
    -- Type 3 indicates that the query resource limit was exceeded.  For more  
  information see the previously listed message CPD4365.                      
    -- Type 4 indicates that a journal error has occurred.                    
    -- Type 5 indicates that the commit lock limit was exceeded.              
    -- Type 6 indicates that the maximum size of the table has been reached.  
    -- Type 7 indicates that the maximum size of the prepared statement area  
  has been reached.                                

 Recovery  . . . :   Do one of the following: If this is error type 1, contact  
   the security officer to increase the user profile storage limit, or delete  
   some objects to free up storage and then try the request again.              
     -- If this is error type 2, then try the operation when the number of      
   machine locks held has decreased.                                            
     -- If this is error types 3, 4, or 5, see previously listed messages in    
   the job log for recovery information.                                        
     -- If this is error type 6, Some of the rows from this table must be moved
   to another table.                                                            
     -- If this is error type 7, issue a COMMIT or ROLLBACK without the HOLD    
   clause before issuing anymore PREPARE statements.                            
I suspect that you do have a Commit problem. If memory serves me correctly on operating systems that old the Commitment Control Threshold was a maximum of 1000 records (ie you could only have up to 100 records before either issuing a commit or roll back.
Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

93MH1351Author Commented:
The info given is not detailed enough to solve the problem
Have you got the message data up on the AS/400?  

You need to look at this to determine exactly which error scenario you have.

Please let us know!
Ok more Info I suspect that you are accessing the records for update. The AS/400 uses Journalling to support commitment control This, on your version of the operating system supports a maximum of 1000 records in a commit block. IE you can have read 1000 records, max, for update without issuing a commit or roll back but when you read 1001 you get the resource error.You should ensure that your VB ActiveX DLL or whatever you are using to access the AS/400 either does not use a committed transaction set, or issues a commit after each retrieval operation. The dead giveaway here is the 1000 record business. I would suggest that you post details (code) on how you are accessing the data.
93MH1351Author Commented:
The message below is the CPD4365 message that is coming back on the AS/400 when the PC program bombs out with the SQL0904 message. I have checked the user's joblog in subsystem QZDASOINIT, and it throws up the following message with a reason code 1.
I have no idea what this means.
Does this indicate whether the problem is coming from the AS400 or the PC?

The message below does not give any more detail for reason code 1.
What causes this?
How do I work around it?

Message ID . . . . .. :   CPD4365       Severity . . . . . . . :   40        
Message type . . . . . :   Diagnostic                                          
Date sent  . . . . . . :   08/05/00      Time sent  . . . . . . :   09:28:02  
Message . . .. :   Query resource limit exceeded. Reason code 1.              
Cause . . . . . The type of resource limit exceeded is indicated by reason
  code 1.  The reason codes and their reasons follow:                          
    1 -- More than 1,024 literals.                                            
    2 -- More than 32,768 fields in an internal buffer.                        
    3 -- More than 65,504 bytes in an internal buffer.                        
    4 -- More than 4,194,304 bytes in an internal work space.                  
    5 -- An attempt to create an internal query object exceeded the maximum size for such an object
    6 -- More than 8,000 fields in a record format are required for creation  
  of a temporary result.                                                      
    7 -- More than 32,766 bytes in a record format required for the creation  
Ah this may be your SQL request. Please post your code and SQL request.
93MH1351Author Commented:
Adjusted points from 30 to 50
93MH1351Author Commented:
I'll post all SQL soon but so far I have discovered that an IN clause in the SQL statement has a string 12122 characters in length?????
I am debugging somebody elses work with no comments in the code so please bear with me. You are my only hope!
Since your problem occurs once you have 1000+ records, I believe the file could be being processed to produce SQL based on all the records in the file.  

Once the file hits a certain size, the SQL is too large to run.

I suspect the SQL statement is being generated in some sort of dynamic way; based on the file he has.  Hence the error being hit at 1000 records.  I agree that the SQL statement ultimatly in use is at fault.

Obviously further details are required.
Appendix A ("SQL Limits") of the SQL Reference manual lists only one item that is limited to 1024:

"Most host variables in an SQL statement:  approximately 1024 (The limit is based on the number of host variables that can fit within the longest SQL statement of 32767 bytes."

At first it seems odd that your error is based upon the number of rows of the query.  Perhaps the program is doing "Multiple-Row FETCH Using a Host Structure Array" (  Look for a statement in the program like "EXEC SQL FETCH D11 FOR nnnn ROWS INTO :DEPT".  You may have to reduce the "nnnn ROWS" to a lesser number and do this FETCH statement multiple times in order to process all of your data.
93MH1351Author Commented:
The answer is not the solution but it is very close.
The problem was an 'IN' clause in the SQL statement was too large.
93MH1351Author Commented:
Comment accepted as answer
93MH1351Author Commented:
I changed the IN clause to read a table instead of passing a long string of 12122 chars. It works fine.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.