Link to home
Start Free TrialLog in
Avatar of Melodi Roberts
Melodi Roberts

asked on

Passing list of object to Oracle Database Procedure

As part of learning java, I have written java to loop through a list of an <object> and call an oracle database procedure for each object, passing individual values from the object rather than just the entire object.  It runs fine.  It does what it needs to do and seems fairly efficient, but I may redesign it for improved efficiency, improved modularity, and perhaps an improved design for error handling, etc.

I could redesign it to use an oracle type, and pass the list of object to the database, store all values in a global temp table, and then finish all other processing from that point based on data in the global temp table.  

This object list could be in the size of one to thousands.  I will not know in advance and it may grow in the future.  

Should I be concerned with the size of the list I am passing to the database?   Should I instead pass in batch sets using update batching functionality?  I know there is also the option to insert into the table from the java, too.  I am exploring options and the benefits and drawbacks of each right now.

I am reading through many websites, and specifically these to learn more: http://betteratoracle.com/posts/26-passing-arrays-between-java-and-oracle-procedures
http://docs.oracle.com/cd/B28359_01/java.111/b31224/oraperf.htm#CHDCCEHD

Thank you for any feedback or direction on what I should be considering with the design.
Avatar of johnsone
johnsone
Flag of United States of America image

What I would do is have Java load the temporary table and then call the procedure.  Don't try and pass all of that data and then parse it, "pass" it by loading it into the table.
Avatar of Melodi Roberts
Melodi Roberts

ASKER

Thank you for the feedback.

So it's more efficient to handle the inserts on the java side, and then execute the business logic against the global temp table in the database.  I was a bit worried about passing all that data in one fell swoop of which I have no control over or knowledge of the amount of data in advance....I'm guessing that could cause efficiency issues.  I can't find any info. on the limits to the size/amoutn of data passed in one call but....

Doing it within java gives the advantage of more control because, for example, I can do a loop through the prepared list <object>and then use the 'update batching' functionality to commit sets of records.

The thing I have to figure out now is where to provide the information on the status of each object processed.  I can either store this in the global temp, and then reference the table from the calling java, loop through it and do something with the status on the java side, e.g. output to a log file...or the database side, outputting to log file..I guess that is more driven by the functionality required.

btw-Is it common practice when downloading data from a third party server, to first load into a global temp table and then execute remaining logic in database?
ASKER CERTIFIED SOLUTION
Avatar of johnsone
johnsone
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial