• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 190
  • Last Modified:

Java/JDBC utilizing memory properly w/o thrashing

I've been given a very simple task of writing a process that will transfer data from one server/datbase to another server/database.  Seems easy.  The hard part that I'm having to deal with is the amount of data that I'm having to transfer (anywhere from a million to 50 million records).  I'm trying to design my process properly so that it takes into account memory resources when running, but I'm still getting a large amount of thrashing.  So my question is, how can I take this simple task and make it run optimally.

Here's a snippet of what I got so far.
..............
//grab 1st connection
Properties props = new java.util.Properties();
props.put("user", userid);
props.put("password", password);
props.put("block size", "512");
conn = DriverManager.getConnection(url, props);

conn.setTransactionIsolation(conn.TRANSACTION_NONE);
Statement st = conn.createStatement(ResultSet.TYPE_FORWARD_ONLY,ResultSet.CONCUR_READ_ONLY);
ResultSet rs = st.executeQuery("...")
...

//grab 2nd connection
Properties props = new java.util.Properties();
props.put("user", userid);
props.put("password", password);
props.put("block size", "512");
conn = DriverManager.getConnection(url, props);

//populate
conn.setAutoCommit(false);
PreparedStatement ps = conn.prepareStatement("...");
while (rs.next())
{
 ...
 ps.addBatch();
}
ps.executeBatch();
conn.commit();
conn.setAutoCommit(false);

//close connections and done!
0
nixj14
Asked:
nixj14
1 Solution
 
heyhey_Commented:
generally speaking memory consumption depends on JDBC driver implementation and you can't do anything to reduce it (except specially crafted SQL statements)
0
 
Venci75Commented:
are you using BLOB fields ?
0
 
nixj14Author Commented:
sql statements are too simple to optimize, and no blob fields.
0
Get expert help—faster!

Need expert help—fast? Use the Help Bell for personalized assistance getting answers to your important questions.

 
nixj14Author Commented:
oh, and i have no control in the JDBC driver.
0
 
udaykumar22Commented:
Why don't you select specific chunks of the table and then pass it along.(You can lock the database activity for that amount of time if you want to).
The chunks of table could be based on the data on the tables. You can then schedule this transfer.
If possible, use a thin JDBC driver.

Regards,
Uday.
0
 
nixj14Author Commented:
Basically, I was looking for JDBC specifics on what the block sizes should be.  When/if I should executeBatch after a number of inserts have been batched up, and what that specific number should be.  I'm looking for performance tuning options.  
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Get expert help—faster!

Need expert help—fast? Use the Help Bell for personalized assistance getting answers to your important questions.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now