Java/JDBC utilizing memory properly w/o thrashing

I've been given a very simple task of writing a process that will transfer data from one server/datbase to another server/database.  Seems easy.  The hard part that I'm having to deal with is the amount of data that I'm having to transfer (anywhere from a million to 50 million records).  I'm trying to design my process properly so that it takes into account memory resources when running, but I'm still getting a large amount of thrashing.  So my question is, how can I take this simple task and make it run optimally.

Here's a snippet of what I got so far.
..............
//grab 1st connection
Properties props = new java.util.Properties();
props.put("user", userid);
props.put("password", password);
props.put("block size", "512");
conn = DriverManager.getConnection(url, props);

conn.setTransactionIsolation(conn.TRANSACTION_NONE);
Statement st = conn.createStatement(ResultSet.TYPE_FORWARD_ONLY,ResultSet.CONCUR_READ_ONLY);
ResultSet rs = st.executeQuery("...")
...

//grab 2nd connection
Properties props = new java.util.Properties();
props.put("user", userid);
props.put("password", password);
props.put("block size", "512");
conn = DriverManager.getConnection(url, props);

//populate
conn.setAutoCommit(false);
PreparedStatement ps = conn.prepareStatement("...");
while (rs.next())
{
 ...
 ps.addBatch();
}
ps.executeBatch();
conn.commit();
conn.setAutoCommit(false);

//close connections and done!
LVL 1
nixj14Asked:
Who is Participating?
 
udaykumar22Connect With a Mentor Commented:
Why don't you select specific chunks of the table and then pass it along.(You can lock the database activity for that amount of time if you want to).
The chunks of table could be based on the data on the tables. You can then schedule this transfer.
If possible, use a thin JDBC driver.

Regards,
Uday.
0
 
heyhey_Commented:
generally speaking memory consumption depends on JDBC driver implementation and you can't do anything to reduce it (except specially crafted SQL statements)
0
 
Venci75Commented:
are you using BLOB fields ?
0
Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

 
nixj14Author Commented:
sql statements are too simple to optimize, and no blob fields.
0
 
nixj14Author Commented:
oh, and i have no control in the JDBC driver.
0
 
nixj14Author Commented:
Basically, I was looking for JDBC specifics on what the block sizes should be.  When/if I should executeBatch after a number of inserts have been batched up, and what that specific number should be.  I'm looking for performance tuning options.  
0
All Courses

From novice to tech pro — start learning today.