ronan_40060
asked on
JDBC performance
Im doing a batch insert of 1 million records using the following code ,
however to improve performance I need to use statelessSession from
session.getSessionFactory( ).openStat elessSessi on() , How can I use openStatelessSession effectively to improve Batch insert performance in above code ?
Please advice
public void insertLogs(final List<HistoryLog> historyLogsList) throws SQLException
{
val conObj = DataSourceUtils.getConnection(dataSource);
PreparedStatement batchUpdate = null;
try
{
conObj.setAutoCommit(false);
batchUpdate = conObj.prepareStatement(HistoryLogDao.SQL_INSERT_HISTORYLOG);
for (val historyLogObj : historyLogsList)
{
int i = 0;
batchUpdate.setString(i++, modImportHistoryLogObj.getLogMessage());
batchUpdate.setDate(i++, new java.sql.Date(modImportHistoryLogObj.getStartDateTime().toDate().getTime()));
batchUpdate.setDate(i++, new java.sql.Date(modImportHistoryLogObj.getEndDateTime().toDate().getTime()));
batchUpdate.addBatch();
}
int [] countHistoryLogsRecords = batchUpdate.executeBatch();
conObj.commit();
}
catch(BatchUpdateException batchEx)
{
int[] updateCount = batchEx.getUpdateCounts();
for (int element : updateCount)
{
if (element==Statement.EXECUTE_FAILED)
{
log.error("HistoryLog Insert failed" + element );
} else
{
log.info("HistoryLog Inserted OK" + element);
}
}
}
catch (SQLException sqlEx)
{
log.error("An error occurred with batch inserts", sqlEx);
throw sqlEx;
}
finally
{
DataSourceUtils.releaseConnection(conObj, dataSource);
JdbcUtils.close(batchUpdate);
}
}
}
however to improve performance I need to use statelessSession from
session.getSessionFactory(
Please advice
ASKER
It is an on demand Task , We already have an import functionality done , Im just calling above method from a code to insert specific Log statements to a table which can be used for various purposes , I know its challenging to improve the performance using stateless session.
I'm guessing that when you are talking about "sessions" that you are using Hibernate or some other ORM. However, the above code is pure JDBC and hence would not be affected at all by any use of the ORM, stateless or otherwise. And the code that you have looks fine from a performance point of view. How long is it taking to perform the insert of 1 million rows?
However, I don't believe that this code would work!? On line 13, you are initialising i to be 0. And the line below you are using that as the parameter index that you are setting the String parameter for, and yes you are incrementing it but it is the post-increment operator, so the first line would still be trying to set an index of 0. However, the PreparedStatement methods all use indexes starting from 1.
Are you sure this is the actual code that you are using?
However, I don't believe that this code would work!? On line 13, you are initialising i to be 0. And the line below you are using that as the parameter index that you are setting the String parameter for, and yes you are incrementing it but it is the post-increment operator, so the first line would still be trying to set an index of 0. However, the PreparedStatement methods all use indexes starting from 1.
Are you sure this is the actual code that you are using?
ASKER
Thanks mccarl for your comments . Yes I was refactoring the code , The actual code is as below
The above code makes a method call to setImportHistoryParameterV alues which is as below
Now the issue is this code didnt worked on 1 million import while saving to the table as it got timed out . I suspect the logic written in setImportHistoryParameterV alues method .
Please provide your views if the above code can be corrected for a better performance .
Im using Hibernate and Spring and making a JDBC calls , I heard that Hibernate statelessSession can achieve better performance.
Please suggest.
public void insertLogs(final List<HistoryLog> historyLogsList) throws SQLException
{
val conObj = DataSourceUtils.getConnection(dataSource);
PreparedStatement batchUpdate = null;
try
{
conObj.setAutoCommit(false);
batchUpdate = conObj.prepareStatement(HistoryLogDao.SQL_INSERT_HISTORYLOG);
for (val historyLogObj : historyLogsList)
{
setImportHistoryParameterValues(batchUpdate, historyLogObj);
}
int [] countHistoryLogsRecords = batchUpdate.executeBatch();
conObj.commit();
}
catch(BatchUpdateException batchEx)
{
int[] updateCount = batchEx.getUpdateCounts();
for (int element : updateCount)
{
if (element==Statement.EXECUTE_FAILED)
{
log.error("HistoryLog Insert failed" + element );
} else
{
log.info("HistoryLog Inserted OK" + element);
}
}
}
catch (SQLException sqlEx)
{
log.error("An error occurred with batch inserts", sqlEx);
throw sqlEx;
}
finally
{
DataSourceUtils.releaseConnection(conObj, dataSource);
JdbcUtils.close(batchUpdate);
}
}
}
The above code makes a method call to setImportHistoryParameterV
private static void setImportHistoryParameterValues( final PreparedStatement ps , final HistoryLog historyLogObj ) throws SQLException
{
int i = 1;
batchUpdate.setString(i++, historyLogObj.getLogMessage());
batchUpdate.setDate(i++, new java.sql.Date(historyLogObj.getStartDateTime().toDate().getTime()));
batchUpdate.setDate(i++, new java.sql.Date(historyLogObj.getEndDateTime().toDate().getTime()));
batchUpdate.addBatch();
}
Now the issue is this code didnt worked on 1 million import while saving to the table as it got timed out . I suspect the logic written in setImportHistoryParameterV
Please provide your views if the above code can be corrected for a better performance .
Im using Hibernate and Spring and making a JDBC calls , I heard that Hibernate statelessSession can achieve better performance.
Please suggest.
Im using Hibernate and Spring and making a JDBC calls , I heard that Hibernate statelessSession can achieve better performance
Firstly, you may be using Hibernate elsewhere but the code above is pure JDBC so anything to do with HIbernate such as statelessSession don't apply.
There is nothing wrong your code as it is, the setImportHistoryParameterV
What I probably would suggest, is to find a middle ground being do each insert separately and doing everything as a batch. So maybe you can try inserting batches of 200 records at a time? Also, what is the DB that you are using for this?
ASKER
Thank you , I'm using SQL Server and Oracle both . 200 batch size seems OK . please elaborate on finding a middle ground to do each insert separately and doing as a batch , could you please show sample code to help me understand.
Thanks again for your time.
Thanks again for your time.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Excellent , Thank you for our time
Is this one off activity or a routine task? In either case I strongly suggest to seek alternative ways to achieve this and use above code as a last resort.
Incidentally, which is the database? Oracle / SQL Server?