I'm using perl to extract a lot of data from an oracle database and printing this to a file. I'm using the DBI library to connect to oracle.
My sql in the prepare statement is a join between 4 tables (where all access is made on indexes using index hinting where appropriate), and returns approx. 45K of rows (about 1/3 of total data). I have run this sql in sqlpluss where it returns my result within reasonable time - approx 2 min.
If i implement this into perl i get the following results:
The execute() statement takes about 10 minutes to finish which is slower than above.
But my biggest problem is that when i loop through the results it starts to hang. I started to print the times of every 1k rows and realised it never got that far. I then printed every row. It did about 25 - 30 rows more or less instantly and then it stalls.
I have tried to use fetchrow_hashref() and fetchrow_array() (don't know if there is a performance gain between the two??) and i have tried to both manipulate the data on the fly and also store them locally (both in hash and array) before proceeding but with the same results.
I have used the same approach before with large amounts of data (mind you only on 1 table access, possibly only a join between 2 tables).
Does anyone know why this is happening and preferably a solution?