kapes13
asked on
Where is the bottleneck?
Hi, I hope I can explain my issue and get some ideas:
I basically load QueryResults into a dataset and loop through those results and perform calculations on the results.
This is all happening fine when my QueryResults are, lets say > 500 records. Once the QueryResults become a set of 1500 rows, for example, my server load on the DB goes way high and the operations take a minute instead of 5 seconds or less.
Where do I need to split the operations. Is 1500 rows just too big a result to process?
Thanks.
I basically load QueryResults into a dataset and loop through those results and perform calculations on the results.
This is all happening fine when my QueryResults are, lets say > 500 records. Once the QueryResults become a set of 1500 rows, for example, my server load on the DB goes way high and the operations take a minute instead of 5 seconds or less.
Where do I need to split the operations. Is 1500 rows just too big a result to process?
Thanks.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Hopefully the CPU times are accurate and I did not miss some memory hole somewhere that is not allowing the record set to really be processed in the proper area, so we are swapping out some processing routines and letting the smaller record sets do the bulk of the processing and see how that pans out, thanks all.
ASKER