• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 315
  • Last Modified:

Efficient way to handle lots of database queries

I've got a script that has several queries in a loop.  The end result is that a single page execution may run 2500+ simple queries.  Needless to say, it can be pretty slow sometimes.  I'm thinking about replacing all those queries with one larger query, and then using arrays to access the data.

So right now I have:

while(id < something){
      Database.Execute("select * from table where id="&id)
      'yadda yadda yadda

And I am thinking of changing to:

array = Database.Execute("select * from table")

while(not end of array){
    'yadda yadda yadda

Is this going to give me better performance, or should I just leave it the way it is.  And yes, I know I should probably move to COM objects or something compiled, but for now its ASP/SQL, whether I like it or not.

  • 2
  • 2
  • 2
  • +3
5 Solutions
The single query will be substantial more efficient.  With the multiple queries the index has to be read for each one.  Plus with the single query it can use internal buffering to return block containing multiple rows.  You also eliminate the overhead of hve to manipulate the ID value.

I agree, but if your table contains a lot of data, you could end up with a memory problem in stead.

I'm thinking, if you face 2500+ queries from a single page execution, there must be a database view, that can solve your problem or perhaps a query of the form

"Select * from table where id in (select ids from sometable where criteria = " & pagecriteria & ")"

puts a lid on disc/network IO and memory consumption and is more efficient.

Agreed.  If you can find a way to move some of the code to the SQL server (stored procedures, views, etc), your performance will imnprove greatly.
Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

lSuch desicions are always based on the following
[1] Application in use
[2] anticipated/known  volume of data
[3] anticipated/known   volume of data tranfer from server to client
[4] anticipated/known   frequency of data transfer
[5] avarage change time of data
[6] what do you want to do with the data you get?
[7] shall it be read only or writeable
[8] are more than 1 person/software accessing this data at sime time or are likely to do so
[9] is the data server and client on the same system

an example to explain
a blood bank data
input once and rarely changed it shall have a different approch and in a

share market data
which is changing all  the time the solution shall be entirely different

So if you leave a discription of your application and the specific work pretaining to this question I could be more than glad to provide a solution and i guess other experts too shall be in better position to help you

--------Dont Use Array--------
by the way arrays take memory exponantially so dont use array for more than a few hundred size, consider caching it to disk you can serealise it and create an cache again depends on what u want to do with the data that you get.
there is NOT ONE SINGLE CASE where looking up a database 2500 times even through a numeric index can ever be faster than one single query. the memory issue should be controlled via query buffering or not.

consider non-buffered queries only if your SGBD either knows how to unlock itself when stuck (otherwise you might simply block the entire app if a user closes the page while loading) and the treatment on each line is short OR the SGBD provides line by line reading locks.

in buffered queries, the memory space could be allowed on the sql server and the lines be forwarded one by one to the web server.
try to use odbc_do and odbc_fetch_row (i'm not sure they are available through this name in asp)

you pretty well may use arrays as long as you don't want to store the complete set of results at once : you could slightly modify your own code to retrieve the results 50 by 50 for example.
don't rely on testing the speed of 1 page alone since the memory use will be very different for several queries.
you must create a page with n frameset pointing to your page and ask a friend to load it over the internet and countdown !
consider that n=users/20 will probably give you a good average time for loading
(if anybody knows about 'standard' figures for different kind of sites maybe considering more parameters such as average time a user spends on a page, i'd be greatly interested !)

good luck :) and remember that only you know how you want you own site developped
IgiwwaAuthor Commented:
Sorry for not responding in a while.  Thank you for all your suggestions.  I'm going to read them all in more detail and run some experiments.  I'll keep you posted.

IgiwwaAuthor Commented:
Thank you all for your comments.  For the time being, I used a single query to store all the data in arrays, using the id number from my database as the array element for quick access.

Some of your other suggestions have given me ideas for future improvements, but the array solution was quickest and easiest to implement.

Thanks again.
Glad we could help.  Thanks for the A. :^)

not sure if we have been more helpfull than COBOLdinosaur was in the first place, but thanks still.
cool if you went through.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 2
  • 2
  • 2
  • +3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now