Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win


Efficient way to handle lots of database queries

Posted on 2004-03-23
Medium Priority
Last Modified: 2010-04-06
I've got a script that has several queries in a loop.  The end result is that a single page execution may run 2500+ simple queries.  Needless to say, it can be pretty slow sometimes.  I'm thinking about replacing all those queries with one larger query, and then using arrays to access the data.

So right now I have:

while(id < something){
      Database.Execute("select * from table where id="&id)
      'yadda yadda yadda

And I am thinking of changing to:

array = Database.Execute("select * from table")

while(not end of array){
    'yadda yadda yadda

Is this going to give me better performance, or should I just leave it the way it is.  And yes, I know I should probably move to COM objects or something compiled, but for now its ASP/SQL, whether I like it or not.

Question by:Igiwwa
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
  • +3
LVL 53

Accepted Solution

COBOLdinosaur earned 240 total points
ID: 10662637
The single query will be substantial more efficient.  With the multiple queries the index has to be read for each one.  Plus with the single query it can use internal buffering to return block containing multiple rows.  You also eliminate the overhead of hve to manipulate the ID value.


Assisted Solution

mikkelp earned 240 total points
ID: 10666728
I agree, but if your table contains a lot of data, you could end up with a memory problem in stead.

I'm thinking, if you face 2500+ queries from a single page execution, there must be a database view, that can solve your problem or perhaps a query of the form

"Select * from table where id in (select ids from sometable where criteria = " & pagecriteria & ")"

puts a lid on disc/network IO and memory consumption and is more efficient.


Assisted Solution

pillbug22 earned 240 total points
ID: 10670878
Agreed.  If you can find a way to move some of the code to the SQL server (stored procedures, views, etc), your performance will imnprove greatly.
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!


Assisted Solution

j_chakraverty earned 240 total points
ID: 10672525
lSuch desicions are always based on the following
[1] Application in use
[2] anticipated/known  volume of data
[3] anticipated/known   volume of data tranfer from server to client
[4] anticipated/known   frequency of data transfer
[5] avarage change time of data
[6] what do you want to do with the data you get?
[7] shall it be read only or writeable
[8] are more than 1 person/software accessing this data at sime time or are likely to do so
[9] is the data server and client on the same system

an example to explain
a blood bank data
input once and rarely changed it shall have a different approch and in a

share market data
which is changing all  the time the solution shall be entirely different

So if you leave a discription of your application and the specific work pretaining to this question I could be more than glad to provide a solution and i guess other experts too shall be in better position to help you

--------Dont Use Array--------
by the way arrays take memory exponantially so dont use array for more than a few hundred size, consider caching it to disk you can serealise it and create an cache again depends on what u want to do with the data that you get.
LVL 27

Assisted Solution

skullnobrains earned 240 total points
ID: 10688558
there is NOT ONE SINGLE CASE where looking up a database 2500 times even through a numeric index can ever be faster than one single query. the memory issue should be controlled via query buffering or not.

consider non-buffered queries only if your SGBD either knows how to unlock itself when stuck (otherwise you might simply block the entire app if a user closes the page while loading) and the treatment on each line is short OR the SGBD provides line by line reading locks.

in buffered queries, the memory space could be allowed on the sql server and the lines be forwarded one by one to the web server.
try to use odbc_do and odbc_fetch_row (i'm not sure they are available through this name in asp)

you pretty well may use arrays as long as you don't want to store the complete set of results at once : you could slightly modify your own code to retrieve the results 50 by 50 for example.
don't rely on testing the speed of 1 page alone since the memory use will be very different for several queries.
you must create a page with n frameset pointing to your page and ask a friend to load it over the internet and countdown !
consider that n=users/20 will probably give you a good average time for loading
(if anybody knows about 'standard' figures for different kind of sites maybe considering more parameters such as average time a user spends on a page, i'd be greatly interested !)

good luck :) and remember that only you know how you want you own site developped

Author Comment

ID: 10725796
Sorry for not responding in a while.  Thank you for all your suggestions.  I'm going to read them all in more detail and run some experiments.  I'll keep you posted.


Author Comment

ID: 10981020
Thank you all for your comments.  For the time being, I used a single query to store all the data in arrays, using the id number from my database as the array element for quick access.

Some of your other suggestions have given me ideas for future improvements, but the array solution was quickest and easiest to implement.

Thanks again.
LVL 53

Expert Comment

ID: 10981867
Glad we could help.  Thanks for the A. :^)

LVL 27

Expert Comment

ID: 10984736
not sure if we have been more helpfull than COBOLdinosaur was in the first place, but thanks still.
cool if you went through.

Featured Post

[Webinar] Lessons on Recovering from Petya

Skyport is working hard to help customers recover from recent attacks, like the Petya worm. This work has brought to light some important lessons. New malware attacks like this can take down your entire environment. Learn from others mistakes on how to prevent Petya like worms.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article covers the basics of the Sass, which is a CSS extension language. You will learn about variables, mixins, and nesting.
I found this questions asking how to do this in many different forums, so I will describe here how to implement a solution using PHP and AJAX. The logical flow for the problem should be: Write an event handler for the first drop down box to get …
Viewers will learn one way to get user input in Java. Introduce the Scanner object: Declare the variable that stores the user input: An example prompting the user for input: Methods you need to invoke in order to properly get  user input:
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…

604 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question