Solved

Efficient way to handle lots of database queries

Posted on 2004-03-23
9
260 Views
Last Modified: 2010-04-06
I've got a script that has several queries in a loop.  The end result is that a single page execution may run 2500+ simple queries.  Needless to say, it can be pretty slow sometimes.  I'm thinking about replacing all those queries with one larger query, and then using arrays to access the data.

So right now I have:

while(id < something){
      Database.Execute("select * from table where id="&id)
      'yadda yadda yadda
      id++
}

And I am thinking of changing to:

array = Database.Execute("select * from table")

while(not end of array){
    'yadda yadda yadda
}

Is this going to give me better performance, or should I just leave it the way it is.  And yes, I know I should probably move to COM objects or something compiled, but for now its ASP/SQL, whether I like it or not.

Thanks.
0
Comment
Question by:Igiwwa
  • 2
  • 2
  • 2
  • +3
9 Comments
 
LVL 53

Accepted Solution

by:
COBOLdinosaur earned 60 total points
ID: 10662637
The single query will be substantial more efficient.  With the multiple queries the index has to be read for each one.  Plus with the single query it can use internal buffering to return block containing multiple rows.  You also eliminate the overhead of hve to manipulate the ID value.

Cd&
0
 
LVL 4

Assisted Solution

by:mikkelp
mikkelp earned 60 total points
ID: 10666728
I agree, but if your table contains a lot of data, you could end up with a memory problem in stead.

I'm thinking, if you face 2500+ queries from a single page execution, there must be a database view, that can solve your problem or perhaps a query of the form

"Select * from table where id in (select ids from sometable where criteria = " & pagecriteria & ")"

puts a lid on disc/network IO and memory consumption and is more efficient.

mikkelp
0
 
LVL 6

Assisted Solution

by:pillbug22
pillbug22 earned 60 total points
ID: 10670878
Agreed.  If you can find a way to move some of the code to the SQL server (stored procedures, views, etc), your performance will imnprove greatly.
0
 
LVL 1

Assisted Solution

by:j_chakraverty
j_chakraverty earned 60 total points
ID: 10672525
lSuch desicions are always based on the following
[1] Application in use
[2] anticipated/known  volume of data
[3] anticipated/known   volume of data tranfer from server to client
[4] anticipated/known   frequency of data transfer
[5] avarage change time of data
[6] what do you want to do with the data you get?
[7] shall it be read only or writeable
[8] are more than 1 person/software accessing this data at sime time or are likely to do so
[9] is the data server and client on the same system

an example to explain
--------------------------
a blood bank data
input once and rarely changed it shall have a different approch and in a

share market data
which is changing all  the time the solution shall be entirely different

-------------------------
So if you leave a discription of your application and the specific work pretaining to this question I could be more than glad to provide a solution and i guess other experts too shall be in better position to help you

--------Dont Use Array--------
by the way arrays take memory exponantially so dont use array for more than a few hundred size, consider caching it to disk you can serealise it and create an cache again depends on what u want to do with the data that you get.
0
Threat Intelligence Starter Resources

Integrating threat intelligence can be challenging, and not all companies are ready. These resources can help you build awareness and prepare for defense.

 
LVL 26

Assisted Solution

by:skullnobrains
skullnobrains earned 60 total points
ID: 10688558
there is NOT ONE SINGLE CASE where looking up a database 2500 times even through a numeric index can ever be faster than one single query. the memory issue should be controlled via query buffering or not.

consider non-buffered queries only if your SGBD either knows how to unlock itself when stuck (otherwise you might simply block the entire app if a user closes the page while loading) and the treatment on each line is short OR the SGBD provides line by line reading locks.

in buffered queries, the memory space could be allowed on the sql server and the lines be forwarded one by one to the web server.
try to use odbc_do and odbc_fetch_row (i'm not sure they are available through this name in asp)

you pretty well may use arrays as long as you don't want to store the complete set of results at once : you could slightly modify your own code to retrieve the results 50 by 50 for example.
don't rely on testing the speed of 1 page alone since the memory use will be very different for several queries.
you must create a page with n frameset pointing to your page and ask a friend to load it over the internet and countdown !
consider that n=users/20 will probably give you a good average time for loading
(if anybody knows about 'standard' figures for different kind of sites maybe considering more parameters such as average time a user spends on a page, i'd be greatly interested !)

good luck :) and remember that only you know how you want you own site developped
0
 

Author Comment

by:Igiwwa
ID: 10725796
Sorry for not responding in a while.  Thank you for all your suggestions.  I'm going to read them all in more detail and run some experiments.  I'll keep you posted.

Thanks!
0
 

Author Comment

by:Igiwwa
ID: 10981020
Thank you all for your comments.  For the time being, I used a single query to store all the data in arrays, using the id number from my database as the array element for quick access.

Some of your other suggestions have given me ideas for future improvements, but the array solution was quickest and easiest to implement.

Thanks again.
0
 
LVL 53

Expert Comment

by:COBOLdinosaur
ID: 10981867
Glad we could help.  Thanks for the A. :^)

Cd&
0
 
LVL 26

Expert Comment

by:skullnobrains
ID: 10984736
not sure if we have been more helpfull than COBOLdinosaur was in the first place, but thanks still.
cool if you went through.
0

Featured Post

Top 6 Sources for Identifying Threat Actor TTPs

Understanding your enemy is essential. These six sources will help you identify the most popular threat actor tactics, techniques, and procedures (TTPs).

Join & Write a Comment

Preface In the first article: A Better Website Login System (http://www.experts-exchange.com/A_2902.html) I introduced the EE Collaborative Login System and its intended purpose. In this article I will discuss some of the design consideratio…
Have you tried to learn about Unicode, UTF-8, and multibyte text encoding and all the articles are just too "academic" or too technical? This article aims to make the whole topic easy for just about anyone to understand.
Viewers will learn about arithmetic and Boolean expressions in Java and the logical operators used to create Boolean expressions. We will cover the symbols used for arithmetic expressions and define each logical operator and how to use them in Boole…
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

15 Experts available now in Live!

Get 1:1 Help Now