?
Solved

Measuring Execution Time of a Single Command

Posted on 2003-03-31
5
Medium Priority
?
446 Views
Last Modified: 2013-11-15
I am trying to run an elementary benchmark of a MySQL database query using the Time::HighRes and DBD::mysql modules.  The idea is to query four tables of different sizes and measure the time to query each with Perl.  However, I suspect that my results are invalid because the first table always takes the longest to query, no matter how big it is, and the general trend is that subsequent queries take less and less time.  I suspect that Perl is trying to optimize execution and does commands out of order, which would be problematic for the method I'm using to measure elapsed time.  My question is this: How can I ensure that I am really measuring the time of the MySQL select statement?  

Below is the code snippet that does the 4 queries.  $ARGV[$i] is the size of the table being queried.  I am using Time::HighRes to take the measurements.  
-----------------------------------------------------------------------------------------------------  
$i = 1;
while($i <= 4) {
                # Find the middle record in the table
                $query = $ARGV[$i]/2;
                $sth = $dbh->prepare("SELECT * FROM `benchmark".$i."` WHERE `int`=".$query.";");
                # Set Timestamp
                my $start_time = [gettimeofday];
                exec($sth->execute);
                # Measure & Record Query Execution Time
                $query_time = tv_interval($start_time);
                $sth->finish;
                print "Query time was: ".$query_time." seconds\n";
                $i++;
        }
-----------------------------------------------------------------------------------------------------  
My apologies if I've left anything unclear.  I'm not even 100% sure that Perl is the factor responsible for the confusing results.  

Thanks,
Chris
0
Comment
Question by:kavefish
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
5 Comments
 
LVL 26

Expert Comment

by:wilcoxon
ID: 8240993
The simples way is to disconnect from MySQL and undef $dbh inside the while loop.  I'm pretty sure the optimization is in the DBI/DBD layer - if you force that to be recreated each time, then you should get accurate results.

Another option is to use explain plans (if MySQL supports them) which will time the sql at the database level (instead of perl level).
0
 

Author Comment

by:kavefish
ID: 8243194
I changed the query section given above to read as below, but the query times were the same.
-----------------------------------------------------------
$i = 1;
while($i <= 4) {
     # Connect to DB
     my $dbh = DBI->connect("DBI:mysql:$mysql_database",$mysql_username,$mysql_password) or die "Error connecting to database";
     
     $query = $ARGV[$i]/2;
     $sth = $dbh->prepare("SELECT * FROM`benchmark".$i."` WHERE `int`=".$query.";");
     
     # Set Timestamp
     my $start_time = [gettimeofday];
     
     $sth->execute;
     
     # Measure & Record Query Execution Time
     $query_time = tv_interval($start_time);    
     $sth->finish;
     print "\tQuery time of table ".$i." was:".$query_time." seconds\n";
     $i++;

     # Disconnect from DB
     $rc = $dbh->disconnect;
     # Undef all DB-related varibles
     undef($dbh);undef($sth);undef($rc);
}
-----------------------------------------------------------
I thought for sure that would be the answer.  :-(  MySQL apparently does support the EXPLAIN command, but from my first glance I'm not so sure it contains the needed query time information. However, I did notice that running the SELECT from a mysql command line interface does produce the execution times as I would expect them, so there's definitely something amiss at the Perl level.

Thanks,
Chris
0
 
LVL 26

Accepted Solution

by:
wilcoxon earned 225 total points
ID: 8246378
Oops.  You're right - explain plans usually don't give execution times - most databases also have another setting for getting the timings (sometimes only when using their command-line sql tool (isql, sql*plus, etc)).

Hmm.  It definitely appears there is something screwy at the Perl level.

The one sure-fire way I can think of is split your script into 4 scripts (identical except for the query used) and run each one in turn from a shell script.  Then the perl interpreter is started separately for each one and nothing should persist so the timings will be "clean".
0
 

Author Comment

by:kavefish
ID: 8273197
Again I was hoping you were right, but when I broke the queries into different scripts I still got data that didn't make sense.  

I made contact with some other folks who had a number of suggestions.  First, maybe MySQL is doing some caching behind the scenes.  Second, I didn't mention this earlier, but there are 5 tables (there used to be 4) of sizes ranging from 10 to 100000 records.  It's possible that MySQL might be using different algorithms depending on the size of the table being queried, and therefore  

They proposed that I make 15 measurements and throw out the first 5.  So in "More Power" fashion I wrote a loop to do 1010 queries and drop the first 10.  

The results weren't great, but it was the most realistic so far.  I'll go ahead and accept your above answer since I think it's as close as I'm likely to get.  Don't spend any time on it, but if you have any bright ideas feel free to let me know.  

Thanks for your help
-Chris
0
 

Author Comment

by:kavefish
ID: 8273280
See last comment
0

Featured Post

The Eight Noble Truths of Backup and Recovery

How can IT departments tackle the challenges of a Big Data world? This white paper provides a roadmap to success and helps companies ensure that all their data is safe and secure, no matter if it resides on-premise with physical or virtual machines or in the cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Developer portfolios can be a bit of an enigma—how do you present yourself to employers without burying them in lines of code?  A modern portfolio is more than just work samples, it’s also a statement of how you work.
This month, Experts Exchange’s free Course of the Month is focused on CompTIA IT Fundamentals.
Video by: Tony
This video teaches viewers how to export a project from Adobe Premiere Pro and the various file types involved.
The viewer will learn how to successfully download and install the SARDU utility on Windows 7, without downloading adware.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question