Link to home
Start Free TrialLog in
Avatar of rstaveley
rstaveleyFlag for United Kingdom of Great Britain and Northern Ireland

asked on

Timing queries/updates non-intrusively in a web application

I have inherited a fairly complicated web application, which I want to make minimal changes to. It is PHP-based. It isn't as fast as it needs to be, so I'm throwing some new hardware at it and will edit the minimum that I can get away with to improve its performance.

Unfortunately, the application's PHP hasn't been structured in a way that makes it easy to time database queries/updates/appends, so I'd like to see if it I get a sense of where the inefficient operations are by looking at the PostgreSQL logs.

I notice that there is a log file analyser written in Ruby that looks to be good for my purposes: http://pqa.projects.postgresql.org/

Would enabling all of the logging required for this analyser be likely to bring the site to its knees? Or do you reckon I could run the site for a week or so with logging enabled to collect a fair picture of the bottle-necks?
ASKER CERTIFIED SOLUTION
Avatar of earth man2
earth man2
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of rstaveley

ASKER

What I'm really trying to get is a list of individual queries / updates which take longer than the rest, so I can focus on them. In MySQL, they have the "slow query log" (see http://dev.mysql.com/doc/refman/5.0/en/slow-query-log.html), which would probably suffice for me, if such a thing existed for PostgreSQL. It would be nice also to see which style of query is executed most often. Armed with this knowledge, I can concentrate on optimising those queries alone.

I'm not really worried about the overhead of statistics collection so much as messing around with unfamiliar code prematurely. I don't want to sprinkle the code with timing hooks to find out which bits are slow. If I can see some obvious SQL baddies before digging into the PHP code, it could allow me to locate the offending code more quickly.

An alternative approach would be to look at the Apache logs to see which pages are taking the longest to render, but taking timings from the PostgreSQL logs would be much more focused. It *may* be that no optimisation is required in the SQL and all of the optimisation required would be achieved by using an optimiser with the PHP code... but I'd like to be able to find out.
Thanks for the advice, earthman2. I left this to run to see if my question would get any more responses, but it is time to close it now. That Ruby analyser is the best thing I've found, but it isn't ideal for identifying design weaknesses. There's no substitute really for timing the queries within the application and that means improving its organisation so that all queries go via a common call wrapper.