I have inherited a fairly complicated web application, which I want to make minimal changes to. It is PHP-based. It isn't as fast as it needs to be, so I'm throwing some new hardware at it and will edit the minimum that I can get away with to improve its performance.
Unfortunately, the application's PHP hasn't been structured in a way that makes it easy to time database queries/updates/appends, so I'd like to see if it I get a sense of where the inefficient operations are by looking at the PostgreSQL logs.
I notice that there is a log file analyser written in Ruby that looks to be good for my purposes: http://pqa.projects.postgresql.org/
Would enabling all of the logging required for this analyser be likely to bring the site to its knees? Or do you reckon I could run the site for a week or so with logging enabled to collect a fair picture of the bottle-necks?