Avatar of Eduardo Fuerte
Eduardo Fuerte
Flag for Brazil asked on

Could you clarify if internal queries from PHP code could slow the performance of the site?

Hi Experts

Could you clarify if internal queries from PHP code could slow the performance of the site?

The old legacy project that I recently started to participate in extensively queries within the PHP code, users often claim there is
loss of performance

Considering queries could be optimized or not, does the data traffic in these cases between MS-SQL Server (I guess on MySQL is identical) and PHP cause this performance loss?

Thanks in advance!

Avatar of undefined
Last Comment
Eduardo Fuerte

8/22/2022 - Mon


There are several things to consider, amount of data fetch, the queries, the number of user connected at the same time, the server space and spec hardware, Internet access, the services that are running (memory leak)...

You can check all logs to see if there any errors that can slow down the server...

To figure out you have to monitor everything to figure what are the causes..

About the queries if you fetch all data or only some specific data this reduce the load, so maybe some listing have thousand returns that could be reduced, let say user select specific filter and this load only these data.

Some data may need to be archived this reduce the load...
Julian Hansen

Things to consider.

1. When you access a DB server you are moving processing time out to a separate entity - in this case the DB server. While a query is being processed it is outside the scope of the PHP code that initiated it. Factors that play a role are
- performance of connection to the server
- network,
- load on DB server etc)

Solution: run some performance tests from your PHP server to the MySQL server to determine latency and DB server performance

2. If your database is growing in size this could result in a progressive reduction in performance.

Solution: check that your data is properly indexed

3. Your queries may not be optimal - although if everything remains equal and you are experiencing degradation over time then it is not the query at fault - one of the other factors might be causing the issue.

Personally I would check your data size and database optimisation (check your idexes).
David Favor

Post an actual URL which seems slow, for external testing.

Sometimes slowness only appears to be database related.

Better to know than guess.
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst
William Peck
Eduardo Fuerte

The site is intranet.
David Favor

Describe site. Static code. PHP. WordPress. LAMP Stack.

Details will suggest tools you can use to determine problem spot.
David Favor

Note: PHP is an unlikely culprit for slow site performance... well unless you have some sort of tight loop continually reading entire contents of a large table repeatedly - so a SELECT * with no LIMIT + no indexes being used.

Without knowing your entire codebase, tough to say.

You might try running mysqltuner + also enable logging slow + no index queries.
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.

COULD they? Absolutely, 100% they could.

The bigger question is ARE they?

You've mentioned the project "extensively queries within the PHP code" - the keyword here is "extensively" and whether it's extensive ENOUGH to cause performance problems.

Imagine you're sitting in an office full of 100 people, and suddenly it's lunchtime and every single person heads to the kitchen to prepare their lunch. The kitchen can only hold 10 people at a time. So 90 people wait in line, and as one person leaves the kitchen, another person goes in until all 100 are handled.

To make matters worse, there's that one guy who is a 5-star chef and insists on preparing a 5-course meal for himself, so he takes up the kitchen for an entire hour. There are a couple people whose lunches take 10 minutes to microwave, and others who simply grab something from a vending machine and are in and out within a few seconds.

This is sort of how congestion works. Every database connection (usually there's one connection per page load) is like an office worker who goes into the kitchen. And then once they're IN the kitchen, they are in there until their lunch is prepared (the time it takes for the query to be processed by the server). A single page/connection with multiple queries in it is like a person who has reserved his spot in the kitchen and is preparing several lunches.

1. A database server usually has a configuration setting to indicate how many simultaneous connections it can have.

2. Some operating systems, like Windows non-server operating systems (e.g. Windows 10 Pro), enforce a simultaneous network connection limit that is enforce at the system level, so it doesn't matter if you have sufficient resources - you might only be able to serve 20 concurrent connections at once.

3. When queries take a long time to execute, they usually have a big potential to be a bad impact on performance. This is especially true if every page load runs a similar query that takes a long time to execute. It would be like everyone in the office being a 5-star chef, so when the kitchen is occupied for longer, it then takes longer for the congestion to be resolved.

4. While a 5-star chef might take an hour to prepare one meal, if you have one person who is preparing 5000 simple sandwiches, they might end up taking MORE time in the kitchen than the 5-star chef simply due to the sheer volume. So when you talk about "extensive" queries, this might be what's being discussed.

Now, whenever I hear of a project making "extensive" number of queries, usually the problem is that the code has been developed to deal with one-by-one situations. For example, let's say you have a page that lists out a company directory, and you have 200 employees. The efficient way to build that page would be to query the entire employees table and get all the information in one query that takes 1-2 seconds to run. Instead, maybe you have code that gets the 200 IDs for all employees, and then individually queries each one, leading to 200 queries instead of one query.

That might take 5 seconds to run 200 queries instead of 1-2 seconds for the same information in one query.

More often than not, this is the kind of thing that happens. It's code that tries to query information on demand instead of looking ahead to see if it can get more information more efficiently in one query.

The other situation is the lack of caching. For example, let's say that you have code that looks up an employee's name when given their ID. If you have a page that lists the same employee 5 times, you can either issue the same query 5 times or you can cache that information in memory and reuse it. While databases usually have their own query cache, if the code still issues the query, you're still taking up space in the kitchen.

Implementing some efficient caching can be the best medicine for reducing "extensive" database queries, especially if there's data that doesn't change very often. For example, let's say your employee directory doesn't change that often and it's only a few hundred employees. You could query the basic attributes for everyone (names, phone #s, but nothing sensitive), and then cache that data (encoded in some way, like JSON) into a file on the local hard drive of the server. The next time the page loads, just read that cached data from the local file instead of querying the server for it, and you will almost instantly have all that employee lookup info in memory and the lookups could happen almost instantly instead of dealing with extensive queries.

Then, when the employee directory changes in any way, have THAT script update the cache so you're always dealing with fresh information.

There are a lot of factors that go into performance, but those are a couple of my best pieces of advice for a situation like what you're describing.

And if you want a more thorough look into performance on MySQL, read my article on it:

You mentioned both MySQL and SQL Server, so if you're using SQL Server, then that article won't apply.
Eduardo Fuerte

To make it worst.

Here it is a service of sending ambulances ...
Eduardo Fuerte

...as mentioned we are using Ms-SqlServer, everithing is under control, implementing better server specs. The old entire system embeds DB queries inside PHP code.
All of life is about relationships, and EE has made a viirtual community a real community. It lifts everyone's boat
William Peck

Log in or sign up to see answer
Become an EE member today7-DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform
Sign up - Free for 7 days
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.
Not exactly the question you had in mind?
Sign up for an EE membership and get your own personalized solution. With an EE membership, you can ask unlimited troubleshooting, research, or opinion questions.
ask a question
Eduardo Fuerte


By now thank you for so elaborated (and didactical) replies.
I'm carefully studying it.
Julian Hansen

It is important to approach this problem in a systematic way. There are many moving parts in the system you described it is not recommended that you simply go out and start tweaking things. For one an internet application is non-deterministic. A user experiencing a slow response today may have different experience for the exact same function tomorrow - so you need an accurate way of measuring performance so you know when you have found the problem or when some action results in an improvement (or degredation)

Hence my original suggestion to profile your connection - in addition you should profile your http server and the queries in your code.

If you are running on a shared server it is possible that your performance issues have nothing to do with your site at all but are in fact due to another site on your server that is hogging resources - it depends on your hosting environment but these are the sort of variables you need to remove from the equation.

For instance - are the reports of slow performance something new - or has this been a problem from the beginning - are they increasing or remaining static in terms of the number and nature of complaints. This is important because if the system was functioning fine in the past and suddenly there are performance issues - then something external to the code has changed - this could be environment, data, data size etc.

To approach this systematically you need to start by eliminating potential causes and you need to do that by doing individual tests on each of the components and sub-systems in an attempt to establish a benchmark. From there you need to then start making changes (one at a time) and then measure your tests against your benchmarks

You may want to check if IT department are using any monitoring tool.

A popular one for Unix https://mmonit.com/monit/
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
Eduardo Fuerte

Something we've mapped out is a series of sketched database jobs that rotate throughout the day.

Practically almost all the team of professionals who worked on the development of this site that serves 03 companies: dispatch of ambulances according to demand, homecare service (complete, including materials, medications and equipment) and follow-up call center run on this same site .

This system grew without documentation for about 10 years.

From the old team only a Jr developer with knowledge in 01 of the companies remains well as infrastructure professionals, the site needs to update an ERP and to this the jobs. All type of update: financial and fiscal, accounting, inventories. etc...

In addition to the many DB queries that runs internally  PHP codes whose efficiency is very doubtful we have to understand what each of the jobs skeduled really do.

Today we ended up identifying a very time-consuming job that did not generate an error log.

It ends up being a somewhat hospitable work as long as we do not map the processes better in terms of databases (50+) and PHP code (without framework usage but still reasonably organized: classes and modules). And we will need to evolve inventory control processes in the dispensing of drugs / materials / ... which are not correct.

We already thought about eliminating some jobs and waiting for someone to "complain" but we found it very risky
The information I am collecting through the answers that you are making available to me is helping us to organize everything.

The IT infrastructure is using a tool to monitorize the jobs I'm going to check what is.
Eduardo Fuerte

Thanks for the overall information!