Ali Kayahan
asked on
Heavily loaded Php Application - Expert Optimization Techniques
Hi experts have a little trouble with optimization ;
I have developed a gold stock exchange web based application that shows live currencies, by using ; php,mysql and js and it used to work fine for a while . Now its hosted from win 2003 server with 512mb rams / 2.2 ghz cpu and there's a page stays open on server that connects to a web service , takes currency datas from there and creates data.xml once in 12 seconds which all application works over it.Also this page connects to mysql takes profit margins from there and adds it to my customer's currency value before creating data.xml.Lets call this part as update.php.
While update.php is open once on server and renews it self once in 12 seconds ,a file called as main.php , parses data.xml and saves all current currencies buy / sell information in sessions to compare with the next values in order to display up or down arrows near by updated currency values and plays a sound if my customer's currency updated.
In a file called as program.php i setted up an interval in javascript that uses jquery load to load main.php every 12 seconds in to the main div , program.php is the file which is accessed by clients and stays open all time in client's computers.
Result seems like above everything works perfect , but nowadays the simultaneous usage of application rised and there are at least 600 simultaneous connections which makes web server fail and stuck.
Here are my solutions ;
Hiring my vps from vps.net which is linux + apache based much more better than win + iis build , also cloud hosted.
Creating another page that stays open at server to convert program.php 's output to html and save it as output.html and serve it from program2.html in every 12 seconds , so there will be only one session created and all php processes works once.
Any other solutions ?
I have developed a gold stock exchange web based application that shows live currencies, by using ; php,mysql and js and it used to work fine for a while . Now its hosted from win 2003 server with 512mb rams / 2.2 ghz cpu and there's a page stays open on server that connects to a web service , takes currency datas from there and creates data.xml once in 12 seconds which all application works over it.Also this page connects to mysql takes profit margins from there and adds it to my customer's currency value before creating data.xml.Lets call this part as update.php.
While update.php is open once on server and renews it self once in 12 seconds ,a file called as main.php , parses data.xml and saves all current currencies buy / sell information in sessions to compare with the next values in order to display up or down arrows near by updated currency values and plays a sound if my customer's currency updated.
In a file called as program.php i setted up an interval in javascript that uses jquery load to load main.php every 12 seconds in to the main div , program.php is the file which is accessed by clients and stays open all time in client's computers.
Result seems like above everything works perfect , but nowadays the simultaneous usage of application rised and there are at least 600 simultaneous connections which makes web server fail and stuck.
What i ask for is ;is there a better way to build this application , i mean logically ? Would caching work here since the data is renewed every 12 seconds ?
Here are my solutions ;
Hiring my vps from vps.net which is linux + apache based much more better than win + iis build , also cloud hosted.
Creating another page that stays open at server to convert program.php 's output to html and save it as output.html and serve it from program2.html in every 12 seconds , so there will be only one session created and all php processes works once.
Any other solutions ?
Grab a copy of Firefox and install Firebug + YSlow. Those will help you with the WWW part of the application.
Since we cannot see your data base, web service or programming, we probably cannot help you with anything more than theoretical guesses. So here are some of the things that can help improve the speed of web applications.
Be sure you have indexes on every column used my SQL statements JOIN, WHERE, ORDER BY and GROUP BY. If you are adding a lot of rows to the tables, consider dropping the indexes and recreating them instead of having them in place during the additions.
Use EXPLAIN on every complex query to see what MySQL is doing.
Use LIMIT whenever you can. Avoid SELECT * and choose the columns you need instead.
Add script timers around any calls to external services -- see if the services are responding quickly enough.
It sounds like "data.xml" might be a design element that could be removed - you might just put this information directly into the data base?
Whether the currency is rising or falling really only requires two rows from the data base - current and most recent. Your SELECT statement would ORDER BY most recent and LIMIT 2.
Determine whether there are enough data pipes into the server. The word "connections" might mean HTTP connections or MySQL connections - either could be causing you trouble.
Should you move to Linux? I would. But I cannot tell you that it will necessarily help your performance. Before I did that, I would hire a DBA to examine the application in detail. For a few hundred dollars, you can get a world-class computer science expert to share a day of experience with you. Sounds like it might be worth it.
HTH, ~Ray
Since we cannot see your data base, web service or programming, we probably cannot help you with anything more than theoretical guesses. So here are some of the things that can help improve the speed of web applications.
Be sure you have indexes on every column used my SQL statements JOIN, WHERE, ORDER BY and GROUP BY. If you are adding a lot of rows to the tables, consider dropping the indexes and recreating them instead of having them in place during the additions.
Use EXPLAIN on every complex query to see what MySQL is doing.
Use LIMIT whenever you can. Avoid SELECT * and choose the columns you need instead.
Add script timers around any calls to external services -- see if the services are responding quickly enough.
It sounds like "data.xml" might be a design element that could be removed - you might just put this information directly into the data base?
Whether the currency is rising or falling really only requires two rows from the data base - current and most recent. Your SELECT statement would ORDER BY most recent and LIMIT 2.
Determine whether there are enough data pipes into the server. The word "connections" might mean HTTP connections or MySQL connections - either could be causing you trouble.
Should you move to Linux? I would. But I cannot tell you that it will necessarily help your performance. Before I did that, I would hire a DBA to examine the application in detail. For a few hundred dollars, you can get a world-class computer science expert to share a day of experience with you. Sounds like it might be worth it.
HTH, ~Ray
ASKER
@haijerome: Thanks a lot for that great mysql optimization article , i already clicked yes for it :) .But in my case ; i only use one row and one column even though all optimization tips already applied on it .
@Ray_Paseur : Ray , as i wrote mysql has really few jobs to do , in this case a query gets triggered by update.php in every 12 seconds which works on server and closed for client access , and compiles data.xml with the datas that has taken from web service.
As far as i know , yslow can only look for image,css,dom ect.. speed which is completely client side related anyway i checked it again and looks excellent.
I think there is no need DBA while the database in question is mysql anyway would it be smarter to let mysql handle all mess instead of xml and sessions ? In this case we will get rid of all session data which really constitutes a serious part of load whilst creating a serious headache for mysql.
But we should imagine that 600 clients asks for new data in every 12 seconds while upgrade.php writing new datas in every 12 seconds , this means query rain , in this case should we use memcached ?
@Ray_Paseur : Ray , as i wrote mysql has really few jobs to do , in this case a query gets triggered by update.php in every 12 seconds which works on server and closed for client access , and compiles data.xml with the datas that has taken from web service.
As far as i know , yslow can only look for image,css,dom ect.. speed which is completely client side related anyway i checked it again and looks excellent.
I think there is no need DBA while the database in question is mysql anyway would it be smarter to let mysql handle all mess instead of xml and sessions ? In this case we will get rid of all session data which really constitutes a serious part of load whilst creating a serious headache for mysql.
But we should imagine that 600 clients asks for new data in every 12 seconds while upgrade.php writing new datas in every 12 seconds , this means query rain , in this case should we use memcached ?
SOLUTION
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
ASKER
Ray ; i just thought handling 50 queries per / second for mysql would consume more time than parsing xml . I can handle all mysql operations thats why we dont need DBA :) , so i will try to rebuild application by using mysql to store datas recieved from web service and use memcached to serve it clients.
ASKER CERTIFIED SOLUTION
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
ASKER
A table build like ;
create table currency(
cId int not null auto_increment ,
primary key(cId),
prevGld int,
currentGld int,
prevDollar int,
currentDollar int
);
from program.php runs on server side once in 12 seconds ;
update currency set prevGld = currentGld , prevDollar = currentDollar ,currentGld = '$currentGld' , currentDollar = '$currentDollar' where cId = '1' ;
create table currency(
cId int not null auto_increment ,
primary key(cId),
prevGld int,
currentGld int,
prevDollar int,
currentDollar int
);
from program.php runs on server side once in 12 seconds ;
update currency set prevGld = currentGld , prevDollar = currentDollar ,currentGld = '$currentGld' , currentDollar = '$currentDollar' where cId = '1' ;
Are you suggesting that there is only one row in the data base, and that this is the only change every 12 seconds? Are there any other tables?
ASKER
Yes only one row in database no other tables , the row gets updated by program.php in every 12 seconds , as far as we have 5 currency values that we retrieve from web service , saving their previous values for comparision for up or down arrow would be enough , wouldnt it ?
Sure, unless you want to keep some kind of historical record. You could keep another table of similar structure and just do INSERT ... DELAYED.
http://dev.mysql.com/doc/refman/5.1/en/insert-delayed.html
http://dev.mysql.com/doc/refman/5.1/en/insert-delayed.html
ASKER
So from client side that refreshes once in 12 seconds ;
$memcache = New Memcache();
$memcache->connect('localh ost', 11211) or die("everything will be ok , really") ;
$currencies = $memcache->get( 'values:' . $updatedValues ); // i set this from update.php that works on server
$memcache = New Memcache();
$memcache->connect('localh
$currencies = $memcache->get( 'values:' . $updatedValues ); // i set this from update.php that works on server
I don't know how to answer you since I this appears to be only a tiny fragment of the code. It looks right, I guess.
What is the question at this point?
What is the question at this point?
ASKER
i was just trying to make you busy by asking foolish questions to gain time for answering other questions (: ok i was kidding , i just wanted to be sure about my new build , in update.php (the file works on server side once in 12 seconds and forbidden for client access) ;
1) i get datas from webservice ,
2) insert them in to mysql ,
3) populate memcache variable 'values' with query result that holds prev and current currency values.
from client side that 600 clients connect and renews it self once in 12 seconds
1) i run the code above to get memcache variable
2) fetch datas from that variable and parse them
3) than i pray for memcache not to commit suicide
is this build true or logically is there any pitfall ?
1) i get datas from webservice ,
2) insert them in to mysql ,
3) populate memcache variable 'values' with query result that holds prev and current currency values.
from client side that 600 clients connect and renews it self once in 12 seconds
1) i run the code above to get memcache variable
2) fetch datas from that variable and parse them
3) than i pray for memcache not to commit suicide
is this build true or logically is there any pitfall ?
Nothing could possibly go wrong.
ASKER
Thanks a lot for your help and patience :) , i ll re-build the project and let you know the result
Glad to help you :-)
Even before cache your files, you have to optimize your MySQL server for better results that you are looking for. Kindly go through with the link given below
http://www.experts-exchang
If it helps then its cool else let me know !!!
Regards,
Jerome Dennis D