Solved

PHP / MySQL - highly transactional programming & thousands of concurrent users?

Posted on 2009-04-07
3
510 Views
Last Modified: 2012-06-27
# Strong understanding of highly transactional programming
# Code optimization to support thousands of concurrent users

Hi,

I've just seen these included in the specs for a job advertisement for a PHP, MySQL Web Developer.  Could you give me some ideas / examples of regular solutions / patterns to solve these problems.

What is highly transactional programming? Is this just a fancy term for a web app / site that will have to do a lot of DB updates?

What are the normal methods to accommodate high numbers of concurrent users in web apps.  What can i do in the PHP / MySQL? or is it really down to Apache / MySQL Server and the hardware to deal with this?

Thanks
0
Comment
Question by:damoth1
  • 2
3 Comments
 
LVL 34

Accepted Solution

by:
Beverley Portlock earned 250 total points
ID: 24087777
There are too many factors to fit a realistic answer in, but....

Transactional database updates usually refer to the practice of using journalled databases. These allow you to add updates to several tables and then when a consistent point is reached to "commit" all the updates in one go. It means that if a user backs out of a transaction then any uncommitted changes can just be dropped. This is usually done via the Inno-DB engine.

To be honest, if you have ask these sorts of questions then I'm not sure you should be going for the job! (sorry!)

To deal with thousands of users it is a combination of high-speed connections, fail-over setups, large, fast servers, connection pooling, optimisied database design... it needs a book, not an EE answer.
0
 

Author Comment

by:damoth1
ID: 24087872
Thanks bportlock,

These 2 are listed as preferred skills - i have all the required skills on the job spec.  I just want to understand what the two terms meant.

Anyway, now i know what is meant by Transactional Database Updates.  Could i get a little more information about concurrent users just in terms of PHP / MySQL - i just want a very high level overview like what was given  for transactional databases above.
0
 
LVL 34

Assisted Solution

by:Beverley Portlock
Beverley Portlock earned 250 total points
ID: 24088533
A lot of what has to be done will depend on the precise configuration being used. For instance, there may be multiple Apache servers being fed on a load-balanced basis. The speed of each Apache server would depend on the memory available the number of children that the server process can spawn, how many are created initially, how many held in reserve.

It may be possible for the databases to be held separately with remote processes consolidating the data in an off-line manner. Probably the must important consideration would be the number of updates as this can drastically affect the database response. It may be that queries are queued and processed by a "dispatcher engine" which I have used in fast processing environments with multi-processor computers (normally IBM AS/400s).

One way to increase responsive is to open 20 or 30 connections to the database and make them persist so that they exist in a pool to be allocated. This saves the overhead of re-establishing a server connection every time a transaction starts but you need an efficient connection pool manager or else the overhead is worse than making a fresh connection.

There are in-memory tables for fast lookup, query optimisation, data denormalisation.... there are dozens of techniques.

At a guess, your first headache is getting the users to the server at a reasonable speed - that mean lots of bandwidth on the networks into the server room. Next the servers will need a good chunk of memory - say 128K per Apache process and if you are running 30 Apache processes then you are looking at nearly 4GB of memory so you want a 64bit server, probably with a minimum of 2 cores per chip. I would imagine some form of RAID - RAID 5 is resilient but slow if it loses a disc. RAID 0 is faster and may suffice if you have the right controllers. The discs will be your biggest bottleneck since, being mechanical, they are the slowest component. I would want a multi-disc setup and the machine partitioned onto different discs to separate data / caches / OS / programs.

I'm skimming the surface here, but you get the idea.


0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Things That Drive Us Nuts Have you noticed the use of the reCaptcha feature at EE and other web sites?  It wants you to read and retype something that looks like this.Insanity!  It's not EE's fault - that's just the way reCaptcha works.  But it is …
Build an array called $myWeek which will hold the array elements Today, Yesterday and then builds up the rest of the week by the name of the day going back 1 week.   (CODE) (CODE) Then you just need to pass your date to the function. If i…
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question