can mysql handle over 10 mil of rows?

Dear Experts,

I wish to know if mysql can handle more than 10 mil of rows? I have a query coded in to php simply obtain 10 mil rows of data from mysql DB1 to mysql DB2. DB1 and DB2 are from different location (2 different servers). This is a table with 10 mil rows and this is the only table that I cannot copy.

I wish to know if mysql/php can handle over 10 mil of rows as in copy all 10mil of rows with a single connection? (If I put a LIMIT 100000 my code will be able to copy all those rows over).
Kinderly WadeprogrammerAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

GaryCommented:
Why not dump the db, copy it over and reimport?
There is no limit to what you can copy, the only problem would be the way you are doing it.
Kinderly WadeprogrammerAuthor Commented:
I've created a script that will execute the backup every hour. Since the code is based upon PHP, I am just trying out with the php solution.

I am just wondering if there is a cap on number of rows the database can copy over with a single connection because if I placed a LIMIT, and loop it through then I will be able to copy it over successfully.

If there is a cap, was it capped by my web hosting provider or simply there is a cap within mysql as in number of rows can be copied with a single connection?
GaryCommented:
Is this run from the commandline?
Give me some idea of the sql - you're not trying to load 10 million records in one go are you?
Active Protection takes the fight to cryptojacking

While there were several headline-grabbing ransomware attacks during in 2017, another big threat started appearing at the same time that didn’t get the same coverage – illicit cryptomining.

Ray PaseurCommented:
The answer about 10,000,000 rows is "no problem" in the MySQL side.  But the process may take a long time, and there may be server-to-server communications issues we don't know about.  I would think you want to come up with a restartable process.  Usually these are based on a UNIQUE column value.  When an attempt to INSERT a duplicate value in a UNIQUE index occurs, MySQL throws error #1062.  Your script can trap this error and ignore it.  Another way to do this is to run several steps to copy the data, perhaps using LIMIT clauses to reduce the size in any one operational step.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Brian TaoSenior Business Solutions ConsultantCommented:
MySQL has no problem handling multi-million records. I believe it's on PHP side.  It should be that your script hit either execution time limit or memory limit.
So when you said you could not copy, what was the error message you got from your PHP script?
Kinderly WadeprogrammerAuthor Commented:
I think it is the limitation from web hosting site. I was able to do 10 mil locally by setting up two mysql DB on separate machine. If I try to connect to web hosting DB, I get some sort of cap around 100K rows. Thanks all for great feedback.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
MySQL Server

From novice to tech pro — start learning today.