• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 886
  • Last Modified:

can mysql handle over 10 mil of rows?

Dear Experts,

I wish to know if mysql can handle more than 10 mil of rows? I have a query coded in to php simply obtain 10 mil rows of data from mysql DB1 to mysql DB2. DB1 and DB2 are from different location (2 different servers). This is a table with 10 mil rows and this is the only table that I cannot copy.

I wish to know if mysql/php can handle over 10 mil of rows as in copy all 10mil of rows with a single connection? (If I put a LIMIT 100000 my code will be able to copy all those rows over).
0
Kinderly Wade
Asked:
Kinderly Wade
2 Solutions
 
GaryCommented:
Why not dump the db, copy it over and reimport?
There is no limit to what you can copy, the only problem would be the way you are doing it.
0
 
Kinderly WadeprogrammerAuthor Commented:
I've created a script that will execute the backup every hour. Since the code is based upon PHP, I am just trying out with the php solution.

I am just wondering if there is a cap on number of rows the database can copy over with a single connection because if I placed a LIMIT, and loop it through then I will be able to copy it over successfully.

If there is a cap, was it capped by my web hosting provider or simply there is a cap within mysql as in number of rows can be copied with a single connection?
0
 
GaryCommented:
Is this run from the commandline?
Give me some idea of the sql - you're not trying to load 10 million records in one go are you?
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
Ray PaseurCommented:
The answer about 10,000,000 rows is "no problem" in the MySQL side.  But the process may take a long time, and there may be server-to-server communications issues we don't know about.  I would think you want to come up with a restartable process.  Usually these are based on a UNIQUE column value.  When an attempt to INSERT a duplicate value in a UNIQUE index occurs, MySQL throws error #1062.  Your script can trap this error and ignore it.  Another way to do this is to run several steps to copy the data, perhaps using LIMIT clauses to reduce the size in any one operational step.
0
 
Brian TaoSenior Business Solutions ConsultantCommented:
MySQL has no problem handling multi-million records. I believe it's on PHP side.  It should be that your script hit either execution time limit or memory limit.
So when you said you could not copy, what was the error message you got from your PHP script?
0
 
Kinderly WadeprogrammerAuthor Commented:
I think it is the limitation from web hosting site. I was able to do 10 mil locally by setting up two mysql DB on separate machine. If I try to connect to web hosting DB, I get some sort of cap around 100K rows. Thanks all for great feedback.
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Tackle projects and never again get stuck behind a technical roadblock.
Join Now