?
Solved

Best way to chunk data from MySQL for import into shared-hosting MySQL database

Posted on 2011-04-23
4
Medium Priority
?
642 Views
Last Modified: 2013-12-14
I have a MySQL table with c.1,850 rows and two columns - `ID` (int - not auto-incrementing) and `data` (mediumblob). The table is c.400MiB, with many individual entries exceeding 1MiB and some as large as 4MiB. I must upload it to a typical Linux shared-hosting installation.

So far, I have run into a variety of size restrictions. Bigdump, which effortlessly imported the rest of the database, cannot handle this table - stopping at different places, whichever method I have used (various attempts using SQL or CSV). Direct import using phpMyAdmin has also failed.

I now accept that I have to split the table's content in some way, if the import is ever to be successful. But as (for example) the last CSV displayed 1.6m rows in GVIM (when there are only 1,850 rows in the table), I don't even know where to start with this.

What is the best method? And what settings must I use at export to make the method work?
0
Comment
Question by:hnmcc
4 Comments
 
LVL 39

Expert Comment

by:Aaron Tomosky
ID: 35453842
I'd start by using FTP to get the file to the server. Either in chunks or all at once.
Then you could write a tiny php file using fopen an fget to get a line at a time, and insert it into the table. Make sure to echo something every hundred rows or so, that way php won't timeout.
0
 
LVL 24

Expert Comment

by:johanntagle
ID: 35455398
Hmmm I guess you shared hosting subscription doesn't allow shell access, which is why you are trying these different methods?  In such a case I would upload the output of mysqldump to the server, then just ask/tell/demand the hosting provider to import it for me.  If that's not possible, I would try it the as Aaron said, or maybe I would just switch hosts =).

0
 
LVL 111

Accepted Solution

by:
Ray Paseur earned 2000 total points
ID: 35467086
This is an interesting question and it might be best to redesign the application.  First the "why" then the "how"...

Data bases with large blobs of data are terribly inefficient and slow, sometimes to the point of failure (as you have observed).  Consider what happens when someone writes a query that says, "SELECT *" -- you get a table scan.  In data base terms, a table scan is sort of like dog poop - it's out of place no matter where you find it.  When that table scan involves transferring many megabytes of unnecessary information... well, you get the picture.

Here is what I might try.  Use ALTER TABLE to add a URL column to the existing data base table.  Then go into the table with a query that retrieves each row.  Write each blob out into the file system using a URL name based on the key.  Store each URL in the new column of the table.  Then use ALTER TABLE to remove the blob column.  Once you have done that you have a clean-and-lean data base table with two narrow columns, and you have the large files in the file system, where they belong.

Use FTP to upload your 1,850 files.  Use phpMyAdmin to import the new, lean table.  HTH, ~Ray
0
 
LVL 27

Expert Comment

by:skullnobrains
ID: 35497071
forget csv if you transfer blobs. this is definitely asking for trouble

- mysqldump has features that will generate a query for whatever number of rows you specify.
just cat the resulting file into a mysql client if you can
if you can only use phpmyadmin's import, just copy-paste the stuff breaking it every 40 megs or so

- you may also write a php script that remotely performs an sql query on you server and imports the received data on the target mysql. the script may do this chunk by chunk easily (roughly 10-20 lines of codes required)

- you may also send your dump to the target web server, and hava a php script perform each import query one by one

- if your web and mysql server are on the same host, you may also consider using "select ... into outfile", move the file to the remote host and use "load data infile" to reimport. this is the less painfull and more efficient method if it applies
0

Featured Post

Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article will show you step-by-step instructions to build your own NTP CentOS server.  The network diagram shows the best practice to setup the NTP server farm for redundancy.  This article also serves as your NTP server documentation.
This holiday season, we’re giving away the gift of knowledge—tech knowledge, that is. Keep reading to see what hacks, tips, and trends we have wrapped and waiting for you under the tree.
If you're a developer or IT admin, you’re probably tasked with managing multiple websites, servers, applications, and levels of security on a daily basis. While this can be extremely time consuming, it can also be frustrating when systems aren't wor…
In this brief tutorial Pawel from AdRem Software explains how you can quickly find out which services are running on your network, or what are the IP addresses of servers responsible for each service. Software used is freeware NetCrunch Tools (https…
Suggested Courses

571 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question