Best way to chunk data from MySQL for import into shared-hosting MySQL database
Posted on 2011-04-23
I have a MySQL table with c.1,850 rows and two columns - `ID` (int - not auto-incrementing) and `data` (mediumblob). The table is c.400MiB, with many individual entries exceeding 1MiB and some as large as 4MiB. I must upload it to a typical Linux shared-hosting installation.
So far, I have run into a variety of size restrictions. Bigdump, which effortlessly imported the rest of the database, cannot handle this table - stopping at different places, whichever method I have used (various attempts using SQL or CSV). Direct import using phpMyAdmin has also failed.
I now accept that I have to split the table's content in some way, if the import is ever to be successful. But as (for example) the last CSV displayed 1.6m rows in GVIM (when there are only 1,850 rows in the table), I don't even know where to start with this.
What is the best method? And what settings must I use at export to make the method work?