I need to import a large amount of data into MySQL from a CSV file. The data in question is a group of 50,000 or more line items, which need to be written to various tables in my database for proper normalization.
I have class strucutures in the form of parents that house collections of line items that are designed well and offer what I need when the user is editing a few of the items, or wants to otherwise manuipulate the data structure(s).
The problem is, that during an import process, the performance is quite piss-poor, if I use my collections in php to write to the database:
foreach ($oProject->lineItemCollection as $oLineItem)
Without stored procedures yet in MySQL, is there a 'proper' way I could compile an .SQL script in PHP, and then submit it to the MySQL server, which would then execute the 12,000 queries on ONE database access, rather than looping in php 10,000 times with