I have a large XML file Im trying to read into MySQL which works, but when I get to a few thousand records Im getting a fast-cgi error. Apparently according to my host I cannot change this value, so I have to import the XML document in under 60 seconds.
So Im wondering if its possible to split an XML file up into 100 records and then I'll call each 100 record batch independently until the end. My problem is how to easy split the XML file up.
My current idea is to read each line of the XML file until I find the </record> tag then count to 100 and then save them into a file, then carry on for the next 100 </record>. By doing it this way Im thinking by reading the file one line at a time might also reduce the memory usage as some of the XML files are massive.
Can anyone suggest another way of doing this, or is this going to be the best way?