I have a large XML file Im trying to read into MySQL which works, but when I get to a few thousand records Im getting a fast-cgi error. Apparently according to my host I cannot change this value, so I have to import the XML document in under 60 seconds.
So Im wondering if its possible to split an XML file up into 100 records and then I'll call each 100 record batch independently until the end. My problem is how to easy split the XML file up.
My current idea is to read each line of the XML file until I find the </record> tag then count to 100 and then save them into a file, then carry on for the next 100 </record>. By doing it this way Im thinking by reading the file one line at a time might also reduce the memory usage as some of the XML files are massive.
Can anyone suggest another way of doing this, or is this going to be the best way?
Our community of experts have been thoroughly vetted for their expertise and industry experience.
The Most Valuable Expert award recognizes technology experts who passionately share their knowledge with the community, demonstrate the core values of this platform, and go the extra mile in all aspects of their contributions. This award is based off of nominations by EE users and experts. Multiple MVEs may be awarded each year.
This award recognizes an author who contributes the highest volume of original works or content. Recipients of this award produce extremely valuable content that prioritizes accuracy, relevancy, and professionalism.