I need to load data from a url and dump it into a mySQL database using PHP. I will be croning this process, so I need it to happen with no intervention. The data is in .csv format. The file is large, approximately 75 MB.
I tried a test with this code, which I know works with smaller files.
$file = file_get_contents('http://www.xxxxxx.com/download_inventory/somefile.csv'
// some PHP to process $file
Even with the set_time_limit it appears to time out. I assume that my variable $file can't hold 75 MB, so I need some way to write it to my server's drive, without having to hold the entire file in a variable.
My intention was to then use PHP to process the data and dump it into a database, but there may be a better way. Should I be writing it to the drive and then using LOAD DATA LOCAL INFILE to dump it to the database? If yes, what is the code to write the data from the remote url to my server's drive? (note: I have a dedicated server, so I have full access) Is there a way to use mySQL to dump it directly into the database?
So I am looking for the best way to automate daily getting the data from the url and updating my database with it.
I'd appreciate your help.