I have never messed with PERL but understand it is probably what I want to be using to import due to the file sizes I am dealing with.
Environment : Linux MariaDB server.
I have 7 data files of roughly 24 million rows each.
Normally I am importing these via SSH mysql command after I move the ".csv" files over to the server running the following command for each file.
LOAD DATA LOCAL INFILE '/home/APR17/20161003_FULL FILE_01.csv' INTO TABLE FD_MASTER FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;
Then I run this query to code the batch that got loaded.
SET LoadInputFile = '01'
WHERE LoadInputFile IS NULL
So my question is how do I create a PERL script that will load each of the 7 files one at a time, and in between each load run the Update Query.
It would be nice if it at the end provided a simple report of what it did including how long each segment took to load.
After this there are a series of queries that I also run on that Master file which I would like to also add to this process, but those can wait.