I have a python script that processes data files which I currently run from the Linux command line:
$ /path/process.py archive1.csv archive1 2> error.log
with arguments: name_of_file to process, subdirectory for the output file, and 2> error.log catches any errors
Each of 24 archives takes at least 60 seconds to process; I also have 24 data files which take 15 - 30 seconds each to process. I've tried to get the python script to run in my crontab iteratively for each archive file but nothing happens. I've tried integrating the command line arguments into a bash script with similar results, probably because I don't know enough about bash scripting.
The alternative to a bash script would be to change the python script so that it will run as is from the crontab, but that looks more difficult, so I thought I'd try this approach first.