First off, I am, in no way a Unix/Linux expert, so please bear with me. Also, I sure hope I've selected the proper topics for this question...
I need to run a few thousand curl cmds but I'm having difficulties figuring out how to go about doing this.
1. The curl is an XPOST (e.g. JSON)
For example:
curl -XPOST -H 'name1: value1' -H 'name2: value2' -H "Content-type: application/json" -d '{ "name3": [{"blah1": "blah1","aaa1": "blah2"}]}' '
http://domain.com'
curl -XPOST -H 'name1: value1' -H 'name2: value2' -H "Content-type: application/json" -d '{ "name3": [{"blah2": "blah2","aaa2": "blah2"}]}' '
http://domain.com'
curl -XPOST -H 'name1: value1' -H 'name2: value2' -H "Content-type: application/json" -d '{ "name3": [{"blah3": "blah3","aaa3": "blah2"}]}' '
http://domain.com'
The thing is I need to run this in a script so that there is a delay between each one. Also, once 1 curl is run, I get back a result which I'd like to save.
I thought I could just load all of the curl cmds in a text file and run something like:
while read line; do $line; sleep 5; done < input_curl.txt >> output_curl.txt
The resulting curl response would be something like:
{"name2":"value2","resultn
ame1":resu
ltvalue1,"
resultvalu
e2":"resul
tname2","r
esultname3
": etc etc...
My thought was that I'd be able to run each curl, line by line, then have it sleep for 5 seconds, and then run the next one. And, the curl result would be writen to output_curl.txt. Of course, that doesn't work.
Any help would be greatly appreciated.
Thanks,
Larry