Solved

curl in unix

Posted on 2013-05-24
6
512 Views
Last Modified: 2013-06-03
I have pasted partial code here, but you may notice down below "Curl" is being used to download the file. I am just wondering if I can use "curl" to download latest file by date. Please help

#!/bin/ksh -xve
date

HOME_DIR=$tradehome/bmc_jobs/pace_adjustments
CUR_DATE=`date '+%Y_%m_%d'`
LOG_FILE=$HOME_DIR/logs/pace_adjustments.log
#DOWNLOAD_FILE_NAME=icecleared_gas_$CUR_DATE.dat
DOWNLOAD_FILE_NAME=icecleared_gas_2013_03_14.dat
USER_NAME=Caxton_settles
PASSWORD=icedata
HTTP_URL=https://downloads.theice.com/Settlement_Reports_CSV/Gas/$DOWNLOAD_FILE_NAME
DATA_FILE_DIR=$HOME_DIR/data
TRIES=0
MAIL_LIST=smaddirala@caxton.com
CTL_FILE=$HOME_DIR/ctl/ice_prices.ctl
SQL_LOADER_LOG_FILE=$HOME_DIR/sql_loader_log/ice_prices.log
DATA_FILE=$DATA_FILE_DIR/icecleared_gas.dat
badfname=$HOME_DIR/sql_loader_log/ice_prices.bad
dscfname=$HOME_DIR/sql_loader_log/ice_prices.dsc
logfname=$HOME_DIR/sql_loader_log/ice_prices.log


echo "Down Load File = $DOWNLOAD_FILE_NAME"
echo "URL = $HTTP_URL"


cd $HOME_DIR

export USER_NAME PASSWORD HTTP_URL

#
# Get the file from ICE
if (test -e "$DATA_FILE_DIR/$DOWNLOAD_FILE_NAME" )
then
   rm $DATA_FILE_DIR/$DOWNLOAD_FILE_NAME
fi
sleep 30


until
   [ -f "$DATA_FILE_DIR/$DOWNLOAD_FILE_NAME" ]
do
   sleep 10
   date
   echo 'Attempt to download $$DOWNLOAD_FILE_NAME from ICE' `date`
        curl -k -u $USER_NAME:$PASSWORD $HTTP_URL > $DATA_FILE_DIR/$DOWNLOAD_FILE_NAME

        CNT=`grep -i "<title>404 Not Found</title>" $DATA_FILE_DIR/$DOWNLOAD_FILE_NAME | wc -l`

        echo "Not Found Count = $CNT"

        if [ $CNT -gt 0 ]
        then
                echo "Delete the bad file and try again"
                rm $DATA_FILE_DIR/$DOWNLOAD_FILE_NAME
                sleep 10

        fi

        TRIES=`expr $TRIES + 1`

        echo "Num of attemps = $TRIES"

        if [ $TRIES -gt 3 ]
        then
            echo "$DOWNLOAD_FILE_NAME download failed after $TRIES attemps. Can't book pace adjustments for Chuck Ames. Pls look into it .." | mail ${FAIL_MAIL_LIST}
            break
        fi

done

Open in new window

0
Comment
Question by:d27m11y
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
6 Comments
 

Author Comment

by:d27m11y
ID: 39194699
Does this work

curl -z "Jun 30 2011" http://yoursite.com/file.html

Open in new window

0
 
LVL 47

Assisted Solution

by:David
David earned 500 total points
ID: 39196039
curl?  rsync is a MUCH better tool for the job. It not only can restart,but can be told to compress, sync only changed files, use wild cards.    All built-in command lines.  rsync is on every O/S that supports curl, even windows.  (yes, you can pass logon credentials as part of the command also.
0
 
LVL 22

Expert Comment

by:blu
ID: 39196562
Rsync is great unless you don't have access to the server except via http. Is that the case here?

I am not exactly sure what you are asking. Could you elaborate? Are you asking to have curl figure out the latest file in a directory? If so, can you give a typical directory listing and what you want to happen?
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 

Author Comment

by:d27m11y
ID: 39207962
let us say I have a directory

export/home/opdu/ods/data/csv

In this directory, I have files listed as the following

missing_trades_0523.csv
missing_trades_0524.csv
missing_trades_0525.csv

How do I get the latest file based on the name of the file.

I would like to pass latest .csv file as parameter to my shell. How do I do that, please suggest.

Quick response is appreciated.
0
 
LVL 47

Accepted Solution

by:
David earned 500 total points
ID: 39208340
You only have one version of any particular file name as UNIX doesn't have revisioning.  Seems to me the best and easiest solution is to modify the program that generates these files so that after it creates the new missing_trades_$NNNN.csv file to
cp -f missing_trades_$NNNN.csv missing_trades_current.csv  

Then you just use whatever tool you want to always copy missing_trades_current.csv

Think outside of the box and make life easy for yourself and redo the logic to get rid of the problem altogether.

Otherwise you'd have to write a script that does ls -1 $directory/missing_trades* | sort -g -r | head - 1

(Not all flavors of unix use -g -r to get a numeric sort (-g) and reverse sort (-r)
0
 

Author Closing Comment

by:d27m11y
ID: 39216188
Useful !
0

Featured Post

Transaction Monitoring Vs. Real User Monitoring

Synthetic Transaction Monitoring Vs. Real User Monitoring: When To Use Each Approach? In this article, we will discuss two major monitoring approaches: Synthetic Transaction and Real User Monitoring.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Attention: This article will no longer be maintained. If you have any questions, please feel free to mail me. jgh@FreeBSD.org Please see http://www.freebsd.org/doc/en_US.ISO8859-1/articles/freebsd-update-server/ for the updated article. It is avail…
I promised to write further about my project, and here I am.  First, I needed to setup the Primary Server.  You can read how in this article: Setup FreeBSD Server with full HDD encryption (http://www.experts-exchange.com/OS/Unix/BSD/FreeBSD/A_3660-S…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

729 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question