Solved

Get files between two dates, Linux RedHat

Posted on 2014-11-26
7
338 Views
Last Modified: 2014-11-27
Hi Experts,
I would write a script to make a copy (scp) of all Transfer_data files. * From a path, which were generated from the current time (DATE) and 1 minute backward.
Example:
hostname: / path /> date
Wed November 26 2014 16:53:00 ART
hostname: / path /> ls -t1 Transfer_data *.
Transfer_data.20141126165232
Transfer_data.20141126165203
Transfer_data.20141119165103
Transfer_data.20141119165003
Transfer_data.20141119164909
Transfer_data.20141119164801

Open in new window

I wish to copy only:
Transfer_data.20141126165232
Transfer_data.20141126165203

Open in new window

and after a minute, rerun the script, taking only the files generated during that minute
Example2:
hostname: / path /> date
Wed November 26 2014 16:54:00 ART
hostname: / path /> ls -t1 Transfer_data *.
Transfer_data.20141126165332
Transfer_data.20141126165303
Transfer_data.20141126165232
Transfer_data.20141126165203

Open in new window

I wish to copy only:
Transfer_data.20141126165332
Transfer_data.20141126165303

Open in new window

I know how to use the cron tool, but I have trouble generating the ranges in a script.
I tried to start with:
#!/bin/ksh
touch -t date --date "1 minute ago"  start
touch -t date end
find . \( -newer inicio -a \! -newer fin \) -exec ls -l  {} \;

Open in new window

You would have any examples that I can use?
Would appreciate any idea how.
Thank you very much
Regards
0
Comment
Question by:carlino70
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
7 Comments
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 40467681
Wouldn't it be easier to remove the already scp'ed files from the relevant directory so you can transfer all new files at each run?
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 40467695
To list the files whose names correspond to the previous minute you can use this:

ls -t1 Transfer_data.$(date -d "1 minute ago" "+%Y%m%d%H%M")*

The above works only with GNU date (standard on Linux).

Please be aware that percent signs (%) used in crontab (not in scripts called by cron!) must be escaped by means of a backslash (\)
0
 
LVL 68

Accepted Solution

by:
woolmilkporc earned 250 total points
ID: 40467733
Again, your own script is not at all wrong but just needs corrections:

#!/bin/ksh
touch -t $(date --date "1 minute ago" "+%Y%m%d%H%M.00") start
# 1 minute backward only:
touch -t $(date --date "1 minute ago" "+%Y%m%d%H%M.59") end
# Current minute and 1 minute backward:
# touch -t $(date "+%Y%m%d%H%M.59") end
find . -name "Transfer_data*" -type f \( -newer start -a \! -newer end \) -print |sort -r

"ls -l" will give way too much info, so I used "-print" and "sort -r" to get the (by name) newest file on top.
0
Enterprise Mobility and BYOD For Dummies

Like “For Dummies” books, you can read this in whatever order you choose and learn about mobility and BYOD; and how to put a competitive mobile infrastructure in place. Developed for SMBs and large enterprises alike, you will find helpful use cases, planning, and implementation.

 
LVL 40

Assisted Solution

by:omarfarid
omarfarid earned 250 total points
ID: 40468313
I would run something like:

mystart=start$$
myend=end$$
touch $mystart
sleep 60
touch $myend
find . -name "Transfer_data*" -type f \( -newer $mystart -a \! -newer $myend \) -print |sort -r

This should give files that were created in last minute
0
 

Author Comment

by:carlino70
ID: 40468712
If it should be easier, but I can not delete files from the original directory. Move them to a temporary folder and then transfer would be an option, but would require more time in the process.
0
 

Author Comment

by:carlino70
ID: 40468852
omarfarid, woolmilkporc:

I'll test both ideas.

Thanksyou
0
 

Author Closing Comment

by:carlino70
ID: 40469260
Thanks, both solutions work
0

Featured Post

Simplifying Server Workload Migrations

This use case outlines the migration challenges that organizations face and how the Acronis AnyData Engine supports physical-to-physical (P2P), physical-to-virtual (P2V), virtual to physical (V2P), and cross-virtual (V2V) migration scenarios to address these challenges.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

How many times have you wanted to quickly do the same thing to a list but found yourself typing it again and again? I first figured out a small time saver with the up arrow to recall the last command but that can only get you so far if you have a bi…
Using 'screen' for session sharing, The Simple Edition Step 1: user starts session with command: screen Step 2: other user (logged in with same user account) connects with command: screen -x Done. Both users are connected to the same CLI sessio…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
How to Install VMware Tools in Red Hat Enterprise Linux 6.4 (RHEL 6.4) Step-by-Step Tutorial

732 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question