• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 409
  • Last Modified:

Possible to Error Check 'Find -exec cp' command?

Is it possible to error check the results of the -exec option when used with the find command?

for example, running:
find -mmin -10 -exec cp {} /somefolder \;-

Open in new window

in a folder that has multiple files that are less than 10 min and the '/somefolder' does not exist errors out for each file found with "no such file or directory exists"

if I error check with echo $? I receive 0 which is due to the find command being succesfull.  The -exec command however is not and this is what I would like to error check for.

Anyway of doing this with out offloading to another script?
0
jwmcf1
Asked:
jwmcf1
  • 4
  • 3
1 Solution
 
farzanjCommented:
Why don't you do something like

[[ -d /somefolder ]] && find -mmin -10 -exec cp {} /somefolder \;

Open in new window


This would make sure that find is only executed when the folder exists
0
 
jwmcf1Author Commented:
Thank you for the recomendation.  However, this line of code is just a small part of a script.

Really what I am attempting to do is prevent the deletion of a file if the copy of a file errors out for any reason.

basically the script is something like:

#first find the newest files and then copy them
find -mtime -2 -exec cp {} /somefolder \;
if [ "$?" = "0" ]
then 
#delete the current oldest file which in this case is +30 days
find -mtime +30 -exec rm {} \;
else
echo "error during copy of file.  stop script and prevent any deletions until fixed blah bl.."
fi

Open in new window


ultimate goal is to maintain the most recent 30 days worth of files and then copy these files to another location for backup and after backup delete the oldest file.  But before I delete anything I want to make sure the copy has actually happend.
0
 
farzanjCommented:
From your script, first your find command looks wrong, because you need to mention a path as well.  So I suppose the path is current directory, so I will put dot. You can put any path instead of dot
Try this:

files="$(find . -mtime -2)"
for file in "$files"
do
    cp $file /somefolder
    if [[ $? == 0 ]]
    then
           ... other find command
    fi
done

Open in new window

0
[Webinar] Cloud and Mobile-First Strategy

Maybe you’ve fully adopted the cloud since the beginning. Or maybe you started with on-prem resources but are pursuing a “cloud and mobile first” strategy. Getting to that end state has its challenges. Discover how to build out a 100% cloud and mobile IT strategy in this webinar.

 
farzanjCommented:
Your logic is still a little fuzzy.  Which file to check and which one to delete?  Any thing in the name or how would I tell?
0
 
jwmcf1Author Commented:
I will try testing using a for loop as you have shown.

As far as logic, I copy the most recent file and the cp command is being checked for error and if no error from the copy;  file older than 30 days is being deleted.

One file copied, one file deleted.  Idea is to always have the last 30 days worth of files in the local path and while also having an archive (the daily copy) at another location (remote server).
0
 
farzanjCommented:
I understand that, but is the criteria only about timestamp?  Are there categories of files?  Like abc_old and abc_new_date and xyz_old and xyz_new_date, are you have to get the latest in the each category.  This would have been more realistic.  

If this is the only criteria, then you can have
ls -lsr | tail -1 | awk '{print $NF}'  gets you the lastest file and if you remove option r from ls, you get the oldest.
0
 
jwmcf1Author Commented:
The for loop works to accomplish this task.  Thank you for the help.

this code does what i am looking for
#!/bin/bash

logfile=/usr/local/bin/exportlog.txt
addresses=somebody@email.com
todaysdate=$(date +"%Y-%m")

echo $todaydate
date > $logfile
echo " " >> $logfile 2>&1

umount /mnt/backup >> $logfile 2>&1
mount -t cifs -o username=someuser,password=somepass //backupserver/E /mnt/backup >> $logfile 2>&1
mkdir /mnt/backup/exporttest/$(hostname)/$todaysdate >> $logfile 2>&1

echo $(date)
echo "Starting Export Backup" >> $logfile 2>&1
echo " " >> $logfile 2>&1

echo "Copying..." >> $logfile 2>&1
cd /data/
files="$(find -mtime -1)"
for file in "$files"
do
   cp --parents --preserve -a $file /mnt/backup/exporttest/$(hostname)/$todaysdate >> $logfile 2>&1
   if [[ $? == 0 ]]
   then
    find . -mtime +30 -exec rm {} \; >> $logfile 2>&1
   else
    cat /usr/local/bin/exportlog.txt | mail -s "EXPORT COPY ERROR" $addresses
   fi
done

cd /
sleep 1m
umount /mnt/backup

echo "Finished Exports Backup."
echo $(date)

Open in new window

0

Featured Post

New feature and membership benefit!

New feature! Upgrade and increase expert visibility of your issues with Priority Questions.

  • 4
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now