kenfcamp
asked on
Shell script to delete files older than 4 days
I've got a cron job that runs 4 times a day which uses a simple shell script to perform a mysql_dump then tar the file (DB_BACKUP-2017-11-18.55.2 5.tgz) and place it in a directory.
I'm trying to figure out a way to have the script delete backups which are older than 4 days.
Ideally, I'd like to be left with 16 files (4 are created per day) at any given time
Any suggestions?
Thank you for looking :)
Ken
#!/bin/bash
backup_file="/share/CACHEDEV1_DATA/Web/SQL/vcal.sql"
backup_dir="/share/CACHEDEV1_DATA/Web/SQL/SQL_Backup"
tar_name=`date +%Y-%m-%d.%M.%S`
# Dump MySQL Database
/usr/local/mysql/bin/mysqldump --user="user" --opt vcal > /share/CACHEDEV1_DATA/Web/SQL/vcal.sql
# Backup MySQL Dump File
/bin/tar -czPf $backup_dir/"DB_BACKUP-"$tar_name.tgz $backup_file
exit 0
I'm trying to figure out a way to have the script delete backups which are older than 4 days.
Ideally, I'd like to be left with 16 files (4 are created per day) at any given time
Any suggestions?
Thank you for looking :)
Ken
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
rm `ls -td /path/to/test/folder/*.tgz | awk 'NR>15'` seems to do what I want, though I'm not sure I'm willing to trust it won't go sideways on me :\
ASKER
Though rm `ls -td /path/to/test/folder/DB_BA CKUP-*.tgz | awk 'NR>15'` does make me feel a little better about it
Thoughts?
Thoughts?
Try
find . -mtime +4
In the example you were provided.
Double check that -delete is an option of your find, or you would need to use -exec rm()\; when ready to implement the removal.
find . -mtime +4
In the example you were provided.
Double check that -delete is an option of your find, or you would need to use -exec rm()\; when ready to implement the removal.
ASKER
@arnold
Yea, I thought of that, no difference and to be honest gave up on using find.
I had on the other hand replaced it with rm `ls -td /path/to/test/folder/DB_BA CKUP-*.tgz | awk 'NR>15'` which seems to work flawlessly
[-- STARTED WITH --]
DB_Backup-2017-11-10-10.00 .02.tgz
DB_Backup-2017-11-10-15.00 .01.tgz
DB_Backup-2017-11-10-20.00 .01.tgz
DB_Backup-2017-11-11-10.00 .02.tgz
DB_Backup-2017-11-11-15.00 .02.tgz
DB_Backup-2017-11-11-20.00 .01.tgz
DB_Backup-2017-11-12-10.00 .01.tgz
DB_Backup-2017-11-12-15.00 .02.tgz
DB_Backup-2017-11-12-20.00 .01.tgz
DB_Backup-2017-11-13-10.00 .02.tgz
DB_Backup-2017-11-13-15.00 .01.tgz
DB_Backup-2017-11-13-20.00 .01.tgz
DB_Backup-2017-11-14-10.00 .01.tgz
DB_Backup-2017-11-14-15.00 .01.tgz
DB_Backup-2017-11-14-20.00 .01.tgz
DB_Backup-2017-11-15-10.00 .01.tgz
DB_Backup-2017-11-15-15.00 .01.tgz
DB_Backup-2017-11-15-20.00 .01.tgz
DB_Backup-2017-11-16-10.00 .01.tgz
DB_Backup-2017-11-16-15.00 .01.tgz
DB_Backup-2017-11-16-20.00 .01.tgz
DB_Backup-2017-11-17-10.00 .02.tgz
DB_Backup-2017-11-17-15.00 .02.tgz
DB_Backup-2017-11-17-20.00 .02.tgz
DB_Backup-2017-11-18-10.00 .01.tgz
[-- CURRENT --]
DB_Backup-2017-11-14-15.00 .01.tgz
DB_Backup-2017-11-14-20.00 .01.tgz
DB_Backup-2017-11-15-10.00 .01.tgz
DB_Backup-2017-11-15-15.00 .01.tgz
DB_Backup-2017-11-15-20.00 .01.tgz
DB_Backup-2017-11-16-10.00 .01.tgz
DB_Backup-2017-11-16-15.00 .01.tgz
DB_Backup-2017-11-16-20.00 .01.tgz
DB_Backup-2017-11-17-10.00 .02.tgz
DB_Backup-2017-11-17-15.00 .01.tgz
DB_Backup-2017-11-17-20.00 .01.tgz
DB_Backup-2017-11-18-10.00 .01.tgz
DB_Backup-2017-11-18-15.00 .01.tgz
DB_Backup-2017-11-18-20.00 .01.tgz
DB_Backup-2017-11-19-10.00 .02.tgz
Yea, I thought of that, no difference and to be honest gave up on using find.
I had on the other hand replaced it with rm `ls -td /path/to/test/folder/DB_BA
[-- STARTED WITH --]
DB_Backup-2017-11-10-10.00
DB_Backup-2017-11-10-15.00
DB_Backup-2017-11-10-20.00
DB_Backup-2017-11-11-10.00
DB_Backup-2017-11-11-15.00
DB_Backup-2017-11-11-20.00
DB_Backup-2017-11-12-10.00
DB_Backup-2017-11-12-15.00
DB_Backup-2017-11-12-20.00
DB_Backup-2017-11-13-10.00
DB_Backup-2017-11-13-15.00
DB_Backup-2017-11-13-20.00
DB_Backup-2017-11-14-10.00
DB_Backup-2017-11-14-15.00
DB_Backup-2017-11-14-20.00
DB_Backup-2017-11-15-10.00
DB_Backup-2017-11-15-15.00
DB_Backup-2017-11-15-20.00
DB_Backup-2017-11-16-10.00
DB_Backup-2017-11-16-15.00
DB_Backup-2017-11-16-20.00
DB_Backup-2017-11-17-10.00
DB_Backup-2017-11-17-15.00
DB_Backup-2017-11-17-20.00
DB_Backup-2017-11-18-10.00
[-- CURRENT --]
DB_Backup-2017-11-14-15.00
DB_Backup-2017-11-14-20.00
DB_Backup-2017-11-15-10.00
DB_Backup-2017-11-15-15.00
DB_Backup-2017-11-15-20.00
DB_Backup-2017-11-16-10.00
DB_Backup-2017-11-16-15.00
DB_Backup-2017-11-16-20.00
DB_Backup-2017-11-17-10.00
DB_Backup-2017-11-17-15.00
DB_Backup-2017-11-17-20.00
DB_Backup-2017-11-18-10.00
DB_Backup-2017-11-18-15.00
DB_Backup-2017-11-18-20.00
DB_Backup-2017-11-19-10.00
You are placing info on time stamps created by a process,
I might be misunderstanding what you are looking for, could you list the directory where these files are
ls -l
Tgz point to a tar compressed archive, might your process include adding files to an existing archive?
I might be misunderstanding what you are looking for, could you list the directory where these files are
ls -l
Tgz point to a tar compressed archive, might your process include adding files to an existing archive?
ASKER
Arnold,
Though I'm still testing my current solutions to the issue, here's what you asked for.
The script in question is ran via cron (currently) 3x a day and is located in a directory called "backups".
The script generates multiple dumps of various mysql databases into the directory then creates a date/time stamped tarball in an directory named "archives" within the same directory.
Obviously over time the archives accumulate and though cleaning them out manually is more of a nuisance than anything else, the added line of code is meant to maintain a specific number of archives without my needing to intervene
Let me know if you have any other questions
Ken
Though I'm still testing my current solutions to the issue, here's what you asked for.
The script in question is ran via cron (currently) 3x a day and is located in a directory called "backups".
The script generates multiple dumps of various mysql databases into the directory then creates a date/time stamped tarball in an directory named "archives" within the same directory.
Obviously over time the archives accumulate and though cleaning them out manually is more of a nuisance than anything else, the added line of code is meant to maintain a specific number of archives without my needing to intervene
Let me know if you have any other questions
Ken
One option could be to add before the backup to delete the oldest backup within your script.
Though it would not include date.
Your script does not test to see whether there were issues, assuming all went well.
echo $(ls -t| tail -1)
This will remove the oldest once echo replaced with rm.
Every time the script runs, it will add one file while removing the oldest.....
Though it would not include date.
Your script does not test to see whether there were issues, assuming all went well.
echo $(ls -t| tail -1)
This will remove the oldest once echo replaced with rm.
Every time the script runs, it will add one file while removing the oldest.....
ASKER
@arnold,
That doesn't come close to what I need to be able to do, which is more than deleting 1 oldest file
Currently 3 backups (tgz files) are created per day, and with the method currently being tested I'm keeping 5 days worth. Meaning at any given time there are 15 archives (tgz) files in the archive directory..
I'm also needing flexibility to adjust the number of files being saved which again the current method being tested allows for
Since the line of code I'm testing is working, I think I'm going to close this out
That doesn't come close to what I need to be able to do, which is more than deleting 1 oldest file
Currently 3 backups (tgz files) are created per day, and with the method currently being tested I'm keeping 5 days worth. Meaning at any given time there are 15 archives (tgz) files in the archive directory..
I'm also needing flexibility to adjust the number of files being saved which again the current method being tested allows for
Since the line of code I'm testing is working, I think I'm going to close this out
ASKER
Thanks for the help David!
While your solution didn't exactly work the way I needed, it did point me in the direction I'm currently testing.
Since I didn't (and likely wouldn't have) think of this method until your post, you get the points :)
Thanks again,
Ken
While your solution didn't exactly work the way I needed, it did point me in the direction I'm currently testing.
Since I didn't (and likely wouldn't have) think of this method until your post, you get the points :)
Thanks again,
Ken
The example deletes one for every new one created.
You can alter the tail to delete more in one shot by changing the -1 as the tail option with -2 to grab the two oldest files every time your script creating a backup runs.
You can alter the tail to delete more in one shot by changing the -1 as the tail option with -2 to grab the two oldest files every time your script creating a backup runs.
ASKER
Using a test folder with copied backups
It finds three files
DB_Backup-2017-11-13-15.00
DB_Backup-2017-11-13-20.00
DB_Backup-2017-11-14-10.00
But ignores:
DB_Backup-2017-11-10-10.00
DB_Backup-2017-11-10-15.00
DB_Backup-2017-11-10-20.00
DB_Backup-2017-11-11-10.00
DB_Backup-2017-11-11-15.00
DB_Backup-2017-11-11-20.00
DB_Backup-2017-11-12-10.00
DB_Backup-2017-11-12-15.00
DB_Backup-2017-11-12-20.00
DB_Backup-2017-11-13-10.00
DB_Backup-2017-11-14-15.00
DB_Backup-2017-11-14-20.00
-- 4 day stop --
DB_Backup-2017-11-15-10.00
DB_Backup-2017-11-15-15.00
DB_Backup-2017-11-15-20.00
DB_Backup-2017-11-16-10.00
DB_Backup-2017-11-16-15.00
DB_Backup-2017-11-16-20.00
DB_Backup-2017-11-17-10.00
DB_Backup-2017-11-17-15.00
DB_Backup-2017-11-17-20.00
DB_Backup-2017-11-18-10.00