Shell script to delete files older than 4 days

I've got a cron job that runs 4 times a day which uses a simple shell script to perform a mysql_dump then tar the file (DB_BACKUP-2017-11-18.55.25.tgz) and place it in a directory.

#!/bin/bash

backup_file="/share/CACHEDEV1_DATA/Web/SQL/vcal.sql"
backup_dir="/share/CACHEDEV1_DATA/Web/SQL/SQL_Backup"
tar_name=`date +%Y-%m-%d.%M.%S`

# Dump MySQL Database
/usr/local/mysql/bin/mysqldump --user="user" --opt vcal > /share/CACHEDEV1_DATA/Web/SQL/vcal.sql

# Backup MySQL Dump File
/bin/tar -czPf $backup_dir/"DB_BACKUP-"$tar_name.tgz $backup_file

exit 0

Open in new window


I'm trying to figure out a way to have the script delete backups which are older than 4 days.
Ideally, I'd like to be left with 16 files (4 are created per day) at any given time

Any suggestions?

Thank you for looking :)

Ken
LVL 16
kenfcampAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

David FavorLinux/LXD/WordPress/Hosting SavantCommented:
Try this command...

find  $backup_dir -name "*.tgz" -mtime 4

Open in new window


Once you're sure you have the correct files, you can add the -delete operator, as in...

find  $backup_dir -name "*.tgz" -mtime 4 -delete

Open in new window


You may think it's overkill to include a -name option. I'm fairly paranoid... or cautions... My preference is to ensure I'm only matching correct file names, as a mistake with find $dir -delete can be disastrous.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
kenfcampAuthor Commented:
It works (sort of)

Using a test folder with copied backups

It finds three files
DB_Backup-2017-11-13-15.00.01.tgz
DB_Backup-2017-11-13-20.00.01.tgz
DB_Backup-2017-11-14-10.00.01.tgz

But ignores:
DB_Backup-2017-11-10-10.00.02.tgz  
DB_Backup-2017-11-10-15.00.01.tgz  
DB_Backup-2017-11-10-20.00.01.tgz  
DB_Backup-2017-11-11-10.00.02.tgz  
DB_Backup-2017-11-11-15.00.02.tgz  
DB_Backup-2017-11-11-20.00.01.tgz  
DB_Backup-2017-11-12-10.00.01.tgz  
DB_Backup-2017-11-12-15.00.02.tgz  
DB_Backup-2017-11-12-20.00.01.tgz  

DB_Backup-2017-11-13-10.00.02.tgz  

DB_Backup-2017-11-14-15.00.01.tgz  
DB_Backup-2017-11-14-20.00.01.tgz  

-- 4 day stop --
DB_Backup-2017-11-15-10.00.01.tgz  
DB_Backup-2017-11-15-15.00.01.tgz  
DB_Backup-2017-11-15-20.00.01.tgz  
DB_Backup-2017-11-16-10.00.01.tgz  
DB_Backup-2017-11-16-15.00.01.tgz  
DB_Backup-2017-11-16-20.00.01.tgz  
DB_Backup-2017-11-17-10.00.02.tgz  
DB_Backup-2017-11-17-15.00.02.tgz  
DB_Backup-2017-11-17-20.00.02.tgz  
DB_Backup-2017-11-18-10.00.01.tgz
kenfcampAuthor Commented:
rm `ls -td /path/to/test/folder/*.tgz | awk 'NR>15'` seems to do what I want, though I'm not sure I'm willing to trust it won't go sideways on me :\
Acronis True Image 2019 just released!

Create a reliable backup. Make sure you always have dependable copies of your data so you can restore your entire system or individual files.

kenfcampAuthor Commented:
Though rm `ls -td /path/to/test/folder/DB_BACKUP-*.tgz | awk 'NR>15'` does make me feel a little better about it

Thoughts?
arnoldCommented:
Try
find . -mtime +4

In the example you were provided.
Double check that -delete is an option of your find, or you would need to use -exec rm()\; when ready to implement the removal.
kenfcampAuthor Commented:
@arnold

Yea, I thought of that, no difference and to be honest gave up on using find.

I had on the other hand replaced it with  rm `ls -td /path/to/test/folder/DB_BACKUP-*.tgz | awk 'NR>15'` which seems to work flawlessly

[-- STARTED WITH --]
DB_Backup-2017-11-10-10.00.02.tgz  
DB_Backup-2017-11-10-15.00.01.tgz  
DB_Backup-2017-11-10-20.00.01.tgz  
DB_Backup-2017-11-11-10.00.02.tgz  
DB_Backup-2017-11-11-15.00.02.tgz  
DB_Backup-2017-11-11-20.00.01.tgz  
DB_Backup-2017-11-12-10.00.01.tgz  
DB_Backup-2017-11-12-15.00.02.tgz  
DB_Backup-2017-11-12-20.00.01.tgz  
DB_Backup-2017-11-13-10.00.02.tgz  
DB_Backup-2017-11-13-15.00.01.tgz
DB_Backup-2017-11-13-20.00.01.tgz
DB_Backup-2017-11-14-10.00.01.tgz
DB_Backup-2017-11-14-15.00.01.tgz  
DB_Backup-2017-11-14-20.00.01.tgz  
DB_Backup-2017-11-15-10.00.01.tgz  
DB_Backup-2017-11-15-15.00.01.tgz  
DB_Backup-2017-11-15-20.00.01.tgz  
DB_Backup-2017-11-16-10.00.01.tgz  
DB_Backup-2017-11-16-15.00.01.tgz  
DB_Backup-2017-11-16-20.00.01.tgz  
DB_Backup-2017-11-17-10.00.02.tgz  
DB_Backup-2017-11-17-15.00.02.tgz  
DB_Backup-2017-11-17-20.00.02.tgz  
DB_Backup-2017-11-18-10.00.01.tgz

[-- CURRENT --]
DB_Backup-2017-11-14-15.00.01.tgz
DB_Backup-2017-11-14-20.00.01.tgz
DB_Backup-2017-11-15-10.00.01.tgz
DB_Backup-2017-11-15-15.00.01.tgz
DB_Backup-2017-11-15-20.00.01.tgz
DB_Backup-2017-11-16-10.00.01.tgz
DB_Backup-2017-11-16-15.00.01.tgz
DB_Backup-2017-11-16-20.00.01.tgz
DB_Backup-2017-11-17-10.00.02.tgz
DB_Backup-2017-11-17-15.00.01.tgz
DB_Backup-2017-11-17-20.00.01.tgz
DB_Backup-2017-11-18-10.00.01.tgz
DB_Backup-2017-11-18-15.00.01.tgz
DB_Backup-2017-11-18-20.00.01.tgz
DB_Backup-2017-11-19-10.00.02.tgz
arnoldCommented:
You are placing info on time stamps created by a process,
I might be misunderstanding what you are looking for, could you list the directory where these files are
ls -l
Tgz point to a tar compressed archive, might your process include adding files to an existing archive?
kenfcampAuthor Commented:
Arnold,

Though I'm still testing my current solutions to the issue, here's what you asked for.

The script in question is ran via cron (currently) 3x a day and is located in a directory called "backups".  

The script generates multiple dumps of various mysql databases into the directory then creates a date/time stamped tarball in an directory named "archives" within the same directory.

Obviously over time the archives accumulate and though cleaning them out manually is more of a nuisance than anything else, the added line of code is meant to maintain a specific number of archives without my needing to intervene

Let me know if you have any other questions

Ken
arnoldCommented:
One option could be to add before the backup to delete the oldest backup within your script.
Though it would not include date.

Your script does not test to see whether there were issues, assuming all went well.

echo $(ls -t| tail -1)
This will remove the oldest once echo replaced with rm.
Every time the script runs, it will add one file while removing the oldest.....
kenfcampAuthor Commented:
@arnold,

That doesn't come close to what I need to be able to do, which is more than deleting 1 oldest file

Currently 3 backups (tgz files) are created per day, and with the method currently being tested I'm keeping 5 days worth. Meaning at any given time there are 15 archives (tgz) files in the archive directory..

I'm also needing flexibility to adjust the number of files being saved which again the current method being tested allows for

Since the line of code I'm testing is working, I think I'm going to close this out
kenfcampAuthor Commented:
Thanks for the help David!

While your solution didn't exactly work the way I needed, it did point me in the direction I'm currently testing.

Since I didn't (and likely wouldn't have) think of this method until your post, you get the points :)

Thanks again,

Ken
arnoldCommented:
The example deletes one for every new one created.
You can alter the tail to delete more in one shot by changing the -1 as the tail option with -2 to grab the two oldest files every time your script creating a backup runs.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Linux

From novice to tech pro — start learning today.