Compressing Archived APache Log Files By date

I wasn't sure which place to ask this question, since it involves basic Solaris compression, but here goes:

I am running RedHat Stronghold Apache 4 on SUn Solaris 9 servers. I use the rotatelogs command inside the CustomLog format to rotate the httpd access logs daily and this works as it should.

Now the question:
How can I write a script that will either jar, tar, or zip these archived files by date. (This will be set as a cron job that will run daily) In other words, take the individual files that are, say, over 10 days old and zip each one separately into its own zip file to be moved to a different archive area?

Who is Participating?
yuzhConnect With a Mentor Commented:
use gzip you can get better compression. It is important to soecify the file type with
the find command:

find /path-to/log -type f -mtime +10 -exec gzip {} \;


find /path-to/log -type f -mtime +10 -exec compress {} \;

To move the files to a different dir:

For gzip
cd /path-to/log
mv *.gz /another-dir

or (For "compreess")
mv *.Z /another-dir

You can put the about commands in your script.
bouchercAuthor Commented:
Just a comment, the log files are named as follows:

find /path/to/log -mtime +10 -exec zip {} \;

That files any files in the /path/to/log directory that haven't been modified for 10 days and zips them
Will that do?
bouchercAuthor Commented:
Thanks for your help. The answer from yuzh was just what I needed. I completely forget about executing commands inside a find command. Trying to get out of my Microsoft habits, it'll take time.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.