Script to automatically archive logs to a folder every weeks

Hi, I am looking for a script to automatically archive specific weekly logs from a folder to another every weeks.

Williams225System AdministatorAsked:
Who is Participating?
TomuniqueConnect With a Mentor Commented:
Ok, here's the script.
There's a new staging directory you need to create locally.
   If the remote system is down, they'll queue up here.

the local directory is

This directory should normally be empty, if not, the next run of the script will push anything that didn't make a previous run.

remember to make the changes to the script per my previous post to set the remote userid appropriately.

you can use tar to backup thr logs and this can be scheduled to run weekly via crontab

1- create the script with the name e.g. myscript which contains:

/usr/bin/tar -cf /path/to/myarcdir/mybackup`date "+%Y%m%d"` /path/to/mylogs

2- make the script executable

chmod +x /path/to/myscript

3- add crontab schedule

EDITOR=vi ; exprt EDITOR

crontab -e

add below to the temp file, then save it then exit

0 1 * * 0 /path/to/myscript

for more info on the crontab schedule please see
Cloud Class® Course: Microsoft Office 2010

This course will introduce you to the interfaces and features of Microsoft Office 2010 Word, Excel, PowerPoint, Outlook, and Access. You will learn about the features that are shared between all products in the Office suite, as well as the new features that are product specific.

Williams225System AdministatorAuthor Commented:
please can you give an example
script and steps already given

you need to replace the path names with you's

if you have files names and dir name then I can post the script to use
Do you want all the files in the directory to be archived?
DO you want only a certain name pattern?
Do you want only files over a certain age?

Here's what we use to age off files out of a directory, it could be easily modified to just move them.
(To those familiar with skulker, you'll see some remnants in here).

This works for multiple directories, but uses the find command, which does all directories below where you've specified in the tree.

Format of the function calls is:
RMGZ  {Starting Path}  {Age to rm}  {Age to compress}
   if you dont want to compress, just specify a number larger than rm value
Specify as many directories as you want.

to "archive" in stead of cleanup.
  in the remove_file function,   change the rm to an mv, and give it a target directory.

You could then use this script unmodified (except for the directory to keep the logs maybe), to maintain your archive.

Williams225System AdministatorAuthor Commented:
thanx for your script. I am a newbie, it will be a lil bit difficult to customise the script,  i would appreciate if you can help.

I want a certain name pattern to be moved and archive in a specific directory.
Give me the specifics...

What directory do you want them posted in?
What file-name pattern do you want to follow?
Williams225System AdministatorAuthor Commented:
thanx a lot.

Here are the specifics

1°) Name of the files to archives ( compress and move)

The names of the files to archives will have the parterns bellow where YYYY =year; MM=Month ; DD= day

•      fdamo_debug.log.YYYY-MM-DD
•      snmpTraps.log. YYYY-MM-DD
•      system.log.YYYY-MM-DD
•      httpUssd.log.YYYY-MM-DD
•      ussd_messaging.log.YYYY-MM-DD
•      fdamo.log. YYYY-MM-DD

2°) the folders where the files will be archived

•      servers: et
•      folders : /opt/fdamo/gateway/core/logs/

3°)frequency to archive  logs : weekly
We can run it daily, and archive anything older than 7 days, so you wont build up two weeks worth of logs before the archive occurs.

1) What's the source directory?
Destination directory = /opt/fdamo/gateway/core/logs
What's the source directory?

Q2) Once they've been moved to the archive directory, can they be compressed immediately, or should they remain uncompressed?

Q3) How long should they live in the archive directory before being removed from the system for good.


Williams225System AdministatorAuthor Commented:
@Tomunique, here are the infos, thanx a lot for your help!

source directory? /opt/fdamo/gateway/core/logs/  
Destination directory = /opt/fdamo/Archive/LogArchive/

Once they've been moved to the archive directory, the logs should be compressed immediately

They should stay on the system for good.

The names of the archives compressed folders should be YYYY-MM-DD_Log_Arc

The names of the files to archives  have the parterns bellow where YYYY =year; MM=Month ; DD= day

•      fdamo_debug.log.YYYY-MM-DD
•      snmpTraps.log. YYYY-MM-DD
•      system.log.YYYY-MM-DD
•      httpUssd.log.YYYY-MM-DD
•      ussd_messaging.log.YYYY-MM-DD
•      fdamo.log. YYYY-MM-DD

frequency to archive  logs : weekly

Every time you post something the requirements appear to change slightly.

What do you mean "The names of the archives compressed folders should be YYYY-MM-DD_Log_Arc " ????

are there folders underneath /opt/fdamo/Archive/LogArchive?

To be clear, as an example --

would move to

the .gz suffix would be added for a compressed file, (not required, but recommended to clearly note it's compressed)

Is my example accurate?
Williams225System AdministatorAuthor Commented:
your example is very accurate. its a weekly script, so it will be good that each month under 2011-09-12_Log_Arc  for example i can see the 4 compressed files with the archive logs for the month. (for example)
So, the 2011-09-12_Log_Arc, is the date the script ran..

  In the directory 2011-09-12_Log_Arc
   there may be 7 fdamo_debug.log.*   files?  one per day that's found?

Say it ran this last Sunday  the directory would contain these 7 files?
Just trying to get specs down before I spend the time to mod the script  
(measure twice, cut once)
Williams225System AdministatorAuthor Commented:
yes . you get it right!
TomuniqueConnect With a Mentor Commented:
here ya go.
To run it weekly you'll need some form fo scheduler, such as cron.
Williams225System AdministatorAuthor Commented:
Thanx a lot Tomunique!!!!

I have one more question, I hope you will answer. because your script seems perfect!

if for example the target direct /opt/fundamo/Archive/LogArchive/$(date +%Y-$m-$y_Log_Arc)  is on another server (ip: how will i modify the part of the script bellow

TARGETDIR=/opt/fundamo/Archive/LogArchive/$(date +%Y-$m-$y_Log_Arc)

Open in new window

Now you're talking password-less file transfers (or NFS).

Ok, one side has to originate the transfer.  
Normally you'd select the sender, as that system knows what files need to be sent, and when.

If, on the other hand, that system with all the logs, is in a DMZ (exposed to the internet), it may be better to have an internal system reach out to it and pull the files.

So, are these systems "equals" on the same network, or is one of them more secured behind firewalls, than the other?

Does the sending system have a lot of users on it that you can't control?  That may "hack" the system?

Where I'm going is, as stated above, it would be nice to "send" the files, but, if this is a system that could be hacked (more easily), than the  target system, then I'd rather set you up with a pull.

make sense?
Williams225System AdministatorAuthor Commented:
the server is very secure, and its not exposed  to the internet. the admin system did some security  audits a few weeks ago.

Williams225System AdministatorAuthor Commented:
so, its possible to transfert the files i think
Ok.. Here's what I need you to do.

We'll say you have LOG system(source), and ARC system(target)
You have an ID on the LOG system that will run the script I wrote, we'll call it log_id

You have an ID on the ARC system to, we'll call it arc_id

Some of this may already be set up, so we'll look ahead to see what we can leverage.
If either exist, we have to tread a little more lightly, so we don't dork up existing configurations.

First, let's test...
log into LOG system as log_id  
issue the following command:
ssh arc_id@
(if it prompts for yes/no, reply yes)
If it prompts for password, ^c and get back to the $ prompt.
IF it does NOT prompt for a password, and you end up on the ARC system, logged in as arc_id, then skip the rest of this post, and just run the script I've posted.

The rest of this post will get you the ability to log into ARC as arc_id, from LOG/log_id, without a password prompt.  This will permit the script to run unattended.

The process is fairly easy, but I've got to walk you through a bunch of questions/conditions, because I don't know what you already have in place.

Now, let's start with the ARC system

Log into the arc_id, and look in it's home directory a .ssh directory
ls -lad $HOME/.ssh

IF .ssh does NOT exist, we'll create one  (if it does exist, skip mkdir/chmod commands)
   mkdir $HOME/.ssh
   chmod 700 $HOME/.ssh

Now, we'll check the LOG system:
Log into log_id, and look in it's home directory for a .ssh directory
ls -lad $HOME/.ssh

IF .ssh exists, see if there're two files in .ssh called  id_rsa  and
IF they exist, skip the ssh-keygen command !!!  It would over-write them, instead, we'll use them.
IF they DONT exist, run "ssk-keygen -t rsa"  This will create the two files id_rsa and in $HOME/.ssh

At this point, we should have a .ssh directory on ARC system, either existing or new.
And on the LOG system, we have either existing id_rsa/ files in .ssh, or we created them.

On LOG system, as log_id, run the following:
( is the IP of the ARC system, arc_id is the real id you'll use)

scp -p $HOME/  arc_id@
it should prompt for a password.

Now, log into ARC system as arc_id and run these two commands
cd    .ssh
cat  >>  authorized_keys      
# Make sure you use TWO greater-than signs.  This will append/create an authorized keys file.

Back on LOG system, as log_id
you should be able to issue

ssh arc_id@  
and it will log you in, without a password prompt.
(Security concerns?  This is more secure than passwords in scripts. as anyone with a password can use it.
using "ssh-keys", ONLY this id ON this system, can log in without the password.  If someone steals your, they can't use it, they need the id_rsa file to go with it.)

id_rsa is called a "private key"  keep it that way, it needs to stay *private*

remember, log_id  arc_id are just names I've made up, don't use those literally.  You may use root, or any other real id on the systems that the script will run as.

In the script, I've created some variables at the top.  You need to edit the script and put in your real IDs.

Then, run the attached script on the LOG system, as log_id  and you should be good to go.

Sorry.. lost track of time.... I've got scouts tonight.. I'll modify the script and post it when I get back.


Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.