[2 days left] What’s wrong with your cloud strategy? Learn why multicloud solutions matter with Nimble Storage.Register Now


Script to automatically archive logs to a folder every weeks

Posted on 2011-09-09
Medium Priority
Last Modified: 2013-12-27
Hi, I am looking for a script to automatically archive specific weekly logs from a folder to another every weeks.

Question by:cismoney
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 11
  • 9
  • 2
  • +1
LVL 16

Expert Comment

ID: 36509967
LVL 40

Expert Comment

ID: 36510021
you can use tar to backup thr logs and this can be scheduled to run weekly via crontab

1- create the script with the name e.g. myscript which contains:

/usr/bin/tar -cf /path/to/myarcdir/mybackup`date "+%Y%m%d"` /path/to/mylogs

2- make the script executable

chmod +x /path/to/myscript

3- add crontab schedule

EDITOR=vi ; exprt EDITOR

crontab -e

add below to the temp file, then save it then exit

0 1 * * 0 /path/to/myscript

for more info on the crontab schedule please see


Author Comment

ID: 36511324
please can you give an example
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

LVL 40

Expert Comment

ID: 36511597
script and steps already given

you need to replace the path names with you's

if you have files names and dir name then I can post the script to use

Expert Comment

ID: 36516126
Do you want all the files in the directory to be archived?
DO you want only a certain name pattern?
Do you want only files over a certain age?

Here's what we use to age off files out of a directory, it could be easily modified to just move them.
(To those familiar with skulker, you'll see some remnants in here).

This works for multiple directories, but uses the find command, which does all directories below where you've specified in the tree.

Format of the function calls is:
RMGZ  {Starting Path}  {Age to rm}  {Age to compress}
   if you dont want to compress, just specify a number larger than rm value
Specify as many directories as you want.

to "archive" in stead of cleanup.
  in the remove_file function,   change the rm to an mv, and give it a target directory.

You could then use this script unmodified (except for the directory to keep the logs maybe), to maintain your archive.


Author Comment

ID: 36516622
thanx for your script. I am a newbie, it will be a lil bit difficult to customise the script,  i would appreciate if you can help.

I want a certain name pattern to be moved and archive in a specific directory.

Expert Comment

ID: 36519175
Give me the specifics...

What directory do you want them posted in?
What file-name pattern do you want to follow?

Author Comment

ID: 36523983
thanx a lot.

Here are the specifics

1°) Name of the files to archives ( compress and move)

The names of the files to archives will have the parterns bellow where YYYY =year; MM=Month ; DD= day

•      fdamo_debug.log.YYYY-MM-DD
•      snmpTraps.log. YYYY-MM-DD
•      system.log.YYYY-MM-DD
•      httpUssd.log.YYYY-MM-DD
•      ussd_messaging.log.YYYY-MM-DD
•      fdamo.log. YYYY-MM-DD

2°) the folders where the files will be archived

•      servers: et
•      folders : /opt/fdamo/gateway/core/logs/

3°)frequency to archive  logs : weekly

Expert Comment

ID: 36525257
We can run it daily, and archive anything older than 7 days, so you wont build up two weeks worth of logs before the archive occurs.

1) What's the source directory?
Destination directory = /opt/fdamo/gateway/core/logs
What's the source directory?

Q2) Once they've been moved to the archive directory, can they be compressed immediately, or should they remain uncompressed?

Q3) How long should they live in the archive directory before being removed from the system for good.



Author Comment

ID: 36530432
@Tomunique, here are the infos, thanx a lot for your help!

source directory? /opt/fdamo/gateway/core/logs/  
Destination directory = /opt/fdamo/Archive/LogArchive/

Once they've been moved to the archive directory, the logs should be compressed immediately

They should stay on the system for good.

The names of the archives compressed folders should be YYYY-MM-DD_Log_Arc

The names of the files to archives  have the parterns bellow where YYYY =year; MM=Month ; DD= day

•      fdamo_debug.log.YYYY-MM-DD
•      snmpTraps.log. YYYY-MM-DD
•      system.log.YYYY-MM-DD
•      httpUssd.log.YYYY-MM-DD
•      ussd_messaging.log.YYYY-MM-DD
•      fdamo.log. YYYY-MM-DD

frequency to archive  logs : weekly


Expert Comment

ID: 36530538
Every time you post something the requirements appear to change slightly.

What do you mean "The names of the archives compressed folders should be YYYY-MM-DD_Log_Arc " ????

are there folders underneath /opt/fdamo/Archive/LogArchive?

To be clear, as an example --

would move to

the .gz suffix would be added for a compressed file, (not required, but recommended to clearly note it's compressed)

Is my example accurate?

Author Comment

ID: 36530916
your example is very accurate. its a weekly script, so it will be good that each month under 2011-09-12_Log_Arc  for example i can see the 4 compressed files with the archive logs for the month. (for example)

Expert Comment

ID: 36530981
So, the 2011-09-12_Log_Arc, is the date the script ran..

  In the directory 2011-09-12_Log_Arc
   there may be 7 fdamo_debug.log.*   files?  one per day that's found?

Say it ran this last Sunday  the directory would contain these 7 files?

Expert Comment

ID: 36530984
Just trying to get specs down before I spend the time to mod the script  
(measure twice, cut once)

Author Comment

ID: 36531190
yes . you get it right!

Assisted Solution

Tomunique earned 2000 total points
ID: 36533718
here ya go.

Expert Comment

ID: 36533722
To run it weekly you'll need some form fo scheduler, such as cron.

Author Comment

ID: 36534472
Thanx a lot Tomunique!!!!

I have one more question, I hope you will answer. because your script seems perfect!

if for example the target direct /opt/fundamo/Archive/LogArchive/$(date +%Y-$m-$y_Log_Arc)  is on another server (ip: how will i modify the part of the script bellow

TARGETDIR=/opt/fundamo/Archive/LogArchive/$(date +%Y-$m-$y_Log_Arc)

Open in new window


Expert Comment

ID: 36535763
Now you're talking password-less file transfers (or NFS).

Ok, one side has to originate the transfer.  
Normally you'd select the sender, as that system knows what files need to be sent, and when.

If, on the other hand, that system with all the logs, is in a DMZ (exposed to the internet), it may be better to have an internal system reach out to it and pull the files.

So, are these systems "equals" on the same network, or is one of them more secured behind firewalls, than the other?

Does the sending system have a lot of users on it that you can't control?  That may "hack" the system?

Where I'm going is, as stated above, it would be nice to "send" the files, but, if this is a system that could be hacked (more easily), than the  target system, then I'd rather set you up with a pull.

make sense?

Author Comment

ID: 36539663
the server is very secure, and its not exposed  to the internet. the admin system did some security  audits a few weeks ago.


Author Comment

ID: 36539672
so, its possible to transfert the files i think

Expert Comment

ID: 36540097
Ok.. Here's what I need you to do.

We'll say you have LOG system(source), and ARC system(target)
You have an ID on the LOG system that will run the script I wrote, we'll call it log_id

You have an ID on the ARC system to, we'll call it arc_id

Some of this may already be set up, so we'll look ahead to see what we can leverage.
If either exist, we have to tread a little more lightly, so we don't dork up existing configurations.

First, let's test...
log into LOG system as log_id  
issue the following command:
ssh arc_id@
(if it prompts for yes/no, reply yes)
If it prompts for password, ^c and get back to the $ prompt.
IF it does NOT prompt for a password, and you end up on the ARC system, logged in as arc_id, then skip the rest of this post, and just run the script I've posted.

The rest of this post will get you the ability to log into ARC as arc_id, from LOG/log_id, without a password prompt.  This will permit the script to run unattended.

The process is fairly easy, but I've got to walk you through a bunch of questions/conditions, because I don't know what you already have in place.

Now, let's start with the ARC system

Log into the arc_id, and look in it's home directory a .ssh directory
ls -lad $HOME/.ssh

IF .ssh does NOT exist, we'll create one  (if it does exist, skip mkdir/chmod commands)
   mkdir $HOME/.ssh
   chmod 700 $HOME/.ssh

Now, we'll check the LOG system:
Log into log_id, and look in it's home directory for a .ssh directory
ls -lad $HOME/.ssh

IF .ssh exists, see if there're two files in .ssh called  id_rsa  and  id_rsa.pub
IF they exist, skip the ssh-keygen command !!!  It would over-write them, instead, we'll use them.
IF they DONT exist, run "ssk-keygen -t rsa"  This will create the two files id_rsa and id_rsa.pub in $HOME/.ssh

At this point, we should have a .ssh directory on ARC system, either existing or new.
And on the LOG system, we have either existing id_rsa/id_rsa.pub files in .ssh, or we created them.

On LOG system, as log_id, run the following:
( is the IP of the ARC system, arc_id is the real id you'll use)

scp -p $HOME/id_rsa.pub  arc_id@
it should prompt for a password.

Now, log into ARC system as arc_id and run these two commands
cd    .ssh
cat   log_id.pub  >>  authorized_keys      
# Make sure you use TWO greater-than signs.  This will append/create an authorized keys file.

Back on LOG system, as log_id
you should be able to issue

ssh arc_id@  
and it will log you in, without a password prompt.
(Security concerns?  This is more secure than passwords in scripts. as anyone with a password can use it.
using "ssh-keys", ONLY this id ON this system, can log in without the password.  If someone steals your id_rsa.pub, they can't use it, they need the id_rsa file to go with it.)

id_rsa is called a "private key"  keep it that way, it needs to stay *private*

remember, log_id  arc_id are just names I've made up, don't use those literally.  You may use root, or any other real id on the systems that the script will run as.

In the script, I've created some variables at the top.  You need to edit the script and put in your real IDs.

Then, run the attached script on the LOG system, as log_id  and you should be good to go.

Sorry.. lost track of time.... I've got scouts tonight.. I'll modify the script and post it when I get back.



Accepted Solution

Tomunique earned 2000 total points
ID: 36540790
Ok, here's the script.
There's a new staging directory you need to create locally.
   If the remote system is down, they'll queue up here.

the local directory is

This directory should normally be empty, if not, the next run of the script will push anything that didn't make a previous run.

remember to make the changes to the script per my previous post to set the remote userid appropriately.


Featured Post

Will your db performance match your db growth?

In Percona’s white paper “Performance at Scale: Keeping Your Database on Its Toes,” we take a high-level approach to what you need to think about when planning for database scalability.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

SSH (Secure Shell) - Tips and Tricks As you all know SSH(Secure Shell) is a network protocol, which we use to access/transfer files securely between two networked devices. SSH was actually designed as a replacement for insecure protocols that sen…
Join Greg Farro and Ethan Banks from Packet Pushers (http://packetpushers.net/podcast/podcasts/pq-show-93-smart-network-monitoring-paessler-sponsored/) and Greg Ross from Paessler (https://www.paessler.com/prtg) for a discussion about smart network …
Learn how to navigate the file tree with the shell. Use pwd to print the current working directory: Use ls to list a directory's contents: Use cd to change to a new directory: Use wildcards instead of typing out long directory names: Use ../ to move…
How to Install VMware Tools in Red Hat Enterprise Linux 6.4 (RHEL 6.4) Step-by-Step Tutorial
Suggested Courses

656 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question