?
Solved

Script to automatically archive logs to a folder every weeks

Posted on 2011-09-09
23
Medium Priority
?
834 Views
Last Modified: 2013-12-27
Hi, I am looking for a script to automatically archive specific weekly logs from a folder to another every weeks.

The OS is SOLARIS.
0
Comment
Question by:cismoney
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 11
  • 9
  • 2
  • +1
23 Comments
 
LVL 16

Expert Comment

by:medvedd
ID: 36509967
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 36510021
you can use tar to backup thr logs and this can be scheduled to run weekly via crontab

1- create the script with the name e.g. myscript which contains:

/usr/bin/tar -cf /path/to/myarcdir/mybackup`date "+%Y%m%d"` /path/to/mylogs

2- make the script executable

chmod +x /path/to/myscript

3- add crontab schedule

EDITOR=vi ; exprt EDITOR

crontab -e

add below to the temp file, then save it then exit

0 1 * * 0 /path/to/myscript

for more info on the crontab schedule please see

http://unixhelp.ed.ac.uk/CGI/man-cgi?crontab+5
0
 

Author Comment

by:cismoney
ID: 36511324
please can you give an example
0
Percona Live Europe 2017 | Sep 25 - 27, 2017

The Percona Live Open Source Database Conference Europe 2017 is the premier event for the diverse and active European open source database community, as well as businesses that develop and use open source database software.

 
LVL 40

Expert Comment

by:omarfarid
ID: 36511597
script and steps already given

you need to replace the path names with you's

if you have files names and dir name then I can post the script to use
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36516126
Do you want all the files in the directory to be archived?
DO you want only a certain name pattern?
Do you want only files over a certain age?


Here's what we use to age off files out of a directory, it could be easily modified to just move them.
(To those familiar with skulker, you'll see some remnants in here).

This works for multiple directories, but uses the find command, which does all directories below where you've specified in the tree.

Format of the function calls is:
RMGZ  {Starting Path}  {Age to rm}  {Age to compress}
   if you dont want to compress, just specify a number larger than rm value
Specify as many directories as you want.

to "archive" in stead of cleanup.
  in the remove_file function,   change the rm to an mv, and give it a target directory.

You could then use this script unmodified (except for the directory to keep the logs maybe), to maintain your archive.


I
clearlogs.sh
0
 

Author Comment

by:cismoney
ID: 36516622
thanx for your script. I am a newbie, it will be a lil bit difficult to customise the script,  i would appreciate if you can help.


I want a certain name pattern to be moved and archive in a specific directory.
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36519175
Give me the specifics...

What directory do you want them posted in?
What file-name pattern do you want to follow?
0
 

Author Comment

by:cismoney
ID: 36523983
thanx a lot.

Here are the specifics


1°) Name of the files to archives ( compress and move)

The names of the files to archives will have the parterns bellow where YYYY =year; MM=Month ; DD= day

•      fdamo_debug.log.YYYY-MM-DD
•      snmpTraps.log. YYYY-MM-DD
•      system.log.YYYY-MM-DD
•      httpUssd.log.YYYY-MM-DD
•      ussd_messaging.log.YYYY-MM-DD
•      fdamo.log. YYYY-MM-DD

2°) the folders where the files will be archived

•      servers: 10.16.30.66 et 10.16.30.67
•      folders : /opt/fdamo/gateway/core/logs/

3°)frequency to archive  logs : weekly
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36525257
We can run it daily, and archive anything older than 7 days, so you wont build up two weeks worth of logs before the archive occurs.

Questions:
1) What's the source directory?
Destination directory = /opt/fdamo/gateway/core/logs
What's the source directory?

Q2) Once they've been moved to the archive directory, can they be compressed immediately, or should they remain uncompressed?

Q3) How long should they live in the archive directory before being removed from the system for good.

Tom


0
 

Author Comment

by:cismoney
ID: 36530432
@Tomunique, here are the infos, thanx a lot for your help!


source directory? /opt/fdamo/gateway/core/logs/  
Destination directory = /opt/fdamo/Archive/LogArchive/

Once they've been moved to the archive directory, the logs should be compressed immediately

They should stay on the system for good.


The names of the archives compressed folders should be YYYY-MM-DD_Log_Arc

The names of the files to archives  have the parterns bellow where YYYY =year; MM=Month ; DD= day

•      fdamo_debug.log.YYYY-MM-DD
•      snmpTraps.log. YYYY-MM-DD
•      system.log.YYYY-MM-DD
•      httpUssd.log.YYYY-MM-DD
•      ussd_messaging.log.YYYY-MM-DD
•      fdamo.log. YYYY-MM-DD

frequency to archive  logs : weekly

0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36530538
Every time you post something the requirements appear to change slightly.

What do you mean "The names of the archives compressed folders should be YYYY-MM-DD_Log_Arc " ????

are there folders underneath /opt/fdamo/Archive/LogArchive?

To be clear, as an example --

/opt/fdamo/gateway/core/logs/fdamo_debug.log.2011-09-12
would move to
/opt/fdamo/Archive/LogArchive/2011-09-12_Log_Arc/fdamo_debug.log.2011-09-12.gz

the .gz suffix would be added for a compressed file, (not required, but recommended to clearly note it's compressed)

Is my example accurate?
0
 

Author Comment

by:cismoney
ID: 36530916
your example is very accurate. its a weekly script, so it will be good that each month under 2011-09-12_Log_Arc  for example i can see the 4 compressed files with the archive logs for the month. (for example)
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36530981
So, the 2011-09-12_Log_Arc, is the date the script ran..

  In the directory 2011-09-12_Log_Arc
   there may be 7 fdamo_debug.log.*   files?  one per day that's found?

Say it ran this last Sunday  the directory would contain these 7 files?
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-05.gz
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-06.gz
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-07.gz
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-08.gz
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-09.gz
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-10.gz
2011-09-11_Log_Arc/fdamo_debug.log.2011-09-11.gz
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36530984
Just trying to get specs down before I spend the time to mod the script  
(measure twice, cut once)
0
 

Author Comment

by:cismoney
ID: 36531190
yes . you get it right!
0
 
LVL 6

Assisted Solution

by:Tomunique
Tomunique earned 2000 total points
ID: 36533718
here ya go.
clearlogs.sh
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36533722
To run it weekly you'll need some form fo scheduler, such as cron.
0
 

Author Comment

by:cismoney
ID: 36534472
Thanx a lot Tomunique!!!!

I have one more question, I hope you will answer. because your script seems perfect!


if for example the target direct /opt/fundamo/Archive/LogArchive/$(date +%Y-$m-$y_Log_Arc)  is on another server (ip:10.16.30.66) how will i modify the part of the script bellow



TARGETDIR=/opt/fundamo/Archive/LogArchive/$(date +%Y-$m-$y_Log_Arc)

Open in new window

0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36535763
Now you're talking password-less file transfers (or NFS).

Ok, one side has to originate the transfer.  
Normally you'd select the sender, as that system knows what files need to be sent, and when.

If, on the other hand, that system with all the logs, is in a DMZ (exposed to the internet), it may be better to have an internal system reach out to it and pull the files.

So, are these systems "equals" on the same network, or is one of them more secured behind firewalls, than the other?

Does the sending system have a lot of users on it that you can't control?  That may "hack" the system?

Where I'm going is, as stated above, it would be nice to "send" the files, but, if this is a system that could be hacked (more easily), than the  target 10.16.30.66 system, then I'd rather set you up with a pull.

make sense?
0
 

Author Comment

by:cismoney
ID: 36539663
the server is very secure, and its not exposed  to the internet. the admin system did some security  audits a few weeks ago.

so,
0
 

Author Comment

by:cismoney
ID: 36539672
so, its possible to transfert the files i think
0
 
LVL 6

Expert Comment

by:Tomunique
ID: 36540097
Ok.. Here's what I need you to do.

We'll say you have LOG system(source), and ARC system(target)
You have an ID on the LOG system that will run the script I wrote, we'll call it log_id

You have an ID on the ARC system to, we'll call it arc_id

Some of this may already be set up, so we'll look ahead to see what we can leverage.
If either exist, we have to tread a little more lightly, so we don't dork up existing configurations.

First, let's test...
log into LOG system as log_id  
issue the following command:
ssh arc_id@10.16.30.66
(if it prompts for yes/no, reply yes)
If it prompts for password, ^c and get back to the $ prompt.
IF it does NOT prompt for a password, and you end up on the ARC system, logged in as arc_id, then skip the rest of this post, and just run the script I've posted.

The rest of this post will get you the ability to log into ARC as arc_id, from LOG/log_id, without a password prompt.  This will permit the script to run unattended.

The process is fairly easy, but I've got to walk you through a bunch of questions/conditions, because I don't know what you already have in place.

Now, let's start with the ARC system

Log into the arc_id, and look in it's home directory a .ssh directory
ls -lad $HOME/.ssh

IF .ssh does NOT exist, we'll create one  (if it does exist, skip mkdir/chmod commands)
   mkdir $HOME/.ssh
   chmod 700 $HOME/.ssh

Now, we'll check the LOG system:
Log into log_id, and look in it's home directory for a .ssh directory
ls -lad $HOME/.ssh

IF .ssh exists, see if there're two files in .ssh called  id_rsa  and  id_rsa.pub
IF they exist, skip the ssh-keygen command !!!  It would over-write them, instead, we'll use them.
IF they DONT exist, run "ssk-keygen -t rsa"  This will create the two files id_rsa and id_rsa.pub in $HOME/.ssh


At this point, we should have a .ssh directory on ARC system, either existing or new.
And on the LOG system, we have either existing id_rsa/id_rsa.pub files in .ssh, or we created them.


On LOG system, as log_id, run the following:
( 10.16.30.66 is the IP of the ARC system, arc_id is the real id you'll use)

scp -p $HOME/id_rsa.pub  arc_id@10.16.30.66:.ssh/log_id.pub
it should prompt for a password.

Now, log into ARC system as arc_id and run these two commands
cd    .ssh
cat   log_id.pub  >>  authorized_keys      
# Make sure you use TWO greater-than signs.  This will append/create an authorized keys file.

Back on LOG system, as log_id
you should be able to issue

ssh arc_id@10.16.30.66  
and it will log you in, without a password prompt.
(Security concerns?  This is more secure than passwords in scripts. as anyone with a password can use it.
using "ssh-keys", ONLY this id ON this system, can log in without the password.  If someone steals your id_rsa.pub, they can't use it, they need the id_rsa file to go with it.)

id_rsa is called a "private key"  keep it that way, it needs to stay *private*

remember, log_id  arc_id are just names I've made up, don't use those literally.  You may use root, or any other real id on the systems that the script will run as.

In the script, I've created some variables at the top.  You need to edit the script and put in your real IDs.

Then, run the attached script on the LOG system, as log_id  and you should be good to go.

Sorry.. lost track of time.... I've got scouts tonight.. I'll modify the script and post it when I get back.

Tom






0
 
LVL 6

Accepted Solution

by:
Tomunique earned 2000 total points
ID: 36540790
Ok, here's the script.
There's a new staging directory you need to create locally.
   If the remote system is down, they'll queue up here.

the local directory is
/opt/fdamo/gateway/core/staging_logs

This directory should normally be empty, if not, the next run of the script will push anything that didn't make a previous run.

remember to make the changes to the script per my previous post to set the remote userid appropriately.

Tom
clearlogs.sh
0

Featured Post

Enroll in August's Course of the Month

August's CompTIA IT Fundamentals course includes 19 hours of basic computer principle modules and prepares you for the certification exam. It's free for Premium Members, Team Accounts, and Qualified Experts!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Using libpcap/Jpcap to capture and send packets on Solaris version (10/11) Library used: 1.      Libpcap (http://www.tcpdump.org) Version 1.2 2.      Jpcap(http://netresearch.ics.uci.edu/kfujii/Jpcap/doc/index.html) Version 0.6 Prerequisite: 1.      GCC …
Introduction We as admins face situation where we need to redirect websites to another. This may be required as a part of an upgrade keeping the old URL but website should be served from new URL. This document would brief you on different ways ca…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.
Suggested Courses
Course of the Month12 days, 23 hours left to enroll

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question