Solved

need help to create the script

Posted on 2013-05-22
13
525 Views
Last Modified: 2013-05-29
Hi All,

I need to create a shell script to append the output of all files under below directory to a single file.

issue is this - these files created only when alert comes - means first my script will check the directory for new alert/file and than cat the output to append in that single file.

I also want to purge the file every week.

Please help


tmp# ls

22754_cancelled_2013_05_22_11_54_56.txt
22754_created_2013_05_22_11_39_56.txt
22755_cancelled_2013_05_22_11_54_56.txt
22755_created_2013_05_22_11_39_56.txt
22756_cancelled_2013_05_22_12_24_56.txt
22756_created_2013_05_22_11_44_56.txt
22757_cancelled_2013_05_22_12_24_56.txt
22757_created_2013_05_22_11_49_56.txt
22758_cancelled_2013_05_22_12_19_56.txt
22758_created_2013_05_22_11_54_56.txt
22759_cancelled_2013_05_22_12_19_56.txt
22759_created_2013_05_22_11_54_56.txt
22760_cancelled_2013_05_22_12_24_56.txt
22760_created_2013_05_22_11_59_56.txt
22761_cancelled_2013_05_22_12_39_56.txt
22761_created_2013_05_22_12_24_56.txt
22762_cancelled_2013_05_22_12_39_56.txt
22762_created_2013_05_22_12_29_56.txt
22763_cancelled_2013_05_22_13_19_56.txt
22763_created_2013_05_22_13_04_56.txt
0
Comment
Question by:apunkabollywood
  • 6
  • 5
  • 2
13 Comments
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187188
Assuming that you don't want to keep the alert files after appending their content to the target file:

cd /directory/with/files
if ls  *.txt >/dev/null 2>&1; then cat *.txt >> /path/to/target/file && rm *.txt; fi
0
 
LVL 26

Expert Comment

by:skullnobrains
ID: 39187190
can you use a separate directory to store processed files just to make it easier ?

do you need the resulting log to be ordered ?

can you assume that when a new log file is created, it will be writen after some fixed period of time, or it won't be written to once it reaches some definited size ?

can you elaborate on those logs and how they rotate ?

do you need the main log to be updated real time ? what delay would be acceptable ?
0
 

Author Comment

by:apunkabollywood
ID: 39187270
Hi Here is a brief:

1. These files created due to alerts which can be come any time during day
2. I want to put my script via crontab which will run every 15 minutes
3. Just want to append all files contents to a single file and keep doing for a day or a week with date and time stamp.
4. During appending content if possible it should append with date and time.
5. I dont want this file after a week so weekly i want it to removed and same process continue


I have no problem to copy all these files to other folder to play with it for this task
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187296
Time stamp - One time stamp per appended file or per line? Using the current time or the file creation time or the time contained in the file name?

Did you post the actual file name format?

Do you want to keep the single allert files, or do you want them removed/moved?
0
 

Author Comment

by:apunkabollywood
ID: 39187399
- Yes one time stamp per appended file and using file creation time.

- Yes that is the actual file format i have given.

- I want to keep the single alert files - if it needs to remove we might copy to a diffrent folder and use it.

- Problem is this I dont want to miss any alert and want to check for those alert files every 5 minute so it can be possible
new alert comes every 5 minutes in that situtaion what we can do so that only new alert file content append to that single log file.

- i need to have a one single file for every day in a diffrent folder and after a week delete all files and restart the same process
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187585
>> I want to keep the single alert files  <<

Do they have to stay in their original directory or can we move them to a backup directory after processing?
0
Enabling OSINT in Activity Based Intelligence

Activity based intelligence (ABI) requires access to all available sources of data. Recorded Future allows analysts to observe structured data on the open, deep, and dark web.

 
LVL 26

Expert Comment

by:skullnobrains
ID: 39187587
i assume the file are small and get fully written as soon as they are created

try something along this line

LOGDIR=/var/log/whatever  # EDIT THIS
day=`date '+%Y%m%d'`
test -d "$LOGDIR/processed" || mkdir "$LOGDIR/processed" || exit
for file in `ls -1 "$LOGDIR"`
do
  expr "$file" : ^consolidated && echo "ignoring:$file" && continue
  echo "" | tee -a $LOGDIR/consolidated.$day.log >/dev/null || break
  echo "FILE:$file" | tee -a $LOGDIR/consolidated.$day.log >/dev/null || break
  cat $file | tee -a $LOGDIR/consolidated.$day.log >/dev/null || break
  mv -v $file "$LOGDIR/processed/"
done
find $LOGDIR/processed $LOGDIR/consolidated.*.log -mtime -7 -delete

you can add checks for filenames if the directory may contain other files, maybe use find + xargs if you process LOTS of files and performance is an issue, perhaps run this in a loop as a demon (you'd need better handling in that case)
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187627
Taking into account that the summary file should contain a timestamp per appended log and that this timestamp should stem from the file's "creation" date:

#!/bin/sh
IDIR=/var/log/alertlogs ## <- please customize!
CDIR=$IDIR/summary
BDIR=$IDIR/backup
find $CDIR $BDIR -type f -mtime +7 |xargs rm
for file in $(ls $IDIR/*.txt 2>/dev/null)
 do
  TS=$(stat -c "%y" $file | awk -F"[.]" '{print $1}')
  DOW=$(date -d "01/01/1970 + $(stat -c "%Y" $file) seconds" "+%A")
  [[ ! -d $CDIR/$DOW ]] && mkdir -p $CDIR/$DOW
  echo "---" >> $CDIR/$DOW/collection
  echo "$TS" >> $CDIR/$DOW/collection
  echo "---" >> $CDIR/$DOW/collection
  echo $TS
  cat $file >> $CDIR/$DOW/collection
  [[ ! -d $BDIR/$DOW ]] && mkdir -p $BDIR/$DOW
  mv $file $BDIR/$DOW
 done
0
 

Author Comment

by:apunkabollywood
ID: 39187630
Thanks Skull - I will check and let you know..



@ wool - no we can move them if we want to backup directory
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187649
Run my script regularly via cron - and remove the single "echo $TS" - it's a remnant from testing.
0
 

Author Comment

by:apunkabollywood
ID: 39189918
@wool - Just need a brief description what are things this script will do :

- Only create a single file of all alert file a separte directory with date stamp
- Or it will generate a single file on a daily basis for a week and than after a week delete the file and restart the same process

please advice ! i m confuse
0
 
LVL 68

Accepted Solution

by:
woolmilkporc earned 500 total points
ID: 39190057
My script takes all files in (example!) "/var/log/alertlogs/" ending in ".txt" and extracts the modification time of each file in the form "YYYY-mm-dd HH:MM:SS" and the modification day of week of each file in the form "Sunday", Monday" etc. by means of "stat".

If not present it then creates subdirectories of "/var/log/alertlogs/" like "/var/log/alertlogs/summary/Sunday/", "/var/log/alertlogs/summary/Monday/" etc.

Depending on the file's modification day of week the content of the respective file (including a header containing the modification date in the form described above) is appended to a file named "collection" in the matching subdirectory, like "/var/log/alertlogs/summary/Sunday/collection" etc.

Once this processing is complete the file gets moved to a subdirectory of "/var/log/alertlogs/" named "/var/log/alertlogs/backup/Sunday" etc. like the above "summary" directories.
These directories also get created if they don't exist.

Before the process described above starts files older than 7 days get removed from the directories "/var/log/alertlogs/backup/Sunday/", "/var/log/alertlogs/backup/Monday/" etc. and from "/var/log/alertlogs/summary/Sunday/",  "/var/log/alertlogs/summary/Monday/" etc.

So once the script has run you'll have an empty sudirectory "/var/log/alertlogs",  directories called "/var/log/alertlogs/backup/Sunday/",  "/var/log/alertlogs/backup/Monday/" etc. containing the processed files, and there will be the  subdirectories "/var/log/alertlogs/summary/Sunday/",  "/var/log/alertlogs/summary/Monday/" etc., each of them containing a file named "collection" which in turn contains the concatenated files created on that particular day of week.

The directory "/var/log/alertlogs" is now waiting for new files to arrive and to be processed by the script.

To repeat it: The "collection" files in the "day of week" subdirectories get populated depending on the modification date of the just processed file itself. The current time and date as well as the time stamps contained in the file names are of no influence at all, so you can run the script any time you like and also as often as you like. The more often you run it the more up-to-date will be the content of the collection files.
0
 

Author Closing Comment

by:apunkabollywood
ID: 39204268
Thank you so much
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Over the years I've spent many an hour playing on hardened, DMZ'd servers, with only a sub-set of the usual GNU toy's to keep me company; frequently I've needed to save and send log or data extracts from these server back to my PC, or to others, and…
Every server (virtual or physical) needs a console: and the console can be provided through hardware directly connected, software for remote connections, local connections, through a KVM, etc. This document explains the different types of consol…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:
Get a first impression of how PRTG looks and learn how it works.   This video is a short introduction to PRTG, as an initial overview or as a quick start for new PRTG users.

760 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now