Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

need help to create the script

Posted on 2013-05-22
13
Medium Priority
?
541 Views
Last Modified: 2013-05-29
Hi All,

I need to create a shell script to append the output of all files under below directory to a single file.

issue is this - these files created only when alert comes - means first my script will check the directory for new alert/file and than cat the output to append in that single file.

I also want to purge the file every week.

Please help


tmp# ls

22754_cancelled_2013_05_22_11_54_56.txt
22754_created_2013_05_22_11_39_56.txt
22755_cancelled_2013_05_22_11_54_56.txt
22755_created_2013_05_22_11_39_56.txt
22756_cancelled_2013_05_22_12_24_56.txt
22756_created_2013_05_22_11_44_56.txt
22757_cancelled_2013_05_22_12_24_56.txt
22757_created_2013_05_22_11_49_56.txt
22758_cancelled_2013_05_22_12_19_56.txt
22758_created_2013_05_22_11_54_56.txt
22759_cancelled_2013_05_22_12_19_56.txt
22759_created_2013_05_22_11_54_56.txt
22760_cancelled_2013_05_22_12_24_56.txt
22760_created_2013_05_22_11_59_56.txt
22761_cancelled_2013_05_22_12_39_56.txt
22761_created_2013_05_22_12_24_56.txt
22762_cancelled_2013_05_22_12_39_56.txt
22762_created_2013_05_22_12_29_56.txt
22763_cancelled_2013_05_22_13_19_56.txt
22763_created_2013_05_22_13_04_56.txt
0
Comment
Question by:apunkabollywood
  • 6
  • 5
  • 2
13 Comments
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187188
Assuming that you don't want to keep the alert files after appending their content to the target file:

cd /directory/with/files
if ls  *.txt >/dev/null 2>&1; then cat *.txt >> /path/to/target/file && rm *.txt; fi
0
 
LVL 27

Expert Comment

by:skullnobrains
ID: 39187190
can you use a separate directory to store processed files just to make it easier ?

do you need the resulting log to be ordered ?

can you assume that when a new log file is created, it will be writen after some fixed period of time, or it won't be written to once it reaches some definited size ?

can you elaborate on those logs and how they rotate ?

do you need the main log to be updated real time ? what delay would be acceptable ?
0
 

Author Comment

by:apunkabollywood
ID: 39187270
Hi Here is a brief:

1. These files created due to alerts which can be come any time during day
2. I want to put my script via crontab which will run every 15 minutes
3. Just want to append all files contents to a single file and keep doing for a day or a week with date and time stamp.
4. During appending content if possible it should append with date and time.
5. I dont want this file after a week so weekly i want it to removed and same process continue


I have no problem to copy all these files to other folder to play with it for this task
0
Get quick recovery of individual SharePoint items

Free tool – Veeam Explorer for Microsoft SharePoint, enables fast, easy restores of SharePoint sites, documents, libraries and lists — all with no agents to manage and no additional licenses to buy.

 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187296
Time stamp - One time stamp per appended file or per line? Using the current time or the file creation time or the time contained in the file name?

Did you post the actual file name format?

Do you want to keep the single allert files, or do you want them removed/moved?
0
 

Author Comment

by:apunkabollywood
ID: 39187399
- Yes one time stamp per appended file and using file creation time.

- Yes that is the actual file format i have given.

- I want to keep the single alert files - if it needs to remove we might copy to a diffrent folder and use it.

- Problem is this I dont want to miss any alert and want to check for those alert files every 5 minute so it can be possible
new alert comes every 5 minutes in that situtaion what we can do so that only new alert file content append to that single log file.

- i need to have a one single file for every day in a diffrent folder and after a week delete all files and restart the same process
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187585
>> I want to keep the single alert files  <<

Do they have to stay in their original directory or can we move them to a backup directory after processing?
0
 
LVL 27

Expert Comment

by:skullnobrains
ID: 39187587
i assume the file are small and get fully written as soon as they are created

try something along this line

LOGDIR=/var/log/whatever  # EDIT THIS
day=`date '+%Y%m%d'`
test -d "$LOGDIR/processed" || mkdir "$LOGDIR/processed" || exit
for file in `ls -1 "$LOGDIR"`
do
  expr "$file" : ^consolidated && echo "ignoring:$file" && continue
  echo "" | tee -a $LOGDIR/consolidated.$day.log >/dev/null || break
  echo "FILE:$file" | tee -a $LOGDIR/consolidated.$day.log >/dev/null || break
  cat $file | tee -a $LOGDIR/consolidated.$day.log >/dev/null || break
  mv -v $file "$LOGDIR/processed/"
done
find $LOGDIR/processed $LOGDIR/consolidated.*.log -mtime -7 -delete

you can add checks for filenames if the directory may contain other files, maybe use find + xargs if you process LOTS of files and performance is an issue, perhaps run this in a loop as a demon (you'd need better handling in that case)
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187627
Taking into account that the summary file should contain a timestamp per appended log and that this timestamp should stem from the file's "creation" date:

#!/bin/sh
IDIR=/var/log/alertlogs ## <- please customize!
CDIR=$IDIR/summary
BDIR=$IDIR/backup
find $CDIR $BDIR -type f -mtime +7 |xargs rm
for file in $(ls $IDIR/*.txt 2>/dev/null)
 do
  TS=$(stat -c "%y" $file | awk -F"[.]" '{print $1}')
  DOW=$(date -d "01/01/1970 + $(stat -c "%Y" $file) seconds" "+%A")
  [[ ! -d $CDIR/$DOW ]] && mkdir -p $CDIR/$DOW
  echo "---" >> $CDIR/$DOW/collection
  echo "$TS" >> $CDIR/$DOW/collection
  echo "---" >> $CDIR/$DOW/collection
  echo $TS
  cat $file >> $CDIR/$DOW/collection
  [[ ! -d $BDIR/$DOW ]] && mkdir -p $BDIR/$DOW
  mv $file $BDIR/$DOW
 done
0
 

Author Comment

by:apunkabollywood
ID: 39187630
Thanks Skull - I will check and let you know..



@ wool - no we can move them if we want to backup directory
0
 
LVL 68

Expert Comment

by:woolmilkporc
ID: 39187649
Run my script regularly via cron - and remove the single "echo $TS" - it's a remnant from testing.
0
 

Author Comment

by:apunkabollywood
ID: 39189918
@wool - Just need a brief description what are things this script will do :

- Only create a single file of all alert file a separte directory with date stamp
- Or it will generate a single file on a daily basis for a week and than after a week delete the file and restart the same process

please advice ! i m confuse
0
 
LVL 68

Accepted Solution

by:
woolmilkporc earned 2000 total points
ID: 39190057
My script takes all files in (example!) "/var/log/alertlogs/" ending in ".txt" and extracts the modification time of each file in the form "YYYY-mm-dd HH:MM:SS" and the modification day of week of each file in the form "Sunday", Monday" etc. by means of "stat".

If not present it then creates subdirectories of "/var/log/alertlogs/" like "/var/log/alertlogs/summary/Sunday/", "/var/log/alertlogs/summary/Monday/" etc.

Depending on the file's modification day of week the content of the respective file (including a header containing the modification date in the form described above) is appended to a file named "collection" in the matching subdirectory, like "/var/log/alertlogs/summary/Sunday/collection" etc.

Once this processing is complete the file gets moved to a subdirectory of "/var/log/alertlogs/" named "/var/log/alertlogs/backup/Sunday" etc. like the above "summary" directories.
These directories also get created if they don't exist.

Before the process described above starts files older than 7 days get removed from the directories "/var/log/alertlogs/backup/Sunday/", "/var/log/alertlogs/backup/Monday/" etc. and from "/var/log/alertlogs/summary/Sunday/",  "/var/log/alertlogs/summary/Monday/" etc.

So once the script has run you'll have an empty sudirectory "/var/log/alertlogs",  directories called "/var/log/alertlogs/backup/Sunday/",  "/var/log/alertlogs/backup/Monday/" etc. containing the processed files, and there will be the  subdirectories "/var/log/alertlogs/summary/Sunday/",  "/var/log/alertlogs/summary/Monday/" etc., each of them containing a file named "collection" which in turn contains the concatenated files created on that particular day of week.

The directory "/var/log/alertlogs" is now waiting for new files to arrive and to be processed by the script.

To repeat it: The "collection" files in the "day of week" subdirectories get populated depending on the modification date of the just processed file itself. The current time and date as well as the time stamps contained in the file names are of no influence at all, so you can run the script any time you like and also as often as you like. The more often you run it the more up-to-date will be the content of the collection files.
0
 

Author Closing Comment

by:apunkabollywood
ID: 39204268
Thank you so much
0

Featured Post

[Webinar On Demand] Database Backup and Recovery

Does your company store data on premises, off site, in the cloud, or a combination of these? If you answered “yes”, you need a data backup recovery plan that fits each and every platform. Watch now as as Percona teaches us how to build agile data backup recovery plan.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

FreeBSD on EC2 FreeBSD (https://www.freebsd.org) is a robust Unix-like operating system that has been around for many years. FreeBSD is available on Amazon EC2 through Amazon Machine Images (AMIs) provided by FreeBSD developer and security office…
Utilizing an array to gracefully append to a list of EmailAddresses
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
Connecting to an Amazon Linux EC2 Instance from Windows Using PuTTY.
Suggested Courses
Course of the Month15 days, 5 hours left to enroll

577 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question