Solved

Unix/Linux Help for the below command

Posted on 2011-09-30
9
457 Views
Last Modified: 2013-11-22
Hi,

Can anyone pleas let me know  how the below command works.

awk  '/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}' HD_sync.log.10|sed -ne 's/^[^/]*[/]\(IMD.*\)\/IMAGES\//\1/p'|sort |uniq
0
Comment
Question by:new_perl_user
  • 4
  • 3
  • 2
9 Comments
 
LVL 37

Expert Comment

by:Gerwin Jansen
ID: 36891941
Hi, it is a 'one liner' that processed the file HD_sync.log.10 with awk and sed and then sorts the output leaving only the unique entries.

Here's a description of what is happening. More details can be given if can post a sample HD_sync.log file.

awk'/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}'
1. Perform this awk command - for lines that match pattern STARTED or FINISHED - check if pattern /IMAGES[/]$/ is on this line, if so output the 2nd field of that line

HD_sync.log.10
2. on this file (your input file)

|sed -ne 's/^[^/]*[/]\(IMD.*\)\/IMAGES\//\1/p'
3. search beginning (^) of each line for pattern [^/]*[/]\(IMD.*\)\/IMAGES\/ and print the first matching pattern

|sort
4. sort the output

|uniq
5. keep only unique lines

Is this enough explanation for you? The above is really not 'a command' but a series of commands on the input file. You can view the output steps in between by doing these separate steps:

awk  '/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}' HD_sync.log.10
awk  '/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}' HD_sync.log.10|sed -ne 's/^[^/]*[/]\(IMD.*\)\/IMAGES\//\1/p'
awk  '/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}' HD_sync.log.10|sed -ne 's/^[^/]*[/]\(IMD.*\)\/IMAGES\//\1/p'|sort
awk  '/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}' HD_sync.log.10|sed -ne 's/^[^/]*[/]\(IMD.*\)\/IMAGES\//\1/p'|sort |uniq
0
 

Author Comment

by:new_perl_user
ID: 36892020
Hi,
Thank you so much for the above explanation it is crystal clear. Also need a small help.  how can we tweak the above series of commands  to grab  data based on date(it should get yesterday's data from the file)  and write it to a log file.
0
 
LVL 38

Expert Comment

by:wesly_chen
ID: 36892663
> awk'/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}'
> 1. Perform this awk command - for lines that match pattern STARTED or FINISHED
No
In HD_sync.log.10, search the content between "STARTED" and "FINISHED", If any line contain "IMAGES/" or "IMAGES" at the end of line, then print the
second field of that line

> grab  data based on date(it should get yesterday
You need to provide the date format and the location of date in  a line, like
2011/09/29 or
2011-09-29
0
 

Author Comment

by:new_perl_user
ID: 36892746
I can't hardcode the date. so is there a way to  tell it  to look for data sysdate -1
0
Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

 
LVL 38

Expert Comment

by:wesly_chen
ID: 36892807
> can't hardcode the date.
You just need to look at the HD_sync.log.10 and post the date/time format in that file (more than one for comparison)
0
 

Author Comment

by:new_perl_user
ID: 36892847
the date format in the file is:'Mon Sep 26 05:10:05 ' and this log file  gets data every hour  365 days . So everday I need to run the above command throught script and grab the data for yesterday . will it be possible, if so  can you please let me know.

Thanks,
0
 
LVL 37

Expert Comment

by:Gerwin Jansen
ID: 36893286
@wesley_chen
>> search the content between "STARTED" and "FINISHED"
You are absolutely correct, sorry about that.

About adding the date filter: I would add this:

grep "`date -d 'yesterday' +"%a %b %d"`"

then add the (somewhat modified) original line to this:

grep "`date -d 'yesterday' +"%a %b %d"`" | awk  '/STARTED/,/FINISHED/{if($0~/IMAGES[/]$/){print $2}}' | sed -ne 's/^[^/]*[/]\(IMD.*\)\/IMAGES\//\1/p' | sort | uniq

Open in new window


The 'grep' line will filter the lines matching 'yesterday' in this format:
Mon Sep 26
0
 

Author Comment

by:new_perl_user
ID: 36893336
oh my bad.  Yesterday in the sense I eas telling that it should get previous day data, but not grep for yestarday. It has to grep only date for previous day.

For example if I am executing the command tomorrow  it has to read the file and get today's data  as output.
0
 
LVL 37

Accepted Solution

by:
Gerwin Jansen earned 500 total points
ID: 36893444
The grep command is using a date command that gets you yesterday's Day Month Date.

You can try this separate command if you like:

date -d 'yesterday' +"%a %b %d"

Open in new window


If you run this today (Fri Sep 30), it will give you Thu Sep 29 as output. Just try the above command.
0

Featured Post

U.S. Department of Agriculture and Acronis Access

With the new era of mobile computing, smartphones and tablets, wireless communications and cloud services, the USDA sought to take advantage of a mobilized workforce and the blurring lines between personal and corporate computing resources.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Over the years I've spent many an hour playing on hardened, DMZ'd servers, with only a sub-set of the usual GNU toy's to keep me company; frequently I've needed to save and send log or data extracts from these server back to my PC, or to others, and…
Background Still having to process all these year-end "csv" files received from all these sources (including Government entities), sometimes we have the need to examine the contents due to data error, etc... As a "Unix" shop, our only readily …
Connecting to an Amazon Linux EC2 Instance from Windows Using PuTTY.
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

895 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now