Finding existing occurances in a file...

Posted on 2003-02-20
Medium Priority
Last Modified: 2012-05-04
Just some problem I've encountered...
I have this text file where there are the following for multiple servers:

Server name: Longest downtime

Server a: 150
Server b: 20
Server c: 150
Server d: 75

I want to scan the file and when there are servers with the same longest downtime, I need to add a new field for that record, being average downtime, and then place this edited line back into the text file.

Like taking server a and server c out so I can process them....into like
Server a: 150 (85)
Server c: 150 (70)

The average is calculated via another subprocedure so thats not the problem, its just the identification of those lines with the same "longest downtime" value that is posing some difficulties.

So can anyone show me how to find occurances with the same downtime, extract them for editing somehow so I may append the average onto the end of the line?

Question by:Maldini
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
LVL 84

Expert Comment

ID: 7987772
while( <> ){
    next unless my ($downtime)=/:\s*(\d+)/;
    push @{$down{$downtime}},$_;
while( my($k,$v) = each %down ){
    next unless @{$v} > 1;
    for( @{$v} ){
        print average $_;

Author Comment

ID: 7990330
Could you please explain the second while loop? Since I'm new to perl and not too sure about how to use it..

Thx :)

Author Comment

ID: 7990373
Could you please explain the second while loop? Since I'm new to perl and not too sure about how to use it..

Thx :)

Accepted Solution

amitabhrai earned 100 total points
ID: 7994257
$no_of_server = 0 ;
open ( SERVER , "server.txt" );
while ( <SERVER> ) {
     $line = $_;
     chop $line ;
     ( $ServerName , $longestDownTime ) = split ( /:/ , $line );
     $count_of_longestDownTime{ $longestDownTime } = $count_of_longestDownTime{ $longestDownTime } + 1;
     $Server_Count{$ServerName} = $longestDownTime ;
     $ServerNames[$no_of_server++] = $ServerName ;
close SERVER ;

open ( SERVER , ">server.txt" );
for ( $i= 0 ; $i < $no_of_server ; $i++ ) {
     if ( $count_of_longestDownTime{$Server_Count{$ServerNames[$i]}} > 1 ) {
          print SERVER "$ServerNames[$i]:$Server_Count{$ServerNames[$i]}:" . getAverage($ServerNames[$i]) . "\n";
     else {
          print SERVER "$ServerNames[$i]:$Server_Count{$ServerNames[$i]}\n" ;
close SERVER ;

sub getAverage
($name) = @_ ;
if ( $name eq "server1" ) {
     return "45"; }
else {
     return "60"; }

Change the subroutine "getAverage" to your subroutine and
change filename from "server.txt" to your file.
Hope this will work for you.

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Many time we need to work with multiple files all together. If its windows system then we can use some GUI based editor to accomplish our task. But what if you are on putty or have only CLI(Command Line Interface) as an option to  edit your files. I…
There are many situations when we need to display the data in sorted order. For example: Student details by name or by rank or by total marks etc. If you are working on data driven based projects then you will use sorting techniques very frequently.…
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Six Sigma Control Plans
Suggested Courses

801 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question