Solved

Reading from a file - by passing the value

Posted on 2003-12-10
11
243 Views
Last Modified: 2010-03-04
Hi ,
I am writing data from into a file. This file is pipe demilited and every new data starts from a new line. First value of the line is unique value. I need to read this file and generate a report. Now my question is -

If i am having this file ready then how can i read the data by passing that unique key.

Format of the file is like this ->
12345|Tom|Jonh|Analyst|Software devision|$30000|5 Years|contract expires on 2132|end
-----------------------------------------------------------------
-----------------------------------------------------------------
------------------------------------------------------------------

Now i will pass the 12345 to this file to take the rest of the data i.e Tom , John , Analyst etc. in the same order which will be printed in another report.

Thanks in Advance !

Aaku
0
Comment
Question by:akku_batra
  • 4
  • 2
  • 2
  • +2
11 Comments
 
LVL 2

Expert Comment

by:PurplePerls
ID: 9916347
On windows do it like this:

type customer.dat | perl extract.pl 12346

Where the extract.pl is like this:

my $key = shift;
while (<>) {
  print unless !(/^$key/);
}


On Unix it is simelar command:
cat customer.dat | perl extract.pl 12346 >> report.dat




0
 
LVL 2

Accepted Solution

by:
PurplePerls earned 25 total points
ID: 9916469
And if you like to remove the key and change the pipes to comas, then try this:

my $key = shift;
while (<>) {
  if(/^$key/){
    $_ =~ s/^\w+\|//;
    $_ =~ s/\|/\,/g;
    print;
  };
};



0
 
LVL 84

Assisted Solution

by:ozo
ozo earned 25 total points
ID: 9917396
Better to use
/^$key\|/
0
Does Powershell have you tied up in knots?

Managing Active Directory does not always have to be complicated.  If you are spending more time trying instead of doing, then it's time to look at something else. For nearly 20 years, AD admins around the world have used one tool for day-to-day AD management: Hyena. Discover why

 
LVL 2

Expert Comment

by:PurplePerls
ID: 9918687
That's very true!
Worst case if , no key is entered, then are all records returned.

Here the corrected version:

my $key = shift;
while (<>) {
  if(/^$key\|/){
    $_ =~ s/^\w+\|//;
    $_ =~ s/\|/\,/g;
    print;
  };
};


Or without normalization:
my $key = shift;
while (<>) {
  print unless !(/^$key\|/);
}

0
 
LVL 18

Expert Comment

by:kandura
ID: 9924888
In that last bit i'd change the double negative from

   print unless !(/^$key\|/);

to

   print if /^$key\|/;

I never didn't like no double negatives ;^)
0
 
LVL 18

Expert Comment

by:kandura
ID: 9924945
Another note: if the file isn't too big (say, under 10MB), and you'd like to report on more than just a single line, then it may be easier to just slurp the whole file into a hash, where the keys are your unique first values, and the value is the rest of the line (possibly you'd even want the rest of the line split into an array as well, which I'll show too):

#!/usr/bin/perl -w

my $filename = 'data.txt';
my %lines = ();

open F, $filename or die $!;
while(<F>)
{
  chomp;                                             # remove newline
  my ($key, $data) = split /\|/, $_, 2;     # split on | into two parts
  $lines{$key} = $data;

  ### alternatively, store an array ref to the data
  my ($key, @flds) = split /\|/, $_;
   $lines{$key} = \@flds;
}
close F;

### now use %lines
my @somekeys = qw( 12345 12346 );   # some random keys ;^)
foreach(@somekeys)
{
    print $lines{$_}, "\n";
   
   ### alternatively
    print join(', ', @{$lines{$_}}), "\n";
}

0
 
LVL 84

Expert Comment

by:ozo
ID: 9925408
#Another way to use a hash without using up space on large files is:
my $filename = 'data.txt';
my @somekeys = qw( 12345 12346 );   # some random keys ;^)
@hash{@somekeys} = (1)x@somekeys;

open F, $filename or die $!;
while( <F> ){
  my ($key, $data) = split /\|/, $_, 2;     # split on | into two parts
  print if $hash{$key};
close F;
0
 
LVL 48

Expert Comment

by:Tintin
ID: 9926629
PurplePerls.

No need to fork additional processes:

cat customer.dat | perl extract.pl 12346 >> report.dat

is better written as

perl extract.pl 12346 <customer.dat >> report.dat

No UUOC
0
 
LVL 2

Expert Comment

by:PurplePerls
ID: 9927221
Thanks.
Now I learned one more jargon acronym :)
0
 
LVL 20

Expert Comment

by:jmcg
ID: 10218803
Nothing has happened on this question in more than 6 weeks. It's time for cleanup!

My recommendation, which I will post in the Cleanup topic area, is to
split points between PurplePerls and ozo.

Please leave any comments here within the next seven days.

PLEASE DO NOT ACCEPT THIS COMMENT AS AN ANSWER!

jmcg
EE Cleanup Volunteer
0

Featured Post

NAS Cloud Backup Strategies

This article explains backup scenarios when using network storage. We review the so-called “3-2-1 strategy” and summarize the methods you can use to send NAS data to the cloud

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

I've just discovered very important differences between Windows an Unix formats in Perl,at least 5.xx.. MOST IMPORTANT: Use Unix file format while saving Your script. otherwise it will have ^M s or smth likely weird in the EOL, Then DO NOT use m…
Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Finds all prime numbers in a range requested and places them in a public primes() array. I've demostrated a template size of 30 (2 * 3 * 5) but larger templates can be built such 210  (2 * 3 * 5 * 7) or 2310  (2 * 3 * 5 * 7 * 11). The larger templa…

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question