Solved

Command to see particular colum value in the flat file

Posted on 2004-03-29
15
375 Views
Last Modified: 2010-05-18
Hi ,

I need to find 50th column value in the flatfile.There a header for flatfile and there is separator || to separe for each field.could you please let me know what is command which can resolve this .
0
Comment
Question by:sumanth_ora
  • 3
  • 3
  • 2
  • +4
15 Comments
 
LVL 7

Expert Comment

by:fim32
ID: 10706640
you want to find the value on that line?

well, you can:
awk -F| '{print $50}'

where | is the separator
0
 
LVL 48

Accepted Solution

by:
Tintin earned 500 total points
ID: 10707954
fim32

The separator is || not |, and the field separator for awk/cut can only be a single character (AFAIK).  Also note that you need to quote the pipe symbol.

A workaround is:

sed "s/||/|/g" flatfile | awk -F'|' '{print $50}'
0
 
LVL 62

Expert Comment

by:gheist
ID: 10708160
Kind of work for cut

cut -b 1-10 <onefile > anotherfile

0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
LVL 7

Expert Comment

by:fim32
ID: 10708180
ah, i wasn't sure if that were a typo, or a separate odd character...

technically tho, even if || were the separator (and good point on the quoting), you could still just use awk (every other field would be empty), but i think the limit on awk fields is 99?
0
 
LVL 9

Expert Comment

by:Alf666
ID: 10708273
This might not work. There's a chance that your file contains pipe signs (|) inside the records (hence the double pipe).

This means that sometimes, you will not get the proper field.

The following will work for sure.

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[50] . "\n";'
0
 
LVL 9

Expert Comment

by:Alf666
ID: 10708282
Err. Forgot the file name :

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[50] . "\n";' flatfile
0
 
LVL 48

Expert Comment

by:Tintin
ID: 10709336
If using Perl, we can make that a little shorter :-)

perl -F'\|\|' -lane 'print $F[49]' flatfile
0
 
LVL 51

Expert Comment

by:ahoffmann
ID: 10713068
dooh, we see a lot of experts testing on linux instead Unix (see TA :-))

>  but i think the limit on awk fields is 99
no, the limit is 9
only gawk, probably nawk or mawk,  can have more

KISS as Tintin ;-)
0
 
LVL 9

Expert Comment

by:Alf666
ID: 10713647
BTW, thanks Tintin, I had a small typo :

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[49] . "\n";' flatfile

But your solution is simpler.
0
 
LVL 20

Expert Comment

by:Gns
ID: 10721961
Ehm, guys... Testing on an AIX 5.2 awk... The file aa is as the above, containg lines like "a||a||a||a||...b||c||" with the b in column 50...:
(first the "non working case of a single | delimiter)
>awk -F'|' '{print $100}' aa


>awk -F'|' '{print $101}' aa  
c
c
>awk -F'|' '{print $99}' aa
b
b
(And now, the interresting case, where the "character" is the "||" _string_)
>awk -F'||' '{print $49}' aa
a
a
>awk -F'||' '{print $50}' aa
b
b
>
The first case (singel "|" delimiter) was exactly repeatable on my (retired) DG/UX machines, but not the latter ("||" delimiter). On solaris 2.6 awk complains on "to many fields in line ..." while the 2.6 nawk bugs out over the delimiter (illegal primary in RE || at |)...
... Conclusions: Depending on your awk implementation you might not have the arbitrary limits preventing the simple awk from working.
The nice perl scriptlets abive will (of course) do the job.

-- Glenn
0
 
LVL 6

Expert Comment

by:bira
ID: 10723735
"I need to find 50th column value in the flatfile"

Suppose flatfile has:
abcdefghijklnmopkrstuvxabcdedfghijklnmopqrstuvxzxO||abcdefghijk

  Run as below:

 cat flatfile|cut -f1 -d'||'|rev|cut -c1-1
0
 
LVL 6

Expert Comment

by:bira
ID: 10723865
Or this way, that dont´t depend on separators:

 cat flatfile|cut -c50-50|rev|cut -c1-1
0
 
LVL 51

Expert Comment

by:ahoffmann
ID: 10726374
bira, which cut allows a string as separator for -c ?
also nice interpretation of the question ;-) which happens when people are to lazy to explain ..
0
 
LVL 48

Expert Comment

by:Tintin
ID: 10727775
sumanth_ora.

Could you please provide some feedback and possibly close this question.
0
 
LVL 20

Expert Comment

by:Gns
ID: 10730758
:-) Achim and bira... And CC to Achims question... When I tried, I of course tried all OSes at hand with a cut solution too... None seem to work with a string delimiter. I might add that AIX awk manpage does indeed explain FS values as either being blank, singel char or an extended regular expression (which explains why it works;).

-- Glenn
0

Featured Post

Free Tool: Postgres Monitoring System

A PHP and Perl based system to collect and display usage statistics from PostgreSQL databases.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A metadevice consists of one or more devices (slices). It can be expanded by adding slices. Then, it can be grown to fill a larger space while the file system is in use. However, not all UNIX file systems (UFS) can be expanded this way. The conca…
My previous tech tip, Installing the Solaris OS From the Flash Archive On a Tape (http://www.experts-exchange.com/articles/OS/Unix/Solaris/Installing-the-Solaris-OS-From-the-Flash-Archive-on-a-Tape.html), discussed installing the Solaris Operating S…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:

828 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question