Command to see particular colum value in the flat file

Hi ,

I need to find 50th column value in the flatfile.There a header for flatfile and there is separator || to separe for each field.could you please let me know what is command which can resolve this .
sumanth_oraAsked:
Who is Participating?
 
TintinConnect With a Mentor Commented:
fim32

The separator is || not |, and the field separator for awk/cut can only be a single character (AFAIK).  Also note that you need to quote the pipe symbol.

A workaround is:

sed "s/||/|/g" flatfile | awk -F'|' '{print $50}'
0
 
fim32Commented:
you want to find the value on that line?

well, you can:
awk -F| '{print $50}'

where | is the separator
0
 
gheistCommented:
Kind of work for cut

cut -b 1-10 <onefile > anotherfile

0
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

 
fim32Commented:
ah, i wasn't sure if that were a typo, or a separate odd character...

technically tho, even if || were the separator (and good point on the quoting), you could still just use awk (every other field would be empty), but i think the limit on awk fields is 99?
0
 
Alf666Commented:
This might not work. There's a chance that your file contains pipe signs (|) inside the records (hence the double pipe).

This means that sometimes, you will not get the proper field.

The following will work for sure.

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[50] . "\n";'
0
 
Alf666Commented:
Err. Forgot the file name :

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[50] . "\n";' flatfile
0
 
TintinCommented:
If using Perl, we can make that a little shorter :-)

perl -F'\|\|' -lane 'print $F[49]' flatfile
0
 
ahoffmannCommented:
dooh, we see a lot of experts testing on linux instead Unix (see TA :-))

>  but i think the limit on awk fields is 99
no, the limit is 9
only gawk, probably nawk or mawk,  can have more

KISS as Tintin ;-)
0
 
Alf666Commented:
BTW, thanks Tintin, I had a small typo :

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[49] . "\n";' flatfile

But your solution is simpler.
0
 
GnsCommented:
Ehm, guys... Testing on an AIX 5.2 awk... The file aa is as the above, containg lines like "a||a||a||a||...b||c||" with the b in column 50...:
(first the "non working case of a single | delimiter)
>awk -F'|' '{print $100}' aa


>awk -F'|' '{print $101}' aa  
c
c
>awk -F'|' '{print $99}' aa
b
b
(And now, the interresting case, where the "character" is the "||" _string_)
>awk -F'||' '{print $49}' aa
a
a
>awk -F'||' '{print $50}' aa
b
b
>
The first case (singel "|" delimiter) was exactly repeatable on my (retired) DG/UX machines, but not the latter ("||" delimiter). On solaris 2.6 awk complains on "to many fields in line ..." while the 2.6 nawk bugs out over the delimiter (illegal primary in RE || at |)...
... Conclusions: Depending on your awk implementation you might not have the arbitrary limits preventing the simple awk from working.
The nice perl scriptlets abive will (of course) do the job.

-- Glenn
0
 
biraCommented:
"I need to find 50th column value in the flatfile"

Suppose flatfile has:
abcdefghijklnmopkrstuvxabcdedfghijklnmopqrstuvxzxO||abcdefghijk

  Run as below:

 cat flatfile|cut -f1 -d'||'|rev|cut -c1-1
0
 
biraCommented:
Or this way, that dont´t depend on separators:

 cat flatfile|cut -c50-50|rev|cut -c1-1
0
 
ahoffmannCommented:
bira, which cut allows a string as separator for -c ?
also nice interpretation of the question ;-) which happens when people are to lazy to explain ..
0
 
TintinCommented:
sumanth_ora.

Could you please provide some feedback and possibly close this question.
0
 
GnsCommented:
:-) Achim and bira... And CC to Achims question... When I tried, I of course tried all OSes at hand with a cut solution too... None seem to work with a string delimiter. I might add that AIX awk manpage does indeed explain FS values as either being blank, singel char or an extended regular expression (which explains why it works;).

-- Glenn
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.