Solved

Command to see particular colum value in the flat file

Posted on 2004-03-29
15
360 Views
Last Modified: 2010-05-18
Hi ,

I need to find 50th column value in the flatfile.There a header for flatfile and there is separator || to separe for each field.could you please let me know what is command which can resolve this .
0
Comment
Question by:sumanth_ora
  • 3
  • 3
  • 2
  • +4
15 Comments
 
LVL 7

Expert Comment

by:fim32
ID: 10706640
you want to find the value on that line?

well, you can:
awk -F| '{print $50}'

where | is the separator
0
 
LVL 48

Accepted Solution

by:
Tintin earned 500 total points
ID: 10707954
fim32

The separator is || not |, and the field separator for awk/cut can only be a single character (AFAIK).  Also note that you need to quote the pipe symbol.

A workaround is:

sed "s/||/|/g" flatfile | awk -F'|' '{print $50}'
0
 
LVL 61

Expert Comment

by:gheist
ID: 10708160
Kind of work for cut

cut -b 1-10 <onefile > anotherfile

0
 
LVL 7

Expert Comment

by:fim32
ID: 10708180
ah, i wasn't sure if that were a typo, or a separate odd character...

technically tho, even if || were the separator (and good point on the quoting), you could still just use awk (every other field would be empty), but i think the limit on awk fields is 99?
0
 
LVL 9

Expert Comment

by:Alf666
ID: 10708273
This might not work. There's a chance that your file contains pipe signs (|) inside the records (hence the double pipe).

This means that sometimes, you will not get the proper field.

The following will work for sure.

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[50] . "\n";'
0
 
LVL 9

Expert Comment

by:Alf666
ID: 10708282
Err. Forgot the file name :

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[50] . "\n";' flatfile
0
 
LVL 48

Expert Comment

by:Tintin
ID: 10709336
If using Perl, we can make that a little shorter :-)

perl -F'\|\|' -lane 'print $F[49]' flatfile
0
How to improve team productivity

Quip adds documents, spreadsheets, and tasklists to your Slack experience
- Elevate ideas to Quip docs
- Share Quip docs in Slack
- Get notified of changes to your docs
- Available on iOS/Android/Desktop/Web
- Online/Offline

 
LVL 51

Expert Comment

by:ahoffmann
ID: 10713068
dooh, we see a lot of experts testing on linux instead Unix (see TA :-))

>  but i think the limit on awk fields is 99
no, the limit is 9
only gawk, probably nawk or mawk,  can have more

KISS as Tintin ;-)
0
 
LVL 9

Expert Comment

by:Alf666
ID: 10713647
BTW, thanks Tintin, I had a small typo :

perl -n -e 'chomp ; @arr=split(/\|\|/); print $arr[49] . "\n";' flatfile

But your solution is simpler.
0
 
LVL 20

Expert Comment

by:Gns
ID: 10721961
Ehm, guys... Testing on an AIX 5.2 awk... The file aa is as the above, containg lines like "a||a||a||a||...b||c||" with the b in column 50...:
(first the "non working case of a single | delimiter)
>awk -F'|' '{print $100}' aa


>awk -F'|' '{print $101}' aa  
c
c
>awk -F'|' '{print $99}' aa
b
b
(And now, the interresting case, where the "character" is the "||" _string_)
>awk -F'||' '{print $49}' aa
a
a
>awk -F'||' '{print $50}' aa
b
b
>
The first case (singel "|" delimiter) was exactly repeatable on my (retired) DG/UX machines, but not the latter ("||" delimiter). On solaris 2.6 awk complains on "to many fields in line ..." while the 2.6 nawk bugs out over the delimiter (illegal primary in RE || at |)...
... Conclusions: Depending on your awk implementation you might not have the arbitrary limits preventing the simple awk from working.
The nice perl scriptlets abive will (of course) do the job.

-- Glenn
0
 
LVL 6

Expert Comment

by:bira
ID: 10723735
"I need to find 50th column value in the flatfile"

Suppose flatfile has:
abcdefghijklnmopkrstuvxabcdedfghijklnmopqrstuvxzxO||abcdefghijk

  Run as below:

 cat flatfile|cut -f1 -d'||'|rev|cut -c1-1
0
 
LVL 6

Expert Comment

by:bira
ID: 10723865
Or this way, that dont´t depend on separators:

 cat flatfile|cut -c50-50|rev|cut -c1-1
0
 
LVL 51

Expert Comment

by:ahoffmann
ID: 10726374
bira, which cut allows a string as separator for -c ?
also nice interpretation of the question ;-) which happens when people are to lazy to explain ..
0
 
LVL 48

Expert Comment

by:Tintin
ID: 10727775
sumanth_ora.

Could you please provide some feedback and possibly close this question.
0
 
LVL 20

Expert Comment

by:Gns
ID: 10730758
:-) Achim and bira... And CC to Achims question... When I tried, I of course tried all OSes at hand with a cut solution too... None seem to work with a string delimiter. I might add that AIX awk manpage does indeed explain FS values as either being blank, singel char or an extended regular expression (which explains why it works;).

-- Glenn
0

Featured Post

6 Surprising Benefits of Threat Intelligence

All sorts of threat intelligence is available on the web. Intelligence you can learn from, and use to anticipate and prepare for future attacks.

Join & Write a Comment

Using libpcap/Jpcap to capture and send packets on Solaris version (10/11) Library used: 1.      Libpcap (http://www.tcpdump.org) Version 1.2 2.      Jpcap(http://netresearch.ics.uci.edu/kfujii/Jpcap/doc/index.html) Version 0.6 Prerequisite: 1.      GCC …
Why Shell Scripting? Shell scripting is a powerful method of accessing UNIX systems and it is very flexible. Shell scripts are required when we want to execute a sequence of commands in Unix flavored operating systems. “Shell” is the command line i…
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

747 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now