Solved

Multiple Text Files INTO one - awk

Posted on 2013-12-18
12
303 Views
Last Modified: 2013-12-19
we have two type of file

1. Type One  file name starts with T or K for ex. T01 or K17  or K55 or T02 it should has any 2 digit after T or K and ends with .txt
sample first type :

00:01:E8:D6:53:37: -> Port 1000
00:02:A5:4F:26:75: -> Port 15
00:04:AC:E3:E8:49: -> Port 8
00:0C:29:12:26:90: -> Port 1000
00:0C:29:4C:58:DE: -> Port 23
00:0C:29:63:C1:D4: -> Port 6
00:0C:29:96:26:FE: -> Port 26
00:0C:29:A9:87:02: -> Port 1000
00:0C:29:C2:4E:E7: -> Port 1000
00:0C:29:DE:4E:E5: -> Port 26
00:0C:29:F2:71:DD: -> Port 26
00:0E:7F:EC:1D:8E: -> Port 23
00:12:A9:C2:E0:A0: -> Port 1000
00:1B:21:2F:65:F4: -> Port 1
00:1B:21:2F:69:31: -> Port 24
00:1B:21:2F:69:3A: -> Port 9
00:1B:21:2F:69:47: -> Port 4
00:1B:21:9C:03:D1: -> Port 17
00:1B:21:C6:58:25: -> Port 28
00:25:61:45:1E:40: -> Port 1000
00:25:90:95:7A:6E: -> Port 1000
00:25:90:A4:3A:74: -> Port 1000
00:25:90:A4:3A:82: -> Port 1000
00:25:90:A4:40:2F: -> Port 1000
00:25:90:A4:41:5E: -> Port 1000
00:25:90:A4:41:6E: -> Port 1000
00:25:90:A4:56:20: -> Port 1000
00:25:90:A8:97:2F: -> Port 1000
00:25:90:A8:97:42: -> Port 1000
00:25:90:A8:97:43: -> Port 1000

Open in new window



We want to collect all this files into one but will adding each line the name of file without .txt


for ex.
00:25:90:A8:97:42: -> Port 1000  -> K17
00:25:90:A8:97:43: -> Port 1000  -> K16
... etc



2.  Second type of file :


172.16.1.21 --> 00:18:6E:37:CF:28
172.16.1.22 --> 00:01:E8:D6:53:37
10.1.1.1 --> 00:01:E8:D6:53:37
37.123.96.3 --> 00:50:56:BE:70:D8
37.123.96.5 --> 00:04:AC:E3:E8:49
37.123.96.6 --> 00:04:AC:E3:E8:49
37.123.96.9 --> 02:D0:68:12:4B:CC
37.123.96.18 --> 90:2B:34:9D:53:CB
37.123.96.19 --> 90:2B:34:9D:53:CB
37.123.96.20 --> 90:2B:34:A0:42:F3
37.123.96.21 --> 90:2B:34:A0:42:F3
37.123.96.34 --> E0:69:95:2E:90:A4
37.123.96.35 --> E0:69:95:2E:90:A4
37.123.96.36 --> 90:2B:34:A0:42:F3
37.123.96.39 --> E0:69:95:2E:90:A4
37.123.96.67 --> B8:AC:6F:97:82:6F
37.123.96.116 --> 00:04:AC:E3:E8:49
37.123.96.162 --> 00:50:56:BE:36:C1
37.123.96.178 --> 00:50:56:96:50:FA
37.123.96.179 --> 00:50:56:96:03:8B
37.123.96.180 --> 00:50:56:96:03:C3
37.123.96.181 --> 00:50:56:96:03:8B
37.123.96.182 --> 00:50:56:96:03:8B
37.123.96.183 --> 00:50:56:96:03:8B
37.123.96.184 --> 00:50:56:96:03:C3
37.123.96.185 --> 00:50:56:96:25:99
37.123.96.188 --> 00:50:56:96:03:C3
37.123.96.189 --> 00:50:56:96:03:8B
37.123.96.226 --> 00:50:56:BE:D0:49
37.123.96.227 --> 00:50:56:BE:D0:49
37.123.96.228 --> 00:50:56:BE:D0:49

Open in new window



it will add to the top of the created text file this with adding --> Router to the end of each line.


Thanks
0
Comment
Question by:3XLcom
  • 5
  • 4
  • 3
12 Comments
 
LVL 31

Expert Comment

by:farzanj
ID: 39726599
What does the second type of file start with--what would be the possible names of second type?
0
 

Author Comment

by:3XLcom
ID: 39726612
router.txt

after complete the process of collecting into one we should delete

K*.txt
T*.txt
router.txt


there should be the only file of collected.
0
 
LVL 8

Expert Comment

by:Surrano
ID: 39726623
Dear 3XLcom,

As for type 1, try this:
grep ^ [KT][0-9][0-9].txt /dev/null | gawk '{print substr($0,9)" -> "substr($0,1,3)}'

Open in new window


As for type 2, I'm not sure I get your question properly but try this:

cat (...) | gawk '{print $0" --> Router"}'

Open in new window


If you mean the two file types are *both* called e.g. K22.txt or even the records are intermixed, with the difference being in record format:
- first record format: mac " -> Port " portno
- second record format: ip " --> " mac
(notice the different arrows!)
then try this:

grep ^ [KT][0-9][0-9].txt /dev/null | gawk '
  $2=="->" {print substr($0,9)" -> "substr($0,1,3)}
  $2=="-->" {print substr($0,9)" --> Router"}'

Open in new window

0
 

Author Comment

by:3XLcom
ID: 39726652
That worked :
grep ^ [KT][0-9][0-9].txt /dev/null | gawk '{print substr($0,9)" -> "substr($0,1,3)}'  > single.txt

and this is also work

cat routerresult.txt | gawk '{print $0" --> Router"}' > secondsingle.txt


i want to combine both to one second single will be the prepended to the single.txt

and i want to be removed all the other .txt files in the directory.

that is all
0
 
LVL 31

Expert Comment

by:farzanj
ID: 39726668
Instead of
cat routerresult.txt | gawk '{print $0" --> Router"}' > secondsingle.txt

Use
cat routerresult.txt | gawk '{print $0" --> Router"}' >> single.txt

AND

ls | grep -v 'single.txt' | xargs rm -f *txt
0
 
LVL 31

Accepted Solution

by:
farzanj earned 500 total points
ID: 39726685
To prepend, I would use the second  command first

cat routerresult.txt | gawk '{print $0" --> Router"}' > single.txt

Then
grep ^ [KT][0-9][0-9].txt /dev/null | gawk '{print substr($0,9)" -> "substr($0,1,3)}'  >> single.txt
0
Free Trending Threat Insights Every Day

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

 
LVL 8

Expert Comment

by:Surrano
ID: 39726690
Good catch, but this way it'll delete all txt files, including single.txt. Try this instead:
ls *txt | grep -v 'single.txt' | xargs rm -f

Open in new window


or how about doing the simple way:
rm [KT][0-9][0-9]*txt routerresult.txt

Open in new window

0
 
LVL 31

Expert Comment

by:farzanj
ID: 39726752
@Surrano,

Da!  Good.

Could you explain
grep ^ [KT][0-9][0-9].txt /dev/null expression please?

I mean ^ and /dev/null part to read file names.  Regex part is easy.
0
 

Author Closing Comment

by:3XLcom
ID: 39726786
Thank you so much
0
 
LVL 31

Expert Comment

by:farzanj
ID: 39726802
Hi 3XLcom,

Thank you for the points but Surrano has mainly solved this problem.  It would have been nicer to split points between me and him.
0
 

Author Comment

by:3XLcom
ID: 39726927
Dear farzans ;

I have one more question : http://www.experts-exchange.com/Programming/Languages/Scripting/Shell/Q_28321258.html

so if he/she help me there i should give more points because in this question you helped me totally. And please check out the other too thanks
0
 
LVL 8

Expert Comment

by:Surrano
ID: 39728616
grep ^ is a trick to match all lines (even empty ones)

grep ... /dev/null is a trick to force printing of filename in front of line.
If you have only one file listed on the command line e.g.
grep regex K01.txt
then grep won't prefix the lines matched with "K01.txt:"

If you have more than one file e.g.
grep regex K01.txt /dev/null
then grep will prefix the lines. Add to this that /dev/null is always zero length, i.e. it contains 0 lines => no artifacts produced.
0

Featured Post

Free Trending Threat Insights Every Day

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

Join & Write a Comment

A year or so back I was asked to have a play with MongoDB; within half an hour I had downloaded (http://www.mongodb.org/downloads),  installed and started the daemon, and had a console window open. After an hour or two of playing at the command …
There are many situations when we need to display the data in sorted order. For example: Student details by name or by rank or by total marks etc. If you are working on data driven based projects then you will use sorting techniques very frequently.…
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now