Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

create csv using ksh

Posted on 2010-11-16
10
Medium Priority
?
1,012 Views
Last Modified: 2013-12-26
hi guys, trying to create a csv in ksh,

using 'awk '{print $1" "$14" "$15" "$16" "$17" "$18" "$19}' >> $TMP_FILE' on another set of files I a simple log file with hundreds of lines in like so:

ABC_DEFGHI_16_JKLMNP11.20101115_095412_374.log:09:54:29.579 cars amount 29, total cars 70
ABC_DEFGHI_16_JKLMNP11.20101115_095412_374.log:09:54:29.585 cars amount 34, total cars 43
ABC_DEFGHI_16_JKLMNP11.20101115_095412_374.log:09:54:29.601 cars amount 22, total cars 44

but from the above log file i'm trying to create 2 csv files which looks like so:

date,time,cars amount
15/11/2010,09:54:29,29
15/11/2010,09:54:29,34
15/11/2010,09:54:29,22

and the other one...

date,time,cars amount
15/11/2010,09:54:29.579,29
15/11/2010,09:54:29.585,34
15/11/2010,09:54:29.601,22

could anyone provide an example awk or cut to achieve the above? cheers!
0
Comment
Question by:r_padawan
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 5
  • 4
10 Comments
 
LVL 12

Expert Comment

by:tel2
ID: 34151360
Will you accept a Perl solution that you can run from your shell script or command line, r_padawan?
0
 

Author Comment

by:r_padawan
ID: 34153198
yes pearl is good ;-)
0
 
LVL 12

Expert Comment

by:tel2
ID: 34153482
Is this homework, r_padawan?  Tell us all about it.
0
Important Lessons on Recovering from Petya

In their most recent webinar, Skyport Systems explores ways to isolate and protect critical databases to keep the core of your company safe from harm.

 

Author Comment

by:r_padawan
ID: 34158947
solved with awk...
0
 
LVL 12

Expert Comment

by:tel2
ID: 34159614
Hi r_padawan,

Sorry for the delay with this.  In case you're still interested, here's a quick & nasty Perl alternative (without the column headings):

perl -ne '($x,$x,$x,$x,$d,$x,$x,$x,$h,$m,$s,$t,$x,$x,$a)=split(/[._:, ]/);$d=~s|(....)(..)(..)|$3/$2/$1|;print "$d,$h:$m:$s,$a\n";warn "$d,$h:$m:$s.$t,$a\n"' $TMP_FILE >file1.csv 2>file2.csv

Open in new window


What does your awk solution look like?

By the way, I'd suggest you consider the option of generating these 2 output files from the original source, rather than having your awk script (the one in your original post) generate an intermediate file (unless you need that intermediate file for something, of else course).  If you need help with doing this, pls post some sample data from the original file, which corresponds to the data you've already provided (which was very useful, by the way!).
0
 

Author Comment

by:r_padawan
ID: 34162308
keep it open! :-)
0
 

Author Comment

by:r_padawan
ID: 34162328
I know that generating from the original source would be best but the lines contained in the original files are very very long, i was attempting to make the awk more simple by first chopping down the data into somthing more manageable, I struggle to with awk at the best of times, this was the awk I used (I didn't write this):


awk -F'[ \t,_:.]*' '{print substr($5,7,2)"/"substr($5,5,2)"/"substr($5,1,4)","$9":"$10":"$11"."$12}'

Open in new window

0
 
LVL 12

Expert Comment

by:tel2
ID: 34162703
Hi r_padawan,

I see that the awk one-liner above produces only one of the two files you wanted.  What about the other file?  I also see it has a '\t' in it (that represents a TAB character.  If there are no tabs in your input file, you should be able to remove the '\t' from the awk one-liner.  If there ARE tabs, tell me where, pls.

What would you like me to do for you now?  Some options are, I could:
1. Change my Perl one-liner to put column headings on the output files.
2. Change my Perl one-liner to take input from the original source file.  If you want me to do this, pls ATTACH some sample data from the original file, which corresponds to the data you've already provided.
3. Do something else.  Tell me.
0
 

Accepted Solution

by:
genialdinesh earned 1500 total points
ID: 34443999

A solution with awk and cut
test1.txt is the log file, containing hundreds of lines.

command
---------
cat test1.txt  | tr ' .'  ,, | cut -d, -f2,3,7  | nawk -f summary.awk

summary.txt contents
---------------------
#!/bin/awk -f
BEGIN -f{
  FS=",";
}
{
year=substr($1,1,4);
month=substr($1,5,2);
day=substr($1,7,2);
astime=substr($2,5,length($2));
printf("%d/%d/%d,%s,%d\n",    day,month,year,astime,$3);
}
0
 

Author Closing Comment

by:r_padawan
ID: 34853210
Simples
0

Featured Post

Enroll in October's Free Course of the Month

Do you work with and analyze data? Enroll in October's Course of the Month for 7+ hours of SQL training, allowing you to quickly and efficiently store or retrieve data. It's free for Premium Members, Team Accounts, and Qualified Experts!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

The following is a collection of cases for strange behaviour when using advanced techniques in DOS batch files. You should have some basic experience in batch "programming", as I'm assuming some knowledge and not further explain the basics. For some…
Recently, an awarded photographer, Selina De Maeyer (http://www.selinademaeyer.com/), completed a photo shoot of a beautiful event (http://www.sintjacobantwerpen.be/verslag-en-fotoreportage-van-de-sacramentsprocessie-door-antwerpen#thumbnails) in An…
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
In a recent question (https://www.experts-exchange.com/questions/29004105/Run-AutoHotkey-script-directly-from-Notepad.html) here at Experts Exchange, a member asked how to run an AutoHotkey script (.AHK) directly from Notepad++ (aka NPP). This video…
Suggested Courses

618 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question