?
Solved

Replace all spaces by,

Posted on 2011-03-15
6
Medium Priority
?
437 Views
Last Modified: 2012-05-11
Hi,

Can I get a perl script that can convert the following output into CSV. So basically all the spaces need to be converted to a comma.

My goal is to run a cron job every day and then add the output of df-h into a CSV file every day. I need to script.

Script 1 will just convert this output to CSV

Script 2 will open a file everyday for append and then it will issue the df -h command then store the output of df -h into a CSV file everyday  and it will skip the headers. Also can the perl script get the hostname of the machine also and then add it as a column to the csv and repeat the hostname in the CSV every day ?



thanks
-anshu

Filesystem            Size  Used Avail Use% Mounted on
/dev/sda1              15G  4.0G  9.8G  29% /
/dev/sda6              27G  6.9G   19G  28% /omega
/dev/sda3             4.9G  390M  4.3G   9% /var
/dev/sda5             4.9G  139M  4.5G   3% /tmp
tmpfs                 127G     0  127G   0% /dev/shm
/dev/sdb2              35G   15G   19G  44% /alpha
/dev/sdc1             135G   28G  100G  22% /alpha1
/dev/sdd1             135G  105G   23G  83% /alpha2
0
Comment
Question by:anshuma
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
6 Comments
 
LVL 84

Expert Comment

by:ozo
ID: 35144571

perl -i.bak -lpe 's/\s+/,/g' file
0
 

Author Comment

by:anshuma
ID: 35144586
Hi Ozo,

Thank you very much. How can I make it skip the first line which is

Filesystem            Size  Used Avail Use% Mounted on


thanks
-anshu
0
 
LVL 14

Expert Comment

by:sentner
ID: 35144670
Easiest way to skip the first line is to just pipe the df output first to tail +2

df -h | tail +2 | perl -lpe 's/\s+/,/g'
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 

Author Comment

by:anshuma
ID: 35144956
[root@myserver ~]# df -h | tail +2 | perl -lpe 's/\s+/,/g' > x1.txt
tail: cannot open `+2' for reading: No such file or directory

I get this error
0
 
LVL 84

Accepted Solution

by:
ozo earned 1400 total points
ID: 35145997
df -h | perl -lpe '$_=<> if 1..1;s/\s+/,/g'
0
 
LVL 14

Assisted Solution

by:sentner
sentner earned 600 total points
ID: 35150492
Sorry, tail +2 doesn't work for every version of tail.

You should be able to use:

df -h | tail --lines=+2 | perl -lpe 's/\s+/,/g'

Or:

df -h | egrep -v "^Filesystem"| perl -lpe 's/\s+/,/g'


Ozo's last comment seems to address your need, though when I try it it adds an extra comma to the end of the first line.  
0

Featured Post

On Demand Webinar: Networking for the Cloud Era

Ready to improve network connectivity? Watch this webinar to learn how SD-WANs and a one-click instant connect tool can boost provisions, deployment, and management of your cloud connection.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

I've just discovered very important differences between Windows an Unix formats in Perl,at least 5.xx.. MOST IMPORTANT: Use Unix file format while saving Your script. otherwise it will have ^M s or smth likely weird in the EOL, Then DO NOT use m…
Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Six Sigma Control Plans
Suggested Courses

800 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question