Go Premium for a chance to win a PS4. Enter to Win

x
?
Solved

Filtering leading spaces from text string.

Posted on 2000-03-02
9
Medium Priority
?
266 Views
Last Modified: 2010-04-21
I need a quick fix and am experiencing brain fade so I thought I would offer you nice people a chance at some easy point, It's been a while since I posted a question so I thought it would be fun.

OK here ya go ... shouldnt be too tough.

Nightly we dump logs of our database and filesystem backups to a text "log" file. for one field entry we tail the "fixed" output file from Informix and pipe it through "head -1" to grab just the first line. So far this has worked fine but now I am formatting (mostly cleaning up) the format to be place in a database for query level reporting.

So...we have one line of text in one file, unfortunately this text has LEADING spaces that I would like to eliminate, theres your question.

How do I filter a line that looks something like "     143121 - 143856" into something like "143121 - 143856"   ?

A couple things... first notice that there MAY be additional spaces embedded in the string before and after the "-" these can be left in place or filtered out, I dont care, what I do care about is that the actual text value (Database logical log numbers) stay in tact an accurate. ALSO the solution should be shell independent as we may use C shell on one system and K shell or Bourne shell on others. Thinking maybe AWK or SED or even a GREP statement might do it ???

Current syntax (if it helps) is:
(filename is an example)

LOGS="'tail -3 /testfile.flg | head -1 '"
if  [ -f  "/testfile.log" ]
then
ORIGNUM= 'head -1 "/testfile.log'
ORIGNUM1='expr $ORIGNUM -1'
ORIG=1

then this string gets echoed out to sendmail and to a logfile.
Its a little rough, since you dont have all the varible info but I think you see the drift.

I put the value at 100 cuz I feel like giving away points , sound fair ?

Anyone ?

THANX !!
 






0
Comment
Question by:Randyb
  • 5
  • 4
9 Comments
 
LVL 21

Expert Comment

by:tfewster
ID: 2578444
If you don't care about other spaces in the string, how about "sed s/ //g"?
- Shell & Unix variant independent (awk isn't...)

Seems a short answer to your detailed spcification, but I think it meets all your requirements.

0
 
LVL 21

Expert Comment

by:tfewster
ID: 2578459
For 100 points, maybe I should have said "specification", not "spcification"

 .
 .
ORIGNUM=`head -1 /testfile.log|sed s/ //g`
 .
 .

Should there really have been a " before the /testfile.log in this line or was that a typo?


0
 
LVL 1

Author Comment

by:Randyb
ID: 2578479
Let me test drive this theory of your real quick
Actually I just figured it out also using sed but my solution has more arguments...I'll post it for you in a sec . If yours works and since its more "condensed" i'll give you the points, if not I will retract the question as my solution does work already.

be right back !
0
Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 1

Author Comment

by:Randyb
ID: 2578485
Let me test drive this theory of your real quick
Actually I just figured it out also using sed but my solution has more arguments...I'll post it for you in a sec . If yours works and since its more "condensed" i'll give you the points, if not I will retract the question as my solution does work already.

be right back !
0
 
LVL 21

Expert Comment

by:tfewster
ID: 2578494
Seems fair, if you'll show me yours...

However, I reckon I thought of the answer before you did (see the timestamps on this Q...!)
0
 
LVL 1

Author Comment

by:Randyb
ID: 2578512
OK OK ...you were CLOSE !!

let say the file is named "file1"

the simplest syntax that works is:

head -1|sed 's/ //g' > newfile

notice the single quotes in the sed command !
without them it returns a "sed: command garbled s/" error.

The syntax I used included cases for mulitple spaces or TABS. But since I am not using tabs YOU WIN !

Told ya it was easy !
0
 
LVL 1

Author Comment

by:Randyb
ID: 2578519
oh my solution ..sorry !

sed 's/^[       ]*//'

note: between the [    ] is a space and a tab character..the * means more than zero spaces or tabs ...basically ALL or them.
0
 
LVL 21

Accepted Solution

by:
tfewster earned 400 total points
ID: 2578531
Hey, it wasn't bad for 2 minutes past midnight, without a Unix box around to prove it on...

(Ooops, I think I've just blown my credibility)

Anyway, it was fun - See you around
Tim
0
 
LVL 1

Author Comment

by:Randyb
ID: 2578553
ooops ...accepted the wrong answer ...sorry for those who pay to see it !

Have a good one.

Randy
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

When you do backups in the Solaris Operating System, the file system must be inactive. Otherwise, the output may be inconsistent. A file system is inactive when it's unmounted or it's write-locked by the operating system. Although the fssnap utility…
I have been running these systems for a few years now and I am just very happy with them.   I just wanted to share the manual that I have created for upgrades and other things.  Oooh yes! FreeBSD makes me happy (as a server), no maintenance and I al…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

886 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question