Solved

Trying to get the string in a *.log file and search for that string name file throughout the folders and subfolders

Posted on 2010-11-18
33
480 Views
Last Modified: 2012-05-10
How to grep the string from a file, then find that string name file throughout ./ and store it in a separate folder
0
Comment
Question by:mail2vijay1982
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 18
  • 11
  • 3
  • +1
33 Comments
 
LVL 12

Accepted Solution

by:
tel2 earned 56 total points
ID: 34162794
Hi mail2vijay1982,

I'm struggling to understand what you mean.
For example:
- "find the string name file".  What does that mean?
- "store it in a separate folder"?  Store what, exactly?
Forgive me, Ingrish is only my first language.

Please explain your problem in detail, AND I suggest you provide some sample:
- Input data, and
- Expected results.
So we can clearly understand what you mean.

Thanks.
0
 
LVL 14

Expert Comment

by:Monis Monther
ID: 34162797
to find the file that contained a specific string simply use the ( -l switch )

ex:  grep -rl "some words with spaces" /

-r : search recursively in the dir
-l : output file name containing the string

Now this would take a lot of time

0
 
LVL 14

Assisted Solution

by:Monis Monther
Monis Monther earned 334 total points
ID: 34162811
Oh sorry didnt read the title well enough, since you are looking under log files then it would simply be

grep -rl "some words with spaces" /var/log

Note: You must have the " " if the sentence has spaces
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34162837
How to grep the string " *.xsl " from a file, then find that string named  file throughout ./  folders and sub folders. And store those files in a separate folder. Let it be shell script or commands, but commands would be more preferable as it is easy to execute right away without keeping my hand in the permissions..

The second question is to search for a string name "export" in all  .log file through all directories, sub directories and copy those  .log files which contains that string "export" to separate folder.
0
 
LVL 48

Assisted Solution

by:Tintin
Tintin earned 110 total points
ID: 34162852
I'm interpreting the question as grepping out a string from a bunch of logs and then searching for filenames (not content) with that name.

If so, then


#!/bin/bash
grep string /some/dir/*.log | read read string
do
  find / -name "*string*" 
done

Open in new window

0
 
LVL 48

Assisted Solution

by:Tintin
Tintin earned 110 total points
ID: 34162863
Ah, wrote my last post before I saw yours.  I was on the right track though.


#!/bin/bash
cd /to/some/path
grep ".xsl" file | while read string
do
  find . -name "$file" -type f | xargs -i mv {} /path/to/separate folder
done

Open in new window

0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34162864
that string could contain name, like " *.xls"..is it possible..
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34162884
i dont want to move, want to copy it actually...so i think cp applies for that right


#!/bin/bash
cd /to/some/path
grep ".xsl" file | while read string
do
  find . -name "$file" -type f | xargs -i cp {} /path/to/separate folder
done

Open in new window

0
 
LVL 48

Expert Comment

by:Tintin
ID: 34162897
Are you saying the string in the file you are grepping could be

*.xsl

0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34162933
yes, so it can be anything   .xsl like,  abc.xsl or bcd.xsl,...
0
 
LVL 14

Expert Comment

by:Monis Monther
ID: 34162935
If you have an * in the string put a \ before it

grep "\*.xls" filename
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34162936
which consists in those .log file..
0
 
LVL 14

Assisted Solution

by:Monis Monther
Monis Monther earned 334 total points
ID: 34162976
Ok now I understand what you want.

Here you go I tested this and it works fine

 for i in $(grep ".xls" /root/file); do find / -name $i -exec cp {} /path/to/dir \;; done


Note there is a ; ; two semicolons after the \
0
 
LVL 14

Assisted Solution

by:Monis Monther
Monis Monther earned 334 total points
ID: 34162984
Now the second task is done in the way I pointed out in my first post by using the grep -rl
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163037

small student: i have to use this command in very big Linux environment. I heard that using of exec with large storage files gives lots of errors. So my sources asked me to use xargs. But i am not sure about that. What do u think???
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163038

small student: i have to use this command in very big Linux environment. I heard that using of exec with large storage files gives lots of errors. So my sources asked me to use xargs. But i am not sure about that. What do u think???
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163062
Tintin: i have tried a sample of ur code, but nogo, Can you check this...

BTW, this is for the first question,



#!/bin/bash
cd ./thelogfiles/
grep -il "/*.xls" *.log | while read string
do
  find . -name "$*.log" -type f | xargs -i cp {} ./extractedfiles/
done

Open in new window

0
 
LVL 14

Assisted Solution

by:Monis Monther
Monis Monther earned 334 total points
ID: 34163091
Well instead of exec you can use ok , it will prompt you for each copy and you can choose yes or no, but I think (Although not 100% sure) that the exec in my command is not the same as the bash exec its a find built in switch with the same name.

Try this and if you dont like the result you can Ctrl +C and nothing is harmed

 for i in $(grep ".xls" /path/to/file); do find / -name $i -ok cp {} /path/to/dir \;; done
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163125
small_student:I have tried this command along, no errors, but no output to...nothing in the /extractedfiles..


for i in $(grep ".xls" ./thelogfiles); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

Any suggestions..
0
 
LVL 14

Assisted Solution

by:Monis Monther
Monis Monther earned 334 total points
ID: 34163147
Ok Lets route out the confusion

./thelogfiles    (this should be a file not a dir, if it is a dir then use grep -r ".xls"...etc)
./Documents  (This is where all the files are located and you are trying to seach in them, this was / in your first post)

./extractedfiles  (This is the location that the files will finally end up being copied to)

Note: Use full paths instead of relative paths for all three of the above

0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163217
small_student: Plz do check this line, giving warning msg...

for i in $(grep "/*.xls" ./thelogfiles/*); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

the warning msg:

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163252

small_student:

./Documents  (let it be inside this folder or any sub folders inside it, it should search all directiores and      subdirectories inside ./Documents)

./thelogfiles/*.log ( would this help out)


for i in $(grep "/*.xls" ./thelogfiles/*.log); do find ./Documents/ -name $i -ok cp {} ./dis \;; done
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34163256
sorry not ./dis

for i in $(grep "/*.xls" ./thelogfiles/*.log); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done
0
 
LVL 14

Assisted Solution

by:Monis Monther
Monis Monther earned 334 total points
ID: 34163751
1- No you cant do this, first off all you cant have a / to escape a wildcard like * you put \
2- dont use any * in anything here
3- type the command as follows

for i in $(grep -r ".xls" ./thelogfiles/); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34164271
small_student:

I have noticed one thing that using -exec, able to export at least one file to ./extractedfiles

With -ok, no output, ..nothing in the /extractedfiles..
but getting error msg as same mentions above,


the warning msg:

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?

0
 
LVL 14

Expert Comment

by:Monis Monther
ID: 34164415
Please follow the tips in my last post and let me know what happens
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34164605

small_student: thats what iam been saying already, i tried this

for i in $(grep -r ".xls" ./thelogfiles/); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

and getting this error msg,

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?
0
 
LVL 14

Expert Comment

by:Monis Monther
ID: 34164884
aha Ok I think I know what the problem is, the grep is getting more that a file name, its getting a full path, we need to filter it to the file names only.

Lets do a little test

I want to see the output of the grep part only, please run the following command and post its output here

grep -rl ".xls" /thelogfiles | head
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34171938

small_student:

Tried this,
grep -rl ".xls" ./thelogfiles | head

OUTPUT:

./thelogfiles/oh135.log
./thelogfiles/12the.log
./thelogfiles/233345t4w_456.log
0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34171953
sorry and this too,

./thelogfiles/something.log
0
 
LVL 14

Expert Comment

by:Monis Monther
ID: 34180786
Oh sorry I made a mistake here, this output shows the files that actually have lines in them containing the .xls, what I wanted is the line itself.

Post the output of this

grep -r ".xls" ./thelogfiles | head

0
 
LVL 4

Author Comment

by:mail2vijay1982
ID: 34187666

small_student:

Tried this,
grep -r ".xls" ./thelogfiles | head

the output

./thelogfiles/oh135.log:123.xls
./thelogfiles/12the.log:File name           456.xls
./thelogfiles/233345t4w_456.log:File name           789.xls
0
 
LVL 4

Author Closing Comment

by:mail2vijay1982
ID: 34191531
for the second question, i sorted it out by my self with the support of internet..

for the first question the answer is still pending,

Code:
for i in $(grep -rh ".xls"  ./thelogfiles/) ; do find ./Documents/ -iname $i -print0 -exec cp {} ./extractedfiles/ \;; done

the "-h", removes the path and just shows the file name.
So now, iam getting the output in "extractedfiles" folder. Except for the null spaces Ex:this that.xls. I think i can sort it out.
 But if any one have solution for this null space, you are very well appreciated. And if any one can do this code with xargs instead of exec, do comment below...

I appreciate all the above experts for what they have done so far for helping me out, thanks
0

Featured Post

Learn by Doing. Anytime. Anywhere.

Do you like to learn by doing?
Our labs and exercises give you the chance to do just that: Learn by performing actions on real environments.

Hands-on, scenario-based labs give you experience on real environments provided by us so you don't have to worry about breaking anything.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

I have seen several blogs and forum entries elsewhere state that because NTFS volumes do not support linux ownership or permissions, they cannot be used for anonymous ftp upload through the vsftpd program.   IT can be done and here's how to get i…
After running Ubuntu some time, you will be asked to download updates for fixing bugs and security updates. All the packages you download replace the previous ones, except for the kernel, also called "linux-image". This is due to the fact that w…
How to Install VMware Tools in Red Hat Enterprise Linux 6.4 (RHEL 6.4) Step-by-Step Tutorial
If you're a developer or IT admin, you’re probably tasked with managing multiple websites, servers, applications, and levels of security on a daily basis. While this can be extremely time consuming, it can also be frustrating when systems aren't wor…

729 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question