Link to home
Start Free TrialLog in
Avatar of Vijay kumar Mohanraj
Vijay kumar MohanrajFlag for Malaysia

asked on

Trying to get the string in a *.log file and search for that string name file throughout the folders and subfolders

How to grep the string from a file, then find that string name file throughout ./ and store it in a separate folder
ASKER CERTIFIED SOLUTION
Avatar of tel2
tel2
Flag of New Zealand image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
to find the file that contained a specific string simply use the ( -l switch )

ex:  grep -rl "some words with spaces" /

-r : search recursively in the dir
-l : output file name containing the string

Now this would take a lot of time

SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Vijay kumar Mohanraj

ASKER

How to grep the string " *.xsl " from a file, then find that string named  file throughout ./  folders and sub folders. And store those files in a separate folder. Let it be shell script or commands, but commands would be more preferable as it is easy to execute right away without keeping my hand in the permissions..

The second question is to search for a string name "export" in all  .log file through all directories, sub directories and copy those  .log files which contains that string "export" to separate folder.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
that string could contain name, like " *.xls"..is it possible..
i dont want to move, want to copy it actually...so i think cp applies for that right


#!/bin/bash
cd /to/some/path
grep ".xsl" file | while read string
do
  find . -name "$file" -type f | xargs -i cp {} /path/to/separate folder
done

Open in new window

Avatar of Tintin
Tintin

Are you saying the string in the file you are grepping could be

*.xsl

yes, so it can be anything   .xsl like,  abc.xsl or bcd.xsl,...
If you have an * in the string put a \ before it

grep "\*.xls" filename
which consists in those .log file..
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial

small student: i have to use this command in very big Linux environment. I heard that using of exec with large storage files gives lots of errors. So my sources asked me to use xargs. But i am not sure about that. What do u think???

small student: i have to use this command in very big Linux environment. I heard that using of exec with large storage files gives lots of errors. So my sources asked me to use xargs. But i am not sure about that. What do u think???
Tintin: i have tried a sample of ur code, but nogo, Can you check this...

BTW, this is for the first question,



#!/bin/bash
cd ./thelogfiles/
grep -il "/*.xls" *.log | while read string
do
  find . -name "$*.log" -type f | xargs -i cp {} ./extractedfiles/
done

Open in new window

SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
small_student:I have tried this command along, no errors, but no output to...nothing in the /extractedfiles..


for i in $(grep ".xls" ./thelogfiles); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

Any suggestions..
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
small_student: Plz do check this line, giving warning msg...

for i in $(grep "/*.xls" ./thelogfiles/*); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

the warning msg:

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?

small_student:

./Documents  (let it be inside this folder or any sub folders inside it, it should search all directiores and      subdirectories inside ./Documents)

./thelogfiles/*.log ( would this help out)


for i in $(grep "/*.xls" ./thelogfiles/*.log); do find ./Documents/ -name $i -ok cp {} ./dis \;; done
sorry not ./dis

for i in $(grep "/*.xls" ./thelogfiles/*.log); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
small_student:

I have noticed one thing that using -exec, able to export at least one file to ./extractedfiles

With -ok, no output, ..nothing in the /extractedfiles..
but getting error msg as same mentions above,


the warning msg:

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?

Please follow the tips in my last post and let me know what happens

small_student: thats what iam been saying already, i tried this

for i in $(grep -r ".xls" ./thelogfiles/); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

and getting this error msg,

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?
aha Ok I think I know what the problem is, the grep is getting more that a file name, its getting a full path, we need to filter it to the file names only.

Lets do a little test

I want to see the output of the grep part only, please run the following command and post its output here

grep -rl ".xls" /thelogfiles | head

small_student:

Tried this,
grep -rl ".xls" ./thelogfiles | head

OUTPUT:

./thelogfiles/oh135.log
./thelogfiles/12the.log
./thelogfiles/233345t4w_456.log
sorry and this too,

./thelogfiles/something.log
Oh sorry I made a mistake here, this output shows the files that actually have lines in them containing the .xls, what I wanted is the line itself.

Post the output of this

grep -r ".xls" ./thelogfiles | head


small_student:

Tried this,
grep -r ".xls" ./thelogfiles | head

the output

./thelogfiles/oh135.log:123.xls
./thelogfiles/12the.log:File name           456.xls
./thelogfiles/233345t4w_456.log:File name           789.xls
for the second question, i sorted it out by my self with the support of internet..

for the first question the answer is still pending,

Code:
for i in $(grep -rh ".xls"  ./thelogfiles/) ; do find ./Documents/ -iname $i -print0 -exec cp {} ./extractedfiles/ \;; done

the "-h", removes the path and just shows the file name.
So now, iam getting the output in "extractedfiles" folder. Except for the null spaces Ex:this that.xls. I think i can sort it out.
 But if any one have solution for this null space, you are very well appreciated. And if any one can do this code with xargs instead of exec, do comment below...

I appreciate all the above experts for what they have done so far for helping me out, thanks