Solved

Trying to get the string in a *.log file and search for that string name file throughout the folders and subfolders

Posted on 2010-11-18
33
468 Views
Last Modified: 2012-05-10
How to grep the string from a file, then find that string name file throughout ./ and store it in a separate folder
0
Comment
Question by:mail2vijay1982
  • 18
  • 11
  • 3
  • +1
33 Comments
 
LVL 11

Accepted Solution

by:
tel2 earned 56 total points
Comment Utility
Hi mail2vijay1982,

I'm struggling to understand what you mean.
For example:
- "find the string name file".  What does that mean?
- "store it in a separate folder"?  Store what, exactly?
Forgive me, Ingrish is only my first language.

Please explain your problem in detail, AND I suggest you provide some sample:
- Input data, and
- Expected results.
So we can clearly understand what you mean.

Thanks.
0
 
LVL 14

Expert Comment

by:small_student
Comment Utility
to find the file that contained a specific string simply use the ( -l switch )

ex:  grep -rl "some words with spaces" /

-r : search recursively in the dir
-l : output file name containing the string

Now this would take a lot of time

0
 
LVL 14

Assisted Solution

by:small_student
small_student earned 334 total points
Comment Utility
Oh sorry didnt read the title well enough, since you are looking under log files then it would simply be

grep -rl "some words with spaces" /var/log

Note: You must have the " " if the sentence has spaces
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
How to grep the string " *.xsl " from a file, then find that string named  file throughout ./  folders and sub folders. And store those files in a separate folder. Let it be shell script or commands, but commands would be more preferable as it is easy to execute right away without keeping my hand in the permissions..

The second question is to search for a string name "export" in all  .log file through all directories, sub directories and copy those  .log files which contains that string "export" to separate folder.
0
 
LVL 48

Assisted Solution

by:Tintin
Tintin earned 110 total points
Comment Utility
I'm interpreting the question as grepping out a string from a bunch of logs and then searching for filenames (not content) with that name.

If so, then


#!/bin/bash
grep string /some/dir/*.log | read read string
do
  find / -name "*string*" 
done

Open in new window

0
 
LVL 48

Assisted Solution

by:Tintin
Tintin earned 110 total points
Comment Utility
Ah, wrote my last post before I saw yours.  I was on the right track though.


#!/bin/bash
cd /to/some/path
grep ".xsl" file | while read string
do
  find . -name "$file" -type f | xargs -i mv {} /path/to/separate folder
done

Open in new window

0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
that string could contain name, like " *.xls"..is it possible..
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
i dont want to move, want to copy it actually...so i think cp applies for that right


#!/bin/bash
cd /to/some/path
grep ".xsl" file | while read string
do
  find . -name "$file" -type f | xargs -i cp {} /path/to/separate folder
done

Open in new window

0
 
LVL 48

Expert Comment

by:Tintin
Comment Utility
Are you saying the string in the file you are grepping could be

*.xsl

0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
yes, so it can be anything   .xsl like,  abc.xsl or bcd.xsl,...
0
 
LVL 14

Expert Comment

by:small_student
Comment Utility
If you have an * in the string put a \ before it

grep "\*.xls" filename
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
which consists in those .log file..
0
 
LVL 14

Assisted Solution

by:small_student
small_student earned 334 total points
Comment Utility
Ok now I understand what you want.

Here you go I tested this and it works fine

 for i in $(grep ".xls" /root/file); do find / -name $i -exec cp {} /path/to/dir \;; done


Note there is a ; ; two semicolons after the \
0
 
LVL 14

Assisted Solution

by:small_student
small_student earned 334 total points
Comment Utility
Now the second task is done in the way I pointed out in my first post by using the grep -rl
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility

small student: i have to use this command in very big Linux environment. I heard that using of exec with large storage files gives lots of errors. So my sources asked me to use xargs. But i am not sure about that. What do u think???
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility

small student: i have to use this command in very big Linux environment. I heard that using of exec with large storage files gives lots of errors. So my sources asked me to use xargs. But i am not sure about that. What do u think???
0
What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
Tintin: i have tried a sample of ur code, but nogo, Can you check this...

BTW, this is for the first question,



#!/bin/bash
cd ./thelogfiles/
grep -il "/*.xls" *.log | while read string
do
  find . -name "$*.log" -type f | xargs -i cp {} ./extractedfiles/
done

Open in new window

0
 
LVL 14

Assisted Solution

by:small_student
small_student earned 334 total points
Comment Utility
Well instead of exec you can use ok , it will prompt you for each copy and you can choose yes or no, but I think (Although not 100% sure) that the exec in my command is not the same as the bash exec its a find built in switch with the same name.

Try this and if you dont like the result you can Ctrl +C and nothing is harmed

 for i in $(grep ".xls" /path/to/file); do find / -name $i -ok cp {} /path/to/dir \;; done
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
small_student:I have tried this command along, no errors, but no output to...nothing in the /extractedfiles..


for i in $(grep ".xls" ./thelogfiles); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

Any suggestions..
0
 
LVL 14

Assisted Solution

by:small_student
small_student earned 334 total points
Comment Utility
Ok Lets route out the confusion

./thelogfiles    (this should be a file not a dir, if it is a dir then use grep -r ".xls"...etc)
./Documents  (This is where all the files are located and you are trying to seach in them, this was / in your first post)

./extractedfiles  (This is the location that the files will finally end up being copied to)

Note: Use full paths instead of relative paths for all three of the above

0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
small_student: Plz do check this line, giving warning msg...

for i in $(grep "/*.xls" ./thelogfiles/*); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

the warning msg:

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility

small_student:

./Documents  (let it be inside this folder or any sub folders inside it, it should search all directiores and      subdirectories inside ./Documents)

./thelogfiles/*.log ( would this help out)


for i in $(grep "/*.xls" ./thelogfiles/*.log); do find ./Documents/ -name $i -ok cp {} ./dis \;; done
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
sorry not ./dis

for i in $(grep "/*.xls" ./thelogfiles/*.log); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done
0
 
LVL 14

Assisted Solution

by:small_student
small_student earned 334 total points
Comment Utility
1- No you cant do this, first off all you cant have a / to escape a wildcard like * you put \
2- dont use any * in anything here
3- type the command as follows

for i in $(grep -r ".xls" ./thelogfiles/); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
small_student:

I have noticed one thing that using -exec, able to export at least one file to ./extractedfiles

With -ok, no output, ..nothing in the /extractedfiles..
but getting error msg as same mentions above,


the warning msg:

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?

0
 
LVL 14

Expert Comment

by:small_student
Comment Utility
Please follow the tips in my last post and let me know what happens
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility

small_student: thats what iam been saying already, i tried this

for i in $(grep -r ".xls" ./thelogfiles/); do find ./Documents/ -name $i -ok cp {} ./extractedfiles \;; done

and getting this error msg,

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.  Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.

find: warning: Unix filenames usually don't contain slashes (though pathnames do).  That means that '-name `./thelogfiles/something.log:File'' will probably evaluate to false all the time on this system.  You might find the '-wholename' test more useful, or perhaps '-samefile'.

 Alternatively, if you are using GNU grep, you could use 'find ... -print0 | grep -FzZ `./thelogfiles/something.log:File''.
< cp ... ./Documents/see.xls > ?
0
 
LVL 14

Expert Comment

by:small_student
Comment Utility
aha Ok I think I know what the problem is, the grep is getting more that a file name, its getting a full path, we need to filter it to the file names only.

Lets do a little test

I want to see the output of the grep part only, please run the following command and post its output here

grep -rl ".xls" /thelogfiles | head
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility

small_student:

Tried this,
grep -rl ".xls" ./thelogfiles | head

OUTPUT:

./thelogfiles/oh135.log
./thelogfiles/12the.log
./thelogfiles/233345t4w_456.log
0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility
sorry and this too,

./thelogfiles/something.log
0
 
LVL 14

Expert Comment

by:small_student
Comment Utility
Oh sorry I made a mistake here, this output shows the files that actually have lines in them containing the .xls, what I wanted is the line itself.

Post the output of this

grep -r ".xls" ./thelogfiles | head

0
 
LVL 4

Author Comment

by:mail2vijay1982
Comment Utility

small_student:

Tried this,
grep -r ".xls" ./thelogfiles | head

the output

./thelogfiles/oh135.log:123.xls
./thelogfiles/12the.log:File name           456.xls
./thelogfiles/233345t4w_456.log:File name           789.xls
0
 
LVL 4

Author Closing Comment

by:mail2vijay1982
Comment Utility
for the second question, i sorted it out by my self with the support of internet..

for the first question the answer is still pending,

Code:
for i in $(grep -rh ".xls"  ./thelogfiles/) ; do find ./Documents/ -iname $i -print0 -exec cp {} ./extractedfiles/ \;; done

the "-h", removes the path and just shows the file name.
So now, iam getting the output in "extractedfiles" folder. Except for the null spaces Ex:this that.xls. I think i can sort it out.
 But if any one have solution for this null space, you are very well appreciated. And if any one can do this code with xargs instead of exec, do comment below...

I appreciate all the above experts for what they have done so far for helping me out, thanks
0

Featured Post

Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

Join & Write a Comment

In my business, I use the LTS (Long Term Support) versions of Linux. My workstations do real work, and so I rarely have the patience to deal with silly problems caused by an upgraded kernel that had experimental software on it to begin with from a r…
The purpose of this article is to show how we can create Linux Mint virtual machine using Oracle Virtual Box. To install Linux Mint we have to download the ISO file from its website i.e. http://www.linuxmint.com. Once you open the link you will see …
It is a freely distributed piece of software for such tasks as photo retouching, image composition and image authoring. It works on many operating systems, in many languages.
This video explains how to create simple products associated to Magento configurable product and offers fast way of their generation with Store Manager for Magento tool.

744 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now