?
Solved

file search

Posted on 2003-03-25
4
Medium Priority
?
162 Views
Last Modified: 2010-04-21
I need some help with a program which searches a file on the linux server

an argument should be given on the command line for a program to search a file from any line for a word,
you must enter the word you want searched and also from which lines you want it searched in

I am not sure how to start this and would appreciate a starting point.
0
Comment
Question by:PALKTA
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 

Expert Comment

by:aleph0x
ID: 8201748
Here is a short list of things to read:
man locate OR man find
man grep

locate would be better IMHO as long as there aren't too many things changing between the updates.

Shouldn't be too difficult to knit a script to do what you want after reading the man pages. Here is a dirty one:

---start---
#!/bin/bash

local $top_level=$1
local $word=$2

for i in `find $top_level -type f -print 2>/dev/null`
do
  fgrep -n $word $i
done
---end---
0
 
LVL 5

Accepted Solution

by:
ecw earned 750 total points
ID: 8203970
As long as the word does not contain a /,

  #!/bin/sh
  # start line arg 1, pattern arg 2, file arg 3 onwards
  start="$1"
  pattern="$2"
  shift 2
  sed -n "$start,\$/$pattern/p" ${1+"$@"}

To directory trees,

  #!/bin/sh
  # start line arg 1, pattern arg 2, files/dirs arg 3 onwards
  start="$1"
  pattern="$2"
  shift 2
  for i in `find "$@" -type f -print` ; do
    sed -n "$start,\$/$pattern/p" "$i"
  done

Note the chances are -print is superfluous, modern versions of find(1) default to -print
0
 
LVL 2

Expert Comment

by:bkrahmer
ID: 8243939
Correct me if I'm wrong, but the 1st and 3rd solutions above both create a sub-process for handling each file.  From what I understand, xargs is more efficient by passing multiple filenames to your command.  That may be more efficient in terms of time and cpu/disk overhead.

brian
0
 
LVL 5

Expert Comment

by:ecw
ID: 8247867
xargs is far more efficient, if it the command one uses do what you want to achieve.  There was no mention in the original post of searching through more than one file, and for multiple files sed would not be appropriate if asked to process more than one file at time. Awk would do the job though.
0

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Have you ever been frustrated by having to click seven times in order to retrieve a small bit of information from the web, always the same seven clicks, scrolling down and down until you reach your target? When you know the benefits of the command l…
The purpose of this article is to fix the unknown display problem in Linux Mint operating system. After installing the OS if you see Display monitor is not recognized then we can install "MESA" utilities to fix this problem or we can install additio…
In this video, Percona Solution Engineer Rick Golba discuss how (and why) you implement high availability in a database environment. To discuss how Percona Consulting can help with your design and architecture needs for your database and infrastr…
In this video, Percona Solutions Engineer Barrett Chambers discusses some of the basic syntax differences between MySQL and MongoDB. To learn more check out our webinar on MongoDB administration for MySQL DBA: https://www.percona.com/resources/we…
Suggested Courses

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question