• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 152
  • Last Modified:

global search and replace?

I need to go through every file in my webserver root directory and its subdirectories (over 600 files) and replace this text:

<!--#include virtual="/header/header.html"-->

with this text:

<!--#include virtual="/cgi/header/header.cgi?sec=header"-->

Can someone give me a good idea how to do it? I am on Redhat 8.0

Thanks!

0
rustycp
Asked:
rustycp
1 Solution
 
jdfoxCommented:
How about this script:

#!/bin/bash
mkdir temp
for n in `ls`;
do sed s/"original string"/"new string"/g $n > temp/$n
done

Perl could probably do it in two short lines, but hey. :-)

If you had more than just HTML files in that directory, you could change that script like this, so it would only work on .html files:

#!/bin/bash
mkdir temp
for n in `find . -name "*.html"`;
do sed s/"original string"/"new string"/g $n > temp/$n
done


HTH,
--
JF
0
 
rustycpAuthor Commented:
cool.  I could do it in Perl, but I really wondered about a shell script possiblity. Thanks!
0
 
arn0ldCommented:
The solution is "safe" in that it leaves the original
files unchanged. However, it will change all *html files
even if they do not contain "original string". I like to retain "last change date"
on my files and would use something like the following.

cd root_dir
for n in $(find . -type f -name "*.html" -exec grep -l "original string" {} \;)
do
ex $n<<!
g/original string/s//new string/
w
q
!
done

I would backup my original files before I ran this.
0

Featured Post

Receive 1:1 tech help

Solve your biggest tech problems alongside global tech experts with 1:1 help.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now