Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 368
  • Last Modified:

File Search and contents scan with Perl

I have a list of file names in a text file and what I need to do is check whether the file exists in a directory and/or any of its sub-directories (the parent directory is remote if that makes a difference so I am using a path like '\\computer\share$\parent'. If the file exists I want to be able to read the file and look for certain string patterns using regular expressions. This has to be done in Perl but I am a complete novice!

This process will be running over a huge amount of files so needs to be quite efficient and close any unused resources if there are any... Any ideas?
1 Solution
This might help you get started:
use strict;
use File::Find;

# Load all of the file names into a hash array so we can search the directories
# once and check if it is a required file.
my $wanted_file;
while ( my $file_to_find = <DATA> ) {
    $wanted_file->{$file_to_find} = 1;

# Search the directory and if the filename is one of the names loaded into the
# hash array above - call a subroutine so it can be processed.
my $remote_dir = '\\\\computer\share$\parent';
find( sub { process_file($File::Find::name) if defined($wanted_file->{$_}) }, $remote_dir);

# Found a file - it can be processed here.
sub process_file($) {
    my ($file) = @_;
    print "Found a file ... ",$file,"\n";

yet another file name.txt

Open in new window

I have used an in-line file for the file names - these can be read from an external file easily.

What do you mean by a 'a huge amount of files'?  100's, 1000's, ... ?

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Tackle projects and never again get stuck behind a technical roadblock.
Join Now