Solved

How to keep remote files up to date with master server?

Posted on 2004-09-18
8
231 Views
Last Modified: 2008-02-01
I have remote servers that have 2 files that i need to keep up to date with the 2  files on my master server.

The problem is my remote servers are both windows and linux.

Whats the best way I should do this? doesn't have to be PHP but that would be nice.


Thanks guys!
0
Comment
Question by:Brad_nelson1
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
  • 2
  • +1
8 Comments
 
LVL 4

Expert Comment

by:Skonen
ID: 12094370
It depends on what you consider "up to date". If you've got ssh or PHP console access, you can use the following small script which depending on the file sizes, could have some undesired bandwidth usage:

<?php

$strServer = "http://www.mysite.com";

$aFiles = Array(); $aDest = Array();

//These are the files you are grabbing from the server
$aFiles[] = "example1.txt";
$aFiles[] = "files/example2.txt";

//Where to put the corresponding files
$aDest[] = "example1.txt";
$aDest[] = "example2.txt";

$iMinutes = 15;
$iSleep = $iMinutes*60;

$sizeofFiles = sizeof($aFiles);

while (1) {

for ($i=0; $i < $sizeofFiles; $i++){

     if ($fp = fopen($strServer . '/' . $aFiles[$i], "rb")){
         $file_contents = "";
         while (!feof($fp)) {
              $file_contents .= fgets($fp, 1024);
         }
         if ($fp_dest = fopen($aDest[$i], "wb")){
         fwrite($fp, $file_contents, strlen($file_contents);
         @fclose($fp_dest);
         }

         @fclose($fp);
     }
}

sleep($iSleep);
}

?>
0
 
LVL 4

Expert Comment

by:Skonen
ID: 12094380
It will attempt to download the files and write them to the local server every 15 minutes. However, this does not check to see if the files have been updated, but being that you referred to it as a Master Server, it really shouldn't matter. You may also need to add the following line after the opening php tag (doubtful):

ini_set("max_execution_time", 0);
0
 
LVL 49

Expert Comment

by:Roonaan
ID: 12094766
Do you have access to cron or the windows taskmanager on your servers? You then could cronjob/task the following php file.

<?php
  //notice the url and dir difference with the two variables below!
  $master_server_url = 'http://www.myfirstdomain.com/';
  $second_server_dir = '/wwwhome/';

  //all files that needed to be checked
  $files = array('news/news.txt', 'conf/conf.xml');

  foreach($files as $file)
  {
      //Use some vars just for understandability
      $file1 = $master_server_url.$file;
      $file2 = $second_server_dir.$file;
     
      //retrieve the last modification time of the files.
      $time1 = @filemtime($file1);
      $time2 = @filemtime($file2);
     
      //If file1 is newer than file2, update.
      if($time2 < $time1)
         @copy($file1, $file2);
  }
?>

I added loads of '@' to suppress warnings. But the idea must be clear. warnings could occur when files aren't accessable, you haven't got your file2's chmodden to be writable, etc etc.

-r-
0
SharePoint Admin?

Enable Your Employees To Focus On The Core With Intuitive Onscreen Guidance That is With You At The Moment of Need.

 
LVL 4

Expert Comment

by:Skonen
ID: 12096480
I didn't realize the core copy function properly handled remote files, so my example is much more bloated. Roonan is correct about Cron/TaskMon, it's preferred as it doesn't hog nearly as many resources. Next time I won't attempt to answer a question when I've got for more than 24 hours without sleep ;)
0
 
LVL 8

Expert Comment

by:hendridm
ID: 12097246
If you have root, you could schedule an rsync task to mirror the files:
http://www.danhendricks.com/articles/My%20Articles/Poor%20Man's%20Guide%20to%20Load%20Balancing/
0
 

Author Comment

by:Brad_nelson1
ID: 12099267
 //notice the url and dir difference with the two variables below!
  $master_server_url = 'http://www.myfirstdomain.com/';
  $second_server_dir = '/wwwhome/';

Im not understanding that top part of the code.

my files will be located at: http//domain.com/files/
my remote server would be: http://ipaddress/files/

so would that make the top:
  $master_server_url = 'http://www.domain.com/files/';
  $second_server_dir = 'http://ipadddress/files/';


0
 

Author Comment

by:Brad_nelson1
ID: 12099478
I got it I had to make it:

$master_server_url = 'http://www.domain.com/files/';
 $second_server_dir = 'C:\\location\\of\\files\\';

I also had to change the < to a > in:

//If file1 is newer than file2, update.
      if($time2 > $time1)
         @copy($file1, $file2);

Thanks for the help guys!
0
 
LVL 49

Accepted Solution

by:
Roonaan earned 250 total points
ID: 12100035
Hmm. This system isn't ideal as filemtime doesn't support remote files.
I was wondering why you changed the < into a > because that didn't sound logical at all.
The reason you had to do this is that filemtime(remotefile) returns false/0. So therefor allways be lower than filemtime($file2). Probably something like the code below will be more efficient because in the current version your files just get copied everytime.

I'm in dubio now, because the stat()-family functions like filemtime filesize, etc cannot be used on http/https/ftp/etc, only on local filesystem. Therefor it would seem logical to check for difference in -for example- filecontents using md5(file_get_contents()) and match the resulting md5hashvalues. But then again, when you are using file_get_contens() you just might use copy, because the whole file has to be pumped over the web anyway.

An solution can be to have a script running on your primairy server which tells the filemtime of the file to be checked:
usage: http://www.domain.com/filemtime.php?/news/data.txt

<?php
$file = realpath(dirname(__FILE__).'/'.$_SERVER['QUERY_STRING']);
if(str_replace(dirname(__FILE__), '', dirname($file)) == dirname($file))
{
 //they tried to check a file you didn't allow to be checked
 return 0;
}
else
{
 return filemtime($file);
}
?>

On the secondairy server you then would have:

  foreach($files as $file)
  {
      //Use some vars just for understandability
      $file1 = $master_server_url.$file;
      $file2 = $second_server_dir.$file;
     
      //retrieve the last modification time of the files.
      $time1 = intval(file_get_contents($master_server_url.'filemtime?'.$file));
      $time2 = @filemtime($file2);
     
      //If file1 is newer than file2, update.
      if($time2 < $time1)
         @copy($file1, $file2);
  }

I hope you understand the problem with the last script and the somewhat more difficult new approach.

-r-
0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

These days socially coordinated efforts have turned into a critical requirement for enterprises.
Many old projects have bad code, but the budget doesn't exist to rewrite the codebase. You can update this code to be safer by introducing contemporary input validation, sanitation, and safer database queries.
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.
This tutorial will teach you the core code needed to finalize the addition of a watermark to your image. The viewer will use a small PHP class to learn and create a watermark.

738 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question