• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 240
  • Last Modified:

How to keep remote files up to date with master server?

I have remote servers that have 2 files that i need to keep up to date with the 2  files on my master server.

The problem is my remote servers are both windows and linux.

Whats the best way I should do this? doesn't have to be PHP but that would be nice.


Thanks guys!
0
Brad_nelson1
Asked:
Brad_nelson1
  • 3
  • 2
  • 2
  • +1
1 Solution
 
SkonenCommented:
It depends on what you consider "up to date". If you've got ssh or PHP console access, you can use the following small script which depending on the file sizes, could have some undesired bandwidth usage:

<?php

$strServer = "http://www.mysite.com";

$aFiles = Array(); $aDest = Array();

//These are the files you are grabbing from the server
$aFiles[] = "example1.txt";
$aFiles[] = "files/example2.txt";

//Where to put the corresponding files
$aDest[] = "example1.txt";
$aDest[] = "example2.txt";

$iMinutes = 15;
$iSleep = $iMinutes*60;

$sizeofFiles = sizeof($aFiles);

while (1) {

for ($i=0; $i < $sizeofFiles; $i++){

     if ($fp = fopen($strServer . '/' . $aFiles[$i], "rb")){
         $file_contents = "";
         while (!feof($fp)) {
              $file_contents .= fgets($fp, 1024);
         }
         if ($fp_dest = fopen($aDest[$i], "wb")){
         fwrite($fp, $file_contents, strlen($file_contents);
         @fclose($fp_dest);
         }

         @fclose($fp);
     }
}

sleep($iSleep);
}

?>
0
 
SkonenCommented:
It will attempt to download the files and write them to the local server every 15 minutes. However, this does not check to see if the files have been updated, but being that you referred to it as a Master Server, it really shouldn't matter. You may also need to add the following line after the opening php tag (doubtful):

ini_set("max_execution_time", 0);
0
 
RoonaanCommented:
Do you have access to cron or the windows taskmanager on your servers? You then could cronjob/task the following php file.

<?php
  //notice the url and dir difference with the two variables below!
  $master_server_url = 'http://www.myfirstdomain.com/';
  $second_server_dir = '/wwwhome/';

  //all files that needed to be checked
  $files = array('news/news.txt', 'conf/conf.xml');

  foreach($files as $file)
  {
      //Use some vars just for understandability
      $file1 = $master_server_url.$file;
      $file2 = $second_server_dir.$file;
     
      //retrieve the last modification time of the files.
      $time1 = @filemtime($file1);
      $time2 = @filemtime($file2);
     
      //If file1 is newer than file2, update.
      if($time2 < $time1)
         @copy($file1, $file2);
  }
?>

I added loads of '@' to suppress warnings. But the idea must be clear. warnings could occur when files aren't accessable, you haven't got your file2's chmodden to be writable, etc etc.

-r-
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
SkonenCommented:
I didn't realize the core copy function properly handled remote files, so my example is much more bloated. Roonan is correct about Cron/TaskMon, it's preferred as it doesn't hog nearly as many resources. Next time I won't attempt to answer a question when I've got for more than 24 hours without sleep ;)
0
 
hendridmCommented:
If you have root, you could schedule an rsync task to mirror the files:
http://www.danhendricks.com/articles/My%20Articles/Poor%20Man's%20Guide%20to%20Load%20Balancing/
0
 
Brad_nelson1Author Commented:
 //notice the url and dir difference with the two variables below!
  $master_server_url = 'http://www.myfirstdomain.com/';
  $second_server_dir = '/wwwhome/';

Im not understanding that top part of the code.

my files will be located at: http//domain.com/files/
my remote server would be: http://ipaddress/files/

so would that make the top:
  $master_server_url = 'http://www.domain.com/files/';
  $second_server_dir = 'http://ipadddress/files/';


0
 
Brad_nelson1Author Commented:
I got it I had to make it:

$master_server_url = 'http://www.domain.com/files/';
 $second_server_dir = 'C:\\location\\of\\files\\';

I also had to change the < to a > in:

//If file1 is newer than file2, update.
      if($time2 > $time1)
         @copy($file1, $file2);

Thanks for the help guys!
0
 
RoonaanCommented:
Hmm. This system isn't ideal as filemtime doesn't support remote files.
I was wondering why you changed the < into a > because that didn't sound logical at all.
The reason you had to do this is that filemtime(remotefile) returns false/0. So therefor allways be lower than filemtime($file2). Probably something like the code below will be more efficient because in the current version your files just get copied everytime.

I'm in dubio now, because the stat()-family functions like filemtime filesize, etc cannot be used on http/https/ftp/etc, only on local filesystem. Therefor it would seem logical to check for difference in -for example- filecontents using md5(file_get_contents()) and match the resulting md5hashvalues. But then again, when you are using file_get_contens() you just might use copy, because the whole file has to be pumped over the web anyway.

An solution can be to have a script running on your primairy server which tells the filemtime of the file to be checked:
usage: http://www.domain.com/filemtime.php?/news/data.txt

<?php
$file = realpath(dirname(__FILE__).'/'.$_SERVER['QUERY_STRING']);
if(str_replace(dirname(__FILE__), '', dirname($file)) == dirname($file))
{
 //they tried to check a file you didn't allow to be checked
 return 0;
}
else
{
 return filemtime($file);
}
?>

On the secondairy server you then would have:

  foreach($files as $file)
  {
      //Use some vars just for understandability
      $file1 = $master_server_url.$file;
      $file2 = $second_server_dir.$file;
     
      //retrieve the last modification time of the files.
      $time1 = intval(file_get_contents($master_server_url.'filemtime?'.$file));
      $time2 = @filemtime($file2);
     
      //If file1 is newer than file2, update.
      if($time2 < $time1)
         @copy($file1, $file2);
  }

I hope you understand the problem with the last script and the somewhat more difficult new approach.

-r-
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

  • 3
  • 2
  • 2
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now