I'm currently running Gutsy as my home server so no X on there. I've got 3 websites online. I'd like to know if there is a way I can get my server to automatically connect to these three websites via ftp and download them for backup. It would also be nice if it automatically compressed them. It would be even better if it could check for new files only and download them to prevent myself downloading the same large files over and over again.
You just need to make a script that copies the directories in the web server of those sites and copy it where you want to keep the backup, via ftp or whatever. Then make the script run periodically
If you use NetScaler you will want to see these guides. The NetScaler How To Guides show administrators how to get NetScaler up and configured by providing instructions for common scenarios and some not so common ones.
In this article, I am going to show you how to simulate a multi-site Lab environment on a single Hyper-V host. I use this method successfully in my own lab to simulate three fully routed global AD Sites on a Windows 10 Hyper-V host.
Learn how to find files with the shell using the find and locate commands.
Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.: