Link to home
Start Free TrialLog in
Avatar of Promodel
Promodel

asked on

Ubuntu Server Backup

Hi there,

We have a Ubuntu Server running Apache and PostgreSQL. We would like to backup the whole disk daily.

dd in Unix can be schedule to do so, but it seems that it cannot backup the running system. Also, if we just tar the root directory, what directories should be included? Such that when we install a new Ubuntu server on another machine, replacing the corresponding directories may make the new machine serve as the original machine.

Are there any other solutions for Ubuntu Server backup?  Such as Acronis, Paragon or Ghost for Windows, which can be installed and scheduled task every night without shutting down the machine?

Thanks.
Avatar of a1j
a1j
Flag of United States of America image

Google up Amanda or Bacula. Whatever you will be comfortable with.
Avatar of Promodel
Promodel

ASKER

We do not need network backup. Though Amanda or Bacula can backup local host, we prefer to solutions that clone the Ubuntu system locally. Otherwise, we need to install Amanda before recovery.

For we have installed and configured some software and application running on this server, we make a backup in case the machine crash. Then we can spend as less time as possible on making it start work.

We are looking for disk imaging software, such as Acronis, Paragon or Symantec, they can schedule backup daily and restore the image using a simple recovery CD and the image file.

If there is no such backup software for Linux. Then we have to backup the root everyday and install a new Ubuntu server on a new machine. The problem is can we replace the root directory in new server with the backup and then all software and application can run as the original one?
If you just need a simple tool to dump/restore filesystem there are tools borrowed from BSD world just for that.. They called....
http://manpages.ubuntu.com/manpages/hardy/man8/restore.8.html
http://manpages.ubuntu.com/manpages/hardy/man8/dump.8.html

can be done on working system. Can be done incrementally.
Btw don't try to attempt to tar database, the restore of that will be a corrupted database.
dump will work better but still not good enough.

You need to use database tools to backup a database (read database dox). Those tools will take proper database snapshot before backup and so on, so backup will be in consistent state.

Keep config files in your source code repository (such as git).
Keep scripts that install and configure system in that repository as well.
Everything else could be just reinstalled from ubuntu distro.
This way you will end up with very small amount of data needed to be backed up, also your system will be much cleaner and you will be able to set up exactly the same second one in matter of minutes.
thx a1j.

Unfortunitly, we do not keep the scripts installing and configuring. We just install it as we encounter a problem (e.g packages in CPAN). We might not know the script to reconfigure the server for it is working fine now, that's why we want backup the whole system. Also, latest version of software would be installed in the future, I am not sure the applications running in this server are compatible to the update version.

Since there is no software making hot backup, can we consider using Clonezilla to clone the disk once. Then use pg_dumpall to backup PostgreSQL database every night. Therefore, after Clonezilla recover the image to a disk, we can import the updated database to an "old" system?

btw, are dump and restore in ubuntu like tar, which just copy the file system? so we still need to concern grub( I am not familiar with grub) and software configuration?
A quick thought: I personally recommend performing backups of apache and PostgreSQL and putting the backups onto a remote server/drive.  There's a way to retrieve a list of modules from CPAN using "pmall" -- check out [http://www.cpan.org/misc/cpan-faq.html] (search for "pmall").

Clonezilla does not support online imaging/cloning, and anything with RAID is manual.

I've run into a lot of problems when trying to make a backup system identical to the system it is backing up.  For that reason, I just make sure to have a copy of the required config files so they can be copied as needed.   I would also consider getting a list of all RPM or DEB files, so they can be installed as well.
ASKER CERTIFIED SOLUTION
Avatar of a1j
a1j
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Hi crazedsanity,

Would you please advise what problems are there when you use Clonezilla?

Since there is no software for online imaging, we do need one usable image. At least, we want to know whether the image can work on different hardware. Perhaps Clonezilla seems to be an easy option doing that.

Look forward to your kind reply.
I've never used Clonezilla, but I've used the same imaging technique it uses, which is (from what I can tell) the Linux utility "dd"; this does a bit-for-bit copy of the drive to a file.  That works fine, thought it is pretty space-intensive: if the drive being copied is 1TB, the image will be that size, even when only a few gigabytes have been used.

Cloning from one system to another is problematic in that it is nearly impossible to get an identical system.  I've cloned from one laptop to another, where the laptop was the same model, but still contained different hardware.

As I noted before, i would suggest keeping a copy of the configurations from the main server so they can be restored on a backup system.  Instead of relying on it being cloned (which takes quite a bit of time), you can copy just those configurations and then run tests on that backup system pretty easily (which I would also recommend).

Getting the list of things that need to get backed up can be a pain, and can be very time-consuming.  It will give the most flexibility, which is important in the event that an identical backup system is not available when the main system has a hardware failure.
Any cloning utility on a working system will produce inconsistent database backup. The cloning utility will copy stuff sequentially, different parts of disks will be copied at different times because linux filesystem does not support snapshotting.
You have to read database backup guide to produce working database backup. Simple disk backup will not work.