Solved

Backing up current RHEL 5 Web Server Disk to USB Drive, Best Way to do this?

Posted on 2013-12-17
8
58 Views
Last Modified: 2016-07-13
We are using a dedicated web server with Redhat Enterprise Linux 5 installed, we are using this to host a couple of our websites.

We asked our host to attach an external USB hard drive to our server so we could make a backup copy of our current files and DB data.

My question is what is the best way to do this? How to copy everything needed in our server to the USB without downtime?

1) We're thinking of using rsync to copy the files. Is there a bettter way to do this? Faster and more secure (no file corruption and should preserve file meta data)?

2) For the DB since the websites are live and there are a lot of visitors it would always be changing, rsync would have a problem with this. What is the best way to copy our DB data without down time? DB is ranging from 20GB to 50GB. DB used is MySQL Server 5.0.95

3) Is there anything we shouldn't copy? So far I know that copying:
/proc/
/tmp/
Is just a waste of time and HD space since we won't be able to use this. Is there anything else like this that I should avoid copying?
0
Comment
Question by:ibr33
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
8 Comments
 
LVL 27

Accepted Solution

by:
skullnobrains earned 250 total points
ID: 39726440
1) if you want to copy files, rsync is a reasonable choice with tons of useful features. in your case, you probably should disable checkum computing since comparing mtimes should be enough.

unison would be a reasonable alternative

you may also consider mirroring filesystems or even use an incremental block-level copy if you want to copy more stuff

2) you'll find tools for each backend : for example
- innodbhotcopy (not free) for innodb
- xtrabackup for xtra. should work with innodb as well but will lock stuff
- myisam files cannot be copied without at least locking the tables during the copy

there are quite a few alternate way. i'd suggest this
- setup a regular replication to another mysql instance on the same or a remote host
- backup that instance doing a clean backup while the server is stopped
you may snapshot the filesystem while the replicate server is stopped. note that using snapshots without stopping the server will most likely not work unless you lock the whole database, and flush innodb buffers. even then you'll have to restart it in recovery mode (ie ignore the innodb log files)

3)
many things such as /var/run and /var/tmp

you should probably either mirror the whole disk or copy only what you need : it is easier to reinstall from a default install, a list of packages, and possibly a few config files such as your network setup, hostname and the likes, and obviously your database files than try to mirror needed files and reapply them on a new system. the install should be fairly easy to script

hope that helps
0
 
LVL 16

Assisted Solution

by:Joseph Gan
Joseph Gan earned 250 total points
ID: 39728011
You could use linux dump/restore commands to backup filesystems to usb drive, and restore it back, if the filesystem is ext2, ext3 and ext4. Eg:

# /sbin/dump -0u -f /dev/usb_drive /filesystems

always start with a level 0 backup, after that can use differencial backup with level 2,3, etc..

Please look at man dump for details.
0
 
LVL 29

Expert Comment

by:serialband
ID: 39728337
You can use rsync as suggested above and use msqldump to save the database.
0
Why You Need a DevOps Toolchain

IT needs to deliver services with more agility and velocity. IT must roll out application features and innovations faster to keep up with customer demands, which is where a DevOps toolchain steps in. View the infographic to see why you need a DevOps toolchain.

 
LVL 27

Expert Comment

by:skullnobrains
ID: 39728642
beware, mysqldump locks the tables while it runs. it is only a reasonable option on small databases
0
 
LVL 29

Expert Comment

by:serialband
ID: 39729326
20GB-50GB isn't too big at all and shouldn't take too long.  It might be prudent to just have a 2nd server with everything else duplicated to handle failover if your server is more critical.  You can also replicate the database to the 2nd server.
0
 
LVL 16

Expert Comment

by:Joseph Gan
ID: 41704248
One of the solution has provide: ID: 39728011
0

Featured Post

Free NetCrunch network monitor licenses!

Only on Experts-Exchange: Sign-up for a free-trial and we'll send you your permanent license!

Here is what you get: 30 Nodes | Unlimited Sensors | No Time Restrictions | Absolutely FREE!

Act now. This offer ends July 14, 2017.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this article we will learn how to backup a VMware farm using Nakivo Backup & Replication. In this tutorial we will install the software on a Windows 2012 R2 Server.
Microsoft will be releasing the Windows 10 Creators Update in just a matter of weeks. Are you prepared? Follow these steps to ensure everything goes smoothly and you don't lose valuable data on your PC.
Connecting to an Amazon Linux EC2 Instance from Windows Using PuTTY.
How to Install VMware Tools in Red Hat Enterprise Linux 6.4 (RHEL 6.4) Step-by-Step Tutorial

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question