Link to home
Start Free TrialLog in
Avatar of Xetroximyn
XetroximynFlag for United States of America

asked on

Simple file level backup for redhat linux (preferably free)

Looking for very simple/easy to setup file level backup for redhat linux.  (backup to network share)

I have acronis - but the latest version won't do file level backups (even with agent installed) because my main LVM partition is LUKS encrypted.... it forces me to do sector by sector which means #1 HUGE backups, and #2 no file level recovery.  I already backup at the VM level for DR and just keep a few days.

But I need an easy/low learning curve (and preferably free, since I already pay for acronis) way to backup 2 of my more important linux servers, just at the file level.  I'm looking for something easily to implement - don't need fancy features just

1. can do full and incremental on a schedule
2. can do file level restore
3. can encrypted the backups.

I don't even need a central management console... I'm fine with something standalone.

These would backup to Samba shares on a NAS.

Any suggestions?

Thanks!
Avatar of arnold
arnold
Flag of United States of America image

Luks backup images the entire disk, to backup file file means one has to decrypt prior to backup.

Dump | gzip password protect.

Dump /recover is the free existing solution. The data can bezipoed/password protect
Tar,...

Backuo a, zmanda, etc are open source, but to backup file level they need .....
To have an encrypted backup, the destination has to have been setup as an encrypted filesystem.

One is to password protect the dump file .....
Avatar of Xetroximyn

ASKER

Thanks! - so I know luks encrypts the entire partition... It seems most "file level" tools should work fine (i.e. if they are not trying to backup the partition) because as long as the server is booted, the disk contents are all automatically decrypted for whatever apps want to use them...

Also - confused why you say destination filesystem must be encrypted?  I want the backup files themselves to be encrypted (this is how acronis works for example) - so I don't need to disk encrypt my NAS - I just only put already encrypted backup files there.  

can dump/gzip be used to schedule full and incremental backups?  

Is zmanda related to amanda... granted it was a LONG LONG time ago, but I vaugly recall looking at amanda and it seeming to be a bit overcomplicated overkill for what I needed.

I just want something simple that can handle
1. full + incremental on schedule
2. auto cleanup of backups older than 90 days
3. ability to encrypt
The second comment deals with encrypting backup data.

Presumably you authorize users to decrypt certain files not all.

Man dump
Dump 0 full backup
Dump 1-9 provides a variety of options.

The encrypt part, one deals with password protecting, encrypting ......

If not mistaken, zmanda was a variation of Amanda
..
Thanks!  So I am looking for something a bit more intuitive than dump... (or perhaps you know of a "Getting started" guide for "dump" that's better than just a man page.)

I suppose I should add a couple items to my wish list
I just want something simple that can handle

1. full + incremental on schedule
2. auto cleanup of backups older than 90 days
3. ability to encrypt
4. ability for me to give it the network share credentials and it manages them as opposed to me having to mount on the linux machine
5. ability to email me if backups are failing
1, scheduling is achieved through running a shell script that includes the logic to determine whether the full backup or an incremental runs

#!/bin/sh

datevalue=$(date +"%Y%m%e%H%%M%S")
#the above will have a for at 20180227101823 as an example. You San backup a partition or a mount point
cd location_where_the_backuo_is_stored
If criteria_is_met; then dump -0f - /dev/sda1 > gzip -c >"$datevalue-backup.bak.gz"
Else
        Dump -1f - /dev/sda1 > gzip -c > "incbackup_$datevalue.bak.gz"
Fi
#cleanup full backups older than 14 days ago, it could be done differently as well depending on your cleanup requirements, backup retention.
find -name "*backup.bak.gz" -mtime +90 -rm {} \;
#the below is an example using tail and head  to extract the list the difficulty in count based cleanup deals with the frequency at which the cleanup
# runs in this case, the cleanup runs every tine a backup is performed. The cleanup can be separated to run once a week, month, etc the logic has to follow
find -name "*backup.bak.gz" |sort -n


3) what are you encrypting?

4) you could as part of the shell script use mount, umount When done.
5)  you can test the output status of the backup and generate and email, though to detect the status of running dump, the dump has to be changed to write to file, or the output status of the example deals with the line sequence, potentially false positive (success) while dump failed

Try
Dump -0f - /dev/sd12 |gzip -c >sone test.bak.gz
echo $?
I if it is anything other than 0 than you can use the example, if it is 0, you got a false positive when the backup failed.
This question needs an answer!
Become an EE member today
7 DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.