Link to home
Start Free TrialLog in
Avatar of Pierre Ammoun
Pierre AmmounFlag for Lebanon

asked on

Backup data

Dears,

I have a small network of 5 PCs and a server that is mainly used for sharing data between the users.

I am thinking of a "way" of backing up the data and "be safe" from malware and ransomware.

If I get 2 NAS (the data I am to backup is around 1TB). and both NAS are configured to backup the data at night.
So in theory I have then 3 copies of the data (the original +2 NAS).
My problem is that in order to be 99.99% safe, I need that once I do a backup (which is at night) I need to disconnect the NAS from the LAN so that if meanwhile I get hit,then my backup is safe.

I need a mechanism where by once my backup is done , my NAS  disconnects from the LAN.

How can I achieve this ?
The NAS I have are QNAP 4 bays).

Thanks
SOLUTION
Avatar of John Tsioumpris
John Tsioumpris
Flag of Greece image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Pierre Ammoun

ASKER

Hi John,

The problem is that I have to use folder mapping for specific tasks.

But thanks for the idea ! Haven't thought about ftp protocol. Is it "safe" from ransomware that scans all your network ?
As long you have it protected (FTP) with username/password and you don't map the FTP folder the  ransomware just can't access it....i guess the QNAP can isolate the Volumes...i am afraid i don't own one to know its inner working...
Pierre, ransomware usually uses the account of the user that executes it. If you use a different account for backups and entitle only that account to write to the backup location, you are safe.
McKnife, Thanks for the reply. But If I have a PC that I am backing up and even If I use the Backup username & Password, if that PC is infected, it will infect the backup !
"even If I use the Backup username & Password, if that PC is infected, it will infect the backup !" - no, what makes you think that way? An infection does not mean the attacking program knows all credentials.
AFAIK, if the folder has ever been mapped (and it sounds like it has) then it can be infected by ransomware. I have a QNAP 4 bay at home, the best way I see of doing this is to either physically unplug the network cord or to make sure you ar using versioning backup software like Crashplan or indync.
"AFAIK, if the folder has ever been mapped (and it sounds like it has) then it can be infected by ransomware." That's not correct. If we save credentials for drive maps, then it would be correct.
The thing is most users do save creeds for drive maps. And since the majority of users who have access to such a share, like c-level guys, don't know they are saving credentials, the result is the same.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I certainly won't argue the point of different creds for backups.  I just don't believe, in practice,  that many people do this.  Just like too few people actually follow POLP (even if they know what it is).
Pros do this. And Pros also use something like applocker so not any unknown software that our AV software does not recognize will be able to run. If we are not discussing how pros would do this, what's the point? (Sure, there are many wannabe pros :-)
Yes pros do do this, but most of us have bosses who are not IT Pros and demand access (or at least ask for it)  to those backups.  My boss is relatively IT savvy, but he wants and has all my passwords.  I make sure he doesn't keep them in a text file on his computer,  but I still haven't gotten him to use a password manager. He could easily get access to those backups,  which is why we use versioning backup software (inSync from Druva) for endpoint backup and Phoenix for data center backup.
Thank you guys for the interesting conversation.
The problem is that If I am using a backup tool (second copy for example) , then I need at some point in time issue the command for mapping drives or else it doesn't work. Now if I am to automate the backup, I have some trouble imagining how I can do so without saving the credentials.
of course I could use a script to map the drive and another to un-map the drive. But we all know that sometimes it doesn't work and in this case, either you are left exposed (map drive still available) or worse, drive not mapped and hence no backup !
Now even if I use a backup username and password (after I create them) still the problem is the same.
Thanks
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
You're asking a decent set of questions. But another thing to take into account is how long you're keeping backups for. Let's say a user gets infected, and they cause a number of files at the original location to get encrypted, which of course now means you're backing up affected data. The question becomes how long will it take you to identify this vs how far back your available backups go.

So obviously, this is technically a bit out of the scope of your question, since you're wondering how to prevent the infection in the backup process. However, I think it's worth paying attention to the entire picture.
I believe the scope becomes less of an issue if you use versioning backup software.  This keeps an encrypted snapshot of your system in chronological order.  inSync, for example, keeps the last 24 hours in increments you set, the last 7 days - 1 per day, previous weeks 1 per week, and then when you are a month away - each month one copy (I believe 3-4 copies - so 3-4 months).  In this way you can always go "back in time".  The best way is to have an external drive you can disconnect, but it doesn't sound like that is possible.
To be honest I did not know that so many parameters are at play.

What I have in mind is the following (please do not laugh at me...-)-

I get 3 external NAS (not expensive) that can hook to the LAN. and each one I connect to a switch.
Now I configure my backups to be in 3 folds , meaning 1 mon, 1 Tue , and 1 Wed. and again , on the one for Mon I do Thu and the one for Tue I do Fri.
So now , theoretically I have 3 sets of backups of 3 days. So at any time I can go back to 3 days. Which is good enough. One question was when I would know I was infected..the answer is maximum the same day.

Now since each NAS is connected to my LAN using a switch. I connect each switch to an electronic timer (5 USD each).
And that electronic switch is made to switch off power -hence disconnecting the NAS from the LAN once the backup is done.
How do I know the backup is done ? Just calculate.. suppose the first time it took 3H, then I would schedule for 4H.

I know that doesn't sound too "IT" stuff, but that is a very simple way and safe. coz the problem is the automated tasks and scripts...
they always fail at some point in time and most of the time when "*&#*" happens !
What do you guys think ?
Sounds reasonable and like it might work (the low-tech stuff always works better).  My one worry would be if the power cut off too soon.  You might want to keep are careful eye on that the first couple of weeks.  Also you might want to double the time power is on.  Otherwise that sounds like it will work.
in my opinion there are some good suggestions but the main dangers still were not addressed.

(1) if the data you want to backup already is (partially) infected then a backup of the data would overwrite a good version of an old backup. so even if you hold always 2 versions of the backup, you finally have only 1 backup left and the original and the latest backup cannot be used.

(2)  while the connection to the backup device is open, the malware could infect more files and even infect backup-files if you don't take care that old backup versions are not accessible from the current os. be aware that malware also is able to map a network device or network share although not all malware is as clever.

(3) if you need to restore a system from a backup, you definitively should have a clean or empty system before. you may have to consider that the malware still had infected the operation system which therefore must be thoroughly checked  before being used again.

i think to meet all those dangers you need programs which trustfully check a system that it is not infected. if that is guaranteed, you should make a backup by using another os best by making clones of whole disks at least of the system disk. the clone function of most backup programs would shutdown and reboot into another os since that would be the only way to make a clone of the system disk.

so, assume you could make two clone versions of each system disk and two regular (zipped) backup versions of your data disks to disks or NAS systems which are not accessible from your Windows accounts, you are really safe and I don't see any leak where malware could hook into beside the programs to check whether your systems are clean (Point 1), would not work properly or were infected themselves.

note, all backup programs which could clone disks are also able to reboot your main os after backup.

Sara
Agree with Sara on the clone stuff.  I have one system set up in this way:

It clones the drives on the device every night
It backup all files every 20 minutes
The clones are replaced each day with alternates
weekly backups are stored on a network share device, and on external drives which are detached
2 monthly backups are stored on detached backup drives

Backups are tested on a weekly basis (untested backups are the same as not making backups at all).
Clones are tested once a month on a different physical device

I make the clones using Paragon Software (free for qualified members of EE).
You need to determine if you really need a daily offsite backup or just a weekly.  How much data is ok to lose?  How often do you expect ransomeware actually get activated?  Offsite backups make more sense when you get a bunch to take offsite each week and keep it offsite for a month or more.  It's not useful if you overwrite them too soon.
Setting up a backup job from the QNAP involves a lot of configurations and custom scripting. To simplify the backup operations, you can setup a small backup server on the same machine manage the backup jobs with QNAP as the storage node.
At the enterprise level, Veeam, Microsoft DPM and other EMC products can be an option but it totally depends upon the backup requirements. you can use any backup software integrated with cloud storage provider. I tried CloudBerry Backup which provides a simple GUI to manage backup and restores and has backup features like job scheduling, CLI, compression and encryption.
Sara thank you for the detailed explanation.
In terms of cloning I believe I'm covered as all my environment is Hyper-V VMS.
So.i am taking monthly a copy of my VMs.(i don't often change configuration).
So I'm left with the data backup.(Documents mainly).
So if I have a "workable" VMs copy all I would need to care of is data files.
So if I have a "workable" VMs copy all I would need to care of is data files.
yes. you even could try to separate between data that easily could be regained from anywhere and data created by you which might got lost forever. in my case the amount of data which could not be restored from other computers respectively is not automatically backup'ed is very small. so, it is safe for me to save all new data to an usb-stick and from there to my notebook, or send it by mail attachment to my mail server.

Sara
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial