Solved

What are the security risks associated with hard copies and backups.

Posted on 2004-08-05
3
211 Views
Last Modified: 2013-12-04
I am putting together a security paper for my company.   This question encompasses many different areas as one can imagine.  

1)  more and more features  have the potential to create more security holes and fewer people managing the security actually understand all the complexity that is involved.  

2)  Dozens of services running on just as many ports can create a security nightmares
 
3) many applications by default are coming 'out of the box' secure by default.  Even Windows is installing with security policies inplemented expecially on the Domain Controllers

4)  Security issues with hard copies and backups    
0
Comment
Question by:plate55
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
3 Comments
 
LVL 38

Expert Comment

by:Rich Rumble
ID: 11741682
1) fully understanding the products you impliment is key, testing is where you can find most of the inital bugs, or undesirable behaviour. Products used must come with complete documentation, or during testing phase be satisfactorially documented by testers.
2) All unnecessary ports and services should be turned off or blocked.
3) ...
4) Back-ups should either be encrypted during transmission over the network and stored off site by a trusted party, most companies are happy to guide you through their process's of their storage proceedures and transport precautions.

Let's have an example http://www.connected.com/
A product like Connected TLM offers several advantages over many backup solutions, backup files are compressed before being sent over the network, as well as being encrypted before traversing the network, backup's are broken up into 5-10 meg chunks, also encrypted. Duplicate files are not backed up. The files are recoverable, as long as you can setup the same username, and password for the account that backed them up, there is no problem. So if you had a 500 meg backup job of a few dozen files, that 500 meg's of data would actually be stored in 50-100 compressed/encrypted archives, that what you can search faster, because you do not have to decompress 500 meg's to restore 1 or 2 files, instead you uncompress 5 or 10 megs instead, speeding up the process. Also Delta's of backed up files are saved, and you can actually keep multiple versions of files, without having to back up each file version entirely... just the changed parts. Also back to the no duplicate storage... say the company sent everyone the same power-point presentation, instead of each and every person getting the PPT saved, the first person to back-up the PPT would send it to the server, then the next person who went to back up their PC, would get a "Flag" that said they also had the same PPT, and the next, and the next... the PPT would only be backed up once, however the back up would be aware of the others that may need that same file. If anyone changed the file slightly, and when to back it up, just the Delta of the canges would be backed up. The no dup's allows you to back up one PC, and then everyone in the company could probaly do a "complete" back up of their own machines, without using very much space at all.
-rich
0
 
LVL 31

Accepted Solution

by:
rid earned 500 total points
ID: 11746443
Different organisations have different problems.

"1)  more and more features  have the potential to create more security holes and fewer people managing the security actually understand all the complexity that is involved."  

Many of the security holes are well known and based on well known weaknesses in programs used almost everywhere, like Outlook (Express), Internet Explorer, IIS etc. One way to counter the threats is to avoit these products and use e.g. Pegasus, Mozilla and Apache instead; it will not make you "safe", but you'll have a lot less to worry about, probably. A good antivirus software is key, obviously.

"2)  Dozens of services running on just as many ports can create a security nightmares"

Yes, indeed. The remedy is of course to have a good firewall protecting the network AND to keep after the workstations/users, educating them about the risks and performance losses associated with having a lot of things running on a computer. The last part is probably the hardest...  Any network admin will probably benefit from looking through info at sites like www.grc.com, www.spywareinfo.com, www.answersthatwork.com . A policy in writing for some kind of user guidance is a good idea too.

"4)  Security issues with hard copies and backups "

Depends on the organisation, I'd say. What is the worst case scenario: Someone unauthorized getting hold of the data or the data just vanishing into thin air? Both?

I believe there are a number of "best practice" rules that can be applied in most cases, but I don't believe in generalization as a work method. Each case, each organisation should have it's own security philosophy that all involved understand. Without the understanding nothing much is accomplished, as people tend to take shortcuts around obstacles they don't understand. And you can't implement CIA-grade security in all workplaces either...
/RID
0
 
LVL 38

Expert Comment

by:Rich Rumble
ID: 11751800
Agreed, and it's what I was getting at... generalization leaves you open to many other vectors, each issue or program needs to have a detailed approach and documented histroy. Best practices would help put some of your questions in perspective.
Here are some great guides: http://www.sans.org/rr/catindex.php?cat_id=8
-rich
0

Featured Post

Optimize your web performance

What's in the eBook?
- Full list of reasons for poor performance
- Ultimate measures to speed things up
- Primary web monitoring types
- KPIs you should be monitoring in order to increase your ROI

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

As I write this article, I am finishing cleanup from the Qakbot virus variant found in the wild on April 18, 2011.  It was a messy beast that had varying levels of infection, speculated as being dependent on how long it resided on the infected syste…
Container Orchestration platforms empower organizations to scale their apps at an exceptional rate. This is the reason numerous innovation-driven companies are moving apps to an appropriated datacenter wide platform that empowers them to scale at a …
Michael from AdRem Software explains how to view the most utilized and worst performing nodes in your network, by accessing the Top Charts view in NetCrunch network monitor (https://www.adremsoft.com/). Top Charts is a view in which you can set seve…
In this video you will find out how to export Office 365 mailboxes using the built in eDiscovery tool. Bear in mind that although this method might be useful in some cases, using PST files as Office 365 backup is troublesome in a long run (more on t…
Suggested Courses

630 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question