We help IT Professionals succeed at work.

My DLT is not taking 70GB backup in compression mode

snaren2210
snaren2210 asked
on
Medium Priority
240 Views
Last Modified: 2010-04-03
Hi,

We are using SUN DLT7000 with a capacity of 35GB Native and 70GB compressed, when i am using in Sun OS it is taking only 55GB of data in compress mode,

Can anybody help me on this

Naren
Comment
Watch Question

saggar maker's framemaker
CERTIFIED EXPERT
Distinguished Expert 2019
Commented:
Unlock this solution and get a sample of our free trial.
(No credit card required)
UNLOCK SOLUTION
I second Andyalder, it's often a misconception about compression with tape drives and people usually fail to read the small print "assuming 2:1 compression". At least your getting 55GB, I've seen people with novell servers get 40GB and then call me up complaining that their drive is defective. This is not the case since most novell volumes are compressed already. Your probably getting 1.3-1.4:1 compression, this is normal and will not change unless you are backing up data that isn't compressed alreay.


Brad

I think this has already been answered but perhaps you are looking for a little more background data.

On a basic level software and hardware compression do the same. Once a file has been compressed it can not become compressed again. Examples of compressed files are *.zip, *.jpg, *.mp3, *.tar and there are a lot more.

Compression works be taking out repetitive patterns and replacing it with what I will call a key. This brings up two points:
1) The process of compressing a file does add overhead, so if a compressed file is compressed it could actually be larger than the original.
2) Some files will compress more than others. For example:
    A) binary files such as *.exe, *.dll, do not have much in the way of a repetitive pattern and so will not compress much if any at all.
    B) Text files typically have a lot of repetitive patterns in the way of empty space and so will compress a lot.

Also each file is processed seperately so lots of small files will equate to more overhead and less total compression.

Let's take a worse case and best case example that I have come across.
1) 100gb of 25-50kjpgs no compression was realized.
2) 100gb of text files in the form of debug logs which all fit on one tape with plenty of room left.

So my bottom line is that if the data in question fits into the catagory of "normal" data and not like either of the above examples then andyalder is right on the mark.
snaren2210,

From my perspective your question has been completly answered.
Is there some more data you needed on this question?

If not please award andyalder the deserved points.
Unlock the solution to this question.
Thanks for using Experts Exchange.

Please provide your email to receive a sample view!

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.