My DLT is not taking 70GB backup in compression mode

Hi,

We are using SUN DLT7000 with a capacity of 35GB Native and 70GB compressed, when i am using in Sun OS it is taking only 55GB of data in compress mode,

Can anybody help me on this

Naren
snaren2210Asked:
Who is Participating?

[Webinar] Streamline your web hosting managementRegister Today

x
 
Handy HolderConnect With a Mentor Saggar maker's bottom knockerCommented:
55GB on a native 35GB tape is quite good. You will only get 70GB on it if the data can be compressed 2:1 and that's rarely the case. Executables don't compress, databases are often partially compressed already, about the only things that you may get 2:1 compression with are text files.
0
 
MustangbradCommented:
I second Andyalder, it's often a misconception about compression with tape drives and people usually fail to read the small print "assuming 2:1 compression". At least your getting 55GB, I've seen people with novell servers get 40GB and then call me up complaining that their drive is defective. This is not the case since most novell volumes are compressed already. Your probably getting 1.3-1.4:1 compression, this is normal and will not change unless you are backing up data that isn't compressed alreay.


Brad

0
 
dovidmichelCommented:
I think this has already been answered but perhaps you are looking for a little more background data.

On a basic level software and hardware compression do the same. Once a file has been compressed it can not become compressed again. Examples of compressed files are *.zip, *.jpg, *.mp3, *.tar and there are a lot more.

Compression works be taking out repetitive patterns and replacing it with what I will call a key. This brings up two points:
1) The process of compressing a file does add overhead, so if a compressed file is compressed it could actually be larger than the original.
2) Some files will compress more than others. For example:
    A) binary files such as *.exe, *.dll, do not have much in the way of a repetitive pattern and so will not compress much if any at all.
    B) Text files typically have a lot of repetitive patterns in the way of empty space and so will compress a lot.

Also each file is processed seperately so lots of small files will equate to more overhead and less total compression.

Let's take a worse case and best case example that I have come across.
1) 100gb of 25-50kjpgs no compression was realized.
2) 100gb of text files in the form of debug logs which all fit on one tape with plenty of room left.

So my bottom line is that if the data in question fits into the catagory of "normal" data and not like either of the above examples then andyalder is right on the mark.
0
 
dovidmichelCommented:
snaren2210,

From my perspective your question has been completly answered.
Is there some more data you needed on this question?

If not please award andyalder the deserved points.
0
All Courses

From novice to tech pro — start learning today.