Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 354
  • Last Modified:

When compressing a file on a unix server

When compressing a log file using the compress command in solaris8 how much space do I need to leave on the file system. For example I want to compress a 1gig file and only have 100 megs free space on the drive could I do it?
1 Solution
A neat trick is to use the -v option to work out how well the compression will be performed whilst writing the output to /dev/null. That way you can work out how much space is required before doing the compress for real.


I want to compress a large file called large.file

-bash-3.00$ ls -l large.file
-rw-r--r--   1 garypen  staff    294625280 Feb 26 14:11 large.file
-bash-3.00$ compress -cv large.file > /dev/null
large.file: Compression: 29.87%

In this case we know when we compress the file for real it will reduce by 29.87% of 294625280, i.e. 88004571, bytes. So as long as we have 294625280 -  88004571, i.e. 20662079 bytes, space free, we are in business.

An alternative solution is to do the following:

1) Identify an alternative filesystem which has at least [logfile size] free space.
2) Use compress to compress the file to standard output and redirect standard output to this other filesystem.


Imagine you have a 1G log file which you wish to compress and you have 2G of space available in /tmp.

You could do the following command

compress -c log.file > /tmp/log.file.Z

The above will definitely work on Solaris 10. I don't have access to Solaris 8 to test this, but I think the -v option existed back in Solaris 8. Also, gzip and bzip2 both compress better than compress if you are really keen to save space. Both are widely available as freeware for Solaris 8 and gzip may even have shipped as standard in Solaris 8, can't remember. They both ship as standard in Solaris 10.

When you compress a file using any of the compression utilities (you should be using gzip or bzip2 instead of compress), the amount of space needed is:

original size + compressed size

Say your 1GB file compresses to 100MB, then you will need 1110MB free in order to compress it.

However, as garypen has pointed out, you can simply compress it to another filesystem.  I would do it like:

gzip -c /filesystem/largefile >/tmp/largefile.gz

if [ $? -eq 0 ]
     echo "gzip succeeded"
     rm /filesystem/largefile
     mv /tmp/filefile.gz /filesystem/
    echo "gzip failed.  Leaving original file intact"
    rm -f /tmp/largefile.gz

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now