Solved

Compress Large Files in a Fast Manner

Posted on 2008-10-01
8
1,570 Views
Last Modified: 2013-12-01
Hi,

How can we compress a large file quicker. Right now the file size is like 15GB and will probably grow. Winrar takes hours. Is there any tools out there doing a better job?

thanks for your help
0
Comment
Question by:ebi168
  • 5
  • 3
8 Comments
 

Author Comment

by:ebi168
Comment Utility
Add: compress fast but also ensures that it has good compression rate
0
 
LVL 7

Expert Comment

by:mchkorg
Comment Utility
Look, you can use:
winzip
winrar
winace
7zip
bzip2 (if unix/linux)
gzip (if unix/linux)
some others...

But no matter the one you choose, it's almost the same algorithm at the end.
You have to choose a compression/speed ratio and decide what you want : real smaller file or not ?
Example with winrar, in the options : "create default profile" and you choose something different from "best", it should be faster, bug a bigger file.
It also depends on your file content: text, raw data, something specific ? Some content will require more time(/techniques to be correctly compressed.

You can't just ask these tools to "do it faster".

By the way, maybe the idea would be to have something to cut your file into smaller pieces (1 GB parts). Thus, you might be able to multi-thread your process (depending on the disk speed/interface and so on.

If you want some help to decide: tell us what's this file, how it is generated and so on...

Regards
0
 

Author Comment

by:ebi168
Comment Utility
Hi this is just txt files with data rows in it, think about the export from a database table...
0
 

Author Comment

by:ebi168
Comment Utility
We want a high size compression (as the compressed file needs to be sent to outside network) while the faster the better. thanks
0
Threat Intelligence Starter Resources

Integrating threat intelligence can be challenging, and not all companies are ready. These resources can help you build awareness and prepare for defense.

 
LVL 7

Accepted Solution

by:
mchkorg earned 250 total points
Comment Utility
You can't get both :(

As it's a DB export, you should for example extract, let say, a 100 MB file from it and do some tests with several compression profile (from low to best).
Maybe using "best" is useless (two times the time for 2% better)
With some values from these real examples, you'll find the right solution for your needs.

If it's a DB export, you might also be able to generate it as multiple files (1 GB files ?) and see if some multi-threading compression (aka, start several winrar) can help. It depends on : hard disk interface / multiple disks / number of CPU/cores / memory etc.
Example : on a "common" computer, don't think about it, it's useless. On a real server (several SCSI disks using RAID), you might get 4 times faster by encoding 4 files. You "might".
0
 

Author Comment

by:ebi168
Comment Utility
mchkorg: thanks for the comment. let me do some research into it.
0
 

Author Comment

by:ebi168
Comment Utility
how about 8 duo core AMD processor, with SCSI and RAID controllers. So multithread appears the solution then.
0
 
LVL 7

Assisted Solution

by:mchkorg
mchkorg earned 250 total points
Comment Utility
Clearly,

1) Try to generate smaller files
2) Do some tests to find the best time/compression ratio with your compression tool
3) script it all :)


0

Featured Post

IT, Stop Being Called Into Every Meeting

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

Join & Write a Comment

In this article we have discussed the manual scenarios to recover data from Windows 10 through some backup and recovery tools which are offered by it.
I previously wrote an article addressing the use of UBCD4WIN and SARDU. All are great, but I have always been an advocate of SARDU. Recently it was suggested that I go back and take a look at Easy2Boot in comparison.
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…
This demo shows you how to set up the containerized NetScaler CPX with NetScaler Management and Analytics System in a non-routable Mesos/Marathon environment for use with Micro-Services applications.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

11 Experts available now in Live!

Get 1:1 Help Now