Solved

Compress Large Files in a Fast Manner

Posted on 2008-10-01
8
1,576 Views
Last Modified: 2013-12-01
Hi,

How can we compress a large file quicker. Right now the file size is like 15GB and will probably grow. Winrar takes hours. Is there any tools out there doing a better job?

thanks for your help
0
Comment
Question by:ebi168
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 5
  • 3
8 Comments
 

Author Comment

by:ebi168
ID: 22620537
Add: compress fast but also ensures that it has good compression rate
0
 
LVL 7

Expert Comment

by:mchkorg
ID: 22623392
Look, you can use:
winzip
winrar
winace
7zip
bzip2 (if unix/linux)
gzip (if unix/linux)
some others...

But no matter the one you choose, it's almost the same algorithm at the end.
You have to choose a compression/speed ratio and decide what you want : real smaller file or not ?
Example with winrar, in the options : "create default profile" and you choose something different from "best", it should be faster, bug a bigger file.
It also depends on your file content: text, raw data, something specific ? Some content will require more time(/techniques to be correctly compressed.

You can't just ask these tools to "do it faster".

By the way, maybe the idea would be to have something to cut your file into smaller pieces (1 GB parts). Thus, you might be able to multi-thread your process (depending on the disk speed/interface and so on.

If you want some help to decide: tell us what's this file, how it is generated and so on...

Regards
0
 

Author Comment

by:ebi168
ID: 22626382
Hi this is just txt files with data rows in it, think about the export from a database table...
0
Webinar: Aligning, Automating, Winning

Join Dan Russo, Senior Manager of Operations Intelligence, for an in-depth discussion on how Dealertrack, leading provider of integrated digital solutions for the automotive industry, transformed their DevOps processes to increase collaboration and move with greater velocity.

 

Author Comment

by:ebi168
ID: 22626409
We want a high size compression (as the compressed file needs to be sent to outside network) while the faster the better. thanks
0
 
LVL 7

Accepted Solution

by:
mchkorg earned 250 total points
ID: 22632328
You can't get both :(

As it's a DB export, you should for example extract, let say, a 100 MB file from it and do some tests with several compression profile (from low to best).
Maybe using "best" is useless (two times the time for 2% better)
With some values from these real examples, you'll find the right solution for your needs.

If it's a DB export, you might also be able to generate it as multiple files (1 GB files ?) and see if some multi-threading compression (aka, start several winrar) can help. It depends on : hard disk interface / multiple disks / number of CPU/cores / memory etc.
Example : on a "common" computer, don't think about it, it's useless. On a real server (several SCSI disks using RAID), you might get 4 times faster by encoding 4 files. You "might".
0
 

Author Comment

by:ebi168
ID: 22638351
mchkorg: thanks for the comment. let me do some research into it.
0
 

Author Comment

by:ebi168
ID: 22638387
how about 8 duo core AMD processor, with SCSI and RAID controllers. So multithread appears the solution then.
0
 
LVL 7

Assisted Solution

by:mchkorg
mchkorg earned 250 total points
ID: 22650643
Clearly,

1) Try to generate smaller files
2) Do some tests to find the best time/compression ratio with your compression tool
3) script it all :)


0

Featured Post

Space-Age Communications Transitions to DevOps

ViaSat, a global provider of satellite and wireless communications, securely connects businesses, governments, and organizations to the Internet. Learn how ViaSat’s Network Solutions Engineer, drove the transition from a traditional network support to a DevOps-centric model.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

It is a common problem that often server suffers from the lack of space on system volume. Old servers or new ones from vendors come with preformatted small volume - 5-6GB in total and after installing updates or applications the free space on system…
I wrote an article (http://www.experts-exchange.com/articles/2245/Anti-rootkit-software.html) some time ago with a reference to nLite  (http://www.nliteos.com/)slipstreaming software.  I recently changed that link to point to NTLite (https://www.ntl…
I've attached the XLSM Excel spreadsheet I used in the video and also text files containing the macros used below. https://filedb.experts-exchange.com/incoming/2017/03_w12/1151775/Permutations.txt https://filedb.experts-exchange.com/incoming/201…
Finding and deleting duplicate (picture) files can be a time consuming task. My wife and I, our three kids and their families all share one dilemma: Managing our pictures. Between desktops, laptops, phones, tablets, and cameras; over the last decade…

726 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question