Improve company productivity with a Business Account.Sign Up

x
?
Solved

c# decompress a really large file

Posted on 2010-09-13
4
Medium Priority
?
4,872 Views
Last Modified: 2012-05-10
hi

i am using c# 3.5
i have a file that is just under 1GB.
 I am trying to unzip it (it has been Gzipped) and write the bytes to a new file.

I am getting an outofmemory exception when i try and do this.

here is a sample of my code (which works fine on small files):

using System.IO;
using System.IO.Compression;
using System.IO.Packaging;

//##############################################
//step 1
//first i read the bytes out the zipped file (this bit works fine)
//###############################################

byte[] zippedBytes;
using (var fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read))
{
       using (var binaryReader = new BinaryReader(fileStream))
       {
          var numBytes = new FileInfo(fileName).Length;
          zippedBytes = binaryReader.ReadBytes((int)numBytes);
       }
}

//################################################################
//step 2
//then i try and decompress to a new memorystream
//and i get repeated "Exception of type 'System.OutOfMemoryException' was thrown."    
//in the following code...        
//################################################################

using (var gZipStream = new GZipStream(new MemoryStream(zippedBytes), CompressionMode.Decompress))
{
         const int size = 4096;
         var buffer = new byte[size];
         using (var memoryStream = new MemoryStream())
             {
               var count = 0;
                do
                {
                  count = gZipStream.Read(buffer, 0, size);
                  if (count > 0)
                    {
                      memoryStream.Write(buffer, 0, count);
                        }
                    } while (count > 0);

                    return memoryStream.ToArray();
               }
}


can anyone help please?!

thankyou for your time
0
Comment
Question by:MrKevorkian
  • 2
4 Comments
 
LVL 55

Accepted Solution

by:
Jaime Olivares earned 2000 total points
ID: 33664004
MrKevorkian,
This is not a good approach, you are reading the entire file contents into memory, which overloads the memory. What will happen if you have a file of 4 GB? Also, consider that you are creating a decompressing buffer which is of the size of compressed file. It would be better to have a dynamic buffer.

The standard technique is to read in chunks and write to new file, like in the example shown in MSDN:
http://msdn.microsoft.com/en-us/library/system.io.compression.gzipstream(v=VS.90).aspx

If you are working with .net 4.0, then the solution is quiet straightforward with the new CopyTo method:


var ms = new MemoryStream(zippedBytes), 

using (GZipStream decompress = new GZipStream(new FileStream(fileName, FileMode.Open, FileAccess.Read), CompressionMode.Decompress))
                    {
decompress.CopyTo(ms);
return ms.ToArray();

// Notice you will have at least two buffers here: the 'ms' stream and the array generated before returning.
// To avoid this, you can decompress to a temporary filestream and read the contents into an array with ReadAllBytes

Open in new window

0
 
LVL 3

Expert Comment

by:vusov
ID: 33664056
Please try to use Ionic.Zip library. I've attached the GZipHelper sample of using.
Compress.zip
0
 
LVL 1

Author Comment

by:MrKevorkian
ID: 33669874
hi sorry for the delay!  im just looking at these two answers now. thanks
0
 
LVL 1

Author Closing Comment

by:MrKevorkian
ID: 33670760
hi i used the msdn example (well actually one of the comments at the bottom of the article)

heres my final code

 public void DecompressAndWriteToFile(FileDetails fileDetails)
        {
            const int bufferSize = 4096;
            var compressedfileInfo = new FileInfo(fileDetails.FullPath);

            using (var compressedFileStream = compressedfileInfo.OpenRead())
            {
                var newDecompressedPath = Path.Combine(directoryProvider.MyDropDirectory, fileDetails.UncompressedName);
                using (var decompressedFileSream = File.Create(newDecompressedPath))
                {
                    using (var gZipStream = new GZipStream(compressedFileStream, CompressionMode.Decompress))
                    {
                        var buffer = new byte[bufferSize];
                        int numRead;
                        while ((numRead = gZipStream.Read(buffer, 0, buffer.Length)) != 0)
                        {
                            decompressedFileSream.Write(buffer, 0, numRead);
                        }
                    }
                }
            }
        }

it works great. thanks very much
0

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Entity Framework is a powerful tool to help you interact with the DataBase but still doesn't help much when we have a Stored Procedure that returns more than one resultset. The solution takes some of out-of-the-box thinking; read on!
The article shows the basic steps of integrating an HTML theme template into an ASP.NET MVC project
If you are looking for an automated tool which can generate reports for Outlook emails and other items from PST file, then you can go for Kernel PST Reporter tool. The reports which are created by this tool are helpful to analyze and understand PST …
Did you know PowerShell can save you time with SaaS platforms? Simply leverage RESTfulAPIs to build your own PowerShell modules. These will kill repetitive tickets and tabs, using the command Invoke-RestMethod. Tune into this webinar to learn how…

585 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question