• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1824
  • Last Modified:

C#: Reading Data in chunks from DB

Hi,

I have number of large binary files stored in a MS SQL Database and have to retrieve them in order to offer them to users of a website for download. While this works fine using the usual Response.BinaryWrite() for files for up about 100 MB in size, anything larger then that eventually runs into an OutOfMemory Exception on the webserver of course.

The natural solution to this is retrieving and serving up the files in chunks instead methinks. I have thus used the code below to achieve this (that is just a snippet of course), but the files become corrupted during transfer. That means I can successfully download the files in their full size, but they are no longer openable. I am a novice at serving up such hugs files from the database, but due to the architecture of the project, I am unable to save them in the filesystem instead. My suspicion is that I am just dropping bytes in the method below, or possibly I am not catching the last chunk fully.

I have read this article explaining how to read from a file system, but am unsure if this is applicable to my problem here: http://www.developerfusion.com/code/4696/reading-binary-data-in-c/

At a loss here, any hints would be much appreciated - feeling a bit stupid for asking what might seem like an obvious question to some experts.

Thanks in advance!

//execute command and get file itself
                SqlDataReader drFile = myCommand.ExecuteReader(CommandBehavior.SequentialAccess);
                while (drFile.Read())
                {
                    Byte[] fileBytes = new Byte[32768];
                    MemoryStream ms = new MemoryStream();
                    try
                    {
                        int index = 0;
                        while (true)
                        {
                            long count = drFile.GetBytes(drFile.GetOrdinal("File"), index, fileBytes, 0, fileBytes.Length);
                            if (count == 0)
                            {
                                break;
                            }
                            else
                            {
                                index = index + (int)count;
                                ms.Write(fileBytes, 0, (int)count);
                                Response.BinaryWrite(ms.ToArray());
                                Response.Flush();
                            }
                        }
 
                    }
                    catch (Exception ex)
                    {
                        Trace.Warn("Failed to get file:", ex.ToString());
                    }
                    fileBytes = null;
                    ms.Dispose();
                }
                //close connections
                drFile.Close();
                drFile.Dispose();
                myConnection.Close();
                Response.End();

Open in new window

0
mikevoss
Asked:
mikevoss
  • 3
  • 2
3 Solutions
 
vbwizardryCommented:
Try this
const int BUFFER_SIZE = 1024;
long currPos = 0;
int fIndex = drFile.GetOrdinal("File")
long blobSize = reader.GetBytes(0, 0, null, 0, 0);
byte[] buffer = new byte[(BUFFER_SIZE>blobSize?blobSize:BUFFER_SIZE)];
while (currPos < blobSize) {
    currPos += reader.GetBytes(fIndex, currPos, buffer, 0, buffer.Length);
    Response.BinaryWrite(ms.ToArray());
    Response.Flush();
}

Open in new window

0
 
vbwizardryCommented:
Small fix

fix this long blobSize = reader.GetBytes(fIndex, 0, null, 0, 0);

Open in new window

0
 
mikevossAuthor Commented:
Hi vbwizardry,

Thank you for your quick response! I have taken your code and applied it. It doesnt work however, because the bytes never get written to the memory stream from what I can tell. Am I missing a line?

Here is the full code including the datareader read instruction, stripped of the try catch finally code for database operations.

Thanks!
SqlConnection myConnection = new SqlConnection(ConfigurationManager.ConnectionStrings["CMSConnectionString"].ConnectionString);
myConnection.Open();
new SqlCommand myCommand = new SqlCommand("usp_get_GeoDirectory_CustomerFiles", myConnection);
myCommand.CommandType = CommandType.StoredProcedure;
myCommand.CommandTimeout = 240;
 
//execute command and get file
SqlDataReader drFile = myCommand.ExecuteReader(CommandBehavior.SequentialAccess);
while (drFile.Read())
{
    MemoryStream ms = new MemoryStream();
    const int BUFFER_SIZE = 1024;
    long currPos = 0;
    int fIndex = drFile.GetOrdinal("File");
    long blobSize = drFile.GetBytes(fIndex, 0, null, 0, 0);
    byte[] buffer = new byte[(BUFFER_SIZE>blobSize?blobSize:BUFFER_SIZE)];
    while (currPos < blobSize) {
        currPos += drFile.GetBytes(fIndex, currPos, buffer, 0, buffer.Length);
        Response.BinaryWrite(ms.ToArray());
        Response.Flush();
    }
 
//close connections
drFile.Close();
drFile.Dispose();
myConnection.Close();
Response.End();

Open in new window

0
 
vbwizardryCommented:
Thats because i forgot to replace the ms.ToArray() with buffer.
Response.BinaryWrite(buffer);

Open in new window

0
 
mikevossAuthor Commented:
Great Stuff, thanks a lot for your help - merely graded it as good rather than excellent, because it didnt work the first time round, otherwise very happy with fast response and working solution.
0

Featured Post

2018 Annual Membership Survey

Here at Experts Exchange, we strive to give members the best experience. Help us improve the site by taking this survey today! (Bonus: Be entered to win a great tech prize for participating!)

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now