I have number of large binary files stored in a MS SQL Database and have to retrieve them in order to offer them to users of a website for download. While this works fine using the usual Response.BinaryWrite() for files for up about 100 MB in size, anything larger then that eventually runs into an OutOfMemory Exception on the webserver of course.
The natural solution to this is retrieving and serving up the files in chunks instead methinks. I have thus used the code below to achieve this (that is just a snippet of course), but the files become corrupted during transfer. That means I can successfully download the files in their full size, but they are no longer openable. I am a novice at serving up such hugs files from the database, but due to the architecture of the project, I am unable to save them in the filesystem instead. My suspicion is that I am just dropping bytes in the method below, or possibly I am not catching the last chunk fully.
I have read this article explaining how to read from a file system, but am unsure if this is applicable to my problem here: http://www.developerfusion.com/code/4696/reading-binary-data-in-c/
At a loss here, any hints would be much appreciated - feeling a bit stupid for asking what might seem like an obvious question to some experts.
Thanks in advance!
//execute command and get file itself
SqlDataReader drFile = myCommand.ExecuteReader(CommandBehavior.SequentialAccess);
Byte fileBytes = new Byte;
MemoryStream ms = new MemoryStream();
int index = 0;
long count = drFile.GetBytes(drFile.GetOrdinal("File"), index, fileBytes, 0, fileBytes.Length);
if (count == 0)
index = index + (int)count;
ms.Write(fileBytes, 0, (int)count);
catch (Exception ex)
Trace.Warn("Failed to get file:", ex.ToString());
fileBytes = null;