w3wp.exe eats all memory in IIS usin Response.BinaryWrite

I have this site, written in ASP which makes intensive use of Response.BinaryWrite, this is the code mostly used (Jscript):

  var rs = cn.execute(sqlQuery)
  var pos = 0;
  var binMem = rs(0); // data is in first field of recordset

  var stream = Server.CreateObject("ADODB.Stream");
  stream.Type = 1 //adTypeBinary
  Response.ContentType = this.getContentType();
  Response.AddHeader("Content-Disposition", "inline;filename=\"" + filepath + "\"");
  Response.Buffer = false;
  stream.Position = 0;
  var buffer = (16 * 1024);
  while (!stream.EOS)
  try { rs.close(); rs=null; dbe.close(); dbe=null; } catch (e) {}
After a few hours the w3wp.exe eats all the system memory (about 1,3Gb) and w2k3 starts paging, which cause IIS to start responding very slowly.

I have read there's a fix from microsoft here:

but i have also read that it doesn't completely solve the problems because one of the side effects of applying the fix is that IIS hangs up occasionally.

It would be nice to hear from someone that streams large files using Response.BinaryWrite without having those problems, and to know how he/she does it. I was wondering if using small reads from the record and small writes to the stream could avoid this problem.

Who is Participating?
AndresMConnect With a Mentor Commented:
I don't know. If you can, you should make some tests with VectorSendThrottleLimit, and see what happens. I couldn't find information about experiences, just this link...
The KB says: "If you set the VectorSendThrottleLimit subkey to a value that is less than the amount of data that an ASP page sends to a client, IIS stops responding (hangs). For example, if you set the VectorSendThrottleLimit subkey to 1000, any ASP page that transfers more than 1000 bytes is not dispayed in the client browser. Additionally, the IIS thread that handles the request is blocked until you restart the IIS service"

So, you should configure VectorSendThrottleLimit with a value grater than the biggest response you could have.

But, I am not sure, I think you can chunk the response, see if this thread helps:

cipAuthor Commented:
Thanks AndresM.

It's not clear to me how much a good value for VectorSendThrottleLimit should be. What is not clear is the how much is "the amount of data that an ASP page sends to a client". Does this mean a single call to binaryWrite or the data from the first Response.binarywrite to the end of the page.

We stream files several tens of mb large, does this mean we need to set VectorSendThrottleLimit to something like tens of mb? Wouldn't this bring up the problem again 'cause tens of mb are queued so we're there again.

cipAuthor Commented:
Just for the records, buffering the data in a loop sending them to the client in small chunks seems to have improved performances. During the loop I also check if the client is still connected using Response.IsClientConnected.

Here is some sample code about how to read blobs in small chunks:

I have accepted AndesM answer giving him a B since he has been of some help suggesting to chunk the response.

cipAuthor Commented:
Got aware I gave AndesM a C instead of a B, my fault, if an administrator is following can it please change it to a B?

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.