Link to home
Start Free TrialLog in
Avatar of cip
cip

asked on

w3wp.exe eats all memory in IIS usin Response.BinaryWrite

I have this site, written in ASP which makes intensive use of Response.BinaryWrite, this is the code mostly used (Jscript):

  var rs = cn.execute(sqlQuery)
  var pos = 0;
  var binMem = rs(0); // data is in first field of recordset

  var stream = Server.CreateObject("ADODB.Stream");
  stream.Type = 1 //adTypeBinary
  stream.Open();
              
  Response.ContentType = this.getContentType();
  Response.AddHeader("Content-Disposition", "inline;filename=\"" + filepath + "\"");
  Response.Buffer = false;
  stream.Write(binMem.Value);
  stream.Position = 0;
  var buffer = (16 * 1024);
                
  while (!stream.EOS)
  {
    Response.BinaryWrite(stream.Read(buffer));
  }
               
  stream.Close();
  try { rs.close(); rs=null; dbe.close(); dbe=null; } catch (e) {}
             
After a few hours the w3wp.exe eats all the system memory (about 1,3Gb) and w2k3 starts paging, which cause IIS to start responding very slowly.

I have read there's a fix from microsoft here:
http://support.microsoft.com/kb/916984

but i have also read that it doesn't completely solve the problems because one of the side effects of applying the fix is that IIS hangs up occasionally.

It would be nice to hear from someone that streams large files using Response.BinaryWrite without having those problems, and to know how he/she does it. I was wondering if using small reads from the record and small writes to the stream could avoid this problem.

Avatar of AndresM
AndresM

The KB says: "If you set the VectorSendThrottleLimit subkey to a value that is less than the amount of data that an ASP page sends to a client, IIS stops responding (hangs). For example, if you set the VectorSendThrottleLimit subkey to 1000, any ASP page that transfers more than 1000 bytes is not dispayed in the client browser. Additionally, the IIS thread that handles the request is blocked until you restart the IIS service"

So, you should configure VectorSendThrottleLimit with a value grater than the biggest response you could have.

But, I am not sure, I think you can chunk the response, see if this thread helps:

http://groups.google.com.ar/group/microsoft.public.inetserver.asp.general/browse_thread/thread/d0b57f3543749c56/30874979435cce0c?lnk=st&q=asp+chunks++Response.BinaryWrite&rnum=4&hl=es
Avatar of cip

ASKER

Thanks AndresM.

It's not clear to me how much a good value for VectorSendThrottleLimit should be. What is not clear is the how much is "the amount of data that an ASP page sends to a client". Does this mean a single call to binaryWrite or the data from the first Response.binarywrite to the end of the page.

We stream files several tens of mb large, does this mean we need to set VectorSendThrottleLimit to something like tens of mb? Wouldn't this bring up the problem again 'cause tens of mb are queued so we're there again.


ASKER CERTIFIED SOLUTION
Avatar of AndresM
AndresM

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of cip

ASKER

Just for the records, buffering the data in a loop sending them to the client in small chunks seems to have improved performances. During the loop I also check if the client is still connected using Response.IsClientConnected.

Here is some sample code about how to read blobs in small chunks:
http://support.microsoft.com/kb/317034/en-us

I have accepted AndesM answer giving him a B since he has been of some help suggesting to chunk the response.

thanks
fc
Avatar of cip

ASKER

Got aware I gave AndesM a C instead of a B, my fault, if an administrator is following can it please change it to a B?

Thanks
fc