Improve company productivity with a Business Account.Sign Up

x
?
Solved

Is .net Response.OutputStream.Write asynchronous?

Posted on 2009-04-08
2
Medium Priority
?
1,566 Views
Last Modified: 2013-11-08
On Micorsofts website (http://support.microsoft.com/?kbid=812406) there's an article showing how to download very large files from a website.

Works great but...

In their sample code (see code section) they allocate a 10,000 byte buffer to chunk the byte file stream to the output stream. Fine. But in the loop they re-allocate a new 10,000 byte buffer for each chunk. Seems like a waste of memory.

I can only image that response.outputstream.write is running in a separate thread and may still be using the original buffer. I can't find any info on threading for this method.

Any thoughts on why/if the reallocation is necessary?
System.IO.Stream iStream = null;
 
// Buffer to read 10K bytes in chunk:
byte[] buffer = new Byte[10000];
 
// Length of the file:
int length;
 
// Total bytes to read:
long dataToRead;
 
// Identify the file to download including its path.
string filepath  = "DownloadFileName";
 
// Identify the file name.
string  filename  = System.IO.Path.GetFileName(filepath);
 
try
{
	// Open the file.
	iStream = new System.IO.FileStream(filepath, System.IO.FileMode.Open, 
					System.IO.FileAccess.Read,System.IO.FileShare.Read);
 
 
	// Total bytes to read:
	dataToRead = iStream.Length;
 
	Response.ContentType = "application/octet-stream";
	Response.AddHeader("Content-Disposition", "attachment; filename=" + filename);
 
	// Read the bytes.
	while (dataToRead > 0)
	{
		// Verify that the client is connected.
		if (Response.IsClientConnected) 
		{
			// Read the data in buffer.
			length = iStream.Read(buffer, 0, 10000);
 
			// Write the data to the current output stream.
			Response.OutputStream.Write(buffer, 0, length);
 
			// Flush the data to the HTML output.
			Response.Flush();
 
			buffer= new Byte[10000];
			dataToRead = dataToRead - length;
		}
		else
		{
			//prevent infinite loop if user disconnects
			dataToRead = -1;
		}
	}
}
catch (Exception ex) 
{
	// Trap the error, if any.
	Response.Write("Error : " + ex.Message);
}
finally
{
	if (iStream != null) 
	{
		//Close the file.
		iStream.Close();
	}
	Response.Close();
}

Open in new window

0
Comment
Question by:mduffin06
2 Comments
 
LVL 5

Accepted Solution

by:
burningmace earned 2000 total points
ID: 24100186
It isn't necessary at all - the code would simply overwrite the old data with the new. You can comment line 46 out with no problems at all. I think the programmer did this just to make sure that the old data would not cause complications. However, it wouldn't be a waste of memory as the garbage collector would recognise that line 46 causes the old buffer asset to become orphaned and would unallocate it. I'm not sure how GC scheduling is handled in .NET, but the use of New may signal GC automatically to check if an old asset is being replaced. If not, then this method probably wastes around 500K of memory at peak before GC notices and clear it. Not really a huge issue with most systems having 1GB of memory or more these days. But yes, it is bad practice to do things like that.
0
 

Author Closing Comment

by:mduffin06
ID: 31568143
Per burningmace advice, removed new allocation. Seems to work fine. No problems to date.
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Simulator games are perfect for generating sample realistic data streams, especially for learning data analysis. It is even useful for demoing offerings such as Azure stream analytics, PowerBI etc.
There is a wide range of advantages associated with the use of ASP.NET. This is why this programming framework is used to create excellent enterprise-class websites, technologies, and web applications.
From store locators to asset tracking and route optimization, learn how leading companies are using Google Maps APIs throughout the customer journey to increase checkout conversions, boost user engagement, and optimize order fulfillment. Powered …
Watch the working video to know how to import Outlook PST/OST files to Amazon WorkMail. Kernel released this tool which is very easy to use and migrate single or multiple PST and OST files to Amazon WorkMail. To know more about Kernel Import PST to …

595 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question