• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 714
  • Last Modified:

Classic ASP Buffer Issue

I have an asp recordset that contains about 2300 rows and could climb to 4000 in the next couple of weeks.  When I try to open the page I get  "Buffer limit exceeded" even though I have increased that limit to 64MB.  I will supply my code below.  I have to believe that getting 4000 records should not be a problem if done correctly, right?  I am including my query below.  Please let me know where my query can be optimized.

As always, thanks!!!

		Set rs = Server.CreateObject("ADODB.Recordset")
		sql="SELECT p.ParticipantID, p.FirstName, p.LastName, rc.Bib, p.Gender, rc.Age, p.City, p.St, p.Phone, p.DOB, p.Email, rg.RaceID FROM "
		sql = sql & "Participant p INNER JOIN PartReg rg ON p.ParticipantID = rg.ParticipantID JOIN PartRace rc "
		sql = sql & "ON rc.ParticipantID = p.ParticipantID WHERE rc.RaceID IN (" & sEventRaces & ") " & sOrderBy
		rs.Open sql, conn, 1, 2
		Do While Not rs.EOF
			PartArray(0, i) = rs(0).value
			PartArray(1, i) = Replace(rs(2).Value, "''", "'") & ", " & Replace(rs(1).Value, "''", "'")
			For j = 2 to 10
				PartArray(j, i) = rs(j + 1).Value
			Next
			i = i + 1
			ReDim Preserve PartArray(10, i)
			rs.MoveNext
		Loop
		rs.Close
		Set rs=Nothing

Open in new window

0
Bob Schneider
Asked:
Bob Schneider
4 Solutions
 
Carl TawnSystems and Integration DeveloperCommented:
Is there any particular reason you are pulling all of that data into an array rather than writing straight to the output buffer?

ReDim Preserve causes a new array to be created of the specified size, and then copies all the data from the old array to it. So each time you call ReDim Preserve you are temporarily holding double the amount of memory - hence increasing the array size by 1 in a loop is a very expensive operation.
0
 
Big MontySenior Web Developer / CEO of ExchangeTree.org Commented:
it looks like you're trying to copy your recordset into an array, if thats the case, ditch the loop and use the built-in recordset function GetRows:

 		Set rs = Server.CreateObject("ADODB.Recordset")
		sql="SELECT p.ParticipantID, p.FirstName, p.LastName, rc.Bib, p.Gender, rc.Age, p.City, p.St, p.Phone, p.DOB, p.Email, rg.RaceID FROM "
		sql = sql & "Participant p INNER JOIN PartReg rg ON p.ParticipantID = rg.ParticipantID JOIN PartRace rc "
		sql = sql & "ON rc.ParticipantID = p.ParticipantID WHERE rc.RaceID IN (" & sEventRaces & ") " & sOrderBy
		rs.Open sql, conn, 1, 2     '-- i would change this to set rs = conn.Execute( sql ), you should only use rs.Open when modifying an rs

                dim PartArray
                PartArray = rs.GetRows()

Open in new window


another way you could optimize this page would be to use paging, this way you could only grab the records you want to use for that page. Here's a great paging tutorial to follow:

http://www.4guysfromrolla.com/webtech/121298-1.shtml

*edit* fixed typo in code
0
 
Scott Fell, EE MVEDeveloper & EE ModeratorCommented:
Increasing the limit is not the answer.  Using getrows as The_Big_Daddy is the way to go and there should be no reason to pull 4,000 records at once.  Pull in a couple hundred records and display about 20 or 30 at a time via js/jquery.  When you get to the last "page" then pull in another 100 records.  It would probably be best to do all of this via ajax where you have a remote page that you post to to add records to a js/jquery array on your main page.   Have a quick look at dataables http://datatables.net/ to help you do this.
0
 
fritz_the_blankCommented:
I agree that outputting that many records at a time is bad practice.

As The Big Daddy says, the .getRows() method is much more efficient than what you are doing.

I would use paging to control this--that way you're only grabbing a subset of the records at a time while having the whole set available.

If you want to, however, using Response.Flush will clear your buffer as you print your records.
0
 
Bob SchneiderCo-OwnerAuthor Commented:
This is the information I needed.  Thank you all so much!  For background information, we time events (5k, mud runs, nordic ski, cross-country running, etc) with proprietary software.  We are transitioning to some much bigger events so this is the type of process change that I will need to transition to in order to do this.

Once again, very helpful!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now