Solved

Classic ASP Buffer Issue

Posted on 2013-11-15
5
618 Views
Last Modified: 2013-11-17
I have an asp recordset that contains about 2300 rows and could climb to 4000 in the next couple of weeks.  When I try to open the page I get  "Buffer limit exceeded" even though I have increased that limit to 64MB.  I will supply my code below.  I have to believe that getting 4000 records should not be a problem if done correctly, right?  I am including my query below.  Please let me know where my query can be optimized.

As always, thanks!!!

		Set rs = Server.CreateObject("ADODB.Recordset")
		sql="SELECT p.ParticipantID, p.FirstName, p.LastName, rc.Bib, p.Gender, rc.Age, p.City, p.St, p.Phone, p.DOB, p.Email, rg.RaceID FROM "
		sql = sql & "Participant p INNER JOIN PartReg rg ON p.ParticipantID = rg.ParticipantID JOIN PartRace rc "
		sql = sql & "ON rc.ParticipantID = p.ParticipantID WHERE rc.RaceID IN (" & sEventRaces & ") " & sOrderBy
		rs.Open sql, conn, 1, 2
		Do While Not rs.EOF
			PartArray(0, i) = rs(0).value
			PartArray(1, i) = Replace(rs(2).Value, "''", "'") & ", " & Replace(rs(1).Value, "''", "'")
			For j = 2 to 10
				PartArray(j, i) = rs(j + 1).Value
			Next
			i = i + 1
			ReDim Preserve PartArray(10, i)
			rs.MoveNext
		Loop
		rs.Close
		Set rs=Nothing

Open in new window

0
Comment
Question by:Bob Schneider
5 Comments
 
LVL 52

Assisted Solution

by:Carl Tawn
Carl Tawn earned 100 total points
ID: 39651112
Is there any particular reason you are pulling all of that data into an array rather than writing straight to the output buffer?

ReDim Preserve causes a new array to be created of the specified size, and then copies all the data from the old array to it. So each time you call ReDim Preserve you are temporarily holding double the amount of memory - hence increasing the array size by 1 in a loop is a very expensive operation.
0
 
LVL 33

Accepted Solution

by:
Big Monty earned 200 total points
ID: 39651266
it looks like you're trying to copy your recordset into an array, if thats the case, ditch the loop and use the built-in recordset function GetRows:

 		Set rs = Server.CreateObject("ADODB.Recordset")
		sql="SELECT p.ParticipantID, p.FirstName, p.LastName, rc.Bib, p.Gender, rc.Age, p.City, p.St, p.Phone, p.DOB, p.Email, rg.RaceID FROM "
		sql = sql & "Participant p INNER JOIN PartReg rg ON p.ParticipantID = rg.ParticipantID JOIN PartRace rc "
		sql = sql & "ON rc.ParticipantID = p.ParticipantID WHERE rc.RaceID IN (" & sEventRaces & ") " & sOrderBy
		rs.Open sql, conn, 1, 2     '-- i would change this to set rs = conn.Execute( sql ), you should only use rs.Open when modifying an rs

                dim PartArray
                PartArray = rs.GetRows()

Open in new window


another way you could optimize this page would be to use paging, this way you could only grab the records you want to use for that page. Here's a great paging tutorial to follow:

http://www.4guysfromrolla.com/webtech/121298-1.shtml

*edit* fixed typo in code
0
 
LVL 52

Assisted Solution

by:Scott Fell, EE MVE
Scott Fell,  EE MVE earned 150 total points
ID: 39652852
Increasing the limit is not the answer.  Using getrows as The_Big_Daddy is the way to go and there should be no reason to pull 4,000 records at once.  Pull in a couple hundred records and display about 20 or 30 at a time via js/jquery.  When you get to the last "page" then pull in another 100 records.  It would probably be best to do all of this via ajax where you have a remote page that you post to to add records to a js/jquery array on your main page.   Have a quick look at dataables http://datatables.net/ to help you do this.
0
 
LVL 46

Assisted Solution

by:fritz_the_blank
fritz_the_blank earned 50 total points
ID: 39653589
I agree that outputting that many records at a time is bad practice.

As The Big Daddy says, the .getRows() method is much more efficient than what you are doing.

I would use paging to control this--that way you're only grabbing a subset of the records at a time while having the whole set available.

If you want to, however, using Response.Flush will clear your buffer as you print your records.
0
 

Author Comment

by:Bob Schneider
ID: 39654487
This is the information I needed.  Thank you all so much!  For background information, we time events (5k, mud runs, nordic ski, cross-country running, etc) with proprietary software.  We are transitioning to some much bigger events so this is the type of process change that I will need to transition to in order to do this.

Once again, very helpful!
0

Featured Post

Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Long way back, we had to take help from third party tools in order to encrypt and decrypt data.  Gradually Microsoft understood the need for this feature and started to implement it by building functionality into SQL Server. Finally, with SQL 2008, …
I would like to start this tip/trick by saying Thank You, to all who said that this could not be done, as it forced me to make sure that it could be accomplished. :) To start, I want to make sure everyone understands the importance of utilizing p…
Nobody understands Phishing better than an anti-spam company. That’s why we are providing Phishing Awareness Training to our customers. According to a report by Verizon, only 3% of targeted users report malicious emails to management. With compan…
I've attached the XLSM Excel spreadsheet I used in the video and also text files containing the macros used below. https://filedb.experts-exchange.com/incoming/2017/03_w12/1151775/Permutations.txt https://filedb.experts-exchange.com/incoming/201…

828 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question