Solved

Classic ASP Buffer Issue

Posted on 2013-11-15
5
594 Views
Last Modified: 2013-11-17
I have an asp recordset that contains about 2300 rows and could climb to 4000 in the next couple of weeks.  When I try to open the page I get  "Buffer limit exceeded" even though I have increased that limit to 64MB.  I will supply my code below.  I have to believe that getting 4000 records should not be a problem if done correctly, right?  I am including my query below.  Please let me know where my query can be optimized.

As always, thanks!!!

		Set rs = Server.CreateObject("ADODB.Recordset")
		sql="SELECT p.ParticipantID, p.FirstName, p.LastName, rc.Bib, p.Gender, rc.Age, p.City, p.St, p.Phone, p.DOB, p.Email, rg.RaceID FROM "
		sql = sql & "Participant p INNER JOIN PartReg rg ON p.ParticipantID = rg.ParticipantID JOIN PartRace rc "
		sql = sql & "ON rc.ParticipantID = p.ParticipantID WHERE rc.RaceID IN (" & sEventRaces & ") " & sOrderBy
		rs.Open sql, conn, 1, 2
		Do While Not rs.EOF
			PartArray(0, i) = rs(0).value
			PartArray(1, i) = Replace(rs(2).Value, "''", "'") & ", " & Replace(rs(1).Value, "''", "'")
			For j = 2 to 10
				PartArray(j, i) = rs(j + 1).Value
			Next
			i = i + 1
			ReDim Preserve PartArray(10, i)
			rs.MoveNext
		Loop
		rs.Close
		Set rs=Nothing

Open in new window

0
Comment
Question by:Bob Schneider
5 Comments
 
LVL 52

Assisted Solution

by:Carl Tawn
Carl Tawn earned 100 total points
ID: 39651112
Is there any particular reason you are pulling all of that data into an array rather than writing straight to the output buffer?

ReDim Preserve causes a new array to be created of the specified size, and then copies all the data from the old array to it. So each time you call ReDim Preserve you are temporarily holding double the amount of memory - hence increasing the array size by 1 in a loop is a very expensive operation.
0
 
LVL 32

Accepted Solution

by:
Big Monty earned 200 total points
ID: 39651266
it looks like you're trying to copy your recordset into an array, if thats the case, ditch the loop and use the built-in recordset function GetRows:

 		Set rs = Server.CreateObject("ADODB.Recordset")
		sql="SELECT p.ParticipantID, p.FirstName, p.LastName, rc.Bib, p.Gender, rc.Age, p.City, p.St, p.Phone, p.DOB, p.Email, rg.RaceID FROM "
		sql = sql & "Participant p INNER JOIN PartReg rg ON p.ParticipantID = rg.ParticipantID JOIN PartRace rc "
		sql = sql & "ON rc.ParticipantID = p.ParticipantID WHERE rc.RaceID IN (" & sEventRaces & ") " & sOrderBy
		rs.Open sql, conn, 1, 2     '-- i would change this to set rs = conn.Execute( sql ), you should only use rs.Open when modifying an rs

                dim PartArray
                PartArray = rs.GetRows()

Open in new window


another way you could optimize this page would be to use paging, this way you could only grab the records you want to use for that page. Here's a great paging tutorial to follow:

http://www.4guysfromrolla.com/webtech/121298-1.shtml

*edit* fixed typo in code
0
 
LVL 52

Assisted Solution

by:Scott Fell, EE MVE
Scott Fell,  EE MVE earned 150 total points
ID: 39652852
Increasing the limit is not the answer.  Using getrows as The_Big_Daddy is the way to go and there should be no reason to pull 4,000 records at once.  Pull in a couple hundred records and display about 20 or 30 at a time via js/jquery.  When you get to the last "page" then pull in another 100 records.  It would probably be best to do all of this via ajax where you have a remote page that you post to to add records to a js/jquery array on your main page.   Have a quick look at dataables http://datatables.net/ to help you do this.
0
 
LVL 46

Assisted Solution

by:fritz_the_blank
fritz_the_blank earned 50 total points
ID: 39653589
I agree that outputting that many records at a time is bad practice.

As The Big Daddy says, the .getRows() method is much more efficient than what you are doing.

I would use paging to control this--that way you're only grabbing a subset of the records at a time while having the whole set available.

If you want to, however, using Response.Flush will clear your buffer as you print your records.
0
 

Author Comment

by:Bob Schneider
ID: 39654487
This is the information I needed.  Thank you all so much!  For background information, we time events (5k, mud runs, nordic ski, cross-country running, etc) with proprietary software.  We are transitioning to some much bigger events so this is the type of process change that I will need to transition to in order to do this.

Once again, very helpful!
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Long way back, we had to take help from third party tools in order to encrypt and decrypt data.  Gradually Microsoft understood the need for this feature and started to implement it by building functionality into SQL Server. Finally, with SQL 2008, …
In this article I will describe the Detach & Attach method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
This video gives you a great overview about bandwidth monitoring with SNMP and WMI with our network monitoring solution PRTG Network Monitor (https://www.paessler.com/prtg). If you're looking for how to monitor bandwidth using netflow or packet s…
This demo shows you how to set up the containerized NetScaler CPX with NetScaler Management and Analytics System in a non-routable Mesos/Marathon environment for use with Micro-Services applications.

706 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

19 Experts available now in Live!

Get 1:1 Help Now