Link to home
Start Free TrialLog in
Avatar of TechMommy
TechMommyFlag for United States of America

asked on

Retrieving Batches of Rows in ADO Recordset - Microsoft Access

I have a large recordset which I want to use as the foundation for an Access form. The problem is that I do not want to bring back all of the rows in the recordset to the form. I am able to use the ADO PageSize property to page through a recordset once it is on the client, but I still experience a performance hit populating the recordset with the 5000 plus rows that I am retrieving from the server. I need a way to page through the recordset on the server, retrieving only 100 rows at a time.

Here is the code I have so far:

       Set rst = New ADODB.Recordset
        rst.ActiveConnection = CurrentProject.AccessConnection
        rst.LockType = adLockOptimistic
        rst.CursorType = adOpenKeyset
        rst.CursorLocation = adUseServer
        
        Set rst2 = New ADODB.Recordset
        rst2.ActiveConnection = CurrentProject.AccessConnection
        rst2.LockType = adLockOptimistic
        rst2.CursorType = adOpenKeyset
        
        rst.Open "SELECT * From vwProjectBuyerReport"         
        rst.PageSize = 25
        
        rst2.Open "SELECT *  FROM tblTempRecordset WHERE ProjectID = 0"
        
        Dim intCount As Integer
        Do While Not rst.EOF And intCount < rst.PageSize
            rst2.AddNew
            PopulateRST2
            rst2.Update
            intCount = intCount + 1
            rst.MoveNext
        Loop

        Set Me.fsubProjectBuyerReport.Form.Recordset = rst2

Open in new window


The line of code rst.Open takes two minutes to execute when returning 5000 rows. My goal is to NOT return the 5000 rows, but rather to page through the 5000 rows on the server, 100 rows at a time as the user clicks NEXT and PREVIOUS PAGE buttons.
Avatar of Rey Obrero (Capricorn1)
Rey Obrero (Capricorn1)
Flag of United States of America image

is this for viewing of records only?
Avatar of TechMommy

ASKER

Yes. Absolutely. View only.
ASKER CERTIFIED SOLUTION
Avatar of Bill Ross
Bill Ross
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
The view that is being used contains the primary key, but the primary keys are not sequential because it pull back various rows based on criteria. How can we use your method in this situation?
Hi Paganed,

Results will be slower but you can create a page table with rows 1-5000 say and append the PKs to that table.  You could then join that table to your primary table and bring over 100 records at a time.  That will still be a lot faster than than your current method.
 
Essentially, bring over 100 PKs and query the big recordset from that small set of data.  The SQL query engine will optimize that pretty well.

Regards,

Bill
Hi,

Another thought would be to pass the criteria back to the server but that would not necessarily give you 100 rows at a time...

Anything that you can do to limit the data passed back from the server will speed up your work.

Bill
what i normally do is get everything as recordset and dump the result to an excel file
i can get 35 to 40 k records in 30 seconds plus/minus 5


then i just link to the excel file
the less number of time you connect to the server to get the records the better.
Capricorn1
I like the Excel idea, but the problem is a long time to create the recordset to dump.
.
.

BillDenver

As for the PrimaryKey capture
Were you talking about this :  ???
.
Example
0. keep the view in an ascending order of PrimaryKey

1. Load the TOP 100 of a 5000+ View as a recordset

2.  Capture the First & Last row of the PrimaryKey as a Long Integer
--- something like lngPrimaryKeyFirst  + lngPrimaryKeyLast

3. then with two Basic Buttons :  PREV  +  NEXT
--- if NEXT is selected you take the TOP 100  (WHERE PrimaryKey > lngPrimaryKeyFirst)
--- if PREV is selected you take the TOP 100  (WHERE PrimaryKey > lngPrimaryKeyLast) + (SORT PrimaryKey Desc)  __because PKs will not be incremental you cannot just say -100 or something like that

.
I think this may work
you just use the

 CopyFromRecordset  method

and it is very fast
I'm slightly confused as to how many questions are going on in this thread?
Are PAGANED and TechMommy one and the same? :-s

Anyway - in reference to the end suggestion...
Indeed, the actual solution of naviation is pretty much what you'd feel intuitively.
But (with that in mind) it's a conceptual solution only with potential performance issues lingering.  What I'm wondering about is...

>> The line of code rst.Open takes two minutes to execute when returning 5000 rows.

That's a long time.  Presumably the view is doing more than just returning rows.  It's aggregating and performing some other calculation.
Are you sure there's no speed improvement to be had in that?

We (developers) avoid large data sourced forms for efficiency reasons - and we justify that as users can't be about to take in thousands for records.  So they perform filtering first.
Fair enough, this is reporting type data, but even so - there's a need to scrool through 5000+ rows?  That's summariased data?  Is it summariased enough?
Can it not be prefildered in some way?  (Perhaps before the aggregation in the view if need be... as afterwards may be too late to reap advantage from doing so.)

Anyway - back to my point.
To retrieve the TOP N rows is still quite an expensive hit (the need to be returned and sorted).  Yes, you're pulling over fewer rows across the network - but the server is still number crunching.
It can't be the sheer volume of data coming over the wire.

Have you timed a smaller subset of that view and found it to be radically faster?
Both by limiting its results based on aggregated and non-aggregated data?
The purpose of the testing being to time how long it's taking the server to process the result - as opposed to just pulling fewer of those results over the network.

Cheers.
PAGANED  +  TECHMOMMY  are two different people
.
We are working on the same problem together, and decided to ask this question on EE
.
Curiosity subsides.
(On that issue anyway.)
Well, we finally solved the problem. The code that follows illustrates how we did it.

Routine for basic select:

Sub SelectStatement()

    strSELECT = "SELECT TOP 25 ProjectID,EnteredOn,ExpiresOn From vwBuyerWorksheet "

End Sub

Open in new window


Routine to generate WHERE clause:

Sub SelectStatementWhere(strOrigin As String)


'=======Create Where=============
    Select Case strOrigin
        Case "LOAD"
            strWHERE = " WHERE ProjectID > " & lngProjectIDLast
        Case "PREV"
            strWHERE = " WHERE ProjectID < " & lngProjectIDFirst
        Case "NEXT"
            strWHERE = " WHERE ProjectID > " & lngProjectIDLast
    End Select
    
'=======Prev Change Order By=============
    Select Case strOrigin
        Case "PREV"
            strWHERE = strWHERE & " ORDER BY ProjectID DESC"
    End Select   
End Sub

Open in new window



When the user selects NEXT PAGE:

Private Sub cmdNextPage_Click()

    rst.Close
    Set rst = Nothing
    Set rst = New ADODB.Recordset
    rst.ActiveConnection = CurrentProject.Connection
    rst.CursorType = adOpenDynamic
    rst.LockType = adLockOptimistic
    rst.CursorLocation = adUseClient
    
    Call SelectStatementWhere("NEXT")
    rst.Open strSELECT & strWHERE
    
    Dim intRecordCount As Integer
    Do While Not rst.EOF
        intRecordCount = intRecordCount + 1
        rst.MoveNext
    Loop
    
    If intRecordCount >= 25 Then
        rst.MoveFirst
        lngProjectIDFirst = rst("ProjectID")
        rst.MoveLast
        lngProjectIDLast = rst("ProjectID")
        rst.MoveFirst
        Set Me.fsubBuyerWorksheet.Form.Recordset = rst
    End If
     
    
End Sub

Open in new window


When the user selects the previous page button, it works basically the same us the next (slightly inverted on first and last) and we pass "PREV" to SelectStatementWhere.

It all works and the performance is excellent.
This suggestion pointed us in the right direction so that we were able to solve the problem.

Thanks.