Dataset vs Datareader, and is there a way to make a Datareader skip to a position within a result set

I have to create a page that browses through a very large MSSQL database.  The development environment is C# and ASP.NET.

There seem to be two choices:  Use a Dataset and paging, or use a Datareader.  All the doc I can find seems to imply that all the records in a Dataset are saved "in memory" on the server.  Using a Dataset for a large database seems to be likely to cause performance problems.

But if I use a Datareader, is there some way to get it to skip reading through the first X records in the result set from the query?  Otherwise, every time they user pages forward, I am going to have to read the whole database up that point again.

Alternatively, is there an SQL method of asking for "the 100th through 150th rows that meet such and such a where clause"?

The goal is actually very simple.  It is just the list function in and add, change, delete and list application for a large database.  But I am at a loss as to the best way to code it for a real world, large database.

Thank you.

Kevin
kevinmackenzieAsked:
Who is Participating?
 
razoConnect With a Mentor Commented:
this is a query that would help u to get what u want using a data reader

   If DataGrid1.VirtualItemCount - startIndex > DataGrid1.PageSize Then
                i = DataGrid1.PageSize
            Else
                i = DataGrid1.VirtualItemCount - startIndex
            End If
   comm.CommandText = "SELECT * FROM(SELECT TOP " & i & " * FROM (SELECT TOP " & startIndex + DataGrid1.PageSize & " * FROM (SELECT TOP " & startIndex + DataGrid1.PageSize & " * FROM tablename ORDER BY field) DERIVEDTBL ORDER BY field DESC) DERIVEDTBL) DERIVEDTBL ORDER BY field"

the idea is to get 150 rows order them in the reverse order and then take the top 50 rows
0
 
AerosSagaCommented:
The datareader is a forward only stream of data and thus is very fast.  In the modern day world serving up 500 records from a MSSQL server on any reasonable server will hardly make a dent in your performance.  I do agree that using the TOP function of TSQL is the way to go, and I applaud you for thinking about the overhead; but the fact is that horsepower is way cheaper than brainpower nowadays, and less a consideration.

Regards,

Aeros
0
 
DotNetLover_BaanCommented:
DataReader has Read() method, that advances the DataReader to the next record.
http://msdn.microsoft.com/library/en-us/cpref/html/frlrfsystemdatasqlclientsqldatareaderclassreadtopic.asp

You can use a for loop to skip records. But, as Aeros said, reader is Forward only. So you can't go back to the rows you skipped.
-Baan
0
Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

 
kevinmackenzieAuthor Commented:
Hi folks,

I think I need to update my email address in expert exchange, as I did not receive the usual notifications of these responses.  Will take a look at them this morning.

kevin
0
 
kevinmackenzieAuthor Commented:
Razo,

What are the performance implications for a really large table with the method you suggest.  I.E. if startIndex was 2 million, would this cause the server to create two temporary tables with 2 million + pagesize rows in them?  Or will the query optimizer figure it out, and only retrieve only the rows needed?

Aerosaga,

My problem is that I need to dive into a 4 million record database and page through it a screen at a time starting anywhere in the database.  I also do not have any unique column value in each row, so I need a way to get the Xth through Yth row in the database.  Reading the whole 4M into a database is not practical, even with today's servers.

Baan,

I think it would take forever to do 2 million Read method calls to get to the middle of the database, so that won't work.

Thanks folks.  Razo, I look forward to your response, sorry for the delay.

Kevin
0
 
kevinmackenzieAuthor Commented:
Razo,

I tried to use your syntax, but if you want the 100th through 150th row from a table that is X rows, I think you need to know X, and write the statement a little different.  I don't know whether counting the total number of rows with COUNT is expensive in terms of performance or not...

Kevin
0
 
kevinmackenzieAuthor Commented:
I have solved this problem.  I am awarding the points to razo because his answer was the most useful, even if it did not get me all the way to a solution.

The solution I found is at:

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=27297

and is referred to as "Jeff's solution".  Here is the pseudo code:

-- for each column in your sort, you need a variable to hold
-- the "starting values". In our case, we need two:
declare @startingDate datetime;
declare @startingID int;

-- again, we want to returns resutls from row @a to row @b:
declare @a int;
declare @b int;

set @a = 200 -- start at row 200
set @b = 250 -- end at row 250

-- get the starting date and starting ID to return results:
set rowcount @a
select @StartingDate = DateCol, @startingID = ID
from yourtable
order by DateCol ASC,ID ASC

-- find out how many rows to return, and set the rowcount:
set @b = @b - @a
set rowcount @b

-- now return the results:
select * from yourtable
where
DateCol > @StartingDate OR
(DateCol = @StartingDate AND ID > @StartingID)
order by DateCol ASC, ID ASC

-- clean up:
set rowcount 0
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.