C# DataReader vs DataSet


I am writing some code in C# that retrieves data from a microsoft access database. The question I have is when I should use a DataReader or DataSet? As I understand it DataReader is quicker and forward only where as a DataSet allows you more flexibility with your data.

When retriving the data I run one sql query once. It will typically return between 30,000 & 90,000 rows of data with between 4 to 10 fields (depends on the table called). Would this be to much in memory terms for a DataSet to work properly?

Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Naman GoelPrinciple Software engineerCommented:
30,000 & 90,000 rows are good enough in DataSet but if your data will be growing in future use DataReader.


Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
My suggestion would be to create an object with properties and then use a datareader, itterate through the data creating a new object each time and adding it to a generic list of that object...  See below (note check the syntax, i did this from memory):

List<TestObject> myTestObjectList = new List<TestObject>();
while (reader.Read())
	TestObject myTestObject = new TestObject();
	myTestObject.Property1 = reader["WebUserID"].ToString();
	myTestObject.Property2 = reader["TestColumn2"].ToString();

Open in new window

I am not a fan of datasets as they are designed to somewhat act like mini in memory databases therefore, by nature making them less efficient.  With that being said, be careful not to bloat your objects too much aswell because they can also very easily become inefficient.
mcs26Author Commented:
Thanks for the replies. But when it says,

"Use the DataReader in your application if you:

•Are processing a set of results too large to fit into memory."

How do you know or whats the rule of thumb when saying "to large to fit into memory"?

Naman GoelPrinciple Software engineerCommented:
see if you are expecting 1 million records it is not advisable to store those in DataSet for further processing as that will be stored in memory for a while, on the other hand we can use DataReader will read one row at a time and the process it. That will save memory as well as processing time.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Editors IDEs

From novice to tech pro — start learning today.