Link to home
Start Free TrialLog in
Avatar of cdfllc
cdfllc

asked on

Datagrids vs multiple selects

hello, I have a situation where I am trying to build some sort of reporting functionality.

I have a table of accounts, lets say there are 1000 accounts.
I have another table of payments, lets say there are 500,000 payments (multiple payments per account)

I have to obtain the payment information for each account, and manipulate it in some way.

I imagine that is why they created DataTables - you just do a "big" select from the database, and keep that table in memory, and then do selects against that in-memory table, right?

Can anyone show me an example of how to do that?

So basically, in this example, I just want to get back the matching payment records for an account. I have an account id in the payment table.

thanks!
SOLUTION
Avatar of Bob Learned
Bob Learned
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of MyersA
MyersA

Hi  cdfllc,

With VS 2002 1.0 I tried to do something similar to that (store over 500,000 records into a datatable) and the performance for that was horrible. All those records are stored in memory, and in my case, it took up so much of my physical memory, that it got to the point that I couldn't switch between applications.  And if you're going to bind that datatable/dataset to a datagrid, then you'll see an even higher performance reduction (I had 1GB ram).

What I ended up doing was implementing paging.  I'd display the first X records (50-100 of them) and I'd have a 'Previous' and 'Next' button' that would let me move through all the records. I was doing this with Access so it was a pain in the ###, but now SQL Server 2005 has a new feature that lets you easily implement the paging functionality.
Unless it's been improved with VS .Net 2003 or VS2005, I wouldn't store that much data in a datatable.

Hope this helps,
Vaughn
Avatar of cdfllc

ASKER

Well, we might now even mind having a "dedicated" machine to handle this report.
I am just concerned that I am going to slow down everyone else using another application that hits this same database.
I don't want to issue 1000 select statements, when I can just get everything back in a single one - and then work with that result set.
I mean, if it takes this dedicated machine "5 minutes" to perform the math on the results, we don't really care - as long as it's not slowing down the database or the other application users.

As far as you know, there are not any limits to the amount of data stored in a DataTable - are there?
I guess we could possibly bump up the RAM in this dedicated machine if that would help.
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of cdfllc

ASKER

This is the answer I was looking for - I realized I was searching for the wrong thing


DataRow[] oRows;
DataTable oTable = new DataTable();
oTable = PaymentDataSet.Tables[0]; // this is the 500,000 record table - I only want to loop through each payment for a specific account
oRows = oTable.Select("ACCOUNT_ID = " + propertyId,"ACCOUNT_ID ASC");

foreach(DataRow oRow in oRows)
{
     cashBalanceStr      = oRow["ACCOUNT_BALANCE"].ToString().Trim();                        
     cashBalanceDbl      = System.Convert.ToDouble(cashBalanceStr);
     // do the math...
}







But you guys gave me some good information, so I'll split the points!