[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 348
  • Last Modified:

DataTable - What is the fastest way to process data from a DataTable?

I am retrieving approximately 5000 records from a MySQL database.

I am doing this hundreds of times per second.

I need to process the data in my dataset VERY quickly.
- "look at each of the values"
- "do some calculations"
- "store a few bits of data elsewhere and then I'm done with the datatable"


I'm familiar with recordsets. I would use the Recordset.GetRows() command to throw the data into a multi-dimensional array.

Do I need to do something similar with recordsets?

Thank you!
0
oxygen_728
Asked:
oxygen_728
  • 5
  • 4
  • 2
  • +1
7 Solutions
 
NetworkArchitekCommented:
Well it depends on what you need to do with the rest of the data. The fastest method that I am aware of is to just use a DataReader on the connection and not use the dataset. The datareader as you may know is very, very fast. So when you use it and the DataReader is called "dr" you will do like:

while(dr.Read())
' Look at values
' do calcs
'store
end while

If the datareader alone won't fit your requirement or if you need more information on using it then let us know. I don't want to go through the whole thing if you already know it anyway. =)
0
 
oxygen_728Author Commented:
Yes, I'm not familiar with the datareader.

Basically.. I would prefer the data to be in an index-able array or something just as fast.

I will look into the datareader as soon as i find time!
0
 
grayeCommented:
I think there is a fundamental flaw in this whole idea....

You're wanting to read 5000 records in a database *hundreds of times* per second !?!?!   WOW!

Instead of concentrating on how solve this particular problem, tell us more about what you're trying to do.... perhaps we can come up with some alternate suggestions on how to reduce the quantity/frequency of the database requirement.

One thing that pops into my head is to build a trigger at the database that detects when a certain condition exists, then it pulls a subset of the data (perhaps by timestamp, etc) and passing only the rows that need to passed to your application.
0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
jordi169Commented:
Graye is right.  Wouldn't it be easier for you to greate a stored procedure that does all the calculations for you, and just call the stored procedure.  This would reduce a lot of processeing done by the client, save you a lot of code, and most of the work would be done by the sql server.
0
 
NetworkArchitekCommented:
eek ... Yes graye is right we need more info ... I totally skipped the part about "hundreds of times per second."  Reading 5,000  records in itself is not necessarily that expensive but wow that is a lot per second ... I think we can optimize it better ... =)
0
 
oxygen_728Author Commented:
This isn't a business application. It's a research application in which I really do need to retrieve this many unique records hundreds of times per second.

You're just going to have to trust that my application logic is designed correctly and efficiently. I know it may sound weird, but there are some sitautions in which you would want to query so often.

However, I've been looking into the datareader and I think that's going to be the way to go.

-----

here's a question for you guys:

Is it faster to populate a multi-dimensional array from a datareader, or is it faster for a dataset to be filled?

I have zero interest in updates, so there is a lot of functionality included in the dataset which I simply do not need.


0
 
grayeCommented:
I can't imagine any commercially available database system, on any hardware platform, at any cost that could handle that kind of load.

But to answer your question...

VB.Net is a poor development platform for such a project... but hey, nothing make sense to me so far, so I'm gonna assume that you're gonna stay with VB.Net :)

The DataReader is by far much faster than anything that the DataAdapter can match.   If your database design will allow, you can tune the DataReader for Sequential read access, which is as fast as VB.Net will go.
0
 
NetworkArchitekCommented:
Well, for this case I probably would not go with filling a dataset, actually Datasets *are* filled by the datareader anyway, as far as I am aware. I think you will add more overhead. Really it depends but if you only interested in consuming data and not updating and so forth, arrays are one possibly route, I certainly wouldn't use an ArrayList though as you will have some overheard with that.
0
 
oxygen_728Author Commented:
Do you know if I can use the datareader to "instantly" put all 5000 records returned by my query into an array?

currently, I'm processing each record... one by one....
0
 
NetworkArchitekCommented:
Sure, I mean there is no way around looking at each record but you can do:

dim i as integer = 0
while(dr.Read())
         someStringArray(i) = CType(dr("SomeField"), String)
         i = i + 1
end while


You can change the array and the dr("whatever") to suit your needs, basically the "SomeField" is the actually fieldname in your table. You can this is in different ways but it all boils down to about the same thing as that loop. It should be pretty fast if you only want to put it in the array, I have written over 17,000 records to XML files first using the DataReader and it was very, very fast even with other overheard. Hope this helps.
0
 
oxygen_728Author Commented:
Ya that's how I'm reading all of 'em.

It is very fast... but I read data faster from a Microsoft Access database... I guess I was expecting more.

0
 
oxygen_728Author Commented:
NetworkArchitek, I'm going to accept yours as an answer with an assisted from graye, but I'm going to leave this open for another day just in case somebody comes by with an idea.

Also, you probably would be interested in this question:

http://www.experts-exchange.com/Databases/Mysql/Q_21400860.html

It is quite similar and probably more programming related than database related... though I could be wrong!
0

Featured Post

Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

  • 5
  • 4
  • 2
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now