Link to home
Start Free TrialLog in
Avatar of Bruce Gust
Bruce GustFlag for United States of America

asked on

What would be the best approach?

I've to a process that will potentially result in a table with over 91,000,000 rows.

The user will input a range of latitude and longitude values to download a csv file from that table, but here's my concern:

With that much data, I feel like it's going to be better to break up that one table and instead have 10 different tables, each one housing data specific to a particular state. Now when those queries are run, I'm asking my system to sort through less data and hence the app works quicker.

Is my concern justified or should I feel comfortable in setting things up in one mammoth table?
Avatar of Member_2_3684445
Member_2_3684445
Flag of Netherlands image

Hi Brucegust,

Did you work out a data model or are you discovering the best approach while coding?
SOLUTION
Avatar of Gary
Gary
Flag of Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Please post the CREATE TABLE statement and some of your test data.  It only needs to be about 4 rows, but we need to see the complete row, with all of the columns filled.  Armed with that (it's called the SSCCE) we should be able to give you some pretty solid ideas.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial