I've to a process that will potentially result in a table with over 91,000,000 rows.
The user will input a range of latitude and longitude values to download a csv file from that table, but here's my concern:
With that much data, I feel like it's going to be better to break up that one table and instead have 10 different tables, each one housing data specific to a particular state. Now when those queries are run, I'm asking my system to sort through less data and hence the app works quicker.
Is my concern justified or should I feel comfortable in setting things up in one mammoth table?