Solved

What would be the best approach?

Posted on 2014-09-26
5
249 Views
Last Modified: 2014-10-06
I've to a process that will potentially result in a table with over 91,000,000 rows.

The user will input a range of latitude and longitude values to download a csv file from that table, but here's my concern:

With that much data, I feel like it's going to be better to break up that one table and instead have 10 different tables, each one housing data specific to a particular state. Now when those queries are run, I'm asking my system to sort through less data and hence the app works quicker.

Is my concern justified or should I feel comfortable in setting things up in one mammoth table?
0
Comment
Question by:brucegust
  • 2
  • 2
5 Comments
 
LVL 10

Expert Comment

by:Chris_Gralike
ID: 40346204
Hi Brucegust,

Did you work out a data model or are you discovering the best approach while coding?
0
 
LVL 58

Assisted Solution

by:Gary
Gary earned 300 total points
ID: 40346214
If you were using an indexed spatial points column the search should be pretty quick
http://dev.mysql.com/doc/refman/5.1/en/gis-property-functions.html

Seperating them then how would you know which state table to search?

I'll see if I can find a tutorial on it as it takes a little setting up, but what MySQL version are you on?

I use this method on a few million records and it's very quick
0
 
LVL 58

Accepted Solution

by:
Gary earned 300 total points
ID: 40346272
0
 
LVL 109

Expert Comment

by:Ray Paseur
ID: 40346283
Please post the CREATE TABLE statement and some of your test data.  It only needs to be about 4 rows, but we need to see the complete row, with all of the columns filled.  Armed with that (it's called the SSCCE) we should be able to give you some pretty solid ideas.
0
 
LVL 109

Assisted Solution

by:Ray Paseur
Ray Paseur earned 200 total points
ID: 40356734
Now that I've looked at your data a little bit, I think the answer is "break up the data into several tables."  By my computations in your related questions it looks like you may get 160,000,000+ rows.  Of course, your query arrival rates and response times will be important considerations (did you ever think you would be using the second derivative again?!) and at some level you may want a DBA to get hands-on with the application and to help you with the design.
0

Featured Post

Ransomware-A Revenue Bonanza for Service Providers

Ransomware – malware that gets on your customers’ computers, encrypts their data, and extorts a hefty ransom for the decryption keys – is a surging new threat.  The purpose of this eBook is to educate the reader about ransomware attacks.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Incorrect definition of table mysql.proc 7 46
Test if unique ID is in log file 5 21
Ajax and PHP 4 29
Could you point a way to eliminate an array unexpected element? 8 23
Part of the Global Positioning System A geocode (https://developers.google.com/maps/documentation/geocoding/) is the major subset of a GPS coordinate (http://en.wikipedia.org/wiki/Global_Positioning_System), the other parts being the altitude and t…
3 proven steps to speed up Magento powered sites. The article focus is on optimizing time to first byte (TTFB), full page caching and configuring server for optimal performance.
The viewer will learn how to count occurrences of each item in an array.
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.

789 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question