Solved

What would be the best approach?

Posted on 2014-09-26
5
260 Views
Last Modified: 2014-10-06
I've to a process that will potentially result in a table with over 91,000,000 rows.

The user will input a range of latitude and longitude values to download a csv file from that table, but here's my concern:

With that much data, I feel like it's going to be better to break up that one table and instead have 10 different tables, each one housing data specific to a particular state. Now when those queries are run, I'm asking my system to sort through less data and hence the app works quicker.

Is my concern justified or should I feel comfortable in setting things up in one mammoth table?
0
Comment
Question by:brucegust
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
5 Comments
 
LVL 11

Expert Comment

by:Chris Gralike
ID: 40346204
Hi Brucegust,

Did you work out a data model or are you discovering the best approach while coding?
0
 
LVL 58

Assisted Solution

by:Gary
Gary earned 300 total points
ID: 40346214
If you were using an indexed spatial points column the search should be pretty quick
http://dev.mysql.com/doc/refman/5.1/en/gis-property-functions.html

Seperating them then how would you know which state table to search?

I'll see if I can find a tutorial on it as it takes a little setting up, but what MySQL version are you on?

I use this method on a few million records and it's very quick
0
 
LVL 58

Accepted Solution

by:
Gary earned 300 total points
ID: 40346272
0
 
LVL 110

Expert Comment

by:Ray Paseur
ID: 40346283
Please post the CREATE TABLE statement and some of your test data.  It only needs to be about 4 rows, but we need to see the complete row, with all of the columns filled.  Armed with that (it's called the SSCCE) we should be able to give you some pretty solid ideas.
0
 
LVL 110

Assisted Solution

by:Ray Paseur
Ray Paseur earned 200 total points
ID: 40356734
Now that I've looked at your data a little bit, I think the answer is "break up the data into several tables."  By my computations in your related questions it looks like you may get 160,000,000+ rows.  Of course, your query arrival rates and response times will be important considerations (did you ever think you would be using the second derivative again?!) and at some level you may want a DBA to get hands-on with the application and to help you with the design.
0

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction This article is intended for those who are new to PHP error handling (https://www.experts-exchange.com/articles/11769/And-by-the-way-I-am-New-to-PHP.html).  It addresses one of the most common problems that plague beginning PHP develop…
This post contains step-by-step instructions for setting up alerting in Percona Monitoring and Management (PMM) using Grafana.
The viewer will learn how to dynamically set the form action using jQuery.
The viewer will learn how to count occurrences of each item in an array.

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question