?
Solved

What would be the best approach?

Posted on 2014-09-26
5
Medium Priority
?
265 Views
Last Modified: 2014-10-06
I've to a process that will potentially result in a table with over 91,000,000 rows.

The user will input a range of latitude and longitude values to download a csv file from that table, but here's my concern:

With that much data, I feel like it's going to be better to break up that one table and instead have 10 different tables, each one housing data specific to a particular state. Now when those queries are run, I'm asking my system to sort through less data and hence the app works quicker.

Is my concern justified or should I feel comfortable in setting things up in one mammoth table?
0
Comment
Question by:brucegust
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
5 Comments
 
LVL 11

Expert Comment

by:Chris Gralike
ID: 40346204
Hi Brucegust,

Did you work out a data model or are you discovering the best approach while coding?
0
 
LVL 58

Assisted Solution

by:Gary
Gary earned 1200 total points
ID: 40346214
If you were using an indexed spatial points column the search should be pretty quick
http://dev.mysql.com/doc/refman/5.1/en/gis-property-functions.html

Seperating them then how would you know which state table to search?

I'll see if I can find a tutorial on it as it takes a little setting up, but what MySQL version are you on?

I use this method on a few million records and it's very quick
0
 
LVL 58

Accepted Solution

by:
Gary earned 1200 total points
ID: 40346272
0
 
LVL 111

Expert Comment

by:Ray Paseur
ID: 40346283
Please post the CREATE TABLE statement and some of your test data.  It only needs to be about 4 rows, but we need to see the complete row, with all of the columns filled.  Armed with that (it's called the SSCCE) we should be able to give you some pretty solid ideas.
0
 
LVL 111

Assisted Solution

by:Ray Paseur
Ray Paseur earned 800 total points
ID: 40356734
Now that I've looked at your data a little bit, I think the answer is "break up the data into several tables."  By my computations in your related questions it looks like you may get 160,000,000+ rows.  Of course, your query arrival rates and response times will be important considerations (did you ever think you would be using the second derivative again?!) and at some level you may want a DBA to get hands-on with the application and to help you with the design.
0

Featured Post

Use Case: Protecting a Hybrid Cloud Infrastructure

Microsoft Azure is rapidly becoming the norm in dynamic IT environments. This document describes the challenges that organizations face when protecting data in a hybrid cloud IT environment and presents a use case to demonstrate how Acronis Backup protects all data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

3 proven steps to speed up Magento powered sites. The article focus is on optimizing time to first byte (TTFB), full page caching and configuring server for optimal performance.
In this series, we will discuss common questions received as a database Solutions Engineer at Percona. In this role, we speak with a wide array of MySQL and MongoDB users responsible for both extremely large and complex environments to smaller singl…
This tutorial will teach you the core code needed to finalize the addition of a watermark to your image. The viewer will use a small PHP class to learn and create a watermark.
In this video, Percona Solutions Engineer Barrett Chambers discusses some of the basic syntax differences between MySQL and MongoDB. To learn more check out our webinar on MongoDB administration for MySQL DBA: https://www.percona.com/resources/we…
Suggested Courses

741 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question