[2 days left] What’s wrong with your cloud strategy? Learn why multicloud solutions matter with Nimble Storage.Register Now

x
?
Solved

MySQL query returns far too many rows (2 million)!

Posted on 2014-02-05
5
Medium Priority
?
367 Views
Last Modified: 2014-02-05
Experts,

I have a query that should return at most 25 rows - the results for one location in January 2013.

However the resultset has 2.5 million or so!

Obviously I'm missing something, b/c it looks like every row in every join table is being included in the results.

Here's the query.  Help!
sql.txt
0
Comment
Question by:Glen Gibb
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
5 Comments
 
LVL 4

Expert Comment

by:Shane Kahkola
ID: 39836328
Can you try SELECT DISTINCT?

If not, can you post some sample data that gets pulled from that query?
0
 
LVL 1

Expert Comment

by:SarahDaisy8
ID: 39836353
This is probably way to simple of a thought to be the solution, but I'll throw it out there heh.  What about group by?
0
 
LVL 11

Accepted Solution

by:
John_Vidmar earned 2000 total points
ID: 39836718
Pull the SQL into a database tool, replace the select-section with select count(*), comment-out all tables except for the first join and run the query.  Add the next join, run the query... repeat these steps until you hit the table that makes your query blow up to 2 million records.  At this point, review the join (perhaps a typo).

Also, I would qualify every field reference, especially in the on-clause.
0
 

Author Comment

by:Glen Gibb
ID: 39836937
Tnx, Experts, for your quick reply.

1.  (ccb) Yup, tried the DISTINCT modifier and reduced from a couple of million records to about 180 K.

2.  (sara) Attempted the GROUP BY, but things got REALLY slow, and still had superfluous results.

3. (john) Sounds like a plan.  I'll get started here.  Tell me please, how does one "qualify" a field reference?  I have made sure there are no ambiguous references, etc.  Am I missing something important?

I'll post sample results - no one needs a couple million of the same row!

Capt
0
 

Author Closing Comment

by:Glen Gibb
ID: 39838046
Thanks for the approach to solving this problem.

Turns out it was bad DB design.  Merged fields in the "consumption" and "consumption_item" tables to eliminate unnecessary duplication.

Presto!  MySQL workbench was able to help me analyse the difficulty.
0

Featured Post

Learn how to optimize MySQL for your business need

With the increasing importance of apps & networks in both business & personal interconnections, perfor. has become one of the key metrics of successful communication. This ebook is a hands-on business-case-driven guide to understanding MySQL query parameter tuning & database perf

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this blog post, we’ll look at how ClickHouse performs in a general analytical workload using the star schema benchmark test.
In this article, I’ll talk about multi-threaded slave statistics printed in MySQL error log file.
Internet Business Fax to Email Made Easy - With eFax Corporate (http://www.enterprise.efax.com), you'll receive a dedicated online fax number, which is used the same way as a typical analog fax number. You'll receive secure faxes in your email, fr…
Both in life and business – not all partnerships are created equal. Spend 30 short minutes with us to learn:   • Key questions to ask when considering a partnership to accelerate your business into the cloud • Pitfalls and mistakes other partners…
Suggested Courses

649 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question