Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

Mysql performance

Posted on 2016-08-03
8
Medium Priority
?
92 Views
Last Modified: 2016-08-29
Mysql slow log files is filled with slow queries. I used percona toolkit(pt-query-digest) to analyze them, and found the most used query that take very long is just a simple select with where clause using the primary key

the table has 2 500 000 rows and the primary key is a varchar(255). when I run the query within the mysql command line, it's very fast.

any idea where I can look to determine why the query is so slow( at 95% takes up to 9s to run).

Thanks
0
Comment
Question by:CCVOIP
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
8 Comments
 
LVL 4

Expert Comment

by:vvk
ID: 41741175
where the query being called from when it is slow? that may affect default session parameters etc. what DB engine type you're using? can you use FORCE INDEX in the query?
0
 

Author Comment

by:CCVOIP
ID: 41741182
it's an innodb table that has 4 columns with longblob and 1 column longtext in addition to a mix of varchar/datetime/tinyint

it's slowing when it gets called from the application server hosts which are under the same domain as the db host

when I do explain extended it shows 1 under rows column

+----+-------------+----------------+-------+----------------------+---------------+---------+-------+------+----------+-------+
| id | select_type | table       | type  | possible_keys | key           | key_len | ref   | rows | filtered | Extra |
+----+--------------+--------------+----------+-------------------+---------------+---------+-------+------+----------+-------+
|  1 | SIMPLE      | tabl1_       | const | pk_ID              | pk_ID          | 768     | const |    1 |   100.00 | NULL  |
+----+---------------+--------------+--------+--------------------+---------------+---------+-------+------+----------+-------+
0
 
LVL 4

Expert Comment

by:vvk
ID: 41741200
can you run it with show warnings to get additional output about execution plan and also post query itself if possible. value of 100 in filtered is sometimes confusing without additional data
0
U.S. Department of Agriculture and Acronis Access

With the new era of mobile computing, smartphones and tablets, wireless communications and cloud services, the USDA sought to take advantage of a mobilized workforce and the blurring lines between personal and corporate computing resources.

 

Author Comment

by:CCVOIP
ID: 41741223
show warnings doesn't return much because of the longblob columns:
mysql> show warnings \G
*************************** 1. row ***************************
  Level: Note
   Code: 1003
Message:*▒H▒▒elect#1 */ select '830wdLLXs6k4vtNVDOqXIZ70It_FAtMT3F-xLxO0JsI=' AS `DEVICE_ID`,'NONE' AS `BLACKLISTED_TYPE`,'infotel' AS `CARRIER`,'r4g' AS `PRODUCT`,'5.0' AS `VER`,'0▒▒0▒ 5▒V@▒50
1 row in set (0.00 sec)
0
 
LVL 4

Expert Comment

by:vvk
ID: 41741593
it'll be really helpful to see query itself. just to dismiss that it may be just network, is application run on the same server with database? does it retrieve blob? what is approximate size of that? do you use STATS_PERSISTENT and how often do you run ANALYZE TABLE
0
 

Author Comment

by:CCVOIP
ID: 41743160
here is the exact query:

select * from devices where device_id='54789wdLLXs6k4jhgjyuyUOIOXIZ70It_FAtMT3F-JHIUOPIm='  

device_id is a primary key
the query retrieves 4 longblob columns
we use mysql5.6.15 and STATS_PERSISTENT=1 by default
we don analyze tables that often(you can say at all)
the table has up to 3 000 000 row
0
 
LVL 4

Expert Comment

by:vvk
ID: 41748147
not running analyze table should not have effect in this situation. just to confirm, when intially looking at your question i assumed that with pt-query-digest you'd calculated ratio of rows_sent/rows_exemined and it equal 1 for those long running queries, am i correct?
0
 
LVL 25

Accepted Solution

by:
Tomas Helgi Johannsson earned 2000 total points
ID: 41751820
Hi!

Having STATS_PERSISTENT=1 on your tables you should run ANALYZE TABLE periodically or set the STATS_AUTO_RECALC=1 on the tables for optimal performance for your  sql queries.
Having 3 million rows in your table and no statistics on it your queries will not run with optimal performance. Note that I may be that some queries will run fast and some don't. You wan't all queries to run as optimal as possible so I recommend doing as I mentioned above.

https://dev.mysql.com/doc/refman/5.6/en/innodb-persistent-stats.html

Regards,
      Tomas Helgi
0

Featured Post

The Eight Noble Truths of Backup and Recovery

How can IT departments tackle the challenges of a Big Data world? This white paper provides a roadmap to success and helps companies ensure that all their data is safe and secure, no matter if it resides on-premise with physical or virtual machines or in the cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Password hashing is better than message digests or encryption, and you should be using it instead of message digests or encryption.  Find out why and how in this article, which supplements the original article on PHP Client Registration, Login, Logo…
By, Vadim Tkachenko. In this article we’ll look at ClickHouse on its one year anniversary.
In this video, Percona Solution Engineer Rick Golba discuss how (and why) you implement high availability in a database environment. To discuss how Percona Consulting can help with your design and architecture needs for your database and infrastr…
In this video, Percona Solutions Engineer Barrett Chambers discusses some of the basic syntax differences between MySQL and MongoDB. To learn more check out our webinar on MongoDB administration for MySQL DBA: https://www.percona.com/resources/we…
Suggested Courses

722 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question