• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 99
  • Last Modified:

Mysql performance

Mysql slow log files is filled with slow queries. I used percona toolkit(pt-query-digest) to analyze them, and found the most used query that take very long is just a simple select with where clause using the primary key

the table has 2 500 000 rows and the primary key is a varchar(255). when I run the query within the mysql command line, it's very fast.

any idea where I can look to determine why the query is so slow( at 95% takes up to 9s to run).

Thanks
0
CCVOIP
Asked:
CCVOIP
  • 4
  • 3
1 Solution
 
vvkCommented:
where the query being called from when it is slow? that may affect default session parameters etc. what DB engine type you're using? can you use FORCE INDEX in the query?
0
 
CCVOIPAuthor Commented:
it's an innodb table that has 4 columns with longblob and 1 column longtext in addition to a mix of varchar/datetime/tinyint

it's slowing when it gets called from the application server hosts which are under the same domain as the db host

when I do explain extended it shows 1 under rows column

+----+-------------+----------------+-------+----------------------+---------------+---------+-------+------+----------+-------+
| id | select_type | table       | type  | possible_keys | key           | key_len | ref   | rows | filtered | Extra |
+----+--------------+--------------+----------+-------------------+---------------+---------+-------+------+----------+-------+
|  1 | SIMPLE      | tabl1_       | const | pk_ID              | pk_ID          | 768     | const |    1 |   100.00 | NULL  |
+----+---------------+--------------+--------+--------------------+---------------+---------+-------+------+----------+-------+
0
 
vvkCommented:
can you run it with show warnings to get additional output about execution plan and also post query itself if possible. value of 100 in filtered is sometimes confusing without additional data
0
Get expert help—faster!

Need expert help—fast? Use the Help Bell for personalized assistance getting answers to your important questions.

 
CCVOIPAuthor Commented:
show warnings doesn't return much because of the longblob columns:
mysql> show warnings \G
*************************** 1. row ***************************
  Level: Note
   Code: 1003
Message:*▒H▒▒elect#1 */ select '830wdLLXs6k4vtNVDOqXIZ70It_FAtMT3F-xLxO0JsI=' AS `DEVICE_ID`,'NONE' AS `BLACKLISTED_TYPE`,'infotel' AS `CARRIER`,'r4g' AS `PRODUCT`,'5.0' AS `VER`,'0▒▒0▒ 5▒V@▒50
1 row in set (0.00 sec)
0
 
vvkCommented:
it'll be really helpful to see query itself. just to dismiss that it may be just network, is application run on the same server with database? does it retrieve blob? what is approximate size of that? do you use STATS_PERSISTENT and how often do you run ANALYZE TABLE
0
 
CCVOIPAuthor Commented:
here is the exact query:

select * from devices where device_id='54789wdLLXs6k4jhgjyuyUOIOXIZ70It_FAtMT3F-JHIUOPIm='  

device_id is a primary key
the query retrieves 4 longblob columns
we use mysql5.6.15 and STATS_PERSISTENT=1 by default
we don analyze tables that often(you can say at all)
the table has up to 3 000 000 row
0
 
vvkCommented:
not running analyze table should not have effect in this situation. just to confirm, when intially looking at your question i assumed that with pt-query-digest you'd calculated ratio of rows_sent/rows_exemined and it equal 1 for those long running queries, am i correct?
0
 
Tomas Helgi JohannssonCommented:
Hi!

Having STATS_PERSISTENT=1 on your tables you should run ANALYZE TABLE periodically or set the STATS_AUTO_RECALC=1 on the tables for optimal performance for your  sql queries.
Having 3 million rows in your table and no statistics on it your queries will not run with optimal performance. Note that I may be that some queries will run fast and some don't. You wan't all queries to run as optimal as possible so I recommend doing as I mentioned above.

https://dev.mysql.com/doc/refman/5.6/en/innodb-persistent-stats.html

Regards,
      Tomas Helgi
0

Featured Post

[Webinar] Improve your customer journey

A positive customer journey is important in attracting and retaining business. To improve this experience, you can use Google Maps APIs to increase checkout conversions, boost user engagement, and optimize order fulfillment. Learn how in this webinar presented by Dito.

  • 4
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now