Hello there. I'm trying to figure out the best way of optimizing a very large database so that the queries go as fast as possible and need some expert advice because i've never dealt with a database this large. Ok..so here are the technical aspects of the mysql database:
database size is 92.5 gigabytes
1 table called "logs" with 265,321,000 rows! yes...265.3 million rows
index length: 34.45 GB (36,987,469,824)
data length: 48.53 GB (52,105,838,592)
here are the fields in the table:
Navicat MySQL Data Transfer
Source Server : localhost-mydbase
Source Server Version : 50623
Source Host : localhost:3307
Source Database : mydbase
Target Server Type : MYSQL
Target Server Version : 50623
File Encoding : 65001
Date: 2015-02-09 08:45:05
-- Table structure for logs
DROP TABLE IF EXISTS `logs`;
CREATE TABLE `logs` (
`id` bigint(255) unsigned NOT NULL AUTO_INCREMENT,
`username` varchar(255) DEFAULT NULL,
`allowed` int(255) DEFAULT NULL,
`dtime` datetime DEFAULT NULL,
`category` varchar(255) DEFAULT NULL,
`compname` varchar(255) DEFAULT NULL,
`srcip` varchar(100) DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `user` (`username`),
KEY `allow` (`allowed`),
KEY `dtime1` (`dtime`),
KEY `cat1` (`category`),
KEY `compname1` (`compname`),
KEY `srcip1` (`srcip`)
) ENGINE=InnoDB AUTO_INCREMENT=265790858 DEFAULT CHARSET=utf8;
First off, what server specs should i be looking for in order to query a database of this size? memory..etc etc...
Most queries against this database are going to return all fields looking for a specific username within a specified date range. Occasionally there will be some queries against the url table using a '%LIKE%' clause that will look for URL's that contain a specified domain...such as, give me the date and time, username and ip address for the user that visited espn.com from September of last year to September of this year.
This database will continue to grow and i'm sure there is a limit somewhere that will force me to create another table or re-structure this thing. I've just never dealt with this much data before and really need to hear from people who understand mysql and dealing with a database of this size. What can I do to make this quick and responsive? Are my indexes setup properly? should the database be broken up into separate tables? Just need to know (thinking towards the future) the best way to structure this beast so I can get the fastest possible results from the queries I'm needing. Any suggestions or modifications are greatly appreciated as I can destroy this table and redo the entire thing if I need to so all suggestions are very welcomed!
Oh almost forgot....should i even be using mysql for this? i can use MS sql server if need be but you know, free is really awesome!
Also, this query took 156 seconds which is just too long:
SELECT `logs`.id, `logs`.username, `logs`.allowed, `logs`.url, `logs`.dtime, `logs`.category, `logs`.compname, `logs`.srcip FROM `logs` WHERE `logs`.username = 'myuser' AND `logs`.dtime > '2014-12-28 02:13:46' AND `logs`.dtime < '2015-01-05 12:06:29'
Thanks very much in advance!