• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 592
  • Last Modified:

16Gb Table, how much Ram and Optimization

We have a SQL 4 server running a single large database. The database has 7 tiny tables and one very large 16Gb table. The index file for this table is also 1Gb.

Currently queries often need to query this table and are very slow. Querying the table results in very heavy disk read traffic. The server currently only has 10Gb RAM, if I got a new server with 32Gb RAM (And space for 64Gb) I'm sure it could be made to work properly but do I need to worry about settings like key_buffer_size? (Would this need to be set to 2Gb?)
0
smj001
Asked:
smj001
1 Solution
 
michofreihaCommented:
it's a very good idea to use 2Gb as memory ..upgrade to 2G and try the script again...I'l sure it'll be better..

I had to do a large import (> 9 million records) for a customer on a shared database machine.  It's 2.4Ghz with 1.5Gb RAM, so close to your specs.  The inserts hammered on it, and slowed down the response on the console, but requests from the web server (a separate machine) still came through perfectly.  There are a few very touchy customers who complain if a page takes 1 full second to load, and we had no problems through the entire insert cycle
0

Featured Post

Important Lessons on Recovering from Petya

In their most recent webinar, Skyport Systems explores ways to isolate and protect critical databases to keep the core of your company safe from harm.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now