• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 593
  • Last Modified:

16Gb Table, how much Ram and Optimization

We have a SQL 4 server running a single large database. The database has 7 tiny tables and one very large 16Gb table. The index file for this table is also 1Gb.

Currently queries often need to query this table and are very slow. Querying the table results in very heavy disk read traffic. The server currently only has 10Gb RAM, if I got a new server with 32Gb RAM (And space for 64Gb) I'm sure it could be made to work properly but do I need to worry about settings like key_buffer_size? (Would this need to be set to 2Gb?)
1 Solution
it's a very good idea to use 2Gb as memory ..upgrade to 2G and try the script again...I'l sure it'll be better..

I had to do a large import (> 9 million records) for a customer on a shared database machine.  It's 2.4Ghz with 1.5Gb RAM, so close to your specs.  The inserts hammered on it, and slowed down the response on the console, but requests from the web server (a separate machine) still came through perfectly.  There are a few very touchy customers who complain if a page takes 1 full second to load, and we had no problems through the entire insert cycle
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now