Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17


How long should these procedures take, in your estimation.

Posted on 2011-03-07
Medium Priority
Last Modified: 2012-05-11
1. We have determined that the database needs to be VAACUMed and ANALYZEd.  VACUUM is a postgres routine maintenance command which essentially defragments your database.  It needs to be run every 1 billion transactions.  You are running at 5 transactions/sec right now, and have processed about 468,000,000 transactions since the last VACUUM or initdb, so my guess is that VACUUM has never been run on this data.  While a backup of the database consumes 29mb, the size of the data directory is over 700mb.  This lends some support to my vacuum hypothesis.  VACUUM is also necessary for ANALYZE to run.  ANALYZE updates the query planner for the actual data layout, so if one does not run ANALYZE, one can expect query performance to drop over time.  From my research, I  expect a performance increase when this routine maintenence has been accomplished.

The database has not been tuned to the size of the system.  Almost all settings are at the default, which are not optimal for a 4GB system.  With some further research, I can change the tuning parameters to make the database faster.  This may not be necessary if #1 makes performance good enough.  If necessary, it should be possible to tune the database so the entire data set fits in memory.

Postgres is currently at version 8.0.3.  This version was released 2005-05-09.  I recommend that this be upgraded, at a minimum to 8.0.26, the last bugfix release in the 8.0 stream.  The upgrade may require reindexing of some tables and may be conducted in place, however it may be less troublesome to dump the database, initdb, and restore.
Question by:zen_68
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
LVL 40

Expert Comment

ID: 35059557
1. I think you must run a VACUUM ANALYZE VERBOSE; immediately and this for a 700mb pg_default data folder should not take more than 15-20 minutes but of course this depends on how many constraints/indexes you have, when was last time you did VACCUMed, and your IO. Also you need to consider do a re-indexing as your indexes are most likely outdated/bloated. You should consider if possible to upgrade to 9.0 where the maintenance VACUUM improved a lot.
--rebuild only outdated indexex not ALL on each table
(select * from pg_stat_user_indexes
      where idx_tup_fetch > (idx_scan * 100) and idx_scan <> 0 and indexrelname not like '%_pkey' and indexrelname not like '%_uniq%' and indexrelname not like '%_ukey' and indexrelname not like 'pk_%' and schemaname = 'public'
select * from pg_stat_user_indexes
      where idx_tup_read != idx_tup_fetch and indexrelname not like '%_pkey' and indexrelname not like '%_uniq%' and indexrelname not like '%_ukey' and indexrelname not like 'pk_%' and schemaname = 'public'

2. As these are config file changes and some require only re-load but others may require re-start PG you may need a short downtime for these.

3. It all depends on your environment and how you plan to do it.

Expert Comment

ID: 35060918
With so much space for not so much data I'm afraid your autovacuum (if you have one) is too low so you are likely to need a vacuum FULL and then a reindexdb.

If no autovacuum then set a cron task to vacuum analyse a least every day

vacuum full will lock the table (exclusive lock) while vacuuming it

bumping version to 9 will be a lot of work


Author Comment

ID: 35063443
Im grateful for the advice, but what I'm looking for is estimates in how long each of the three procedures would take. Approximately.
Survive A High-Traffic Event with Percona

Your application or website rely on your database to deliver information about products and services to your customers. You can’t afford to have your database lose performance, lose availability or become unresponsive – even for just a few minutes.


Expert Comment

ID: 35067148
hello, can you describe your hardware ? how many disk ? speed ? fsync or not ?  raid ? raid + write-back ?

You need to read 700 MB of data and rewrite a part of it. If you have 4 or 6 15K SAS disks in raid 10 array + BBU + write back this should take less than 2 minutes
LVL 40

Accepted Solution

lcohan earned 1000 total points
ID: 35072806
1. I think a VACUUM ANALYZE VERBOSE; for a 700mb pg_default data folder should not take more than 15-20 minutes as mentioned and remembre this can be stopped/resumed at any time if it causes any issues and I suggest you backup your database before that just in case.

2. Config file changes are quick depending how fast you can connect to the server, type the changes in pg config files and restart if required: 10-15 minutes maybe

3. This can't be predicted unless you provide full details about your upgrade path and process.

Assisted Solution

l4ncel0t earned 1000 total points
ID: 35074670
20 minutes for a simple vacuum analyse for such a small db ? that would be on very old hardware

Featured Post

NEW Veeam Agent for Microsoft Windows

Backup and recover physical and cloud-based servers and workstations, as well as endpoint devices that belong to remote users. Avoid downtime and data loss quickly and easily for Windows-based physical or public cloud-based workloads!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Best database to use for Maps is PostgreSQL. This is an open source database. Comes as a package with most Linux OS. For more info visit the following site: ( This requires some add-o…
Many developers have database experience, but are new to PostgreSQL. It has some truly inspiring capabilities. I have several years' experience with Microsoft's SQL Server. When I began working with MySQL, I wanted a quick-reference to MySQL (htt…
Steps to create a PostgreSQL RDS instance in the Amazon cloud. We will cover some of the default settings and show how to connect to the instance once it is up and running.
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
Suggested Courses

688 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question