Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 598
  • Last Modified:

Would PostgresSQL support a 30-millions-rows table?

I have a question: Is PostgreSQL able to support a table with 30 millions of rows? We have a .NET web application connecting to Oracle DB, but due Oracle licenses are very expensive, our IT chief is thinking to use a free SQL database. However, we use a table (just for reading) that has 30 millions of rows. Then you have to add indexes. Is there any technical article about if PostgresSQL will work or not?
Thanks!
0
Petronilo
Asked:
Petronilo
2 Solutions
 
rjkimbleCommented:
PostgreSQL should work fine with such a table, so long as you have the right indexes on it. I have run it against a table with 3.5 million rows with no problem. However, whenever you update the table, you will have to pay particular attention to vacuuming it and reindexing its indexes.
0
 
earth man2Commented:
Behold read the postgresql FAQ

http://www.postgresql.org/docs/faqs/FAQ.html

I plaigiarise it below -- all credit to the Postgresql documentation team.

4.5) What is the maximum size for a row, a table, and a database?
These are the limits:
    Maximum size for a database?             unlimited (32 TB databases exist)
    Maximum size for a table?                32 TB
    Maximum size for a row?                  1.6TB
    Maximum size for a field?                1 GB
    Maximum number of rows in a table?       unlimited
    Maximum number of columns in a table?    250-1600 depending on column types
    Maximum number of indexes on a table?    unlimited
Of course, these are not actually unlimited, but limited to available disk space and memory/swap space. Performance may suffer when these values get unusually large.
The maximum table size of 32 TB does not require large file support from the operating system. Large tables are stored as multiple 1 GB files so file system size limits are not important.
The maximum table size and maximum number of columns can be quadrupled by increasing the default block size to 32k.

AND NOTE
bigserial should be used if you anticipate the use of more than 2^31 identifiers over the lifetime of the table.

The  proof of the pudding is in the eating.  Using Oracle gives you many nice features such as partitioning but Postgres may perform just as well depending on the complexity and numbers of concurrent users of your solution.


0

Featured Post

[Webinar On Demand] Database Backup and Recovery

Does your company store data on premises, off site, in the cloud, or a combination of these? If you answered “yes”, you need a data backup recovery plan that fits each and every platform. Watch now as as Percona teaches us how to build agile data backup recovery plan.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now