DB2 tablespace management of DPF databases; best practices backup/recovery of very large (18 tb) databases

Posted on 2012-08-11
Last Modified: 2012-09-22
Hi all,

Can I get well reasoned advice on the best practices on tablespace management of DPF databases in terms of design for OLTP vs design for data warehouse (DW).

What are best practices for database backup and recovery of very large databases, typically multi terabytes (18 TB) in a DPF database environment?

What are the best practices for monitoring and tuning DPF multi-terabyte databases?

All comments, suggestion, links to relevant reading resources, real-life example or case study will be highly appreciated.

Thank in advance
Question by:Enyinnaya

    Author Comment

    DB2 LUW gurus, Can I get a response to this post, please? Is everyone on vacation?
    LVL 8

    Accepted Solution

    You do realize that most people don't work on weekends, don't you? :-)

    Your questions are so generic that I can offer you very generic answers:
    LVL 34

    Assisted Solution

    by:Gary Patterson
    IBM publishes along list of publications, including several that document best practices for database administration, backup/recovery/replication, and performance management.  You didn't provide your version, so here are the links to the 8.1 and 8.2. docs:

    - Gary Patterson

    Author Comment

    Hi Gary,
    I am looking for specific information about Db2 With DPF flavor. Specifically, How does shops with very large Db2 DPF based database mange with their backup, regular recovery, reotg /runstats if any, and high level monitoring and tuning strategy.

    Any one know of where I can find this information? Maybe something of a best practices or case study.


    Author Comment

    I've requested that this question be closed as follows:

    Accepted answer: 250 points for mustaccio's comment #a38288847
    Assisted answer: 250 points for Gary_The_IT_Pro's comment #a38290005
    Assisted answer: 0 points for Enyinnaya's comment #a38298899

    for the following reason:

    Guys provided good reading links but I don't think they understood the question which was basically how do you manage Very Large Database - VLDB viz: Backup, restore, tuning strategies, etc

    Featured Post

    6 Surprising Benefits of Threat Intelligence

    All sorts of threat intelligence is available on the web. Intelligence you can learn from, and use to anticipate and prepare for future attacks.

    Join & Write a Comment

    VM backups can be lost due to a number of reasons: accidental backup deletion, backup file corruption, disk failure, lost or stolen hardware, malicious attack, or due to some other undesired and unpredicted event. Thus, having more than one copy of …
    Microservice architecture adoption brings many advantages, but can add intricacy. Selecting the right orchestration tool is most important for business specific needs.
    This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
    In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

    731 members asked questions and received personalized solutions in the past 7 days.

    Join the community of 500,000 technology professionals and ask your questions.

    Join & Ask a Question

    Need Help in Real-Time?

    Connect with top rated Experts

    15 Experts available now in Live!

    Get 1:1 Help Now