How to purge historical data from current tables.

I'm using IBM DB2 version 5.2

(Yea I know, its old, I've finally got the bosses convinced that its time to move.)

(please see, for a detailed question of the setup and how things are going.)

when I issue a carefully crafted delete statement against a dated dataset to delete data older than a certain date I get an error indicating that the transaction log is not large enough to accomodate my request.

I end up having to reissue the same sql statement and incrementing the date by a week, or a month or whatever to get a years worth of data purged.

In my estimation I need to do one of two things:

1) find a way to temporarily (or dynamically) increase the transaction buffer so that the sql statement will execute ONCE and be done.

2) find a way to bybass the the transactional logging system and rollbacks and all that for a one time use sql statement.

I'm open to suggestions on approach as well as method.

Thanks for your help.
Who is Participating?
ghp7000Connect With a Mentor Commented:
log full error messages simply mean that you dont have enough log files configured to handle the load.

if you do from command line:
db2 get db cfg for <dbname>
and look at the configuration of the LOGPRIMARY database parameter, you will see it it is set to a certain number of log files.
Increase the number of log files by doing
db2 update db cfg for <dbname> using logprimary <at least twice as many log files as were previously defined>
you can increasethe size of the log files by doing
db2 update db cfg for <dbname> using LOGFILSIZ 5000 (means each log file is 20MB in size)
you can tell db2 to create more log files if it runs out of log file space during a transaction
db2 update db cfg for <dbname> using LOGSECOND 100 (means create up to 100 log files if logprimary space runs out. These log files are destroyed automatically by db2 when the transaction is committed or rolled back)

Kent OlsenData Warehouse Architect / DBACommented:

Hi Rance,

I remember the discussion.    :~}

You've got a couple of choices.

-- You can go into the db2 admin utility and increase the number and/or size of the logs.  This may not be sufficient as the volume of data being deleted may require a huge amount of logging.

-- You can spin through the data deleting the oldest data one day, week, month, etc at a time.

And of course, you can do both.  The loop below will delete one days' worth of data at a time.  That should be relatively easy on the logs.

  DECLARE BaseDate    DATE = '2004-01-01';
  DECLARE offset         INTEGER;

  offset = 10000;       -- Assuming nearly 30 years is backing up far enough.    :~}

  WHILE (offset > 0)
    DELETE FROM table WHERE posting_date < BaseDate - offset DAYS:

Good Luck,
Rance_HallAuthor Commented:
Thanks for the input, your question has opened another question concerning access control which I will post later, but these notes give me exactly what I want.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.