Rance_Hall
asked on
How to purge historical data from current tables.
I'm using IBM DB2 version 5.2
(Yea I know, its old, I've finally got the bosses convinced that its time to move.)
(please see https://www.experts-exchange.com/questions/21766241/strange-changes-to-values-in-a-db2-table.html, for a detailed question of the setup and how things are going.)
when I issue a carefully crafted delete statement against a dated dataset to delete data older than a certain date I get an error indicating that the transaction log is not large enough to accomodate my request.
I end up having to reissue the same sql statement and incrementing the date by a week, or a month or whatever to get a years worth of data purged.
In my estimation I need to do one of two things:
1) find a way to temporarily (or dynamically) increase the transaction buffer so that the sql statement will execute ONCE and be done.
2) find a way to bybass the the transactional logging system and rollbacks and all that for a one time use sql statement.
I'm open to suggestions on approach as well as method.
Thanks for your help.
(Yea I know, its old, I've finally got the bosses convinced that its time to move.)
(please see https://www.experts-exchange.com/questions/21766241/strange-changes-to-values-in-a-db2-table.html, for a detailed question of the setup and how things are going.)
when I issue a carefully crafted delete statement against a dated dataset to delete data older than a certain date I get an error indicating that the transaction log is not large enough to accomodate my request.
I end up having to reissue the same sql statement and incrementing the date by a week, or a month or whatever to get a years worth of data purged.
In my estimation I need to do one of two things:
1) find a way to temporarily (or dynamically) increase the transaction buffer so that the sql statement will execute ONCE and be done.
2) find a way to bybass the the transactional logging system and rollbacks and all that for a one time use sql statement.
I'm open to suggestions on approach as well as method.
Thanks for your help.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Thanks for the input, your question has opened another question concerning access control which I will post later, but these notes give me exactly what I want.
Thanks.
Thanks.
Hi Rance,
I remember the discussion. :~}
You've got a couple of choices.
-- You can go into the db2 admin utility and increase the number and/or size of the logs. This may not be sufficient as the volume of data being deleted may require a huge amount of logging.
-- You can spin through the data deleting the oldest data one day, week, month, etc at a time.
And of course, you can do both. The loop below will delete one days' worth of data at a time. That should be relatively easy on the logs.
DECLARE BaseDate DATE = '2004-01-01';
DECLARE offset INTEGER;
offset = 10000; -- Assuming nearly 30 years is backing up far enough. :~}
WHILE (offset > 0)
BEGIN
DELETE FROM table WHERE posting_date < BaseDate - offset DAYS:
COMMIT;
END
Good Luck,
Kent