Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

Changing Primary key on a big table

Posted on 2006-07-18
7
Medium Priority
?
2,005 Views
Last Modified: 2008-01-09
Hi experts, I've been using mostly SQL Server in the last 5 years, so my Oracle skills are a little outdated. I need your advice for the following :

On a datawarehouse Oracle 9i DB, a table has been wrongly designed and I'm asked to change the primary key to avoid dupes.

Let's call it STAR12, and assume the current primary key is on
(ID_MONTH int, ID_DWH int) where ID_DWH is a sequence number that resets for each new ID_MONTH

I had to alter the table to add a new field ID_GRP_RISK
And I need to
- populate the new field (I've done this part)
- delete the records where the new field is null (done it too)
- set a not null constraint on the new field
- drop the current primary key
- delete the duplicate records
- create the new primary key on (ID_MONTH, ID_GRP_RISK)

The requirements :
I've been told that applying the changes to the historical data is a requirement. Dropping the table and running the loading script to populate a brand new table for each and every month is not an option.

My current concerns :
I was wondering if it would not be better to keep the current index and just create a new unique index on (ID_MONTH, ID_GRP_RISK), and if you experts would have better ideas or advices to handle this situation.

Also I was wondering if it would not be more effective to
- create a non-unique index first, to speed up the deleting of the dupes (records that have the same ID_MONTH, ID_GRP_RISK)
- delete the dupes
- make it the primary key afterwards (is it even possible ?)

The table has about 1 million records per ID_MONTH, currently 20 months loaded.

To avoid filling the temp tablespaces and / or swapping forever, I loop on ID_Month with a cursor to put intermediary COMMITs after each month is processed.

Any ideas/sample code welcome (more efficient syntax), in particular

I will create another topic for the "delete dupes" part if need be.

Thanks for reading

Hilaire
0
Comment
Question by:Hilaire
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
  • +1
7 Comments
 
LVL 14

Expert Comment

by:sathyagiri
ID: 17130335
1.Create an non-unique index as you mentioned on the ID_MONTH and ID_GRP_RISK
1.For deleting your duplicates, think of using BULK COLLECT using FORALL.
3.Then drop the nojn-unique index and create your primary key

0
 
LVL 26

Author Comment

by:Hilaire
ID: 17130600
thanks for your comment sathyagiri
would you mind setting verbose mode on (no offence meant, I really appreciate your comment above ;-) ) and giving me a few tips on how to remove dupes using BULK COLLECT FORALL, so that I get the general idea ?

I'm going through the BULK COLLECT online reference just now, but I'm not familiar with it at all, so I'd be glad to get a short explaination so that Idon't miss the point.
Thanks again

Hilaire
0
 
LVL 14

Accepted Solution

by:
sathyagiri earned 1600 total points
ID: 17130986
To delete duplicates you could just use
create or replace procedure del_proc
is
TYPE T is table of rowid;
t1 t;
cursor c is
select rowid from table a where rowid > (select min(rowid)  from table b where a.col1 = b.col1 and a.col2 = b.col2 );
begin
open c;
loop
fetch c bulk collect into T;
for all i in t1.first..t1.last
loop
delete from table where rowid = t1(i);
end loop;
exit when c%notfound;
end loop;
end;

Instead an easier way would be create the primary key with the duplicates using the INTO EXCEPTIONS clause.

Step1. Create the exceptions table using utlexcpt.sql which would be under your ORACLE_HOME\rdbms folder.
Step2. Alter table add primary key tab_pk(col1,col2) exceptions into exception
Step 3.delete from table where rowid <> (select minrid from (SELECT col1,col2,MIN(e.row_id) minrid
FROM EXCEPTIONS e,table t WHERE t.ROWID=e.row_id GROUP BY col1,col2))
0
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 27

Assisted Solution

by:sujith80
sujith80 earned 400 total points
ID: 17131076
Hi,
If you have the liberty of creating a new table from the existing one and at the end dropping STAR12 and renaming the new table to STAR12, that would be the best thing to do. Think about it.

If now, Keeping your constraints in mind,
what you have to do now is - delete the duplicates and set a new primary key.( Assuming that you are not going to do any DMLs in between.)

1. Drop your current constraint and any unwanted indexes.

2. Execute a code similar to the sample code below to delete your duplicates.

---------
SQL> desc a1
 Name                                            Null?    Type
 ----------------------------------------------- -------- -------------
 OBJECT_ID                                                NUMBER
 OBJECT_NAME                                              VARCHAR2(128)
-----------

declare
 type t1 is table of rowid index by binary_integer;
 l_t1 t1;
 -- cursor to find the duplicates
 cursor c1 is
 select rowid from a1
 minus
 select min(rowid) from a1
 group by object_id, object_name; --put your criteria to find duplicates here

 l_rows number := 5; -- batch size change according to ur record count
begin
 open c1;
 loop
   fetch c1 bulk collect into l_t1 limit l_rows;
   exit when l_t1.count = 0;
   forall i in l_t1.first..l_t1.last
    delete from a1
    where rowid = l_t1(i);
   commit;
 end loop;
 close c1;
end;
/

3. set your constraints

I dont undetstand why you worry about creating a non-unique index first and dropping it later. NO-need.

Hope this helps.

Rgds,
Sujith
0
 
LVL 26

Author Comment

by:Hilaire
ID: 17131452
Thanks a million.
I really appreciate your help.

Regards

Hilaire
0
 
LVL 32

Expert Comment

by:awking00
ID: 17131561
>>And I need to<<
  - populate the new field (I've done this part)
OK
  - delete the records where the new field is null (done it too)
OK
  - delete the duplicate records
Needs to be done first

delete from star12
where rowid not in
(select row_id from
 (select id_month, id_grp_risk, max(rowid) row_id
  from star12
  group by id_month, id_grp_risk));

  - create the new primary key on (ID_MONTH, ID_GRP_RISK)
Should be done next
create unique index <index_name> on star12(id_month, id_grp_risk); => use keyword local if table is partitioned

  - set a not null constraint on the new field
alter table star12 add constraint <same as index_name>
 primary key(id_month, id_grp_risk) using index;

  - drop the current primary key
alter table star12 drop constraint <old_constraint_name>;
drop index <old_index_name>;


0
 
LVL 32

Expert Comment

by:awking00
ID: 17131583
Sorry, I shouldn't have gone to lunch leaving the question open. I see you already have the solution. Good job, guys:-)
0

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Background In several of the companies I have worked for, I noticed that corporate reporting is off loaded from the production database and done mainly on a clone database which needs to be kept up to date daily by various means, be it a logical…
When it comes to protecting Oracle Database servers and systems, there are a ton of myths out there. Here are the most common.
Via a live example, show how to restore a database from backup after a simulated disk failure using RMAN.
This video shows how to configure and send email from and Oracle database using both UTL_SMTP and UTL_MAIL, as well as comparing UTL_SMTP to a manual SMTP conversation with a mail server.

609 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question