Link to home
Start Free TrialLog in
Avatar of ajexpert
ajexpertFlag for United States of America

asked on

Get the Primary Key Given the table name

I have to reterive the primary key given the table name as input.

The following query takes about 3 seconds to execute and its costly for me as its executed in loop.

SELECT COLUMN_NAME FROM USER_CONS_COLUMNS UCC, USER_CONSTRAINTS UC
WHERE UC.CONSTRAINT_NAME= UCC.CONSTRAINT_NAME
AND UC.OWNER= UCC.OWNER
AND UC.CONSTRAINT_TYPE='P'
AND UCC.POSITION=1
AND UC.TABLE_NAME = <table_name>

Is there any way to optmize the above query or get the primary key of the table in other way?

Thanks
Avatar of Sean Stuber
Sean Stuber

if this is 10g or 11g, do you have statistics gathered on your dictionary tables?
Avatar of ajexpert

ASKER

11g, I dont have privs on dictionary tables.  I will ask DBA to collect stats and update on this
in the mean time you could also try switching to RULE optimizer,
it doesn't exist anymore as system or session level parameter but is still supported as a hint

SELECT /*+ RULE */ COLUMN_NAME FROM USER_CONS_COLUMNS UCC, USER_CONSTRAINTS UC
WHERE UC.CONSTRAINT_NAME= UCC.CONSTRAINT_NAME
AND UC.OWNER= UCC.OWNER
AND UC.CONSTRAINT_TYPE='P'
AND UCC.POSITION=1
AND UC.TABLE_NAME = <table_name>
Is this in a database that supports Oracle Apps (EBS) or SAP or a similar application that uses 10s of thousands of tables?
ASKER CERTIFIED SOLUTION
Avatar of schwertner
schwertner
Flag of Antarctica image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Have you considered writting sql to access the underlying dictionary tables rather than going through the dictionary views (USER_CONS_COLUMNS, USER_CONSTRAINTS).  I'm not sure this would make a difference to performance but it may.  Unfortunately, I don't have access to an Oracle 11g database so I can't try this or provide sample code.

Of course, the downside to doing this is that you may have to change the code each time you upgrade to a newer version of Oracle and you may have to grant privileges to the underlying tables (or you could create your own views).

Just a thought as this may not give you any performance benefits whatsoever.
I asked DBA to collect stats on underlying tables, and he did it, but it could not improve te performance.
Sean,
I implemented RULE hint but could not get any benefit from it
 
If your database supports Oracle Applications (eBS) or SAP or any other application that uses 10s ot thousands of tables, this query *WILL* be slow if you use the default data dictionary views.  They are *NOT* optimized to proved fast response times in databases that have lots of tables.  The only way to make this query faster is to create a materialized view (or a regular table) that contains a copy of the contents of the views: USER_CONS_COLUMNS and USER_CONSTRAINTS.  A query of a materialized view or regular table can be *MUCH* faster than a query of the data disctionary views.
Your query is based on views, not on tables.

 USER_CONS_COLUMNS  and USER_CONSTRAINTS  are views, based internally on Oracle dictionary tables. I think they are not indexed and so there is no lever that can help you to speed up the performance.

You can only prepare the desired result and refresh it  periodically using materialized views. I think there are not often changes of the list of Primary keys.
It should be relativelly static.
"I asked DBA to collect stats on underlying tables, and he did it, but it could not improve te performance."

by "underlying tables" do you mean the tables that have the constraints on them?

or do you mean the data dictionary itself and the SYS.xxxxx tables under the USER_xxxx views?

collecting dictionary statistics is different than collecting table statistics.  And the stats on the constrained tables make no difference at all.

also, as markgeer noted above, if you have a very large number of objects you might have to wait anyway.
by "underlying tables" do you mean the tables that have the constraints on them By this I mean collecting stats on  SYS.xxxxx tables under the USER_xxxx views.  Sorry for not being clear.
We have two versions of Oracle 10g and 11g with same set of tables and constraints.
Interstingly, the performance of the said query runs faster on Oracle 10g but slower on 11g.  Our DBA compared the explain plan and found different paths and objects (sys objects) are being used for 10g and 11g.
The ticket is opened with Oracle

Can you tell us if your databases support Oracle Applications or SAP or another system that uses 10's of thousands of tables, or is it a much smaller system?   Can you run this query and post the results:
select count(*) from dba_tables;
We do not use Ora APP or SAP.  Its a custom product developed and being customized by our company.  The front end is J2EE.  Size of db is 100GB
That could be 100GB in just one (or a few) table(s), or it could be in 20,000+ tables, as with Oracle EBS or SAP.  But the data diictionary tables will have *VERY* different numbers of records and will give *VERY* different response times depending on the numbers of objects they contain.  For example, in our two (very different) systems, we get these results to these two queries:

                                         Custom      EBS
select count(*) from dba_tables;          3,221     27139
select count(*) from dba_constraints;    11,107     167043

Are you willing to run those two queries in your system and post the results?  

Without knowing how many records your data dictionary tables contain, we can only guess at the performance you should be able to get from them.
As you are aware that we have raised the ticket with Oracle, and as per them, they have done incorporated some changes in oracle data dictionary to improve overall performance.  Somehow it adversely affected the constraints metadata performance views.
They have suggested to make use of bind variables.  I will try and keep you posted.
Thanks for comments experts
"they have ... incorporated some changes in oracle data dictionary".  What kind of changes are you talking about?  Do you mean the changes from Oracle10 to Oracle11?  Or, do you mean specific changes (or a patch) for your organization only?  Or, a general patch or update for Oracle11?  If yes, which one?

Also, you have never told us how many objects (tables and constraints) your database has.  Are you willing to give us that information?
Yes, Inorder to improve overall performance of Oracle 11g, they (Oracle) have done modifications in data dictionary.
I have about 500+ tables with each table having 1 pk and 2 fk on an avg
If the standard data dictionary views are too slow in Oracle11,, even with statistics gathered on the data dictionary objects and with only about 500 tables and a couple thousand constraints, then your only option is materialized views (or "copy" tables) that you periodically refresh.  These will be *MUCH* faster than the data dictionary views for your query/ies.
Oracle suggested to make use of bind variables.  Creating MVIEW require lot of approvals.
Will keep you posted
Yes, for best performance, cursors in PL\SQL procedures should always use bind variables, rather than "hard-coded" literal values in combination with "execute immediate".  And yes, creating a materialized view may be more work, but when other options don't give you the performance you want, it's nice to have the option of materialized views in Oracle.
with such a small number of tables and constraints I'm very curious as to why the query takes so long.
without binds you will incur parsing time, but for a simple query as in this question it seems very long indeed.
however, I guess if other queries in your application(s) are similarly written without binds then it might make more sense as they will all be trampling over each other

trace the session running the query (use dbms_monitor with waits=>TRUE)  and send the trace file.  I'm curious what it's doing that takes so long.
Sean,
I do not have rights to trace on data dictionary tables/ views.
 
it's not tracing the dictionary, it's tracing your session.

if you can't run dbms_monitor yourself, ask your dba to do it for you.
actually, I'm surprised Oracle support didn't already ask for it, they may have asked for an alter session with 10046 event which does the same thing
Thanks Sean,
I will try.  Can you please tell me the exact order of the commands I execute for trace?
from sql*plus prompt

exec dbms_monitor.session_trace_enable

then run your query
then exit



the procedure does take parameters but the defaults will capture wait events for your session and that's what we want.

go to your user_dump_dest  directory to get your tracefile.
Hi,
I resolved the issue by creating an array
We have around 50 tables for which PRIMARY KEY needs to queried form user_constraints and user_cons_columns.
I populated the array for required tables to get primary keys and now its gets data from array vs querying the data dictionary of oracle.
This involved bit of code fix too.
Even though there are 50 distinct tables, the loop was iterating for more than 200 times and each time user_constraints query was being executed.  
Now, it gets the data from array and code is optimized.
I leave on moderators to award the points.