Learn how to a build a cloud-first strategyRegister Now

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 606
  • Last Modified:

Validate Oracle tables and Views scripts

I have a number of databases all should be the same but I would like a better way to compare and validate the scripts that make up the DBes.

For instance I have say 500 tables and at the moment none of scripts that create the tables have version numbers so over time I don't know if I have the correct version of the script installed.

I also have the same issue with views say again 500 views as i have 10+ Databases I don't want to have to manually check each and every one as that's 10000 checks.

I thought the best way to tackle the issue would be to export just the tables and views and do a md5sum on each of them if they are different to the source then i know I have a difference.

how best to just export the scripts that make up the tables and views?

example below
500 tables
60 indexes
3000 Constraints
5 functions
10 Procedures
5 Sequences

Thanks for your comments (in advance)
1 Solution
use dbms_metadata.get_ddl

but the scripts generated aren't guaranteed to be the same as scripts you already have due to upper/lower case, white space formatting and inclusion/exclusion of default values.

So your md5 sums won't match.

It could work going forward though if all scripts were generated from dbms_meta calls

slightwv (䄆 Netminder) Commented:
To compare the databases themselves you can get something like Oracle's Change Management pack.

You can also write some SQL to compare the objects in question.  For example, tables can be compared by using dba_tab_columns.

How about instead of exporting the db and comparing it against your scripts, build a new 'baseline' database using the scripts then compare the resulting objects?
slightwv (䄆 Netminder) Commented:
I suppose you could use the above suggestion of dbms_metadata from the 'baseline' database.  This should help (not eliminate) the spacing/case issue mentioned.

That might be quicker than writing your own SQL to check the objects.

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

hdazAuthor Commented:
I am not quite sure how the above would work.. ?? how would I get all 500 tables and or if there was a table added i did not know about how would / could I get it.. a wildcard would need to be added like

set heading off;
set echo off;
Set pages 999;
set long 90000;
spool ddl_list.sql
select dbms_metadata.get_ddl('*');

taken from http://www.dba-oracle.com/oracle_tips_dbms_metadata.htm

and would it not just place every script in = ddl_list.sql ?
slightwv (䄆 Netminder) Commented:
Look at the SCHEMA_EXPORT option:


For specific object types you can also just select them from the core views, similar to below (pasted from: http://pandazen.wordpress.com/2008/02/09/using-the-dbms_metadata-package-to-extract-all-the-schema-ddl-with-one-call/)

Here is what I would do to  compare tables:

1. Take some empty schema and create database links to all schemas you want to compare.

Create table compare_tbls(
dblink varchar2(30),
tblname varchar2(30)

Open in new window

then for each database link:

insert into compare_tbls(dblink,tblname)
select '<DBLINK>', table_name from user_tables@<DBLINK>;

Open in new window

Or (in case it will not work

insert into compare_tbls(dblink,tblname)
select '<DBLINK>', table_name from all_tables@<DBLINK> where owner=<DBLINK_USERNAME>;

Open in new window

Now you can query this to see which tables are missing:
select table_name, count(*) from compare_tbls group by table_name
having count(*)< NUMBER_OF_DATABASES;

Open in new window

Similar with columns

Create table compare_cols(
dblink varchar2(30),
tblname varchar2(30),
colname varchar2(30), 
datatype varchar2(30), 
datasize varchar2(30)

Open in new window

Use view user_table_columns (or all_table_columns)
For datasize you have several columns in that view, copy them all (I don't have Oracle near me at this moment, but I hope you got the picture)

slightwv (䄆 Netminder) Commented:
It's a bit of a security risk to link all databases with an account that can see the tables and code.

All I need to do is compromise your report server then I have access to all your databases.
hdazAuthor Commented:
Thanks for the comments, its getting late here,

htonkov - thanks for the above comparing the data in the tables or just the scripts that create the db (which are held in the db?) ((obviously I will need to try your suggestion out to see if I can get it to work for me)).

slightwv - thanks for highlighting the risk(s) ideally I would remove all links and the creating of the empty schema before going live with the deployment(s).

I also looked and tried out the link(s) you posted, created a 35MB file with all scripts in the spooled file, I wonder how hard it might be to spool each of the scripts out to their name.sql?
slightwv (䄆 Netminder) Commented:
Dbms_metadata.get_ddl has options for each specific object type.  You can spool as many as you want.

Featured Post

Restore individual SQL databases with ease

Veeam Explorer for Microsoft SQL Server delivers an easy-to-use, wizard-driven interface for restoring your databases from a backup. No expert SQL background required. Web interface provides a complete view of all available SQL databases to simplify the recovery of lost database

Tackle projects and never again get stuck behind a technical roadblock.
Join Now