Link to home
Start Free TrialLog in
Avatar of Mohammed Mohiuddin
Mohammed Mohiuddin

asked on

Load Data into multiple tables automatically using procedure or dynamic SQL

HI There! I have two schemas (A & B) having multiple tables(Huge no. of records), and have to load data into schema (B) tables by capturing table names from all_tables & all_tab_columns of (A).  The table names are same in source and the target schemas but in target there're some tables has extra columns with null constraint. Need Help to INSERT the data automatically.

Oracle 11gR2 Release 4.
Source Schema Names: A
 Target Schema Names: B
Tables: Same in both schemas and in some tables more NULL columns datatypes are Number, Varchar2, Char, Date & Timestamp.

Need Help.
Avatar of Geert G
Geert G
Flag of Belgium image

just copy the whole schema with expdp/impdp using remap_schema
https://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_import.htm#SUTIL927

what do you mean with automatically ?
Avatar of Mohammed Mohiuddin
Mohammed Mohiuddin

ASKER

Create Oracle PL/SQL Procedure / Dynamic SQL to form the SQL statement and than schedule cronjob  to insert data automatically on regular basis.

Automatically meaning you can schedule the script to execute procedure/SQL Statement from any job scheduler or crontab.
Can it be possible to use expdp/impdp on tables having un-identical columns between source & target schema's.
>> having un-identical columns between source & target schema's.
no, not directly

but you can export multiple times
1. export all tables with same structure, excluding the ones that don't
2. export via sql statements per table with different structure

why would you want to copy the data within the same database to another schema ?

i hope you haven't put QAS and PRD in the same database ?
These are two different environments in the database. One is live data and another one is active load env.

As an expert Oracle Developer/DBA, I assume it would be easy for you to write a code using PL/SQL block, open/Fetch/close cursors on all_tables & all_tab_columns (read table & column names ) & form Dynamic Insert query for the same purpose to insert data.
>>Huge no. of records

I doubt you want to write code to export ALL the data then remove/load EVERYTHING in the target database.

As you have described things, this is a full unload/load every time.

You don't need PL/SQL to form "insert" statements or CSV files.  Simple SQL and a sqlplus script can do this.

The problem you have is the amount of data you claim to have.  To some people there large amounts of data is multiple terabytes or petabytes.  You don't want to create a 1 TB CSV file and attempt to load it.

If you have the space on the destination system, I would look at using transportable tablespaces to make an exact copy then write code on the destination system to do a local ETL style process.
Even if we can generate a statement like

Insert into B.b1 (col1,col2,......col10) values col1,col2,......col10)  select col1, ,col2,......col10 from A.b1;

Datatypes for above columns are number, varchar2, char, date and timestamp

Please comment.
We have data around 100k records in major tables and might grow. Also, 2-3 tables having records around 1million.

To avoid bottleneck we can use parallel threads.
How are you planning on parallel threads if all you have is a file with 100,000 insert statements?

>>Even if we can generate a statement like

With an insert statement, you either use VALUES or SELET not both,

Personally I would forget about trying to do this with ALL_TAB_COLS and just write a SQL script to generate the inserts.

This is untested but gives you the basic idea (col1 is a varchar2 and wrapped in single quotes and col2 is a number):
set pages 0
set lines 10000
set feedback off
spool myinserts.sql
select 'Insert into b1 (col1,col2) values ('''' || col1 || ''',' || col2 || ');' from A.b1;
spool off

Open in new window


You'll need TO_DATE and TO_TIMESTAMP for the date and timestamp fields.

Then use sqlplus on the destination to execute the script.
If  target schema and source schema has exactly same structure this is not problem as experts already said. You want to synchronize two schema except some situations. Unfortunately null constraints or any other constraints is not so much different in that manner

You may simply write an insert statement that created manually (maybe with help of tricky queries)
insert into schema1.a  select * from schema2.a where conditions;
insert into schema1.b select * from schema2.b where conditions;
in that way you can use alternative method to load huge tables ( hints bulk load, pipeline.. )


If you want to fully automated method you need some procedures and functions . If you have huge tables use alternative  load methods

You need something like this

CREATE OR REPLACE PROCEDURE sync IS

   DECLARE
      CURSOR table_name_cursor  IS
         SELECT table_name
           FROM all_tables where owner=..;
  l_condition varchar2(1000);
           
   BEGIN
   
   
      OPEN table_name_cursor;

      LOOP
         FETCH table_name_cursor INTO t_name;
                 EXIT WHEN table_name_cursor%NOTFOUND;
           
          l_condition=fn_get_conditions(t_name)
                    dynsql:= ' INSERT INTO SCHEMA1.' || t_name || ' AS   SELECT * .. FROM SCHEMA1.' || t_name ||' WHERE '|| l_condition;
      -- bulk collect, pipelined query, hint etc may be needed
COMMIT;

         EXECUTE IMMEDIATE dynsql

         DBMS_OUTPUT.put_line ('SUCCESS');
      END LOOP;

      CLOSE table_name_cursor;

   END;
END;

Open in new window



WHERE conditions comes from all_constraints table and unfortuately working with long  data typoe a bit problematic. So better to use external function to create where condition
Where condition will looks like
COLX IS NOT NULL AND COL Y IS NOT NULL AND ...

To get null condition use this sql

    SELECT *
      FROM all_constraints
       WHERE owner='..'
       AND TABLE_NAME='..'
       AND CONSTRAINT_TYPE='C'
       AND SEARCH_CONDITION IS NOT NULL

Open in new window


Tricky way : to create search  condition string , use listagg () withing group() on search_condition column ( firstly you need to convert it  )
This question needs an answer!
Become an EE member today
7 DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.