Solved

Oracle insert when from a collection

Posted on 2014-07-26
8
608 Views
Last Modified: 2014-08-03
Hi,

I'm stuck.  I am trying to grab data from a remote site, load it into a collection object, then unload the collection into target tables based on predefined conditions.

I though to use a schema based collection object instead of a plsql associative array due to the table transformation (less columns in the target table than from the source).  I would then bulk collect from an execute immediate - the statement has to be dynamic sql because this procedure has to work for multiple source DB, e.g., requiring a db link a parameter.

The basic idea is to grab data from the remote site and load it into multiple tables in a staging db.  After the load is complete, clean up can begin at the remote site (either updating a flag attribute indicating records are processed or deleting records and reclaiming disk space).  The tables I will load split the data based on location, with an additional condition (not included but basically "when column is not null") to load all data into a history table.  Basically a big move of everything while also fanning it out.

I am having trouble with the insert when because it requires a select statement at the end and I am not sure how to proceed or if this way is event the right way to accomplish my task.

Any expert help here would be greatly appreciated
My DB version is Oracle 11gR2
Here is the code I have so far:
(I am using the dbms_errlog error tables for each target table while developing)

drop table t1_eq_us;
drop table t2_eq_non_us;
drop table t3_eq_event_hist;
drop type eq_event_ty;
drop type eq_event_o;


CREATE TABLE t1_eq_us
(
   EQ_ID              NUMBER (15)
  ,EVENT_DATASOURCE   VARCHAR2 (25 CHAR)
  ,EVENT_SOURCE       VARCHAR2 (25 CHAR)
  ,EQ_EVENT_ID        VARCHAR2 (25 CHAR)
  ,EQ_TIME            VARCHAR2 (25 CHAR)
  ,LONGITUDE          VARCHAR2 (25 CHAR)
  ,LATITUDE           VARCHAR2 (25 CHAR)
  ,DEPTH              VARCHAR2 (25 CHAR)
  ,MAG_VALUE          VARCHAR2 (25 CHAR)
)
TABLESPACE quake_tbs;

CREATE TABLE t2_eq_non_us
(
   EQ_ID              NUMBER (15)
  ,EVENT_DATASOURCE   VARCHAR2 (25 CHAR)
  ,EVENT_SOURCE       VARCHAR2 (25 CHAR)
  ,EQ_EVENT_ID        VARCHAR2 (25 CHAR)
  ,EQ_TIME            VARCHAR2 (25 CHAR)
  ,LONGITUDE          VARCHAR2 (25 CHAR)
  ,LATITUDE           VARCHAR2 (25 CHAR)
  ,DEPTH              VARCHAR2 (25 CHAR)
  ,MAG_VALUE          VARCHAR2 (25 CHAR)
)
TABLESPACE quake_tbs;

CREATE TABLE t3_eq_event_hist
(
   EQ_ID              NUMBER (15)
  ,EVENT_DATASOURCE   VARCHAR2 (25 CHAR)
  ,EVENT_SOURCE       VARCHAR2 (25 CHAR)
  ,EQ_EVENT_ID        VARCHAR2 (25 CHAR)
  ,EQ_TIME            VARCHAR2 (25 CHAR)
  ,LONGITUDE          VARCHAR2 (25 CHAR)
  ,LATITUDE           VARCHAR2 (25 CHAR)
  ,DEPTH              VARCHAR2 (25 CHAR)
  ,MAG_VALUE          VARCHAR2 (25 CHAR)
  ,insert_time        TIMESTAMP (3) DEFAULT SYSTIMESTAMP
  ,insert_osuser      VARCHAR2 (25 CHAR)
                         DEFAULT SYS_CONTEXT ('userenv'
                                             ,'os_user')
)
TABLESPACE quake_tbs;

CREATE TYPE eq_event_o AS OBJECT
(
   EQ_ID NUMBER (15)
  ,EVENT_DATASOURCE VARCHAR2 (25 CHAR)
  ,EVENT_SOURCE VARCHAR2 (25 CHAR)
  ,EQ_EVENT_ID VARCHAR2 (25 CHAR)
  ,EQ_TIME VARCHAR2 (25 CHAR)
  ,LONGITUDE VARCHAR2 (25 CHAR)
  ,LATITUDE VARCHAR2 (25 CHAR)
  ,DEPTH VARCHAR2 (25 CHAR)
  ,MAG_VALUE VARCHAR2 (25 CHAR)
);

CREATE OR REPLACE TYPE eq_event_ty AS TABLE OF eq_event_o;
/

DECLARE
   aDbLink   VARCHAR2 (25 CHAR) := 'VM12C';
   xStmt     VARCHAR2 (500 CHAR) := 'SELECT eq_id
      ,event_datasource
      ,event_source
      ,eq_event_id
      ,eq_time
      ,longitude
      ,latitude
      ,DEPTH
      ,mag_value
  FROM eq_event_t'--||'@'||'#link#'
  ;
   src_tab   SYS_REFCURSOR;
   xOutRec   eq_event_ty
                := eq_event_ty (NULL
                               ,NULL
                               ,NULL
                               ,NULL
                               ,NULL
                               ,NULL
                               ,NULL
                               ,NULL);

   TYPE t3 IS TABLE OF t3_eq_event_hist%ROWTYPE;

   CURSOR c1
   IS
      SELECT *
        FROM t3_eq_event_hist;
BEGIN
   BEGIN
      OPEN src_tab FOR xStmt; --REPLACE (xStmt,'#link',aDbLink);

      FETCH src_tab BULK COLLECT INTO xOutRec LIMIT 100;

      CLOSE src_tab;

      DBMS_OUTPUT.put_line ('Dynamic Execute: ' || xOutRec.COUNT);
   EXCEPTION
      WHEN OTHERS
      THEN
         DBMS_OUTPUT.put_line (DBMS_UTILITY.format_error_backtrace);
         DBMS_OUTPUT.put_line (SQLERRM);
   END;

   BEGIN
      FORALL i IN xOutRec.FIRST .. xOutRec.LAST
         --to do: logging - what is happening, timing, source, target, load start etc
         INSERT INTO t3_eq_event_hist (eq_id
                                      ,event_datasource
                                      ,event_source
                                      ,eq_event_id
                                      ,eq_time
                                      ,longitude
                                      ,latitude
                                      ,DEPTH
                                      ,mag_value)
         VALUES (xOutRec (i).eq_id
                ,xOutRec (i).event_datasource
                ,xOutRec (i).event_source
                ,xOutRec (i).eq_event_id
                ,xOutRec (i).eq_time
                ,xOutRec (i).longitude
                ,xOutRec (i).latitude
                ,xOutRec (i).DEPTH
                ,xOutRec (i).mag_value);

      DBMS_OUTPUT.put_line ('History Count: ' || SQL%ROWCOUNT);
      COMMIT;
   EXCEPTION
      WHEN OTHERS
      THEN
         DBMS_OUTPUT.put_line (DBMS_UTILITY.format_error_backtrace);
         DBMS_OUTPUT.put_line (SQLERRM);
   END;

   INSERT
   WHEN TRIM (LOWER (event_datasource)) = 'us'
   THEN
        INTO t1_eq_us (eq_id
                      ,event_datasource
                      ,event_source
                      ,eq_event_id
                      ,eq_time
                      ,longitude
                      ,latitude
                      ,DEPTH
                      ,mag_value)
      VALUES (eq_id
             ,event_datasource
             ,event_source
             ,eq_event_id
             ,eq_time
             ,longitude
             ,latitude
             ,DEPTH
             ,mag_value)
   ELSE
        INTO t2_eq_non_us (eq_id
                          ,event_datasource
                          ,event_source
                          ,eq_event_id
                          ,eq_time
                          ,longitude
                          ,latitude
                          ,DEPTH
                          ,mag_value)
      VALUES (eq_id
             ,event_datasource
             ,event_source
             ,eq_event_id
             ,eq_time
             ,longitude
             ,latitude
             ,DEPTH
             ,mag_value)
      SELECT eq_id
            ,event_datasource
            ,event_source
            ,eq_event_id
            ,eq_time
            ,longitude
            ,latitude
            ,DEPTH
            ,mag_value
        FROM t3_eq_event_hist;

   DBMS_OUTPUT.put_line ('Target Total Count: ' || SQL%ROWCOUNT);
   COMMIT;
EXCEPTION
   WHEN OTHERS
   THEN
      DBMS_OUTPUT.put_line (DBMS_UTILITY.format_error_backtrace);
END;

Open in new window


I am just trying to avoid having to use dynamic sql where its possible.  Also, I thought of just creating the types within the plsql block itself instead of having it hard coded, but wasn't sure if there would be memory issues  (the tables are large enough to notice the loss of network performance and I have to balance how big of a chunk I grab at a time with how fast it needs to be available at the target).
0
Comment
Question by:g_currier
  • 3
  • 3
  • 2
8 Comments
 
LVL 76

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 200 total points
Comment Utility
Personally I don't like in-memory tables for things like this.  in-memory tables are good for small data sets, data sets where you need to make multiple passes through the data or need direct lookups.

On a quick scan of what you posted, I would just create a cursor for loop and insert each row into the necessary table as it is fetched.

>>or deleting records and reclaiming disk space

Just deleting records does not really reclaim disk space.  Once space has been allocated to a table, it remains allocated.  Deleting records may allow new rows to reuse existing allocated space but not reclaim space.
0
 
LVL 29

Expert Comment

by:MikeOM_DBA
Comment Utility
I agree with slightwv.
Here are some modifications to your code that may work:
DECLARE
   Adblink        VARCHAR2 ( 25 CHAR ) := 'VM12C';
   Xstmt          VARCHAR2 ( 500 CHAR ) := '
        SELECT Eq_Id
             , Event_Datasource
             , Event_Source
             , Eq_Event_Id
             , Eq_Time
             , Longitude
             , Latitude
             , DEPTH
             , Mag_Value
          FROM Eq_Event_T'--||'@'||'#link#'
;
   Src_Tab        SYS_REFCURSOR;
   Xoutrec        Eq_Event_Ty
      := Eq_Event_Ty ( NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL );

   TYPE T3 IS TABLE OF T3_Eq_Event_Hist%ROWTYPE;

   CURSOR C1
   IS
      SELECT * FROM T3_Eq_Event_Hist;
BEGIN
   BEGIN
      OPEN Src_Tab FOR Xstmt;               --REPLACE (xStmt,'#link',aDbLink);

      FETCH Src_Tab BULK COLLECT INTO Xoutrec LIMIT 100;

      CLOSE Src_Tab;

      DBMS_OUTPUT.Put_Line ( 'Dynamic Execute: ' || Xoutrec.COUNT );
   EXCEPTION
      WHEN OTHERS
      THEN
         DBMS_OUTPUT.Put_Line ( DBMS_UTILITY.Format_Error_Backtrace );
         DBMS_OUTPUT.Put_Line ( SQLERRM );
   END;

   BEGIN
      FORALL I IN Xoutrec.FIRST .. Xoutrec.LAST
         --to do: logging - what is happening, timing, source, target, load start etc
         INSERT INTO T3_Eq_Event_Hist ( Eq_Id
                                      , Event_Datasource
                                      , Event_Source
                                      , Eq_Event_Id
                                      , Eq_Time
                                      , Longitude
                                      , Latitude
                                      , DEPTH
                                      , Mag_Value )
              VALUES ( Xoutrec ( I ).Eq_Id
                     , Xoutrec ( I ).Event_Datasource
                     , Xoutrec ( I ).Event_Source
                     , Xoutrec ( I ).Eq_Event_Id
                     , Xoutrec ( I ).Eq_Time
                     , Xoutrec ( I ).Longitude
                     , Xoutrec ( I ).Latitude
                     , Xoutrec ( I ).DEPTH
                     , Xoutrec ( I ).Mag_Value );

      DBMS_OUTPUT.Put_Line ( 'History Count: ' || SQL%ROWCOUNT );
      COMMIT;
   EXCEPTION
      WHEN OTHERS
      THEN
         DBMS_OUTPUT.Put_Line ( DBMS_UTILITY.Format_Error_Backtrace );
         DBMS_OUTPUT.Put_Line ( SQLERRM );
   END;

   FORALL I IN Xoutrec.FIRST .. Xoutrec.LAST
      INSERT INTO T1_Eq_Us ( Eq_Id
                           , Event_Datasource
                           , Event_Source
                           , Eq_Event_Id
                           , Eq_Time
                           , Longitude
                           , Latitude
                           , DEPTH
                           , Mag_Value )
         SELECT Xoutrec ( I ).Eq_Id
              , Xoutrec ( I ).Event_Datasource
              , Xoutrec ( I ).Event_Source
              , Xoutrec ( I ).Eq_Event_Id
              , Xoutrec ( I ).Eq_Time
              , Xoutrec ( I ).Longitude
              , Xoutrec ( I ).Latitude
              , Xoutrec ( I ).DEPTH
              , Xoutrec ( I ).Mag_Value
           FROM DUAL
          WHERE TRIM ( LOWER ( Event_Datasource ) ) = 'us';

   FORALL I IN Xoutrec.FIRST .. Xoutrec.LAST
      INSERT INTO T2_Eq_Non_Us ( Eq_Id
                               , Event_Datasource
                               , Event_Source
                               , Eq_Event_Id
                               , Eq_Time
                               , Longitude
                               , Latitude
                               , DEPTH
                               , Mag_Value )
         SELECT Xoutrec ( I ).Eq_Id
              , Xoutrec ( I ).Event_Datasource
              , Xoutrec ( I ).Event_Source
              , Xoutrec ( I ).Eq_Event_Id
              , Xoutrec ( I ).Eq_Time
              , Xoutrec ( I ).Longitude
              , Xoutrec ( I ).Latitude
              , Xoutrec ( I ).DEPTH
              , Xoutrec ( I ).Mag_Value
           FROM DUAL
          WHERE TRIM ( LOWER ( Event_Datasource ) ) != 'us';

   /*
      And what is this???
      vvvvvvvvvvvvvvvvvvv

         SELECT Eq_Id
              , Event_Datasource
              , Event_Source
              , Eq_Event_Id
              , Eq_Time
              , Longitude
              , Latitude
              , DEPTH
              , Mag_Value
           FROM T3_Eq_Event_Hist;
   */
   DBMS_OUTPUT.Put_Line ( 'Target Total Count: ' || SQL%ROWCOUNT );
   COMMIT;
EXCEPTION
   WHEN OTHERS
   THEN
      DBMS_OUTPUT.Put_Line ( DBMS_UTILITY.Format_Error_Backtrace );
END;
/

Open in new window

:p
0
 

Author Comment

by:g_currier
Comment Utility
Before I give this a try (it looks like it may work), is there no way to do this over a dblink...the commentd portions are my attempt at that.

As for the "And what is this?" part: that's the end select of the "INSERT WHEN" statement (the data has to come from somewhere to be inserted, and it follows the syntax right from the documentation - and it didn't work).  Using the code I posted all I got was a lot of rows with NULL when using the DBLINK...which I really need.  I am not a fan of dynamic sql, but can it really be that it is the only way to retrieve rows from a remote DB?

As for the reclaim space...there's more to it, of course.  It might not be the best way but shrinking the datafiles after deleting the records is the only way I know to "reclaim space".  This way satisfies the resiting when there are still records in the table.  Otherwise I would simply truncate, drop storage and move on...

Anyway,  I hope that either of you can provide some insight on the selection from remote tables...
I will give the code a try and post my results in a bit.

Thanks.
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
Comment Utility
Before I would go with a pl/sql table, I would look at creating a global temporary table if pulling directly from the link won't work or some reason.

Still not seeing the need for the custom type.
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 
LVL 29

Accepted Solution

by:
MikeOM_DBA earned 300 total points
Comment Utility
Ok, added  the db link:
DECLARE
   Adblink        VARCHAR2 ( 25 CHAR ) := 'VM12C';
   Xstmt          VARCHAR2 ( 500 CHAR ) := '
    SELECT eq_id
          ,event_datasource
          ,event_source
          ,eq_event_id
          ,eq_time
          ,longitude
          ,latitude
          ,DEPTH
          ,mag_value
      FROM eq_event_t'  --||'@'||'#link#'
;
   Src_Tab        SYS_REFCURSOR;
   Xoutrec        Eq_Event_Ty
      := Eq_Event_Ty ( NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL );

   TYPE T3 IS TABLE OF T3_Eq_Event_Hist%ROWTYPE;

   CURSOR C1
   IS
      SELECT * FROM T3_Eq_Event_Hist;
BEGIN
   BEGIN
      OPEN Src_Tab FOR Xstmt || '@' || Adblink;

      FETCH Src_Tab BULK COLLECT INTO Xoutrec LIMIT 100;

      CLOSE Src_Tab;

      DBMS_OUTPUT.Put_Line ( 'Dynamic Execute: ' || Xoutrec.COUNT );
   EXCEPTION
      WHEN OTHERS
      THEN
         DBMS_OUTPUT.Put_Line ( DBMS_UTILITY.Format_Error_Backtrace );
         DBMS_OUTPUT.Put_Line ( SQLERRM );
   END;

   BEGIN
      FORALL I IN Xoutrec.FIRST .. Xoutrec.LAST
         --to do: logging - what is happening, timing, source, target, load start etc
         INSERT INTO T3_Eq_Event_Hist ( Eq_Id
                                      , Event_Datasource
                                      , Event_Source
                                      , Eq_Event_Id
                                      , Eq_Time
                                      , Longitude
                                      , Latitude
                                      , DEPTH
                                      , Mag_Value )
              VALUES ( Xoutrec ( I ).Eq_Id
                     , Xoutrec ( I ).Event_Datasource
                     , Xoutrec ( I ).Event_Source
                     , Xoutrec ( I ).Eq_Event_Id
                     , Xoutrec ( I ).Eq_Time
                     , Xoutrec ( I ).Longitude
                     , Xoutrec ( I ).Latitude
                     , Xoutrec ( I ).DEPTH
                     , Xoutrec ( I ).Mag_Value );

      DBMS_OUTPUT.Put_Line ( 'History Count: ' || SQL%ROWCOUNT );
      COMMIT;
   EXCEPTION
      WHEN OTHERS
      THEN
         DBMS_OUTPUT.Put_Line ( DBMS_UTILITY.Format_Error_Backtrace );
         DBMS_OUTPUT.Put_Line ( SQLERRM );
   END;

   FORALL I IN Xoutrec.FIRST .. Xoutrec.LAST
      INSERT INTO T1_Eq_Us ( Eq_Id
                           , Event_Datasource
                           , Event_Source
                           , Eq_Event_Id
                           , Eq_Time
                           , Longitude
                           , Latitude
                           , DEPTH
                           , Mag_Value )
         SELECT Xoutrec ( I ).Eq_Id
              , Xoutrec ( I ).Event_Datasource
              , Xoutrec ( I ).Event_Source
              , Xoutrec ( I ).Eq_Event_Id
              , Xoutrec ( I ).Eq_Time
              , Xoutrec ( I ).Longitude
              , Xoutrec ( I ).Latitude
              , Xoutrec ( I ).DEPTH
              , Xoutrec ( I ).Mag_Value
           FROM DUAL
          WHERE TRIM ( LOWER ( Event_Datasource ) ) = 'us';

   FORALL I IN Xoutrec.FIRST .. Xoutrec.LAST
      INSERT INTO T2_Eq_Non_Us ( Eq_Id
                               , Event_Datasource
                               , Event_Source
                               , Eq_Event_Id
                               , Eq_Time
                               , Longitude
                               , Latitude
                               , DEPTH
                               , Mag_Value )
         SELECT Xoutrec ( I ).Eq_Id
              , Xoutrec ( I ).Event_Datasource
              , Xoutrec ( I ).Event_Source
              , Xoutrec ( I ).Eq_Event_Id
              , Xoutrec ( I ).Eq_Time
              , Xoutrec ( I ).Longitude
              , Xoutrec ( I ).Latitude
              , Xoutrec ( I ).DEPTH
              , Xoutrec ( I ).Mag_Value
           FROM DUAL
          WHERE TRIM ( LOWER ( Event_Datasource ) ) != 'us';

   DBMS_OUTPUT.Put_Line ( 'Target Total Count: ' || SQL%ROWCOUNT );
   COMMIT;
EXCEPTION
   WHEN OTHERS
   THEN
      DBMS_OUTPUT.Put_Line ( DBMS_UTILITY.Format_Error_Backtrace );
END;
/

Open in new window

0
 

Author Comment

by:g_currier
Comment Utility
I have reviewed the code and it still poses problems...adding the dblink as a variable in the statement preparation causes the fetch to fail.

I actually do not like using dynamic SQL, though I do recognize its usefulness.  The current system I am working with uses hard coded dynamic sql with variable table names and dblinks to move data from one system to another on a schedule.  Data movement is very slow : 2-3- hours, connecting to up to 4 remote systems, to move about 200 MB of record data from each.  Then, process the data (transformation) in the staging location, then distributing it out based on certain key values.

Thanks for the attempted help, but it looks like I need to go back to school for this one.  I have awarded points to each of you based on effort.

Thank you.
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
Comment Utility
2-3 hours to move 200M of data, you have bigger issues...

I would look at replication. transportable tablespaces, materialized views, etc... to move the data.

Still not 100% sure of the reason to hard code anything but I don't understand your situation.
0
 

Author Comment

by:g_currier
Comment Utility
Let's just say that both data DBs reside in separate datacenters separated by thousands of miles of land, a really, really big pond and then a few thousand kilometers.  On the miles side of this equation is a data center run by top-quality professionals and on the other side is a datacenter run by newcomers.  Suffice it to say that network issues are not surprising.
Unfortunately, I cannot help this situation and am seeking the fastest way to move data, while simultaneously providing consistency (for when the network takes a millisecond or two break and cuts the DB connection).

The code I gave is an approximation of the real data, of course...

I cannot do better without a solid and stable supporting infrastructure, so I am seeking alternatives to long running queries (i.e. using NTILE to bucket the data and move it based on that, moving data by bucket to a stage table and committing the transaction before proceeding to the next bucket and whatever other little tricks I or my coworker can think of)...just can't seem to get around the dblink in the dynamic statement.

Thanks anyway, I appreciate your insight.
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Join & Write a Comment

Suggested Solutions

Truncate is a DDL Command where as Delete is a DML Command. Both will delete data from table, but what is the difference between these below statements truncate table <table_name> ?? delete from <table_name> ?? The first command cannot be …
Background In several of the companies I have worked for, I noticed that corporate reporting is off loaded from the production database and done mainly on a clone database which needs to be kept up to date daily by various means, be it a logical…
This video shows information on the Oracle Data Dictionary, starting with the Oracle documentation, explaining the different types of Data Dictionary views available by group and permissions as well as giving examples on how to retrieve data from th…
This video shows how to Export data from an Oracle database using the Datapump Export Utility.  The corresponding Datapump Import utility is also discussed and demonstrated.

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now