Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 457
  • Last Modified:

Conditional data-transfer of a large table from one machine to another using Oracle

A large table with over two million records has to be conditionally transferred from one Oracle user to another Oracle user in different machines.  We have to filter only the records created after a specified date, with another field beginning with a particular string, and with another field greater than a particular number.  The specified date, string and number are parameters, so can vary, and are not yet known.  It's a one-shot transfer, so this technique will be applied only one time.

Please describe your solution in two cases:

a) You can create a DBLink;
b) You aren't allowed to create a DBLink.
0
hc2342uhxx3vw36x96hq
Asked:
hc2342uhxx3vw36x96hq
  • 3
  • 2
1 Solution
 
hc2342uhxx3vw36x96hqAuthor Commented:
UP
0
 
dbmullenCommented:
assuming the table doesn't have a LOB, fastest way to move the data is to just insert it over a link.

what is the SIZE of the table?
select bytes from dba_segments where owner = 'xxx' and segment_name = 'yyyyy';

now..  some are going to tell you to use export/import or data pump or pl/sql procedure with cursor or "oracle copy"
all of those will work fine..  but, for 2 million rows one time..  just insert the rows over the link.

with db_link
connect to target database
CREATE DATABASE LINK source_link CONNECT TO source_user IDENTIFIED BY 'pass'  USING 'SOURCE'; 
insert into target_table select * from source_table@source_link where......
;
commit;
 
without db_link
assume username/password are same on target/source
connect to target database
insert into owner.target_table select * from owner.source_table@source where......
;
commit;

Open in new window

0
 
hc2342uhxx3vw36x96hqAuthor Commented:
The size is the following: 190000000

(I've approximated the value, of course).

The table contains about 2 million records: with the where condition, they will become 1,75 million records.

With a simple insert statement you cannot do it.  With a cursor is it possible?

declare cursor cxxx is select * from source_table@source_link where date_field >= to_date (.....)

Is theren't a fastest way to copy?
0
 
dbmullenCommented:
190mb is pretty small...
With a simple insert statement you cannot do it.  
why not?  is there a CLOB/LOB/BLOG or something
the fastest way to move the data is to just INSERT it over a link..  try the "oracle copy command"  only thing about COPY, you have to put the select on ONE line so use the "_" if want to make it look pretty.
 

connect username/password@target_db
set arraysize 100
set copycommit 50
set long 4000
truncate table target_table;
COPY FROM username/password@source_db _
       TO username/password@target_db _
       insert target_table _
       USING  _
       SELECT * _
       FROM source_table _
       where.... _
       ;

Open in new window

0
 
hc2342uhxx3vw36x96hqAuthor Commented:
OK, thanks!
0

Featured Post

Windows Server 2016: All you need to know

Learn about Hyper-V features that increase functionality and usability of Microsoft Windows Server 2016. Also, throughout this eBook, you’ll find some basic PowerShell examples that will help you leverage the scripts in your environments!

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now