Improve company productivity with a Business Account.Sign Up

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 271
  • Last Modified:

Oracle data migration from 1 environment to other


I have a set of tables for which the data needs to copied to another environment.  
Here are the key points:

1. I do not want to copy the entire data, it will be driven by a set of instructions from screen and related master and detail tables (around 20 in number) we should be able to migrate to another environment.

2.  We can safely assume its plain insert and there will not be any violation of constraints

3.  This process should be done seamlessly as its repetitive process (done from screen ref point 1) , via a stored procedure or something better - experts i need your suggesstion here

4.  There will not be any provision for DB Links, ideally because we want to demonstrate there is no overhead except the same table structure should reside in the environments.

5.  Dont mind if its takes time in loading the data, but it should ideally be done in background when user initiates action from screen and may be reads config files to find target db, without invoking command prompt or any process that can be called from java.

6.  Do not want to have third party tools
  • 4
  • 4
2 Solutions
slightwv (䄆 Netminder) Commented:
With those restrictions, the only thing I can think of was mentioned in your tags for the question:  data pump.

Assuming you have already considered it have you discounted it or are you just looking for possible alternatives?

This activity is called Integration/Interfacing one application with other.
Flow can be like this
1) 1st environment application, in the form / sreen, 1 group window need to be available.
That group consists of those 20 tables list. And individual or group click option should be available. By clicking on this, an export of these tables to .CSV should be performed based on the partial selection of data. For this, in the code, Oracle SQL commands can be used to generated the .CSV output.

2) .CSV file should be generated in a folder which is NFS mounted(this is a network mount in unix systems to avoid FTP. But it has its own security/technical related good and bad. Consult System Admin)
So with this, no need to separately transfer the files to 2nd environment. Seamless.

3) 2nd Environment application Screen/Form. 1 group window or individuval table window should be available along with import button. When clicked, in the background, it should use sqlloader or insert commands to load the data. This should be handled in the code.

Note : Formatting of the .CSV file should also be handled to import. It need to be aligned properly so that import will be successfully  completed.

Scope of work : Developer/Sql programmer,Application Owner who understands the application and is useful to developer to complete this interfacing.
Clue : In Oracle ERP environment, this approach is popular with concurrent request

ajexpertAuthor Commented:
Can datapump be inoked from plsql procedure or java?
Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

slightwv (䄆 Netminder) Commented:

Oracle Data Pump is made up of the following distinct parts:

•The command-line clients expdp and impdp
These client make calls to the DBMS_DATAPUMP package to perform Oracle Data Pump operations (see "PL/SQL Packages").
•The DBMS_DATAPUMP PL/SQL package, also known as the Data Pump API
This API provides high-speed import and export functionality.
•The DBMS_METADATA PL/SQL package, also known as the Metadata API
This API, which stores object definitions in XML, is used by all processes that load and unload metadata.
ajexpertAuthor Commented:

I have to look into the documentation and the link you provided.  Thanks a lot.

Just a quick question as I have to submit the approach as soon as I can.

I want expdp and impdp to be FULLY controlled from PL/SQL program.  Is it possible?

Thanks a lot
slightwv (䄆 Netminder) Commented:
The docs will confirm this but I think datapump does everything inside the database no matter what initiates it.

The command line actually creates a database job internal to the database.

This means you can do things like close down the window and reconnect to an existing job from another window to check status.

I've never used the PL/SQL APIs but I believe you can do everything you need to do with them.
ajexpertAuthor Commented:
what matters to me is invoking.  If datapump can be done in PL/SQL, I am more than happy.

I do not want to check the status because, the number of records are few < 10,000 also, I do not care about the time it takes to export or import.

I can write SQL statements on target table in another environment to see if its ready for next operation

Again, everything should be in PL/SQL without any calls which invokes command line utility

slightwv (䄆 Netminder) Commented:
Sorry if I confused you with my last post.

I was trying to point out that datapump does everything inside the database no matter how you access it.

>> If datapump can be done in PL/SQL

ajexpertAuthor Commented:
Welll, we have changed the focus to other important tasks, but I will go with slightwv's approach and will try to implement it in PL/SQL

Thanks anand_20703 for your comment as well
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

  • 4
  • 4
Tackle projects and never again get stuck behind a technical roadblock.
Join Now