Solved

Export Data Pump

Posted on 2013-06-18
6
568 Views
Last Modified: 2013-06-25
Hi,
TNS for Linux: Version 10.2.0.4.0
I've 2 Oracle instances (PROD and DEV) Enterprise Edition Release 10.2.0.4.0 - 64bit
I need export full db from PROD instance and create the DUMPFILE on DEV machine.
I can create a database link from PROD to DEV and viceversa.

Can I create, from PROD, a new oracle directory that pointing to mount point on the DEV machine?


Thanks in advance!
0
Comment
Question by:ralph_rea
6 Comments
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
Comment Utility
If the prod server can mount a disk on dev then yes.

However, you should NEVER link dev and prod servers.  Even a database link!  Opens up way to many issues and it's a good way to trash production.

Since you can create database links, they are on the same network.  Just run exp from the dev server and point it to the production database through tnsnames.  Still risky but less so than physically linking the servers.

Just create the file on production and copy/ftp/??? it over.
0
 

Author Comment

by:ralph_rea
Comment Utility
I cannot create the file dmp on production because I have just a little space on mount point.

So, if I understand it, your solution is:

on DEV:
1) Create an oracle directory
2) modify tnsnames.ora to point it to the producion database
3) run expdp
expdp "sys/password@PROD as sysdba" directory=dumpdir file=prod_exp.dmp full=y

Correct?
I Know that expdp is a server process and run on server machine. In this case prod_exp.dmp  file will be on PROD or DEV machine?
0
 
LVL 76

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 100 total points
Comment Utility
Never tried datapump exports remotely.  Looks like you can do it but it does require a database link:
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#SUTIL856

If I was the production DBA, I wouldn't let the development database create a link to production.  Way to many things could go wrong.

I know classic exp/imp will work for you.
0
How to improve team productivity

Quip adds documents, spreadsheets, and tasklists to your Slack experience
- Elevate ideas to Quip docs
- Share Quip docs in Slack
- Get notified of changes to your docs
- Available on iOS/Android/Desktop/Web
- Online/Offline

 
LVL 23

Assisted Solution

by:David
David earned 50 total points
Comment Utility
1.  If you are so tight on production storage, and cannot remove any log or other trash files, you may soon have an even bigger problem.

2.  Are you backing up production, either with RMAN or DataPump?  Consider restoring to DEV from your last backup set. If you are not backing up PROD, see above about the bigger problem.

3.  Moving production data into a less secure copy, where security is typically less enforced, is also unwise.  Bigger problem, yada yada.

One win-win approach may be to clone the PROD database template using DBCA, use the template on DEV to create the new database, then populate the DEV side with a set of generated baseline data.  Remember that you don't need to have all the rows in PROD tables in order to test constraints.  Bring over your stats to simulate the load for the Optimizer, and bring over the SQL profiles.
0
 
LVL 34

Assisted Solution

by:johnsone
johnsone earned 50 total points
Comment Utility
To use the exp/imp, you don't actually need to store the dump file.  You can do it easily with a named pipe on either side.  The basic commands would be this:

mknod data.dmp p
exp userid=user/pass@prod file=data.dmp ...
imp userid=user/pass@dev file=data.dmp ...

You would need to run the exp and imp commands in different OS sessions, but it definitely works, even on a large scale (we did a 1.5TB database once).  You never need more than a few blocks of disk space to do it.
0
 

Accepted Solution

by:
mkhandba earned 300 total points
Comment Utility
You can do directly ... direct import into your DEV from PROD (not recommended), no need of any export too

Create a db link at DEV to your PROD and then can do this ... direct.

UNIX cmd:
nohup impdp system/xxxx parfile=imp_name.par &

LOGFILE=imp_name.log
DIRECTORY=DP_DIR
NETWORK_LINK=PROD.WORLD
SCHEMAS=SCOTT
REMAP_TABLESPACE=a:b, c:d,e:f     -  (if different tablespaces @source & target)
EXCLUDE=STATISTICS     - this'll increase speed, can gather after import
parallel=12  
metrics=yes
trace=480300

Now you can make changes accordingly, here I'm doing a schema import, directly into Target from Source, in one step, using DB Link, also doing a REMAP of tablespaces, if they are different.

HTH
0

Featured Post

Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

Join & Write a Comment

Suggested Solutions

Entering a date in Microsoft Access can be tricky. A typo can cause month and day to be shuffled, entering the day only causes an error, as does entering, say, day 31 in June. This article shows how an inputmask supported by code can help the user a…
Creating and Managing Databases with phpMyAdmin in cPanel.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function
This demo shows you how to set up the containerized NetScaler CPX with NetScaler Management and Analytics System in a non-routable Mesos/Marathon environment for use with Micro-Services applications.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

14 Experts available now in Live!

Get 1:1 Help Now