Export Data Pump

Hi,
TNS for Linux: Version 10.2.0.4.0
I've 2 Oracle instances (PROD and DEV) Enterprise Edition Release 10.2.0.4.0 - 64bit
I need export full db from PROD instance and create the DUMPFILE on DEV machine.
I can create a database link from PROD to DEV and viceversa.

Can I create, from PROD, a new oracle directory that pointing to mount point on the DEV machine?


Thanks in advance!
ralph_reaAsked:
Who is Participating?
 
Mushfique KhanConnect With a Mentor Director OperationsCommented:
You can do directly ... direct import into your DEV from PROD (not recommended), no need of any export too

Create a db link at DEV to your PROD and then can do this ... direct.

UNIX cmd:
nohup impdp system/xxxx parfile=imp_name.par &

LOGFILE=imp_name.log
DIRECTORY=DP_DIR
NETWORK_LINK=PROD.WORLD
SCHEMAS=SCOTT
REMAP_TABLESPACE=a:b, c:d,e:f     -  (if different tablespaces @source & target)
EXCLUDE=STATISTICS     - this'll increase speed, can gather after import
parallel=12  
metrics=yes
trace=480300

Now you can make changes accordingly, here I'm doing a schema import, directly into Target from Source, in one step, using DB Link, also doing a REMAP of tablespaces, if they are different.

HTH
0
 
slightwv (䄆 Netminder) Commented:
If the prod server can mount a disk on dev then yes.

However, you should NEVER link dev and prod servers.  Even a database link!  Opens up way to many issues and it's a good way to trash production.

Since you can create database links, they are on the same network.  Just run exp from the dev server and point it to the production database through tnsnames.  Still risky but less so than physically linking the servers.

Just create the file on production and copy/ftp/??? it over.
0
 
ralph_reaAuthor Commented:
I cannot create the file dmp on production because I have just a little space on mount point.

So, if I understand it, your solution is:

on DEV:
1) Create an oracle directory
2) modify tnsnames.ora to point it to the producion database
3) run expdp
expdp "sys/password@PROD as sysdba" directory=dumpdir file=prod_exp.dmp full=y

Correct?
I Know that expdp is a server process and run on server machine. In this case prod_exp.dmp  file will be on PROD or DEV machine?
0
Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

 
slightwv (䄆 Netminder)Connect With a Mentor Commented:
Never tried datapump exports remotely.  Looks like you can do it but it does require a database link:
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#SUTIL856

If I was the production DBA, I wouldn't let the development database create a link to production.  Way to many things could go wrong.

I know classic exp/imp will work for you.
0
 
DavidConnect With a Mentor Senior Oracle Database AdministratorCommented:
1.  If you are so tight on production storage, and cannot remove any log or other trash files, you may soon have an even bigger problem.

2.  Are you backing up production, either with RMAN or DataPump?  Consider restoring to DEV from your last backup set. If you are not backing up PROD, see above about the bigger problem.

3.  Moving production data into a less secure copy, where security is typically less enforced, is also unwise.  Bigger problem, yada yada.

One win-win approach may be to clone the PROD database template using DBCA, use the template on DEV to create the new database, then populate the DEV side with a set of generated baseline data.  Remember that you don't need to have all the rows in PROD tables in order to test constraints.  Bring over your stats to simulate the load for the Optimizer, and bring over the SQL profiles.
0
 
johnsoneConnect With a Mentor Senior Oracle DBACommented:
To use the exp/imp, you don't actually need to store the dump file.  You can do it easily with a named pipe on either side.  The basic commands would be this:

mknod data.dmp p
exp userid=user/pass@prod file=data.dmp ...
imp userid=user/pass@dev file=data.dmp ...

You would need to run the exp and imp commands in different OS sessions, but it definitely works, even on a large scale (we did a 1.5TB database once).  You never need more than a few blocks of disk space to do it.
0
All Courses

From novice to tech pro — start learning today.