Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Export Data Pump

Posted on 2013-06-18
6
Medium Priority
?
607 Views
Last Modified: 2013-06-25
Hi,
TNS for Linux: Version 10.2.0.4.0
I've 2 Oracle instances (PROD and DEV) Enterprise Edition Release 10.2.0.4.0 - 64bit
I need export full db from PROD instance and create the DUMPFILE on DEV machine.
I can create a database link from PROD to DEV and viceversa.

Can I create, from PROD, a new oracle directory that pointing to mount point on the DEV machine?


Thanks in advance!
0
Comment
Question by:ralph_rea
6 Comments
 
LVL 78

Expert Comment

by:slightwv (䄆 Netminder)
ID: 39256521
If the prod server can mount a disk on dev then yes.

However, you should NEVER link dev and prod servers.  Even a database link!  Opens up way to many issues and it's a good way to trash production.

Since you can create database links, they are on the same network.  Just run exp from the dev server and point it to the production database through tnsnames.  Still risky but less so than physically linking the servers.

Just create the file on production and copy/ftp/??? it over.
0
 

Author Comment

by:ralph_rea
ID: 39256646
I cannot create the file dmp on production because I have just a little space on mount point.

So, if I understand it, your solution is:

on DEV:
1) Create an oracle directory
2) modify tnsnames.ora to point it to the producion database
3) run expdp
expdp "sys/password@PROD as sysdba" directory=dumpdir file=prod_exp.dmp full=y

Correct?
I Know that expdp is a server process and run on server machine. In this case prod_exp.dmp  file will be on PROD or DEV machine?
0
 
LVL 78

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 400 total points
ID: 39256709
Never tried datapump exports remotely.  Looks like you can do it but it does require a database link:
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#SUTIL856

If I was the production DBA, I wouldn't let the development database create a link to production.  Way to many things could go wrong.

I know classic exp/imp will work for you.
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
LVL 23

Assisted Solution

by:David
David earned 200 total points
ID: 39256767
1.  If you are so tight on production storage, and cannot remove any log or other trash files, you may soon have an even bigger problem.

2.  Are you backing up production, either with RMAN or DataPump?  Consider restoring to DEV from your last backup set. If you are not backing up PROD, see above about the bigger problem.

3.  Moving production data into a less secure copy, where security is typically less enforced, is also unwise.  Bigger problem, yada yada.

One win-win approach may be to clone the PROD database template using DBCA, use the template on DEV to create the new database, then populate the DEV side with a set of generated baseline data.  Remember that you don't need to have all the rows in PROD tables in order to test constraints.  Bring over your stats to simulate the load for the Optimizer, and bring over the SQL profiles.
0
 
LVL 35

Assisted Solution

by:johnsone
johnsone earned 200 total points
ID: 39257021
To use the exp/imp, you don't actually need to store the dump file.  You can do it easily with a named pipe on either side.  The basic commands would be this:

mknod data.dmp p
exp userid=user/pass@prod file=data.dmp ...
imp userid=user/pass@dev file=data.dmp ...

You would need to run the exp and imp commands in different OS sessions, but it definitely works, even on a large scale (we did a 1.5TB database once).  You never need more than a few blocks of disk space to do it.
0
 

Accepted Solution

by:
Mushfique Khan earned 1200 total points
ID: 39260650
You can do directly ... direct import into your DEV from PROD (not recommended), no need of any export too

Create a db link at DEV to your PROD and then can do this ... direct.

UNIX cmd:
nohup impdp system/xxxx parfile=imp_name.par &

LOGFILE=imp_name.log
DIRECTORY=DP_DIR
NETWORK_LINK=PROD.WORLD
SCHEMAS=SCOTT
REMAP_TABLESPACE=a:b, c:d,e:f     -  (if different tablespaces @source & target)
EXCLUDE=STATISTICS     - this'll increase speed, can gather after import
parallel=12  
metrics=yes
trace=480300

Now you can make changes accordingly, here I'm doing a schema import, directly into Target from Source, in one step, using DB Link, also doing a REMAP of tablespaces, if they are different.

HTH
0

Featured Post

How to Use the Help Bell

Need to boost the visibility of your question for solutions? Use the Experts Exchange Help Bell to confirm priority levels and contact subject-matter experts for question attention.  Check out this how-to article for more information.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

What we learned in Webroot's webinar on multi-vector protection.
I have written articles previously comparing SARDU and YUMI.  I also included a couple of lines about Easy2boot (easy2boot.com).  I have now been using, and enjoying easy2boot as my sole multiboot utility for some years and realize that it deserves …
This video shows how to Export data from an Oracle database using the Original Export Utility.  The corresponding Import utility, which works the same way is referenced, but not demonstrated.
With just a little bit of  SQL and VBA, many doors open to cool things like synchronize a list box to display data relevant to other information on a form.  If you have never written code or looked at an SQL statement before, no problem! ...  give i…
Suggested Courses

810 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question