Solved

expdp remote machine

Posted on 2009-07-07
26
4,802 Views
Last Modified: 2013-12-19
Hi

I am trying to do export of some table which are more then 20GB.
The problem is the database is on remote server.
Is it possible to export dump file directly on to local machine.
How can I do that?

Can you help me with example.

Thanks
0
Comment
Question by:basirana
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 12
  • 11
  • 3
26 Comments
 

Author Comment

by:basirana
ID: 24795777
Remote DB name is rdb1
Local DB name is ldb1
remote machine is rmachine1
local machine name is lmachine1
0
 

Author Comment

by:basirana
ID: 24795797
Tables that need to be exported are ASP_D, ASP_E, ASP_F.
Is it possible I can schedule as it is 20GB.How can I do that?

Thanks
0
 

Author Comment

by:basirana
ID: 24795846
Is it possible I can rebuild index later? bcoz indexes are more then 40GB size.
I just want to import tables and constraints.
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 40

Expert Comment

by:mrjoltcola
ID: 24796257
With expdp I don't believe you can do that. Data pump exports into an Oracle DIRECTORY object, which is local. You could try creating a network mapped directory.

Data Pump does have a network mode, but it does not create an interim dump file, it is direct database to database.

So I think you need to try regular old export, or try to allocate space on the server in a regular directory or, again, try a network mounted directory.
0
 
LVL 48

Accepted Solution

by:
schwertner earned 500 total points
ID: 24796696
It is possible:

With the Oracle 10g import utility, you can use a network connection so that you can directly import either a single or multiple table(s) from one server to another. Here are the steps you need to perform in order to import a table or tables from one server to another. First, assume that you have two servers, Server A and Server B, and you want to import a table or tables such as "EMP" and "DEPT" from server "A" to "B."

1. Go to Server B and create a database link that will access the database in Server A.
Example:
SQL> CREATE DATABASE LINK mylink2a CONNECT TO scott IDENTIFIED BY password USING 'mylink2a';
Note that 'scott' is a user in the database in Server "A" and that its password is 'password'.

2. Then, perform the following import dump command.
Example:
# impdp scott2/password2 TABLES=emp,dept DIRECTORY=dpump1 NETWORK_LINK=mylink2a
Note that 'scott2' is a user in the database in Server B and that its password is 'password2'.
0
 
LVL 48

Expert Comment

by:schwertner
ID: 24796749
I do not see the need to use files here, but if you decide to use single instance Export via Data Pump you will hit the 2GB barier (not in all cases, but this happens often on 32 bits machines).

The workaround:
1. Use filesize parameter like
filesize=1900M
2. use wildcard %U in the DUMPFILE:
DUMPFILE=expdat%U.dmp

So Data Pump will create many dump files.
0
 

Author Comment

by:basirana
ID: 24796781
we have AIX bceyenetdb1 3 5 00C8AADD4C00 which is 64 bit and has 8 processors.
I need help to do datapump of tables without indexes and I want to run it parallel.
Any help?
0
 
LVL 48

Assisted Solution

by:schwertner
schwertner earned 500 total points
ID: 24796831
It is easy.
Use
PARALLEL=8

Also

GRANTS=y
INDEXES=n
ROWS=y
CONSTRAINTS=y
CONSISTENT=n
 
 
0
 
LVL 48

Expert Comment

by:schwertner
ID: 24796853
But PARALLEL has no practical meaning if there are no datafiles.
You use 'network' variant and actually there is only one media - the network.
Normally I use PARALLEL=4 but when the dump file is a real dumpfile and with 2 Xeon processors.
0
 

Author Comment

by:basirana
ID: 24796871
below is the expdp I have written does it work? If it works how can I schedule it to run at 5PM today?

expdp dhana@SOURCE_DB directory=dpump tables=MYDATA.SL_AN,MYDATA.SL_AT,MYDATA.SL_AN parallel=4 dumpfile=SL%U.dat network_link=source_link

After export How can I do import from multiple files?

Thanks
0
 

Author Comment

by:basirana
ID: 24796940
CREATE DIRECTORY DUMP_DIR AS '/d032/oradata/dpump'

expdp dhana@SOURCE_DB
DIRECTORY=dump_dir
TABLES=MYDATA.SL_AN,MYDATA.SL_AT,MYDATA.SL_AN
PARALLEL=4
DUMPFILE=SL%U.dat
NETWORK_LINK=source_link
GRANTS=y
INDEXES=n
ROWS=y
CONSTRAINTS=y
CONSISTENT=n
0
 
LVL 48

Expert Comment

by:schwertner
ID: 24797403
using impdp

In contrast to clasic Import
the parameters have the same name :)

But possibly you missunderstood me.

ID:24796696 do not needs files. It is via the network.
You need files when you make pure server dump, transfer it on another machine and do import. But using DBLink you do not use files
0
 

Author Comment

by:basirana
ID: 24797556
How can I do i using DB link? can you help me with example.
0
 
LVL 48

Expert Comment

by:schwertner
ID: 24797589
It is explained in the post ID:24796696 in details.
0
 

Author Comment

by:basirana
ID: 24797802
schwertner

The posting explains about impdp. But I do not see expdp. Can you help me step by step. As I am new to datapump
0
 
LVL 40

Expert Comment

by:mrjoltcola
ID: 24797834
schwertner, will the network mode create an actual .dmp file? I was under the impression that the network mode for data pump does not create an interim file at all?

I will admit I do not use data pump much yet, still using exp/imp for most databases.
0
 
LVL 48

Assisted Solution

by:schwertner
schwertner earned 500 total points
ID: 24801023
DP Dump files can not be exported directly to a remote machine.
Only if you use common SAN, NAS, remote disk or some other kind of shared disk drives.
Import is used because the dump files could only be used for imports.
Or ftp can be scheduled on the remote server.

If you would like to use Data Pump then create a local empty DB,
do the DP import via DBlink and after that export locally the DB in files.

If not - use classic export via client but set NLS_LANG appropriatelly.
0
 

Author Comment

by:basirana
ID: 24803044
0
 
LVL 40

Expert Comment

by:mrjoltcola
ID: 24803348
That is what I thought, I checked docs yesterday, and it is correct, to export with data[ pump remotely requires an actual Oracle instance plus a DB link, which will not exist for a pure client. So expdp cannot work like the old exp remote method. That is one thing I will miss if they ever remove exp completely.

0
 
LVL 48

Expert Comment

by:schwertner
ID: 24803758
The DP export is between two instances.
You can use an existing client instance and after that to export the dump file.
I guess you need for some reason Data Pump dump. If this is not so - use the old Export,
but you need client installation.
0
 

Author Comment

by:basirana
ID: 24803846
I am not running expdb on my local machine it is on development machine and has dev DB running.
I am planning to export from prod to dev box over DB Link. and then import the data on to development.
All I need is stesp to export table and import using datapump.
As we are not having enough disk space I want to do export on to more then one disk.

I have disk1, and disk5 free which are of 15GB each but the table size is more then 23GB so we need to export on to diff disk and import.

I am looking for steps to perform this operation.

Thanks
0
 
LVL 48

Expert Comment

by:schwertner
ID: 24806951
The steps are described above.
Syntactically they are not long, but need some training.
If you have not enough space think about commpressed tablespaces, but also about the danger to hit some bugs.
Also please be aware that the using the method I find in my notes the data will be directly entered in the database, in the tables. So the tablespaces should be created. Do not expect that you will get dmp files - the method works other way, it is import and imports the data immediatelly.
0
 

Author Comment

by:basirana
ID: 24925446
I have specified smaller filesize and given two destination directories the files are distributed equally in both directories and now it working fine.
Export the scott schema from orcl database and split the dump file into three files. The dump files will be stored in three different location. This method is especially useful if you do not have enough space in one file system to perform the complete expdp job. After export is successful, import the dump file into ordb database.

Expdp parfile content:

userid=system/password@orcl
logfile=schemaexp_split.log
directory=dumplocation
dumpfile=dump1:schemaexp_%U.dmp,dump2:schemaexp_%U.dmp,dump3:schemaexp_%U.dmp
filesize=50M
schemas=scott
include=table
0
 
LVL 48

Assisted Solution

by:schwertner
schwertner earned 500 total points
ID: 24932677
In this case

parallel=n

will help.

Oracle recommends n to be 2 times the number of the cores.
0
 
LVL 48

Expert Comment

by:schwertner
ID: 24932789
Backup LISTENER.ORA file!!!!!!!

In ...\network\admin open listener.ora

see

SID_LIST_LISTENER =
  (SID_LIST =
    (SID_DESC =
      (SID_NAME = PLSExtProc)
      (ORACLE_HOME = D:\Ora10g)
      (PROGRAM = extproc)
    )
    (SID_DESC =
      (SID_NAME = test)
      (ORACLE_HOME = D:\Ora10g)
    )
  )

Add the entry
    (SID_DESC =
      (SID_NAME = test)
      (ORACLE_HOME = D:\Ora10g)
    )
with regard to the particular Oracle Home and SID Name.


In Oracle 11g you have to create from scratch in the listener.ora file the section

SID_LIST_LISTENER =
&..

mentioned above.
Restart the listener:
c:>lsnrctl stop
c:>lsnctl start
0
 
LVL 48

Expert Comment

by:schwertner
ID: 25081184
Goooood luck!
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

I remember the day when someone asked me to create a user for an application developement. The user should be able to create views and materialized views and, so, I used the following syntax: (CODE) This way, I guessed, I would ensure that use…
Configuring and using Oracle Database Gateway for ODBC Introduction First, a brief summary of what a Database Gateway is.  A Gateway is a set of driver agents and configurations that allow an Oracle database to communicate with other platforms…
Via a live example show how to connect to RMAN, make basic configuration settings changes and then take a backup of a demo database
Via a live example, show how to restore a database from backup after a simulated disk failure using RMAN.

739 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question