Link to home
Start Free TrialLog in
Avatar of Jay Redd
Jay Redd

asked on

Command Line version of creating "Loader" format Export Files Via SQL*Developer

My adventures with moving Oracle data continue!

So I have had lots of success with using the "loader" format on the Tools > Export Database in SQL Developer, but now I want to do the same thing automatically from the command line.

Is there a way to invoke the same exact process from the command line? I am dealing with BLOB/CLOB data which SQL Developer's Export process handles nicely, but I can't figure out how to do the same thing with the Export command line tool.
Avatar of David VanZandt
David VanZandt
Flag of United States of America image

This may be simpler than you think, but to begin with the right tool, SQL*D is using DataPump (expdb) instead of Export (exp). I've attached instructions for version 3.1, which mentions that to view (and use) the actual code from your wizard, click on the pl/sql tab.  IOW one can take the generated script, and execute it (with the appropriately privileged account) from the O/S.

http://www.oracle-base.com/articles/misc/sql-developer-31-data-pump-wizards.php
Avatar of slightwv (䄆 Netminder)
slightwv (䄆 Netminder)

The export command line requires the corresponding import tool to load it back in.

Classic: exp needs imp
Datapump: expdp needs impdp

The online docs have all the parameters and several examples.  There are also a TON of examples in the Internet.

What issues are you having with the command line versions?

If you can provide more information we can probably post the exact syntax you need.
Avatar of Jay Redd

ASKER

[ignore this comment] found my mistake
Thanks to both of you, I learned a lot of stuff there! It appears I can't use EXP to do it since it include the schema and the schema is already built in this workflow.

I am also unclear that I could use the info on the PL/SQL tab of the dba export wizard, becuase I dont think i can run dbms_pump in sqlplus from command line correct?

I'm now stuck on this point:
I'm not trying to see if expdp can work instead.
>I can't use EXP to do it since it include the schema and the schema is already built in this workflow.

I don't understand what you are saying here.  If it's in the database classic exp will export it.

>>becuase I dont think i can run dbms_pump in sqlplus from command line correct?

You can.  It's just a pl/sql call.  You can run it from a pl/sql block in sqlplus.  That said, it is just the pl/sql interface to the command line expdp.  Therefore, just use expdp.
>I can't use EXP to do it since it include the schema and the schema is already built in this workflow.
>I don't understand what you are saying here.  If it's in the database classic exp will export it

From what I was reading, it seems like I cannot ONLY export the data to later import into a database that has the schema but no data. It seems like the Schema is produced as part of the table export.
>> It seems like the Schema is produced as part of the table export.

Just don't do a FULL export.  In classic exp look at the OWNER parameter:
http://docs.oracle.com/cd/E11882_01/server.112/e22490/original_export.htm#autoId35

You can export as little or as much information from the database as you want.
Thanks Slight.
I'm not sure I understand how to use OWNER to just export data...

expdp bq/****@Exfer tables=CDLKUP  dumpfile=CDLKUP.dmp logfile=CDLKUP.log owner=bq

bq owns the data, would that mean it'd exclude the schema?
>>bq owns the data, would that mean it'd exclude the schema?

I'm not following.

I see you are using datapump export not classic/original export so I'll start provide syntax for that when necessary.

>>I'm not sure I understand how to use OWNER to just export data...

You would use OWNER if you were exporting from a different user than the one you logged in with.  OWNER is a parameter for the original exp not datapump, expdp.

If you just want a single table then what you posted should be correct.
Ok basically, I have gotten way off of my original question now trying to consider all of these alternative ways of doing it.

Basically, I need a way to give my clients a SH / BAT that they can run that loads all the data from flat files. I really don't want to fool around with Oracle directories and all that, I'd just like to have the SH and the Data flat files in the same directory and run them. Using INSERT statements in .SQL script is too slow, so I wanted to use SQL Loader. It works great but I need a way to create the ctl and dtl files from the command line the way I can from Export Database in SQL Developer instead of thru its Wizard.

I am still looking fro a way to do that.
Would it meet your need to have an OS script prompt for values, such as the data directory, the schema, the object name(s); then pass those dynamic values into the control script to be executed?
Sorry but I don't know what sql developer does to generate a delimited file for use with sql loader when it comes to BLOBs.
>>Would it meet your need to have an OS script prompt for values, such as the data directory, the schema, the object name(s); then pass those dynamic values into the control script to be executed?

I suppose that would work. I'd just like to be able to put the SH that runs all the IMP commands in a directory with all the needed dmp files and not have to have it be in one of the all_directories path locations.

That's my current hurdle, I need to be able to like unzip my dmp files and a SH and run them on a box that just has oracle client and then it imports the data to a remote server. I can do ALL of this except that i need to move the files into the dump folder and that's a problem...
OK, you have me confused.

Above you mentioned "I wanted to use SQL Loader"  Now you mention IMP and DMP files.  These two things do not go together.

Which path do you want to continue with, SQL Loader or export/import?
Forgive me if this is obvious but:

An executable / application does not have to be defined in the session's PATH variable -- as long as it's referenced by its full path name. Example:  $ORACLE_HOME/bin/expdb ....

Whatever directory you use for the dump files HAS to be defined to Oracle as a DIRECTORY variable.  Refer to the "create directory" syntax in http://www.orafaq.com/wiki/Datapump ; and in particular, the sentences on DATA_PUMP_DIR.  Note the directory HAS to have read/write permission as appropriate. This is true in both the Oracle instance, and the OS.

To recap, your pseudo-code might resemble:
$ORACLE_HOME/bin/expdp DIRECTORY=DATA_PUMP_DIR ....
So what if you are only on a local machine that talks to a remote oracle server? You can't put your directory into the database because its not on the server? In our setup the user wouldnt even have rights to add the Directory into the oracle all_directories table. How are you supposed to use impdp  like this?
ASKER CERTIFIED SOLUTION
Avatar of slightwv (䄆 Netminder)
slightwv (䄆 Netminder)

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I ended up forming a solution using IMP and EXP.