Improve company productivity with a Business Account.Sign Up

x
?
Solved

Oracle 11g Datapump

Posted on 2011-02-24
13
Medium Priority
?
1,087 Views
Last Modified: 2012-08-14
Hi,

Can we use datapump command if the source tables are in 9i and Destination are in 11g.
If so can you give me the syntax and the steps to do it?

Thanks@
0
Comment
Question by:D-pk
  • 6
  • 4
  • 2
  • +1
13 Comments
 
LVL 18

Expert Comment

by:sventhan
ID: 34972528
You cannot use datapump.
0
 
LVL 18

Expert Comment

by:sventhan
ID: 34972533
0
 
LVL 4

Expert Comment

by:MarioAlcaide
ID: 34972534
No, you have to make an export using the 9i client exp.

Then, import using the 11g command imp into the new database.

Datapump was implemented in Oracle 10g.
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 

Author Comment

by:D-pk
ID: 34972632
so the dump we make in 9i can be readable in 11g? that is the .dmp file.
what are the parameters i need to give for this cross database transfer..
0
 
LVL 48

Expert Comment

by:schwertner
ID: 34972951
I will recommend to try the Data Pump import via DB link to the source DB.
If this fails you have no chance to use the Pump
0
 

Author Comment

by:D-pk
ID: 34972994
can you give me an example? for example lets say the source table is src_test and the dest table is table_dest.
do we need to have the table to be created in the destination to facilitate the import? or it will automatically create a table based on the name we give?

Thanks!
0
 
LVL 4

Accepted Solution

by:
MarioAlcaide earned 2000 total points
ID: 34973250
Let's see. You have two databases, DB9i and DB11g. And you want to migrate a schema named MY_SCHEMA. You will:

1. Connect to the machine where DB9i is allocated.
2. Execute exp owner=MY_SCHEMA file=c:\blablabla\your_export_file.dmp log=c:\blablabla\your_export_file.log statistics=none buffer=500000
3. Copy that file to the machine where DB11g is
4. Create the schema: (example) CREATE USER MY_SCHEMA IDENTIFIED BY MY_PASSWORD DEFAULT TABLESPACE USERS;
5. Execute in the DB11g machine: imp fromuser=MY_SCHEMA touser=MY_SCHEMA file=c:\blablabla\your_export_file.dmp log=c:\blablabla\your_import_file.log statistics=none buffer=5000
6. It's done!

That will import an entire schema, if you only need a table then add to the exp command the clause TABLES  your_table_name , and it will export only a table.

And it's done :)

Remember: it will automatically create the new table, but you have to create the schema (user) first.
0
 

Author Comment

by:D-pk
ID: 34973467
I just want to do few tables. can we do it without creating schema?
How to dump it in the 11g server and import later with a different table name in the 11g?
is that possible?

Thanks!
0
 
LVL 4

Expert Comment

by:MarioAlcaide
ID: 34973502
Nope, you need to have a schema created. Or maybe use an existent schema.

You cannot import a table with another name, with the imp utility (you could with Data Pump, but not in your case because you are getting data from a 9i database).

However, you can create another table with the new name, like this:

CREATE TABLE NEW_TABLE AS SELECT * FROM OLD_TABLE;
DROP TABLE OLD_TABLE;

And it will be renamed. Easy ;-)
0
 

Author Comment

by:D-pk
ID: 34976022
Thanks...
But some of tables are more than 125 gb and its not letting me to export the table dump file.
Any way around for this?
Thanks!
0
 

Author Comment

by:D-pk
ID: 34976026
I am gonna exp the tables one by one...
0
 
LVL 4

Expert Comment

by:MarioAlcaide
ID: 34977361
The only way you could rename those tables during export is that you upgrade your 9i database to at least 10g, and then use Data Pump. Conventional export/import doesn't have that option.

Hope to help, if you have further questions don't bother to ask
0
 

Author Closing Comment

by:D-pk
ID: 34977401
Ok Thanks!
0

Featured Post

Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Background In several of the companies I have worked for, I noticed that corporate reporting is off loaded from the production database and done mainly on a clone database which needs to be kept up to date daily by various means, be it a logical…
When it comes to protecting Oracle Database servers and systems, there are a ton of myths out there. Here are the most common.
This video explains at a high level with the mandatory Oracle Memory processes are as well as touching on some of the more common optional ones.
This video shows how to recover a database from a user managed backup

587 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question