[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1085
  • Last Modified:

Oracle 11g Datapump

Hi,

Can we use datapump command if the source tables are in 9i and Destination are in 11g.
If so can you give me the syntax and the steps to do it?

Thanks@
0
D-pk
Asked:
D-pk
  • 6
  • 4
  • 2
  • +1
1 Solution
 
sventhanCommented:
You cannot use datapump.
0
 
sventhanCommented:
0
 
MarioAlcaideCommented:
No, you have to make an export using the 9i client exp.

Then, import using the 11g command imp into the new database.

Datapump was implemented in Oracle 10g.
0
Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

 
D-pkAuthor Commented:
so the dump we make in 9i can be readable in 11g? that is the .dmp file.
what are the parameters i need to give for this cross database transfer..
0
 
schwertnerCommented:
I will recommend to try the Data Pump import via DB link to the source DB.
If this fails you have no chance to use the Pump
0
 
D-pkAuthor Commented:
can you give me an example? for example lets say the source table is src_test and the dest table is table_dest.
do we need to have the table to be created in the destination to facilitate the import? or it will automatically create a table based on the name we give?

Thanks!
0
 
MarioAlcaideCommented:
Let's see. You have two databases, DB9i and DB11g. And you want to migrate a schema named MY_SCHEMA. You will:

1. Connect to the machine where DB9i is allocated.
2. Execute exp owner=MY_SCHEMA file=c:\blablabla\your_export_file.dmp log=c:\blablabla\your_export_file.log statistics=none buffer=500000
3. Copy that file to the machine where DB11g is
4. Create the schema: (example) CREATE USER MY_SCHEMA IDENTIFIED BY MY_PASSWORD DEFAULT TABLESPACE USERS;
5. Execute in the DB11g machine: imp fromuser=MY_SCHEMA touser=MY_SCHEMA file=c:\blablabla\your_export_file.dmp log=c:\blablabla\your_import_file.log statistics=none buffer=5000
6. It's done!

That will import an entire schema, if you only need a table then add to the exp command the clause TABLES  your_table_name , and it will export only a table.

And it's done :)

Remember: it will automatically create the new table, but you have to create the schema (user) first.
0
 
D-pkAuthor Commented:
I just want to do few tables. can we do it without creating schema?
How to dump it in the 11g server and import later with a different table name in the 11g?
is that possible?

Thanks!
0
 
MarioAlcaideCommented:
Nope, you need to have a schema created. Or maybe use an existent schema.

You cannot import a table with another name, with the imp utility (you could with Data Pump, but not in your case because you are getting data from a 9i database).

However, you can create another table with the new name, like this:

CREATE TABLE NEW_TABLE AS SELECT * FROM OLD_TABLE;
DROP TABLE OLD_TABLE;

And it will be renamed. Easy ;-)
0
 
D-pkAuthor Commented:
Thanks...
But some of tables are more than 125 gb and its not letting me to export the table dump file.
Any way around for this?
Thanks!
0
 
D-pkAuthor Commented:
I am gonna exp the tables one by one...
0
 
MarioAlcaideCommented:
The only way you could rename those tables during export is that you upgrade your 9i database to at least 10g, and then use Data Pump. Conventional export/import doesn't have that option.

Hope to help, if you have further questions don't bother to ask
0
 
D-pkAuthor Commented:
Ok Thanks!
0

Featured Post

The new generation of project management tools

With monday.com’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

  • 6
  • 4
  • 2
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now