?
Solved

how do I import contents of dmp file after partially import previously

Posted on 2011-03-18
9
Medium Priority
?
432 Views
Last Modified: 2012-05-11
I backed up a tablespace and dropped it because I had a corrupt datafile. Then I started reimporting the data from the backup back into tablespace after recreating it. Everything was going well. Then I was asked to abort the import by developers in order for them to do other tasks. So I killed the import. Now I am being asked to finish the import. I tried importing the same way I did the first time and it is taking an extremely long time. It has been 2 and a half days. All I see is errors where it is trying to import individual records that were successfully imported without error the first time. I expected there to be errors. But I didn't expect it to go through each record in each schema and to go run this long. Is there something I can do to get this done quicker? Do I kill this import and change the parameters or use some different command?

here is the command I used:

nohup imp system/<password entered here> file=dataT.dmp log=dat3152011.log full=y statistics=none
0
Comment
Question by:sikyala
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 4
9 Comments
 
LVL 2

Expert Comment

by:choukssa
ID: 35166011
Use TABLE_EXISTS_ACTION=APPEND.


nohup imp system/<password entered here> file=dataT.dmp log=dat3152011.log full=y statistics=none TABLE_EXISTS_ACTION=APPEND

- choukssa
0
 

Author Comment

by:sikyala
ID: 35166056
so if the record is already in the table will it skip it or keep giving the error. This seems to be all it is doing during the import

example

IMP-00019: row rejected due to ORACLE error 1
IMP-00003: ORACLE error 1 encountered
ORA-00001: unique constraint (S_AD.PK_SP) violated
Column 1 CBA
Column 2 CB-020322
Column 3 /ret-data/cba_single_page_pages/302/CB-020322_00...
Column 4 33
Column 5 423590
Column 6 1079973
Column 7 302
0
 
LVL 7

Accepted Solution

by:
Piloute earned 2000 total points
ID: 35166229
Hi sikyala,

Please clarify following :

- oracle version
- what kind of export is it ? data pump / classical one / RMAN backup

If it is a classical export / import as I guess, all you can do is :

1) use a larger BUFFER (parameter)
2) next time you import, activate the RESUMABLE feature (parameter) and related name & timeout.
3) check also the IGNORE & COMMIT parameters,
- the first allows you not to report errors => not to spend time to send you back useless error messages.
- the second is useful to allow restarting a stopped import.
Warning : if ALL of your tables have at least one unique key, you can use these parameter. Otherways, you will end up with double entries in tables...
4) do your import operation on the server itself. You will spare the network transactions time.


Official doc here :
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/exp_imp.htm#sthref2391


Cheers,
P

0
Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

 
LVL 7

Expert Comment

by:Piloute
ID: 35166261
@choukksa

that parameter doesn't exist in original import... It is a DataPump feature.

Rgds,
P
0
 

Author Comment

by:sikyala
ID: 35166401
Piloute:

- oracle version >> Oracle 10g
- what kind of export is it ? data pump / classical one / RMAN backup >> classical

If it is a classical export / import as I guess, all you can do is :

2) next time you import, activate the RESUMABLE feature (parameter) and related name & timeout. >> How do I do this
3) check also the IGNORE & COMMIT parameters,
- the first allows you not to report errors => not to spend time to send you back useless error messages.
- the second is useful to allow restarting a stopped import.
Warning : if ALL of your tables have at least one unique key, you can use these parameter. Otherways, you will end up with double entries in tables... >>  are you saying ignore=y commit=n

4) do your import operation on the server itself. You will spare the network transactions time. >> ok


0
 

Author Comment

by:sikyala
ID: 35166724
also is there a way to check to see how much of the import is complete
0
 
LVL 7

Assisted Solution

by:Piloute
Piloute earned 2000 total points
ID: 35167145
A suggestion : since you have a 10g, next time use DataPump (that gives you more features). And unfortunately no, there's no way to import with DP an export that has been made with original exp, in case you'd wonder...

---

2) >> How do I do this  -> that's why I gave you the documentation link.
In a word, you add the following to your import command : imp ... RESUMABLE=Y RESUMABLE_NAME=<a string> RESUMABLE_TIME=<how many seconds>...

3) >>  are you saying ignore=y commit=n
YES. But you must be" aware of the duplicate lines issue in case you don't have primary keys (or unique keys) on tables.
Oh, I just realize default COMMIT=N. So you can forget it...

---

I really really encourage you to read the doc. That's why I gave you the link...

P


0
 

Author Closing Comment

by:sikyala
ID: 35207913
thanks
0
 
LVL 7

Expert Comment

by:Piloute
ID: 35213086
welcome
0

Featured Post

 [eBook] Windows Nano Server

Download this FREE eBook and learn all you need to get started with Windows Nano Server, including deployment options, remote management
and troubleshooting tips and tricks

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Note: this article covers simple compression. Oracle introduced in version 11g release 2 a new feature called Advanced Compression which is not covered here. General principle of Oracle compression Oracle compression is a way of reducing the d…
Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
This video shows how to Export data from an Oracle database using the Original Export Utility.  The corresponding Import utility, which works the same way is referenced, but not demonstrated.
This video shows how to configure and send email from and Oracle database using both UTL_SMTP and UTL_MAIL, as well as comparing UTL_SMTP to a manual SMTP conversation with a mail server.
Suggested Courses
Course of the Month13 days, 9 hours left to enroll

800 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question