Solved

how do I import contents of dmp file after partially import previously

Posted on 2011-03-18
9
425 Views
Last Modified: 2012-05-11
I backed up a tablespace and dropped it because I had a corrupt datafile. Then I started reimporting the data from the backup back into tablespace after recreating it. Everything was going well. Then I was asked to abort the import by developers in order for them to do other tasks. So I killed the import. Now I am being asked to finish the import. I tried importing the same way I did the first time and it is taking an extremely long time. It has been 2 and a half days. All I see is errors where it is trying to import individual records that were successfully imported without error the first time. I expected there to be errors. But I didn't expect it to go through each record in each schema and to go run this long. Is there something I can do to get this done quicker? Do I kill this import and change the parameters or use some different command?

here is the command I used:

nohup imp system/<password entered here> file=dataT.dmp log=dat3152011.log full=y statistics=none
0
Comment
Question by:sikyala
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 4
9 Comments
 
LVL 2

Expert Comment

by:choukssa
ID: 35166011
Use TABLE_EXISTS_ACTION=APPEND.


nohup imp system/<password entered here> file=dataT.dmp log=dat3152011.log full=y statistics=none TABLE_EXISTS_ACTION=APPEND

- choukssa
0
 

Author Comment

by:sikyala
ID: 35166056
so if the record is already in the table will it skip it or keep giving the error. This seems to be all it is doing during the import

example

IMP-00019: row rejected due to ORACLE error 1
IMP-00003: ORACLE error 1 encountered
ORA-00001: unique constraint (S_AD.PK_SP) violated
Column 1 CBA
Column 2 CB-020322
Column 3 /ret-data/cba_single_page_pages/302/CB-020322_00...
Column 4 33
Column 5 423590
Column 6 1079973
Column 7 302
0
 
LVL 7

Accepted Solution

by:
Piloute earned 500 total points
ID: 35166229
Hi sikyala,

Please clarify following :

- oracle version
- what kind of export is it ? data pump / classical one / RMAN backup

If it is a classical export / import as I guess, all you can do is :

1) use a larger BUFFER (parameter)
2) next time you import, activate the RESUMABLE feature (parameter) and related name & timeout.
3) check also the IGNORE & COMMIT parameters,
- the first allows you not to report errors => not to spend time to send you back useless error messages.
- the second is useful to allow restarting a stopped import.
Warning : if ALL of your tables have at least one unique key, you can use these parameter. Otherways, you will end up with double entries in tables...
4) do your import operation on the server itself. You will spare the network transactions time.


Official doc here :
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/exp_imp.htm#sthref2391


Cheers,
P

0
PeopleSoft Has Never Been Easier

PeopleSoft Adoption Made Smooth & Simple!

On-The-Job Training Is made Intuitive & Easy With WalkMe's On-Screen Guidance Tool.  Claim Your Free WalkMe Account Now

 
LVL 7

Expert Comment

by:Piloute
ID: 35166261
@choukksa

that parameter doesn't exist in original import... It is a DataPump feature.

Rgds,
P
0
 

Author Comment

by:sikyala
ID: 35166401
Piloute:

- oracle version >> Oracle 10g
- what kind of export is it ? data pump / classical one / RMAN backup >> classical

If it is a classical export / import as I guess, all you can do is :

2) next time you import, activate the RESUMABLE feature (parameter) and related name & timeout. >> How do I do this
3) check also the IGNORE & COMMIT parameters,
- the first allows you not to report errors => not to spend time to send you back useless error messages.
- the second is useful to allow restarting a stopped import.
Warning : if ALL of your tables have at least one unique key, you can use these parameter. Otherways, you will end up with double entries in tables... >>  are you saying ignore=y commit=n

4) do your import operation on the server itself. You will spare the network transactions time. >> ok


0
 

Author Comment

by:sikyala
ID: 35166724
also is there a way to check to see how much of the import is complete
0
 
LVL 7

Assisted Solution

by:Piloute
Piloute earned 500 total points
ID: 35167145
A suggestion : since you have a 10g, next time use DataPump (that gives you more features). And unfortunately no, there's no way to import with DP an export that has been made with original exp, in case you'd wonder...

---

2) >> How do I do this  -> that's why I gave you the documentation link.
In a word, you add the following to your import command : imp ... RESUMABLE=Y RESUMABLE_NAME=<a string> RESUMABLE_TIME=<how many seconds>...

3) >>  are you saying ignore=y commit=n
YES. But you must be" aware of the duplicate lines issue in case you don't have primary keys (or unique keys) on tables.
Oh, I just realize default COMMIT=N. So you can forget it...

---

I really really encourage you to read the doc. That's why I gave you the link...

P


0
 

Author Closing Comment

by:sikyala
ID: 35207913
thanks
0
 
LVL 7

Expert Comment

by:Piloute
ID: 35213086
welcome
0

Featured Post

SharePoint Admin?

Enable Your Employees To Focus On The Core With Intuitive Onscreen Guidance That is With You At The Moment of Need.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Note: this article covers simple compression. Oracle introduced in version 11g release 2 a new feature called Advanced Compression which is not covered here. General principle of Oracle compression Oracle compression is a way of reducing the d…
Introduction A previously published article on Experts Exchange ("Joins in Oracle", http://www.experts-exchange.com/Database/Oracle/A_8249-Joins-in-Oracle.html) makes a statement about "Oracle proprietary" joins and mixes the join syntax with gen…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
This video shows syntax for various backup options while discussing how the different basic backup types work.  It explains how to take full backups, incremental level 0 backups, incremental level 1 backups in both differential and cumulative mode a…

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question