dba_shashi
asked on
Not able to import in oracle 11.1.7
Hi Experts
Few days back got a export dump from production db and transfered back to test environment to be imported back to test db. but get this error message..
impdp szc2525/singh_1011 DIRECTORY=CMR_3120943 dumpfile=POHDS2S_BETA_SE2D .dmp schemas=PROD_BETA REMAP_SCHEMA=PROD_BETA:SHD S01 exclude=GRANT
Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 October, 2011 17:59:50
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
UDI-31623: operation generated ORACLE error 31623
ORA-31623: a job is not attached to this session via the specified handle
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2862
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4052
ORA-06512: at line 1
Looked at the dba_datapump_jobs too and found no job runinig. No invalid objects too.
Is there any way around ? any suggestion ?
Can we import the same dump through any other way to the db. Its very critical...
Thanks..
Shashi singh
Few days back got a export dump from production db and transfered back to test environment to be imported back to test db. but get this error message..
impdp szc2525/singh_1011 DIRECTORY=CMR_3120943 dumpfile=POHDS2S_BETA_SE2D
Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 October, 2011 17:59:50
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
UDI-31623: operation generated ORACLE error 31623
ORA-31623: a job is not attached to this session via the specified handle
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2862
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4052
ORA-06512: at line 1
Looked at the dba_datapump_jobs too and found no job runinig. No invalid objects too.
Is there any way around ? any suggestion ?
Can we import the same dump through any other way to the db. Its very critical...
Thanks..
Shashi singh
Check if any of the components are invalid in the database
SQL> select comp_name, status, version from dba_registry
Check if any of the specific objects are invalid
SQL> @?/rdbms/admin/utlrp.sql
SQL> select object_name, object_type from dba_objects where
status ='INVALID' and owner = 'SYS';
Did you recently did a database flashback?
SQL> select comp_name, status, version from dba_registry
Check if any of the specific objects are invalid
SQL> @?/rdbms/admin/utlrp.sql
SQL> select object_name, object_type from dba_objects where
status ='INVALID' and owner = 'SYS';
Did you recently did a database flashback?
ASKER
Hi Experts
As per you suggestions..
1.
Yes.. both Production and Test database run on same Oracle 11.1.7 version.
BANNER
-------------------------- ---------- ---------- ---------- ---------- ---------- ----
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for Linux: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
2.
SQL> select comp_name, status, version from dba_registry
Oracle Ultra Search INVALID 11.1.0.7.0
Spatial INVALID 11.1.0.7.0
Oracle Multimedia INVALID 11.1.0.7.0
3.
SQL> @?/rdbms/admin/utlrp.sql
no rows selected
4.
SQL> select object_name, object_type from dba_objects where status ='INVALID' and owner = 'SYS';
OBJECT_NAME OBJECT_TYPE
-------------------------- --------- -------------------
VALIDATE_CONTEXT PROCEDURE
ALL_XSC_SECURITY_CLASS_DEP VIEW
ALL_XSC_PRIVILEGE VIEW
ALL_XSC_AGGREGATE_PRIVILEG E VIEW
WWV_FLOW_VAL PACKAGE BODY
5.
No... I did not do any database flashback recently.
As per you suggestions..
1.
Yes.. both Production and Test database run on same Oracle 11.1.7 version.
BANNER
--------------------------
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for Linux: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
2.
SQL> select comp_name, status, version from dba_registry
Oracle Ultra Search INVALID 11.1.0.7.0
Spatial INVALID 11.1.0.7.0
Oracle Multimedia INVALID 11.1.0.7.0
3.
SQL> @?/rdbms/admin/utlrp.sql
no rows selected
4.
SQL> select object_name, object_type from dba_objects where status ='INVALID' and owner = 'SYS';
OBJECT_NAME OBJECT_TYPE
--------------------------
VALIDATE_CONTEXT PROCEDURE
ALL_XSC_SECURITY_CLASS_DEP
ALL_XSC_PRIVILEGE VIEW
ALL_XSC_AGGREGATE_PRIVILEG
WWV_FLOW_VAL PACKAGE BODY
5.
No... I did not do any database flashback recently.
When you transferred the dmp file did you make sure it was a binary transfer?
ASKER
Yes it was. It was FTP ed from production server to the test server. The way we all usually do.
I have to ask mainly because I have done it, was the FTP in binary mode?
Can you FTP it again?
Maybe redo the export?
Can you FTP it again?
Maybe redo the export?
Some indicate that if any components are invalid (like you have) the datapump fails with this error.
Attempt to resolve the outstanding invalid components in your database.
Others indicate that this fixes the problem:
connect / as sysdba
startup restrict
alter system set cluster_database = false scope=spfile;
shutdown immediate
startup upgrade
alter system set cluster_database = true scope=spfile;
shutdown immediate
startup
Attempt to resolve the outstanding invalid components in your database.
Others indicate that this fixes the problem:
connect / as sysdba
startup restrict
alter system set cluster_database = false scope=spfile;
shutdown immediate
startup upgrade
alter system set cluster_database = true scope=spfile;
shutdown immediate
startup
ASKER
Does the Stream_Pool_Size has something to do with it ?
My database is not configured with stream pool size..and when I tried doing one export from this database it gave the same error as I got while importing into it.
But when tried with other database where stream pool size was allocated it successfully exported..but I did not test the import onto it.
Any comments/views/suggestion. .
Thanks
Shashi singh
My database is not configured with stream pool size..and when I tried doing one export from this database it gave the same error as I got while importing into it.
But when tried with other database where stream pool size was allocated it successfully exported..but I did not test the import onto it.
Any comments/views/suggestion.
Thanks
Shashi singh
Definitely this could be your problem:
From 11G Utilities Manual
Setting the Size Of the Buffer Cache In a Streams Environment
Oracle Data Pump uses Streams functionality to communicate between processes. If the SGA_TARGET initialization parameter is set, then the STREAMS_POOL_SIZE initialization parameter is automatically set to a reasonable value.
If the SGA_TARGET initialization parameter is not set and the STREAMS_POOL_SIZE initialization parameter is not defined, then the size of the streams pool automatically defaults to 10% of the size of the shared pool.
When the streams pool is created, the required SGA memory is taken from memory allocated to the buffer cache, reducing the size of the cache to less than what was specified by the DB_CACHE_SIZE initialization parameter. This means that if the buffer cache was configured with only the minimal required SGA, then Data Pump operations may not work properly. A minimum size of 10M is recommended for STREAMS_POOL_SIZE in order to ensure successful Data Pump operations.
From 11G Utilities Manual
Setting the Size Of the Buffer Cache In a Streams Environment
Oracle Data Pump uses Streams functionality to communicate between processes. If the SGA_TARGET initialization parameter is set, then the STREAMS_POOL_SIZE initialization parameter is automatically set to a reasonable value.
If the SGA_TARGET initialization parameter is not set and the STREAMS_POOL_SIZE initialization parameter is not defined, then the size of the streams pool automatically defaults to 10% of the size of the shared pool.
When the streams pool is created, the required SGA memory is taken from memory allocated to the buffer cache, reducing the size of the cache to less than what was specified by the DB_CACHE_SIZE initialization parameter. This means that if the buffer cache was configured with only the minimal required SGA, then Data Pump operations may not work properly. A minimum size of 10M is recommended for STREAMS_POOL_SIZE in order to ensure successful Data Pump operations.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
when the import was initiated the following error could be seen in the alert log
Fri Oct 28 05:10:44 2011
Errors in file /usr/app/oracle/diagl/diag /rdbms/ioh dssl/IOHDS SL1/trace/ IOHDSSL1_o ra_1938.tr c (incident=100189):
ORA-04031: unable to allocate 56 bytes of shared memory ("streams pool","unknown object","streams pool","fixed allocation callback")
Since the datapump uses streams, it seems the inadequate streams/shared pool size is causing this issue.
This is what is the current status..
NAME TYPE VALUE
-------------------------- ---------- -------------------------- ------ -------------------------- ----
shared_pool_reserved_size big integer 25M
shared_pool_size big integer 500M
streams_pool_size big integer 0
Fri Oct 28 05:10:44 2011
Errors in file /usr/app/oracle/diagl/diag
ORA-04031: unable to allocate 56 bytes of shared memory ("streams pool","unknown object","streams pool","fixed allocation callback")
Since the datapump uses streams, it seems the inadequate streams/shared pool size is causing this issue.
This is what is the current status..
NAME TYPE VALUE
--------------------------
shared_pool_reserved_size big integer 25M
shared_pool_size big integer 500M
streams_pool_size big integer 0
500M isn't a lot for shared_pool.
What are your other memory parameters set to?
I would strongly suggest automatic memory management:
http://download.oracle.com/docs/cd/E11882_01/server.112/e16638/memory.htm#PFGRF94254
then if Oracle needs more for specific tasks, it can borrow from other areas.
What are your other memory parameters set to?
I would strongly suggest automatic memory management:
http://download.oracle.com/docs/cd/E11882_01/server.112/e16638/memory.htm#PFGRF94254
then if Oracle needs more for specific tasks, it can borrow from other areas.
There are a couple of Metalink notes in this thread that might be related:
https://forums.oracle.com/forums/thread.jspa?messageID=4544671