• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1069
  • Last Modified:

Not able to import in oracle 11.1.7

Hi Experts

Few days back got a export dump from production db and transfered back to test environment to be imported back to test db. but get this error message..

impdp szc2525/singh_1011 DIRECTORY=CMR_3120943 dumpfile=POHDS2S_BETA_SE2D.dmp schemas=PROD_BETA REMAP_SCHEMA=PROD_BETA:SHDS01  exclude=GRANT

Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 October, 2011 17:59:50

Copyright (c) 2003, 2007, Oracle.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options

UDI-31623: operation generated ORACLE error 31623
ORA-31623: a job is not attached to this session via the specified handle
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2862
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4052
ORA-06512: at line 1

Looked at the dba_datapump_jobs too and found no job runinig. No invalid objects too.
Is there any way around ? any suggestion ?
Can we import the same dump through any other way  to the db. Its very critical...

Thanks..
Shashi singh
0
dba_shashi
Asked:
dba_shashi
  • 5
  • 4
  • 3
1 Solution
 
slightwv (䄆 Netminder) Commented:
That is a pretty generic error.  From what I could find it appears to be a version mismatch.  Are production and test on the same version?

There are a couple of Metalink notes in this thread that might be related:
https://forums.oracle.com/forums/thread.jspa?messageID=4544671
0
 
JacobfwCommented:
Check if any of the components are invalid in the database
SQL> select comp_name, status, version from dba_registry

Check if any of the specific objects are invalid
SQL> @?/rdbms/admin/utlrp.sql

SQL> select object_name, object_type from dba_objects where
status ='INVALID' and owner = 'SYS';

Did you recently did a database flashback?
0
 
dba_shashiAuthor Commented:
Hi Experts
As per you suggestions..

1.
Yes.. both Production and Test database run on same Oracle 11.1.7 version.

BANNER
--------------------------------------------------------------------------------
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
PL/SQL Release 11.1.0.7.0 - Production
CORE    11.1.0.7.0      Production
TNS for Linux: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production

2.
SQL> select comp_name, status, version from dba_registry

Oracle Ultra Search   INVALID     11.1.0.7.0

Spatial               INVALID     11.1.0.7.0

Oracle Multimedia     INVALID     11.1.0.7.0

3.
SQL> @?/rdbms/admin/utlrp.sql

no rows selected

4.
SQL> select object_name, object_type from dba_objects where status ='INVALID' and owner = 'SYS';

OBJECT_NAME                         OBJECT_TYPE
----------------------------------- -------------------
VALIDATE_CONTEXT                    PROCEDURE
ALL_XSC_SECURITY_CLASS_DEP          VIEW
ALL_XSC_PRIVILEGE                   VIEW
ALL_XSC_AGGREGATE_PRIVILEGE         VIEW
WWV_FLOW_VAL                        PACKAGE BODY

5.
No... I did not do any database flashback recently.
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 
slightwv (䄆 Netminder) Commented:
When you transferred the dmp file did you make sure it was a binary transfer?
0
 
dba_shashiAuthor Commented:
Yes it was. It was FTP ed from production server to the test server. The way we all usually do.
0
 
slightwv (䄆 Netminder) Commented:
I have to ask mainly because I have done it, was the FTP in binary mode?

Can you FTP it again?

Maybe redo the export?
0
 
JacobfwCommented:
Some indicate that if any components are invalid (like you have) the datapump fails with this error.
Attempt to resolve the outstanding invalid components in your database.

Others indicate that this fixes the problem:
  connect / as sysdba
  startup restrict
  alter system set cluster_database = false scope=spfile;
  shutdown immediate
  startup upgrade
  alter system set cluster_database = true scope=spfile;
  shutdown immediate
  startup
0
 
dba_shashiAuthor Commented:
Does the Stream_Pool_Size has something to do with it ?
My database is not configured with stream pool size..and when I tried doing one export from this database it gave the same error as I got while importing into it.
But when tried with other database where stream pool size was allocated it successfully exported..but I did not test the import onto it.

Any comments/views/suggestion..

Thanks
Shashi singh
0
 
JacobfwCommented:
Definitely this could be your problem:

From 11G Utilities Manual
Setting the Size Of the Buffer Cache In a Streams Environment

Oracle Data Pump uses Streams functionality to communicate between processes. If the SGA_TARGET initialization parameter is set, then the STREAMS_POOL_SIZE initialization parameter is automatically set to a reasonable value.

If the SGA_TARGET initialization parameter is not set and the STREAMS_POOL_SIZE initialization parameter is not defined, then the size of the streams pool automatically defaults to 10% of the size of the shared pool.

When the streams pool is created, the required SGA memory is taken from memory allocated to the buffer cache, reducing the size of the cache to less than what was specified by the DB_CACHE_SIZE initialization parameter. This means that if the buffer cache was configured with only the minimal required SGA, then Data Pump operations may not work properly. A minimum size of 10M is recommended for STREAMS_POOL_SIZE in order to ensure successful Data Pump operations.

0
 
slightwv (䄆 Netminder) Commented:
If you are using Automatic Memory Management then you shouldn't have to set the individual pools.

If an expdp failed from the same database, there is something wrong with it but do not think it the pool size.

Check the alert log to see if errors are being generated.
0
 
dba_shashiAuthor Commented:
when the import was initiated the following error could be seen in the alert log

Fri Oct 28 05:10:44 2011
Errors in file /usr/app/oracle/diagl/diag/rdbms/iohdssl/IOHDSSL1/trace/IOHDSSL1_ora_1938.trc  (incident=100189):
ORA-04031: unable to allocate 56 bytes of shared memory ("streams pool","unknown object","streams pool","fixed allocation callback")

Since the datapump uses streams, it seems the inadequate streams/shared pool size is causing this issue.



This is what is the current status..

NAME                                 TYPE                             VALUE
------------------------------------ -------------------------------- ------------------------------
shared_pool_reserved_size            big integer                      25M
shared_pool_size                     big integer                      500M
streams_pool_size                    big integer                      0
0
 
slightwv (䄆 Netminder) Commented:
500M isn't a lot for shared_pool.

What are your other memory parameters set to?

I would strongly suggest automatic memory management:
http://download.oracle.com/docs/cd/E11882_01/server.112/e16638/memory.htm#PFGRF94254

then if Oracle needs more for specific tasks, it can borrow from other areas.
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

  • 5
  • 4
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now