Solved

EXPORT

Posted on 2008-06-12
6
621 Views
Last Modified: 2008-09-13
I have investigated the cause of L1009 datapump export failure. The
following errors reported in export log file:

ORA-31626: job does not exist
ORA-31637: cannot create job exp_OPS$IHESS\SRVORACLEBATCH for user
OPS$IHESS\SRVORACLEBATCH
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 600
ORA-39080: failed to create queues "KUPC$C_1_20080611182005" and
"KUPC$S_1_20080611182005" for Data Pump job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1580
ORA-04031: unable to allocate 56 bytes of shared memory ("streams
pool","unknown object","streams pool","fixed allocation callback")
--

expdp / directory=data_pump_dir dumpfile=schema_%SCH%.nightly_dmp
logfile=schema_%SCH%.log job_name=exp_%SCH1% schemas=%SCH1%
estimate=statistics flashback_time=\"to_timestamp('%ltime%','dd-mon-yy
hh24:mi:ss')\"

Here SCH will be replaced with Schema name(OPS$IHESS\SRVORACLEBATCH,
SYSTEM, VRS_SHR_OWNER). The export of these schemas got failed due to
unable to find the jobs which mentioned at job_name parameter. These
jobs
might not be created because insufficient or unable to allocate space in
shared memory at that time.

Please advice me
0
Comment
Question by:msabuu
6 Comments
 
LVL 3

Accepted Solution

by:
dalebetts earned 250 total points
ID: 21773420
Up the limit on the streams pool.
0
 
LVL 48

Expert Comment

by:schwertner
ID: 21777680
What is the Oracle version?
Data Pump doesn't work stable before Oracle 10g R2 (10.2.0.1)
0
 

Author Comment

by:msabuu
ID: 21787015
Oracle 10g R2
0
 
LVL 8

Assisted Solution

by:LindaC
LindaC earned 250 total points
ID: 21789961
Try this solutions and export again:

1- Remove any object named with the reserved word system created by you.  View thi
s by executing the follwoing query:
select object_name, object_type from all_objects where object_name = 'SYSTEM';
2- Grant exp_full_database, imp_full_database, grant flashback any table to the user, create session, or Dba.
3- If the schema that is being exported (i.e. not the user that started the Export DataPump job) has scheduler jobs that need to be exported, and that user does not have (enough) tablespace quota, then the Export DataPump job will fail
4- Validate invalid objects by executing the following:

SQL> @?/rdbms/admin/utlrp.sql

5- Increase the size of the undo tablespace.

6- Initialization parameter streams_pool_size = 0

The failure is with the streams pool as in the error message:

ORA-04031: unable to allocate 32 bytes of shared memory ("streams pool", ...
Solution
-- To implement the solution, please execute the following steps::
Set streams_pool_size = 48M

SQL> alter system set streams_pool_size = 48M;


Good luck!
0

Featured Post

Netscaler Common Configuration How To guides

If you use NetScaler you will want to see these guides. The NetScaler How To Guides show administrators how to get NetScaler up and configured by providing instructions for common scenarios and some not so common ones.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

From implementing a password expiration date, to datatype conversions and file export options, these are some useful settings I've found in Jasper Server.
When table data gets too large to manage or queries take too long to execute the solution is often to buy bigger hardware or assign more CPUs and memory resources to the machine to solve the problem. However, the best, cheapest and most effective so…
This video shows how to Export data from an Oracle database using the Datapump Export Utility.  The corresponding Datapump Import utility is also discussed and demonstrated.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

856 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question