[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 631
  • Last Modified:

EXPORT

I have investigated the cause of L1009 datapump export failure. The
following errors reported in export log file:

ORA-31626: job does not exist
ORA-31637: cannot create job exp_OPS$IHESS\SRVORACLEBATCH for user
OPS$IHESS\SRVORACLEBATCH
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 600
ORA-39080: failed to create queues "KUPC$C_1_20080611182005" and
"KUPC$S_1_20080611182005" for Data Pump job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1580
ORA-04031: unable to allocate 56 bytes of shared memory ("streams
pool","unknown object","streams pool","fixed allocation callback")
--

expdp / directory=data_pump_dir dumpfile=schema_%SCH%.nightly_dmp
logfile=schema_%SCH%.log job_name=exp_%SCH1% schemas=%SCH1%
estimate=statistics flashback_time=\"to_timestamp('%ltime%','dd-mon-yy
hh24:mi:ss')\"

Here SCH will be replaced with Schema name(OPS$IHESS\SRVORACLEBATCH,
SYSTEM, VRS_SHR_OWNER). The export of these schemas got failed due to
unable to find the jobs which mentioned at job_name parameter. These
jobs
might not be created because insufficient or unable to allocate space in
shared memory at that time.

Please advice me
0
msabuu
Asked:
msabuu
2 Solutions
 
dalebettsCommented:
Up the limit on the streams pool.
0
 
schwertnerCommented:
What is the Oracle version?
Data Pump doesn't work stable before Oracle 10g R2 (10.2.0.1)
0
 
msabuuAuthor Commented:
Oracle 10g R2
0
 
LindaCCommented:
Try this solutions and export again:

1- Remove any object named with the reserved word system created by you.  View thi
s by executing the follwoing query:
select object_name, object_type from all_objects where object_name = 'SYSTEM';
2- Grant exp_full_database, imp_full_database, grant flashback any table to the user, create session, or Dba.
3- If the schema that is being exported (i.e. not the user that started the Export DataPump job) has scheduler jobs that need to be exported, and that user does not have (enough) tablespace quota, then the Export DataPump job will fail
4- Validate invalid objects by executing the following:

SQL> @?/rdbms/admin/utlrp.sql

5- Increase the size of the undo tablespace.

6- Initialization parameter streams_pool_size = 0

The failure is with the streams pool as in the error message:

ORA-04031: unable to allocate 32 bytes of shared memory ("streams pool", ...
Solution
-- To implement the solution, please execute the following steps::
Set streams_pool_size = 48M

SQL> alter system set streams_pool_size = 48M;


Good luck!
0

Featured Post

Get your Disaster Recovery as a Service basics

Disaster Recovery as a Service is one go-to solution that revolutionizes DR planning. Implementing DRaaS could be an efficient process, easily accessible to non-DR experts. Learn about monitoring, testing, executing failovers and failbacks to ensure a "healthy" DR environment.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now