Link to home
Start Free TrialLog in
Avatar of harsha_dp
harsha_dpFlag for India

asked on

Ora-12801 : is it physical space issue

Hi,
    I am using Proc sql in SAS to connect oracle databse in server. While fetching data i am getting the error
Error fetching from cursor. ORACLE error is ORA-12801: error signaled in parallel query
 server P003   ORA-01652: unable to extend temp segment by 9929 in tablespace USR_TMP.


Is it physical space issue, what are the possible solutions..
Avatar of Ora_Techie
Ora_Techie

Yes, you have less tablespace sotrage than you require. Add datafile to tablespace usr_tmp to increase its size.
Avatar of harsha_dp

ASKER

Is there a upper limit for the available memory at any time.. This is to decide when to use datafile. As we have some estimate about temporary space that the query uses during execution, if we know the limit we can dynamically program the things, I believe. I am using SAS , so i can fix up the upper bound before query execution.
No this is not Memory-related issue. Just set autoextend property of database file (in tablespace usr_tmp to ON and the datafile will grow as much as it needed.

Also you can change maxsize (the maxmimum limit a datafile can grow automatically if the autoextend on) to unlimited.

Like:

alter database datafile 'DataFileNameWithFullPath'
autoextend on 10M
Maxsize 1000M;

Here is said to autoextend by 10M but upto the max of 1000M.

Alternatively:

alter database datafile 'DataFileNameWithFullPath'
autoextend on 10M
Maxsize unlimited;

will extend file by 10M (whenever required) and the max size of file is unlimited.
I am just querying the server. I cannot do anything for database which is in remote place. I need the solution which just does this much. My query has group by clause, there will be around 11million records in the table. Result of the query is also stored in remote place, finally i download the resultset or recordset which i am interested in.
No...you can't ...Think about it ...you are grouping on 11 Millions records' table...which just needs a lots of Temporary Tablespace. You may ask DBA at remote to do so or to assign you a larger temp tablespace OR ask them to do themselves at their remote location and you downoad the results.
Yes.. Exactly right.. But i am not the only person who use the database. Thats why i am in search of solution from experts. Though i get the pace DBA has to lookafter about concurrency   Right..

Thanks
Another possible solution is you divide (logically) your data between pieces and

(1) process data of piece
(2) store data
(3) goto 1

and repeat until u r finished.

Just like

select count(*),sum(col2),col1
from
(
select * from
(
select rownum rno,a.*
from tab a
where rownum<=YourUpperLimit)
and rno>YourLowerLimit
)
group by col1


Say you can easily use a range of 1 million records. So there will be 11 pieces
Yes.. .. Yesterday after thinking for a long time i wrote a code to fetch data in different stages and at the end summarizing it. It is working fine and good. But i want to minimize the processing time.The solution what i thought and implemented has become a problem again..
ASKER CERTIFIED SOLUTION
Avatar of Ora_Techie
Ora_Techie

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial