Oracle XE Help tablespace bloated

I am not an Oracle DBA, I have managed to install XE on my home PC to enable me to develop reporting tools using Excel, and therefore to emulate my work DB.  To do this I have recently had to truncate and re-populate a number of large(ish) tables.

The problem now seem to have is I appear to have hit a limit, not THE limit in terms of the 4GB DB size, but a limit, on what I think to be 2 of the TABLESPACES on the DB.

My question is, how to I "shrink" them?  I have tried the compact database function within the XE interface which reports back:
Space Allocated: 1,590 MB
Available:      3,530 MB
Maximum Allowable: 5,120 MB
Percent Used:      31%
Which seems to suggest I have more capacity, however when I view the DB through SQLSeveloper, it shows the following
SYSTEM       PCT_USED: 97.75 ALLOCATED:350 USED:342.13 FREE:7.88
SYSAUX      PCT_USED: 97.02 ALLOCATED: 300 USED:291.06 FREE: 8.94
UNDO      PCT_USED: 81.2 ALLOCATED: 500 USED: 406 FREE: 94
USERS      SPCT_USED: 47.06      ALLOCATED: 940 USED: 442.38 FREE: 497.63

It would seem that both the SYSTEM and SYSAUX tablespaces are the problem, however I do not have the first idea of what to do to overcome this; I have tried dropping a couple of large tables from the DB, but this has only freed up space in the USER tablespace.

Any help greatly appreciated.

Who is Participating?
schwertnerConnect With a Mentor Commented:
It is so, because you have to try to append the rows in small chunks.
about 1000 rows every and to COMMIT in order to empty the UNDO segment.
This is not enough. There is an parameter UNDO_RETENTION that you should
set to a very low level.

There are experts that claim that it is possible to insert without affecting the size
of the UNDO segment:
They suggested to use:
 insert /*+APPEND */ into ... select ... from [some other table or view]
But the table must be defined (or altered) to "nologging".

Also to speed up the inserting you can disable all indexes on the thable(s)
and enable them after insert.

What is the exact error message or what makes you think you hit a limit?
If you the above mentioned  feeling you can put the tablespaces in autoextend mode
and they will not cause expansion problems:

alter database datafile    '/u01/app/oracle/SYSTEM01.dbf'  autoextend on;

Find the correct names and locations of that files.

Be aware that these percents are normal for these tablespaces and
in the common case autoextend is set to ON.

Cloud Class® Course: Microsoft Office 2010

This course will introduce you to the interfaces and features of Microsoft Office 2010 Word, Excel, PowerPoint, Outlook, and Access. You will learn about the features that are shared between all products in the Office suite, as well as the new features that are product specific.

MISServicesAuthor Commented:
OK, so from what schwertner has said, the size of the 2 tablespaces I refer to is not the issue.  As is typical in these cases I am now attempting to append byte-sized parts of the table, and as yet they have not failed to append.  However I still feel there is an issue, that I am not fully understanding, and that is why it takes so long to append the rows, and I mean a very long time indeed.  For example the table I am trying to append from is over 1.5 million rows, if I was to append the lot into the XE equivalent table it would take over 3 hours, some times longer before Access reports back (usually with a failure error).

So is the approach I am using appropriate, e.g. I use a make table query at work to create an Access table of the Oracle table.  At home I then link the XE equivalent table and run an MS Access append query.  Is there a more efficient way to achieve the same results?
MISServicesAuthor Commented:
I have now managed to attempt an append that has resulted in an error.  The error, as reported back by MS Access is:
ODBC insert on linked table [TableName] failed.
[Oracle][ODBC][Ora]ORA-30036: unable to extend segment by 8 in undo tablespace 'UNDO'(#30036)

Both the UNDO tablespace and USER tablespace or at 99-100%

The append query (100,000 rows) took 2 hours before it failed!

MISServicesAuthor Commented:
Thanks schwertner

Based on your advice I created a VBA function that dynamically created append queries of 500 rows a time, worked a dream, and none of the issues previously mentioned arose.  I also found that by turning the LOGGING off for both the table and it's indexes reduced the append query by a further 8-10 secs, per 500 rows.  So, I would suggest this is the best solution if you are to use Access as the vehicle for appending the data.

Having said all of this I have procured a copy of DATA LOADER which seems to handle things in a tidier way.

Thanks again for your advice.
Thats a fantastic news!
Good luck!
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.