Best method to load 50Mb xml file into an Oracle table to search on?
Posted on 2014-07-28
What is the best method (binary, lob, clob, external table, etc) to load this 50Mb xml file into an Oracle table to search on?
I've been reading through the Oracle XML DB Developer's Guide11g Release 2 (11.2). I've tried various methods of loading a very large 50mb XML file of XMLTYPE into a table (CLOB, LOB, varchar2, etc) so I can perform a SQL query on the table to load Latitude and longitude values into an Oracle geospatial procedure I have set up. The next method I want to try is Binary XML Storage,
create table XMLBIN of XMLType
XMLTYPE store as BINARY XML;
but I wanted to check here first since I've spend a lot of time on this already and I don't want to damage our heavily used test db.
I'm trying to use sqlldr to load a 50 mb xml file into an Oracle table. For each method I use, the locally managed tablespace grows over 10GB before stopping after 300 records loaded. The error messages vary between method used but they all consume of 10GB of tablespace before stopping. I had 16GB allocated but this seems like way too much memory being used.
CREATE TABLE TABLE_XML_DATA
sqlldr control file,
into table TEST_USER.TABLE_XML_DATA
FIELDS TERMINATED BY ','
COL_XML_DATA lobfile (CONSTANT' c:\Geospat.xml' ) terminated by eof