how to insert records

hi i need to do some massive inserts by extracting data from an xml stored as a clob in in a table column. the no of records are 40,000. means i extract a particular node by substr etc and then insert. i tried and i am successful for some 10 records. however i am not sure how to work for such massive inserts .... also
any help...? i tried creating a pl/sql...for 500 records and i wastn successful.

what is the right approacj
Who is Participating?
Agreed, if you're already extracting ok and it's taking that long, it probably doesn't matter much how you do the insert.

However, to answer your question, depending on how you're doing the extraction, I'd either do a direct insert like ...

insert into table (select <your-extraction-query here>)  -- most recommended

-- next recommended
or, if you're using pl/sql to iterate through the xml then
as you pull the pieces out into variables, use a table of records with the data you want to insert.
Then do a single bulk insert at the end.

last choice (least recommended) is to pull the data out into variables in your pl/sql
and insert them one row at a time as your extract them.

Only option I absolutely do not recommend is building dynamic sql of concatenated values.
Do any of the above three, but don't insert literals.
you can extract the xml data from the nodes directly using xpath queries.

Something like this perhaps....

insert into my_table (a, b, c)
select extractvalue(value(x),'/xquery/path/@a) as a,
          extractvalue(value(x),'/xquery/path/@b) as b,
          extractvalue(value(x),'/xquery/path@c) as c
from my_xml_table mxt,
        table(xmlsequence(extract(mxt.my_xml_column,'/xqueryparent/xquery'))) x

Mark GeerlingsDatabase AdministratorCommented:
"What is the "right approach"?  I'm not a master of working with xml data in Oracle, but we have an approach that works for us.  We also save the *.xml file into an Oracle table, but we don't use a CLOB column, we use a SYS.XMLTYPE column.  Then we use a view like this to present the xml data in a format that we are used to working with (I removed many of the columns to save space):

create or replace view xml_icaras_orders_view
          SUBSTR (EXTRACTVALUE (VALUE (htable),
          SUBSTR (EXTRACTVALUE (VALUE (htable),
          SUBSTR (EXTRACTVALUE (VALUE (htable),
          SUBSTR (EXTRACTVALUE (VALUE (htable),
          -- line data starts here
          SUBSTR (EXTRACTVALUE (VALUE (ltable),
          SUBSTR (EXTRACTVALUE (VALUE (ltable),
          SUBSTR (EXTRACTVALUE (VALUE (ltable),
          to_number (EXTRACTVALUE (VALUE (ltable),
FROM icaras_orders_xml x,
-- extract headers from XML
  '/requirementsexport/requirements_export_header'))) htable,
-- extract lines from HTABLE to get correlated join
  '/requirements_export_header/requirements_export_detail'))) ltable;

This works well for an interface we run regularly that processes between a few dozen and a few hundred records and occasionally a thousand or more.  This allows use to use "standard" SQL queries and/or PL\SQL procedures to process this data into Oracle by querying the view.
Cloud Class® Course: Certified Penetration Testing

This CPTE Certified Penetration Testing Engineer course covers everything you need to know about becoming a Certified Penetration Testing Engineer. Career Path: Professional roles include Ethical Hackers, Security Consultants, System Administrators, and Chief Security Officers.

whew!  I'm glad I'm not the only one to slap a view on top of a collection of EXCTRACTVALUE's in order to expose xml nicely.
samir25Author Commented:
my xml is huge ;.and extractvalue doesnt work. we store in clob ... i wouldnt like to justify that one now..
but i know how to exctract the node all i want to know is the consequence of inserting so many records at oen time. or should i insert like this from rownum 1 - 100
i know how to do it. but i am not sure on the impact on following one process
Mark GeerlingsDatabase AdministratorCommented:
No, I do not believe that this is the real problem: " the consequence of inserting so many records".  I think the real problem is the performance issue of extracting the data to be inserted into relational tables from the xml data.  That is usually the bottleneck with xml data processing in my experience.

How long does it take to extract the data from one node?  How long would it take to extract the data from *ALL* nodes/records in the xml data (without doing any inserts)?  If it takes almost as long to extract the data from one node as for all nodes, then that indicates the performance problem you face.  I suspect the problems is on the extraction side, not the actual insert.
Definitely insert as many as you can (all of them) in one step, don't try to break it into a loop and insert small chunks.  That consumes more resources and takes longer.  And, like markgeer said, most of your work will be coming from the data extraction, not the insert itself.

You can turn your clob data into xmltype and use extractvalue if you want.  But if you already have a way to pull all the data from your nodes that's fine too.  
samir25Author Commented:
extractvalue dont work... it generates the 64k nodes are big..
How long would it take to extract the data from *ALL* nodes/records in the xml data (without doing any inserts)?   probaly an hr?
Mark GeerlingsDatabase AdministratorCommented:
OK, it should take less than a second to insert an individual record into an Oracle table, but if the extract of the data from xml takes an hour, that clearly indicates where the problem is (in the extract from xml).

How can you optimize that?  Great question!  Like I said, I don't consider myself an expert in this area.  We found a way (using EXTRACTVALUE) that works for us.  But it may have limitations. You may need to keep searching, or studying the Oracle documentation (and examples, if you can find them) for ways of extracting information from xml data in Oracle.

Or, you may have to post a new question with a titlle more like: "How to extract xml data for Oracle to use?"
samir25Author Commented:
ref ur sugg..
as you pull the pieces out into variables, use a table of records with the data you want to insert.
Then do a single bulk insert at the end.

can u show one example..thanks for that in advance
I'm assuming you have some pl/sql block that looks something like this.

v_a := dbms_lob.substr(big_xml, dbms_lob.instr(.....), dbms_lob.instr(.....));
v_b :=  dbms_lob.substr(big_xml, dbms_lob.instr(.....), dbms_lob.instr(.....));

insert into your_table (a,b,) values (v_a,v_b);
end loop;

so something like the following is what I'm imagining.
I'm not using xml in this example just to keep my parsing simple.
But where I do my simple substr, you'd put whatever xml parsing you already have.

As already noted in previous posts, the code below is an efficient way to do mass inserts,
your parsing time is very likely to be your biggest performance sink.

Hope this helps!

CREATE TABLE my_table (a VARCHAR2(10), b VARCHAR2(10), c VARCHAR2(10));

    v_a_tab    DBMS_SQL.varchar2s;
    v_b_tab    DBMS_SQL.varchar2s;
    v_c_tab    DBMS_SQL.varchar2s;
    v_string   VARCHAR2(100)      := 'a1,b1,c1,a2,b2,c2,a3,b3,c3';
    v_index    INTEGER;
    FOR i IN 1 .. 3
        v_a_tab(i)    := SUBSTR(v_string,(i - 1) * 9 + 1, 2);
        v_b_tab(i)    := SUBSTR(v_string,(i - 1) * 9 + 4, 2);
        v_c_tab(i)    := SUBSTR(v_string,(i - 1) * 9 + 7, 2);

    FORALL i IN 1 .. 3
        INSERT INTO my_table
                    (a, b, c
             VALUES (v_a_tab(i), v_b_tab(i), v_c_tab(i)

  FROM my_table;

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.