Solved

Need help inserting data from a datatable to an Oracle table using ODP.Net

Posted on 2011-02-23
31
830 Views
Last Modified: 2012-05-11
I have an oracle table that I am going to start using for archive purposes for some of my application data.  I have a datatable that has the data that I want to archive.  My column names in the Oracle table are not the same as what is currently in the datatable.  

Do I have to do "INSERT INTO .....(oracle.column1....) Values (..datatable.column1...)
or is there a better way?
0
Comment
Question by:rgn2121
  • 16
  • 15
31 Comments
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34962942
Where does the data in the datatable come from initially?

Worst case you should be able to to a bulk insert.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 34962956
The data comes from a couple other oracle tables....I compile the data from those into one large table for reporting and now I want to upload it to a different Oracle DB.  I have tried OrcleBulkCopy, but I am guessing my column names might need to match?
0
 
LVL 76

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 500 total points
ID: 34963019
Let me start by telling you that I'm far from a .Net Expert.  I know enough to be dangerous.  I'm an Oracle DBA by trade.

That said:  The columns shouldn't matter if you use parameter binding.

I've used array binding for a straight insert.  The code in the following link uses a stored procedure.  Just change the comment text to the insert and remove the command type and it should work.

http://www.oracle.com/technology/sample_code/tech/windows/odpnet/howto/arraybind/index.html

I'll see if I can scrub my code so I can post an example of what I did.
0
 
LVL 76

Accepted Solution

by:
slightwv (䄆 Netminder) earned 500 total points
ID: 34963058
I guess it didn't need that much scrubbing...

It accepts an array of strings and inserts them into a table with a varcahr2 column.  Hopefully you can adapt this to your dataset.
sub bulkInsert(stringArr as string()) 

		Dim rowCounter as integer = 0

		Dim con As New OracleConnection() 
		Dim cmd as OracleCommand = new OracleCommand()
		Dim reader as OracleDataReader

		con = New OracleConnection(ConfigurationManager.AppSettings("ConnectionString"))

		cmd.Connection	= con
		cmd.CommandType	= CommandType.Text

		try
			cmd.commandText	= " insert into myTestTable(someColumn) values(:myvalue)"
			cmd.ArrayBindCount = stringArr.Length

			Dim param1 as OracleParameter = cmd.Parameters.Add("myvalue", OracleDbType.Varchar2, _
				50,stringArr, ParameterDirection.Input)

			con.open
			cmd.executeNonQuery()

...

Open in new window

0
 
LVL 12

Author Comment

by:rgn2121
ID: 34963228
So I have to add a parameter for every column?...I was hoping for something where I didn't have to type them all.
I will try this out and let you know...thanks
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34963512
>>So I have to add a parameter for every column?...

Not sure.  Never tried any other way.

>>I was hoping for something where I didn't have to type them all.

Then if someone adds a new column to table, your code breaks.

0
 
LVL 12

Author Comment

by:rgn2121
ID: 34969277
I am assuming that stringArr is that value that is getting inserted...?  I don't see where you are looping through your data, but hitting each column and each row to do the insert seems time consuming.  How much data do you upload when you do this and how long does it take?
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34969843
>>I am assuming that stringArr is that value that is getting inserted...?

All the string values contained in the Array are getting 'bulk' inserted.

>>I don't see where you are looping through your data

With bulk binding there is not need to loop to do the inserts.

>>but hitting each column and each row to do the insert seems time consuming.

Not sure I understand where you are going with this.  You should be able to create an array that matches the columns and still do a bulk insert.

>> How much data do you upload when you do this and how long does it take?

So far it's a few hundred rows.  Never had to time it since it runs fast enough there was no need to attempt to tune it.

If you run experiments I would love to see what you come up with on how it performs under a load.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 34970184
I am still confused...

cmd.commandText= " insert into myTestTable(someColumn) values(:myvalue)"
This looks to me like you are saying insert into one column, a set of values...

so if I have 50 columns then I need to say
"Insert into myTestTable(col1, col2, col3, col4, ..., col50) Values(:myValues1, myvalues2...)????
Which would mean I would need multiple paramters listed below?

cmd.ArrayBindCount = stringArr.Length
Dim param1 as OracleParameter = cmd.Parameters.Add("myvalue", OracleDbType.Varchar2, _
                        50,stringArr, ParameterDirection.Input)

0
 
LVL 12

Author Comment

by:rgn2121
ID: 34970217
maybe I do see now...The column is an array.  So if I have 3000 rows col1 one is an array of those values from row 0 to row 2999?
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34970219
I'll see if I can work on a better test case a little later.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 34970356
I appreciate that, but I won't need a test if I know if I have to create an array for each column.  An article I found looks like that is what I need to do.  So I will have 50 arrays of 3000 items each.  Each one gets bound to the column...
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34971931
I'll still need to do some research.  I just read up on it long enough to get my code working.  It just happened, I had a single column so a single array worked.

You might be right about multiple arrays. Not sure.  If that is true, I would probably forget this approach and do the good old loop, insert approach.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 34972185
Thanks...I am working on the array binding.  I think I might try to do a a number of different approaches to try and test speed.  I was really hoping to be able to use the Merge feature in Oracle to be able to upsert my data...might have to have all my columns named the same for that and then right a stored procedure.

I will leave this open and post back when I get it working.  Thanks for the help....
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34972584
Using the merge should be independent of how you call it.  It is after all just a SQL command like any other.

You are still on my radar.  It's just today is hectic and I cannot find the time right now to work on actual code.
0
How to run any project with ease

Manage projects of all sizes how you want. Great for personal to-do lists, project milestones, team priorities and launch plans.
- Combine task lists, docs, spreadsheets, and chat in one
- View and edit from mobile/offline
- Cut down on emails

 
LVL 12

Author Comment

by:rgn2121
ID: 34972946
No worries....this isn't pressing.  Just doing it inbetween waiting for Hyperion to return result sets.... :(
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 34983326
I finally got a chance to look into this a little today.

Having re-read this to make sure I was still on track, does there have to be a .Net component to this?

I've not been able to find anything complex on array binding either.

I'm not looking at passing refCursors around and that's when the need to .Net hit me.

Please confirm the .Net aspect is necessary.
0
 
LVL 76

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 500 total points
ID: 34983415
I think refcursors will work.

Here's the ODP.Net reference showing the OracleDbType.RefCursor type and some sample code messing with them.

http://download.oracle.com/docs/cd/B19306_01/win.102/b14307/featRefCursor.htm#sthref342

Below is some PL/SQL that passes them around and moves data from one table to the other without referencing columns.

Hopefully between all this, something will work?

If you have specific questions, please ask.  I'll try to answer them.
drop table tab1 purge;
create table tab1(col1 char(1), col2 char(1));

drop table tab2 purge;
create table tab2(col1 char(1), col2 char(1));

insert into tab1 values('a','b');
insert into tab1 values('c','d');
commit;


create or replace procedure outCur ( myCur out sys_refcursor )
is
begin
	open myCur for 'select * from tab1';
end;
/

show errors

create or replace procedure inCur ( myCur in sys_refcursor )
is

	TYPE array_t IS TABLE OF tab2%rowtype INDEX BY BINARY_INTEGER;
	rec_array array_t;
begin
	FETCH myCur BULK COLLECT INTO rec_array;

  FOR i IN rec_array.FIRST .. rec_array.LAST
  LOOP
    insert into tab2 values rec_array(i);
  END LOOP;
end;
/

show errors


declare
	myCur sys_refcursor;
begin
	outCur(myCur);
	inCur(myCur);
end;
/

select * from tab2;

Open in new window

0
 
LVL 12

Author Comment

by:rgn2121
ID: 34996209
Thanks...I am going to try and get the parameters working right now and might get into this after that.  Once I get the parameters finished I will close this.  Thanks for the help...
0
 
LVL 12

Author Comment

by:rgn2121
ID: 35037501
WORKS!  

Okay, I was able to use what you posted above to the parameter array binding an upload to my Oracle DB.  I had to create 58 arrays, 1 for each of my columns.  I then looped through the in memory datatable (4,137 rows) and added items to their respective array...then executed the query.

The entire routine takes less than 1.5 seconds.  The upload itself takes about .5 - .62 seconds...

Thanks for the help...I am going to move to try and use the OracleBulkCopy and rename all my columns.  I appreciate the help!
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 35037677
Glad to help.

Just curious:  How long did it take looping in code and making individual insert calls?
0
 
LVL 12

Author Comment

by:rgn2121
ID: 35037805
I didn't even try that....I might later.  I am saving different projects so that I can test it out and see which is the fastest.  If I do the individual calls I will post back and let you know...
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 35037848
>>I didn't even try that....I might later.

I would have done that first as a baseline.

>> I had to create 58 arrays

Seems like a LOT of extra work just to potentially save milliseconds.

0
 
LVL 12

Author Comment

by:rgn2121
ID: 35038698
I assumed that the row by row insert would be the longest given the number of rows and columns that I had to upload.  I really what a bulk upload but with my column names not being the same I don't think it was feasible.

I have another person in my dept that is uploading about 8,000 records and 20 columns and it takes about 30 - 35 seconds, going row by row.

The array binding was only a couple hundred lines of code.  I can move that to about 8-10 lines though if the bulk copy works..
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 35038900
Cool.

Even though this is closed and if you can remember, please come back and post your findings when you are all done with your testing.

I'd be interested in what you found.  I'm sure other people that might stumble across this question will be as well.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 35069433
Will do...I won't be back to this for awhile, but might be able to try it out in a few weeks.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 35085697
Okay...So I renamed all of my in memory columns to match what I have in Oracle and ordered them the same way.  Using OracleBulkCopy took only slighty longer to upload...maybe 3 hundreths of a second.  I was only measuring the time it took to execute the upload command itself.

Running all the upload code for 60 columns might be closer between the two since the parameter binding required about 300-400 lines of code vs maybe 10 using the bulk copy and creating the arrays for the binding might helps balance them out.

The difference in time is so negligible in my case that the BulkCopy seems the way to go.  The only thing I noticed that I liked about the parameter binding is that the insert errors that were returned were able to return to me column names where as the BulkCopy would return a column index.

The info above is based off of about 4200 rows and 60 columns worth of data.
0
 
LVL 12

Author Comment

by:rgn2121
ID: 35085771
Okay...I went back and timed both routines from the start of the routine to the end of each.  This way I take into account the time to create the arrays.  The parameter binding took 1.39 seconds and the BulkCopy took .69 seconds...this includes opening the connection.

I am currently testing the BulkCopy using a BatchSize of 5000.  I have not tested the refcursor method and don't know if I will.
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 35085856
Thanks for the updates.

I always way performance against future maintenance.  I've seen very efficient and well running code that would be a nightmare to maintain by anyone else and likely even the developer that wrote it 6 months later.

I will typically sacrifice some performance for more 'readable' code.  Because I KNOW I will have to touch that code later when requirements change and I hate asking myself: Now, why did I do that?
0
 
LVL 12

Author Comment

by:rgn2121
ID: 35085903
HA!  Yeah...I have muttered that statement more than once for sure.
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 35086091
lol..  we all have!
0

Featured Post

Maximize Your Threat Intelligence Reporting

Reporting is one of the most important and least talked about aspects of a world-class threat intelligence program. Here’s how to do it right.

Join & Write a Comment

Suggested Solutions

In this post we will learn how to connect and configure Android Device (Smartphone etc.) with Android Studio. After that we will run a simple Hello World Program.
Although it can be difficult to imagine, someday your child will have a career of his or her own. He or she will likely start a family, buy a home and start having their own children. So, while being a kid is still extremely important, it’s also …
Via a live example, show how to restore a database from backup after a simulated disk failure using RMAN.
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

21 Experts available now in Live!

Get 1:1 Help Now