Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win


Updating data with an Oracle trigger

Posted on 2013-02-07
Medium Priority
Last Modified: 2013-02-11
We have a requirement whereby we need to copy data from Column A to Column X in the same table via an insert / update trigger. My background isn't in Oracle so struggling a bit with syntax.

For background the copy is needed to shape data stored as unicode type - NVARCHAR and NCLOB specifically, into VARCHAR and CLOB equivalents ready for Oracle Text indexing. We cannot change the original column types, and we cannot control the INSERT / UPDATE statements as these are auto generated by an existing API.

So, let's say we have a table with existing columns including:

ColumnA NVARCHAR(100)
ColumnB NVARCHAR(300)

We append the new columns to the same table to add (and initialise with copy of source column data):

ColumnA_VAR VARCHAR(100)

We then need a trigger that can enforce the new shadow data is kept in sync with the original data. The trigger needs to handle updates to single columns, or all. ie

UPDATE SampleTable SET ColumnA = N'NewValue';
UPDATE SampleTable SET ColumnA = N'NewValue A', ColumnB = N'NewValueB'; // etc

Initial attempts have shown only one of the shadow fields update while the others were wiped, so I must stress that the trigger needs to allow editing of any number of the source columns without losing other shadow data that hasn't changed.

If it helps, a SQL Server equivalent might look like the following (which doesn't cater for multiple rows I realise):

CREATE TRIGGER [sample_trigger] ON SampleTable

    UPDATE SampleTable SET
                 ColumnA_VAR = t.ColumnA
                 ,ColumnB_VAR = t.ColumnB
                 ,ColumnC_CLOB = t.ColumnC
    FROM SampleTable t
      INNER JOIN INSERTED i ON t.RecordId = i.RecordId

Can anyone provide a working example for Oracle?
Question by:Phil Catterall
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 8
  • 6
LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38863726
You might lost data going from an NVarchar2 to a Varchar2.

Here is your test case:

rop table tab1 purge;
create table tab1(
	columnA nvarchar2(100),
	columnB nvarchar2(300),
	columnC nclob,
	columnA_Var varchar2(100),
	columnB_Var varchar2(300),
	columnC_Var clob

create or replace trigger tab1_trig
before insert or update on tab1
for each row
	:NEW.columnA_Var := :NEW.columnA;
	:NEW.columnB_Var := :NEW.columnB;
	:NEW.columnC_Var := :NEW.columnC;

show errors

insert into tab1(columnA, columnB, columnC) values('a','b','c');
select * from tab1;

update tab1 set columnA='z',  columnB='y',  columnC='x';
select * from tab1;

Open in new window

LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38863734
Have you thought about a mirrored table instead of additional columns?

What type of Text queries are you going to be doing?

You might be able to get away with a USER_DATA_STORE if you will be searching across columns.

For example if columnA has 'Fred' and columnC has 'Flintstone', if you search for 'Fred Flintstone', will you want the row back?

Author Comment

by:Phil Catterall
ID: 38864325
Thanks for your responses.

A mirrored table is up for consideration if we cannot achieve this with a simple trigger, it just feels as if this is such a straight forward thing it shouldn't require it.

I'll try creating a dummy table with trigger as per the suggestion but this looks very similar (if not identical) syntax to what we've tried.

Regarding the Text queries, yes multi-column searches are the goal. I've created indexes with  MULTI_COLUMN_DATASTORE and that part seems to be fine, it's just getting the shadow data to update that's going wrong (at the moment). Perhaps having the index set to sync on commit is interfering with the trigger(?) I'll try and decouple things to see what happens.
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38864351
>>Perhaps having the index set to sync on commit is interfering with the trigger(?)

Not likely since the trigger fires before the commit.

I might consider a USER_DATASTORE:

With this you create a stored procedure that outputs a CLOB that is indexed.

Within this procedure you can convert your NVARCHAR2 and NCLOB data however you want.  Then append the converted data to a CLOB and you are done.

With this you shouldn't need a trigger at all.
LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38864360
I have a user_datastore sample somewhere.  I'll see if I can find it and mock something up for you.
LVL 77

Accepted Solution

slightwv (䄆 Netminder) earned 2000 total points
ID: 38864454
See if this would simplify your life somewhat (I'm still thinking you might lose data when converting from NVARCHAR2 to VARCHAR2 but I'll let you worry about this since I don't do anything with multi-byte charactersets).

drop table tab1 purge;
create table tab1(columnA nvarchar2(20), columnB nvarchar2(20), columnC nclob, idx_col char(1));

create or replace procedure myproc(rid in rowid, tlob in out clob) is 
	for c1 in (select columnA, columnB, columnC from tab1 where rowid = rid) loop
		dbms_lob.writeappend(tlob,length(c1.columnA || ' '),c1.columnA || ' ');
		dbms_lob.writeappend(tlob,length(c1.columnB || ' '),c1.columnB || ' ');
	end loop; 

show errors

	ctx_ddl.create_preference('MY_DATASTORE', 'user_datastore'); 
	ctx_ddl.set_attribute('MY_DATASTORE', 'procedure', 'myproc'); 
	ctx_ddl.set_attribute('MY_DATASTORE', 'output_type', 'CLOB'); 

create index tab1_idx on tab1(idx_col) indextype is ctxsys.context parameters('Datastore MY_DATASTORE sync(on commit)');

insert into tab1 values('Fred','Flintstone','Lives in Bedrock',null);
insert into tab1 values('Barney','Rubble','Lives next door to Fred',null);

select * from tab1 where contains(idx_col,'Fred',1) > 0;
select * from tab1 where contains(idx_col,'Bedrock',1) > 0;

--now the 'magic'
select * from tab1 where contains(idx_col,'near((Fred,Flintstone),10,true)',1) > 0;

Open in new window


Author Comment

by:Phil Catterall
ID: 38867704
This is looking very promising, thanks for the comprehensive example.

Excuse my ignorance, but does the alternative approach prohibit us from making relevancy-based searches using the about() operator?

If I search with:

... WHERE CONTAINS(search_idx_placeholder, :Terms, 1) > 0

Passing the :Terms param the value 'about(some phrase)' I get zero results. Using the old index the same query returned data.

Note that the query is fixed (but can be modified). The goal is to allow a user to search by phrase and about() was giving us that.

I can use CONTAINS() on a single word (ie without wrapping with about()) and the expected record is returned so I'm reasonably sure the data is there.

Any ideas?

Author Comment

by:Phil Catterall
ID: 38867942
Sorry I've noticed that some phrases are working with about(), could be a freak example the keyword I was searching happened to be "void".

I'll keep testing, I think we might be there now thank you.

Author Comment

by:Phil Catterall
ID: 38868439
So close.

The index doesn't appear to refresh when we update data in the included fields. Is there something else we need to do to keep the index in sync?

Note we cannot modify the UPDATE statements in any way since they are auto-generated by a closed system.
LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38868595
I'll play around with the user_datastore and updates.  I've only played with them a little which is why I had that laying around.

>>could be a freak example the keyword I was searching happened to be "void".

About queries rely on a compiled knowledgebase.  Oracle provides one but I'm not sure how detailed it is.  You might also need to compile it (can't remember if this is done by default).

About queries get even more fun when you start adding your own terms to the Thesaurus.

Check out this command:
ctxkbtc -user ctxsys/ctxsys -name DEFAULT
LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38868846
Just like the multi-column datastore (at least as I remember it), you need to update the column the index is based on for the index to resync.

In the example above that I posted, if you update one of the columns, you need to also update idx_col (just set it to null) and the index will resync.

update tab1 set columnA='Wilma', idx_col=null where columnA='Fred';

You might look into a trigger to make sure idx_col is touched on any DML.

Author Comment

by:Phil Catterall
ID: 38875695
I couldn't change the update statements, however adding a simple trigger to update the placeholder column was enough to kick-start the index sync.

Thanks for all your help with this.

Author Closing Comment

by:Phil Catterall
ID: 38875705
Provided a better solution than the one we were aiming for by removing the need for unnecessary table columns.
LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 38875923
>>Thanks for all your help with this.

Glad to help!

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Cursors in Oracle: A cursor is used to process individual rows returned by database system for a query. In oracle every SQL statement executed by the oracle server has a private area. This area contains information about the SQL statement and the…
Using SQL Scripts we can save all the SQL queries as files that we use very frequently on our database later point of time. This is one of the feature present under SQL Workshop in Oracle Application Express.
This video shows how to Export data from an Oracle database using the Datapump Export Utility.  The corresponding Datapump Import utility is also discussed and demonstrated.
This video shows how to Export data from an Oracle database using the Original Export Utility.  The corresponding Import utility, which works the same way is referenced, but not demonstrated.

636 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question