Here's a head-scratcher: For all tables in a given schema, I need to duplicate each table (DDL) for a trigger-based change capture. For example, DEPT_AUD is created from DEPT using the dbms_metadata function. Standard WHO columns are appended, and a column holding the DDL type (I, U, D).
I can manually map a row-based trigger on DEPT. When updating or inserting or deleting, the table's OLD columns are inserted into DEPT_AUD with the appropriate key. If the DEPT table has n number of columns, the CREATE TRIGGER statement will also require n number of inserted columns (plus the WHO, lest we forget).
For a few tables, the manual approach may be simplest. I've thought of dynamic SQL using user_tab_cols, but I can't figure out how to loop through the variable number of columns for each table. My guess is that row_types would somehow eliminate the kludge mapping, but that's an area where I'm ignorant.
For the record, Golden Gate, Logminer, and similar log-based CDC mining are off the table. Don't bother suggesting them. New 11g functions are okay. Secondly, I'm only interested in capturing the :old data. Thirdly, functionality takes priority over performance. Lastly, this audit requirement touches several thousand DW tables.
Thanks in advance, and I'm looking forward to a lively discussion.