Oracle 11gr2 Security Options

Please share your expert input, what are my best options in this security domain:

- track all the DDL's (Object or System too)
- monitor/track/log or even stop too, unnecessary huge SELECTs (one million rows or something or even kicked off at NON-normal business hours
- log/track all the DMLs too

Please suggest/advice, what are the best practice or best solutions, in our scenario.

Thanks and very best regards
Mushfique KhanDirector OperationsAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Mark GeerlingsDatabase AdministratorCommented:
Those are three different tasks that will require three different approaches.  (Of course, there is also the decision to be made of whether any or all of these are worth doing.  But apparently you or someone in your organization has already decided that these may be worth doing even before you know exactly how to do them, or how much they will cost.)

Task #1: "track all the DDL's (Object or System too)" You can use DDL triggers that you can write to log all DDL activity at least for non-SYS objects.  But for SYS-owned objects, these triggers will not apply since the SYS schema does not support user-written triggers.  Here is an example of a trigger I wrote some years ago in our Oracle10 system to check  for unexpected "create" activity:
create or replace trigger ddl_create_ck before create on database
  dummy varchar2(30);
  dummy := ddl_check.ddl_ok;
  if dummy > ' ' then
    raise_application_error(-20501,'You are not authorized to create objects in this database');
  end if;

Task #2 "monitor/track/log or even stop, unnecessary huge SELECTs ..."  You can turn on auditing (which is turned off by default in Oracle) to at least monitor any/all "select..." activity.  Be aware that this auditing can collect a huge amount of information that you will need to manage.  Or, you could turn on "resource limits".  These have the advantage of being able to stop queries that exceed a certain limit, but this limit is *NOT* based on the number of rows that will be returned.  Instead, this limit is based on the number of database blocks that the query forces Oracle to read.  It is possible for a query that will only return one row to hit this limit, and it is possible for queries that return millions of rows to not hit the limit.  It depends on a number of things: the complexity of the query (joins or/subqueries or not); the number of bytes per row in the records in the table(s); the number of rows per block in the table(s); and the availability of indexes to match the literal values (if any) that are provided in the "where" clauses of the query.

Task #3 "log/track all the DMLs"  This is another huge task.  This will require that you either:
a. write "after insert", "after update" and "before delete" triggers that include the "for each row" clause for *EVERY* table in your database (except those owned by SYS) and you will have to create at least one audit table to collect this information.  Another option would be to create an audit table for every table that you want to monitor.
b. use LogMiner to retrieve the DML activity that Oracle automatically captures.  But, there are some limitations to this approach: it is slow to query; it is not organized by table and column, rather it is organized chronologically based on when the activity happened; the data is not saved long-term - it is only available as far back as you have archived redo log files on disk;  this does not capture the actual DML statement, rather it reconstructs equivalent DML but for each row affected; the biggest limitation of LogMiner is that if anyone does DDL activity on a table, all prior LogMiner data for this table (or partition) is now invalid and not retrievable.
Alexander Eßer [Alex140181]Software DeveloperCommented:
@markgeer: Thumbs up!! Great work! I'll have this one saved to my KB ;-)
Mark GeerlingsDatabase AdministratorCommented:
My recommendations:

Task #1 "track all the DDL's" - It is not so difficult to write "before create...",  "before alter..." and "before drop..." triggers, and this kind of activity will likely be infrequent, so the data collected will likely be only a moderate amount of data to manage.  You will have to do some testing of these triggers in a test system system to make sure they collect exactly and only what you want, and that they do not accidentally cause errors that will prevent legitimate activity.

Task #2 ""monitor/track/log or even stop, unnecessary huge SELECTs ..."  - I would create resource limits, probably multiple ones with different levels, then assign these profiles to each user account.  These can at least stop terrible queries.  Here are examples of resource limits that we used some years ago (when servers were smaller):
    CPU_PER_SESSION 480000
    CPU_PER_CALL 12000
    CPU_PER_SESSION 720000
    CPU_PER_CALL 18000

The "LOGICAL_READS_PER_CALL" was the main limit that we used.  Where queries forced Oracle to read this number of database blocks, the query was aborted and the user received an ORA-02395 error: "Exceeded call limit on I/O"

I would not try to monitor every query that gets executed with Oracle's auditing option, unless you have some kind of legal requirement for this.

Task #3 "log/track all the DMLs" - I would make sure that all of your tables include at least two audit columns (one date/timestamp and the other either a number to record the userid or a varchar2 to record the user name) with column names something like: date_modified and modified_by.  Then make sure that all of your tables include "before insert for each row"  and "before update for each row" triggers to save the current date (if time down to the second is close enough for you) or timestamp (if you need time down to the millisecond level) plus the userid or username.  I would not allow deletes from any of you master tables (customers, parts, prices, etc.) so you don't need to worry about tracking deletes on these.  Instead, make sure that each of these tables include two date columns with names like: start_date and end_date, or effective_date and obsolete_date and make sure the start_date or efective_date is set to NOT NULL and make sure that your application validation logic not only checks the record key for valid values, but also makes sure that the begin_date or effective_date is less than the current date and that the end_date or obsolete_date is either null or greater than the current date.  This also allows you to do data entry in advance for parts, prices etc. that will become valid at a particular time in the future by entering a future date value in the start_date column.

For tables where you need to allow deletes, you will need a history table (to write the record to) and a pre-delete trigger to save the record plus the time and userid/name of the person doing the delete.

There may be tables where you also want to track exactly what changed on updates.  For these tables I recommend that you create a "sparse" history table that includes all of the columns of the base table with the same names and datatypes, but with all columns except the primary or unique key columns and the audit columns allowed to be null.  Then in your "before update for each row" trigger, insert the primary or unique key column(s) plus the :new audit column values, plus use CASE or DECODE to compare the :new.column_value to the :old.column_value for each column in the table, so only the :old (or "before") value gets saved and only for those columns that actually changed.  You don't need the new value saved in the history tables, since that is still in the base table.  (On inserts you don't need this, since the record itself records exactly what was inserted).

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Mushfique KhanDirector OperationsAuthor Commented:
thanks markgeer, this'll be very helpful, but what do you think, do we have, Oracle provided any built-in tool, how about logminer or anything related, can you please share thoughts here too.

And thanks again for this help, much appreciated.
Mark GeerlingsDatabase AdministratorCommented:
LogMiner can help a bit sometimes for some DML activity, but LogMiner doesn't track queries at all and it won't show you DDL activity either.  Also, when DDL activity happens, all DML activity on that object from before the DDL happened becomes unretrievable via LogMiner.  Plus, LogMiner is relatively slow, and it doesn't show you the exact SQL that was submitted.  LogMiner only reconstructs equivalent SQL, but for each individual row affected.  And, LogMiner can only look as far back as you have archived redo logs available.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Oracle Database

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.