Oracle 11gr2 Security Options

Posted on 2014-08-13
Last Modified: 2014-08-24
Please share your expert input, what are my best options in this security domain:

- track all the DDL's (Object or System too)
- monitor/track/log or even stop too, unnecessary huge SELECTs (one million rows or something or even kicked off at NON-normal business hours
- log/track all the DMLs too

Please suggest/advice, what are the best practice or best solutions, in our scenario.

Thanks and very best regards
Question by:mkhandba
    LVL 34

    Assisted Solution

    Those are three different tasks that will require three different approaches.  (Of course, there is also the decision to be made of whether any or all of these are worth doing.  But apparently you or someone in your organization has already decided that these may be worth doing even before you know exactly how to do them, or how much they will cost.)

    Task #1: "track all the DDL's (Object or System too)" You can use DDL triggers that you can write to log all DDL activity at least for non-SYS objects.  But for SYS-owned objects, these triggers will not apply since the SYS schema does not support user-written triggers.  Here is an example of a trigger I wrote some years ago in our Oracle10 system to check  for unexpected "create" activity:
    create or replace trigger ddl_create_ck before create on database
      dummy varchar2(30);
      dummy := ddl_check.ddl_ok;
      if dummy > ' ' then
        raise_application_error(-20501,'You are not authorized to create objects in this database');
      end if;

    Task #2 "monitor/track/log or even stop, unnecessary huge SELECTs ..."  You can turn on auditing (which is turned off by default in Oracle) to at least monitor any/all "select..." activity.  Be aware that this auditing can collect a huge amount of information that you will need to manage.  Or, you could turn on "resource limits".  These have the advantage of being able to stop queries that exceed a certain limit, but this limit is *NOT* based on the number of rows that will be returned.  Instead, this limit is based on the number of database blocks that the query forces Oracle to read.  It is possible for a query that will only return one row to hit this limit, and it is possible for queries that return millions of rows to not hit the limit.  It depends on a number of things: the complexity of the query (joins or/subqueries or not); the number of bytes per row in the records in the table(s); the number of rows per block in the table(s); and the availability of indexes to match the literal values (if any) that are provided in the "where" clauses of the query.

    Task #3 "log/track all the DMLs"  This is another huge task.  This will require that you either:
    a. write "after insert", "after update" and "before delete" triggers that include the "for each row" clause for *EVERY* table in your database (except those owned by SYS) and you will have to create at least one audit table to collect this information.  Another option would be to create an audit table for every table that you want to monitor.
    b. use LogMiner to retrieve the DML activity that Oracle automatically captures.  But, there are some limitations to this approach: it is slow to query; it is not organized by table and column, rather it is organized chronologically based on when the activity happened; the data is not saved long-term - it is only available as far back as you have archived redo log files on disk;  this does not capture the actual DML statement, rather it reconstructs equivalent DML but for each row affected; the biggest limitation of LogMiner is that if anyone does DDL activity on a table, all prior LogMiner data for this table (or partition) is now invalid and not retrievable.
    LVL 13

    Expert Comment

    by:Alexander Eßer [Alex140181]
    @markgeer: Thumbs up!! Great work! I'll have this one saved to my KB ;-)
    LVL 34

    Accepted Solution

    My recommendations:

    Task #1 "track all the DDL's" - It is not so difficult to write "before create...",  "before alter..." and "before drop..." triggers, and this kind of activity will likely be infrequent, so the data collected will likely be only a moderate amount of data to manage.  You will have to do some testing of these triggers in a test system system to make sure they collect exactly and only what you want, and that they do not accidentally cause errors that will prevent legitimate activity.

    Task #2 ""monitor/track/log or even stop, unnecessary huge SELECTs ..."  - I would create resource limits, probably multiple ones with different levels, then assign these profiles to each user account.  These can at least stop terrible queries.  Here are examples of resource limits that we used some years ago (when servers were smaller):
        CPU_PER_SESSION 480000
        CPU_PER_CALL 12000
        CPU_PER_SESSION 720000
        CPU_PER_CALL 18000
        LOGICAL_READS_PER_CALL 120000;

    The "LOGICAL_READS_PER_CALL" was the main limit that we used.  Where queries forced Oracle to read this number of database blocks, the query was aborted and the user received an ORA-02395 error: "Exceeded call limit on I/O"

    I would not try to monitor every query that gets executed with Oracle's auditing option, unless you have some kind of legal requirement for this.

    Task #3 "log/track all the DMLs" - I would make sure that all of your tables include at least two audit columns (one date/timestamp and the other either a number to record the userid or a varchar2 to record the user name) with column names something like: date_modified and modified_by.  Then make sure that all of your tables include "before insert for each row"  and "before update for each row" triggers to save the current date (if time down to the second is close enough for you) or timestamp (if you need time down to the millisecond level) plus the userid or username.  I would not allow deletes from any of you master tables (customers, parts, prices, etc.) so you don't need to worry about tracking deletes on these.  Instead, make sure that each of these tables include two date columns with names like: start_date and end_date, or effective_date and obsolete_date and make sure the start_date or efective_date is set to NOT NULL and make sure that your application validation logic not only checks the record key for valid values, but also makes sure that the begin_date or effective_date is less than the current date and that the end_date or obsolete_date is either null or greater than the current date.  This also allows you to do data entry in advance for parts, prices etc. that will become valid at a particular time in the future by entering a future date value in the start_date column.

    For tables where you need to allow deletes, you will need a history table (to write the record to) and a pre-delete trigger to save the record plus the time and userid/name of the person doing the delete.

    There may be tables where you also want to track exactly what changed on updates.  For these tables I recommend that you create a "sparse" history table that includes all of the columns of the base table with the same names and datatypes, but with all columns except the primary or unique key columns and the audit columns allowed to be null.  Then in your "before update for each row" trigger, insert the primary or unique key column(s) plus the :new audit column values, plus use CASE or DECODE to compare the :new.column_value to the :old.column_value for each column in the table, so only the :old (or "before") value gets saved and only for those columns that actually changed.  You don't need the new value saved in the history tables, since that is still in the base table.  (On inserts you don't need this, since the record itself records exactly what was inserted).

    Author Comment

    thanks markgeer, this'll be very helpful, but what do you think, do we have, Oracle provided any built-in tool, how about logminer or anything related, can you please share thoughts here too.

    And thanks again for this help, much appreciated.
    LVL 34

    Expert Comment

    LogMiner can help a bit sometimes for some DML activity, but LogMiner doesn't track queries at all and it won't show you DDL activity either.  Also, when DDL activity happens, all DML activity on that object from before the DDL happened becomes unretrievable via LogMiner.  Plus, LogMiner is relatively slow, and it doesn't show you the exact SQL that was submitted.  LogMiner only reconstructs equivalent SQL, but for each individual row affected.  And, LogMiner can only look as far back as you have archived redo logs available.

    Write Comment

    Please enter a first name

    Please enter a last name

    We will never share this with anyone.

    Featured Post

    Enabling OSINT in Activity Based Intelligence

    Activity based intelligence (ABI) requires access to all available sources of data. Recorded Future allows analysts to observe structured data on the open, deep, and dark web.

    Cybersecurity has become the buzzword of recent years and years to come. The inventions of cloud infrastructure and the Internet of Things has made us question our online safety. Let us explore how cloud- enabled cybersecurity can help us with our b…
    Today, still in the boom of Apple, PC's and products, nearly 50% of the computer users use Windows as graphical operating systems. If you are among those users who love windows, but are grappling to keep the system's hard drive optimized, then you s…
    This video shows how to copy a database user from one database to another user DBMS_METADATA.  It also shows how to copy a user's permissions and discusses password hash differences between Oracle 10g and 11g.
    This video shows syntax for various backup options while discussing how the different basic backup types work.  It explains how to take full backups, incremental level 0 backups, incremental level 1 backups in both differential and cumulative mode a…

    779 members asked questions and received personalized solutions in the past 7 days.

    Join the community of 500,000 technology professionals and ask your questions.

    Join & Ask a Question

    Need Help in Real-Time?

    Connect with top rated Experts

    14 Experts available now in Live!

    Get 1:1 Help Now