Auditing SQL Server 2005 Standard Database on Insert Update and Delete operations

Posted on 2012-09-16
Last Modified: 2012-10-21

For one of the project I am required to record auditing for all the additions and updates. The version of SQL Server in use is SQL Server 2005 Standard. There is no possibility to upgrade it to SQL Server 2008 R2 Enterprise Edition (So no option to use Change Data Capture Feature). Now, I am left with only one option i.e., to use triggers after insert and after update.

My concern is the performance of the database system as there will be tens of tables involved in multiple users writing to database and periodic and overnight jobs will write to database as well.

Can you please suggest some possible solution to the issue. Is there only one option to use trigger to achieve the purpose or there are some alternatives to it. Your advice, suggestions are appreciated.

Question by:ezkhan
    LVL 22

    Expert Comment

    This article will cover the most spectacular feature of SQL 2008 – Policy-based management and how the configuration of SQL Server with policy-based management architecture can make a powerful difference. Policy based management is loaded with several advantages. It can help you implement various policies for reliable configuration of the system. It also provides additional administration assistance to DBAs and helps them effortlessly manage various tasks of SQL Server across the enterprise.

    1 Introduction
    2 Basics of Policy Management
    3 Policy Management Terms
    4 Practical Example of Policy Management
    4.1 Exploring of Facets
    4.2 Create a Condition
    4.3 Create a Policy
    4.4 Evaluate a Policy
    4.5 Fix Non-complying Policy
    5 Summary
    LVL 35

    Expert Comment

    by:David Todd

    The key thoughts for triggers are:
    Keep them small as they are part of the implicit transaction.
    Write them to be able to handle multiple rows - don't assume that the trigger will be fired at the row level - a bulk update/insert/delete will fire the trigger once and inserted/deleted tables will have more than one row.

    LVL 25

    Expert Comment

    the place where I work depend on triggers a lot. Believe me, till now, there are not much issues even though it is an OLTP environment with 1000 + users working simultaneously. You can create triggers and monitor the performance in test environment. It shouldn't be an issue unless a huge batch is applied once.
    LVL 28

    Expert Comment

    by:Ryan McCauley
    Have you considered using C2 Audit logs? It may be a bit overkill for what you're looking, but it will audit all the statements executed on a server for review later (not at the table-level, but any DML statements will be captured). Take a look here:

    There are some caveats - you have to make sure to backup and archive those trace logs as the SQL Service will stop if it runs out of disk space and can't write more audit files, and there may be some performance impact. However, since it's native tracing to SQL Server, the performance impact will be minimal (or as minimal as possible for a process that records everything!), so it might meet your needs.
    LVL 35

    Expert Comment

    by:David Todd

    Doesn't turning on C2 auditing basically kill a servers performance?

    LVL 28

    Accepted Solution

    Not any more than running a SQL Profiler session, in my experience - certainly, it would have less of an impact than putting a trigger on every table to capture data change activity. We use a product from Idera called SQL Compliance Manager to capture database activity, and this is essentially what it does - runs a trace on the server and pumps all the statements back to a centralized collection server (whereas C2 auditing writes it locally). We've never noticed an issue with performance on any monitored servers, especially since there's no network overhead involved.

    One potential performance killer is if you're doing the C2 logging onto the default database location, which in my cases is the same physical disk as your SQL Server data - you may not have a choice, but C2 writes logs into the default data directory, and to my knowledge, that's not configurable. To compensate, the best thing to do would be to set the default data directory to a dedicated disk where logs can be captured - that way, the C2 logs have the LUN all to themselves (if you're on a SAN).

    If you're looking to capture all the DML activity on every table in the database, I think it may be your best option - there's no way to do this with zero overhead, but only testing it would tell you for sure.

    Author Closing Comment

    Many Thanks.

    Write Comment

    Please enter a first name

    Please enter a last name

    We will never share this with anyone.

    Featured Post

    Find Ransomware Secrets With All-Source Analysis

    Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

    Let's review the features of new SQL Server 2012 (Denali CTP3). It listed as below: PERCENT_RANK(): PERCENT_RANK() function will returns the percentage value of rank of the values among its group. PERCENT_RANK() function value always in be…
    How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
    Viewers will learn how the fundamental information of how to create a table.
    Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

    760 members asked questions and received personalized solutions in the past 7 days.

    Join the community of 500,000 technology professionals and ask your questions.

    Join & Ask a Question

    Need Help in Real-Time?

    Connect with top rated Experts

    6 Experts available now in Live!

    Get 1:1 Help Now