Scripts / commands to detect failed DB login attempts and querying of unusual large amount of data

Without DB Activity Monitoring in place, we would like daily job/script
(Windows Scheduled Task for MS SQL  and  Unix cron for Oracle) to
detect if the following took place :

a) failed login attempts to a DB (even a single failed attempt)
b) unusual high amount of data being queried (say 10 times more
    than last 30 days' average;  if this is not possible, then 15MB of
    data)

Q1:
Can provide the specific commands (SQL or OS) for both DB types
above for Windows & UNIX (Solaris & RHEL)?

Q2:
For item a  above, which file in MS SQL & Oracle DB contain that info?

As the scripts run once/day, Ideally the script covers the last 24hrs
events/logs so that in the next run
sunhuxAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

sunhuxAuthor Commented:
https://www.netwrix.com/how_to_track_down_failed_oracle_logon_attempts.html

Q3:
Referring to above URL, what's the equivalent Oracle command to set
' audit_trail="true" '  and is there any way to do it without restarting the
Oracle instance & is it necessary to enable Audit Trail to get this info of
failed login attempts?

Q4:
Our DBA said turning on the Audit Trail can have major performance impact:
what can we do about this?  How do we estimate the disk space the Audit
Trail will take?

Q5:
What's the command to roll over (ie once/day) a new Audit Trail file once the
 sql query to get the failed login attempts is done
0
johnsoneSenior Oracle DBACommented:
Auditing is not required to trap failed login attempts.  It can be done with a trigger.  The basic idea is this:
create or replace trigger error_audit
after servererror
on database
begin

  if (is_servererror(1017)) then
    -- Insert information into a table here
  end if;

end;
/

Open in new window

Most of the information you probably want to insert is in SYS_CONTEXT
0
Mark GeerlingsDatabase AdministratorCommented:
It is true that you can use a custom trigger like the one johnsone suggested to detect failed login attempts.  But, the easiest way to report failed login attempts is to use Oracle's auditing features.  If the only activity that you audit is failed login attempts, you probably will not have to worry about the performance impact of this (assuming those are rare) or space usage for this data.  But to be safe, before you turn on auditing, check the tablespace that your SYS.AUD$ table is in and make sure that is not the SYSTEM tablespace.  In Oracle11.2 and later the default for this now is SYSAUX, but if your database was upgraded from an earlier version of Oracle, this table may be in the SYSTEM tablespace.  If it is there yet, you can move it.  Yes, Oracle supports DBAs moving this SYS-owned table. (Just don't move or modify other tables owned by SYS.)

No, Oracle does not provide or include a built-in way to rename the SYS.AUD$ table each day.  You could create a custom table with a structure similar to SYS.AUD$, that you partition by day (or week or month) and set up a custom PL\SQL job to run just after midnight and copy all of the contents (or at least the contents you want) from the SYS.AUD$ table to your custom table, then delete these rows from SYS.AUD$.
0
Powerful Yet Easy-to-Use Network Monitoring

Identify excessive bandwidth utilization or unexpected application traffic with SolarWinds Bandwidth Analyzer Pack.

johnsoneSenior Oracle DBACommented:
Personally, if the only thing that you are auditing is failed login attempts, I prefer the trigger route.  Especially where you might have to write your own archiving routine anyway, might as well build a custom table and procedures to handle it.  Custom auditing can also get you exactly what you want.  I remember writing custom audit triggers because Oracle auditing didn't capture everything we wanted to capture, that was a very long time ago, so that may have changed.

Also, forgot to mention.  A custom audit trigger doesn't audit all users.  I believe that SYS and anyone that connects with the SYSDBA role are not subject to system triggers firing (that's from memory and could be incorrect).  This is to prevent errors in the triggers from preventing any access to the database at all.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
sunhuxAuthor Commented:
Thanks for the script Johnsone;  how do we cut off the data queried on daily basis?
Possible to add a criteria to query for last 24 hrs, thanks
0
Mark GeerlingsDatabase AdministratorCommented:
If you use a custom trigger to write your records to a custom table, you simply write your query to retrieve them based on whatever time period you want to retrieve.  Or, you can set up the table to be range-partitioned by day, then write your query to retrieve the records for the partition you want.

And yes, as Johnson said, logins for SYS (or anyone as SYSDBA) will not be tracked by a trigger.
0
sunhuxAuthor Commented:
Thanks;  on sudden large amt of data being queried, how do we detect this?

Ok, just as what our DBA replied:
"we will not only monitor 1017 error code.  Better to capture all login error scenarios
  as regulator wanted it.
  Btw, turning on audit for table level will eat a lot of space and resource" <==to detect large amt of data queried?
0
johnsoneSenior Oracle DBACommented:
The trigger to capture 1017 errors is an example, you can add whatever you want.  I'm not sure what other login errors there can be though.

As far as removing data from a custom audit trail table goes, you have full control.  A trigger wouldn't remove data.  Whatever reporting process you come up with can remove the data.  If you are really doing this for regulatory purposes, I would think you have to keep the data around for quite some time.

I'm not aware of a way to audit amount of data, but I was never asked to do it, so it may be out there.  We used to have a job that ran every 10 minutes or so that would look for queries that had been running for more than a specified amount of time.  But, that has nothing to do with data.  As I recall, you can set up a resource profile to stop people from running queries that do too many reads.
0
Mark GeerlingsDatabase AdministratorCommented:
This condition will be very hard to detect: "unusual high amount of data being queried".  Yes, you can set up one or more custom resource profiles (your database will already have at least one default profile) and assign them to the various Oracle users in your database.  But, these do not track the amount of data being sent to a client.  These only track the number of blocks of data that a query reads in the database server, regardless of how many of those rows actually get returned to the client.
0
sunhuxAuthor Commented:
> set up a resource profile to stop people from running queries that do too many reads
Can elaborate a example profile that stops queries of more than 30MB data if it's not
possible to restrict by the number of rows (which  I was looking at  300,000 rows)?
0
sunhuxAuthor Commented:
If it doesn't block, perhaps an alert /detection that a query that returns more than 30MB
(or 300,000 rows) have been run (possibly within the last 1 hour as plan to detect this on
hourly basis)
0
johnsoneSenior Oracle DBACommented:
The documentation is your friend.  Look at CREATE PROFILE

The limit you are probably looking for is LOGICAL_READS_PER_CALL.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
SQL

From novice to tech pro — start learning today.