We help IT Professionals succeed at work.

oracel redo logs on Linux and SAP

sword12
sword12 asked
on
Hi Experts

we have ERP system SAP running on red hat Linux 7  using oracle 12C

from time to time our file system which reserved for ora logs  get full so fast out of the sudden

we are not able to know the root cause which  

process or transaction or user  generate such amount of logs

do you know anyway or tool to know such info


thanks
Comment
Watch Question

Erman Arslan Senior Director - Database and Systems Management

Commented:
Hi,

You mean redo logs/archived logs probably.
You can use Oracle Log Miner to mine those logs and see the transactions that are causing the issue.

ermanarslan.blogspot.com

Author

Commented:
Hi

our problem that we have for oracle only one service account which SAP use

and a lot of users using SAP and sap use only one user

we need to know which action or transaction created a job which created big logs

any advice here


thanks you
Senior Director - Database and Systems Management
Commented:
Yes , I can imagine.
Oracle's ERP (E-Business Suite) works in the same way.

You can use Log Miner for this.
Check v$logmnr_contents view. Write a query to get the info you need.

Following link may help you;

https://www.thegeekdiary.com/simple-steps-to-use-logminer-for-finding-high-redo-log-generation/
johnsoneSenior Oracle DBA

Commented:
As suggested, LogMiner would be what you are looking for.  There are columns in V$LOGMNR_CONTENTS to determine the machine, os user, and transaction.

I highly recommend looking through a very small number of files with LogMiner at once.  The more files you load to LogMiner the slower it gets.  In the past, I found it was faster to actually query the logs one at a time.
Mark GeerlingsDatabase Administrator

Commented:
Yes, Oracle's LogMiner utility is the tool that you can use to look at the archived redo log files to determine which Oracle tables have had the most transactions in the time period you are looking at.  I agree with johnsone's recommendation that you should start with just one, or a very small number, of logs at first.

Queries of the LogMiner information are not particularly fast, especially if you try to query multiple log files at one time.  If you have a test system that has the same Oracle database software version as your production version, you can copy a few log files to the test system and run LogMiner there instead of on your primary system.  (To do this, you will need to create a LogMiner dictionary file on your  production system and copy this to your test system, unless your test system has an up-to-date clone of you production system.)  That way you avoid slowing down (or possibly crashing) your production system when you run LogMiner.  I haven't seen LogMiner cause crashes very often in Oracle11, but in earlier versions of Oracle I saw a number of instance crashes due to LogMiner.
johnsoneSenior Oracle DBA

Commented:
Interesting....  I have never had LogMiner crash an instance.  And I used it a lot when it first came out.  Haven't used it much lately though.

This almost sounds like looking through one log might be sufficient to find what you are looking for.  Sounds like one rogue transaction.  If you can find a log that is mostly filled with one connection, that is probably your culprit.

If I ever needed to go through more a large number of files, I scripted it to go through the files one at a time and extract the relevant information into a table that I could look at after it processed all the files.  It was a lot faster than trying to do multiple files at once.
Mark GeerlingsDatabase Administrator

Commented:
I also used that technique of copying some of the columns from LogMiner into work tables where it can be queried much faster.  But you do have to try some LogMiner queries directly first to find some of the data you are looking for.

I'm also attaching a "log_miner_sample.sql" file that contains the actual LogMiner commands and queries I used in SQL*Plus on our database server some years ago, when I used LogMiner to help with a problem we had at the time.  Note that the "dbms_logmnr.ADD_LOGFILE…" lines contain actual file paths and file names from our server at the time.  You will have to supply the actual file paths and file names from your system when you run "dbms_logmnr.ADD_LOGFILE…" commands.
log_miner_sample.sql
Ora_TechieDatabase Administrator

Commented:
How many redo logs you have and of what size? Do you have any job that regularly rebuilds (or  coalesce) the indexes?
johnsoneSenior Oracle DBA

Commented:
Why would you be regularly rebuilding indexes?  Normally not necessary in Oracle.  The only reason would be because of deleting a significant percentage of the table.
Ora_TechieDatabase Administrator

Commented:
Why would you be regularly rebuilding indexes?

Yes, this shouldn't be the regular practice; that is why I asked author if they are doing it on regular basis because It will also generate logs.

Author

Commented:
the root cause was sap netweaver gateway