As suggested, LogMiner would be what you are looking for. There are columns in V$LOGMNR_CONTENTS to determine the machine, os user, and transaction.
I highly recommend looking through a very small number of files with LogMiner at once. The more files you load to LogMiner the slower it gets. In the past, I found it was faster to actually query the logs one at a time.
Yes, Oracle's LogMiner utility is the tool that you can use to look at the archived redo log files to determine which Oracle tables have had the most transactions in the time period you are looking at. I agree with johnsone's recommendation that you should start with just one, or a very small number, of logs at first.
Queries of the LogMiner information are not particularly fast, especially if you try to query multiple log files at one time. If you have a test system that has the same Oracle database software version as your production version, you can copy a few log files to the test system and run LogMiner there instead of on your primary system. (To do this, you will need to create a LogMiner dictionary file on your production system and copy this to your test system, unless your test system has an up-to-date clone of you production system.) That way you avoid slowing down (or possibly crashing) your production system when you run LogMiner. I haven't seen LogMiner cause crashes very often in Oracle11, but in earlier versions of Oracle I saw a number of instance crashes due to LogMiner.
johnsone
Interesting.... I have never had LogMiner crash an instance. And I used it a lot when it first came out. Haven't used it much lately though.
This almost sounds like looking through one log might be sufficient to find what you are looking for. Sounds like one rogue transaction. If you can find a log that is mostly filled with one connection, that is probably your culprit.
If I ever needed to go through more a large number of files, I scripted it to go through the files one at a time and extract the relevant information into a table that I could look at after it processed all the files. It was a lot faster than trying to do multiple files at once.
Mark Geerlings
I also used that technique of copying some of the columns from LogMiner into work tables where it can be queried much faster. But you do have to try some LogMiner queries directly first to find some of the data you are looking for.
I'm also attaching a "log_miner_sample.sql" file that contains the actual LogMiner commands and queries I used in SQL*Plus on our database server some years ago, when I used LogMiner to help with a problem we had at the time. Note that the "dbms_logmnr.ADD_LOGFILE…" lines contain actual file paths and file names from our server at the time. You will have to supply the actual file paths and file names from your system when you run "dbms_logmnr.ADD_LOGFILE…" commands. log_miner_sample.sql
How many redo logs you have and of what size? Do you have any job that regularly rebuilds (or coalesce) the indexes?
johnsone
Why would you be regularly rebuilding indexes? Normally not necessary in Oracle. The only reason would be because of deleting a significant percentage of the table.
Ora_Techie
Why would you be regularly rebuilding indexes?
Yes, this shouldn't be the regular practice; that is why I asked author if they are doing it on regular basis because It will also generate logs.
You mean redo logs/archived logs probably.
You can use Oracle Log Miner to mine those logs and see the transactions that are causing the issue.
ermanarslan.blogspot.com