Solved

Oracle Table Analyze

Posted on 2003-11-19
8
3,834 Views
Last Modified: 2008-04-20
I have one big table containing 10,00,000 records.
When i analyze this table the system hangs.
Is there any problem with SGA Size.

The same table if i analyze in other database with double SGA size it takes onle 3-4 minutes but it does not hang.
How i can speed up this tables's analyzing process.
0
Comment
Question by:vishalgoyal123
  • 3
  • 2
8 Comments
 
LVL 48

Accepted Solution

by:
schwertner earned 125 total points
ID: 9779016
It shouldnot hang. If the SGA is small then it will take longer time to analyze. You should be patient and wait. Try to analyze through OEM - it will show you if it works.
0
 
LVL 13

Expert Comment

by:anand_2000v
ID: 9779255
10,00,000 records will take some time if your SGA is not big enough. See if you can increase the SGA. Better still use the stats pack to analyze the table.
0
 
LVL 23

Assisted Solution

by:seazodiac
seazodiac earned 125 total points
ID: 9779312
I think the correct way to get around the SGA problem is that

Use "ANALYZE TABLE <table_name> ESTIMATE STATISTICS SAMPLE n PERCENT"
OR
"ANALYZE TABLE <table_name> ESTIMATE STATISTICS SAMPLE n ROWS"

By doing this, you avoid the analysis of the whole 1 million records, CBO also get an overall estimate of statistics as well.

give it a try , see if this command will speed up your analysis
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 
LVL 13

Expert Comment

by:anand_2000v
ID: 9779371
Estimate, IMHO, will cause more problems than solve. I have seen situations wherein a *compute* results in avoidance of usage on index and resultant *speeding of access* and
a *estimate* still going for the index access.

Of course every situation is unique.
0
 
LVL 23

Expert Comment

by:seazodiac
ID: 9780157
anand_2000v, what you said about "estimate" is NOT true:

If no sample size is provided when estimating statistics with the ANALYZE command, Oracle will take a default sample size of the first 1064 rows. This may not be effective and most often will result in bad queries. But If the ESTIMATE sample size is greater than 50%, it is as good as the COMPUTE option, but you get shorter analyze time by skipping the rest of records.

0
 
LVL 13

Expert Comment

by:anand_2000v
ID: 9780973
oh...I thought that estimate by default uses 20% !!!! 1064 rows....thanx for the info
0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
Using SQL Scripts we can save all the SQL queries as files that we use very frequently on our database later point of time. This is one of the feature present under SQL Workshop in Oracle Application Express.
This video explains at a high level with the mandatory Oracle Memory processes are as well as touching on some of the more common optional ones.
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.

828 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question