analyze tables

For analyzing tables analyze table table_name estimate sample 10 percent.WHat is best value for givng sample percent. All my tbales have million records. The reason for this asking this let me explain my case. We will run conversion run job which we will insert 7 million records from staging to targert tables. This will take 1.5 days to complete(as we have many rules to load data into target tables).In between for every 5 hrs i need to stop job and analyze tables and restart again so that job runs faster.we have 40 tables to analyze.I dont want tto waste more time (sat 30-60mmin) for analyzing these tables.Appreciate your help regartding this.
vamsibatAsked:
Who is Participating?
 
sventhanCommented:
> i need to stop job and analyze tables and restart again so that job runs faster
Why do you analyze those tables in between the job runs? Is there anychanges to the schema causing you to do that?
Check with your last analyzed column on those tables in dba_tables view. You'll find the recent dates the table got analyzed.
0
 
sventhanCommented:
Is it 10g or 9i?
0
 
vamsibatAuthor Commented:
its 10G
0
Ultimate Tool Kit for Technology Solution Provider

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy now.

 
sventhanCommented:
what is your statistics_level set to?
sql> show parameter statistics_level
If its typical or ALL then you're already analyzing those tables.
0
 
vamsibatAuthor Commented:
statistics_level  is typical
0
 
vamsibatAuthor Commented:
The reason for analyzing tables in between is this job populates those tables i.e. before starting those tables will have 0 records.after job starts say after 4 hrs those tables will be populated with some laks records so statistics changes so proc is running slow.for that reson i am analyzing tables in between.

I posted a differrnt questiona and you are taking me to some other route.my question what sholud be best number for giving sample percent.
0
 
sventhanCommented:
I know.
I've seen estimates as low as 5% - 10% be effective.  
Its all depends upon your data in that particular table and how much time you want to save. In order to  estimate the table's percentage is how much of the actual table should be looked intto collect the statistics, the common sense is being the larger the sample the more accurate the statistics should be. Overall, the estimate is smaller the statstics generation should be faster so there's a tradeoff between how fast and how through the estimates are. Computed statistics (use 100% of the rows) are probably best but usually take too much time to generate.




0
 
awking00Commented:
Are you analyzing the staging or target tables?
0
 
vamsibatAuthor Commented:
I am analyzing both staging and target
0
 
sdstuberCommented:
The reason you're having trouble getting the answer you want to hear is there is no single "best" number.  If there was, then Oracle wouldn't have it be a parameter at all, it would simply be a fixed number built into the process.

You can try using the automatic sampling and let Oracle determine how many rows to estimate.
0
 
sdstuberCommented:
I recommend split 24330896 and 24346517
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.