Hope you're doing well.
I'm reaching out to get your inputs on statistics gathering in our Oracle Autonomous Database (version 19c), which we use to support our integration workloads.
We have a few very large tables (approximately 70 million records each) that serve as history tables for archiving and auditing purposes. On a daily basis, we dump data from staging tables into these history tables. Since these tables are not frequently queried, we have historically not gathered statistics on them regularly.
However, we've recently observed some performance degradation when querying these tables.
To address this, we are considering to run GATHER_SCHEMA_STATS nightly on entire schema. We are assuming this could help improve performance.
But, we are not very familiar with the DBMS_SCHEMA_STATS APIs. Based on the documentation we are planning to run the following
DBMS_STATS.GATHER_SCHEMA_STATS(ownname => 'XXDB'
, estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE
, method_opt => 'FOR ALL COLUMNS SIZE AUTO'
, cascade => TRUE);
Could you please let us know if this is the right approach to gather schema stats? By any chance if yo have any automation scripts created for this purpose to refer?
Thanks in advance.
Pretty much all the defaults are the way go (in the vast majority of cases), which means you can just do:
DBMS_STATS.GATHER_SCHEMA_STATS(ownname => 'MYSCHEMA')
However - stats are gathered (where appropriate) every night in Autonomous (and even more frequently in certain situations), so before doing anything I'd be checking LAST_ANALYZED on ALL_TABLES to see what the current collection frequency seems to be.
The performance issue might not be stats frequence - they *might* be the type of stats, or it *might* not be a stats issue at all.