Great question, Daidipya! Identifying least-used tables in Oracle can be tricky because Oracle does not maintain per-table access counts by default. However, we can approximate "least used" tables using the following approaches:
π Option 1: Use v$segment_statistics for runtime stats
This dynamic view tracks segment-level stats, including logical reads.
✅ Query: Least accessed tables by logical reads
πΉ Shows tables with the fewest logical reads since instance startup.
π§ Option 2: Use DBA_HIST_SEG_STAT (if AWR licensed)
This gives historical usage, which is more reliable over time.
✅ Best choice if AWR is enabled and licensed. Reflects multi-day usage.
π Option 3: Look at last analyzed time (Stale stats = Possibly unused)
πΈ Helpful if you want to clean up old tables. Not precise for access frequency.
πͺͺ Bonus Tip: Use AUDIT or Fine-Grained Auditing (FGA)
For precise table access tracking, consider enabling:
-
AUDIT SELECT ON <table>; -
Or FGA for selective columns/rows access logging.
But be careful—it adds overhead and should be used selectively.
π Recommendation
If AWR is licensed:
➡️ Use Option 2 (dba_hist_seg_stat) — most accurate.
If not licensed:
➡️ Use Option 1 (v$segment_statistics) — gives real-time insight.
Would you like a shell script or SQL report that runs this for all schemas and filters only user tables (excluding Oracle-maintained ones)?





0 comments:
Post a Comment