Published on November 22nd, 2014 | by Warner Chaves3
Finding good stats candidates for manual updating, wisdom for your sql tuning
On any database system there are two main keys to an optimal execution plan:
- Proper objects to provide access to the data (ie, the right indexes, partitions, etc).
- Enough information of the data so the query engine can determine the best way to use those proper objects.
And when we’re talking about 2. we’re mainly talking about statistics. The statistics will be read by the query optimizer, cardinality estimations will be made and a plan is generated. As any DBA worth their salt knows, having stats up to date is critical to enabling the optimizer to make the best plan choices.
With that intro out of the way, the purpose of this post is to discuss a way to identify stats objects in your SQL Server that are good candidates for a manual refresh, instead of waiting for the auto update stats to kick in to the work. While researching into how to do this my good friend Fabiano Amorim reminded me of this very handy DMF: sys.dm_db_stats_properties. This DMF was introduced in SQL Server 2008 R2 SP2 and SQL Server 2012 Service Pack 1 and is available in all versions afterward.
Now the beauty about this particular DMF is that it gives us access to all kinds of information about stats in a very easy to use function. We can plug this function into our own analysis queries and come up with very useful results. And just as the title suggests, finding good stats that are candidates for a manual updating is a good one!
As a best practice we recommend that clients leave the Auto Update Stats setting left to ON for every database. However, even with this setting, a long time can pass in between stats updating and this can have a negative impact on our SQL Server’s performance.( remember that’s why you are here to get some serious sql performance tuning advice.)
Below, is a sample script of the type of analysis that we can do to find these type of ‘stale’ stats. I’m providing 4 different parameters:
- @rowsThreshold: in case we want to analyze only tables over a certain size.
- @modificationThreshold: in case we only want to see stats that have gone under a minimum amount of change.
- @sampleThreshold: in case we only want to see stats where the sample size used last time is below a minimum.
- @daysOldThreshold: in case we only want to see stats that haven’t been updated in a certain amount of days.
Play around with the parameters and see what works best for your environment. The script also provides the actual T-SQL to update the stats on the [TSQL] column so it could be easily adapted to actually execute the update operations in an automated way, a maintenance SQL Agent job for example.
Happy stats hunting!
DECLARE @rowsThreshold INT=1000000; -- minimum amount of rows the table has to have DECLARE @modificationThreshold INT=1000000; -- minimum amount of modifications the column has to have DECLARE @sampleThreshold INT=8; -- sample size below this number DECLARE @daysOldThreshold INT=1; -- how many days since the stat was updated WITH bigTables([object_id],[tableName],[sizeRows]) as (select TOP 20 object_id,name=object_schema_name(object_id) + '.' + object_name(object_id) , rows=sum(case when index_id < 2 then row_count else 0 end) from sys.dm_db_partition_stats where object_id> 1024 group by object_id HAVING sum(case when index_id < 2 then row_count else 0 end) >= @rowsThreshold order by rows DESC) SELECT bt.tableName,st.name,bt.sizeRows,dsp.rows_sampled,dsp.last_updated,dsp.modification_counter,rows_sampled/(bt.sizeRows*1.0)*100 sample_pct, 'UPDATE STATISTICS '+bt.tableName+' '+st.name+' WITH SAMPLE '+ CONVERT(VARCHAR(3),@sampleThreshold) +' PERCENT;' AS [TSQL] FROM bigTables bt INNER JOIN sys.stats st ON bt.[object_id]=st.object_id CROSS APPLY sys.dm_db_stats_properties(bt.[object_id],st.stats_id) dsp WHERE rows_sampled/(bt.sizeRows*1.0)*100<[email protected] AND datediff(DAY,last_updated,GETDATE())>[email protected] AND dsp.modification_counter>= @modificationThreshold ORDER BY bt.sizeRows desc