Just getting into using various Brent Ozar tools like sp_BlitzCache and sp_BlitzIndex to identify and correct SQL Server performance issues. Been experimenting with running sp_AskBrent (now BlitzFirst...) and sp_WhoIsActive as agent jobs as a way to baseline query performance.
Do any of you run sp_BlitzCache periodically as an agent job to capture history / provide a baseline and/or evidence for your SQL Server performance tuning efforts? If so:
How specifically has this helped you in your environment?
Which sp_BlitzCache sort_order do you find most helpful to log? I'm considering separate output tables by cpu, reads, duration, and executions per minute. But wondering if I should be using "avg X" for these sorts.
Should I stick with the default of @top = 10? Or go for 20, 30, etc?
Here's an example where I'm using total reads and top 20:
EXEC DbaTools.dbo.sp_BlitzCache @top='20', @sort_order = 'reads', @output_database_name = 'DbaTools', @output_schema_name = 'dbo', @output_table_name = 'BlitzCache_by_reads'
How frequently would you run such a job? Weekly? Since the plan cache doesn't get wiped out until a host / service restart, an explicit command to do so (dbcc freeproccache), or in some other perhaps unexpected way (to some) such as changing MAXDOP (our Team learned the latter the hard way).
This is more in line with with another StackExchange question just I posted about sp_AskBrent and sp_WhoIsActive, but how resource-intensive is it to run these SPs on a periodic basis?
That other StackExchange question is here:
sp_AskBrent not logging queries when saving to table vs. sp_WhoIsActive