0

I have been looking at poorly performing stored procedure that performs well for larger customer databases. I believe the reason is that SQL Server has not had time to properly create good statistics on the queries.

I used the database tuning advisor to recommend several statistics, and after applying those in test environment with a restored backup of the database in question, performance was significantly better.

My question is this. If I create a script (Maintenance Script as we call them in house) for support to run on customers who may also be experiencing this issue. Would it be a potential bottle neck on performance if a statistic had already been created automatically for the same column(s) by SQL Sever and my script creates a duplicate statistic?

Auto create and update statistics are set to ON in case you are wondering. We support customers who run SQL 2005 (unfortunately) or higher.

miracle173
  • 7,797
  • 28
  • 42
Eric Reid
  • 75
  • 1
  • 1
  • 5

2 Answers2

2

Would it be a potential bottle neck on performance if a statistic had already been created automatically for the same column(s) by SQL Sever and my script creates a duplicate statistic?

Surely you can write your script such that it detects such a duplicate statistic, instead of just blindly creating another one (possibly multiple times)? We do this for tables, views, procedures, functions, indexes... why not for statistics too? If you have the ability to create a statistic you have the ability to check the metadata for one that's already there (and maybe just needs to be updated).

I haven't done any performance tests, and I'm not sure if auto-stats (sync or async) will kick off multiple times or just pick one of them, but certainly nothing good can come of it. In the one case you're going to spend twice (or more!) the amount of time updating redundant statistics; if they don't both get updated then it's possible that the optimizer will pick the wrong one when using the statistic for cardinality.

I'd be more interested to know how often you find that - with auto-create stats turned on - SQL Server hasn't created a statistic that would yield better performance or estimates, and then suddenly the plans improve because of the statistic you decided to create (that SQL Server didn't bother). My guess is that would be pretty rare (unless you haven't really enabled auto-create stats for those customers, or maybe these are filtered statistics, though that isn't possible for your 2005 folks).

As for stats going stale in general, you may be able to implement Trace Flag 2371 - which improves the thresholds for engaging auto-stats - for some of your customers.

Glorfindel
  • 2,205
  • 5
  • 19
  • 26
Aaron Bertrand
  • 181,950
  • 28
  • 405
  • 624
0

I used the database tuning advisor to recommend several statistics, and after applying those in test environment with a restored backup of the database in question, performance was significantly better.

Dont just blindly accept what the DTA suggest. Instead it should be thoroguhly analyzed before jumping on a conclusion.

Would it be a potential bottle neck on performance if a statistic had already been created automatically for the same column(s) by SQL Sever and my script creates a duplicate statistic?

SQL Server will use only one statistic rendering the other one obsolete. BUT it will still be updated and managed by SQL Server thereby consuming resources.

You can check what stastics are being used by a given query using the method described by Paul White - How to Find the Statistics Used to Compile an Execution Plan

OPTION
(
    QUERYTRACEON 3604,
    QUERYTRACEON 9292,
    QUERYTRACEON 9204
)

Note that these are undocumented trace flags

If I create a script (Maintenance Script as we call them in house) for support to run on customers who may also be experiencing this issue.

Use Ola's maintenance scripts. They are more versatile and works really well.

Glorfindel
  • 2,205
  • 5
  • 19
  • 26
Kin Shah
  • 62,545
  • 6
  • 124
  • 245