2

I have a query that hits a 1.4 billion row table.

Everything runs fine with this query, runs sub-second.

Then once every 5 days or so, auto-update stats kicks in and the query starts running almost 60 seconds - It takes on a different plan.

I checked and when the auto-update stats kicks in, the sample rate is less than 1% of the table.

Now, my question is - Can an auto-update stats cause the optimizer to choose a plan that is worse than before?

I can see that the estimations of the bad plan are wildly different to the good plan.

Good Plan:

enter image description here

Bad Plan:

enter image description here

The estimations are clearly way off for the bad plan and yet the amount of rows being returned by the query is the same.

At this point, I am trying to confirm whether an auto-stats updates can do this?

What I am attempting now is to do a stats update with a sample of 10% and see what happens.

Update: I have completed a stats update with 10% sampling rate and queries are doing well again.

Have made this a job to run regularly.

Sean Gallardy
  • 38,135
  • 3
  • 49
  • 91
Sicilian-Najdorf
  • 381
  • 4
  • 13

1 Answers1

4

At this point, I am trying to confirm whether an auto-stats updates can do this?

Clearly it can. You're currently experiencing it.

For some larger tables, the default sampling rate is too low.

If you're on SQL Server 2016 or better, you can create or update statistics with the PERSIST_SAMPLE_PERCENT syntax to override the default method and maintain a minimum sampling rate. It's just up to you to figure out what that needs to be.

If you're on an older version, or if the above doesn't work out for you because of other interactions, you may update statistics with NORECOMPUTE and handle stats updates manually at a sampling rate you find ideal.

Erik Reasonable Rates Darling
  • 45,549
  • 14
  • 145
  • 532