I recently dealt with a problematic stored proc. At times, runs beautifully fast, other times, runs very very long. I determined that bad parameter sniffing was the cause.
Just for info - The parameters in the proc are datetime and the query uses these parameters to search through date ranges.
Anyway, this is what I attempted:
- Recreated the proc and used
WITH RECOMPILE- Didn't help - Recreated the proc and added
OPTION (RECOMPILE)- Didn't help - Recreated the proc and added
OPTION (OPTIMIZE FOR UNKNOWN)- Runs fast - Recreated the proc and used local variables - Runs fast
To help my understanding....
Is using local variables & OPTIMIZE FOR UNKNOWN the exact same thing in the way that is uses average density statistical data to produce a plan?
I tried a couple of combinations of things as well:
- Recreated the proc and added
OPTIMIZE FOR UNKNOWN&OPTION (RECOMPILE)- Runs fast - Recreated the proc with local variables &
OPTION (RECOMPILE)- Runs slow
I have read about the potential dangers of using OPTIMIZE FOR UNKNOWN and in a lot of cases, using local variables are brought up as if it's the same thing. This is what leads me to think that it is the same thing.
BUT - How do I explain that attempt 6 runs slow.
I want to say that yes, stats are updated but it's with a less than zero % sampling rate - Tables are HUGE +- 1.6 billion rows.
Might also be worth noting - I used the awesome sp_blitzcache and filtered on the specific proc - There is a compilation timeout warning for it - My intuition is telling me that is something to note here.