I have a table MyTable which has columns OldVal, NewVal, Nme which are all VARCHAR(MAX)
Wherever Nme = MyValue, the OldVal and NewVal values will be "integers" IE a numeric value value stored as a VARCHAR
Wherever Nme = SomethingElse, the OldVal and NewVal values will be "bigints"
Wherever Nme = SomethingElseAgain, the OldVal and NewVal values are string values
If I run the following query
SELECT OldVal,
NewVal
FROM MyTable
WHERE Nme = 'MyValue' AND
CONVERT(INT,NewVal) < CONVERT(INT,OldVal)
The query runs as expected.
However If I change the query to
SELECT 1
FROM (
SELECT OldVal,
NewVal
FROM MyTable
WHERE Nme = 'MyValue' AND
CONVERT(INT,NewVal) < CONVERT(INT,OldVal)
) a
I get an error
The conversion of the varchar value '3666999000' overflowed an int column.
The value in question in the error message is one where Nme = "SomethingElse" (which are all BIGINTs stored as VARCHARs)
it seems to me that what is happening here is that on the first query, the optimizer first filters by Nme = 'MyValue' and then does the conversions on the residual data and the second query tries to do the conversion first (meaning it tries to convert values that are illegals INTs) and then applies the Nme = 'MyValue' filter.
This is confirmed by the predicate in the tooltip on the filter operator in each (Estimated) execution plan
The database is on a 2014 server and running in Compatibility Level 110. The behaviour is still present if I force the 2014 optimizer with OPTION (QUERYTRACEON 2312)
What would cause this to happen? I suspect maybe something to do with wrapping NewVal and OldVal in a CONVERT is causing issues with statistics but surely that would have affected both queries?
NB - above is abridged code narrowed down to its simplest form for the purpose of demostrating it here also, I know storing ints as varchars is not good practice but this is a third party database which we cannot control the design of