I'm currently studying for SQL 70-433 (the Microsoft Certification exam), and I'm getting very confused about the "query cost" performance metric.
According to any documentation I could find via Google, the query cost is a percentage figure, and represents the percentage of the whole batch taken up by any one part of it. This already seemed a little odd to me, as I am interested in the absolute merit of a particular query, rather than its merit relative to other queries which happen to appear alongside it.
But then I thought, well, maybe what you are expected to do is place two alternative queries side by side, run them as "a batch", and then whichever one has a cost less than 50% is the winner.
But the discussion of query cost in chapter 6, lesson one of Microsoft's SQL 70-433 Training Kit doesn't seem to bear any relation to this.
Here is an example: They show a query containing two correlated sub-queries, and then improve on it by replacing the sub-queries with an OUTER APPLY. The result: "This query has a cost of roughly 76, while the first query's cost was double that, about 151." They then improve the query even further, and reduce the cost from 76 to 3.6. They do not imply these figures are percentages, whereas they do imply that they are absolute figures which relate to the query as a standalone object, without reference to any other queries. And anyway, how could the first query have a cost of 151%?
Later on in the chapter, they show a screenshot of an execution plan which has three parts. The first says "Cost: 0%", the second says "Cost: 1%" and the last says "Cost: 99%" but the text (of the book itself) below the screenshot "The cost of this query is 0.56". I'm guessing that they mean some other kind of cost, but I can't find reference to it elsewhere.
Can someone help? I'm thoroughly confused.
