29

I wonder if there is any reason - or if it is just an accident of history - that there are no !> and !< operators in most programming languages?

a >= b (a greater OR equals b) could be written as !(a < b) (a NOT lesser b), that equals a !< b.

This question struck me when I was in the middle of coding my own expression tree builder. Most programming languages have a != b operator for !(a=b), so why no !> and !< ?

UPDATE:

  • !< (not lesser) is easier to pronounce than >= (greater or equals)
  • !< (not lesser) is shorter to type than >= (greater or equals)
  • !< (not lesser) is easier to understand* than >= (greater or equals)

*because OR is binary operator you brain need to operate two operands (grater, equals), while NOT is unary operator and you brain need to operate only with one operand (lesser).

Alex Burtsev
  • 686
  • 1
  • 7
  • 15

10 Answers10

85

The D programming language and DMC's extension to C and C++ did support these operators (all 14 combinations of them), but interestingly, D is going to deprecate these operators, mainly because

  1. what exactly is a !< b? It is a>=b || isNaN(a) || isNaN(b). !< is not the same as >=, because NaN !< NaN is true while NaN >= NaN is false. IEEE 754 is hard to master, so using a !< b to will just cause confusion over NaN handling — you can search for such operators in Phobos (D's standard library), and quite a number of use has comments beside it to remind the readers NaN is involved,
  2. therefore, few people will use it, even if such operators exist like in D,
  3. and one have to define 8 more tokens for these seldomly used operators, which complicates the compiler for little benefit,
  4. and without those operators, one could still use the equivalent !(a < b), or if one likes to be explicit, a >= b || isNaN(a) || isNaN(b), and they are easier to read.

Besides, the relations (≮, ≯, ≰, ≱) are seldomly seen in basic math, unlike != (≠) or >= (≥), so it's hard to understand for many people.

These are probably also the reasons why most languages do not support them.

kennytm
  • 977
47

Because it doesn't make much sense to have two different operators with exactly the same meaning.

  • “not greater” (!>) is exactly the same as “lesser or equal” (<=)
  • “not lesser” (!<) is exactly the same as “greater or equal” (>=)

This does not apply to “not equals” (!=), there is no operator with the same meaning.

So, your modification would make the language more complicated with no no benefit.

svick
  • 10,137
  • 1
  • 39
  • 53
10

!< is synonymous with >=. Later is just a way of typing well defined mathematical symbol . You are right that "not less than" is used in spoken language, however it's colloquial and can be ambiguous (can be interpreted or misinterpreted as >). On the other hand programming and math use clearly defined, unambiguous terminology.

Even in 3-value logic, such as ANSI SQL, not x < y is equivalent of x >= y, as they both give NULL if either x or y is NULL. However, there are non-ANSI compliant SQL dialects, where it's not equivalent, and they do have !<.

vartec
  • 20,846
8

Transact-SQL has !> (not greater than) and !< (not less than) operators.

So, other than you, someone at Sybase Microsoft also thought it would be a good idea. Just like Microsoft Bob! :)

mousio
  • 103
  • 3
yannis
  • 39,647
4

I think the answer is simply that there's no need for a !< operator. As you pointed out in your question there is already >= and <= along with the possibility to negate an existing expression, so why add another operator?

Bryan Oakley
  • 25,479
4

From RFC 1925

perfection has been reached not when there is nothing left to add, but when there is nothing left to take away.

Adding additional operators that duplicate existing functionality doesn't do anything other than add (unnecessary) complexity to the language (and thus tokenizer and parser).

Consider also in languages where operator overloading is possible, you would have yet another operator to overload. Consider the confusion when bool operator<= and bool operator!> could return different things (yes, I know one can already make inconsistent comparisons).

Lastly, think of languages where methods or operators are multiply defined (Ruby - I'm looking at you) and you have one programmer who uses <= while another uses !> and you have multiple code styles for the same expression.

3

!< is equal to >= Now why we have second one not first because all language implement positive operator first and then approach to negative operator,As implementing >= also covers !< and <= covers !>.So language creator goes to these and thought they would be redundant and skip them.

Always try to implement positive case first then go to negative case (:) positive thinking, me personal view only)

Notepad
  • 131
2

The reason is that operators in programming languages borrow from the mathematical tradition and in mathematics noone realy uses "not greater" and "not smaller" since "smaller or equal" and "greater or equal" do just as good of a job.

So in Programming languages we usually get a symbol looking like ≠ for not equals (!= or /=, unless someone if being fancy with <> or a textual operator)

and things that look like ≤ and ≥ (<= and >=)


Btw, I don't agree with your assertion that NOT is simpler to understand and reason about then OR. In mathematics, proofs involving lots of negations (like reduction to absurd) are usually frowned upon if there is a more direct alternative available. Also, in the ordering case, the basic knowledge we have (and that is used when thinking or proving something) is the tricotomy between <, = and > so any !< statement probably has to be converted to >= if you want to do anything useful with it.

hugomg
  • 2,102
2

I'd partially blame the assembly instruction set. You've got instructions such as jge for "jump if greater than or equal". As opposed to "jump if not less than".

Compiler writers may have gone off of what assembly writers came up with, which was presumably based on how it was labeled when being designed on the chip.

...possibly.

Ed Marty
  • 179
  • 5
1

I think I saw some languages a few years ago where, instead of the != operator (not equals), something like <> was used. Can't remember their names, though...

I think that it's harder to read !(a < b) or a !< b than a >= b . Probably that's the reason why !< is not used (it does look ugly in my opinion).

Radu Murzea
  • 1,820