0

As far as I have been able to find, the first language to use ^ for exponentiation was BASIC, in 1964. Earlier languages, such as Fortran, used other symbols such as ** for exponentiation (although in Fortran's case this was likely influenced by its limited character set compared with later languages).

My question is, why did BASIC choose to use ^ for exponentiation? It is not a case of simply using existing mathematical notation (unlike + and -), since the ^ symbol was not initially used in math to mean exponentiation (e.g. TeX usage is more recent than BASIC).

I am looking for an objective answer backed up with a proper source.


As pointed out in the accepted answer, the original 1964 Basic used (up-arrow) for exponentiation (as can be found in the original manual, page 5). ASCII did not even include a ^ until 1965. Later versions of Basic did, however, use ^ for exponentiation.

Robert Harvey
  • 200,592

2 Answers2

6

The BASIC article on wikipedia provides a link to the first user manual created by the inventors of the language. At that time, October 1964, the power operator was an ↑ up arrow (page 9). It was available on the keyboard used on the system (page 15). It was however not a standard character since in the manual, all the up arrows are not printed but manually corrected.

Other languages used the up arrow symbol for exponentiation as well, as for example ALGOL, which was with FORTRAN part of the language sources for BASIC.

At the same period, in 1963 a first version of the ASCII character set was published. There are documented discussions of the standard committee about which character to include in the new standard character set. This article provides historical references based on paper archives. It shows that the popularity of ALGOL influenced the choice of the ASCII committee (for example the square brackets). The article also provides 3 references on the use of the caret as a substitute for the up arrow, when it is not used as an accent.

So in conclusion, the use of the caret is not a choice of the language designers, but a result of the choices made by the ASCII standard committee on the available characters.

Christophe
  • 81,699
5

I think the proper answer is the one you already preemptively rejected.

It is not a case of simply using existing mathematical notation (unlike + and -), since the ^ symbol is not really used in math to mean exponentiation.

Why do you say that? Sure, in "formal" mathematical notation, exponentiation is written as a superscript, but ASCII has no concept of superscripts or subscripts, and there is a simple, informal notation (one might even say "for beginners," which is the B in BASIC) which involves using the ^ symbol. It's the simplest, most intuitive way to express the operation, given the constraints of the ASCII character set and the explicit target audience of "beginners" rather than people with a heavy math or computer science background.

Mason Wheeler
  • 83,213