Math History

Today in class we had an interesting guest lecturer. Among other things, he showed us an image of the first ever recorded use of the = sign. Apparently, Robert Recorde invented the “equal” sign in 1557 with the publication of the Whetstone of Witte. According to Wikipedia, this book also served to introduce algebra to England.

My question is, does anybody else know of any other fascinating math factoids like this? It is one thing to know that [insert crazy old math person] invented zero, it is another to actually see it. Might be a bit too obscure, but if anybody has anything, I’d love to hear it.

I’ve placed this is GQ because I’m looking for factual, the first recorded use of zero can be found in Old Man Hubbard’s “The World is Not Enough” from 1453, which can be found housed at Oxford University. Not just the typical factoids I can dredge up myself.

(I’m only interested in this because my fiancee is a math major, so I’d love to compile a little collection of interesting math history for her.)

The Wikipedia article “History of Mathematical Notation” is well-cited, including a reference to Florian Cajori’s A History of Mathematical Notations, which seems like a standard text on this subject.

There’s also these three links:
Earliest Uses of Mathematical Notation
History of Mathematical Notation
Mathematical Notation: Past and Future

The word Algebra itself is taken from the arabic world al-jabr, and is widely attributed to the Arabic Mathematician Al-Khwarizmi.

EDIT… Reads some more… in fact the term Algorithm is beleived to be a latinization of his name.

As for Zero:

I’m not sure exactly what kind of factoids you’re looking for, but the trend so far seems to be history of notation and terminology.

Well, this isn’t along those lines, but one factoid which I enjoy noting is that, of the four most prominent inventors/discoverers of the modern concept of “computable function”, three died tragic deaths: Alan Turing (who gave us the Turing machine in 1936) committed suicide by eating a cyanide-laced apple, in response to government prosecution for his homosexuality. Emil Post (who essentially independently discovered the Turing machine, also in 1936) died of a heart attack subsequent to electroshock therapy for bipolar disorder. And Kurt Goedel (who gave us the concept of the mu-recursive function (i.e., Herbrand-Goedel computability)) grew convinced he was being poisoned, refused to eat, and thus starved himself to death. Only Alonzo Church (who gave us the lambda calculus, also in 1936) managed to live into old age and die of natural causes.

Many of the mathematical notations we use today, including e, π, i and f(x), were originated or popularized by the great (and enormously prolific) eighteenth-century mathematician Leonard Euler (pronounced “Oiler”). (Note that, although the concept of pi has been known about for millennia, it’s only relatively recently that there has been a special symbol for it.) Euler spent the last few years of his life blind, yet continued to crank out lots of mathematical papers. Here’s a List of topics named after Leonard Euler.

Wow. Thanks everybody. Clearly the SDMB has a strong mathematical component to its readership.

Derleth, thank you for pointing me to Cajori’s text.

The BBC has been running as series “The Story Of Maths” (note the British spelling of Math). It covers all this sort of stuff, including the first recorded usage of a zero (on an Indian temple wall I think, but that was last week and I may have it wrong), how ancient Egyptians calculated, the almost simultaneous development of calculus by Newton and Leibniz etc etc. Not the best documentary ever due mainly to the presenter (he spent far too much of last night’s episode getting drunk with descendants of Euler and the Bernoulli clan. Are physicists inherently better at this sort of thing than mathematicians?) but worth looking out for on PBS or wherever such shows turn up where you are.

The lambda in lambda calculus was originally a caret placed above the abstracted variable. The printers were unable to typeset this, so they moved it to the left of the abstracted variable. Eventually, a typist mistook the caret for a lambda, and typeset a paper using that notation. Church liked it, so it stuck.

Probably becasue he attained closure.

That’s horrible.

This book is supposed to be a pretty good read on the history of algebra, although I can’t vouch for it personally.

I can; it is a good read.

Heh, good, you brought it back to notation. While we’re at it, this use of a circumflex to indicate the abstracted variable goes back to Russell and Whitehead’s Principia Mathematica.

To which I add this reference:

Click on View a Clip and watch for the event at 2:49.