Mathematical symbols, notat. systens: at what point created? Which abs. critical to application?

Off the top of my head: From the invention of the plus sign to the integral sign, Greek letters up the wazoo (and graphically futzing with them), logical notation, bra-key, Feynman, Penrose graphical notation.

This is what im thinking about: One could get by for a while without a multiply sign; but at a discrete point in time somebody worked out that adding repeatedly could be subsumed by a new symbol.

Ok, that’s a question 1) regarding symbols:

what/when/how/which symbols are created when a tipping point is reached, in whatever field (pure math, all the sciences).

Engineering, computer science, resource management–and hundreds of fields of investigation which i have no conception of–are invited. Engineering and CS may become unmanageable for fascinating reasons I can’t put my finger on, perhaps because they relate so strongly to question 2).

Question 2) some symbols are ad-hoc–like writing a long phrase and then "hereafter [abbrev.].

In math, “that thing we’re looking for” became x. I believe this is considered, like the numeral zero, a big deal for two reasons: conceptual novelty and it’s critical naturein the project of algebra and arithmetic, respectively.

I have Feynman diagrams in mind for a different application. If this wrong in itself it would be good to hear, and could elucidate the questions.

I believe my questions, hazily explored here, are in the main understandable. :slight_smile:

Just realized that Feynman diagrams is not mathematical notation system.

Question still stands, if anyone wants to take a crack at it.

An important grammatical symbol is the question mark, which in this case would help indicates what your question is. I can’t figure it out from your post.

On a lark I googled fMRI and mathematical notation (more specifically, this) and found this. Not sure if it’s the sort of thing you’re after.

If I’m reading it right, the idea seems to be that the symbols are in fact processed in a way similar the way language is, but not in the same area of the brain. Although language isn’t, at least according to wikipedia, absolutely hard coded to use that region.

So if math is processed similarly to language, maybe that suggests that like words, symbols are just an expression and codification rather than things in themselves.

IDK. Sounds slick, right?

:slight_smile:

Will wait a couple posts…Thread query subject has the queries in short.

When it comes to math, even the very basic stuff like arithmetic signs were invented fairly recently. For example, the Welsh mathematician Robert Recorde invented the = sign in the 16th century.

The rules of basic algebra had been known centuries before that, but lacking a notation system you had to write out equations in words. For example, imagine having to solve something like “twice an unknown quantity less three is equal to the same quantity plus four.” This is a trivial problem for any grade-school algebra student today, because we can write it in much more concise symbols. More importantly, those symbols can be manipulated according to very precise rules. Equations written in English are much harder to manipulate because you are constricted by the rules of the English language. And remember, you can’t go back and forth, because without those symbols you don’t have the mathematical language available to think in even if you have to write in English.

It’s not surprising that the progress of mathematics accelerated exponentially once people got the hang of symbolic notation.

Nope. Multiplication is centuries or even millennia older than the multiplication symbol. Look at some of the early texts like Euclid (4th century BC) and al-Khwārizmī (9th century AD) to see how math (including multiplication, squares, roots and solving quadratic equations) was done before mathematical notation.

Although some early mathematicians used some symbols in their works, mostly they wrote problems and methods out as plain language, and what we considered basic mathematical notation like the plus sign didn’t come into use until the 14th century.

I think reformulating your question is critical to having people understand what it actually is.

As we all heard in grade school, the most critical “invention” was the symbol for zero. having a placeholder for “nothing” made many math concepts easier to write down - including other numbers. You could write “50” or “5”, recycling the digits 0-9 instead of say “L” and “V” needing a separate set of numbers for each order of magnitude.

There’s a lot more to zero than just its symbol, though. The very concept of a number signifying nothing was quite controversial for a long time.

There’s a good book on the subject called Zero: The Biography of a Dangerous Idea by Charles Seife.

I am not sure I understand what the question is supposed to be either, but two of the most important figures in the development of modern algebraic notation (quite apart from their contributions to mathematics as such) were Thomas Harriot, in his Artis Analyticae Praxis (1631) and René Descartes in his La Géométrie (1637) (which really deals with algebra as much as geometry). Amongst other things, IIRC, Harriot introduced the writing of such things as x times y simply as xy, and Descartes introduced exponents written as superscripted numbers.

Harriot probably deserves to be much better known than he is. Not only was he important in the development of mathematics, but he observed and drew the Moon through a telescope before Galileo, much more famously, did so.

Symbolic notation was re-invented after the Renaissance. Roman and medieval authors wrote everything out in words, making things cumbersome, but streamlined shorthands existed in quite ancient times. The Rhind Papyrus, from Middle Kingdom Egypt, uses the logogram “heap” for “x”, legs walking forward for “plus”, and legs walking backward for “minus”.

Circa 1992, a community college math teacher mentioned this to me:

Early in the 20th century, allegedly, there was an (international?) convention of math teachers who got together to standardize notation. (I think that was referring, at least, to basic algebraic notation.) It was here, for example, that factorial notation was standardized to be “x!” (using the exclamation mark). I have seen other notations for factorial in some old math textbooks.

I have read (somewhere) that in DesCartes day, the notation for quadratic polynomials used p for the linear term and q for the quadratic term. Thus, what we write today as:
ax[sup]2[/sup] + bx + c
was then written as
aq + bp + c
One can see that today’s methods for dealing with quadratics would have been more awkward with that earlier notation.

On the history of notation, see Earliest Uses of Various Mathematical Symbols.

There is lots to be said about the import of notation. E.g. what was the importance of the Begriffsschrift?

Regarding tipping points, I would hazard a guess that there was some tipping point where people realised they could come up with lots of useful notations. (A trivial tipping point, so described?)

I can imagine that, at some time, as people stopped formulating math like:
“twice an unknown quantity less three is equal to the same quantity plus four.”
and assigning each distinct idea and unique process its own identifying symbol:

2x-3=x+4

Suddenly, Math becomes an international language, breaking down communication barriers.

I assume there was some general consensus as which symbol corresponded to what idea.

Nowadays at least, mathematicians invent notation at the drop of a hat. For example, some years ago, a logician was creating something called linear logic. And he had “linear and” and “linear or” operators that he denoted by upside down & and upside down ?. These have now become standard and are included in symbol fonts. Or someone wanted a sign for something called an adjoint and he denoted it (roughly) -| (but joined). And, say, I write about something called integrally closed rings (don’t ask) I will call them IC rings just out of laziness.

Oh and don’t get the idea that, say, pi, always means the ratio of the diameter to the circumference of a circle. We heavily overuse all symbols.

I was taught a different notation for factorial when I was at high school in the 1960s. It was a sort of half a box around the number: a vertical line to the left connecting to a horizontal underline.

I haven’t heard of such a convention, though I’m not saying you’re wrong, but I would be interested in a cite of some sort.

Not all notation is standardized today. For example, I can think of at least three different notations for the complement of a set that I’ve seen in different textbooks.

That’s mentioned in the (excellent) site TATG linked to.

It’s a lot more than that. There were already various linguas franca for mathematical study in different times and places. Greek, Arabic, Latin, Sanskrit, etc. depending on where and when you were. The point is not just that mathematical symbols were universal, but that mathematical language allows you to think in a way that verbal languages are not designed for. You were able to translate that English equation into modern notation, but how long would it take you to solve it in English? Remember, you don’t know any special notation, so no cheating by imagining the symbols in your head and translating each step back and forth.

Killer, killer cite. Thanks.

Still need to think big picture using this knowledge…