The importance of using a numerical symbol to indicate zero was/is a conceptual obstacle (“hey, it’s not a number, it’s nothing, so why should it be a number?”). It’s necessary for place-value systems to work, otherwise you’re stuck with a single line of consecutive digits as with Roman numerals. These are easy to add and subtract, but do anything such as division, let alone algebra, and you’re screwed.
I understand not having the concept of zero as a number, but I have no clue how you can live without 0 as a numeral. Did Romans not do simple math problems?
You know(using whatever proper arithmetic notation they would use for this):
V + VI = XI
IV + (II + IV) = X
X + (XX - XV) = XV
II * (X + V) - III * (I + IV) = XV
II * (X + V) - III * X = … per impossibile?
No no, I’m perfectly aware of the limitation. I’m just asking how they wrote down the answer if the answer was zero. I guess the analogy I am trying to make is
Now, 1/0 is not really a number in our system, so we term it undefined (in context of arithmetic, let’s just pretend we don’t have limits, asymptotes, calculus, or whatever).
The romans must’ve done simple arithmetic using their system, in which zero is not only not a numeral, but doesn’t exist as a numeric concept.
V-I = IV, V - II = III, V - III = II, V - IV = I, V - V = ___________
What would a roman accountant, mathematician, student, whatever put in the blank? Did they use some word like “nothing” or was it more akin to our “undefined”?
In the Academy Award winning short, Why Man Creates, (http://www.pyramidmedia.com/item.php3?title_id=1312) there’s a wonderful beginning cartoon version of the history of civilization. At one point, a window opens up in a minaret and shows a man working at a desk who says, “Allah be praised! I’ve invented the zero!” And another man in the room says, “What?” To which the first man replies, “Nothing. Nothing.” And the window closes. Brilliant.
Without arithmetics, you would not think of asking how much nothing is. At least it’s not a trivial question - does it make sense to ask how much nothing is after all? It requires accepting nothing as a cardinal number. You can see what one apple is, but not zero apples. It just leaves you hungry, not having apples. For most of us numbers exist outside of space, but when numbers are strongly associated with counting and simple mathematics, it seems strange to treat nothing as a quantity. Saying nothing plus nothing equals nothing, althought natural to us, is a level of abstraction higher than simple counting like one plus one equals two.
Someone correct me if I’m wrong, but it’s my impression that no one tried to write equations in the way that we did until the invention of algebra in the Middle Ages. They understood the concept of operating on a number with addition, multiplication, squaring, etc. They understood the concept of two numbers being equal. They even understood the concept of particular kinds of equations like quadratic equations. They did this though by writing the equation in ordinary words. It didn’t occur to them that this could be done much easier by special symbols until after the time of the ancient Romans. Without the idea of writing equations with special symbols, it’s not as easy to see the use of a symbol for zero.
It’s kind of interesting that “none”, one, two, “a few”, and “many” seem to be hard wired into our brains. Because of this. the concept of zero and a word (though not necessarily a symbol) representing it have always occurred very early in any human culture’s development. Numbering systems come much later than zero, because our brains aren’t hard wired to understand numbers.
DarrenS guessed correctly. The Romans did indeed use the word “nulla” to represent zero. Even though the Romans didn’t have a symbol for zero, they had a word for it. Sometimes it was abbreviated with just an “n” but then mostly when you are counting and such you don’t really need a zero, so it didn’t show up very often. It’s not like Roman teenagers were being forced to learn algebra in Roman high schools. Things were different back then. If you weren’t counting money or trying to figure out a holiday or a phase of the moon, you didnt’ really need math much. And, if you are counting out coins “three, two, one, I don’t got no more” works just as well as having a specific symbol for zero.
A lot of mathematics and science developed during Roman times and under Roman jurisdiction was actually written in Greek. It used the standard Greek alphanumeric notation, which was not place-value so it didn’t require a zero digit. But it sometimes included a zero symbol as a placeholder in sexagesimal place-value numbers. This symbol is thought to have been derived from a somewhat similar one in cuneiform Late Babylonian astronomical texts; its form was possibly influenced by its resemblance to the Greek letter omicron, the initial letter of Greek ouden, “nothing”, as in the Latin use of “n” for nulla.
Nobody in the western world, true (if you don’t count the syncopated quasi-algebraic notation of Diophantos). Indian mathematics in Sanskrit from about the middle of the first millennium onward had a notation to represent equations including various powers of unknowns, which was actually somewhat closer to modern symbolic algebra than Hellenistic or medieval Arabic notation systems, and which of course used decimal place-value numerals including zero. However, the Indian algebraic notation doesn’t seem to have directly influenced the development of early modern European algebra.
Just wondering…did anyone bother to “play clip”? You’re missing something truly spectacular. 35 years old and as up to date as if it were made this morning.