Thus spake The Frantics on the subject: http://www.youtube.com/watch?v=fjFaKD9BuOc
I’m impressed with the Mayan system. Easy to use for a running count, visually represents the quantity, easy to add and subtract, uses positional placement, has the zero concept. Those Mayans were pretty smart.
Of course, the reason the number systems survived and were used was practicality - they worked.
In what way did anyone need a million or even 20,000 in Rome or Babylon? Almost never. Simply change the units to stay in the realm of measurable and quantifiable. Riches were calculated in pounds (or whatever) of gold, silver etc. not millions of “dollars”. What was the biggest army put to field (ignoring historians’ typical exagerrations)? How big a flock of sheep or camels could any one person manage? IIRC the tax collection process was more like a franchise pyramid than a complex accounting office, simply to avoid the complexity of record-keeping and accounting that a centralized system would require. Did anyone actually do quantitative math to the extent we do now today, simply because we can…?
It’s instructive that the most practical early use for arabic numerals, IIRC, was the accounting industry where they had graduated to percentages and such - beyond the elementary "six-for-five’ type of previous arrangements.
Yeah, but I’d hate to have to memorize the multiplication table!
I saw on a PBS documentary that there was one big advantage to Roman numerals, which slowed the acceptance of the current one.
I didn’t catch all the details, but evidently, using the Roman system, it was very easy for a marketplace money changer to convert between currencies in a way that the customer, who was not math-savvy, could see that it was being done according to the prevaling rate. Partly that was due just to everyone’s familiarity with the Roman system, but partly it was due to being able to break the numbers apart easily, without doing subtraction, and handle the exchange part-by-part.
The exchanger would push coins from his customer’s pile toward the workspace, then move a corresponding number of the other currency to the workspace. When the customer nodded, he’d scrape up the customer’s bit and the customer would pocket the new coins. This would proceed until the customer’s original pile is empty.
But that’s not using Roman numerals. That’s basically just barter. It’s not using any number system at all.
Up until our current numbers system was in use (and probably for a good while after), most calculations in Europe were not done using numerals at all. Instead the use of counting boards was widespread. This, as Wiki says, is a predecessor to the abacus.
Supposedlty the Romans invented the abacus, but I’ve never seen a picture of a European abacus. The closest I’ve seen is a counting board that had grooves that counters were slid along. Never one with the counters on wires, as the Asian ones have (and I assume that true abaci have).
Tally notation, in which “IIIIIII” denotes 7, is sometimes called base-1. (Though this name annoys some people.)
Another notation, sometimes used as a Huffman-like code in compression (I don’t know if it has other uses) may be called base-φ, because the digits have weights 1, 2, 3, 5, 8, 13, … (It isn’t ancient, but that restriction has already been waived upthread).
I = 1
IO = 2
IOO = 3
IOI = 4
IOOO = 5
IOOI = 6
IOIO = 7
The key feature of this representation is that ‘II’ (two consecutive I’s) never occurs. By reversing the digits and appending a ‘I’ a complete prefix code is obtained.
Has no one mentioned the work of Denise Schmandt-Besserat?
Briefly, counters were used to represent crops and animals very early in the Mesopotamian Neolithic. Eleven clay cylinders might represent eleven sheep. These and other counters might be placed in a clay pouch as a record of debt or tax paid. A numeric notation to summarize the contents was engraved on the pouch so that it needn’t be opened to read the count. Eventually man realized that with that numeric notation, the actual counters were unnecessary!
Roman numerals, the old system( without the subtractive shortand for 9)
had the same advantage as Egyptian , that you could write them down as you counted bunches, deliveries, whatever, and add them latter.
VI III XI C I LVI CLXI X V II V III XXII I X XI II X I LXI
It doesnt matter what order, and if you run out space, you can write the digits in between where you left spaces.
Having tallied up your total, you could then divy up the load to various destinations.
CI for you, VXXXXXXXIIIIIIII for you …
Just cross them out as you go.
if you do scribing of various amounts in a rush with positional digits, making a mistake with the gaps between digits can lead to mistakes, eg You don’t know if a ‘3’ represents 3, 30 or 300.
In modern times we might have pre-printed forms, but the idea was that a lump of clay was rather hand to prepare…you squashed it down flat and started recording tallies, and the pressure is on you to do this as quick as you can.
I am sure someone thought of using binary before modern times.
Its advantage is that a human can count to 1023 on his fingers !
Depends on the context. It’s true that people in ancient civilizations were unlikely to need to count or denote a million physical objects, but they did use such numbers in more abstract contexts. For example, there were the astronomical calculations of the first-millennium BCE Babylonians, which calculated planetary period relations with notional cycles of thousands or tens of thousands of years, and consequently millions of days.
Wrong. Ancient Romans did refer to the fortunes of some wealthy people as being on the order of a million or even a hundred million sesterces.
Your idea of “primitive” ancient societies dealing only with tangible and consequently not very numerous objects is way too simplistic.
If they referred to them in millions, they had the concept and the notation would logically follow.
It sounds more like “he has 100 pounds of gold, each pound is 40 aureas, so he has 4,000 aureas; each would be 25 silver denari, so he has as much as 100,000 silver denari, or 400,000 sesterce.”
I’m not saying the more astute educated and/or monied classes were incapable of this, I just suggest that in real life such math was from rare to irrelevant for 99% of the people.
Take as an example the pounds-shillings-pence system of England. Stuff that was low-priced, was denominated in pence, ha’pence, etc. More expensive? price it in shillings and pence. Even more expensive, use pounds. The average Joe on the street never had to grapple with multiples of 100, etc. in most real-life situations. For the guy in the market, a system that counts to twelve probably works most of the time. Add a character for 10, and he’s set for life.
Think of it the same way we work with cash. Almost nobody walks around with 150 one’s or 5’s in their pocket - we have a few 1’s, 5’s, 10’s, and twenties; maybe a hundred or two. Very analogous to counting with roman numerals…
Of course, when we get to the point where accountants for kings have to figure sums to the level of fielding armies or building and outfitting ships, and account for money to the penny, from sum hitting the tens of thousands or higher, then suddenly arabic notation and decimal position becomes an incredible assett.
Jeez I would disagree, I much prefer doing 10 base calculations in my head but I have been bought up on a purely decimal system. I just did the calcs in my head and it was pretty damn easy. My father was a woodworker and although he grew up in pre-decimal times he used decimals for all calculations.
So just because mine or your culture adopts one or the other does not make one superior although I would say for paper working decimal is a lot easier.
md2000, if the last paragraph holds, then would not such numerals (which already existed in a culture in contact with the Romans) be more useful when the continent was under an emperor rather than kings? The empire had more population, fielded larger armies and navies. Even before the imperial age we find such large numbers relatively commonly. Caesar famously quipped that he needed 25 million sesterces just to own nothing. When Pompey made kings his clients and set up tax systems in the east he needed paperwork on dozens of kings, each making contributions of millions of sesterches. And his conquests were far from unique, what with Rome having just trod over Africa, Greece, Illyria, all with populations in the millions, all needing accounting for. Not to mention calculating daily grain shipments in a city of one million, and an empire of 50.
Maybe it’s not the ‘average Joe’, but using numbers in the tens of thousands and higher was more common in Europe a thousand years before the introduction of arabic notation to the continent. Then, I’d argue your ‘accountants for kings’ are probably just as removed from the ‘average Joe’.
I’m thinking that the taxation systems were farmed out and numbers were rounded.
The issue is not whether they could count that high - obviously they could, if theconcept and word “million” existed in some way. The issue was whether they needed that level of precision in everyday activity.
For example, even abck when people were paid what, $10 a week or 10 shillings a week - the smallest change was at least a ha’penny or quarter penny. That would be like having nothing smaller than a quarter or a 50-cent piece (or even a dollar?) today.
Canada just discontinued the penny, so cash transactions round to the nearest nickel, but any electronic transactions still count the individual cents.
The impetus for decimal digits and fancy accounting is driven by the need to account for precise numbers, not the size of those numbers. If we’re talking round millions, that can be handled with the same systems as units, but with a special indicator “these are millions”.
Caesar may have tossed around 25,000,000 but did he toss around “I have 25,632,095.5 sesterce and I am expecting Brutus to pay me 452,685 more by the Ides of March.”
So to re-cast the OP question, it’s not the size that matters, it’s the precision. Adding 1 million to 25 million is trivial. adding 8-digit numbers, less so.
then there’s the whole discussion to be had over decimal fractions. Once arabic numberals arrive, then decimal parts become familiar (perhaps as a side effect of pennies to dollars type calculations). Then you start to get more esoteric calculation like more complex interest rates, log and trig tables, etc.
18 pieces means you have 17 gaps, not 16, so now you have 4 1/4 in waste and 55 3/4 to divide by 18 …
And I’m amazed all woodworkers are so proficient in more than one positional numbering system the number of even fractions are more important to them than familiarity and habit.
The exchange rate was specified in Roman numerals, and that made the process easier, according to the PBS documentary. This was an example of an impediment to the adoption of the new number system.
I was reading about the Venice money exchangers. The article said that Roman numerals were mandated so that the customers would trust the transaction. The money changers wanted to use other numbers, but they were prohibited from doing so. The government wanted the customers to trust the transaction, so they mandated the familiar Roman numerals. I couldn’t find anything which said that Roman numerals made the transaction easier.