In what way is math distinct from notation?

I don’t think Sage Rat disagrees with you; after all, his position is that higher-level programming allows for many advances, changes the ballgame, etc. When he says “CPU instructions that you could have written out by hand”, he doesn’t mean anyone actually would sit down and crank them out manually; he uses that phrase only in order to grant that machine code and higher-level code are both just strings of bytes. But his overarching point is in agreement with you: that though the latter is, in some sense, mechanically translatable into the former, the equivalence this appears to provide is only illusory, as the conceptual space the two live in is generally very different (e.g., the semantic universe of the one includes functions and objects, while the other is focused on things like jumps and registers).

No, I’m a physicist. I use whatever tools are easiest for the job, and sometimes but not always those tools are mathematical. And when I use nonmathematical tools, I freely admit that they’re nonmathematical.

I think the surprising thing isn’t your last sentence in post #15, so much, but your second one (“But without the notation, they’re not mathematics.”). What say you to Ruminator’s example?

(Myself, I might say that the language Ruminator’s measurer uses to describe his process, whatever it is, is the “notation” with which we are concerned, regardless of whether it takes the particular form of glyphs on paper. [And if the “language” is nothing but carrying out the activity itself, then so be it. But at that point, the idea of separating “notation” from the rest of it becomes somewhat incoherent…])

In Ruminator’s example, what he’s doing while he’s actually measuring the tree isn’t mathematics, but what he did before that to realize that that method of measuring the tree would work may well have been. Or he might have used non-mathematical methods for that, too… I don’t know.

Sure, but your claim that mathematics is nothing but a series of rules for manipulating notation sounds a lot like Hilbert’s philosophy of formalism (though I’m no philosopher).

I’ve always had this impression that notation was a big deal in math, as opposed to science, say, where the technical language and precise definitions are obviously an aid to deeper understanding rather than the main point. But I’ve never really thought it through, and reading the responses so far will certainly help with that.

I didn’t want to imply that I think math is nothing more than notation. What sparked this thread was Capt. Ridley’s Shooting Party’s comment in the other thread that “Mathematics isn’t notation”.

Calculus is one example of this. Archimedes and Eudoxus pretty much invented integral calculus around 300 BC. But it took 2000 years and decimal and algebraic notation for mathematicians to begin building on their work. But it also took a more sophisticated concept of infinity and I don’t know if we can call that “notation” by any stretch.

Or how about complex numbers? The entire field is the consequence of pushing our notation far outside of the bounds of what was previously considered reasonable. Or non-Euclidean geometry, where it was noticed that one of the axioms seemed superfluous, but changes in that axiom ended up changing the entire structure of the system.

Notation doesn’t have to be written. I would argue this tree-measurer is actually manipulating symbols in his head. Maybe even word symbols – “my shadow’s length/my height = tree shadow length/tree height” and solve from there. It is a crappy notation and therefore not so much math as measurement. It really becomes math when we introduce a more general notation, with x’s and such, so that we can apply this type of measurement to a whole class of objects.
Ultimately, I don’t think math leads us closer to some deep platonic knowledge, but is rather a symbolic game that incidentally happens to have amazing and powerful applications to the real world. It is this ‘incidental’ power that really wows me about math. I can’t see any reason that math should have this power. Math is pretty much the most mystical thing I believe in; it is almost a religion. The impression I’ve gotten since I’ve been old enough to think about it (although I’m finding it hard to explain why), is that math is just a mental game invented thousands of years ago by early civilization. Very much like religion. Except math works. I see no reason why this should be so but it is one of the most amazing things I can think of.

Well, first off you might want to review the definition of mathematical notation:

Secondly, if you review the history of calculus, you will see that progress was being made in several parts of the world using different notations. It was not notation that led to calculus, rather it was the advancement of the underlying concepts. The same could be said of complex analysis.

Thirdly, while axioms may be written with specific notation, the axioms actually represent abstract concepts. In the case of non-Euclidean geometry it is the concept of parallel lines that is being altered.

Because you have set it up as an accident, like most formalists. Mathematical behavior came before the notation, much like spoken words came before literacy. Formalism is like Jabberwocky, IMO.

You are welcome to think of it as a symbol-manipulation game. I think it is very instructive to do so. Some of the most interesting things about math, IMO, come from this perspective. But if you forget why we play it, the sense behind the symbols, then it would seem mystical, when in fact it’s the most ordinary thing in the world. People balance their checkbooks every day! (Not me, of course, I’m above all that. But I hear it said…)

These days higher level code might actually be less misleading, since machine language instructions will often be executed out of order on a modern processor.

Higher level concepts change the ballgame because most of us get lost in the lower level jungle. von Neumann famously was against floating point because he understood what was going on at that level without hardware assistance. He was against assemblers for the same reason.

So here is a question - would a theorem proving program work better or worse using more sophisticated notation as primitives? Would it work better if it could invent notation?
Back when I paid attention to this stuff I thought there was a base set of primitives that could be manipulated, and things at a higher level were broken down into them - but that was a long time ago, and I don’t know what the state of the art is any more.

I can see arguing for complex numbers to have arisen through something akin to an abuse of notation. But how is modifying the axioms of geometry an example of notation?

Yes, I believe that math is formalism, but I wouldn’t call myself a formalist, for the same reason I don’t call myself a mathematician.

There’s a selection effect going on there. Mathematics can go down all sorts of different paths, and in fact does. But it’s only the paths which happen to correspond to the real world which get all of the attention. As an example, the group of the integers modulo 12 under the operation addition is familiar to elementary school students (even if they don’t know it by that name), because that’s the basis of our timekeeping system. But so far as I know, nobody has ever put a use to the group of integers modulo 1000007 under the operation multiplication. A mathematician will tell you that those are both perfectly valid groups, but one gets a lot more use than the other.

(Thoroughly pedantic nitpick: integer multiplication modulo 1000007 doesn’t form a group, merely a monoid; 0, 29, and 34483 lack multiplicative inverses.)

Huh? Isn’t 1/29 the multiplicative inverse of 29, and 1/34483 the m.i. of 34483?

Think integer arithmetic modulo the number of elements. The non-zero elements which are relatively prime to the number of elements form a group. Since 1000007 = 29 * 34483, they don’t have multiplicative inverses.

modular multiplicative inverse

Yes, the most sophisticated theorem proving environments today (specifically Isabelle) use intuitionistic higher order logic as a meta-logic, and define all object logics within this framework. In effect, you have a formal proof that looks a lot like an informal mathematical proof, as you’d find in a text book, which is just syntactic sugar for a simpler underlying representation. There’s no utility in moving to more expressive meta-logics than HOL—higher-order unification is semi-decideable, but Huet’s algorithm works well in practice (anything more complex and you’re at risk of losing this), and nearly all mathematics can be encoded within HOL.

Huh, I had it stuck in the memory banks that a million and seven was a prime. I guess I’ll need to find some other generic large prime to pull out in situations like this.

You’d need to say something like “non-zero” at some point. Though, people do take things like “the multiplicative group of integers modulo n” to implicitly mean “the multiplicative group of those integers which have inverses modulo n”, so my pedantry was, and is, rather unfair. [Does terminology count as notation?]

On the bright side, a million and three actually is prime and is probably what you had originally put in the memory banks. You’ve just earned some interest over time. :slight_smile:

That, or I was thinking of some other power-of-ten plus 7. But thanks, a million and three will work fine.