Math Semantics: Are negative, subtract and minus the same thing?

There’s a lot to unpack in the definition you gave. “Real number” itself is nonsense unless given a definition.

As a jumping-off point for further exploration of number systems, I don’t see anything wrong. An additive inverse corresponds to the negative for the real numbers, but may not be for others (7 is the additive inverse of 3 in the integers modulo 10), so it’s nice to treat it as a general concept. It also offers a parallel with the multiplicitive inverse, which might seem unrelated at first glance but has some similarities (and differences).

If your definition was given without any further elaboration, I’d say it was a pretty terrible one. But don’t students get introduced to negative numbers in elementary school, and algebra only in junior high or high school? I don’t think I’d expect them to have a clear understanding of the real numbers, but they should know fractions, negatives, and maybe have limited exposure to irrationals like pi.

Yeah. What I meant was the advent of computers in the 1950s munged together a bunch of distinct symbols-for-ideas to force-fit them into the then-limited character sets you explain so well. FYI back in Ye Olden Tymes I’ve earned my living writing both FORTRAN & COBOL at different jobs. On paper transcribed via an 029 onto punch cards. Oh the pain.

Since roughly the mid 90s computers eventually have gained the ability to handle (e.g. Unicode) and display (Hi res displays and printers, LaTex, various math-specific markup languages, etc) the full panoply of anything that was ever doable in traditional book printing.

But somewhere between then and now much of the user audience of humans pretty well lost those fine distinctions. Em-dash, subtract, negate, hyphenate? It’s all the same idea because it’s all the button on the keyboard next to the “=”.

Me, I still like APL although it’s been years since I did any. Plenty of dedicated symbols for everything.

Perhaps the minus sign is overloaded in the sense that it is used for both a unary and a binary operation, but that usage antedates computers by centuries. Also, people in this thread are saying that students are not confused by this. Do you suggest changing the notation, and, if so, to what?

Eh, even before computers, those symbols all looked nearly identical. How many people actually knew they were even different?

Heck, a lot of old mechanical typewriters didn’t even bother having a key for 1, because they already had a character for l that looked almost exactly the same.

If anything, computers have enhanced, or even created, the distinction between those characters, not erased them, because a computer (with the proper character set) can distinguish between them, even when a human can’t.

Well, sure, if you check, the hyphen, minus, and dashes of different lengths that show up on your screen are each represented by a different character code. And those distinctions predate computers as well.