Is there a field in math that uses something other than the natural numbers (and zero) for their subscript space? Is there a “theory of subscripts”?
What do you mean by “subscript space”? Mathematicians can represent a set as elements with any subscript. The natural numbers are the preferable for all sets (except an uncountably infinite set) because people are used to dealing with them.
A subscript doesn’t mean anything mathematically, their only purpose is convenience. They allow you to write symbols with multiple characters without making them look like they’re multipled together.
Unlike exponents (superscripts), they don’t denote an operation.
Yeah, subscripts can be pretty much anything you like, and it’s not at all uncommon in math to see things other than natural numbers as subscripts.
For example, with uncountable sets that you want to index with subscripts, you might see the subscripts come from an arbitrary ordinal, as opposed to the natural numbers (an ordinal can be thought of as an extension of the natural numbers that is still well-ordered (every nonempty subset has a smallest element)).
Another example that is not uncommon. Say you have some set/space X, and you want to look at all of the bounded, real valued functions whose domain is X. So one of these function, call it f, maps X into some real number interval, call it I. You then might want to talk about all of those intervals that have been mapped into.
A nice way to keep these intervals distinct is to write a given interval as I[sub]f[/sub]; you could interpret this notation as meaning, “This is the particular interval that this particular function f maps into”, i.e., label the intervals in this manner by their corresponding functions.
Basically, a subscript is a convenient notation for a map from one set to another. There are no restrictions on the domain, although N is by far the most common in basic mathematics.
I don’t think there’s any specific theory of subscripts; rather, it just falls under set theory. The normal term for the set of subscripts is the index set.
For a very simple example, one often encounters negative integers (or other numbers) as indices, as well. For instance, if I have an infinite line of objects which I want to label, and I’m standing next to the line, I might label the nearest object to myself as m[sub]0[/sub], the next ones to that in one direction as m[sub]1[/sub], m[sub]2[/sub], and so on, and the ones in the other direction as m[sub]-1[/sub], m[sub]-2[/sub], etc. There you go; indices which are not restricted to the natural numbers.
Or an example where the subscripts aren’t numbers at all: If I’m talking about objects in the Solar System, I might refer to M[sub]Sun[/sub], M[sub]Jupiter[/sub], etc.
One can even have indices which themselves have indices; I’ve encountered this a time or two in general relativity.
Halfway relevant rant-
Perhaps the biggest pain I’ve encountered so far in mathematics is that anyone can make damn near anything mean damn near whatever they like. As Chronos said, you could make M[sub]moon[/sub] refer to our satellite, or a fraction of the area of a circle based on the current phase. Natural numbers are used pretty much because they’re easy to make sense of in most situations, but there’s no rule that says you can’t define a notation where complex numbers or words have meaning.
In most college classes, the book, the professor, & (if applicable) the TA will have three different notations without much overlap.
This only becomes dangerous when they start stepping on eachother’s toes- overlapping on terms but not meaning. For instance, in one of my current classes, the symbol the book uses for a vector meant “Real Numbers” everywhere else I’ve seen it. The professor uses it for the rank (dimension of the column space). It’s annoying when the professor goes over a problem, & his exact answer on homework (graded by a grad student we never meet) gets no points because it’s “meaningless.”
Even worse, Pi occasionally stands for “an undefined basis with such & such qualities.” There is no mention of when it’s an irrational number & when it’s a basis.
[/HWRant]
(a) Mathematicians have to say a lot and there are only so many symbols. We’re not going to start to use zapf dingbats (which are really hard to draw in lecture) just because you can’t keep up.
(b) Mathematicians are (usually) very clear on what their notation is in a given situation. Most papers are riddled with “let ____ be _____”. You say one class uses a symbol for vectors that others use to denote the real numbers (what the hell symbol are you talking about, by the way?). So they’ve done their job. In that class, this symbol means this thing rather than that thing. Again, there’s too many things to talk about not to reuse notation.
(c) In the few cases where notation is not explicitly mentioned, it’s intended to be clear from context. Are you dividing pi by 2? Then it’s not a bleeding basis, now, is it?
There are as many symbols as you could possibly want. There are a limited number of symbols beneath a certain level of simplicity. You have to strike a balance between maintaining simplicity and avoiding duplicate meanings. In my humble opinion, mathematicians have pushed too far towards simple symbols with overlap, and need to start moving in the opposite direction. For example, consider how many different things are denoted by a vertical line. The number one, divisibility, mid bar in set notation, absolute value, evaluation of integrals, etc… Certainly I’ve seen tons of cases where confusion arose out of jumbled notation, and time was wasted clearing it up. And there’s no telling how many students get confused by notation but fail to speak up about it.
Thank you, ITR, for explaining my point more clearly.
I called it a halfway relevant rant for a reason- it’s merely annoying.
The one damn near everybody uses for Reals- looks like |R, ℝ is the closest I can get. This would be petty if not for the last bit-
The grader can’t be bothered to learn the professor’s notation, nobody in the class interacts with the grader. The professor has better things to do than grade HW. No matter, that’s why exams are weighted
Absolutely clear from context. But, I’m still a naïve young babe who has hope for some things to be sacred. And being shocked by this big & scary new idea makes my brain imagine it going one step further.
If someone decides to use the same symbol as a vector/matrix with such & such properties (or any number other than 3.14…), and interchanges freely without mentioning it, then dividing by 2 (or multiplying by a vector) could be less than clear.
Are you a professional mathematician? Then write your own papers with your own notation. Go on, I dare you.
Absolute value lines (and sometimes determinants) occur in pairs around a semantic group and extend to the height of the group. It’s hard to miss these.
The “mid bar” also shows up in presentations of algebraic structures by generators and relations, and is (again) very clear in context.
The divisibility relation is the only one here that shows up on its own.
Now, if this is a lecture at a board then there’s a chance that sloppy writing can make these indistinct. However, that’s why writing on the board is a supplement to a spoken lecture. If the instructor says “such that” when describing a set and the inattentive student reads half of an absolute-value pair (incidentally disregarding the fact that this makes no semantic sense) then it’s hardly a fault of the notation. If the instructor doesn’t lecture along with the writing, then it’s a pedagogical fault and – yet again – not one of the notation.
Blackboard bold. I really want to see now what text this is that uses that for a vector. Are you certain that it’s not meant to describe a given vector space?
I would say then that this is a problem between the professor and his grader. The grader is responsible for grading the material and not the notation. If he refuses to use the notation of the class (which is de facto set by the professor) then he’s not doing his job. If he were my grader I’d make sure he changed or was fired.
You’re right on the money here, though I don’t think you realize it. If someone uses the same symbol in the same section for two different concepts (and I mean the exact same symbol) and doesn’t carefully note the switches, then there’s a problem in how the notation is being used. Bad writing, yes. Bad notation? not necessarily.