I barely know more than the first sentence of the Wikipedia articles!
I actually didn’t know offhand what Hamiltonian or Eulerian paths were, but they are apparently “visit every vertex once” and “visit every edge once”, and a cycle just means they’re closed. I know you can build a dual graph such that every vertex turns into an edge and vice versa, so I was initially surprised that you would say the Hamiltonian path problem is harder, since it seems like you can just do the V-E swap and use your Eulerian path algorithm.
But! You were careful to say directed graphs, and I suspect this is where my previous mental image falls apart. And maybe there’s a problem with non-planar graphs, too; not sure about that yet.
Better yet, suppose you want, in an arbitrary base, a way to run through all strings of a given length precisely once, except starting and ending at the same one, in such a way that you only change one digit at a time, and all such changes consist of increasing a digit by 1 (with wraparound but no carries; i.e., increasing the maximum digit just turns it back to 0). [In base 2, this is a familiar Gray code].
How can you do this? Well, there are many different ways. But one is quite simple: on the n-th time you need to bump up a digit, bump up the digit as far from the end as the least significant nonzero digit in n in the relevant base.
For example, in base ten, you would bump up the units digit the first 9 times, then bump the 10s digit the 10th time, then bump up the units digit again the next 9 times, then bump up the 10s digit again the 20th time, and so on until you bump up the 100s digit the 100th time, the 1s digit the 106th time, the 10s digit the 130th time, the 100s digit the 200th time, etc.
Eventually, you’ll be asked to bump up a digit that doesn’t exist in strings of your given length; at this point, just bump up the most significant digit you can, and you’ll return back to where you started.
Ah, you can’t build a “dual graph” where each vertex turns into an edge and vice-versa quite so straightforwardly (even for undirected graphs); there are constructions like the edge-graph, where each edge turns into a vertex and each vertex turns into multiple edges, but it’s not a straightforward flipping of edges and vertices, since vertices and edges play not quite symmetric roles in arbitrary graphs (for example, each edge is connected to precisely two vertices, but it’s not the case that each vertex is connected to precisely two edges).
[For planar graphs with given planar embeddings, there’s also a notion of dual that turns vertices into faces and vice versa, but again, this doesn’t swap vertices and edges in a way relevant to Hamiltonian vs. Eulerian paths]
I suppose a much simpler way to describe it is this: on turn T, take the value of any particular digit to be the difference between that digit and the next higher up digit in T itself [in wrap-around arithmetic].
So on turn 658, we would have the value 693, because 8 - 5 = 3, 5 - 6 = 9 [in wrap-around land], and 6 - 0 = 6.
The asymptotic frequency as you described in #57. Taking substrings of increasing length, what is the frequency of (say) the 9s in base 10?
It’s easy to picture what happens for most normal numbers; we have some roughness early on but it smoothly converges to 1/10.
Here, there are dips in the frequency just after we pass an order-N stopping point, but then it recovers.
Maybe these dips get smaller in magnitude as the length increases, but I doubt it–it seems like the reverse might be true, since as the lengths increase we have more possible de Bruijn sequences, so the opportunity for back-loading the high digits increases.
So it seems like it’s asking for the limit of sin(x) as x->inf: there isn’t one. But it seems pretty weird to have a number where we can’t ask how many of each digit it has.
Ah, yeah. Maybe it seems weird, but there are indeed such sequences: for a simple example, consider the sequence consisting of 1 A then 10 Bs then 10^2 As then 10^3 Bs then 10^4 As, and so on. If we stop just after a block of Bs, there are 10 times as many Bs as As. But if we stop just after a block of As, there are just over 10 times as many As as Bs. So, neither As nor Bs can have a well defined asymptotic frequency; both their frequencies oscillate from 1/11 to 10/11 and back, over and over forever.
And it’s in this sense that (for instance) the icosohedron is dual to the pentagonal dodecahedron, or the Catalan solids are dual to the Archimedan solids (and I can’t understand why the Archimedan solids always get all the attention, when the Catalans are so much cooler).
And the “edge graph” construction is the one where each edge becomes a vertex, and two of these new vertices are connected iff they met at the same vertex in the original graph, right?
Yup. Or, in a directed multigraph sense, each edge becomes a vertex, and each path of length 2 becomes an edge (from its first half to its second half). And in general, each path of length n + 1 becomes a path of length n in the same way.
(Out of curiosity, why do you find the Catalan solids cooler than the Archimedean solids? I guess mathematically there’s no real difference, so it must be about their physical aesthetics.)
Well, for one thing, you can make dice out of the Catalan solids. And most of them are members of infinite families of solids that retain the essential characteristic of being face-transitive: You can, for instance, smoothly transition between all of the face-transitive dodecahedra (except for the bipyramids, of course, but they’re trivial).
Ah, the dice thing is an interesting perspective; hadn’t thought of it that way.
Not sure I understand what kinds of face-transitive infinite families/smooth transitions you have in mind. Would their duals not just as well comprise vertex-transitive infinite families with smooth transitions?
…Hm. Yeah, I guess you can smoothly transition between the icosohedron, truncated tetrahedron, and cuboctohedron, too (well, at least as smoothly as the Catalan transitions, anyway-- In both cases you have edges appearing or disappearing). I’m not sure why I found those to be harder to visualize.
In any event, though, both categories are usually described primarily in terms of their sides, not of their vertices, and it just seems like “all the same” is a more interesting property for the sides to have than “each a regular polygon” (granting, of course, that those aren’t complete descriptions of either category).
What’s all this business about a normal number and finite strings being “likely” to appear in them? Isn’t it correct to say, in layman’s terms, that if a number is normal, that any finite string of digits appears in it? (If not, I hereby define a number in which every finite string of digits appears as a Grady Number.)
Grady numbers are “disjunctive numbers”, as noted above. And, yes, every “normal number” is “disjunctive”, but not every “disjunctive number” is “normal”. The “likelihood” terminology some may use in defining normal numbers is not in the sense of whether a string appears at all, but rather, how frequently it appears, relative to other strings. Thus, if every finite string shows up, but, say, there are generally twice as many 7s as 1s, the sequence is disjunctive but not normal.
Incidentally, in the ordinary language sense of the term “normal”, most normal numbers are nothing like any of this. You won’t find every string in the decimal expansion of, like, 12, or 1/7, or such things. The most normal numbers in the world just end in bunches of 0s, or other repeating cycles of digits. And, as noted, even things like sqrt(2) or π, which are quite normal in the ordinary language sense, aren’t known to be “normal” in the technical sense.
That depends on what sorts of numbers one considers “normal” in the colloquial sense. Most of the numbers which are most often referred to are rational and hence assuredly not normal, but most of the quantities which are ever actually encountered are presumably irrational. Yes, the recipe might have said to use a half-cup of flour, but it almost certainly wasn’t.
Oh, I don’t think that’s true. The flour consists of an integral number of atoms, so doesn’t have an irrational mass. For similar reasons, I don’t think it could be said to have an irrational volume, when you get down to the Planck scale. Intuitively, it seems impossible for physical objects to have irrational characteristics. That would imply that an infinite amount of information is somehow encoded in the physical object.
But the atoms are jiggling around slightly, changing the mass via kinetic energy. And atoms can move arbitrarily slowly.
I might argue that the existence of derivatives (i.e., that physical law often follows differential equations) implies that physical numbers aren’t just the full set of real numbers, but include infinitesimals as well.
Well, there’s quantities and there’s quantities. Half a cup of flour is hard to measure exactly, I will grant that. But I do have exactly 2 parents and 4 grandparents and 1/2 as many parents as grandparents, no ambiguity.
But, yeah, in an ordinary language sense, what’s “normal” is context-dependent, and in some contexts typical numbers may be normal in the technical jargon sense, while in others typical numbers will not.