I would be skeptical of this, but the video does a good job of explaining why 52! the number of ways a standard deck can be arranged is an unfathomably large number.
This video is also good and has an extraordinary visual to depict how big 52! is. Imagine you are standing at the Equator and take one step every billion years. Each time you walk around the equator you take one drop from the Pacific ocean. Once you have collected all the drops in the Pacific you put a single sheet of paper on the ground. Then wait till the sheets of paper reach the sun. The amount of time you have spend is still utterly tiny compared to 52! seconds.
What all this shows to me is how bad our intuition is for really large numbers.
No it is true. You probably need to add “well-shuffled” deck but if you do that the chances are overwhelming that the sequence has never been seen before in the the history of the universe.
Lets accept the notion that an insignificant proportion of the 52! combinations have been produced. And that for a statistically significant proportion to have been produced there would have needed to be in the order of 51! shuffles.
But if “Every shuffling of a deck produces a unique result never seen before” then the shuffling cannot be fully random.
Conversely your statement “chances are overwhelming that the sequence has never been seen before” is true.
I think we need to make a distinction here between “true” (every shuffle must always produce a unique card sequence) and “overwhelmingly likely to be true”, where truth is not a mathematical certainty, but probabilistically a near-certainty. In practice we call many things certainties that have far lower probabilities.
There are a few ways of trying to grasp this number. One is to consider a vaguely comparable arrangement, but instead of having cards with 52 possible values, you have bits with only two possible values. Even here, the number of combinations of a 52-bit binary string is 252, or 4,503,599,627,370,496. But another difference here is that you have an unlimited supply of 1s and 0s.
If you had an unlimited supply of cards of 52 different kinds (a base-52 number system) and randomly inserted them into 52 different slots, the number of combinations would be 5252. But since you have only 52 cards, as soon as you put one in, you have only 51 left, and with two inserted, only 50 left, etc. Hence the factorial instead of the exponent. But either one is an unimaginably huge number. 52! is equal to approximately 8.066 x 1067. There are not many things in the universe that require that large an exponent to express. For example, the total mass of all baryonic matter in the observable universe is estimated to be 1.5 x 1053 kg. The diameter of the observable universe, if defined as including all signals since the end of the inflationary epoch, is about 93 billion light-years; expressed in meters, that number is 8.8 x 1026 m, or about 41 orders of magnitude less than the possible sequences in a deck of cards.
Or consider that the age of the universe is a mere 4.36 x 1017 seconds. Thus if something out there had been shuffling a deck of cards once very second since the beginning of time, over the course of 13.82 billion years it would have been exceedingly unlikely to produce even one duplicate sequence.
You have to start with an already shuffled and perfectly randomized deck, though. If you start with a brand new deck, the number of combinations is much, much, much less. The first cards will never end up on the bottom of the newly shuffled deck, and the bottom cards will never end up on the top. The first shuffle produces a large variety, but it isn’t 52!
Interesting point, but we’re really talking idealized theory here. If you prefer, imagine that the cards are put into a well-designed shuffling machine. They’re placed in a hopper like old-style punchcards, and to prevent any bias due to the physical properties of the cards, the machine sorts them according to a statistically verified pseudo-random number generator started by a truly random seed, like the value of a very fast 64-bit realtime clock measuring a radioactive decay event (or, with the aid of a cat, measuring the time it takes to kill Schrodinger’s favourite feline)
Right. Otherwise we would be saying “it’s impossible for a well-shuffled deck of cards to be the same as another well-shuffled deck of cards.” Which it is not - it’s still possible, just highly unlikely.
All this assumes a universe where an Infinite Improbability Drive doesn’t exist.
Now the existence of an Infinite Improbability Drive is highly, but not infinitely, improbable… hmm… let me go and have a nice hot cup of tea and think about this…
But, the rotation of the deal and the shuffling process are not meticulously random. The cards are continually sorted by the players during play and by habitual shuffling techniques. Is there an ordering process acting on the deck?
When I was in the military I observed that serious poker players believed in hot table (bunk) positions in a game. Especially games that had gone on for a long time with the same deck.
I don’t understand this. Are you assuming a one-pass shuffle with no cutting and rearranging? When I think of a shuffle, I think of a few riffles, a few hand-over-hand passes, a few riffles, and then maybe a cut to end. To me, to “shuffle a deck of cards” means not one-pass, but something like that.
If you start from a fresh deck there must be some minimum number of shuffles needed to get the uniqueness discussed here. So cutting a fresh deck roughly in half and roughly interleaving the two halves once must produce the same results often enough to notice.
How fast do the possible combinations grow with each successive shuffle in the same manner?