Dice Computation Question

To expand on what the others have said, you just need to think in different number bases. And I have to say that this really smacks of homework.

For 1-24 you need to recall that 24 in base 10 is 40 in base 6, so just roll your d4 and count 4 as 0 (for a 0-3 result) and multiply by 6, then roll d6 and add.

For 64, you need to recall that 64 in base 10 is 100 in base 6, so roll 2d8 similarly to the above.

For 81, you need to recall that 81 in base 10 is 100 in base 9 but also 1000 in base 3, so just roll 4d6 treating each d6 as 0-2 by halving and rounding down, and 0000 = 81.

78 is the difficult one. It’s 60 in base 13 but you don’t have a d13. The nearest is base 20, so roll d4 and d20 as above and re-roll results of 79 and 80.

Well there you go, there’s an example already of where you might use this algorithm: If you’re implementing a randint program for a calculator.

All algorithm optimization requires some sort of measure of the total cost you’re trying to minimize. For an algorithm to be performed by a computer, this is generally pretty straightforward: You’re usually taking the cost to be the amount of time, and each operation takes some amount of time, so you add up the times needed for all the operations. Usually, there’s some operation you’re using which takes much more time than the others, and so you can get a very good approximation by just multiplying the cost of that operation by the number of times you need to use it. In particular, the equivalent of rolling a die tends to be quite an expensive operation (in fact, some computer systems which need high security actually do it by having a little box attached that literally rolls a die), hence why I was assuming that as my constraint.

When the algorithm is to be used by humans, though, it’s a lot more complicated. The goal is still to keep the total “cost” as low as possible, but it becomes much harder to figure out what the cost of each operation is. And some operations which are trivially easy for a computer are nontrivial for humans, like (20×18+7)mod33. And RAM is also generally more expensive for humans than it is for computers, such that it is hardly ever the relevant limitation for a modern computer, but often for a human.

One convenient way to think of how to use “leftover” entropy is like this:

At any moment, you keep track of intervals G and C, both starting out equal to [0, 1).

Whenever you roll an n-sided die, you partition the interval G into n equal pieces, and then update G to equal one of those pieces, according to the result of the die.

Whenever you need the result of an m-sided die, you partition the interval C into m equal pieces, and then wait for G to be contained in one of those pieces. Once it is, you spit out the corresponding result, and then update C to equal that piece.

In this way, no entropy ever goes to waste; on asymptotic average, the results will be produced at a rate of log(X)/log(Y) results per roll when emulating a sequence of dYs using repeated rolls of a dX with this method. But this method can also handle generating randomness using a sequence of differently-sided dice, or emulating a heterogenous, non-predetermined of differently-sided dice. (It can even handle unfair dice on either side of things as well, so long as one generalizes the “partition into equal pieces” step with “partition into pieces of the appropriate (not necessarily uniform) distribution”)

[To emulate just a single die roll, however, this method is not optimal (in the sense that the mean number of throws used is not minimized); optimality is only reached in the rate as amortized over many results.]

Chronos’ method is essentially conversion of a string of random digits in base 20 to base 33 (or whatever). Phrased like that it is easy to see that the system maximises extraction of information from the entropy provided. It is also easier to see uses for this algorithm. You just won’t catch me doing it in the average board game.

I don’t recall ever throwing a physical die with other than six (or two) faces: my prolonged misspent youth was misspent during LBJ and RMN Administrations; such things weren’t in widespread use then. But I have “tossed” perhaps trillions (:dubious:) of multi-faced “die” in computerized simulations. One of my hobbies is coding perfectionism so, even though there are PRNG’s that are quite fast and good enough for all but cryptological purposes, I have implemented generators similar to what Indistinguishable describes.

In practice one uses integer arithmetic and the details are slightly different. At any point one has log M entropy, represented as a variate X from {0, 1, …, M-1}. If M is a multiple of 6, one can “toss a 6-sided die” by simply returning X%6 while making the replacements M/6 → M and X/6 → X. When M is not a multiple of 6 and X/6 > Floor(M/6), one must “start over;” however there is some leftover entropy from X that a perfectionist may want to preserve.

Random-generation software observed in the wild often has severe flaws. For example, the Fisher-Yates shuffle is no less than 75 years old but remains beyond the ken of many programmers.

I omitted details to forestall Googling “Nerds who post pseudocode,” but should have avoided a round of rejoinders by pointing out that preserving the entire entropy of X is not an option.