Suppose you lost your mind and decided to build a database containing the optimal solution to every possible permutation of a Rubik’s Cube. That is, for each possible cube layout, the database contains the fastest solution. In order to look up the solution to a given permutation, that permutation must have a unique name. It’s a computer program, so the name ultimately needs to be in binary. Given that there are approximately 4.3*10^19 permutations, how long must such a name be in order to give each permutation a unique name?

My approach was to translate 4.3*10^19 to binary and round up to the nearest whole bit, which gives me 66 bits in order to give each permutation a unique name. Have I got that right, or have I missed something important?

That’s actually where I got the number of permutations that I used in my OP, but I’m not asking how many permutations there are; I’m asking what the minimum number of bits is in order to give each permutation a unique name that would be searchable in an index. I came up with a guess, which I put in the OP, but I was wondering if I’d missed something.

Another calculation that agrees with what you got: The number of bits it takes to represent a number is one more than the integer part of its base-2 log. The base-2 log of 4.3*10^19 is 65 and change, so it takes 66 bits to represent it.

So if you wanted to give each position a “name” from our regular 26-letter alphabet, single case, you’d need “words” of 14 letters (the number of permutations in base-26), if my calculations are correct.

Where does that formula for the number of permutations come from? (How is it derived?)