I was given a problem for a job interview for a computer programming job. I was to write a routine that cuts a computer simulated deck and performs a perfect shuffle. A perfect shuffle, you cut the deck x cards from thre top. Then the bottom card from the top of the deck goes down first, then the bottom card from the bottom of the deck on top of it, one at a time, until one of the stacks runs out, then the remaining cards go on top of the deck. I was to simulate this pretty easy in JAVA and it works fine. The question he had was if the deck was 1001 cards, and you cut from the top 102 cards, then perfect shuffled, how many times would you have to shuffle the deck to return it’s to it’s original order? My algorithm works, but runs slow for large decks. I let it run for 16 hours, it shuffled 800,000,000+ times and I suspended it. He says there is a much simpler, faster way to come up with the answer, than the way I did it.

They way I did it, is I created an array of 1002 cards in computer memory, then created a 2nd array, shuffled the 1st into the 2nd, compared to see if the new was in the original order, if not, copied the 2nd array back into the deck, and repeated until the came up in th4e right order.

Does anybody have any ideas, how to do this simpler, faster?

I do know in a perfect shuffle, the top card gets lowered into the deck 2 positions every shuffle. Also, every shuffle I am only shuffling 204 cards (102 off the top + 102 off the bottom) and putting the remaining 797 cards on the top of the deck.

I could use this job.