I have a solitaire game on my Palm Pilot. It’s written by smallware.com, is freeware and works really well!

:sdimbert holds two thumbs up:

Anyway, I have been playing it for a couple of weeks. I noticed right away that it tracks basic statistics, and I want one of you math geniuses to think about this.

The game is “Klondike” - your basic, vanilla, 7 piles, play red on black or vice versa, build up from the Ace in the top 4 piles variety - and I want to know if anyone can figure:

Based on a standard deck of cards, if I play fair and straight, making no mistakes, what percentage of games should I win?

I ask because I have noticed that my winning percentage has been within a certain range since I started paying attention to it. The game counts games you’ve played and games you’ve won and tells you both along with your winning percentage ([sup]games won[/sup]/[sub]games played[/sub].)

I have played over 150 games (a fairly significant sample size) and my winning percentage is currently 13% (22/154). I can’t recall it ever being below 12% or above 14 or 15%.

Is it me, and the way I play, or is it… in the cards?

It’s not always possible to estimate probability when strategy is involved. Assuming you are playing the “flip three” variety of Klondike, you are at certain points better off not playing a playable card. Also if you have, for example, two black sevens covering piles and you’ve just played a red eight, which seven do you play on top of it?

You would have to make rigid assumptions about how you play in every possible situation. Then it would be possible to compute winning probabilities … although I still wouldn’t know how to do it. It’d be a bitch of a problem to solve!

I think for your computer program, it’s a little of both.

My reasons:
[ol]
[li]Another popular solitare game, Freecell, has a feature where you can select a game number. This number is between 1 and 32767, which happens to be the range of positive integers that can be represented in a 2-byte word in the computer. A number of random-number generators use a formula that has a factor of 32768 (2[sup]15[/sup]) built into it. So it seems to me that this game number is either a seed for a deck shuffling algorithm, or an index to a pre-defined deck order. If it’s the latter, they may have selected a set of deck orders that are either winnable or close to it, so as the players get some sense of satisfaction with the game. If the game’s too tough, people would quit playing it.[/li]
[li]In some casinos, there is a Klondike solitare table. You are given a deck of cards for $52, then are paid $2 for each card you place in the Ace piles (forgot the term for them). Usually in games like this, the payoff is only one half of the break-even probability; i.e., if it’s possible to win 1 out of every 4 games of Klondike solitare, it’d be break-even if they paid $4 per card. So they divide by 2 so that they’re assured a profit.[/li]
So going by the Vegas payout, a really good player can probably win 25% of Klondike games. A crummy one like me can only win 5%. Since you’re up to 15%, I’d say you’re halfway to going pro.
[/ol]

As to figuring the actual probability, that’d be quite a task. First, a really good Klondike-playing computer algorithm would need to be written. Then for each of the 52! possible deck arrangements, a game would be played against it. Even with a really fast computer, that’d take quite a while. (I imagine the probabilities that Vegas uses are from empirical statistics of the games played at their tables.)

Agreeing, that actual calculation would be a bitch.

Further agreeing with all the little nuisances that could have an impact (choice of two black sevens, which one do you move?), however I venture to suggest that those situations are relatively rare, and will not have a major impact on the probable outcomes.

When dealing with bitchy probabilities, one of the ways of estimating is just by running a zillion trials, and see how the distribution curve comes out.

We’ve got one sample so far, with 150 games at around 13%. Anyone else care to toss in their statistics? … and then someone can compile.

AWB, the problem with the gambling set-up is that you can only go through the deck once: you pay $52 to play, and you get $5 (I thought) for each card on the ‘ace’ stacks. But you go through the deck one card at a time (not every third card) and you can only do it once. There’s an alternative where you go through the deck every third card, and you can go through three times. But there’s an unnatural restraint in that game that would cause some situations to be losers that would be winners if you could go through the deck again.

Probability theory and statistics are not sufficient to solve this problem. To solve it you would need to address the problem from a game theoretic view. This would allow you to look at the effects of different strategies of play.

I agree in theory, Kesagiri. But I think the amount of “strategy” involved in standard solitaire is mininal (if not negligible.) The number of times that one doesn’t make the obvious play is small, very small; certainly not as many as one such decision per round.

Thus, if we were talking about trying to beat the odds at the casino, where a fraction of a percent can make a huge difference, I would agree with you entirely. But we’re not talking that, we’re just asking what proportion of games are winnable, roughly; and if we could comfortably say, 15% to 20%, that would be close enough.

So, I think the game is (within epsilon) deterministic, and esoteric bits of strategy needn’t concern us. Once the cards are set, we could have a computer play for us (with a few decision trees on what to do in case of two plays being available.)

With the standard Windows Solitare game, the “Vegas” setup lets you go through the deck 3 times, but three cards at a time. This is how my mom played it in Vegas. And it’s payoff was only $2 per card after paying $52 for the deck. Even me, as crappy as I play, could probably get at least 11 cards up enough times to make $5 per card profitable.

Can we at least start by agreeing on what constitutes “perfect strategy”? The decisions I know of are the following: [ul][li]Given two cards to place on a newly-playedcard, which to choose?[/li][li]Given the opportunity to move a card up onto an ace stack, move it or leave it in place to stack down from?[/li][li]Given the opportunity to move a card from the deck to a stack, place it or save the space for another card?[/ul][/li]My answers to these are the following:
[ul][li]Always place the card with the most hidden cards under it. If both options have the same number of cards below, then the decision is arbitrary. The only exception is when I have a surplus of kings covering hidden cards, and need to make room for them. I’m not sure how strong this exception should be.[/li][li]Alway move a card n to the aces if all cards of rank (n-2) of the opposite color, and the other card of the same color of rank (n-3) have already been placed on the aces. Otherwise, never move a card onto the aces unless this enables some other play at the time which can bring a card into play or reveal a hidden card. When such an opportunity does occur, always move the card to the ace. I know that this is less than optimal, but I’ve never been able to figure out how to do it better.[/li][li]Unless I know definitely of cards later in the deck which I can better use in a space, I always take the opportunity.[/ul][/li]Can anyone point out anything that I missed, and does this help anyone in making simulations/calculations?

You cannot tractably determine the probability of winning solitare using analytical means. In probabilistic game theory each decision point usually requires a multiplication and division by a factorial. Solitare starts off with seven such decision points (each of the seven cards might or might not be playable), and contains at least 24 more. More than a few factorial expressions quickly become incomputable even in computer time.

One might analytically determine a simple probability, such as the odds of at least one legal move existing at the start of the game, but the formula even for such a simple question would be quite complex.

The only tractable way to solve the problem is to run a monte carlo simulation.

I disagree, for example look at the comment about if you are playing the three flips. Sometimes not playing a playable card is a better strategy. This is why you might want to consider a game theoretic approach as it would include a decision making element into the game. Different strategies would result in different outcomes.

Suppose we have two different startegies and we know from prior experience that on average people win 7% of the games. Now if you have two people each playing a different startegy then it is possible for one guy to win 5% of the time (assuming he playes a large number of times) and the other guy 9% of the time. One player might have a more sophisticated strategy than the other. An interesting thing to do would be to see if sdimbert’s winning percentage increases (i.e. he is learning better strategies) the more he plays. Too bad we can’t go back and see some sort of time series on his winning percentage.

Chronos,

In game theory there is no such thing as a perfect strategy. That is why there are so many refinements to equilibria (sub-game perfect equilibria, sequential equilibria, etc.).

First of all, thank you all for your comments… keep 'em coming!

Joe Malik:
**

I am not much of a mathematician, but this doesn’t seem right to me. I mean, a Game-Boy can play chess, you know? Certainly there are more “decision points” in chess than there are in Klondike!

This seems to be just the sort of thing computers do best!

Oh, and if anyone cares, I am now at [sup]22[/sup]/[sub]160[/sub]. That is 13%.

I think it should be made clear what exactly we’re trying to find out about the game. Are we comparing sdimbert to a perfect player (one who will always win if it’s possible) or to a ‘good’ player that uses good strategy?

Usually when the probabilities are given for solitaire games, they are considered to be the ratio of winnable deals to unwinnable deals. And it’s true that you won’t get the answer in this lifetime if you try to calculate every possible hand vs. every possible winning hand (even if you could calcualate winnability in half a nanosecond, it would be several yotta-yotta-years before you’d have the answer). But it’s probably been done for at least a several thousand , or million hands and that sample should be sufficient. (One other possible way of cutting down the calculation might be to work backwards to find all winning games, i.e. the king is always played last, then another king, etc. But it would still likely take a long long time)

Depending on how good our human player is, that may not mean much to him except as an ultimate (and likely unattainable) goal, since winning play in some deals may depend on knowing cards that haven’t even been seen yet. So perhaps he’d rather be compared to a player with excellent strategy. Of course, what that strategy might be is what the game theorists are out to figure out, and as kesagiri points out there’s no ‘perfect’ strategy that can win every game. Although IIRC, the best strategy (most likely to win) is referred to as ‘optimal’, and perhaps there is an optimal strategy, to which a human player with some unknown strategy might be compared.

Chronos: There’s one strategy you overlooked for the three-card-flip game (probably because it doesn’t come up very often)…

If, while I’m working through the deck, I get to the second-to-last flip and remember that I’d extracted three cards earlier during this pass through the deck, I do not play the top card of that flip unless it uncovers a hidden (face-down) card (which I consider to be central to mid-game strategy).

My reasoning is that I can go through the deck another time seeing if there’s anything clearly better (which is possible as a result of the displacement caused by previous extractions). If so, I can use that and get the NEXT card of the penultimate flip, PLUS the next card of the final flip (assuming I hadn’t used it last time).

If, on the other hand, the addtional pass through the deck doesn’t turn up anything worthwhile, I can use the top card of the penultimate flip (which will be exactly the same as it was before) and the card after that (and maybe after that).

If you have trouble following all that, don’t worry: it’s not going to make a huge difference. I’m merely demonstrating that there are some exceedingly subtle strategies out there.

Actually, it indexes a predefined game. If you search the web, you should be able to find a list of which of those games are impossible and which are extemely difficult (near one unique solution). I think the link I saw once was on a Texas A&M math prof.'s site.

sdimbert, I am not sure about your percentage, but I bet if you had 50 monkeys and 50 palm pilots playing for infinity, you would eventually have some vicotries.

jmullaney’s correct: the game number is an index to a predefined list of deals (in the Windows version). Well, sort of. According to this site, the number is used as the seed of the Microsoft C compiler’s random number generator; the same number always gives the same deal. The individual deals were not carefully chosen beforehand, and are essentially as random as they could be conveniently be made.

That site also goes on to say that all of the Microsoft deals except for one are winnable. Number 11982 is the only one that, to the knowledge of the page’s author, has never been solved by either a human or a computer. Documented solutions for all other MS deals exist, and I think the site has links to them.

The statement in the MS help file that “it is believed that all games are winnable” is complete BS. It’s quite trivial to construct an obviously unwinnable Freecell deal; one is shown here.

I can confirm that the number of the game in FreeCell is the seed to a random number generator. I have programmed several solitaire-like games for the HP 100LX and 200LX palmtop PC’s, including FreeCell.

For the FreeCell game, I first came up with a six-digit game number, which would generate a million different games. Then later, when I got my hands on the Microsoft algorithm, I coded that in and - guess what - it works! Even though I did the game in Turbo Pascal instead of C, the algorithm ported right over. Now a user can choose between my million-deal march and the Microsoft 32K way.

<< I disagree, for example look at the comment about if you are playing the three flips. Sometimes not playing a playable card is a better strategy. >>

But my question is, how often do these situations arise?

So, let’s start with some assumptions. I am going to define a game as WINNABLE if a standard stratgegy leads to a winning situation. One can certainly imagine games where there is a winning situation, but no one would ever know it without seeing all the cards (“Don’t play ANY cards at all until you get to the last card in the deck, and THEN play that one, and all the rest come falling into place.”)

So, my formulation of sdimbert’s problem is: given normal play, what is the expectation of success? That is, what is the ratio of winnable games to total games played?

From that perspective, I consider that these minor elements of strategy are just that – minor. They may alter the results by a few percentage points (games that could be winnable by esoteric play), but I’m calling that margin of error.

I am suggesting that we can test out the likelihood of a win by imperical evidence: we can look at the ratio of wins amongst competent players. Sdimbert’s results is based on a small number of games (a few hundred), and I suggest we compile results. If we had a few thousand games, number of wins, number of losses, we would have strong imperical evidence for the win-ratio.

Again, I content that the standard solitaire game is programmable with an optimal strategy; that there are a very small number of cases where decision-making comes into play; that there is a way, way smaller number of cases where optimal strategized decision-making does not produce a WIN, but some other play does. Thus, I believe that a Monte Carlo approach – play lots of games and record the results – will give us a reasonable estimate of the win-ratio.