PDA

View Full Version : Math Gurus: Probability & Statistics of Solitaire?

sdimbert
09-14-2000, 12:50 AM
I have a solitaire game on my Palm Pilot. It's written by smallware.com, is freeware and works really well!

:sdimbert holds two thumbs up:

Anyway, I have been playing it for a couple of weeks. I noticed right away that it tracks basic statistics, and I want one of you math geniuses to think about this.

The game is "Klondike" - your basic, vanilla, 7 piles, play red on black or vice versa, build up from the Ace in the top 4 piles variety - and I want to know if anyone can figure:

Based on a standard deck of cards, if I play fair and straight, making no mistakes, what percentage of games should I win?

I ask because I have noticed that my winning percentage has been within a certain range since I started paying attention to it. The game counts games you've played and games you've won and tells you both along with your winning percentage (games won/games played.)

I have played over 150 games (a fairly significant sample size) and my winning percentage is currently 13% (22/154). I can't recall it ever being below 12% or above 14 or 15%.

Is it me, and the way I play, or is it.... in the cards?

Greg Charles
09-14-2000, 10:43 AM
It's not always possible to estimate probability when strategy is involved. Assuming you are playing the "flip three" variety of Klondike, you are at certain points better off not playing a playable card. Also if you have, for example, two black sevens covering piles and you've just played a red eight, which seven do you play on top of it?

You would have to make rigid assumptions about how you play in every possible situation. Then it would be possible to compute winning probabilities ... although I still wouldn't know how to do it. It'd be a bitch of a problem to solve!

sdimbert
09-14-2000, 10:46 AM
Originally posted by Greg Charles

It'd be a bitch of a problem to solve!

Which is why I posted it here!

:D

AWB
09-14-2000, 11:05 AM
I think for your computer program, it's a little of both.

My reasons:

Another popular solitare game, Freecell, has a feature where you can select a game number. This number is between 1 and 32767, which happens to be the range of positive integers that can be represented in a 2-byte word in the computer. A number of random-number generators use a formula that has a factor of 32768 (215) built into it. So it seems to me that this game number is either a seed for a deck shuffling algorithm, or an index to a pre-defined deck order. If it's the latter, they may have selected a set of deck orders that are either winnable or close to it, so as the players get some sense of satisfaction with the game. If the game's too tough, people would quit playing it.

In some casinos, there is a Klondike solitare table. You are given a deck of cards for \$52, then are paid \$2 for each card you place in the Ace piles (forgot the term for them). Usually in games like this, the payoff is only one half of the break-even probability; i.e., if it's possible to win 1 out of every 4 games of Klondike solitare, it'd be break-even if they paid \$4 per card. So they divide by 2 so that they're assured a profit.

So going by the Vegas payout, a really good player can probably win 25% of Klondike games. A crummy one like me can only win 5%. Since you're up to 15%, I'd say you're halfway to going pro. :)

As to figuring the actual probability, that'd be quite a task. First, a really good Klondike-playing computer algorithm would need to be written. Then for each of the 52! possible deck arrangements, a game would be played against it. Even with a really fast computer, that'd take quite a while. (I imagine the probabilities that Vegas uses are from empirical statistics of the games played at their tables.)

C K Dexter Haven
09-14-2000, 11:31 AM
Agreeing, that actual calculation would be a bitch.

Further agreeing with all the little nuisances that could have an impact (choice of two black sevens, which one do you move?), however I venture to suggest that those situations are relatively rare, and will not have a major impact on the probable outcomes.

When dealing with bitchy probabilities, one of the ways of estimating is just by running a zillion trials, and see how the distribution curve comes out.

We've got one sample so far, with 150 games at around 13%. Anyone else care to toss in their statistics? ... and then someone can compile.

AWB, the problem with the gambling set-up is that you can only go through the deck once: you pay \$52 to play, and you get \$5 (I thought) for each card on the 'ace' stacks. But you go through the deck one card at a time (not every third card) and you can only do it once. There's an alternative where you go through the deck every third card, and you can go through three times. But there's an unnatural restraint in that game that would cause some situations to be losers that would be winners if you could go through the deck again.

kesagiri
09-14-2000, 12:22 PM
Probability theory and statistics are not sufficient to solve this problem. To solve it you would need to address the problem from a game theoretic view. This would allow you to look at the effects of different strategies of play.

C K Dexter Haven
09-14-2000, 12:41 PM
I agree in theory, Kesagiri. But I think the amount of "strategy" involved in standard solitaire is mininal (if not negligible.) The number of times that one doesn't make the obvious play is small, very small; certainly not as many as one such decision per round.

Thus, if we were talking about trying to beat the odds at the casino, where a fraction of a percent can make a huge difference, I would agree with you entirely. But we're not talking that, we're just asking what proportion of games are winnable, roughly; and if we could comfortably say, 15% to 20%, that would be close enough.

So, I think the game is (within epsilon) deterministic, and esoteric bits of strategy needn't concern us. Once the cards are set, we could have a computer play for us (with a few decision trees on what to do in case of two plays being available.)

AWB
09-14-2000, 12:46 PM
Originally posted by CKDextHavn
AWB, the problem with the gambling set-up is that you can only go through the deck once: you pay \$52 to play, and you get \$5 (I thought) for each card on the 'ace' stacks. But you go through the deck one card at a time (not every third card) and you can only do it once. There's an alternative where you go through the deck every third card, and you can go through three times. But there's an unnatural restraint in that game that would cause some situations to be losers that would be winners if you could go through the deck again.

With the standard Windows Solitare game, the "Vegas" setup lets you go through the deck 3 times, but three cards at a time. This is how my mom played it in Vegas. And it's payoff was only \$2 per card after paying \$52 for the deck. Even me, as crappy as I play, could probably get at least 11 cards up enough times to make \$5 per card profitable.

Chronos
09-14-2000, 12:51 PM
Can we at least start by agreeing on what constitutes "perfect strategy"? The decisions I know of are the following: Given two cards to place on a newly-playedcard, which to choose?
Given the opportunity to move a card up onto an ace stack, move it or leave it in place to stack down from?
Given the opportunity to move a card from the deck to a stack, place it or save the space for another card?
My answers to these are the following:
Always place the card with the most hidden cards under it. If both options have the same number of cards below, then the decision is arbitrary. The only exception is when I have a surplus of kings covering hidden cards, and need to make room for them. I'm not sure how strong this exception should be.
Alway move a card n to the aces if all cards of rank (n-2) of the opposite color, and the other card of the same color of rank (n-3) have already been placed on the aces. Otherwise, never move a card onto the aces unless this enables some other play at the time which can bring a card into play or reveal a hidden card. When such an opportunity does occur, always move the card to the ace. I know that this is less than optimal, but I've never been able to figure out how to do it better.
Unless I know definitely of cards later in the deck which I can better use in a space, I always take the opportunity.
Can anyone point out anything that I missed, and does this help anyone in making simulations/calculations?

Joe Malik
09-14-2000, 12:59 PM
You cannot tractably determine the probability of winning solitare using analytical means. In probabilistic game theory each decision point usually requires a multiplication and division by a factorial. Solitare starts off with seven such decision points (each of the seven cards might or might not be playable), and contains at least 24 more. More than a few factorial expressions quickly become incomputable even in computer time.

One might analytically determine a simple probability, such as the odds of at least one legal move existing at the start of the game, but the formula even for such a simple question would be quite complex.

The only tractable way to solve the problem is to run a monte carlo simulation.

kesagiri
09-14-2000, 01:14 PM
CKDextHavn,

I disagree, for example look at the comment about if you are playing the three flips. Sometimes not playing a playable card is a better strategy. This is why you might want to consider a game theoretic approach as it would include a decision making element into the game. Different strategies would result in different outcomes.

Suppose we have two different startegies and we know from prior experience that on average people win 7% of the games. Now if you have two people each playing a different startegy then it is possible for one guy to win 5% of the time (assuming he playes a large number of times) and the other guy 9% of the time. One player might have a more sophisticated strategy than the other. An interesting thing to do would be to see if sdimbert's winning percentage increases (i.e. he is learning better strategies) the more he plays. Too bad we can't go back and see some sort of time series on his winning percentage.

Chronos,

In game theory there is no such thing as a perfect strategy. That is why there are so many refinements to equilibria (sub-game perfect equilibria, sequential equilibria, etc.).

sdimbert
09-14-2000, 01:48 PM
First of all, thank you all for your comments... keep 'em coming!

Joe Malik:

You cannot tractably determine the probability of winning solitare using analytical means. In probabilistic game theory each decision point usually requires a multiplication and division by a factorial. Solitare starts off with seven such decision points... More than a few factorial expressions quickly become incomputable even in computer time.

I am not much of a mathematician, but this doesn't seem right to me. I mean, a Game-Boy can play chess, you know? Certainly there are more "decision points" in chess than there are in Klondike!

This seems to be just the sort of thing computers do best!

Oh, and if anyone cares, I am now at 22/160. That is 13%.

panamajack
09-14-2000, 03:44 PM
I think it should be made clear what exactly we're trying to find out about the game. Are we comparing sdimbert to a perfect player (one who will always win if it's possible) or to a 'good' player that uses good strategy?

Usually when the probabilities are given for solitaire games, they are considered to be the ratio of winnable deals to unwinnable deals. And it's true that you won't get the answer in this lifetime if you try to calculate every possible hand vs. every possible winning hand (even if you could calcualate winnability in half a nanosecond, it would be several yotta-yotta-years before you'd have the answer). But it's probably been done for at least a several thousand , or million hands and that sample should be sufficient. (One other possible way of cutting down the calculation might be to work backwards to find all winning games, i.e. the king is always played last, then another king, etc. But it would still likely take a long long time)

Depending on how good our human player is, that may not mean much to him except as an ultimate (and likely unattainable) goal, since winning play in some deals may depend on knowing cards that haven't even been seen yet. So perhaps he'd rather be compared to a player with excellent strategy. Of course, what that strategy might be is what the game theorists are out to figure out, and as kesagiri points out there's no 'perfect' strategy that can win every game. Although IIRC, the best strategy (most likely to win) is referred to as 'optimal', and perhaps there is an optimal strategy, to which a human player with some unknown strategy might be compared.

panama jack

____________

I just like saying yotta-yotta-years.

Timothy Campbell
09-14-2000, 04:15 PM
Chronos: There's one strategy you overlooked for the three-card-flip game (probably because it doesn't come up very often)...

If, while I'm working through the deck, I get to the second-to-last flip and remember that I'd extracted three cards earlier during this pass through the deck, I do not play the top card of that flip unless it uncovers a hidden (face-down) card (which I consider to be central to mid-game strategy).

My reasoning is that I can go through the deck another time seeing if there's anything clearly better (which is possible as a result of the displacement caused by previous extractions). If so, I can use that and get the NEXT card of the penultimate flip, PLUS the next card of the final flip (assuming I hadn't used it last time).

If, on the other hand, the addtional pass through the deck doesn't turn up anything worthwhile, I can use the top card of the penultimate flip (which will be exactly the same as it was before) and the card after that (and maybe after that).

If you have trouble following all that, don't worry: it's not going to make a huge difference. I'm merely demonstrating that there are some exceedingly subtle strategies out there.

jmullaney
09-14-2000, 04:38 PM
Originally posted by AWB
So it seems to me that this game number is either a seed for a deck shuffling algorithm, or an index to a pre-defined deck order. If it's the latter, they may have selected a set of deck orders that are either winnable or close to it, so as the players get some sense of satisfaction with the game. If the game's too tough, people would quit playing it.

Actually, it indexes a predefined game. If you search the web, you should be able to find a list of which of those games are impossible and which are extemely difficult (near one unique solution). I think the link I saw once was on a Texas A&M math prof.'s site.

Bear_Nenno
09-14-2000, 05:53 PM
sdimbert, I am not sure about your percentage, but I bet if you had 50 monkeys and 50 palm pilots playing for infinity, you would eventually have some vicotries. :)

09-14-2000, 06:15 PM
jmullaney's correct: the game number is an index to a predefined list of deals (in the Windows version). Well, sort of. According to this site (http://members.aol.com/wgreview/fcfaq.html), the number is used as the seed of the Microsoft C compiler's random number generator; the same number always gives the same deal. The individual deals were not carefully chosen beforehand, and are essentially as random as they could be conveniently be made.

That site also goes on to say that all of the Microsoft deals except for one are winnable. Number 11982 is the only one that, to the knowledge of the page's author, has never been solved by either a human or a computer. Documented solutions for all other MS deals exist, and I think the site has links to them.

The statement in the MS help file that "it is believed that all games are winnable" is complete BS. It's quite trivial to construct an obviously unwinnable Freecell deal; one is shown here (http://www.cs.ruu.nl/~hansb/d.freecell/node2.html#SECTION0002).

sdimbert
09-14-2000, 06:50 PM
This is my thread, damnit! I want it to be about math, not solitiare!!!

OK... I calmed down. :)

Seriously though, did I really stump the Math Geniuses around here?

CurtC
09-14-2000, 10:39 PM
I can confirm that the number of the game in FreeCell is the seed to a random number generator. I have programmed several solitaire-like games for the HP 100LX and 200LX palmtop PC's, including FreeCell.

For the FreeCell game, I first came up with a six-digit game number, which would generate a million different games. Then later, when I got my hands on the Microsoft algorithm, I coded that in and - guess what - it works! Even though I did the game in Turbo Pascal instead of C, the algorithm ported right over. Now a user can choose between my million-deal march and the Microsoft 32K way.

C K Dexter Haven
09-15-2000, 08:15 AM
<< I disagree, for example look at the comment about if you are playing the three flips. Sometimes not playing a playable card is a better strategy. >>

But my question is, how often do these situations arise?

So, let's start with some assumptions. I am going to define a game as WINNABLE if a standard stratgegy leads to a winning situation. One can certainly imagine games where there is a winning situation, but no one would ever know it without seeing all the cards ("Don't play ANY cards at all until you get to the last card in the deck, and THEN play that one, and all the rest come falling into place.")

So, my formulation of sdimbert's problem is: given normal play, what is the expectation of success? That is, what is the ratio of winnable games to total games played?

From that perspective, I consider that these minor elements of strategy are just that -- minor. They may alter the results by a few percentage points (games that could be winnable by esoteric play), but I'm calling that margin of error.

I am suggesting that we can test out the likelihood of a win by imperical evidence: we can look at the ratio of wins amongst competent players. Sdimbert's results is based on a small number of games (a few hundred), and I suggest we compile results. If we had a few thousand games, number of wins, number of losses, we would have strong imperical evidence for the win-ratio.

Again, I content that the standard solitaire game is programmable with an optimal strategy; that there are a very small number of cases where decision-making comes into play; that there is a way, way smaller number of cases where optimal strategized decision-making does not produce a WIN, but some other play does. Thus, I believe that a Monte Carlo approach -- play lots of games and record the results -- will give us a reasonable estimate of the win-ratio.

hawthorne
09-15-2000, 09:24 AM
I'm not sure that the difference between playing the percentages and an upper bound for wins is at all trivial. The question of whether to move a king from the deck or a king from a column containing cards to a vacant column is pretty tricky early in the game when you don't how many aces are in the deck or whether you will get any other chances to cycle through the deck.

My only conclusions so far: (i) this problem is a nightmare. I can't even think how to begin to set it up. (ii) sdimbert's winning fraction is pretty damn high.

picmr

sdimbert
09-15-2000, 09:41 AM
Originally posted by CKDextHavn
...my formulation of sdimbert's problem is: given normal play, what is the expectation of success? That is, what is the ratio of winnable games to total games played?

From that perspective, I consider that these minor elements of strategy are just that -- minor. They may alter the results by a few percentage points (games that could be winnable by esoteric play), but I'm calling that margin of error.

Dex, I think that you are on to something here.

I am suggesting that we can test out the likelihood of a win by imperical evidence: we can look at the ratio of wins amongst competent players. Sdimbert's results is based on a small number of games (a few hundred), and I suggest we compile results. If we had a few thousand games, number of wins, number of losses, we would have strong imperical evidence for the win-ratio.

FWIW, I actually have access to two Palm Pilots and play this game on both. I have already posted my first set of results. My second set of stats is 16/110 for a winning percentage of 13%.

picmr,

Thanks! :)

CurtC
09-15-2000, 11:25 AM
My winning percentage on the 200LX palmtop is around 11%. However, it does depend on what rules you play. I've found that there are many situations where I could win, if I did a certain thing which I consider to be cheating so I don't do it.

The Microsoft version, and the version I wrote for the 200LX, will allow you to move a partial stack of cards to another stack. In other words, if one stack contains 9-8-7-6-5, and you need to get to that 8, it will let you move the 7-6-5 to another 8. I consider this cheating - either you move the whole stack or nothing.

sdimbert
09-15-2000, 11:29 AM
Curt,

I don't think that the move you describe is cheating, according to Hoyle.

And, No, I am not going to provide a cite. I'm too tired.

kesagiri
09-15-2000, 12:43 PM
Dex,

Sorry but I still disagree.

Here is sdimbert's original question:

Based on a standard deck of cards, if I play fair and straight, making no mistakes, what percentage of games should I win?

So his strategy is important. Especially if it is true that the average is only 5%. If that is true your error margin is unacceptable. With his current startegy we do have an estimate of how often he wins, 13%. However, we have not factored in for such things as making no mistakes and not cheating. I think what he is looking for is, given his strategy what percentage would he win if he played perfectly. He is trying to find out how well he is doing relative to a perfect player using his strategy. By ignoring strategy you wont be able to answer his question.

C K Dexter Haven
09-15-2000, 01:40 PM
I'm not saying ignore strategy. Well, OK, so I am. But I'm saying it in context. Bear with me here.

I'm saying that the winning ratio is (presumably) the number of all games that wind up a WIN (all cards in the aces pile) divided by the number of all possible games. We could determine number of all possible games fairly easily, it's just straight probability: how many arrangements of the deck are there. Now comes determining the number of all games that end up a WIN.

We first need to define whether a game is a WIN or not. Why is that a problem? Well, for example, if a hundred people play the same game and all 100 people win, we'd call that a WIN. Right? But suppose only 90 people win -- the other people didn't move the right red 8 to the black 9. Or only 50? or only 10? Or only 1?

Suppose the game is set up that no actual human player of the 100 won, but if you could see all the cards, you could determine an illogical but winning strategy?

So how do we count the number of games where you win, if we don't even know what constitutes a winning game?

Now, my main hypothesis: for MOST games, out of 100 plays, there will be either 100 wins or 100 losses (OK, 99, 'cause some jerk won't even SEE the black 9.) Let's not quibble, you know what I mean here: my hypothesis is that standard play (perhaps with an algorithm for decided what to do in certain ambiguous cases) will result in a WIN or in a LOSS. My sub-hypothesis is that the number of games where "strategy" is important, where two different players can get two different results, is small. (NOTE: This is NOT true for Freecell, because there are LOTS of different moves available. But for normal straight solitaire, it's normally mindless, there aren't a lot of choices of moves. Freecell claims that EVERY game can be won if you're clever enough.)

So, bear with me. HYPOTHESIS: The number of games where different players get different results is small. OK? Nowl IF my hypothesis is true, THEN we can approximate the percent of games ending in wins by sampling. What's called the Monte Carlo method. Playing lots and lots of games and seeing what the imperical results are. OK so far?

IF my hypothesis (that the number of games that would yield different results for different players is small) is false, then we're stuck. There's no way to categorize games as wins or losses, and no way to answer the question. It now becomes like predicting the outcome in a game of chess. And, by the way, that leaves us with only one approach -- again, to take samples, and hope we can get some coherent results. I'm sure that people know the approximate number of times that an expert chess game ends in a draw, even if they have no idea what the true underlying probability/mathematics are.

So, whether my hypothesis is true or false, I'm still left with the idea of using a sampling approach to estimate the probability. QED.

sdimbert
09-15-2000, 02:30 PM
I am not sure if this matters or not, but I don't really pay much attention when I play solitaire.

I mean, I look at the cards and think about my moves, but, out of the (almost) 300 games I've logged, I don't think that I have been "tricky" more than 4 or 5 times.

By "tricky" I mean skipping a legal play in order to allow some other, better play to come about.

Usually, I just slog through it, starting over whenever I lose, because I know it is just a matter of time until I win.

So, have I proven that a simple computer could win 12 - 15% of the time?

Or is it just that only 12-15% of the hands dealt randomly are winnable?

kesagiri
09-15-2000, 02:32 PM
We first need to define whether a game is a WIN or not. Why is that a problem? Well, for example, if a hundred people play the same game and all 100 people win, we'd call that a WIN. Right? But suppose only 90 people win -- the other people didn't move the right red 8 to the black 9. Or only 50? or only 10? Or only 1?

Thanks for making my point. :D

Your concept of winning depends on strategy. Now if we ignore strategy we could state that on average x% of the games will be won. However, for a sophisticated player who has been playing for some time this would be an underestimate IMO, and an overestimate for those just trying out the game the first time.

Let me try to reword what I think sdimbert's question is (he can correct me if I am wrong). The way I read it he is wondering how he is doing against the perfect player who is using his strategy. That is a player who has perfect recall and has the ability to see all the options. That is, how does he stack up against this "super-human" sdimbert. Not how is he doing against everyone else (although maybe that is what he is interested in).

It now becomes like predicting the outcome in a game of chess.

IIRC chess is solvable via backward induction. The problem is that nobody can see all the possible moves and hence can't actually solve the problem.

sdimbert
09-15-2000, 03:05 PM
Originally posted by kesagiri
[QUOTE]
Let me try to reword what I think sdimbert's
question is (he can correct me if I am wrong). The way I read it he is wondering how he is doing against the perfect player who is using his strategy. That is a player who has perfect recall and has the ability to see all the options.

Ummmmmmmm... no.

Sorry, kesagiri, but that's not what I meant at all. What's the use of creating a construct who can "see all the options?" That's just silly.

I think we can all agree that there exists some deals from a random deck which are unwinnable by any player of any level of skill or sophistication.

Once that is stipulated, we can turn our attention to the more important (and more complicated) question. How many such deals exist?

I mean, this isn't a very tuff question, is it? I'm not a math person, but how many different ways are there to arrange 52 cards? Isn't the answer 52!, as someone pointed out a while ago? If I understand what a factorial is, than 52! = 80,658,175,170,943,900,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000.

Or, 8.06581751709439E+67, if you prefer. (That is a *big* number.)

So, now that we have established a finite number of deck combinations, wouldn't the number of possible "deals" be the same? I think so. If so, we have established a finite number of possible "starting deals." 15% of that number is: 1.20987E+67. Still a big-ass number, but much smaller than the previous one.

I am not sure what my point here is, but I think that the magnitude of these numbers supports Dex's assertion that strategy is a negligible factor in considering the original question.

So... where is my mistake?

kesagiri
09-15-2000, 03:49 PM
sdimbert,

Thanks, I guess you didn't mean

Based on a standard deck of cards, if I play fair and straight, making no mistakes, what percentage of games should I win?

But actually meant "what percentage of the games are winnable on average ignoring strategy."

The 'making no mistakes' I took to mean that you had perfect recall (i.e. you would know when to play a playable card and when not too) and also, where to place the playable card when given the choice. (This would actually be youd use some sort of probabilitic rule based on past games, probably a Bayesian Perfect strategy or something like that.)

Given the number of games the computer has in its memory I don't think the possible number of deals is 52! See Brad's link, there are 32,000 deals. Of course even this number of deals might be quite difficult to calculate how a given strategy would perform.

Think of it this way. Suppose there is only one deal (no 52! or 32,000) then looking at strategy wouldn't be that tough IMO.

sdimbert
09-15-2000, 04:25 PM
I think I might be miffed at having someone tell me what I meant. I'm not sure...

Anyhow, we still haven't answered the question I asked. Look back up to the OP and read it again. The whole thing. The question is in the last paragraph.

What I was really asking is, do other people win significantly more (or less) than 12% - 15% of the time?

Based on what was said so far, I would guess that Dex would say no, relying on the assertion that strategy does not have an appreciable effect in a game as simple as Klondike. It seems kesagiri swings the other way.

I am still up in the air.

Anyone else play a lot of solitaire? We need more data!

09-15-2000, 04:42 PM
Originally posted by kesagiri
Given the number of games the computer has in its memory I don't think the possible number of deals is 52! See Brad's link, there are 32,000 deals. Of course even this number of deals might be quite difficult to calculate how a given strategy would perform.
[/B]
Kesagiri, the link I got that off of was discussing Freecell, not Klondike.

I haven't found any info on how MS Klondike gets its deals, or how many possibilities it restricts itself to. It wouldn't surprise me if it was very similar to Freecell, though. Maybe somebody else knows....

kesagiri
09-15-2000, 04:53 PM
Is it me, and the way I play, or is it.... in the cards?

Oh yes, I see "...the way I play..." doesn't refer to strategy. How silly of me. Sheesh.

What I was really asking is, do other people win significantly more (or less) than 12% - 15% of the time?

Nope, strategy doesn't affect one chances of winning a game...nope.[/sarcasm]

sdimbert
09-15-2000, 04:58 PM
Originally posted by kesagiri
[quote]Nope, strategy doesn't affect one chances of winning a game...nope.[/sarcasm]

Hey, do us all a favor, OK? Thicken the skin a bit and leave the sarcasm at home. :rolleyes:

The question you have so easily dismissed is exactly what I was asking. There is no doubt in my mind that there are many people who are better solitaire players than I am. What I want to know is if their additional skill/strategy/know-how translates into a significant increase in winning percentage or not?

Now, Dex (and others) have theorized that the answer is no. You claim otherwise. So, prove it.

kesagiri
09-15-2000, 06:52 PM
Dex seems to have concluded a priori that the answer is no. I however think he is not realizing what a strategy is. For example:

Now, my main hypothesis: for MOST games, out of 100 plays, there will be either 100 wins or 100 losses (OK, 99, 'cause some jerk won't even SEE the black 9.) Let's not quibble, you know what I mean here: my hypothesis is that standard play (perhaps with an algorithm for decided what to do in certain ambiguous cases) will result in a WIN or in a LOSS.

Wouldn't his algorithm correspond to a strategy? Couldn't different algorithm's make different "decisions" based on different criteria? Seems that way to me.

Further he is limiting himself to the case where a win is defined as all players either win the game or lose. I contend that different strategies will have different success rates, his test couldn't test for strategies as it excludes the cases when some people win and some don't.

So, bear with me. HYPOTHESIS: The number of games where different players get different results is small. OK?

No, not okay. This is not an hypothesis, but an assumption. You have defined winning in such a way as to exclude the case where some players win and others loose.

IF my hypothesis (that the number of games that would yield different results for different players is small) is false, then we're stuck. There's no way to categorize games as wins or losses, and no way to answer the question.

Maybe not, how about defining winning given a strategy. That is a game is winnable with strategy S1, but not with strategy S2. This would allow us to test the hypothesis. Of course we'd have to try and figure out when a player makes a bad choice out of carelessness vs. strategy. A computer simulation might help here.

Chronos
09-15-2000, 09:18 PM
There are most definitely situations where I've won a game that I would have lost, had I not used my strategy, and games where I lost where I would have won, had I had acess to or recall of a greater amount of information. Further, these games seem to occur more frequently than one in a hundred (although I've never counted), and would thus be a change of more than one percentage point.

As for defining deals as "win" or "loss", let's consider a hypothetical computer (hypothetical because this would probably take a real computer longer than the lifetime of the Universe), that remembers the position of every card which it has seen, and based on all available information, computes the probability of winning, for any given possible move at each decision point. I think that we can safely say that this computer employs perfect strategy. Game theory doesn't even come into play, since there is no intelligent opponent basing decisions on our actions.
Now, we take our perfect computer, and let it try all of the 52! possible deals, and record when it wins and loses. We now have an absolute number of games that will be won with perfect strategy, and hence a win probability. This number may be extremely difficult (read: impossible) to compute, but it is possible to define it.
Now, for the third step: We write a real computer program that uses a set of heuristics such as those I listed to play as well as we can design it, to (hopefully) approximate the hypothetical perfect strategy. We then run this program on a thousand, or a million, or 215, or however many random deals, to approximate the 52! , and count the results. Further, we could write several different strategies for our computer, including a mindless one (always make a move if possible, and when more than one is possible, choose one randomly), and compare the performance of the various algorithms, and thus test Dex's claim that strategy is insignificant. This problem is by no means intractable, especially to any programmers who have already written a Solitaire game.

Manlob
09-16-2000, 01:04 AM
I wrote a program using a simple strategy to play solitaire. It does not allow splitting of a sequence once it is made. I had it play one million games with each of three different methods of dealing from the hand: 3 cards at a time (7.55% wins), 2 cards at a time (15.96%) and 1 card a time (29.45%).

The usual method of playing is dealing 3 cards at a time, but by looking at all the cards in the hand and very careful planning of moves based on the order of those cards, it should be possible to increase the winning percentage, but not any higher than with the 1 at a time method. The one at a time method gives full access to every card in the hand, since you are allowed to go through the hand as many times as you want (unlike Vegas rules). Plotting when to use playable cards with the 3 at a time method greatly increases the accessiblity of the cards in the hand, but doesn't give full access, so it shouldn't be quite as good as one at a time dealing.

kesagiri
09-16-2000, 01:39 PM
Game theory doesn't even come into play, since there is no intelligent opponent basing decisions on our actions.

Hmmm, well it certainly isn't non-cooperative game theory, but I think you could still use game theories structure to help look at this problem.

As for your computer, I was thinking more in probabilistic terms. That is given what the computer knows, when it moves the last card from one of the seven stacks to one of the ace stacks what is the probability that the card uncovered is going to be a red king for example.

That is we have

Prob(Red King|I)

Where I is what the information available at that time.

Then once we observe the new card we can revise the above probability using Bayes Theorem.

There have been some interesing applications of Beyesian Learning to junk e-mail filters. My intuition tells me something similar might work here.

sdimbert
09-16-2000, 09:27 PM
Chronos[/]: I am confused by your comments. You seem to be saying, on the one hand, that writing a program to play solitaire (or building a computer to run said program) is imposible, while on the other hand, you say that it can be done. I am sure that the fault is mine - I am misunderstanding you. I just don't see where.

[b]Manlob: Your program seems to be lousier than even I am, since I average 15% playing three-at-a-time.

[b]kesagiri/b]: You have officially moved over my head. Rock on! Just keep the rest of us idjits posted and try to use small words when possible. :)

Chronos
09-17-2000, 03:37 PM
Sorry about that, sdimbert. What I meant was that it's practically impossible (but still theoretically possible) to build a computer to play perfect solitaire, but that it should be easy to make a computer that's just pretty darn good, but imperfect.

[b]kesagiri[/i], that may well work, but I didn't realize that that was considered game theory. Consider me corrected.

sdimbert
09-17-2000, 05:34 PM
Chronos, I guess lousy code is contaigious! :D

C K Dexter Haven
09-18-2000, 11:52 AM
<< Kesagiri, the link I got that off of was discussing Freecell, not Klondike. >>

And those are very different; in Freecell, there are four rows of 6 cards and 4 rows of seven cards; if the first row and the fourth row are interchanged, it's the same deal. Also, not sure whether that number of deals is just that Freecell has set up a fixed number of hands arbitrarily. I don't think that's all possible hands if you were dealing the deck yourself; that's just the selected numbered hands that the computer-game authors allow, so that you can replay a hand by recalling the number of that hand. (The game instructions for Freecell claim that all games are winnable.)

But, of course, in Freecell, you can see all the cards. You know when to make bizarre plays. In solitaire, you can't see all the cards, and therefore you don't know (most of the time) when to make a bizarre play.

Kesagiri: << Wouldn't his algorithm correspond to a strategy? >>

Yeah, OK, I can see I was using sloppy language. And I think our conversation may be mostly semantics: I was distinguishing between the algorithm and a strategy. What I meant is that, generally speaking, the rules of solitaire totally dictate the play. You turn over a card, there is either a legitimate play or there isn't. And usually there is only one legitimate play, not two or three or six. Thus, you can play "mindlessly" or write a simple computer algorithm that makes each play.

If you wanna call that a strategy, OK. I've been thinking of that as the rules of the game, not the strategy. Example: in chess, the pieces move a certain way. Those are the rules. When you sit down to play chess, the opening move by white could be any one of about 20 moves. When you sit down to play solitaire, the opening moves are pretty much set. There's very little choice. It would normally be considered "weird" not to make a play, and there is normally only one way to make a play. I'm saying there is a sort of algorithm to playing solitaire, dictated by the rules, that leaves a limited number of moves.

I've been using the term "strategy" to mean a decision point at which the play of cards is not mindless. The algorithm does not cover all cases. We'd already talked about situation where there are two black 9's on which to play the red 8, and similar.

When I've been using the term "strategy", I have not meant the game-theoretic use of the word but the common use. The strategy that says, try to uncover the largest pile of unknown cards first; or hold up a play to the ace pile because blah blah. I confess, I didn't even think of having memorized the order of the deck as you sort through, so that you would know whether to make an "unusual" play. My assumption is that at almost every moment there is a "usual" play; unlike chess, where there are a small number of times when there is a "usual" or "required" play. The fact that one can even think of a play as "usual" or "unusual" is confirmation of what I'm talking about.

My assumption is still that "unusual" plays will not affect the overall winnabilty of solitaire. If you have two competent players, playing the same games, I aruge that they will tend to win or lose in synch; one player will not have an advantage over the other because of thinking deeper. And that's the way I read what sdim was asking.

That is, the basic competent approach to play will include things like trying to get through the largest piles first (because they'll have the highest probability of having the most usable cards), etc. I think that some of the casino games REQUIRE that a card be played on the ace pile if it can be. And my hypothesis is that added degrees of subtlety to this basic approach will not impact the outcomes.

If you will: in chess, added degress of subtlety clearly affect the outcome. In tic-tac-toe, if both players know the basic strategy/algorithm, added degrees of subtlety won't help; the winning probability is zero (all games end in ties.) I contend solitaire is closer to tic-tac-toe in that regard than to chess.

Now the easy way to test my hypothesis is to get some others to play and tell us their statistics. (I don't have solitaire on my computer at work, so I can't help, and I don't know if my home game keeps stats.) Manlob's made a start, but those were computer run.

I think Sdim has given us the rules he's been using (since there are lots of variants), let's copy those and go to it.

kesagiri
09-18-2000, 01:50 PM
Dex,

By strategy I mean that the player may have already run through the deck once and knows where each card is in the draw pile (sorry I don't know the names). Then when going through again you may see a playable card, but might forgoe that play to reach and even more disirable card to play. See Chronos' statements on this a few posts up. The "mindless" (perhaps simple is better) strategy would be to just play a playable card as soon as it comes up. A more sophisticated strategy would be to try and remember what you have seen previously, try to do some probability calculations, and determine whether or not to play a card at a given time. In some cases it might be the only playable option, other times you might have a choice.

Chronos
09-18-2000, 01:50 PM
I think that some of the casino games REQUIRE that a card be played on the ace pile if it can be.
Why would they have such a rule, if it has very little effect on the winnability?

C K Dexter Haven
09-18-2000, 03:08 PM
Kesagiri: OK, then we've got definitions agreed.

Chronos: To avoid conflict later? Remember that in the casino games, you pay an amount for the deck and you win an amount for each card on the ace pile. If someone could put a card on the ace pile but doesn't, and covers it with a lower-card-opposite-color, then they could complain later that they could have earned the money for it.

I don't know, I'm speculating. And I confess, my notion that "strategy" makes little difference is based on the way most people play -- what I called mindless and what Kesa has more tactfully called "simple" or straight-forward.

The other way to test the waters would be to set up a few hands, and have some people play them one way (straight-forward) and other play them with great thought (memorizing order of deck, etc), and see how the results come out.

In short, the numbers preclude a mathematical solution, so I am suggesting a empirical approach.

Indysloth
09-18-2000, 06:21 PM
Chronos said "Why would they have such a rule, if it has very little effect on the winnability?"

Funny but I think it is to prevent a certain strategy.

Imagine a scenario where you have the 3 of hearts as a card in one of the seven piles. Also assume that the Ace and two of hearts have been played. Under the rules you would have to play the three to the two and score points.

Now let's assume that you have yet to see either black ace or either black two and that you are fairly early in the draw deck. If you play the three to score, then draw a black two before a matching black ace, you will have to discard that two. Having the option of not playing the three give you the opportunity to use it to hold the two in anticipation of getting the ace at some point.

If either two never comes up then you have lost nothing.

If one or both aces come up before the twos, you have lost nothing.

If the 2 comes up before the ace and would have been unplayable otherwise, you have gained the value of the two, plus whatever other upside that affords.

If the 2 comes up but the ace never does, then you have lost the value of the 3 (and any playable cards under it)

When playing on the computer under Vegas rules, I find I make more money by playing a card for a score as soon as I can, but although I can't prove it, I think I would win fewer games (play all cards on the aces) using that strategy.

But I don't know if the difference is material, so this could be one of the things Dex can throw out as immaterial.

Dean

Indysloth
09-18-2000, 06:26 PM
an addendum - if the casino has the rule that forces the play on the ace as soon as possible, then my belief that I make more money by following that strategy must be incorrect. I would imagine the casino knows what such a rule does to the take, and would adjust accordingly.

Cabbage
09-18-2000, 06:30 PM
Why would they have such a rule, if it has very little effect on the winnability?

I don't know what the Vegas rules are, but keep in mind we are talking strictly about winning the game as a whole. Such a rule could clearly have an effect on how much money you win in a given game; I'm with Dex, though, in thinking such a rule won't have much effect on actually winning the game.

Manlob
09-18-2000, 11:45 PM
Manlob: Your program seems to be lousier than even I am, since I average 15% playing three-at-a-time.The program plays by the simpliest possible strategy- it never passes up an opportunity to move, and does no planning ahead. So its winning percentage is about what a person should get if he plays mindlessly. For this strategy, it was easier to program it than to play a bunch of games.

Anyone who thinks strategy makes no difference is mistaken. I played some games by concentrating much more than in a casual game and won 10 out of 21 games for 48%. This is with Microsoft Solitaire drawing 3 cards at a time. The key is to ignore most playable cards, and only play the ones you really need or which set up the hand so the cards you need come up next cycle through the hand. Unfortunately I didn't know it was possible to bring cards back off the aces pile until the 20th game, so the potential winning percentage is bound to be even higher. In both games that I knew of this rule, bringing cards off the aces pile saved me from losing. Playing with this much concentration, isn't a whole lot of fun, so I don't plan on continuing more trials, but it seems like it should be possible to win more than half the time.

There is an upper limit to winning percentage. A simple estimate can be found by just looking at the opening deal. If after the opening deal, a pile contains a card above what is needed to move this card then there is no possible way to win. For example, if a pile initially contains (starting from the face up card): Ace of hearts, Ace of spades, 6 of clubs, 7 of hearts, 7 of diamonds, and 3 of clubs. There is no way to win. This is because, the 6 of clubs can only be moved off the pile onto a red 7 or onto the aces pile. Both red 7's are beneath the 6, and with the 3 of clubs also covered, the pile on ace of clubs will never reach 5 of clubs so that 6 can never be moved. Similarly, multiple cards can also combine to block the cards the other needs to be moved. I did a Monte Carlo simulation of millions of opening deals, it looks like 2.1% of games are hopeless because of this.

Another case for a futile game is not being able to play anything from the starting hand. 0.75% of the time absolutely nothing can be done from the beginning. 6.67% of the time there is no opportunity to play from the hand (after being able to move around some of the starting face up cards). Although there is some overlap between not being able to play from the hand (6.67%) and the hopeless situation described above (2.1%), it looks like about 9% of the games are doomed from the start. There are certainly other impossible situations, but they get increasingly complex to identify when moves are possible prior to losing.

C K Dexter Haven
09-19-2000, 08:20 AM
<< Unfortunately I didn't know it was possible to bring cards back off the aces pile until the 20th game, so the potential winning percentage is bound to be even higher. >>

Yeah, we'd better agree on rules before we start collecting Monte Carlo samples!

It's interesting that your play with concentration did produce a much higher winning streak. Of course, with only 50 games, that could be sheer dumb luck. My hypothesis has been that a deeper, more strategic player will not have a very large edge (a few percentage points, perhaps) over a standard player. Your statistic tends to invalidate my hypothesis; I guess I'd like to see more evidence. Sigh. I suppose this means a weekend at solitaire.

I find it also interesting that your simulation gives only 2% as the number of games that are unwinnable on account of a "buried" card being unplayable. I guess if we could quantify all unwinnable situations, and estimate their occurence, we could solve the initial problem.

Chronos
09-19-2000, 02:41 PM
I guess if we could quantify all unwinnable situations, and estimate their occurence, we could solve the initial problem.Depends. In addition to the situations which are totally hopeless, there's also the situations where the a priori "best strategy" would not lead to a win, but a nonobvious, "wrong" strategy, by luck, would. It seems to me that it's not meaningful to compare a player to the results of an omniscient strategy.

C K Dexter Haven
09-20-2000, 07:33 AM
I did my bit. I played ten games last night, mostly straightforward (no attempt to memorize the deck, etc)... and I lost all ten. Then I gave up in disgust. So much for my willingness to participate in a noble experiment for the glory of the science of probability.

Manlob
09-20-2000, 11:16 PM
CKDextHavn, don't give up so soon, you are the only other doper providing any data. If you play without out concentrating, you can expect to win 7.5% of the time (this assumes you don't bring down cards from the aces or split sequences, like the pre-Microsoft rules), so losing 10 straight should not be discouraging. To win more than half the time you need to pay attention to the hand. No need to memorize- just lay out real cards to match the computer hand, or just play analog solitaire.

Dealing 3 cards at a time is frustrating to plan out, so try playing with dealing one at a time instead. I just tried 15 games this way, won 67% of the time, and I am sure winning percentage could be raised with more thought. Dealing three cards at a time could give no better a winning percentage than one at a time dealing. Although I was winning only 48% of the time when dealing 3 at a time, most of the games I was losing were because of buried facedown cards rather than not being able to access cards from the hand. So I think the winning percentage of 3 and a time and 1 and a time dealing may be fairly close together. That 48% with 3 at time dealing was before finding out about bringing cards back down from the aces, and using that extra rule will only raise winning percentage.

C K Dexter Haven
09-21-2000, 07:31 AM
Huh? Dealing one card at a time will of course give a better winning streak than dealing three cards at a time, unless you have a rule about only going through the deck once, for instance.

The idea of gathering data was that we all use the same rules. Otherwise, the data is useless.

I was figuring that if sdim's 15% winning rate was in line with national average, that I'd win at least one game in ten. I didn't so, I gave up. I mean, after all, it's a borrrrrring game.

sdimbert
09-21-2000, 09:56 AM
Originally posted by CKDextHavn
I was figuring that if sdim's 15% winning rate was in line with national average...

You wouldn't say such things about me if you knew I grew up in Lake Woebegone! ;)

...that I'd win at least one game in ten. I didn't so, I gave up. I mean, after all, it's a borrrrrring game.

It's only boring because you're playing it on the computer. I have it on my Palm... I spend a lot more time in the bathroom nowadays!

sdimbert
09-26-2000, 04:41 PM
If anyone still cares, I am now at 21/137 for a winning percentage of 14%.

Still in the groove...

Send questions for Cecil Adams to: cecil@straightdope.com