Challenging Probability Question

I’m not convinced of this, and since both the numerator and denominator are going to infinity, you have to be careful.

Say you play each game a million (or whatever) times. Take n to be the maximum number of flips that have occured in any of the Game 1 games. Take m to be the same thing for the Game 2 games. It’s highly unlikely they will ever be equal, and even if they’re close (within 1 of each other), that’s enough to screw up the convergence.

I don’t think m and n as I had them above will necessarily increase at the same rate, so I don’t think there’s a limit.


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)

[QUOTE]
Originally posted by Cabbage:
** I’m not convinced of this, and since both the numerator and denominator are going to infinity, you have to be careful.

Cabbage, are you sure that the numerator (the average winings of Game 1) and the denominator (the average winnings of Game 2) are going to infinity? Certainly they will not be converging to any finite number, but does this imply that they will go to infinity? Perhaps they will, ah, just bounce around indefinitely, ie, not converge at all.

Also, the suggestion was made that we can consider variant games where the number of maximum flips allowable is limted to say, N, and then see what happens as we increase N. This approach, however, won’t tell us what will happen when N is unlimited (as it is in the original question). Just because 2p/p=2 for every finite value of P does not mean that it equals 2 when P is infinite.

I’m not Cabbage, but the numerator and denominator both approach infinity, as evident by the fact that they both have a larger than one value being raised to the nth power. This does not have any effect on their ratio however. See below…

The basic rules of algebra still apply here. 2p/p will still converge to 2 even if p were infinite. Infinities of the same magnitude cancel out. I know that’s not proper mathematical wording, but I’m just trying to get the point across. Since both the infinites are (3/2)^n, and not, say, n^2 and ln(n), you can safely cancel them out. The two values approach infinity at the same rate. There is no ambiguity here. The ratio of the two infinities converges to 1/2. Not all infinites are born equal, but these two are :slight_smile:

And jcgmoi, you cannot assume that 2 * Payoff(1) = Payoff(1) since you’re attempting to find the ratio of 2 * Payoff(1) / Payoff(1). That assumption you made is only valid after you’ve found the actual ratio to converge to infinity (e.g. you find the result to be 2 * infinity, and just write down infinity as your answer).

hmmm… bump? (see atmb)

Cabbage sez:

OK, I will be.

OK, look at this:

Fix n. Same type of modification to both the games as I used above–restrict game 1 to end after n flips (if necessary), restrict game 2 to end after n+1 flips. Take the ratio of the expected payouts. Now let n go to infinity. It won’t converge to 1/2.

m and n as I had them were’nt equal to begin with. And they don’t necessarily grow at the same rate, either. Certainly, there’s an expected rate (value) here at which they’ll grow, and that rate will be the same for both, but neither will actually follow that expected rate consistently; they’ll kind of oscillate around that expected rate, averaging out to it, but not actually equalling it. Statistically, there’s absolutely nothing wrong with that. That oscillation around the expected rate (which will be different for m and n) will be enough to prevent convergence.


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)

We’re not dealing with a straight-forward sequence (or ratio of sequences) here. We’re dealing with the Expected Value, and the calculation of Expected Value is the divergent SERIES (sum of sequence).

That is, for Game 1, the Expected Value E(1)= SUM( 3/2)^n where n–> infinity. That series (that is, the sum) diverges; the sum gets larger and larger without bound as n gets larger and larger. Don’t talk to me about “equals” infinity; say instead, it diverges, it grows without bound.

For Game 2, the Expected Value E(2)= 2* SUM (3/2)^n, and this sum also diverges.

Is it the case that the ratio E(1)/E(2) = 1/2? I confess, I’m way too far away from infinite series to remember how that works. I can try to remember to look it up when I get some time with some reference books.

Or someone will cite the theorem before then, and I’ll be embarrassed at not remembering.

Tony, you will NOT be able to have conclusive indications from a computer run that a series doesn’t converge. Simple example, a series might seem to converge for the first billion terms, and then diverge… no computer run would show divergence. Similarly, you won’t get proof of convergence from a computer run… You will only get an “indication”, if you’re lucky and if your run the simulation enough times (and don’t ask me what “enough” means, a billion runs may not be enough.)

Practical Terms
Someone asked about “practical terms.” In practical terms, of course, the whole point of the paradox is that you’d NEVER have a run of 10,000 tails. Well, OK, not NEVER, but you’d be running the computer a VERRRRRRY long time before you could expect such a thing to show up.

One of the paradox questions (Let’s stick with Game 1) is: How much would you be willing to PAY to play this game? You pay $X, and then get a winning based on the coin toss. If you pay $7.60 ( = 3/2^5) and you get a head on before toss 5, you’ve lost. If you four tails and then a head on toss 5, you’re even. If you get five tails (or more)and then a head, you’ve won.

If you pay $57.66 (= 3/2^10), then you win only if you get a string of at least 10 tails before you get the first head.

In theory, to make a “fair” game, you’d be willing to pay the Expected Value. In this case, that’s Infinite! … but this seems silly. You’d be hard pressed to find someone to bet $57.66 on getting a string of 10 tails before they get heads.

And of course, the paradox goes on… don’t just play once! Play hundreds of times! You pay the $57.66 each time, and you lose and lose and lose, game after game, and then on the thousandth game or so (expectation of 1024 games), you win one! Well, you’re way way far behind, of course – you’ve lost every game but one so far. However, keep playing… sooner or later, after a billion years, maybe, you’ll have a string of a thousand tails before you get heads… and THAT’s when you’ll get a huge payoff.

SO the paradox is that the math behind the game relies on an infinite number of plays; while the actual play of the game, you’d give up after a few billion games (even with the best computers wholly dedicated to this game, it could take thousands of years of computing time before you hit some of the huge payouts.)

Zor says:

And jcgmoi, you cannot assume that 2 * Payoff(1) = Payoff(1) since you’re
attempting to find the ratio of 2 * Payoff(1) / Payoff(1). That assumption
you made is only valid after you’ve found the actual ratio to converge to
infinity (e.g. you find the result to be 2 * infinity, and just write down
infinity as your answer).

But Zor, there is no assumption, only a paradox. Payoff(1) increases without
bound but it is a countable infinity. Twice a countable infinity is a countable infinity.

Look at it like this. Every 3rd positive integer is divisible by 3. Let’s put those in S. Let’s put all other positive integers in T. You would say T has twice as many members as S when in fact they contain the same number, as can be shown by pairing them off 1-1. And if you sum the elements of S and T you find they are equal, not to ‘infinity’, but to a number that increases
without bound. In fact, their sum is Payoff(1).

I agree with most of your argument, but you messed up on the arithmetic on these two examples. If you pay $7.60 to play and you get a head on toss five, you’re not even - you will have won 3^5 = 243 dollars. Pretty good deal.

And if you pay $57.66 and get a head on flip 10, then you’ll collect over $59,000 dollars.

So in practical terms, I’d sure line up to play this game.

Reading the discussion, I realize (with embarrasment) that I slightly mistated the question. Perhaps, that is the explanation for some of the disagreement. With respect to Game 2, I should have designated the number of coin flips as “m” not “n.” As many grasped, I did not intend to assert that the pay-off of Game 2 was based on the same number of coin flips as Game 1. More explicitly: Game 1 and Game 2 are played simultaneously but independently. Though the total winning for each game will go up, they need not increase at the same rate for any given segment of sequential plays.

I also now realize that instead of asking, What will be the ratio of average winnings of the games?, I could simply have asked, What will be the ratio of the total winnings of the games?

Nevertheless, I am still interested in the more basic (?) question, Will the average winnings of either game converge to infinity or not converge? If the answer to this question is that it will converge, I think the answer to the ratio question is 1/2 because Game 2 will converge more quickly. But if the answer is that the average will not converge, that to my mind leaves the question open regarding the ratio of total winnings.

With respect to average winnings of either game, the answer is not clear to me. Intuitively, because extremely large payoffs will occasionally come up, the average winnings could converge to infinity. But perhaps they will come up so randomly that there will be no convergence at all.

CKDextHavn: I agree with much of what you have said. If you could find the relevant theorem (or even suggest to me the right reference work), I’d would asppreciate it.

Tony

I apologize, I did indeed screw up the math. The point is still a valid one, you pay $X and there’s a certain point at which you win back WILDLY more than what you paid… after a sufficient number of tosses.

<< Will the average winnings of either game converge to infinity or not converge? >> NO, Tony, there’s no such thing as “converge to infinity”… there’s only converge (to some finite number) or diverge.

I have to think that, given a divergent series and another divergent series whose terms are all twice those of the first, that the ratio is undefined. “Infinity” divided by “infinity” is not defined.

Example: Take the series A = 1 + 1 + 1 + 1 + …
which is divergent. Take the series B = 2 + 2 + 2 + … which is divergent. The quotient A/B is undefined (not convergent).

Sure, you say, you could write B = 2 * A and then A/2*A = 1/2. But you could also write B as: (1 + 1) + (1 + 1) + (1 + 1) + … and then, lo! B = A, so A/B = 1.

Or you could write A = (1 + 1 + 1 + 1) + ( 1 + 1 + 1 + 1) + … = 4 + 4 + … so then A/B = 4.

So, I am standing by my original guess, that the ratio of the two series does NOT converge.

Notice, that you can’t pull this trick with a convergent series. Rearranging the terms of a convergent series (even one with pluses and minuses) does not affect the convergence. Rearranging the terms ofSo, I think this is another of those questions where “common sense” says that the ratio is 1/2, but “common sense” is mathematically not correct in this case. The ratio is not defined (that is, not convergent.)

Thus, when we have people running 100,000 trials, we get a DIFFERENT result each time.

CKDextHavn,

I have to speak tentatively here because it’s not my area, but I think you are confusing sums and series. The series .9, .99, .999, . . . converges to 1. The series 1, 2, 3, . . . converges to “infinity” (that is gets greater and greater without limit). The series 1, -2, 4, -8, 16 . . . does not converge at all. The average winnings for Game 1 will be a series, maybe sometime like 9, 6, 3, 18 . . . (I make these #s up). So it makes sense to ask whether the series will converge to infinity or not converge at all.

I agree with you that that infinity/infinity is undefined. Nevertheless I do not see how that answers the question. The total winnings of Game 1 and 2 will both increase without limits as the games are played repeatedly. But there still may be a well defined ratio of the winnings. Likewise, even if the average winnings of both games form a divergent series, it may be that there ratio is a convergent series–or at least you would have to refer me to a theorem to the contrary (I don’t trust my intuitions).

In sum, I do not see how your discussion of expressions such as 1+1+1+1 . . . can answer the problem. But I may well be missing something, and I would be happy to be enlightenend.

Tony

Ok, this was sort of interesting and I wanted to see what happened, so I wrote a program to test this.
http://www.members.home.com/whitenight/files/CoinToss.cpp - This is the C source http://www.members.home.com/whitenight/files/CoinToss.exe - This is the Win32 .exe

It’s very fast, it plays about 70,000 games / second on my pentium 200.

I added in a bunch of features so you could analyze the results. The program features three main modes of operation, one where it shows the results of each series of tosses for each type of game, another where it shows only tosses that go past a certain threshhold (more than eight tosses in a series, etc) or silent mode where it only displays results when you hit a key to exit. Needless to say, silent is fastest, but the least informative.

I’ll detail the options, though it comes with some help.
Options: -t # -g # -z # -s -h

-g: Display every Nth game, speeds simulation. Default 1
This lets you check the stats every so many games, without exiting, without having to see each turn.

-t: Display only games that go past N tosses, speeds simulation. Default 8
This shows you just the exceptional turns, so you don’t watch the massively more common 2 and 3 toss turns.

-z: Game Types. The number of types of games played. Default 2

The game defaults to two game types, it will work with up to ten. The payback is x(3^n) where x is the type of game. This lets you see if a trend continues.

-s: Silent. Display no output except the final results.
Will show only the final results. Use this if you want to let it run while you go eat and check the results later.

Run the program, hit a key to exit. Read the output. It’s pretty simple.
And, because I’m bored, and want to show off my mad ASM Sk1llz :slight_smile: here’s a little demo I wrote. 1238 bytes, with 768B pallette. My first (and hopefully only) x86 all-ASM project.
http://www.members.home.com/whitenight/files/fire.com
Enjoy. BTW, please let me know if you see any problems.

My arithmetic error was confusing the original statement of payout (3^n) with the expected value terms (3/2)^n. Sorry about that.

The point is still the same. How much do you pay to play?

Tony, if you understand the difference between “converge to infinity” and “equals infinity” (as opposed to other divergence), taht’s fine. My concern was that most posters didn’t, and are cheerfully dividing one infinity by another infinity.

My point is that you can’t do that. When you are dealing with divergent sequences (or the sum of a divergent sequence) – and I am using the term divergent to mean not convergent,whether it increases without bound or not – it is NOT TRUE that A/2A = 1/2. I used the example of 1’s and 2’s because it’s easier to manipulate than your example of 3^n and 23^n – or, more correctly, than your expected values of (3/2)^n and 2*(3/2)^n.

The example of the 1’s and 2’s was meant to show that one cannot “cancel out” the infinite sum in the numerator and denominator.

I still content that the ratio of the two games is undefined. The “average” winning of Game (1) (that is, the expected value) is unbounded, converges to infinity, is undefined. Ditto the “average” winning of Game (2). The ratio of the two is thus not defined – there is no single numeric value (finite or infinite) that can represent that quantity. It’s like 0/0, it is not a defined quantity.

CKD, you write:

I still content that the ratio of the two games is undefined. The “average” winning of Game (1) (that is, the expected value) is unbounded, converges to infinity, is undefined. Ditto the “average” winning of Game (2). The ratio of the two is thus not defined – there is no single numeric value (finite or infinite) that can represent that quantity. It’s like 0/0, it is not a defined quantity.

I don’t mean to be tedious, but if the average winnings of games 1 and 2 will both converge to infinity (given enough plays), why do you think their ratio is undefined (will not converge)? It seems more plausible to me that the average winnings of game 2 would converge to infinity faster than those of game 1, or at least the rates of convergence could be meaningfully compared. Clearly the series of average winnings cannot converge on a finite number, or that number would equal an amount you should be willing to pay to play the game, and all amounts are two small (the expected value of a play is infinite). But mightn’t the average winnings, as the number of plays increases, not converge at all, i.e., simply bounce around over an unlimited range in a purely random way?

Finally, for what its worth, playing a game with a payoff of 2^n for a million plays produced suprizingly consistent average winnings: 20, 22, 24, 19.

WhiteKnight, I tried running your program, but was unable to find any controls. When I hit a key, the program window shut down. But thanks anyway!

Tony

When you’re evaluating a limit such as this…

lim { f(n) } as n --> x

You cannot assume that n=x right off the bat. The same goes for evaluating any limit, including infinity. Take the following limit for example…

lim { sin(n) / n } as n --> 0

If you just set n=0 right off the bat, you’d get 0/0 as an answer. The correct solution, however, is 1, which you can get by applying l’Hopital’s rule (i.e. find the ratio of the rate at which the numerator and denominator approach 0). Now if we return to the original question…

lim { (3/2)^n / 2(3/2)^n } as n --> infinity*

Again, you cannot set n to infinity right away; they cancel out first. Furthermore, you cannot set the 2 n’s to different values. They are exactly the same by definition; n represents the maximum number of flips that are allowed to be played within a single game. We evaluate the ratio of the average winnings by continuously raising the value of n, simultaneously for both games. You cannot raise the maximum number of flips allowed for Game #1 first, compare its average winnings to that of Game #2, then raise the maximum number of flips allowed for Game #2 later. The maximum number of flips allowed must be the same for both games. That is why you can cancel the two (3/2)^n out and get 1/2.

Take another limit as an example of this…

lim { n / n^2 } as n --> infinity

Now what is your answer? Undefined? No, it’s 0, because the denominator approaches infinity faster than the numerator. Here’s one last example…

lim { n / n } as n --> infinity

So what is it this time? Undefined? Nope, it’s 1, because here the denominator approaches infinity at the same rate the numerator does.
p.s. Just in case any of you have forgotten or are unfamiliar with l’Hopital’s rule, it’s mathematical form is as follows:

lim { f(n)/g(n) } as n --> infinity = lim { f’(n)/g’(n) } as n --> infinity

Under the conditions that the ratios are in of the indeterminate forms 0/0 or Infinity/Infinity.

Zor,

I agree with your math, I just don’t see how the equations you advance are relevant to solving the problem. Perhaps you could explain this more clearly.

It seems to me that you are imagining versions of Game 1 and 2 where the maximum number of flips is limited and then examining the ratio of winnings as that limit is increased. Is that right? If so, I don’t not think it is a correct approach to the problem. To put it simply, in the actual problem, n, the maximum permitted number of flips, is infinite. Examining bounded variants of the game is misleading. It’s like saying that because n/n, n to infinity, equals 1, then when n=infinity, n/n equals 1. But it’s undefined.

At least so it seems to me.

Tony

First, I just want to mention that the limit mentioned there at the top should actually have sums in both the numerator and denominator.

Suppose that the ratios actually do converge. I claim that I should be able to model it the way I did, i.e, by allowing m and n to be different, letting both go off to infinity, and taking the limit. This will still take the limit of the ratios of the partial sums as both partial sums go to infinity.

As I mentioned before, however, say I take the top sum to be from 1 through n, the bottom from 1 through n+1 (sure, I’m setting the max number of allowed flips to one more in the second game, but in a moment I’m going to let n go to infinity, so that won’t matter after all). Take the ratio. Now take the limit of that ratio as n goes to infinity. If the ratio of the two infinite series is going to make any sense at all, this limit should be 1/2 (as it would be if I had the sum from 1 through n in both top and bottom). It’s not.

This describes a way we can model the ratios of the infinite series where we do not get 1/2 as their limit. In fact, we can try to model their ratio many different ways, and get many different answers.

Conclusion: This limit we’re trying to find simply does not exist; the ratios do not converge.


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)

Cabbage,

Interesting answer. perhaps it is right. It’s the most persuasive thing so far, I think. I’ll have to think about it more (even though I don’t think I’ll be able to conclusively evaluate it). But let me ask you this:

You argue that the ratio doesn’t exist because different ways of “modelling” the ratio result in different answers. But let’s say that this was not the case. Let’s say that for a different pair of games 9with different payoff functions), the various models you consider (n, n; n, n+1; n, n+2, etc.) all converged on 1/2, as n when to infinity. Would you then conclude that the ratio was 1/2? Based on my previous post, as I explained to Zor (please see that post), this would NOT justify the conclusion that the true ratio was 1/2. In other words, I have doubts whether the modelling technique you use is correct in the first place, so that, when it produces different answers, you are warranted in concluding that there is no single correct answer. The failure of an invalid technique to produce an aswer does not mean there is no answer.

Tony

Yeah, I would, if ALL the “models” converged to 1/2. That means (n,n), (n,n+1), (n,n+2), (n,n^2), (n^3,the nth prime); ANY of the uncountably many possible pairs of increasing sequences of integers.

For example, if we changed it like this: Game 1, where the payoff on the winning flip was just $1 each time, and Game 2 where the payoff was $2 each time, then the ratio of payoffs as you played the games over and over would converge to 1/2.

As for justifying the model I used, first off, it doesn’t make sense to compare winnings after the games have been played an infinite number of times–both will have won an infinite amount of money, you can’t say this infinite is a certain percentage of that, it doesn’t make sense. What we can do is play each game an ever increasing number of times (so the payoff will be finite at all times) and compare those ratios and see how they behave in the long run.

We can do this by considering the expected payout. The expected payout is the sum of all the possible payouts, weighted with their corresponding probabilities. In general, the more a game is played, the closer the average winnings per game will come to the expected payout. An important point: Statistics says nothing (or if it does, then not enough, in this case) about this rate of convergence. Sometimes, by pure luck, your average winnings may exactly match the expected payout early on. Other times, it may take a while to settle down and get close. The rate of convergence is not consistent, all we know is that it eventually will converge.

That’s not the case in this game, however–the expected payout is infinite, which is why it’s called a paradox. It does imply, however, that the more the game is played, the average winnings will increase without bound.

So what to do about this? We’re playing a finite number of games, and seeing what happens to the ratio of winnings as the number of games increase. If we play a finite number of games, there will also be an expected maximum number of flips having occured in that game at some point. If I play the game only once, probability says I can expect to flip the coin once or twice. If I play the game 1000 times, probability says that the expected maximum number of flips that occured in a single game is 10; 1000000 games, an expected max number of 20 flips has occured (I’m not claiming these numbers are accurate, just illustrating the point). The more games, the larger the expected max number of flips.

So if we play a finite number of games, we can safely go ahead and limit the number of flips allowed in a game, because if you play a set of n games, then play another set of n games, over and over, on average the maximum number of flips occuring in a single game will converge to the expected maximum number of flips.

Allowing only a finite number of flips like this has the benefit of giving a finite expected payoff for each game (the sum from i=1 to n I mentioned in a previous post). Now we can take the ratio of these payoffs.

If we play the “finite flips” version of the game over and over, the ratio of the payoffs the games will converge to the ratio of the expected values.

Now, we take the step we’ve been working towards all along–what happens to this ratio we just obtained as we play more and more games?

First of all, we have to start increasing the max number of flips allowed–the more games played, the larger the expected max number of flips becomes. Correspondingly, this will increase the expected payoff, which will start going to infinity for both games.

Here’s where the important point I mentioned above comes in. There’s nothing guaranteeing the expected payoff in Game 1 will be concurrent with the expected payoff in Game 2, they may converge at different rates (not to mention the “max number of flips in a single game” may increase at different rates as well, which only makes things worse). Sometimes 1 may increase faster, sometimes 2, there will be nothing consistent about it.

That’s why we have to consider all the possible different “models” of this convergence I mentioned at the beginning. Using the different models sometimes the ratio converges to 1/2, sometimes to 1/3, sometimes it just bounces around and never settles down at all.

Since we have no consistency at this point, we’re done, we’ve found out that the ratios will not converge.

Sorry I got kind of long winded, but there are some subtle points that need to be watched out for. Interesting question, though.


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)