Challenging Probability Question

panamajack,

The thing you forget here is that you’re discussing infinity, which is not like a regular real number. You seem to imply in your post that a) board one has an infinite number of markers on it, b) board two has an infinite number of markers on it, c) board two pays twice as much per marker, and d) (2infinity)/infinity = 2. You can’t “cancel out” infinity because 2infinity = 3infinity = googolinfinity = infinity. I offered a particular proof of this earlier in the thread. I realize this property of infinity is not intuitive (in fact, it’s probably counter-intuitive), but that’s the way it is.

Panamajack, I like your board analogy, but I think the point at which it fails is when you look at it after infinite time.

Take the two boards where you put the winnings on each number as they occur.

Forget about the “distribution” at that point of the two boards; ask what’s the TOTAL? And the answer is, there’s a (countable) infinite number of markers on each Board. It’s like asking the question, what’s the ratio of even natural numbers to ALL natural numbers – that ratio is NOT 1/2. There are just as many even natural numbers as there are all natural numbers, even though the one is a proper subset of the other.

Again, there are situations where it is possible to take the ratio of infinite sums and get a finite sum, but I don’t think this is one of them.

Good. The conversation has gotten interesting again (I believe).

First, briefly, CDK. It’s not really important, but you have mistaken my point about the “paradoxical” nature of the StP. Paradox. Your’s is the common line on the Stp. paadox: StP is paradoxical because the exptected payoff is infinite while “common sense” says it’s limited. ($20 seems to be the avearge people are actually willing to pay.) But I’m saying it’s paradoxical because the expected payoff according to standard utility theory (infinite) is inconsistent with (not common sense) but what in fact must happen. The payoff for a given play must be finite even though its expected value is infinite. Usually, it’s at least possible that the expected payof and the actual payof will be the same. I think it’s bizarre that the expected payoff will always be more than the actual payoff and you can know this in advance. You know your expectations are wrong in effect. That’s paradoxical.

Whoops. I’ve been notified that zut has just posted something. I’ll send this, read zut’s and then respond to it.

Tony


Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

A good many sensible posts have clarified what this problem means and why the “obvious” answer of 1/2 is not so obvious. Now several posters have gone back and asked what IS the correct answer.

Since the limit of the two ratios is a random variable, it need not always converge to the same figure. In fact, it’s easy to see that certain possible specific sequences of flips could give many different numbers as a limit. But, all of these specific sequences would have probability zero.

My guess is that the ratio fails to converge, except on a set of measure zero. That is, with Probability = 1. the ratio does not converge.

However, I do not know how to demonstrate this result. Maybe a proof by contradiction:

Let F(k) be the average result of Game 1 after k trials and G(k) = the average result of Game 2 after k trials.

Suppose limit {F(k)/G(k)} = r. Then, for any epsilon, there exists a K such that F(k/G(k) is within epsilon of r for all k > K

Now, I would like to show that with probability = 1, there would be some k for which F(k)/G(k) is not within epsilon of r.
Presumably this would happen because one or the other sequence would be expected to change drastically, due to a very run.

Anyone care to complete this proof or to offer a better one?

Panama Jack,

Welcome to the conversation.

I believe I see your argument. In a nutshell, I think you’re saying that after an infinite number of games, the distribtution of payoffs would be the “expected” distribution and therefore we get the 1:2 ratio. That’s Konrad’s argument too, I think. But it’s impossible to play an infinite amount of games. We can only play more and ask about limits. To be concrete (my math is very basic), after a billion plays, the ratio of 3’s in game 1 to 6’s in game 2 will be very close to 1:1. We can be pretty sure of that. Granted, that’s a good foundation for thinking the ratio of total winnings will be 1:2. But what about the possible 3^billion and 2*3^billion payoffs? Even after a billion plays there will always be payoffs that are (1) large enough to throw off the ratio, (2) likely enough to be a worry, and (3) not so likely that we can say anything about their frequency with confidence. That’s why the ratios will always bounce around.

That’s the argument, even if it is not quite a proof.

Tony


Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

Zut,

Your’s is the point I am really interested. If I understand you, you believe that whenever Game 1 and Game 1 have infinite expected utilities, the exptected ratio of the winnings will be undefined. An you beleive so because infinity/infinity is undefined. So if doesn’t matter if Game 2’s payoff is (10^10n!)! or whatever (as long as Game 2’s payoff is not dependent of Game 1’s or vice-versa). Infinity over infinity is undefined. This is your position, right?

Do others agree?

It seems to me that we have to at least ask what the likelihood is that Game 1 will produce a payoff that will screw up the ratio. If may be that the more plays there are, the greater lead that Game 2 will build up so that it becomes decreasingly likely that Game 1 will ever be able to have a payoff significantly screwing up the ratio. Of course, there is always the possibility that that it might, but if the possibility is expected to decrease, then you will get your convergence.

I am willing to admit that if the payoff of Game 2 is t*3^n, then for any t, there will be no convergence, ie, the ratio will be undefined. Undoubedly the class of nonconvergence payoff functions is lrager than that. But I would need more argument to accept no-derfined-ratio for any payoff function with infinite expected utility.

Tony

Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

Tony: << The payoff for a given play must be finite even though its expected value is infinite. Usually, it’s at least possible that the expected payof and the actual payof will be the same. I think it’s bizarre that the expected payoff will always be more than the actual payoff and you can know this in advance. You know your expectations are wrong in effect. That’s paradoxical.>>

Yes, agreed. It’s a fascinating problem.

I think I have finally come up with a proof that will satisfy everyone. Warning: it’s kinda tedious and not at all elegant.

OK, the set up, from several hundred posts ago:
Player 1 plays Game 1, and gets 3^n if the first heads is on the nth toss.
Player 2 plays Game 2, and gets 2*(3^m) if the first heads is on the mth tosses.

NOW… here’s my new insight.

Let’s add a new Player 3, where Player 3 gets the ratio of winnings of Player 2 to Player 1. Got it?

EXAMPLE: SUPPOSE first round, Player 1 flips 2 tails and a head (TTH), gets 3^3 = 27. Player 2 flips TH, gets 2*3^2 = 18. So Player 3 gets 18/27 that round.

Every one clear? Now, we can reframe the ratio question by asking: what’s the expected value of the game for Player 3?

Obviously, if the same coin-sequence is used (n = m), then on each round, Player 3 gets $2. That’s the dependent games issue, that I think we’ve already dealt with.

OK, so now we need to solve for the expected value of Player 3 when n and m are independent. I contend this is also infinite (unbounded.) I prove this by enumerating all the outcomes:

Game 1: H
Game 2: H, TH, TTH, TTTH, …

Game 1: TH
Game 2: H, TH, TTH, TTTH, …

Game 1: TTH
Game 2: H, TH, TTH, TTTH, …

OK? If you’re with me so far, then let’s calculate the expected value for each outcome (probability x payoff for Player 3), and the expected value for Player 3 (which will be the ratio of Game 2/Game 1) will be the SUM of the expected values for each outcome. Ready?

Game 1: H (Prob = 1/2) Pays $3
Game 2: H (Prob = 1/2) Pays $6
Game 3: 1/4 X payoff $2 = $1/2

Game 1:H (Prob = 1/2) Pays $3
Game 2:TH (Prob = 1/4) Pays $18
Game 3:1/8 X payoff $6 = 3/4
Game 1:H (Prob = 1/2) Pays $3
Game 2:TTH (Prob: 1/8) Pays $54
Game 3: 1/16 X payoff $18 = 9/8

Game 1:H (Prob 1/2) Pays $3
Game 2:TTTH (PRob 1/16) Pays $162
Game 3:1/32 x payoff $54 = 27/16

Etc when Game 1 = H. Then eventually, we list all the outcomes when Game 1 = TH.
Game 1:TH (Prob 1/4) Pays $9
Game 2:H (Prob 1/2) Pays $6
Game 3: 1/8 x payoff 2/3 = $1/12

Game 1:TH (Prob 1/4) Pays $9
Game 2:TH (Prob 1/4) Pays $18
Game 3: 1/16 x $2 = $1/8

Etc. And then, eventually, Game 1 = TTH, etc.

In this way, we can enumerate all the outcomes, their probability of occurring, and the payoff for Player 3.

But it’s clear from the first set of outcomes (Player 1 gets H on first flip) that the expected value for Player 3 is already unbounded, hence the expected value for Game 3 is also unbounded. QED.

Thus, the ratio of Game 2 to Game 1 is, in fact, undefined (and even unbounded!)

Does this do it for y’all??

CKD,

Great attempt. Very helpful. Certainly it establishes the proper framewok for analysis. But small point of clarification:

You repeatedly equate unbounded and undefined. As I sais many posts ago, these are not the same concepts. Unbounded means greater than any finite number. The series 1, 2, 3, . . . . is unbounded and so is the sum of the terms in the series. But it’s not undefined in the sense that 1/0 or (1 + -1 + 1 + -1 . . . ) is undefined. Here we cannot say anything about the quantity, not that it’s greater than 2, or greater than any finite unnumber or anything.

So exactly what are you saying about the ratio and Game 1 and 2?

Are you sasying that “thinking about ratio of winnings does not give us any reason to prefer G2 because the ratio is undefined” or “thinking about ratios give us a reason to prefer G2 before the ratio is greater than any finite number.”

Of course, I could study your proof, but it’s easier to just ask.

Tony


Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

CKDex,
I like your proof because of a corollary I didn’t expect: You can rewrite the proof so that Player 3 gets the ratio of winnings of Player 2 to Player 1 (i.e., the multiplicative inverse of the way you wrote it). In this case, following the same logic, Player 3’s expected winnings are still undefined and unbounded. Pretty neat.

Tony, I think you’ve basically summed up my position. However, in response to:

But the more plays there are, the more likely Game 1 is to have an even bigger payout, which affects the ratio even more. So, in some sense, the chance for Game 1 to “significantly screw up the ratio” keeps pace with the number of plays.

Zut,

I agree that the more plays there are the more likely Game 1 is going to have an even bigger payoff, but that doesn’t that the odds of it coming up are great enough to keep pace with the number of plays. It may grow, but still not keep pace. Or at least, I need I proof, like CKD’s that it won’t. I don’t think you can simply wave infinity/infinity in the air and say the ratio is undefined. If so, CDK’s long attempt at a proof is unneeded. Do you think it’s unneeded?

Ok, I don’t need a formal proof that it will keep pace, but I either need a proof, a convincing narrative, or consensus from the intelligent participants on the thread. (Please don’t take my scepticism personally.)

Some of this message could be more clear, but I hope my point comes across,

Tony


Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

Dex:

That was basically the same approach I was trying to use in my 4/15 1:56 AM post.

So the question was raised: What does this mean?

What does it mean that the expected value for the ratio is unbounded?

I think it means that if you play the two games, take the ratio of the payoffs, and do this again and again, you will get a sequence of ratios whose average value will continually increase. In other words:

[(p1/q1) + (p2/q2) + … (pn/qn)] / n

goes to infinity as n goes to infinity (pi is the payoff for the ith game of Game 1, qi is the payoff for the ith game of Game 2).

The question that occurs to me, though, is this. How does this relate to the ratio of the Total Game 1 payoff to the Total Game 2 Payoff. In other words, how does the above expression relate (if at all) to:

(p1 + p2 + … + pn)

(q1 + q2 + … + qn)

as n goes to infinity?

I think the latter expression seems more akin to the nature of the OP.

I haven’t tried working with it, but my conjecture is: The fact that the former expression goes to infinity probably implies that the latter expression doesn’t converge. (Doesn’t necessarily go to infinity, but just bounces around).


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)

Since all the terms in this series are positive, I think that “unbounded” and “converge to infinity” and “undefined” are all the same. I have been using “unbounded” in the mathematical sense of “increasing without bound” because the phrase “converges to infinity” is like chalk on a blackboard to me. OK?

Thus, I am using “unbounded” in the sense of “summing to countable infinity” or "to distinguish from “divergent but bounded” like the alternating (+ 1 - 1) example.

Cabbage, today is kinda rushed, but I think your question is about lim(SUM(An)/SUM(Bn)) compared to lim(SUM(An))/lim(SUM(Bn)) and I think in this example that both diverge (OK, OK, “converge to countable infinity” or “are unbounded.”)

As you say, Tony, an interesting and challenging set up. Part of the problem here I think has been understanding what the “ratio” of winnings means, when we’re dealing with an infinite sequence of rounds. However, I think I demonstrated that the expected ratio (expressed as an infinite series) has a sub-series that is unbounded (“converges to infinity,” Lordy, I hate that term) and (since all elements are positive) so it must also be unbounded.

Anyway, this topic now has almost an unbounded number of posts.

CKD,

Not to be too picky, but just because a series has an unbounded subseries embedded within it, it doesn’t mean the series is unbounded. Its limit may be undefined. For example, the series 1, 0, 2, 0, 4, 0, 8, 0, 16, 0 . . . . has an unbounded subseries, but the series does not converge (even at infinity). The limit of the series is undefined. So we are still stuck between “increases without limit” and “undefined” in the “indeterminate” sense.

If this is just a quibble, nevermind. But it make be relevant to the bottom-line of your analysis.

Tony


Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

Dex:

No, actually, I was conjecturing that, if the expected ratio is unbounded, then the total ratio (ratio of the sums of the payoffs) is divergent. In other words: If

lim [SUM(An/Bn)] / n

is unbounded, does that imply

lim {[SUM(An)] / [SUM(Bn)]}

is divergent? (The first comes from the fact that the individual ratios per game will have an unbounded average, which is what you proved. The second is what the OP asked–Will the total winnings per game have converging ratios?)

I agree that both statements are true in this case, but I was wondering about the general case. If it is true in the general case, then we would also have a proof that, for example, Tony’s variation on the game involving the factorial payoffs would also involve divergent ratios. Anyway, I’ll try to look into it.

Tony:

I believe you’re confusing “sequences” with “series”. As a sequence, what you said is true–the sequence does not diverge to infinity. As a series (sum of terms in the sequence), it’s not true (when we’re dealing with all positive terms, anyway). In other words:

The “subseries”

1+2+4+8+… is unbounded,

and so is the original series:

1+0+2+0+4+0+8+…


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)

Yup, you’re right.

Tony


Two things fill my mind with ever-increasing wonder and awe: the starry skies above me and the moral law within me. – Kant

Thanks all for the replies. Upon reviewing my position, I’ve seen I was looking at the wrong distributions and I’m slowly coming around to the idea that it involves a ratio of infinite sums that doesn’t converge. (I’d be coming around faster if I hadn’t sold my calculus book a few years back.) The points made are generally clear, I’d just like to point out the right formula if someone cares to do it … I think it may not matter, but the expectation of the ration actually comes out something like this (althogh CKDex’s piecewise definition is probably good enough, since it proved the point) :

If X1 is the random variable of flipping coins until heads shows up, the distribution is
P(X1=n) = p^n where p is 1/2 for {n=1,2,3,…}

Call the payoff for game 1 G1, and for game 2 G2, and for game 3 G3

P(G1=n) = p^n for n = 3^np and 0 everywhere else
P(G2=n) = p^n for n= 2
3^n*p and 0 everywhere else

The expected value E(G1) for G1 is
SUM (n=1 to inf) : n * P(G1=n)
= SUM (n=1 to inf) : 3^n * p^(n+1)

and E(G2) is clearly 2*E(G1)

Since G1 and G2 are independent, E(G1/G2) = E(G1)/E(G2)
This is E(G3), the expected value of the ratio of the two payoffs. So it will equal 1/2 only if the two sums cancel, and I’m not really sure they do. In fact, i’m inclined to think they don’t.

Another, perhaps deeper, perhaps not if this isn’t what’s behind this, is the concept of infinity as a limit or as an extant thing. In other words, does it make any sense to speak of playing for an infinite amount of time, and have an infinite sequence that comes out to something or just some large number ratios bouncing around a lot that just keeps going, and we could look in on the game at any time. Here’s a thought about this : It is imaginable that given a fair (and admittedly immutable and indestructible as well) coin, there is a game that is infinite, i.e. you could keep flipping the coin and it will never come up heads. So in an infinite amount of time, this happens, yes, but how to account for it? Mathematics assigns the probability of this to be the limit, which is in fact 0. I always found both infinite and infinitesimal events to be interesting mathematical concepts (and I used to get really scared as a child that some day, while playing a simple game, I might end up having to roll the dice for all eternity until a tie was resolved, or I hit the winners’ circle on exact count, or whatever).

It’s not as important a question as whether the universe is infinite, but it might be interesting.

panama jack

Tony says: << Originally posted by Cabbage:
Dex:
I believe you’re confusing “sequences” with “series”. As a sequence, what you said is true–the sequence does not diverge to infinity. >>

Just to be clear, I wasn’t confusing sequences with series. A series (all positive terms) with an unbounded subseries must itself be unbounded.

For sequences, a sequence with an unbounded subsequence cannot converge (to a finite sum), but the divergence itself could be undefined or could be unbounded. (Repeating, I hate the term “converges to infinity.”)

Dex:

I don’t think your proof is quite sufficient to answer the question originally raised by the OP (not that I disagree with the conclusion, however), which is why I brought up my conjecture a few posts ago (which turned out to be false).

For example, consider the following two sequences (sorry for the sloppy notation, but it should be clear):

An = 1,4,1,9,1,16,1,25,…
Bn = 4,1,9,1,16,1,25,1,…

(Both sequences alternating between the constant 1 and the squares, shifted by 1 from each other).

If we look at the average of the individual ratios (Ai/Bi), that is unbounded. In other words:

[(A1/B1) + (A2/B2) + … + (An/Bn)] / n

is unbounded as n goes to infinity.

However, the ratio of the sum of the first n Ai’s to the sum of the first n (same n here) Bi’s converges to 1:

A1 + A2 + A3 + … + An

B1 + B2 + B3 + … + Bn

goes to 1 as n goes to infinity.

The point is that your proof shows that the individual ratios, (Ai/Bi), have unbounded expected value. That’s not enough to demonstrate that

A1 + A2 + A3 + … + An

B1 + B2 + B3 + … + Bn

doesn’t converge as n goes to infinity, which is what the OP was asking.

I can see a couple of ways around this. One would be to show that the set of pairs of sequences {An,Bn} where such a counterexample to my conjecture occurs has probability 0.

Another way would be to show that the expected value of:

A1 + A2 + A3 + … + An

B1 + B2 + B3 + … + Bn

doesn’t exist for n=1,2,3,… (Your proof took care of the case n=1).


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)

Cabbage, I’m not entirely following you (on the other hand, it’s early, and last night was a late night.)

We have two games. Define a “round” to be one play of Game 1 (that is, flipping the coin until it comes up heads and getting paid accordingly) and one play of Game 2.

I’m not showing the ratios of each round to be unbounded, I’m showing the EXPECTED VALUE of the ratios of a round to be unbounded. The expected value of the ratios is obtained by enumerating all the possible outcomes of a single round, multiplying each outcome by the probability of that outcome arising, and summing.

So, I’m not looking at a particular sequence of outcomes, I’m looking at an enumeration of ALL outcomes, and then summing the expected value of each one… and the sum in unbounded.

Not sure whether that’s connecting, but I’m not sure where your example of partial sums is leading. I’m not looking at the average of the ratios, I’m looking at the sum of the expected values.

Dex:

Basically, my point is this: Your proof shows that the expected value of (A1/B1) is unbounded–the expected value of the ratio after each game has been played once.

But what’s the expected value of the ratio after the games have been played, say, twice–the expected value of (A1+A2)/(B1+B2)? or, more generally, the expected value of the ratios after the games have been played n times (not just once): (A1 + A2 + … + An)/(B1 + B2 + … + Bn)? I don’t think it necessarily follows that, since the expected value of (A1/B1) is unbounded, then the others are unbounded as well. (It may actually follow, but I don’t think it’s obvious).

So, for n=2, we would have:

Game 1 payoffs: $3, $3
Game 2 payoffs: $6, $6
Ratio: 1/2
Probability of this: 1/16
Add to the expected value: (1/2)*(1/16)

Game 1 payoffs: $3, $9
Game 2 payoffs: $6, $6
Ratio: 1
Probability: 1/32
Add to the expected value: (1) * (1/32)

Game 1 payoffs: $9, $3
Game 2 payoffs: $6, $6
Ratio: 1
Probability: 1/32
Add to the expected value: (1) * (1/32)

Game 1 payoffs: $3, $27
Game 2 payoffs: $6, $6
Ratio: 30/12
Probability: 1/64
Add to the expected value: (30/12) * (1/64)

And so on, through all the possibilities, and then sum up the expected value of the ratio of two payoffs of Game 1 divided by two payoffs of Game 2.

In general, it seems at least possible to me that this expected value (or the expected value of the ratios of n games, for some n) could exist, even if the expected value of the ratio of the payoffs after a single game of each doesn’t exist.


…ebius sig. This is a moebius sig. This is a mo…
(sig line courtesy of WallyM7)