Hi. I have a reasonably difficult probability question that I would really like some help with. If you don’t have some background in probability theory, you probably needn’t read farther.
If you do have some background in probability theory, I won’t waste your time explaining the basis for my interest. Rest assured, however, I am not just trying to get an answer for a homework problem set. I am really interested in the question, but don’t possess the training to answer it. Perhaps it will be easy for you.
The problem is below. Again, any help would be sincerely appreciated.
Many thanks,
Question:
In Game 1, a fair coin is flipped repeatedly until it comes up heads. If it comes up heads after n tosses, you are awarded 3^n dollars.
In Game 2, a fair coin is flipped repeatedly until it comes up heads. If it comes up heads after n tosses, you are awarded 2*(3^n) dollars.
Game 1 and Game 2 are played repeatedly an equal number of times. As the number of plays increases, will the ratio of the average winnings of Game 1 and average winnings of Game 2 converge? If so, what will the ratio converge on? 1? ½? 0?
(The question appears nontrivial to me because, for both games, the expected utility of a single play is infinite.)