The specific numbers don’t matter at all. Let’s consolidate the premise:
Highest number is -432
Second highest number is -1256
998 of the cards have a number lower than -1256
Divide the deck into two haves. There are equal odds that:
The first half contains both -432 and -1256
The first half contains -432 and the second half contains -1256
The second half contains both -432 and -1256
The first half contains -1256 and the second half contains -432
If your strategy is as I detailed in post #2 (make a mental note of the highest card in stack 1, and choose the first card in stack 2 that beats it) then in the first 2 scenarios you lose, in the 3rd scenario you have some chance of winning, and in the 4th scenario you’re guaranteed to win. Even without doing the odds for scenario number 3, you’re already looking at a 25% win ratio. On a bet that pays 8:1, you’d be foolish not to take it.
Now replace -432 and -1256 with whatever numbers you want. Forget about the other 998 numbers, the logic works out the same regardless. Forget that -432 is a “low number,” whatever that means. It’s higher than -1256, which was the highest card you saw in stack 1, so you should pick it.
All that matters is that the cards be orderable: That is to say, given any two cards, you should always be able to say which of the two is better, and “better” is transitive. In the secretary problem, the hiring manager probably can’t assign a numerical value to the applicants at all, but can still make judgements about which one is better than another. And that’s all that’s needed.
OK, I think I understand your theory. Bear with me while I type out loud.
The player only stops on a card in the 2nd stack.
In scenario 1, the two highest numbers are located in the 1st stack. The player loses every time.
In scenario 2, the highest number is in the 1st stack. The player loses every time.
In scenario 3, the two highest numbers are located in the 2nd stack. If -1256 appears before -432, the player stops on -1256 because it’s higher that any number in stack 1. The player loses every time.
Or
-432 appears before -1256, and the player wins every time. That’s a 50/50 chance of one 25% scenario. Or 12.5%.
In scenario 4, the player stops on -432, and the player wins every time.
Assuming the game is played long enough, (and ignoring any excessive winning, or losing, streaks common to gambling) there is a 37.5% chance of winning a game.
1000 games would net the player $40 x 375 games = $15,000.
1000 games would net the dealer $5 x 625 games = $3125.
If I got it right, thanks for clearing it up. (If I’m still wrong, I’ll blame it on Alzheimer’s, or an aneurysm. )
Martin Gardner put this in one of his “Mathematical Games” columns in Scientific American. In essence, steronz, all the way back in post #2, got it right.
Martin Gardner said the strategy is to look at the first 32 numbers – the square root of the total number – and then to choose the first number that exceeds the maximum of those 32. If no number does exceed that maximum…you lose. The square root of the total population gives the optimum sample size, giving you as much information as you are going to get without starting to eat into the likely winners. Picking half the population as a sample for examination means you have at most 1/2 chance of winning. Throwing away only the first 32 numbers raises this substantially.
ETA: Gardner’s analysis involved only a limited number of cards…but absolutely no limits on the numbers printed on the cards.
I haven’t looked at any spoiler yet, but here is a method that will win at least 1/4 of the time, so I will play. There is 1 chance in 4 that the highest card is among the last 500 cards and the second highest among the first 500. If you play by passing on the first 500 and then looking for the first card in the second 500 higher than all you saw in the first 500, then this works subject to the first sentence above. That is one chance in 4. However, there are other winning conditions. If the third highest is in the first 500 and the highest and second highest in the last 500, but the highest comes earlier, you will still win. That adds another 1 chance in 8 for a total of 3 chances in 8 and there are other winning conditions. From an earlier post, I gather that you should turn over 1/e of the cards and then your chances are 1 in e (both approximate, of course).
There’s an unlimited number of secretaries, but you’re going to interview a finite number, just like in the card problem there’s a maximum number of cards you get look at before stopping. Once you have decided the maximum number you’d be willing to interview, you can solve the problem as suggested.
Congratulations to steronz for being first to vote and post the right answer. I don’t know why I had such a problem with this puzzle when I first heard it. Perhaps because it seems so counter-intuitive. Math is a funny thing.
**steronz **has the right basic strategy, but *half *the cards is the wrong place to flip. It’s 36.7% ( 1/e ).
**doorhinge **- think of it this way - no matter what numbers are on the cards, the player is hoping to see the 2nd highest card in the first ‘section’ without seeing the highest card. Then, they wait for a card to beat it. It works about 37% of the time, if you use about 37% of the ‘deck’ to hope for the 2nd highest card without the highest.
This problem is an excellent test. You need to
(1) come up with a parametric form of a solution.
(2) express the objective in a simple mathematical form.
(3) solve the resulting calculus expressions.
If you do (2) elegantly, (3) will require only trivial junior-high calculus.
To test my own fading skills, I tried to solve this variant. Same as OP’s problem but you also win if you pick the 2nd-highest card.
I found solution to be easier than I thought it would be. The solution form uses two parameters, and I found that one parameter has a simple optimal value (simpler than 1/e) with optimal value independent of the other parameter.