I’m with Bricker on this. I don’t see any notable distinction here.
With that said, I also agree with Measure for Measure that the primary effect here is the damage to pride, not the pocketbook. I think a $100 bet is only going to show marginally more confidence than a $10 bet on the same issue. Humans aren’t built to be fully logical about differences along this scale. The primary issue, in my opinion, is confirmation bias. Our mind can whitewash our memories to erase all the mistaken statements we make, thus avoiding the hit to our pride. By forcing us to confront our mistake for at least as long as it takes to make a $10 payment, it encourages more consideration.
Most especially, it forces people to consider carefully precisely what statement they’re defending, so there can be no weaseling on the wording later. (“Well, what I really meant was…”) This just seems psychologically obvious to me, and the distinctions that people are trying to draw between confirming facts today and waiting for a prediction next year don’t seem relevant in the slightest. People are people, and we’re going to naturally be more careful about what we say if there’s some trigger that will cause even a slight bit of pain if we’re wrong. Small wagers on matters of fact, even if they are future matters of fact, will lead most people to be more careful.
Why would it matter if anyone bet on these things? Now, if I spent a long time arguing that a piece of legislation being enacted is going to reduce the number of homeless people, I can kind of understand the point of betting on an upcoming homeless report because I would be betting on the argument that I’ve been trying to convince others with (but too bad I don’t bet on anything anyway).
But I fail to see how a person’s prediction for the election could be the least bit important. You’re not even betting on the argument that you’re invested in, you’re just betting on a horse.
Yes, he is talking about predictions where the outcome is can be seen unambiguously to be correct or not at some point in the future. For example, “Will Obama win a second term?” “Will the Yankees win the World Series this year?” “Will there be an earthquake of magnitude 6 or larger in Los Angeles within the next five years?” Etc. That’s what the whole book is about.
One point he makes over and over is that some people who seem to be making predictions are doing nothing of the sort, they are just entertainers. For example, most political pundits on the talk shows. Their predictive power is so poor that they can’t possibly believe what comes out of their own mouths.
Nate Silver, in his book, is specifically not talking about deciding who is right or wrong, he categorically says to try to eliminate all biases as much as possible, because it clouds your ability to make accurate predictions. The idea is to simply predict what actually will happen (admittedly very difficult in the case of SCOTUS decisions).
If what you think will happen is that the SCOTUS will decide in a way very much against your personal beliefs, so be it. If you’re an honest predictor that’s what you have to say. That’s what got all those Romney supporters in trouble. They didn’t want to believe the evidence that pointed against them.
I’m still grappling with his idea that in cases of disagreement, one should always bet. I’m not sure that I agree completely myself. But I think I understand where he’s coming from. It goes like this. If you think you’re right, then you believe that there’s a greater chance that your prediction will come true than the other guy’s. So if you make a bet with him, chances are greater than 50% (or some figure, depending on your degree of certainty) that you will win. It’s not certain, but if you make a lot of bets like these then in the long run you will come out ahead.
And that last point is the important one. Nate Silver has always been a big betting man, so for him it’s second nature to have lots of bets going on, so if one doesn’t pan out another one will. And the concept such as a constant stream of bets that in the long run are certain to pay off because they’re each for more than 50% makes intuitive sense to him.
Then why did you make a post that amounts to “I make bets on things! Look at this study! It says money makes people think better! It means I’m better than all you guys! Tell me I’m right!”
You could have decided to just discuss the study. But you decided to make it personal, to turn into an attack. Why do you feel the need to attack if you don’t feel persecuted? There was no reason to insinuate things about other people unless that was the point of your post.
You deliberately poisoned the well before you started. Why did you do that if you just wanted to discuss it rationally?
There are many decisions that go into making a bet. Do I think it’s moral to waste money over crap like this (or cause someone else to do so)? Do I trust that person? Can I afford the risk, even if I am over 50% sure I’m right? Do I like that person? Have I said things in the past that would indicate I wouldn’t take a bet, and is this poster one of those “gotcha” posters who will call me out on it? Heck, is the guy even serious–and will I look like an idiot if I act like he is? Or do I value calling him on his bluff enough to create strife?
And that’s just off the top of my head, after less than five minutes thinking about this. Simple things you are smart enough to have thought of. But, no, the fact that we won’t bet you means that you are right and we are wrong. That’s the premise you go with.
That seems more consistent with your proclivity to act like you are getting back at the liberals on this board than it is with the idea that you are genuinely asking. Heck, if it were possible to determine the truth, I’d make a bet on it.
And hence my valuation of Nate Silver as a rational person has diminished greatly. On the plus side, that means that there will be people who can make even more accurate predictions in the future.
Betting is a frivolous activity that, if you have enough free money to do it with, something is wrong with your monetary priorities. You can get a much better return on pretty much anything else.
I’d love to see studies of the sort described in the OP applied to sportswriters who confidently make dumb guesses.
If there were financial consequences to repeatedly mispredicting the outcome of playoff games, championships, statistical titles, draft success etc., would sports pundits make fewer stupid predictions?
As for the partisan gotcha angle, I don’t know which factions are overall most prone to embracing incorrect information and bad predictions. My own observations (based in part on Wall Street Journal coverage of the 2012 campaign) is that conservatives had by far the most horrible record in predicting outcomes, making the average Doper look like a genius by comparison.
How about stock or business analysts – not so much the professionals, for whom there is already a financial consequence for consistent error – but for people like Jim Cramer ofCNBC’s Mad Money, who confidently and aggressively declares that certain events will happen, with very little accountability.
Loss aversion is a well-known facet of human psychology; you are probably at least a little familiar with it.
When a choice presents opportunity for gain with no risk of loss, people will take it every time (as in the study in the OP).
When a choice presents both opportunity and risk, people will weight the risk disproportionately to the actual probability involved (as in a wager on a prediction).
When a choice presents a risk of loss with no opportunity for gain, people will participate only if the outcome itself is valued for its informational content (which is to say, not usually).
So if a weatherman forecasts 60% chance of rain but won’t wager $500 on it, it doesn’t mean he doesn’t believe his prediction or that he isn’t a good weatherman, it just means the odds aren’t strong enough to overcome his aversion to losing $500.
Similarly when someone on a message board won’t take your bet, they’re likely thinking the same thing, though they can’t articulate it in mathematical terms.
So, the comparision in your OP is demonstrably not valid… putting money on predictions is different than answering test questions, and the reason has nothing to do with misrepresenting one’s convictions. Not unless the person says the probability of the event is 100%. In that case, if they say they won’t take your money, then of course they have said an untrue thing. But I would challenge you to go back and find a situation where someone gave 100% probability on a prediction that they also refused to wager on.
As HMS Irruncible points out, they would likely be unwilling to make predictions at all for the simple reason of risk aversion. All people involved in sports know quite well that sports are inherently unpredictable, so any prediction a person would take a fair bet on is very close to being a toss up once the odds are agreed up. A sportscasters would therefore be risking his money for no good reason, and risking the possibility of long downswings that would arise purely by chance.
Professional sports betters - people who actually make a living at this - beat the vigorish by 4 percent, 5 percent, numbers like that. They make enormous numbers of bets and just barely beat the odds, avoiding serious downswings by making large volumes of bets on sports they have tremendous insight into
The “you don’t really believe what you’re saying unless you’re willing to bet on it” nonsense is based on the complete refusal to accept the concept of risk aversion. I genuinely believed Barack Obama was probably going to win the 2012 presidential election but I sure as hell wasn’t going to bet by car on it. There are many people who simply have extreme risk aversion to betting.
The OP is badly confusing** three** different phenomena. The study shows that people are simply prone to telling pollsters bullshit, and will become truthful when incentivized to do so. Betting on a prediction is (sometimes) a completely different thing. A third different thing still is the crap talking heads spew on television, which is just said for entertainment value and has not the first connection whatsoever with the truth. (Nate Silver in The Signal and the Noise does a study of talking heads on TV and concludes that their predictions are effectively just random or political roleplaying.)
Was it Perry to whom Romney offered to bet $10K over whether he could prove some point about his record, right at the debate? IIRC it did momentarily shut up the contender (though mostly out of stunned disbelief), but it only made Mitt look like a rich prick for whom 10K is mere bar-bet money.
But I imagine I’d be able to find at least a couple of instances in which someone says, “There’s no way X will do Y,” and then refuses a wager. Doesn’t “no way” indicate certainty?
Well, yes, but it doesn’t mean they are insincere in their belief, it means the prospect of loss forced them to quantify the odds and conclude that they aren’t 100% certain.
Of course we also cannot rule out that people simply lie as long as it is a cost-free rhetorical device. My point is simply that lying is neither the sole explanation nor the likeliest one.
Yes, they should. Because strong opinions about the future often have more grounding in emotion than in insight into objective reality. Because in practice those who claim not to be betting as a matter of principle can’t often articulate that principle very well. Because the amounts involved can often be small: I usually offer (probabilistic) bets in the range of about a dollar or even less: again the process is about being careful and setting odds. Alternatively, you can gamble larger amounts where a gift to charity is at stake.
It’s not enough to say, “I don’t bet as a matter of principle.” You have to give a reason why. The underlying point of the exercise after all is advance a particular form of humility. Predicting the future is hard (and random!) and strong opinions are often in the province of the blowhard. Discouraging certain strong opinions, ones that are insufficiently formulated, is a good thing.