Daniel Kahneman has died, and I remember reading him and thinking he was wrong. What do you think?

To check the research.

http://cemi.ehess.fr/docannexe/file/2780/tversjy_kahneman_advances.pdf (table on p 307)

The odd part of these things is that I fret way more over trivial amounts than large ones. I feel super bad if, say, something ruins a $20 lunch. But if the stock market is down that day and I lose $20,000? Meh. The $20 is just way more concrete than the $20,000 and hurts more.

I think that for the small amounts, there is a bit more bias against losses. Losing $20 feels worse than gaining $20. But for the abstract amounts, it’s about the same.

And then there’s stuff like spending way too much time figuring out which toothpaste is cheapest, even when the difference is like 15 cents and can’t possibly pay back the time I wasted staring at the wall of options. But not doing that also feels bad.

I can relate to the toothpaste thing, but the other way around: I tend to buy the more expensive one, although I know that it is not really better. But I buy it nonetheless. Despite being aware that the 15 cent (more like 1 €, but never mind) is a rounding error. Do you think Elon Musk thinks about toothpaste? Ever? I remember reading somewhere that the founder of Wal Mart not only took the cheap ball pens in hotels, but instructed his staff to do the same. What can be more irrelevant trivial than that? (And there was a time I had dozens of those ball pens :smiley: But I only used the nice one I bought. It was not better, it just looked cooler. And felt better in my hand. It was a gold plated Cross. Now I use a Caran D’Ache my wife gifted to me on my 50th. And I no longer take the cheap ones).

The last time I think it was even less than 15 cents at stake. I noticed that the 3-pack of some brand was actually more expensive than the individual units by a couple of cents. So I ended up getting those. But just the extra time spent grabbing the three separate ones and putting them through the self-checkout wasn’t worth those few cents!

I dunno about Musk, but there’s this story about Bill Gates and Warren Buffett:

Buffett probably made $1000 in the time just digging those coupons out of this pocket.

Losing money creates a risk. Yes, it’s a different risk than a hungry tiger, but that’s a matter of degree.

Perhaps a tiger was a distracting example of a threat. In the environment where our ancestors arose, losing your clothing, losing the food you had already gathered, losing the water you were carrying, were all threats that could be potentially serious. We perceive threats more acutely than benefits.

I said it depends.

If I were to wake up tomorrow to find that I’d lost a million dollars in the stock market, it make me considerably more miserable than the happiness I would feel if somehow I had gained a million. That’s because losing a million will mean a material change in my retirement prospects. I’d probably choose to work longer. Gaining a million might give me more of a cushion, but likely would benefit my heirs more than it would me.

The same thing for $100k each way would be meh. $10k I wouldn’t even notice. That’s why I don’t gamble. The prospect of going to a casino and winning $500 (or $5,000 even) seems hardly meaningful. But that’s a Big Day for some people I know who have much higher income and wealth than I do. There’s zero excitement for me in taking a chance on a time fraction of my net worth. Winning a tennis tournament is a big deal to me, winning a few hands of blackjack or a few spins of the roulette wheel, not so much.

It’s pretty obvious that people think that way, if you watch game shows. Even though that money they’ve “won” isn’t their money, they didn’t work for it, etc… they still agonize over whether to press their luck, or go for the final round or whatever.

If they were being rational about it, they wouldn’t be sweating the game show money, as there’s literally nowhere to go but up, in the context of the real world. You don’t come out of a game show poorer than before; the question is to what degree do you come out ahead.

Hell no for me, the latter would sting much worse….except that I’m in it for the long run with the market, and diversified, so I don’t care about day-to-day swings and know they will happen. So I don’t care. I got twenty years or more to make up for that swing. It’s not a $20k loss to me, it’s just part of the random walk with an upward trajectory (hopefully.)

On the other hand, were I short sightedly insvested all in one stock and I lost $20k because the stock went completely bust, that would really really suck. With the $20 dinner, I don’t think it’s so much concreteness as that with the investment you expect the loss to be temporary or just a small part of an overall growing porftolio. With the dinner, you’re out that money. (And I wouldn’t be upset about $20, anyway. Who cares?)

I think Kahneman’s work was clever and groundbreaking. But it is a tremendous simplification to reduce it to this one point. I was sad to hear this news of his passing.

Maybe I’m misinterpreting what you’re saying, but that doesn’t sound like “being rational”; it sounds like the “House Money Effect”: the tendency to think of money differently depending on its source, and specifically to treat money you win as different from money you earn. This is part of Mental Accounting, which, like loss aversion, is an important concept in behavioral economics.

Some gift links. WARNING! Shoppers are reminded that soup purchases are limited to 12 cans.

Byt Kahneman did not present “an argument” based on his “feelings” regarding what he perceived. He ran a study, got results, and interpreted them.

Since then, other people ran other studies, and interpreted them as well. Some agreed with Prospect Theory, some did not.

This paper contains a review of some of the relevant work that has been done on the topic in the last 30 years:

You said, you believe this theory may have a US bias. Fine - did you look up any studies done in other countries to support or dismiss that claim?

To put it another way…

Prospect Theory is a model used to describe the decisions made by human beings. It is not a perfect model; human decision making is not as rigid as the acceleration of a ball towards the earth at 9.8m/s^2. But it has more explanatory power than traditional economic models that assume we are all rational actors working to maximize utility.

A number of empirical tests have shown that what happens in reality lines up with Prospect Theory’s assumptions better than simple analysis of utility. Other studies came to other conclusions - the models didn’t work perfectly in every situation they were tested under - and researchers have come up with different explanations as to why this might be the case.

All that is much stronger evidence than how I feel a theory lines up with my own subjective experience. Especially since the whole point of the theory is that us humans are pretty bad at assessing this kind of stuff.

That may be, but it might also be the Endowment effect, which causes people to increase the value of an object that they possess. The experiment was giving half of a class of MBA students mugs. They them immediately asked the students with mugs how much they would need to be paid to sell them, and those without mugs how much they would pay to get them. The students with mugs gave a significantly higher amount.
I’ve observed this effect in engineering (and have a paper that mentions it.) To make a chip easier to test, you add some hardware to it. In the old days designers objected vehemently to this, because the area of the chip went up say 10%, and that costs money. Then they stopped objecting. The reason, I think, is that the tools to synthesize the chip design knew that this overhead was going to be added, and put it in at the beginning. While the end area was exactly the same, the designers never saw the smaller area before the insertion of the hardware, and so did not feel the loss.

It might be innumeracy, but how many people would do the calculation, and not go with their gut instinct. And many similar effects have been shown to work on people with high levels of mathematical skills.
Here is an example from a class my daughter and I taught on behavioral economics applied to engineering. (She has a PhD in Judgement Decision Making, which is the more general term for the field.)
The class was engineers experts in testing computer chips, who all knew all about failure rates and defect per million rates. We did a survey of the class with two questions. For the first, half the class got the question “Is the defect per million rate for ICs in the field more or less than 1 per million.” The other half got the question “Is the defect rate more or less than 100,000 per million.” The answer is more for the first and less for the second.
Then they all got the question “estimate the DPM rate for all chips.” There is obviously no good answer to this, because it depends on all sorts of things.
However, the first group’s estimate was much lower than the second groups. The results were statistically significant even for a small population size. In fact the distribution was perfectly bimodal.
It was a great example because the audience, like me, prides themselves on being rational engineers, and they all certainly have great math skills. This effect, called anchoring, is hard wired into us.

Well, clearly primitive man didn’t play the market. The point is that the biases we have built in due to the proverbial tiger have impacts on our decisions today.
It would be odd if they were too US centric given that they are both Israeli. Western centric perhaps. I’m not aware of any studies done in non-Western countries. I’ll ask my daughter, she’ll probably know. However I do know of similar studies for other things done in China, with the same kind of results.

What I’m saying is that when you go on a game show, there’s no chance that your bank account will actually go down. The very absolute worst thing that happens is that you don’t win anything and you’re exactly where you were when you went in. Everything else is varying degrees of winning money.

But people act like there’s some sort of loss involved with money you don’t actually have (yet), and are very risk-averse to losing this fictitious money, and will often not take chances and leave rather than “lose” the money.

Which is absurd; you don’t ever come out behind.

I don’t agree with that analysis at all. There are two different situations you could be describing.

  1. The person gets to keep the money they win. If that’s the case, then they effectively have been given the money when it is awarded to them, and do lose money when it is taken away. That they don’t have it now is no different than the fact that they haven’t received their paycheck yet. It doesn’t mean you’d be okay with having your paycheck reduced.

  2. They don’t get to keep the money unless they win. In that case, they are concerned with losing progress towards winning, rather than the money itself. The more points you have, the more likely you are to win and get to convert those points to money.

Neither one is irrational at all. Your argument seems the most irrational. The fact that I won’t be any worse off than I started doesn’t change the fact I was guaranteed, say, $10,000, until I risked it all on a question. Even I claw back $500, I still could have gotten $10,500. And a net win of $500 is still $10,000 less than a win of $10,500.

Not to mention the risk of making yourself look like a jerk in front of millions of people, and bringing shame and disgrace on your family name for generations to come.

You don’t get to come back tomorrow. You don’t even get a lousy copy of our home game. You’re a complete loser!

I hope I did not come across as saying that he was dumb or that I was glad he died: I just believe the risk/loss aversion statement is not always correct. I think it is wrong in my case.

That argument reminds me of the Freudian analysts who claim that if you don’t agree with their assesment of our problem it just proves that you have exactly this problem and are suppresing it. If Kahneman states “people are bad at assesing this” and I claim “but this is not true in my experience” answering “see what I mean? You are denying the facts which proves the facts are as I stated” is a circular argument.

I may be wrong, but I think the relevant studies were made in the USA, with US students. It is known as WEIRD problem, and I think it is more a US than Western specific in Kahneman’s case.
Has your daughter come up with any interesting results for non-Western countries? In one of the links provided by other posters in this thread (haven’t found which one, there are so many, sorry) it was stated that Kahneman’s assertion did hold reasonably well for China, but not so much for Africa, FWIW. Checking the details, in particular the methodology, is beyond my capabilities.

Just in general: I did not expect nor wish this thread to become adversarial. I am surprised it even got that controversial (maybe I am being thin-skinned here). As of today (51 answers to the poll) just under 50% agree with Kahneman, 30 see a nuance (it depends) and 20% disagree. I think that validates the question: it was not asked as a gotcha or in bad faith.