Is there a psyche term for "extreme decision justification"?

Often I notice that when people make a certain decision, such as purchasing a car or joining a political party or accepting a job, they then go to extremes to justify their decision. They tout their car as the best on the road, become mindlessly partisan, convince themselves their company is greatest in the world to work for, etc.

All their arguments are first created to justify their choice, then only secondly is the data is found to support it.

Is their a term for this mental process?

Cognitive dissonance?

It’s certainly a similar term for what you’re describing, although probably not an exact fit.

Thats darn close Crusoe, maybe the closest thing formally described. Thanks!

Clearly, one is experiencing cog. diss. when they can’t digest the data that other cars are more reliable, or that their political leader lies like the rest of 'em, etc.

Not a pschology term, but I would call it post-hoc rationalisation.

I disagree with your OP. Most people will assemble a justification for whatever purchase and/or decision they made, but few if any people I know are trying to “justify” their life choices in terms of some manic pretense bordering on “cognitive dissonance”.

It’s human nature to become emotionally invested in things we choose to do, or major decisions we make about what is the most appropriate course of action. I think the majority of the time people will try to make the best decision they can given the resources or inputs at hand, but even if they feel obligated to defend or support their decision it’s rarely at some manic level of delusion (except in the case of sports teams :stuck_out_tongue: ) unless they are ignorant of the alternatives and this is less delusion and more simple ignorance.

An example would be Rolex watch owners thinking or saying that Rolex is the best watch in the world. Rolex is certainly the best marketed watch in the world, but in terms of movement quality and overall fit and finish quality there are lesser known watches levels above Rolex. The Rolex owners claiming “best in the world” simply don’t know any better.

I didn’t mean to imply that everybody does it at the level of cognitive dissonance. I think a good amount of people do, though, but YMMV of course.

[nitpick]

Cognitive dissonance isn’t as pathological as astro would have it be…sure, in some cases, but it’s also a fairly typical coping mechanism when dealing with decisions, specifically ones in which there are choices with a lot of trade-offs. The mind needs to find some way to say that there are more advantages and fewer disadvantages to the choice made than to the one not taken.

[/nitpick]

So, yes, it is cognitive dissonance.

It sounds (from the link) like cognitive dissonance is pretty extreme – basically a state of denial. In general terms I’d say that post-hoc rationalization is what you’re describing.

Cognitive Dissonance must be why people are willing to pay $100 just for some food at an expensive resaurant.

I did not mean to imply that cognitive dissonance could not apply in extreme examples–but that’s just what the link in question provides, an extreme example, largely (I’m guessing) for the purposes of making the point as clear as possible. As a result, though, it ends up being misleading.

Cognitive dissonance as a phenomenon does not have to involve complete denial–a lot of research has been done based on minor forms, including one study (unfortunately I don’t remember the names of the researchers) in which subjects who were paid $1 for participating in a boring task were more likely than subjects compensated to the tune of $20 to say that they enjoyed participating and would do it again. We all do it all the time–it’s a simple matter of making the things we do seem rational. The more serious the dissonance, the more fundamental the self-deception necessary to get it to work out. It’s also important to keep in mind that this isn’t a conscious process.

This is a link that provides a more balanced representation of cognitive dissonance theory (scroll down to Hypotheses 1, 2, and 3 to get the information most relevant to this discussion).

Sorry for posting again, but I should also say that post-hoc rationalization sounds like a good characterization of the actual resolution of the dissonance. I don’t recall encountering it in any Psych textbooks, but I could be wrong.

A more complete description of the study I mentioned above is available here, if anybody’s interested.

And looking back at the OP, there’s another related phenomenon which might be helpful: confirmation bias, which basically states that people only look for/see the evidence that supports their side of an argument.

The opposite of “sour grapes”- the others were sour,mine were sweet. Might be seen in narcissistic types. Denial could be a psychic mechanism, or devaluation(of the other choices). Of course the psych term for sour grape behavior/thinking is rationalization.

I heard of a study similar to the one you cite where subjects were asked to write an essay which happened to support a view contrary to their current opinion. Half were paid, half were not. The premise was that if you actually pay somebody to express an opinion, they would more willingly adopt that opinion. They found the opposite was true. Due to the need to resolve cognitive dissonance, the subjects who were not paid had to rationalize the essay so they started to change their opinion. The ones who were paid could simply rationalize that they were doing it for the money.

Great line uttered by Jeff Goldblum in The Big Chill. He claimed that rationalizing was better than sex, and when challenged on it he said, “When the last time you went a week without rationalizing?”

Rationalizing to resolve cognitive dissonance is not pathological at all, it’s a healthy mechanism to keep you sane.

As my SO, the psychotherapist, is fond of saying, “Why you rationalize, you tell yourself rational lies.”

How about “polarized opinion” or “black and white thinking?”

I find that most people don’t think probabalisticlly, but instead habitually make all-or-nothing choices. It’s not that they rationalize afterwards. It’s that they simplify things right at the start by jumping into black or white and avoiding the gray. They aren’t well practiced in half-beliefs; in three-quarters convictions.

Living in the “gray area” requires constant thinking. It’s much easier to just think briefly, then throw yourself into a yes/no choice. The alternative would be to constantly maintain tenative viewpoints and constantly change opinions as new information comes in. For this, one must REVEL in thinking, not try to avoid it!

Here’s an article about this:

THE NATURE OF KNOWLEDGE by R. A. Lyttleton
“Keep your bead on the wire”
http://amasci.com/freenrg/bead.txt