How do you determine "The Truth"?

So a big issue that keeps popping up is how Conservatives and Liberals have their own version of The Truth, largely driven by a combination of both automatically curated and self-selected media. Each side likes to think that they have it all figured out and that the other side are a bunch of idiots and indoctrinated sheep, but how do you know?

I don’t want to poison the well by bringing up a specific example, so I’ll just leave it open for now.

It comes down to evidence. What is the claim being made? What evidence us presented in support of this claim? Does the conclusion reasonably follow from the evidence provided?

And of course, it is important to continually review the evidence again and again. You can never find truth, only seek it and come closer and closer to it. But any belief, no matter how dearly held, must be challenged at all times. An unchallenged belief is worse than useless - it is destructive to the rational mind.

The truth is the underlying facts. Any attempt to categorize, label, extrapolate, or communicate them can only be done with varying degrees of inaccuracy. I am not so postmodern to think that truth is relative or that no communication is possible: some words and theories are more accurate than others, to the degree that they precisely match the underlying facts.

With regards to the media, some sources are so inaccurate that it is not worthwhile to analyze them for possible truth value. That doesn’t mean that the other media outlets know The Truth. You have to go with their track record at presentation that matches the underlying facts, and not either presenting wholly untrue things (i.e. “fake news”) or containing intolerable amounts of spin (which is a subcategory of using language to misrepresent or mislabel things, whether or not you do it consciously.)

There is no Truth, there are just things that are true.

Anything you know today could be contradicted by new information, so you evaluate your sources, keep multiple viewpoints in mind, and think logically. It’s not a bad idea to apply your own smell test, but that requires some experiential knowledge, and even there it only goes so far.

An open mind, trusted sources, and a broad range of experiences.

You need a willingness to admit you’re wrong. And while it’s easy for most of us to admit we were wrong about a minor fact, it is much more difficult for us to admit we were wrong about something that’s integral to our identity. And because of that difficulty we tend not to be critical of information that supports our pre-conceived notions. So you need to have both a willingness to admit you’re wrong and get into the habit of examining information even when it agrees with you.

As noted above, it all comes down to facts and evidence. Unfortunately, bias and prejudice so thoroughly permeates many things that it’s very difficult for people on both giving and receiving end to tell truth consistently. (Note how, for instance, in some floods, black people who looted were called “looters” by the media but white people who looted were labeled as “having found” the things they were carrying. Technically, both descriptions were factually correct, but one was inflammatory while the other was charitable.)

And there’s the saying “You can’t get a man to accept something when his livelihood depends on him rejecting it” (or some variation of it.) Go ask a Chinese Communist official, in public, if there is persecution of the Uighurs in Xinjiang. Of course he’d deny it; he’d have to if he wants to keep his job.

So you unfortunately sometimes have to read multiple sources in order to glean the truth.

I agree.

It’s wrong to be confident that we know the truth if we are unwilling to even consider that we might be wrong.

In matters where there is significant controversy, if I believe, or think I know, that X is true, it’s often not enough to just have evidence or reasons why X is true. It’s also important to know the evidence against: the reasons why people believe X is false. And to consider falsifiability: if X were not, in fact, true, how would we know? what would look different?

This is a great point - being right for the wrong reasons can be as bad as being wrong.

In support of most of what was already said, I have a pithy little saying that I try to internalize and live by:

Just because you DO believe it doesn’t mean it IS true. Just because you DON’T believe it doesn’t mean it’s NOT true.

I also try to consume news from as wide a variety of sources as possible. It’s easy to recognize the ‘spin’ coming from the other side. It takes a lot of work to learn to recognize the spin that your side puts out, too.

As my sainted mother once said, “The older I get, the more I realize just how much we’re all being manipulated.”

Also …

According to a follow-up survey by Fairleigh Dickinson University’s PublicMind, NPR and Sunday morning political talk shows are the most informative news outlets, while exposure to partisan sources, such as Fox News and MSNBC, has a negative impact on people’s current events knowledge.

“A negative impact” means that people who watched the partisan sources actually had less correct current event information than those who didn’t watch the news at all.

Think about that.

We expect that watching the news should help people learn, but the most popular of the national media sources – Fox, CNN, MSNBC – seem to be the least informative.

It’s hard to tease out how much is cause and how much is effect, but … as I used to say quite often … it’s best to get your news the way you should be getting your nutrition: from as wide a variety of sources as possible and from as close to the source (ie, with as little processing) as possible.

I don’t think differences arise primarily from competing assessments of “The Truth” but rather, “What’s Best.” “The Truth” as an expression is just a proxy for that. It’s how people take the same evidence and arrive at different conclusions. They may then end up, through motivated reasoning, rejecting or giving less credit to some evidence that another would accept at face value and vice versa. The fundamental disparity is not in the framework used to arrive at “The Truth” per se, but rather the foundation they set that framework upon. Someone who thinks “What’s Best” is to be found in a religious text, for instance, may well arrive at–and be further motivated to arrive at–different conclusions based on the same evidence as someone who thinks “What’s Best” ought to be determined based on what makes the most people materially or emotionally well-off.

“Truth” can be a problematic term, as it can refer to unprovable philosophical positions. We hold these truths to be self evident…" “We both have truths, are mine the same as yours?”

Well put, and spot on.

A lot of people have very strong views about how they believe things are or should be. Many of these people are also ill-informed, of moderate education, and relatively isolated. So if you give these people the excuse that they are being misled, they will happily use that to reconcile any cognitive dissonance.

Although, I suppose the question is where does one get the evidence to challenge their own views? If you presented with a well-crafted narrative backed by cherry-picked evidence, it can be difficult to distinguish between non traditional, but accurate sources and “fringe bullshit”.

I was going to say that you can sort of “smell the truth”, but fact is most people probably can’t. So for me it helps to base the truth on what people I know are smelling and what I know about how those people smell. i.e., If I see people who I knew to be jerks, morons, bullies, racists, and self-serving SOBs before I knew anything about their politics, chances are I’m going to question the things they hold “true”.

At first, I was going to suggest reading books (or taking courses) on Logic or Debating. But instead, I suggest browsing to see how they confirm/debunk various urban legends.

But let’s get real here. The current state of play in the U.S. is not remotely that “both sides do it”. These two things are not the same. MSNBC is generally a source of reliable facts, the commentary is left-leaning. Fox News goes far beyond partisan commentary, it deliberately and persistently lies. It is an unethical propaganda machine. It’s exceptional and notable when Fox does report information honestly and accurately, in the few cases such as the Fox electoral analysis team that are credible sources of facts.

Or to an organization. I just got involved in an issue with training where a trainer wanted to use “the learning pyramid”. The trainer wanted me to work with the graphic to use in a presentation, but I quickly found that that was myth. Very busy now, but I will expand later on the research I did to check and then prevent the distribution of that myth in one of my work places.

If you read the NY Times and listen to NPR, you’re going to get pretty close to the TRUTH. Both of those organizations are very concerned about their reputations as high quality sources. The Wall Street Journal, not including the editorial page, is an excellent source of financial news. BBC News and the Economist as well. I think the Washington Post is pretty highly respected. TV news will be too shallow and too sensationalistic, so you can skip CNN, MSNBC, and FOX News.

If you stuck to the sources I mentioned, you should have an excellent, reality-based view of the world.

I’ve seen charts like this one that claim to show the reliability and bias of various news sources:

I think it would be cool if we had a rule that cites for news-related claims had to come from within the green Most Reliable box. If it’s real news, it will end up there, although it might take half a day or something. If it never ends up there, it’s probably bullshit.

This is probably not the answer the OP is looking for, but it goes much deeper than what source(s) you use to get your news. There is a whole branch of philosophy called epistemology that goes into the question of how do you know what is true about the world, at any level. I wish it weren’t so esoteric a subject because most people, I think, could benefit from an introductory course. Perhaps the first step, already mentioned, is to learn to doubt what you think you know. If you understand how easy it is just to misinterpret the things you see or the sounds you hear, you begin to understand how little you actually know about more difficult things, like what other people are thinking or feeling, what motivates them, what scares them. Those are the kinds of things we all make all kinds of assumptions about, based on our own very limited past experiences. That’s why human relationships are so difficult.

An introductory course in logic wouldn’t hurt either, but that’s for another day.