Truth: a practical guide to finding it

Whateverman

Well-known member
I thought I'd try to create an apolitical thread to establish how truth is found. Regardless of who you vote for, what you believe, which values you hold - everyone recognizes that both truth and falsehood exist; some beliefs are true, and some are not.

So, if we keep this in the abstract, it should be possible to describe the process for determining whether "X is true" is true or not. Of course, if this abstract process is defined correctly, "X" could be anything: a claim about a politician, the supposed existence of a dangerous virus, the motivation behind the publishing of an article in a blog, etc.

I expect very few responses to this thread, but for those who decide to take a stab, you don't need to go into as much detail as I have below. Also, don't bother listing which sources you pay more attention to; I'm curious about the steps you take to decide whether ANY claim is true or false.

To start this off, I saw a claim about the large number of people currently at hospitals around the country being abnormally high:

  • The very first step I took is to assign this claim some amount of credibility. This is more instinctual than anything else: without doing any investigation, how do I feel about the truthiness of the claim?
  • The second step is to see how well the claim fits in with other things I know or have heard. This is a big step, in that there are a lot of things to compare the claim against:
    - Do I know medical professionals who confirm or cast doubt on the claim?
    - Have I heard similar claims from other peer sources (ie. blogs if I got the information from a blog, Youtube videos if I got it from there, friends if I got it from them, etc)
    - Have I heard similar things from non-peer sources? For example, if I saw someone on TV saying the number of patients around the country is large, what do my friends say about this? Bloggers? Facebook users? The senators of my state?
  • Does this claim stand up to reasonable scrutiny? For example, is it possible for hospitals to experience an abnormally high volume of health issues? (In this case, the answer is obviously yes. However, if the claim was about something miraculous (eg. the sun falling from the sky), then this step might be enough to stop the assessment of the claim...
  • If we assume the claim is true/false, what would be the logical consequences of it being true/false? For example, if there are large numbers of people at the hospital, we might expect to see healthcare workers complaining about a lack of supplies. We also might see a lot of people talking about their friends/relatives being hospitalized. We shouldn't expect to see hospital staff on tv talking about there being plenty of beds available, etc.
  • How much do I actually care about knowing whether this is true or not? Related to this is the extent to which I'm willing to put actual effort into coming to a conclusion. I may not be willing to put much effort in at all, or I may feel the question is important and want to know more...
  • Finally, how much risk am I putting myself in by making an incorrect assessment about the truth/falsehood of the claim? There's a massive difference in risk between being told "liver and onions are tasty" and "there's a bus about to run over you". So how much risk is there in falsely believing the opposite of how many people there are at hospitals right now?
 
Last edited:

Whateverman

Well-known member
I'm 100% confident that the political differences between people in the US today are based entirely in the processes they use to decide whether a claim is true or false.
 
Last edited:

Gus Bovona

Active member
I think money big problem is that priors (from Bayes’ Theorem, background knowledge in other words, things we already know are true) gets mixed up with our biases.
 

vibise

Active member
I think you should not disallow sources of information. This is critical to making an informed decision.

When I read about something new in the news, I click on the links provided.

If the story is about a SCOTUS decision, the link should take me to that decision.
If the story is about a scientific finding, it should take me to the published paper, or to statements made by the investigator.
If the story is about a crime, the link should go to the police report, eyewitnesses, available evidence, etc.
If the story is about an election result, the link should go to the relevant authority.

If, OTOH, the provided links take me to youtube videos, opinion pieces by nonexperts, press secretaries of the losing candidate, etc., I know to ignore that news.
 

Whateverman

Well-known member
I think money big problem is that priors (from Bayes’ Theorem, background knowledge in other words, things we already know are true) gets mixed up with our biases.
I haven't read Bayes at all, though I'm familiar with the term. Without knowing the details, I'd say it's literally impossible to not rely upon things we already know when deciding whether to accept a new piece of information. We have to view the new through the biases of our past.

To your point, though, the question of how heavy that reliance is - certainly is a valid one. I think I'd put a finer point on it, in that the extent to which your priors filter out information is where the real problem lies.
 

Whateverman

Well-known member
I think you should not disallow sources of information. This is critical to making an informed decision.

When I read about something new in the news, I click on the links provided.

If the story is about a SCOTUS decision, the link should take me to that decision.
If the story is about a scientific finding, it should take me to the published paper, or to statements made by the investigator.
If the story is about a crime, the link should go to the police report, eyewitnesses, available evidence, etc.
If the story is about an election result, the link should go to the relevant authority.

If, OTOH, the provided links take me to youtube videos, opinion pieces by nonexperts, press secretaries of the losing candidate, etc., I know to ignore that news.
Yes, I'm hyper-aware of being misled, and an article forcing me to click on more than one thing just to answer a question in my head is an immediate red flag. Journalists exist to tell us what we want/need to know, and if a writer starts avoiding doing that, I know to ignore whatever it is they've written.
 

Whateverman

Well-known member
Like I said in the OP, I didn't have high hopes for the thread; the subject is boring and probably involves more "hashing out" than most people in this forum are willing to engage in.

Part of my motivation for posting this comes from the following suspicion: the people I disagree most strongly with (in terms of politics) are unwilling to have their truth-determination process examined. They don't want to admit to other people the criteria they use to decide whether something is true or false.
 

Gus Bovona

Active member
I haven't read Bayes at all, though I'm familiar with the term. Without knowing the details, I'd say it's literally impossible to not rely upon things we already know when deciding whether to accept a new piece of information. We have to view the new through the biases of our past.

To your point, though, the question of how heavy that reliance is - certainly is a valid one. I think I'd put a finer point on it, in that the extent to which your priors filter out information is where the real problem lies.
OMG, you totally have to check out Bayes’. Do a search for Richard Carrier and Bayes. If you come up poorly, lemme know and I’ll do some research when I have time.

Bayes is consistent with everything you say above.
 

vibise

Active member
Yes, I'm hyper-aware of being misled, and an article forcing me to click on more than one thing just to answer a question in my head is an immediate red flag. Journalists exist to tell us what we want/need to know, and if a writer starts avoiding doing that, I know to ignore whatever it is they've written.
A writer can give you a two sentence summary of a SCOTUS decision, but he should also provide a link so you can read it yourself.

After all, AG Barr provided a summary of the Mueller report that was completely misleading, and the actual report was not available for weeks. So the misinformation spread and took hold.
 

Backup

Active member
I thought I'd try to create an apolitical thread to establish how truth is found. Regardless of who you vote for, what you believe, which values you hold - everyone recognizes that both truth and falsehood exist; some beliefs are true, and some are not.

So, if we keep this in the abstract, it should be possible to describe the process for determining whether "X is true" is true or not. Of course, if this abstract process is defined correctly, "X" could be anything: a claim about a politician, the supposed existence of a dangerous virus, the motivation behind the publishing of an article in a blog, etc.

I expect very few responses to this thread, but for those who decide to take a stab, you don't need to go into as much detail as I have below. Also, don't bother listing which sources you pay more attention to; I'm curious about the steps you take to decide whether ANY claim is true or false.

To start this off, I saw a claim about the large number of people currently at hospitals around the country being abnormally high:

  • The very first step I took is to assign this claim some amount of credibility. This is more instinctual than anything else: without doing any investigation, how do I feel about the truthiness of the claim?
  • The second step is to see how well the claim fits in with other things I know or have heard. This is a big step, in that there are a lot of things to compare the claim against:
    - Do I know medical professionals who confirm or cast doubt on the claim?
    - Have I heard similar claims from other peer sources (ie. blogs if I got the information from a blog, Youtube videos if I got it from there, friends if I got it from them, etc)
    - Have I heard similar things from non-peer sources? For example, if I saw someone on TV saying the number of patients around the country is large, what do my friends say about this? Bloggers? Facebook users? The senators of my state?
  • Does this claim stand up to reasonable scrutiny? For example, is it possible for hospitals to experience an abnormally high volume of health issues? (In this case, the answer is obviously yes. However, if the claim was about something miraculous (eg. the sun falling from the sky), then this step might be enough to stop the assessment of the claim...
  • If we assume the claim is true/false, what would be the logical consequences of it being true/false? For example, if there are large numbers of people at the hospital, we might expect to see healthcare workers complaining about a lack of supplies. We also might see a lot of people talking about their friends/relatives being hospitalized. We shouldn't expect to see hospital staff on tv talking about there being plenty of beds available, etc.
  • How much do I actually care about knowing whether this is true or not? Related to this is the extent to which I'm willing to put actual effort into coming to a conclusion. I may not be willing to put much effort in at all, or I may feel the question is important and want to know more...
  • Finally, how much risk am I putting myself in by making an incorrect assessment about the truth/falsehood of the claim? There's a massive difference in risk between being told "liver and onions are tasty" and "there's a bus about to run over you". So how much risk is there in falsely believing the opposite of how many people there are at hospitals right now?
The effort is appreciated, but on this forum there is a large percentage of people that think lunatic, conspiracy blogs are gospel and peer-reviewed academic journals are “fake news.”

They are idiots, and information gathering is relatively simple if you are really interested in the truth.
 

Gus Bovona

Active member
I haven't read Bayes at all, though I'm familiar with the term. Without knowing the details, I'd say it's literally impossible to not rely upon things we already know when deciding whether to accept a new piece of information. We have to view the new through the biases of our past.

To your point, though, the question of how heavy that reliance is - certainly is a valid one. I think I'd put a finer point on it, in that the extent to which your priors filter out information is where the real problem lies.
Here more to whet your appetite for Bayes’ theorem:

You are already using it. Every time you reach any conclusion about how probable something is. You are unconsciously assuming a prior probability. You are unconsciously assuming a likelihood of the evidence. You are unconsciously feeling whether the one is enough to outweigh the other. And then that causes you to feel confident that something is true or false. Or more likely the one than the other. Or else you feel you aren’t confident either way. And that feeling? A product of Bayes’ Theorem. Already running in your head. Only, like all intuition, by not examining it, you often fuck it up.
from a good intro to BT
 

Whateverman

Well-known member
Here more to whet your appetite for Bayes’ theorem:

You are already using it. Every time you reach any conclusion about how probable something is. You are unconsciously assuming a prior probability. You are unconsciously assuming a likelihood of the evidence. You are unconsciously feeling whether the one is enough to outweigh the other. And then that causes you to feel confident that something is true or false. Or more likely the one than the other. Or else you feel you aren’t confident either way. And that feeling? A product of Bayes’ Theorem. Already running in your head. Only, like all intuition, by not examining it, you often fuck it up.

from a good intro to BT
You were right: I was expressing something pretty close to the idea in that quote. I can't remember exactly where/when I heard Bayes' Theorem being mentioned, though I suspect it was here in this forum (possibly in a discussion involving Metacrock)
 

Authentic Nouveau

Well-known member
I thought I'd try to create an apolitical thread to establish how truth is found. Regardless of who you vote for, what you believe, which values you hold - everyone recognizes that both truth and falsehood exist; some beliefs are true, and some are not.

So, if we keep this in the abstract, it should be possible to describe the process for determining whether "X is true" is true or not. Of course, if this abstract process is defined correctly, "X" could be anything: a claim about a politician, the supposed existence of a dangerous virus, the motivation behind the publishing of an article in a blog, etc.

I expect very few responses to this thread, but for those who decide to take a stab, you don't need to go into as much detail as I have below. Also, don't bother listing which sources you pay more attention to; I'm curious about the steps you take to decide whether ANY claim is true or false.

To start this off, I saw a claim about the large number of people currently at hospitals around the country being abnormally high:

  • The very first step I took is to assign this claim some amount of credibility. This is more instinctual than anything else: without doing any investigation, how do I feel about the truthiness of the claim?
  • The second step is to see how well the claim fits in with other things I know or have heard. This is a big step, in that there are a lot of things to compare the claim against:
    - Do I know medical professionals who confirm or cast doubt on the claim?
    - Have I heard similar claims from other peer sources (ie. blogs if I got the information from a blog, Youtube videos if I got it from there, friends if I got it from them, etc)
    - Have I heard similar things from non-peer sources? For example, if I saw someone on TV saying the number of patients around the country is large, what do my friends say about this? Bloggers? Facebook users? The senators of my state?
  • Does this claim stand up to reasonable scrutiny? For example, is it possible for hospitals to experience an abnormally high volume of health issues? (In this case, the answer is obviously yes. However, if the claim was about something miraculous (eg. the sun falling from the sky), then this step might be enough to stop the assessment of the claim...
  • If we assume the claim is true/false, what would be the logical consequences of it being true/false? For example, if there are large numbers of people at the hospital, we might expect to see healthcare workers complaining about a lack of supplies. We also might see a lot of people talking about their friends/relatives being hospitalized. We shouldn't expect to see hospital staff on tv talking about there being plenty of beds available, etc.
  • How much do I actually care about knowing whether this is true or not? Related to this is the extent to which I'm willing to put actual effort into coming to a conclusion. I may not be willing to put much effort in at all, or I may feel the question is important and want to know more...
  • Finally, how much risk am I putting myself in by making an incorrect assessment about the truth/falsehood of the claim? There's a massive difference in risk between being told "liver and onions are tasty" and "there's a bus about to run over you". So how much risk is there in falsely believing the opposite of how many people there are at hospitals right now?
Joe Biden says 200 million people have died from COVID-19

lying dog-faced pony soldier” Joe

A few weeks later

“Masks matter. These masks, they matter. It matters. It saves lives. It prevents the spread of the disease … two hundred and ten million — 210,000 people have died. You have, you know, about 1,000 people a day getting the coronavirus. Fifty thousand, I mean — so it’s a great concern.”

Lying is in his blood.
All pro-deathers lie
 
Top