2024-04-14 80,000 Hours Podcast - #180 – Hugo Mercier on Why Gullibility and Misinformation Are Overrated

@tags:: #lit✍/🎧podcast/highlights
@links::
@ref:: #180 – Hugo Mercier on Why Gullibility and Misinformation Are Overrated
@author:: 80,000 Hours Podcast

=this.file.name

Book cover of "#180 – Hugo Mercier on Why Gullibility and Misinformation Are Overrated"

Reference

Notes

Quote

People Are Generally Amenable to Having Their Minds Change (Anti-Vaxers are the Exception
Transcript:
Speaker 2
Do you think to some extent people are maybe cherry picking cases where folks are resistant to arguments and they ignore just the familiar everyday cases when arguments persuade us Sensibly all the time?
Speaker 1
Yes, yes. I think there are at least two documented cases of this Blackfire effect. There is another one with vaccine hesitancy, I think, for one specific vaccine and one specific range of the population. But, but yes, I mean, there are kind of now dozens, if not hundreds of experiments showing that in the overwhelming or like the quasi, you know, entirety of the cases, when you give people A good argument for something, something that is based in fact that they would, you know, some authority that they trust, then they are going to change their mind, maybe not enough, Not as much as we'd like to, but the change will be in the direction that you would expect. Yeah. So that's really, and that's in a way that's the sensible thing to do. And you're right that both going to lay people and professional psychologists are and have been and still are very much attracted to demonstrations that human adults are irrational And a bit silly because it's more interesting. Like if you show, well, look, people can speak, you know, it's like the most amazing thing maybe in the, you know, in the biological world, we have language. It's like, well, sure, obviously we have language. I mean, but if you say, oh, sometimes, you know, when times every maybe 50,000 words, there's a word that you can't remember, you have a tip of the tongue. Oh my God, this is amazing. You know, how are we brines, you know, working so poorly. So we are attracted by mistakes, by errors, by, by silly behavior. But that doesn't mean this is representative at all.)
- Time 0:21:56
-

Quote

(highlight:: The Influence of Existing Beliefs on Incorporating New Information
Transcript:
Speaker 2
So our defensive posture is that if something conflicts with our existing beliefs, so it kind of doesn't pass that initial plausibility check of just being consistent with what we Think, then it has to have something else going for it that allows it to pass through and be incorporated into our ideas. So it could be something like it comes from an authority that we have trust in, and then we might take it very seriously, or it could be an argument that we feel ourselves qualified to check And to see whether the reasoning holds up. But if it's just an assertion from something that we don't trust and that we don't feel qualified to pass judgment on the soundness of the argument ourselves, then the default thing Is just to not try to incorporate it into our beliefs. Is that right?
Speaker 1
Yes, no, exactly. Yeah. And that's what we see. So if you consider, and we'll talk about that later as well, I guess, but if you consider a mass-pass version attempt, if you consider advertising, propaganda, religious, proselytizing, All of these things, you're typically in one of these situations in that when you see an ad on the subway or something or on TV, I mean, you know who is sending you the message, but you don't Have any information by their real competence. You even tend to suspect that there's a conflict of interest. They don't really have time to give you any arguments that might change your mind. I mean, not at any length. So in most of these mass-pass version situations, you're in a situation in which people are mostly going to react on the basis of whether the message that they're hearing drives with What they were already believing or not.)
- Time 0:28:34
-

Quote

(highlight:: The Myth of Brainwashing and Subliminal Influence
Transcript:
Speaker 2
But you say that research suggests actually the opposite, that when people are distracted, not paying that much attention or they're tired or they don't feel in a position to judge Ideas, what happens is they just stop changing their mind at all, which is of course a very sensible thing to do. Because those are the points at which you would be, if you tried to evaluate the arguments, you will be most likely to make a mistake. And so you simply close your ears more or less, or you simply become unwilling to shift your opinions.
Speaker 1
Yes, no, completely. And that's in a way that's this idea that has led both to the myth of brainwashing and the myth of subliminal influence. So both of the mid-century America you have the idea that if you if you know you go to the to the movie theater and in the middle of the movie, they're going to show very, very quickly words Like, you know, Coca-Cola or something, then it will make you drink more Coca-Cola. Because and the idea is that precisely because your brain can process the information on any conscious level, then you're unaware of the influence attempt and you fall, you're like Completely falling prey for it. And so there's no data showing that at all. So the original claims were just completely made up by someone who wanted to sell books. And there is no evidence that any of this works at all. And the other thing, which is which is which has had much more dramatic consequences is the idea of brainwashing, the idea that you could take prisoners of war and you submit them to really, Really harsh treatment, you give them no food, you stop them from sleeping, you're beating them up. So you make them as you are describing, you make them extremely tired and very, very foggy. And then you get them to read hours and hours and end. Are they going to become communists? Well, we know the answer because unfortunately, the Koreans and the Chinese have tried during the war and during the Korean war, and it just doesn't work at all. They've managed to kill a lot of POWs and they managed to get, I think, two of them to go back to China and to claim that they had converted to communism. But in fact, after the fact, it was revealed that these people were just had just converted because they wanted to stop being beaten and starved to death. And that as soon as they could revert back to go back to the US, they did so.)
- Time 0:30:22
-

Quote

(highlight:: Intuitive v.s. Reflective Beliefs
Transcript:
Speaker 2
So can you explain the distinction between intuitive and reflective beliefs?
Speaker 1
Yeah, intuitive beliefs are beliefs that are formed usually through perception. Like, you know, if I see there's a desk in front of me, you know, I have an intuitive belief that there's a desk in front of me, and I'm not going to try walking through it. I know I can put my laptop on it. And also believes that I'm from through some symbols from the testimonies with my wife tells me she's at home tonight, then I'm going to intuitively believe she's at home tonight. So I will guide me, I will base my behavior on that and I will act as if I had perceived that she was at home tonight, for instance. And that's the vast majority of our beliefs and things work really well. And these beliefs tend to become sequential and to have behavioral impact. By contrast, reflective beliefs are beliefs that we can hold them equally strongly as intuitive beliefs. So it's not just a matter of confidence, but they tend to be largely divorced from our behavior. So you can believe something. But either because you don't really know how to act on the basis of that belief or some for some other reasons, it doesn't really translate into the kind of behavior that when when we'd Expect if you held the same belief intuitively. So an example that is really striking is conspiracy theories. So if you text someone who who believes intuitively believes in a conspiracy, so for instance, someone who is working in a company or in a government, and they've seen that their boss Was shredding documents or was doing something really fishy, and they have good evidence that that's something really bad is going on. Their reaction is going to be to shut up. They're going to be afraid for their jobs, places for their lives. And it will be really they have strong emotional component. And their behavior will be will be when I really not want him to say anything, or if they say anything, they won't want to share it from the rooftops. They'll contact a journalist, anonymously or something like this. And if you can contrast that to the behavior of conspiracy theorists, we don't have actual perceptual or first-hand evidence of conspiracy going on, then these people, they tend not To be afraid. They can say, oh, I believe the CIA orchestrated 9-11, and there are this all-powerful evil institution, and yet they're not going to kill me if I say this. And so they're just going to sail at worst, they're going to say things, but they're in a emotional and behavioral reactions are really stunted or really different from what you would Expect from someone who would have a similar intuitive belief.)
- Time 0:32:53
-

Quote

(highlight:: Trust = Competence + Aligned Incentives
Transcript:
Speaker 1
So there are two main dimensions of trust, really. So one has to do with competence. So essentially, how likely is it that what you're telling me is true? And that depends on how well-informed you are, how much of an expert you are, whether you're someone who is very knowledgeable in a given area. And so for this, we keep track of like informational access, for instance. So if you tell me something about, let's say we have a friend in common, and I know that you've seen her recently, if you tell me something about her, I will tend to believe you because presumably You're better informed because you've seen her more recently. More generally, we are pretty good at figuring out who is an expert in a given area, sometimes on the basis of relatively kind of subtle cues. And so we can know even like, you know, let's you have a friend who manages to fix your computer, you're going to think, well, you know, they're a good computer person, and maybe you'll Turn to them the next time you have a computer problem. So that's the competence dimension. So does that person know the truth? Like, do they have themselves accurate beliefs? And the other dimension, which is maybe what we really call trust in everyday life, is are they going to tell us that? Because you know, if I even I can believe that you're the most expert person in the world in a given area, if I don't trust you, if I don't believe that you will share with me the accurate Beliefs that you hold, then well, it's not it's no use to me. And that's a good that's a good dimension of really trust per se, it depends on on broadly on two things. One is your kind of short and incentives. So even if you're, you know, you're my your my brother, you're a very good friend, if we play poker together, I'm not going to believe you. Because I know that if you tell me to fold, well, you know, you have no incentive to be truthful in the context of poker, we have like purely opposite incentives. So there's this kind of short term, you know, what can you get from me with that specific message? And then there's a long term incentives, like, are you someone who who's going to interest or kind of intermeshed with mine? And someone who would benefit from me doing well? And is that something that's going to be true, you know, kind of moving forward? So if you're a family member, if you're a good friend, I know that you don't have any incentive to or like, if you're very small incentives to to mislead me, because then that will jeopardize Our relationship and the cost to you as well as to me would be quite high.)
- Time 0:41:20
-

Quote

(highlight:: People Believe Things or Join Groups to Satisfy Needs Unrelated to the Group
Transcript:
Speaker 1
Usually people, as we were mentioning earlier, for religious conversions, people join a new religious group because they have practical reasons, like they get along well with the People, they get stuff in the short term that they enjoy. And so if you want to convince them that the doctrine is ridiculous, it's not going to do so much. What you have to do is to provide them with an environment in which they're going to get what the other environment is able to provide in terms of status, in terms of brotherhood, sort Of things. But just to compare theories and maybe flat authors in particular, when you have a really good idea and you think you're the first person in the world to have that idea, even if it's not Something not massive, it feels really awesome. You figured out something that no one else really has figured out. Like you mentioned, if you have that belief about the earth being flat, it's like all the scientists, everybody in the world is getting this thing completely wrong. And I know this. And I have this truth that is better than what everybody else is thinking. Like it's maybe quite a high. If you're able to convince yourself of that, I can see how it would be quite pleasant in a way.)
- Time 0:55:28
-

Quote

(highlight:: Vaccination Receives Skepticism Because It Is Unintuitive, Not Because It "Causes Autism"
Transcript:
Speaker 1
So obviously, given everything we know about vaccination, you know, you should do it for all the vaccines that are recommended by the health system. But I can see how the, you know, it's not the most the most intuitive therapy. It's not like if you have a broken arm and so on. So you know, probably we should put the burden, you know, right? Sure. Yeah, let's do it. I say, oh, you should, your kid is perfectly fine. We should take this thing from that sick person and transform it and then put it in your kid. It's not, it doesn't sound great. So there's an intuition, I think that many people share that vaccination isn't the best therapy. And we know that this is the prime driver big and not know the stories about vaccination causing autism, for instance, because as much as in every culture, there are people who are going To doubt vaccination, the reasons that they offered for to justify that doubt are going to vary tremendously from when culture to the next. So in the West, it has been a lot recently about, you know, vaccines like the MMR vaccine in particular causing autism. It used to be that the small pox inoculation would turn you into a cow. There are many cultures in which, you know, it's going to make you sterile, it's going to make you age, it's going to make you all sorts of bad things to you. So the the justification is very a lot because these are the ones that you get from your environment, but the underlying motivation to this like vaccine is pretty much universal. I know it's universal business that everybody shares it, but in every population, you'll have people who are very keen on being anti-vax.
Speaker 3
Yeah, so it does make a lot of sense.)
- Time 1:08:16
-

Quote

(highlight:: The Stigma Against Nuclear Power is Not Irrational, But It is Immoral To Some Extent
Transcript:
Speaker 1
We don't really well understand very well why people seem to have such kind of negative preconceptions about nuclear power, just nearly everywhere in the world. But whatever the cause, these misperceptions have had dramatic policy consequences. And it is indeed one of the domains in which you can make a plausible case that it is public opinion to some extent that led countries like Germany or Belgium or other countries to dismantle Their nuclear fleet. And studies have shown that thousands of people have died because of this, because of the coal plants that had to be used instead of nuclear power. So yeah, no, it is a case in which it's not irrational for people to get this wrong. But you could make the case that it is bordering on being a bit immoral to the extent that they are inflicting costs on others.)
- Time 1:17:27
-

Quote

(highlight:: The Disgust Response to Nuclear Energy
Transcript:
Speaker 1
We have some evidence, but it's not completely clear that that one of the things that's going on is that nuclear power is tapping into people's disgust mechanisms. So we have this psychology that evolved to help us avoid things that are going to make us sick. And so we have an intuition that we shouldn't touch or even less eat feces and urine and anything that comes out of people's bodies or rotten flesh. I think that smells really horrendous. And these mechanisms, the way they work is they tell us, well, look, these things contain small things that you can't really perceive, but that are going to make people sick. And the amount of the thing doesn't matter, which is mostly true. And you can get sick with a very small amount of viruses or bacteria, obviously. And then that's contagious. That can be transmittable from one person to the next. And I think it is that template that people apply to nuclear energy, because they think, well, you know, okay, radiation is this invisible thing, a big, like, you know, germs and viruses Are invisible to the naked eye. It's this invisible thing that makes people sick, even if they're exposed to a very, very small amount of it. And then the people who have gotten sick, they can make other sick. And you see that, you see that, for instance, when the way that people who have been heard by radiation, mostly after the Hiroshima and Nagasaki bombings, like it was sometimes hard To find people willing to treat them, because they are perceived as being contagious themselves, which was which was not the case by in large. Some of their clothes might have had some radiation, but that themselves were not anymore. So we have this this false image of nuclear energy that is then again, overwhelming demiscated. And I think that gives us this bad feeling about it. And it's true. And it's true. It's funny. I think like particulate matters, I think it would work in the same way as people have the intuition in a way that smoking is going to make you sick. Like when people when data started coming out that smoking is causing lung cancer, I think it intuitively is like, Oh, yeah, I can see that happening. And if there was more of a kind of media discussion of the effect of particulate matter, I think it would even if you can't really see that in the same way as you can't really see the particles In smoke, people I think would get it, but they wouldn't get like the contagion is still less carried because like if you're exposed to it, then you're marked to get it, you don't have This very kind of insidious feeling you have with things that are contagious.
Speaker 2
Yeah, I would have thought that one of the differences between particulate pollution and radiation is that the former has a very simple physical mechanism that I think I intuitively Understand, which is that you burn coal, it produces smoke or you burn wood and it produces smoke. And then I breathe it in and that sounds bad, but also comprehensible. Whereas with nuclear power, you're like, where is the radiation? I can't see it. How much is it? How much is bad? Are there different types? I'm very educated and I still find it very confusing. So you can imagine someone who just doesn't understand what radiation is super weird. And I think that that confusion means that you just have to take the belief that it's safe on trust because you don't understand that you're not a graduate physics. And then if you don't trust people, if you don't trust engineers to that degree, then well, you just screwed. You're kind of always going to be suspicious of it.)
- Time 1:19:27
-

Quote

(highlight:: The Role of Skeptics in Stopping Harmful Information Cascades
Transcript:
Speaker 1
Like in an information cascade, the way it's been usually designed is, yes, you look, people are influenced by the people, you know, before them. And it looks as if it looks to you as if each new person has made up them and independently of others, when in fact, they themselves had been influenced by the people before them. And so it looks as if you have like a lot of confirmatory evidence, when in fact, it just so happens that at the beginning of the chain, you had a few people who thought so, and then everybody Sort of was overwhelmingly influenced by them. And that gets, that's supposed to get increasingly worse is because if you have, you know, five and then 10 and then 50 people who will agree, then obviously the weight of the evidence That they're right is increasingly large, for the record that doesn't happen in when you try to do the experiments, you know, it kind of means they have these nice models of how that should Happen in a way if people are rational, to some extent. But in fact, in every group you have enough people who are pigmented and just because they're, well, no, I'm going to ignore it. Everybody else is saying, I'm just going to go my own way. And these people, they break, they completely break this case. So they can be annoying, but at least they play that kind of useful role sometimes.)
- Time 1:26:11
-

Quote

(highlight:: The Impact of Disinformation is Constrained By How Much and Where People Direct Attention
Transcript:
Speaker 1
Well, I mean, first of all, there's already an essentially infinite amount of information on the internet. So the bottleneck is not how many articles there are on any given topic, because there is already like way more than anybody will ever read. The bottleneck is people's individual attention. And that bottleneck is largely controlled by, I mean, to some extent, by when it's going to appears in colleagues on social network, but otherwise mostly by the big actors in the field, By cable news, by big newspapers. And there's no reason to believe these things are going to change dramatically. So having another, you know, when thousand articles on a given issue, just no one is going to read them.)
- Time 1:58:10
-

Quote

(highlight:: Fake News Is Likely to Entrench Opinion More Than It Is Likely to Change It
Transcript:
Speaker 2
Let's accept for the sake of argument that it does become easier to produce misleading content in future. I think some people envisage that the outcome of this would be people's opinions being changed in all sorts of random directions all the time. Whereas I think your model and now my model, I've even read the book, is that it's not that people would start changing their mind more, it's that people would start changing their mind Less because they would simply realize that, like, you know, as in when people make really complex arguments in some field that I don't understand, I'm just like, and I don't trust the Person, I don't believe that they are an authority really. And I can't check the argument that they're making for myself because I don't understand it well enough, I simply don't change my mind. And likewise, in future, if people start noticing that it's possible to trick them into believing stuff all the time, because they're incapable of noticing that a video is doctored, For example, then they just stop changing their mind at all in response to these inputs because they always just have the option of keeping their current views. And I think so, yeah, do you agree that that would be where things would, in a bad case, potentially bottom out?
Speaker 1
Yes, I mean, like, it's increasingly easy to say, well, that video has been made up, so etc, etc. I'm not sure how much the technological impossibility of doing something ever was such a strong argument. So if, you know, if 40 years ago someone were to tell you, look, you know, there's this picture that appeared in the New York Times, and they tell you, oh, I think it's a fake picture, would Your argument really have been, well, it's impossible to doctor a photo? Or would your argument have been, well, it's in the New York Times, and also in the Washington Post and also everywhere else? Yeah. I think the argument always rests really ultimately on reputation and not on the technical possibility of doing such and such trick. And so there are always people who want to say, well, you know, I don't believe in that and they'll have more excuses, but I don't think it's going to make a big difference.
Speaker 2
So here I'm picturing a scenario where, like, let's say that we end up in a worst case than that, where we, like, the New York Times doesn't exist or that the New York Times is no longer credible, And so you don't trust. There's no particular authority that you trust to determine the providence of an image or a video and to determine whether it's real or not. In that case, I think what happens is you just stop paying attention and you stop changing your mind. Yes. So hopefully, hopefully we can solve it by having trustworthy sources and institutions that people believe have done the legwork to figure out if things are true. But if they don't, it won't be mass persuasion. It'll be mass indifference and mass stubbornness, I think.)
- Time 2:24:59
-


dg-publish: true
created: 2024-07-01
modified: 2024-07-01
title: #180 – Hugo Mercier on Why Gullibility and Misinformation Are Overrated
source: snipd

@tags:: #lit✍/🎧podcast/highlights
@links::
@ref:: #180 – Hugo Mercier on Why Gullibility and Misinformation Are Overrated
@author:: 80,000 Hours Podcast

=this.file.name

Book cover of "#180 – Hugo Mercier on Why Gullibility and Misinformation Are Overrated"

Reference

Notes

Quote

People Are Generally Amenable to Having Their Minds Change (Anti-Vaxers are the Exception
Transcript:
Speaker 2
Do you think to some extent people are maybe cherry picking cases where folks are resistant to arguments and they ignore just the familiar everyday cases when arguments persuade us Sensibly all the time?
Speaker 1
Yes, yes. I think there are at least two documented cases of this Blackfire effect. There is another one with vaccine hesitancy, I think, for one specific vaccine and one specific range of the population. But, but yes, I mean, there are kind of now dozens, if not hundreds of experiments showing that in the overwhelming or like the quasi, you know, entirety of the cases, when you give people A good argument for something, something that is based in fact that they would, you know, some authority that they trust, then they are going to change their mind, maybe not enough, Not as much as we'd like to, but the change will be in the direction that you would expect. Yeah. So that's really, and that's in a way that's the sensible thing to do. And you're right that both going to lay people and professional psychologists are and have been and still are very much attracted to demonstrations that human adults are irrational And a bit silly because it's more interesting. Like if you show, well, look, people can speak, you know, it's like the most amazing thing maybe in the, you know, in the biological world, we have language. It's like, well, sure, obviously we have language. I mean, but if you say, oh, sometimes, you know, when times every maybe 50,000 words, there's a word that you can't remember, you have a tip of the tongue. Oh my God, this is amazing. You know, how are we brines, you know, working so poorly. So we are attracted by mistakes, by errors, by, by silly behavior. But that doesn't mean this is representative at all.)
- Time 0:21:56
-

Quote

(highlight:: The Influence of Existing Beliefs on Incorporating New Information
Transcript:
Speaker 2
So our defensive posture is that if something conflicts with our existing beliefs, so it kind of doesn't pass that initial plausibility check of just being consistent with what we Think, then it has to have something else going for it that allows it to pass through and be incorporated into our ideas. So it could be something like it comes from an authority that we have trust in, and then we might take it very seriously, or it could be an argument that we feel ourselves qualified to check And to see whether the reasoning holds up. But if it's just an assertion from something that we don't trust and that we don't feel qualified to pass judgment on the soundness of the argument ourselves, then the default thing Is just to not try to incorporate it into our beliefs. Is that right?
Speaker 1
Yes, no, exactly. Yeah. And that's what we see. So if you consider, and we'll talk about that later as well, I guess, but if you consider a mass-pass version attempt, if you consider advertising, propaganda, religious, proselytizing, All of these things, you're typically in one of these situations in that when you see an ad on the subway or something or on TV, I mean, you know who is sending you the message, but you don't Have any information by their real competence. You even tend to suspect that there's a conflict of interest. They don't really have time to give you any arguments that might change your mind. I mean, not at any length. So in most of these mass-pass version situations, you're in a situation in which people are mostly going to react on the basis of whether the message that they're hearing drives with What they were already believing or not.)
- Time 0:28:34
-

Quote

(highlight:: The Myth of Brainwashing and Subliminal Influence
Transcript:
Speaker 2
But you say that research suggests actually the opposite, that when people are distracted, not paying that much attention or they're tired or they don't feel in a position to judge Ideas, what happens is they just stop changing their mind at all, which is of course a very sensible thing to do. Because those are the points at which you would be, if you tried to evaluate the arguments, you will be most likely to make a mistake. And so you simply close your ears more or less, or you simply become unwilling to shift your opinions.
Speaker 1
Yes, no, completely. And that's in a way that's this idea that has led both to the myth of brainwashing and the myth of subliminal influence. So both of the mid-century America you have the idea that if you if you know you go to the to the movie theater and in the middle of the movie, they're going to show very, very quickly words Like, you know, Coca-Cola or something, then it will make you drink more Coca-Cola. Because and the idea is that precisely because your brain can process the information on any conscious level, then you're unaware of the influence attempt and you fall, you're like Completely falling prey for it. And so there's no data showing that at all. So the original claims were just completely made up by someone who wanted to sell books. And there is no evidence that any of this works at all. And the other thing, which is which is which has had much more dramatic consequences is the idea of brainwashing, the idea that you could take prisoners of war and you submit them to really, Really harsh treatment, you give them no food, you stop them from sleeping, you're beating them up. So you make them as you are describing, you make them extremely tired and very, very foggy. And then you get them to read hours and hours and end. Are they going to become communists? Well, we know the answer because unfortunately, the Koreans and the Chinese have tried during the war and during the Korean war, and it just doesn't work at all. They've managed to kill a lot of POWs and they managed to get, I think, two of them to go back to China and to claim that they had converted to communism. But in fact, after the fact, it was revealed that these people were just had just converted because they wanted to stop being beaten and starved to death. And that as soon as they could revert back to go back to the US, they did so.)
- Time 0:30:22
-

Quote

(highlight:: Intuitive v.s. Reflective Beliefs
Transcript:
Speaker 2
So can you explain the distinction between intuitive and reflective beliefs?
Speaker 1
Yeah, intuitive beliefs are beliefs that are formed usually through perception. Like, you know, if I see there's a desk in front of me, you know, I have an intuitive belief that there's a desk in front of me, and I'm not going to try walking through it. I know I can put my laptop on it. And also believes that I'm from through some symbols from the testimonies with my wife tells me she's at home tonight, then I'm going to intuitively believe she's at home tonight. So I will guide me, I will base my behavior on that and I will act as if I had perceived that she was at home tonight, for instance. And that's the vast majority of our beliefs and things work really well. And these beliefs tend to become sequential and to have behavioral impact. By contrast, reflective beliefs are beliefs that we can hold them equally strongly as intuitive beliefs. So it's not just a matter of confidence, but they tend to be largely divorced from our behavior. So you can believe something. But either because you don't really know how to act on the basis of that belief or some for some other reasons, it doesn't really translate into the kind of behavior that when when we'd Expect if you held the same belief intuitively. So an example that is really striking is conspiracy theories. So if you text someone who who believes intuitively believes in a conspiracy, so for instance, someone who is working in a company or in a government, and they've seen that their boss Was shredding documents or was doing something really fishy, and they have good evidence that that's something really bad is going on. Their reaction is going to be to shut up. They're going to be afraid for their jobs, places for their lives. And it will be really they have strong emotional component. And their behavior will be will be when I really not want him to say anything, or if they say anything, they won't want to share it from the rooftops. They'll contact a journalist, anonymously or something like this. And if you can contrast that to the behavior of conspiracy theorists, we don't have actual perceptual or first-hand evidence of conspiracy going on, then these people, they tend not To be afraid. They can say, oh, I believe the CIA orchestrated 9-11, and there are this all-powerful evil institution, and yet they're not going to kill me if I say this. And so they're just going to sail at worst, they're going to say things, but they're in a emotional and behavioral reactions are really stunted or really different from what you would Expect from someone who would have a similar intuitive belief.)
- Time 0:32:53
-

Quote

(highlight:: Trust = Competence + Aligned Incentives
Transcript:
Speaker 1
So there are two main dimensions of trust, really. So one has to do with competence. So essentially, how likely is it that what you're telling me is true? And that depends on how well-informed you are, how much of an expert you are, whether you're someone who is very knowledgeable in a given area. And so for this, we keep track of like informational access, for instance. So if you tell me something about, let's say we have a friend in common, and I know that you've seen her recently, if you tell me something about her, I will tend to believe you because presumably You're better informed because you've seen her more recently. More generally, we are pretty good at figuring out who is an expert in a given area, sometimes on the basis of relatively kind of subtle cues. And so we can know even like, you know, let's you have a friend who manages to fix your computer, you're going to think, well, you know, they're a good computer person, and maybe you'll Turn to them the next time you have a computer problem. So that's the competence dimension. So does that person know the truth? Like, do they have themselves accurate beliefs? And the other dimension, which is maybe what we really call trust in everyday life, is are they going to tell us that? Because you know, if I even I can believe that you're the most expert person in the world in a given area, if I don't trust you, if I don't believe that you will share with me the accurate Beliefs that you hold, then well, it's not it's no use to me. And that's a good that's a good dimension of really trust per se, it depends on on broadly on two things. One is your kind of short and incentives. So even if you're, you know, you're my your my brother, you're a very good friend, if we play poker together, I'm not going to believe you. Because I know that if you tell me to fold, well, you know, you have no incentive to be truthful in the context of poker, we have like purely opposite incentives. So there's this kind of short term, you know, what can you get from me with that specific message? And then there's a long term incentives, like, are you someone who who's going to interest or kind of intermeshed with mine? And someone who would benefit from me doing well? And is that something that's going to be true, you know, kind of moving forward? So if you're a family member, if you're a good friend, I know that you don't have any incentive to or like, if you're very small incentives to to mislead me, because then that will jeopardize Our relationship and the cost to you as well as to me would be quite high.)
- Time 0:41:20
-

Quote

(highlight:: People Believe Things or Join Groups to Satisfy Needs Unrelated to the Group
Transcript:
Speaker 1
Usually people, as we were mentioning earlier, for religious conversions, people join a new religious group because they have practical reasons, like they get along well with the People, they get stuff in the short term that they enjoy. And so if you want to convince them that the doctrine is ridiculous, it's not going to do so much. What you have to do is to provide them with an environment in which they're going to get what the other environment is able to provide in terms of status, in terms of brotherhood, sort Of things. But just to compare theories and maybe flat authors in particular, when you have a really good idea and you think you're the first person in the world to have that idea, even if it's not Something not massive, it feels really awesome. You figured out something that no one else really has figured out. Like you mentioned, if you have that belief about the earth being flat, it's like all the scientists, everybody in the world is getting this thing completely wrong. And I know this. And I have this truth that is better than what everybody else is thinking. Like it's maybe quite a high. If you're able to convince yourself of that, I can see how it would be quite pleasant in a way.)
- Time 0:55:28
-

Quote

(highlight:: Vaccination Receives Skepticism Because It Is Unintuitive, Not Because It "Causes Autism"
Transcript:
Speaker 1
So obviously, given everything we know about vaccination, you know, you should do it for all the vaccines that are recommended by the health system. But I can see how the, you know, it's not the most the most intuitive therapy. It's not like if you have a broken arm and so on. So you know, probably we should put the burden, you know, right? Sure. Yeah, let's do it. I say, oh, you should, your kid is perfectly fine. We should take this thing from that sick person and transform it and then put it in your kid. It's not, it doesn't sound great. So there's an intuition, I think that many people share that vaccination isn't the best therapy. And we know that this is the prime driver big and not know the stories about vaccination causing autism, for instance, because as much as in every culture, there are people who are going To doubt vaccination, the reasons that they offered for to justify that doubt are going to vary tremendously from when culture to the next. So in the West, it has been a lot recently about, you know, vaccines like the MMR vaccine in particular causing autism. It used to be that the small pox inoculation would turn you into a cow. There are many cultures in which, you know, it's going to make you sterile, it's going to make you age, it's going to make you all sorts of bad things to you. So the the justification is very a lot because these are the ones that you get from your environment, but the underlying motivation to this like vaccine is pretty much universal. I know it's universal business that everybody shares it, but in every population, you'll have people who are very keen on being anti-vax.
Speaker 3
Yeah, so it does make a lot of sense.)
- Time 1:08:16
-

Quote

(highlight:: The Stigma Against Nuclear Power is Not Irrational, But It is Immoral To Some Extent
Transcript:
Speaker 1
We don't really well understand very well why people seem to have such kind of negative preconceptions about nuclear power, just nearly everywhere in the world. But whatever the cause, these misperceptions have had dramatic policy consequences. And it is indeed one of the domains in which you can make a plausible case that it is public opinion to some extent that led countries like Germany or Belgium or other countries to dismantle Their nuclear fleet. And studies have shown that thousands of people have died because of this, because of the coal plants that had to be used instead of nuclear power. So yeah, no, it is a case in which it's not irrational for people to get this wrong. But you could make the case that it is bordering on being a bit immoral to the extent that they are inflicting costs on others.)
- Time 1:17:27
-

Quote

(highlight:: The Disgust Response to Nuclear Energy
Transcript:
Speaker 1
We have some evidence, but it's not completely clear that that one of the things that's going on is that nuclear power is tapping into people's disgust mechanisms. So we have this psychology that evolved to help us avoid things that are going to make us sick. And so we have an intuition that we shouldn't touch or even less eat feces and urine and anything that comes out of people's bodies or rotten flesh. I think that smells really horrendous. And these mechanisms, the way they work is they tell us, well, look, these things contain small things that you can't really perceive, but that are going to make people sick. And the amount of the thing doesn't matter, which is mostly true. And you can get sick with a very small amount of viruses or bacteria, obviously. And then that's contagious. That can be transmittable from one person to the next. And I think it is that template that people apply to nuclear energy, because they think, well, you know, okay, radiation is this invisible thing, a big, like, you know, germs and viruses Are invisible to the naked eye. It's this invisible thing that makes people sick, even if they're exposed to a very, very small amount of it. And then the people who have gotten sick, they can make other sick. And you see that, you see that, for instance, when the way that people who have been heard by radiation, mostly after the Hiroshima and Nagasaki bombings, like it was sometimes hard To find people willing to treat them, because they are perceived as being contagious themselves, which was which was not the case by in large. Some of their clothes might have had some radiation, but that themselves were not anymore. So we have this this false image of nuclear energy that is then again, overwhelming demiscated. And I think that gives us this bad feeling about it. And it's true. And it's true. It's funny. I think like particulate matters, I think it would work in the same way as people have the intuition in a way that smoking is going to make you sick. Like when people when data started coming out that smoking is causing lung cancer, I think it intuitively is like, Oh, yeah, I can see that happening. And if there was more of a kind of media discussion of the effect of particulate matter, I think it would even if you can't really see that in the same way as you can't really see the particles In smoke, people I think would get it, but they wouldn't get like the contagion is still less carried because like if you're exposed to it, then you're marked to get it, you don't have This very kind of insidious feeling you have with things that are contagious.
Speaker 2
Yeah, I would have thought that one of the differences between particulate pollution and radiation is that the former has a very simple physical mechanism that I think I intuitively Understand, which is that you burn coal, it produces smoke or you burn wood and it produces smoke. And then I breathe it in and that sounds bad, but also comprehensible. Whereas with nuclear power, you're like, where is the radiation? I can't see it. How much is it? How much is bad? Are there different types? I'm very educated and I still find it very confusing. So you can imagine someone who just doesn't understand what radiation is super weird. And I think that that confusion means that you just have to take the belief that it's safe on trust because you don't understand that you're not a graduate physics. And then if you don't trust people, if you don't trust engineers to that degree, then well, you just screwed. You're kind of always going to be suspicious of it.)
- Time 1:19:27
-

Quote

(highlight:: The Role of Skeptics in Stopping Harmful Information Cascades
Transcript:
Speaker 1
Like in an information cascade, the way it's been usually designed is, yes, you look, people are influenced by the people, you know, before them. And it looks as if it looks to you as if each new person has made up them and independently of others, when in fact, they themselves had been influenced by the people before them. And so it looks as if you have like a lot of confirmatory evidence, when in fact, it just so happens that at the beginning of the chain, you had a few people who thought so, and then everybody Sort of was overwhelmingly influenced by them. And that gets, that's supposed to get increasingly worse is because if you have, you know, five and then 10 and then 50 people who will agree, then obviously the weight of the evidence That they're right is increasingly large, for the record that doesn't happen in when you try to do the experiments, you know, it kind of means they have these nice models of how that should Happen in a way if people are rational, to some extent. But in fact, in every group you have enough people who are pigmented and just because they're, well, no, I'm going to ignore it. Everybody else is saying, I'm just going to go my own way. And these people, they break, they completely break this case. So they can be annoying, but at least they play that kind of useful role sometimes.)
- Time 1:26:11
-

Quote

(highlight:: The Impact of Disinformation is Constrained By How Much and Where People Direct Attention
Transcript:
Speaker 1
Well, I mean, first of all, there's already an essentially infinite amount of information on the internet. So the bottleneck is not how many articles there are on any given topic, because there is already like way more than anybody will ever read. The bottleneck is people's individual attention. And that bottleneck is largely controlled by, I mean, to some extent, by when it's going to appears in colleagues on social network, but otherwise mostly by the big actors in the field, By cable news, by big newspapers. And there's no reason to believe these things are going to change dramatically. So having another, you know, when thousand articles on a given issue, just no one is going to read them.)
- Time 1:58:10
-

Quote

(highlight:: Fake News Is Likely to Entrench Opinion More Than It Is Likely to Change It
Transcript:
Speaker 2
Let's accept for the sake of argument that it does become easier to produce misleading content in future. I think some people envisage that the outcome of this would be people's opinions being changed in all sorts of random directions all the time. Whereas I think your model and now my model, I've even read the book, is that it's not that people would start changing their mind more, it's that people would start changing their mind Less because they would simply realize that, like, you know, as in when people make really complex arguments in some field that I don't understand, I'm just like, and I don't trust the Person, I don't believe that they are an authority really. And I can't check the argument that they're making for myself because I don't understand it well enough, I simply don't change my mind. And likewise, in future, if people start noticing that it's possible to trick them into believing stuff all the time, because they're incapable of noticing that a video is doctored, For example, then they just stop changing their mind at all in response to these inputs because they always just have the option of keeping their current views. And I think so, yeah, do you agree that that would be where things would, in a bad case, potentially bottom out?
Speaker 1
Yes, I mean, like, it's increasingly easy to say, well, that video has been made up, so etc, etc. I'm not sure how much the technological impossibility of doing something ever was such a strong argument. So if, you know, if 40 years ago someone were to tell you, look, you know, there's this picture that appeared in the New York Times, and they tell you, oh, I think it's a fake picture, would Your argument really have been, well, it's impossible to doctor a photo? Or would your argument have been, well, it's in the New York Times, and also in the Washington Post and also everywhere else? Yeah. I think the argument always rests really ultimately on reputation and not on the technical possibility of doing such and such trick. And so there are always people who want to say, well, you know, I don't believe in that and they'll have more excuses, but I don't think it's going to make a big difference.
Speaker 2
So here I'm picturing a scenario where, like, let's say that we end up in a worst case than that, where we, like, the New York Times doesn't exist or that the New York Times is no longer credible, And so you don't trust. There's no particular authority that you trust to determine the providence of an image or a video and to determine whether it's real or not. In that case, I think what happens is you just stop paying attention and you stop changing your mind. Yes. So hopefully, hopefully we can solve it by having trustworthy sources and institutions that people believe have done the legwork to figure out if things are true. But if they don't, it won't be mass persuasion. It'll be mass indifference and mass stubbornness, I think.)
- Time 2:24:59
-