How to (not) get duped in Silicon Valley
One of our favorite (read: most entertaining) Survivor players, Shane Powers, was voted out in a dramatic blindside in Season 12.
CA's latest homelessness numbers. What a parcel tax means for your wallet. Or how the State's actually spending its cheddar. It's easy to fall prey to misleading political narratives—as info literacy expert Melanie Trecek-King explains, on YouTube—if we don't understand the web of internal/external forces influencing our judgments.
We live in a time of a lot of information and a lot of misinformation, and our goal is to spot the difference between the two. Misinformation is defined as information that's false or misleading. It does not matter the intention of the person who is spreading it. But, importantly, information impacts our beliefs. It impacts how we see the world—and, therefore, the decisions that we make. And so, we want to make sure that we can sort the good from the bad … (3:54–4:27)
So belief is anything that we accept as true, broadly speaking. We are not blank slates. We have wiring built into the brain that helps us make sense of information. And that wiring includes things like looking for patterns, trying to understand if those patterns are actually real. Intuitive thinking, our fast thinking … the biases and heuristics. All of that is part of that fast decision-making. Our biases and, importantly, our emotions. So that's the internal part of us, what we're born with.
And then there are external sources, both personal experiences and received wisdom. So personal experiences … I recently went to my very first Bigfoot Museum … one of her main lines of evidence was “I saw Bigfoot. And I know other people who tell me they saw Bigfoot.” Right? So we believe ourselves, and we believe what others tell us. …
[A]ll of this comes down to trust. We trust ourselves. We trust our perceptions. We trust our thinking. And we also trust people in our social groups. We trust our authorities: parents, clergy, we trust members of our our tribe, our social group.
Historically speaking, humans were a tribal species. We lived in small groups, and those groups were the source of our protection. We had to trust them. We don't live in that environment anymore, though … we can go online and find anybody who believes the same thing that we believe, and next thing you know, they're our tribe, and we believe what they tell us … And this is important because … it behooves us to make sure that we place our trust wisely, that we're skeptical of certain things … (6:11–8:29)
So why are we vulnerable to misinformation? Jonathan Haidt's “elephant and the rider” is an oversimplification, of course; but it's a really helpful way to understand what's going on in our head. Basically, our elephant is the part of the brain that's always on. It's emotional. It's biased. It uses shortcuts. And then our rider, instead of thinking critically about things, goes, “Yeah, that sounds about right” and then just often finds ways to rationalize them.
So the kinds of things that can cause our elephant to rampage—motivated reasoning, especially if our motivations are things like identity protection or our social groups, our values … our emotions …
[T]rust is closely related to polarization. We live in a climate where we're highly polarized; and so what we do is we just retreat to our group, and we assume that our group is right. Not only is it highly improbable that all of our beliefs are true, our tribe is probably not right about everything either.
So how do we know the difference? For me, it's important to know what's going on in our brain—our vulnerabilities—so that we can be aware when we are more likely to fall from misinformation.
There's one more thing I want to point out up there, and that's repetition: the more we hear something, the more likely we are to accept it as true. Our brain mistakes fluency—ease of processing, with truth—This is really a danger if you're in an echo chamber, for example, where the same false information just keeps circulating … (11:12–13:12)
I'm going to broadly [categorize misinformation] into fake news and science's pretenders. [As far as fake news,] things that are out of context … Next is things that are manipulated … And then, finally, many things that are completely wrong. … Now, again, why would we fall for these things? They confirm what we think is true. We are emotional, especially if it's in part of our group. Dunking on the other side is really satisfying. You have to be careful of self-righteousness …
And then we have another type of misinformation, which I call science's pretenders. … So pseudoscience pretends to be scientific, but it doesn't adhere to science's methodology—whereas science denial is a refusal to accept strongly supported scientific conclusions. Now, notice the motivations and the result. So pseudoscience is motivated by wanting to believe, a need for hope … But, conversely, science denial is motivated by not wanting to believe; and so, as a result, we set a standard of evidence that's impossibly high. (13:38–18:27)
We are all vulnerable. The moment we think we're not vulnerable is when we're vulnerable. Overconfidence can really get in our own way. And no one is here to save us, right? The social media companies aren't doing it. It's on us. So check before you believe it, and especially before you share it, because people trust you. (30:14–30:43)
Watch the whole thing here.
Follow Opportunity Now on Twitter @svopportunity
We prize letters from our thoughtful readers. Typed on a Smith Corona. Written in longhand on fine stationery. Scribbled on a napkin. Hey, even composed on email. Feel free to send your comments to us at opportunitynowsv@gmail.com or (snail mail) 1590 Calaveras Ave., SJ, CA 95126. Remember to be thoughtful and polite. We will post letters on an irregular basis on the main Opp Now site.