Watch videos with subtitles in your language, upload your videos, create your own subtitles! Click here to learn more on "how to Dotsub"

Michael Shermer: The pattern behind self-deception

0 (0 Likes / 0 Dislikes)
So since I was here last, in '06, we discovered that global climate change is turning out to be a pretty serious issue So we covered that fairly extensively in Skeptic magazine We investigate all kinds of scientific and quasi-scientific controversies But it turns out we don't have to worry about any of this because the world's going to end in 2012 Another update: You will recall I introduced you guys to the Quadro Tracker It's like a water dowsing device It's just a hollow piece of plastic with an antenna that swivels around And you walk around, and it points to things Like if you're looking for marijuana in students' lockers, it'll point right to somebody Oh, sorry. (Laughter) This particular one that was given to me finds golf balls, especially if you're at a golf course and you check under enough bushes. Well, under the category of "What's the harm of silly stuff like this?" this device, the ADE 651, was sold to the Iraqi government for 40,000 dollars a piece. It's just like this one, completely worthless, in which it allegedly worked by "electrostatic magnetic ion attraction," which translates to "pseudoscientific baloney" -- would be the nice word -- in which you string together a bunch of words that sound good, but it does absolutely nothing. In this case, at trespass points, allowing people to go through because your little tracker device said they were okay, actually cost lives. So there is a danger to pseudoscience, in believing in this sort of thing. So what I want to talk about today is belief. I want to believe, and you do too. And in fact, I think my thesis here is that belief is the natural state of things. It is the default option. We just believe. We believe all sorts of things. Belief is natural. Disbelief, skepticism, science, is not natural. It's more difficult. It's uncomfortable to not believe things. So like Fox Mulder on "X-Files," who wants to believe in UFOs? Well, we all do. And the reason for that is because we have a belief engine in our brains. Essentially, we are pattern-seeking primates. We connect the dots: A is connected to B; B is connected to C. And sometimes A really is connected to B. And that's called association learning. We find patterns, we make those connections, whether it's Pavlov's dog here associating the sound of the bell with the food, and then he salivates to the sound of the bell, or whether it's a Skinnerian rat, in which he's having an association between his behavior and a reward for it, and therefore he repeats the behavior. In fact, what Skinner discovered is that, if you put a pigeon in a box like this, and he has to press one of these two keys, and he tries to figure out what the pattern is, and you give him a little reward in the hopper box there. If you just randomly assign rewards such that there is no pattern, they will figure out any kind of pattern. And whatever they were doing just before they got the reward, they repeat that particular pattern. Sometimes it was even spinning around twice counterclockwise, once clockwise and peck the key twice. And that's called superstition. And that, I'm afraid, we will always have with us. I call this process "patternicity," that is, the tendency to find meaningful patterns in both meaningful and meaningless noise. When we do this process, we make two types of errors. A Type I error, or false positive, is believing a pattern is real when it's not. Our second type of error is a false negative. A Type II error is not believing a pattern is real when it is. So let's do a thought experiment. You are a hominid three million years ago walking on the plains of Africa. Your name is Lucy, okay? And you hear a rustle in the grass. Is it a dangerous predator, or is it just the wind? Your next decision could be the most important one of your life. Well, if you think the rustle in the grass is a dangerous predator and it turns out it's just the wind, you've made an error in cognition, made a Type I error, false positive. But no harm. You just move away. You're more cautious. You're more vigilant. On the other hand, if you believe that the rustle in the grass is just the wind, and it turns out it's a dangerous predator, you're lunch. You've just won a Darwin award. You've been taken out of the gene pool. Now the problem here is that patternicities will occur whenever the cost of making a Type I error is less than the cost of making a Type II error. This is the only equation in the talk by the way. We have a pattern detection problem that is assessing the difference between a type one and a type two error is highly problematic, especially in split-second, life-and-death situations. So the default position is just "believe all patterns are real." "All rustles in the grass are dangerous predators and not just the wind." And so I think that we evolved there was a natural selection for the propensity for our belief engines, our pattern-seeking brain processes, to always find meaningful patterns and infuse them with these sort of predatory or intentional agencies that I'll come back to. So for example, what do you see here? It's a horse head, that's right. It looks like a horse. It must be a horse. That's a pattern. And is it really a horse? Or is it more like a frog? See, our pattern detection device, which appears to be located in the anterior cingulate cortex -- it's our little detection device there -- can be easily fooled, and this is the problem. For example, what do you see here? Yes, of course. It's a cow. Once I prime the brain -- it's called cognitive priming -- once I prime the brain to see it, it pops back out again even without the pattern that I've imposed on it. And what do you see here? Some people see a dalmatian dog. Yes, there it is. And there's the prime. So when I go back without the prime, your brain already has the model so you can see it again. What do you see here? Planet Saturn. Yes, that's good.. How about here? Just shout out anything you see. That's a good audience, Chris. Because there's nothing in this. Well, allegedly there's nothing. This is an experiment done by Jennifer Whitson at U.T. Austin on corporate environments and whether feelings of uncertainty and out of control makes people see illusory patterns. That is, almost everybody sees the planet Saturn. People that are put in a condition of feeling out of control are more likely to see something in this, which is allegedly patternless. In other words, the propensity to find these patterns goes up when there's a lack of control. For example, baseball players are notoriously superstitious when they're batting, but not so much when they're fielding. Because fielders are successful 90 to 95 percent of the time. The best batters fail seven out of 10 times. So their superstitions, their patternicities, are all associated with feelings of lack of control and so forth. What do you see in this particular one here, in this field? Anybody see an object there? There actually is something here, but it's degraded. While you're thinking about that, this was an experiment done by Susan Blackmore, a psychologist in England, who showed subjects this degraded image and then ran a correlation between their scores on an ESP test, how much they believe in the paranormal, supernatural, angels and so forth. And those who scored high on the ESP scale, tended to, not only see more patterns in the degraded images, but incorrect patterns.

Video Details

Duration: 19 minutes and 1 second
Country: American Samoa
Language: English
Genre: None
Producer: TED
Director: TED
Views: 83
Posted by: iamwagner on Jun 17, 2010

Michael Shermer says the human tendency to believe strange things -- from alien abductions to dowsing rods -- boils down to two of the brain's most basic, hard-wired survival skills. He explains what they are, and how they get us into trouble

Caption and Translate

    Sign In/Register for Dotsub above to caption this video.