Watch videos with subtitles in your language, upload your videos, create your own subtitles! Click here to learn more on "how to Dotsub"

Annotated captions of Bruce Schneier: The security mirage in English

Last Modified By Time Content
tedtalks 00:00
00:02

So security is two different things:

tedtalks 00:02
00:04

it's a feeling, and it's a reality.

tedtalks 00:04
00:06

And they're different.

tedtalks 00:06
00:08

You could feel secure

tedtalks 00:08
00:10

even if you're not.

tedtalks 00:10
00:12

And you can be secure

tedtalks 00:12
00:14

even if you don't feel it.

tedtalks 00:14
00:16

Really, we have two separate concepts

tedtalks 00:16
00:18

mapped onto the same word.

tedtalks 00:18
00:20

And what I want to do in this talk

tedtalks 00:20
00:22

is to split them apart --

tedtalks 00:22
00:24

figuring out when they diverge

tedtalks 00:24
00:26

and how they converge.

tedtalks 00:26
00:28

And language is actually a problem here.

tedtalks 00:28
00:30

There aren't a lot of good words

tedtalks 00:30
00:33

for the concepts we're going to talk about.

tedtalks 00:33
00:35

So if you look at security

tedtalks 00:35
00:37

from economic terms,

tedtalks 00:37
00:39

it's a trade-off.

tedtalks 00:39
00:41

Every time you get some security,

tedtalks 00:41
00:43

you're always trading off something.

tedtalks 00:43
00:45

Whether this is a personal decision --

tedtalks 00:45
00:47

whether you're going to install a burglar alarm in your home --

tedtalks 00:47
00:50

or a national decision -- where you're going to invade some foreign country --

tedtalks 00:50
00:52

you're going to trade off something,

tedtalks 00:52
00:55

either money or time, convenience, capabilities,

tedtalks 00:55
00:58

maybe fundamental liberties.

tedtalks 00:58
01:01

And the question to ask when you look at a security anything

tedtalks 01:01
01:04

is not whether this makes us safer,

tedtalks 01:04
01:07

but whether it's worth the trade-off.

tedtalks 01:07
01:09

You've heard in the past several years,

tedtalks 01:09
01:11

the world is safer because Saddam Hussein is not in power.

tedtalks 01:11
01:14

That might be true, but it's not terribly relevant.

tedtalks 01:14
01:17

The question is, was it worth it?

tedtalks 01:17
01:20

And you can make your own decision,

tedtalks 01:20
01:22

and then you'll decide whether the invasion was worth it.

tedtalks 01:22
01:24

That's how you think about security --

tedtalks 01:24
01:26

in terms of the trade-off.

tedtalks 01:26
01:29

Now there's often no right or wrong here.

tedtalks 01:29
01:31

Some of us have a burglar alarm system at home,

tedtalks 01:31
01:33

and some of us don't.

tedtalks 01:33
01:35

And it'll depend on where we live,

tedtalks 01:35
01:37

whether we live alone or have a family,

tedtalks 01:37
01:39

how much cool stuff we have,

tedtalks 01:39
01:41

how much we're willing to accept

tedtalks 01:41
01:43

the risk of theft.

tedtalks 01:43
01:45

In politics also,

tedtalks 01:45
01:47

there are different opinions.

tedtalks 01:47
01:49

A lot of times, these trade-offs

tedtalks 01:49
01:51

are about more than just security,

tedtalks 01:51
01:53

and I think that's really important.

tedtalks 01:53
01:55

Now people have a natural intuition

tedtalks 01:55
01:57

about these trade-offs.

tedtalks 01:57
01:59

We make them every day --

tedtalks 01:59
02:01

last night in my hotel room,

tedtalks 02:01
02:03

when I decided to double-lock the door,

tedtalks 02:03
02:05

or you in your car when you drove here,

tedtalks 02:05
02:07

when we go eat lunch

tedtalks 02:07
02:10

and decide the food's not poison and we'll eat it.

tedtalks 02:10
02:12

We make these trade-offs again and again,

tedtalks 02:12
02:14

multiple times a day.

tedtalks 02:14
02:16

We often won't even notice them.

tedtalks 02:16
02:18

They're just part of being alive; we all do it.

tedtalks 02:18
02:21

Every species does it.

tedtalks 02:21
02:23

Imagine a rabbit in a field, eating grass,

tedtalks 02:23
02:26

and the rabbit's going to see a fox.

tedtalks 02:26
02:28

That rabbit will make a security trade-off:

tedtalks 02:28
02:30

"Should I stay, or should I flee?"

tedtalks 02:30
02:32

And if you think about it,

tedtalks 02:32
02:35

the rabbits that are good at making that trade-off

tedtalks 02:35
02:37

will tend to live and reproduce,

tedtalks 02:37
02:39

and the rabbits that are bad at it

tedtalks 02:39
02:41

will get eaten or starve.

tedtalks 02:41
02:43

So you'd think

tedtalks 02:43
02:46

that us, as a successful species on the planet --

tedtalks 02:46
02:48

you, me, everybody --

tedtalks 02:48
02:51

would be really good at making these trade-offs.

tedtalks 02:51
02:53

Yet it seems, again and again,

tedtalks 02:53
02:56

that we're hopelessly bad at it.

tedtalks 02:56
02:59

And I think that's a fundamentally interesting question.

tedtalks 02:59
03:01

I'll give you the short answer.

tedtalks 03:01
03:03

The answer is, we respond to the feeling of security

tedtalks 03:03
03:06

and not the reality.

tedtalks 03:06
03:09

Now most of the time, that works.

tedtalks 03:10
03:12

Most of the time,

tedtalks 03:12
03:15

feeling and reality are the same.

tedtalks 03:15
03:17

Certainly that's true

tedtalks 03:17
03:20

for most of human prehistory.

tedtalks 03:20
03:23

We've developed this ability

tedtalks 03:23
03:25

because it makes evolutionary sense.

tedtalks 03:25
03:27

One way to think of it

tedtalks 03:27
03:29

is that we're highly optimized

tedtalks 03:29
03:31

for risk decisions

tedtalks 03:31
03:34

that are endemic to living in small family groups

tedtalks 03:34
03:37

in the East African highlands in 100,000 B.C.

tedtalks 03:37
03:40

2010 New York, not so much.

tedtalks 03:41
03:44

Now there are several biases in risk perception.

tedtalks 03:44
03:46

A lot of good experiments in this.

tedtalks 03:46
03:49

And you can see certain biases that come up again and again.

tedtalks 03:49
03:51

So I'll give you four.

tedtalks 03:51
03:54

We tend to exaggerate spectacular and rare risks

tedtalks 03:54
03:56

and downplay common risks --

tedtalks 03:56
03:59

so flying versus driving.

tedtalks 03:59
04:01

The unknown is perceived

tedtalks 04:01
04:04

to be riskier than the familiar.

tedtalks 04:05
04:07

One example would be,

tedtalks 04:07
04:10

people fear kidnapping by strangers

tedtalks 04:10
04:13

when the data supports kidnapping by relatives is much more common.

tedtalks 04:13
04:15

This is for children.

tedtalks 04:15
04:18

Third, personified risks

tedtalks 04:18
04:21

are perceived to be greater than anonymous risks --

tedtalks 04:21
04:24

so Bin Laden is scarier because he has a name.

tedtalks 04:24
04:26

And the fourth

tedtalks 04:26
04:28

is people underestimate risks

tedtalks 04:28
04:30

in situations they do control

tedtalks 04:30
04:34

and overestimate them in situations they don't control.

tedtalks 04:34
04:37

So once you take up skydiving or smoking,

tedtalks 04:37
04:39

you downplay the risks.

tedtalks 04:39
04:42

If a risk is thrust upon you -- terrorism was a good example --

tedtalks 04:42
04:45

you'll overplay it because you don't feel like it's in your control.

tedtalks 04:47
04:50

There are a bunch of other of these biases, these cognitive biases,

tedtalks 04:50
04:53

that affect our risk decisions.

tedtalks 04:53
04:55

There's the availability heuristic,

tedtalks 04:55
04:57

which basically means

tedtalks 04:57
05:00

we estimate the probability of something

tedtalks 05:00
05:04

by how easy it is to bring instances of it to mind.

tedtalks 05:04
05:06

So you can imagine how that works.

tedtalks 05:06
05:09

If you hear a lot about tiger attacks, there must be a lot of tigers around.

tedtalks 05:09
05:12

You don't hear about lion attacks, there aren't a lot of lions around.

tedtalks 05:12
05:15

This works until you invent newspapers.

tedtalks 05:15
05:17

Because what newspapers do

tedtalks 05:17
05:19

is they repeat again and again

tedtalks 05:19
05:21

rare risks.

tedtalks 05:21
05:23

I tell people, if it's in the news, don't worry about it.

tedtalks 05:23
05:25

Because by definition,

tedtalks 05:25
05:28

news is something that almost never happens.

tedtalks 05:28
05:30

(Laughter)

tedtalks 05:30
05:33

When something is so common, it's no longer news --

tedtalks 05:33
05:35

car crashes, domestic violence --

tedtalks 05:35
05:38

those are the risks you worry about.

tedtalks 05:38
05:40

We're also a species of storytellers.

tedtalks 05:40
05:43

We respond to stories more than data.

tedtalks 05:43
05:45

And there's some basic innumeracy going on.

tedtalks 05:45
05:48

I mean, the joke "One, Two, Three, Many" is kind of right.

tedtalks 05:48
05:51

We're really good at small numbers.

tedtalks 05:51
05:53

One mango, two mangoes, three mangoes,

tedtalks 05:53
05:55

10,000 mangoes, 100,000 mangoes --

tedtalks 05:55
05:58

it's still more mangoes you can eat before they rot.

tedtalks 05:58
06:01

So one half, one quarter, one fifth -- we're good at that.

tedtalks 06:01
06:03

One in a million, one in a billion --

tedtalks 06:03
06:06

they're both almost never.

tedtalks 06:06
06:08

So we have trouble with the risks

tedtalks 06:08
06:10

that aren't very common.

tedtalks 06:10
06:12

And what these cognitive biases do

tedtalks 06:12
06:15

is they act as filters between us and reality.

tedtalks 06:15
06:17

And the result

tedtalks 06:17
06:19

is that feeling and reality get out of whack,

tedtalks 06:19
06:22

they get different.

tedtalks 06:22
06:25

Now you either have a feeling -- you feel more secure than you are.

tedtalks 06:25
06:27

There's a false sense of security.

tedtalks 06:27
06:29

Or the other way,

tedtalks 06:29
06:31

and that's a false sense of insecurity.

tedtalks 06:31
06:34

I write a lot about "security theater,"

tedtalks 06:34
06:37

which are products that make people feel secure,

tedtalks 06:37
06:39

but don't actually do anything.

tedtalks 06:39
06:41

There's no real word for stuff that makes us secure,

tedtalks 06:41
06:43

but doesn't make us feel secure.

tedtalks 06:43
06:46

Maybe it's what the CIA's supposed to do for us.

tedtalks 06:48
06:50

So back to economics.

tedtalks 06:50
06:54

If economics, if the market, drives security,

tedtalks 06:54
06:56

and if people make trade-offs

tedtalks 06:56
06:59

based on the feeling of security,

tedtalks 06:59
07:01

then the smart thing for companies to do

tedtalks 07:01
07:03

for the economic incentives

tedtalks 07:03
07:06

are to make people feel secure.

tedtalks 07:06
07:09

And there are two ways to do this.

tedtalks 07:09
07:11

One, you can make people actually secure

tedtalks 07:11
07:13

and hope they notice.

tedtalks 07:13
07:16

Or two, you can make people just feel secure

tedtalks 07:16
07:19

and hope they don't notice.

tedtalks 07:20
07:23

So what makes people notice?

tedtalks 07:23
07:25

Well a couple of things:

tedtalks 07:25
07:27

understanding of the security,

tedtalks 07:27
07:29

of the risks, the threats,

tedtalks 07:29
07:32

the countermeasures, how they work.

tedtalks 07:32
07:34

But if you know stuff,

tedtalks 07:34
07:37

you're more likely to have your feelings match reality.

tedtalks 07:37
07:40

Enough real world examples helps.

tedtalks 07:40
07:43

Now we all know the crime rate in our neighborhood,

tedtalks 07:43
07:46

because we live there, and we get a feeling about it

tedtalks 07:46
07:49

that basically matches reality.

tedtalks 07:49
07:52

Security theater's exposed

tedtalks 07:52
07:55

when it's obvious that it's not working properly.

tedtalks 07:55
07:59

Okay, so what makes people not notice?

tedtalks 07:59
08:01

Well, a poor understanding.

tedtalks 08:01
08:04

If you don't understand the risks, you don't understand the costs,

tedtalks 08:04
08:06

you're likely to get the trade-off wrong,

tedtalks 08:06
08:09

and your feeling doesn't match reality.

tedtalks 08:09
08:11

Not enough examples.

tedtalks 08:11
08:13

There's an inherent problem

tedtalks 08:13
08:15

with low probability events.

tedtalks 08:15
08:17

If, for example,

tedtalks 08:17
08:19

terrorism almost never happens,

tedtalks 08:19
08:21

it's really hard to judge

tedtalks 08:21
08:24

the efficacy of counter-terrorist measures.

tedtalks 08:25
08:28

This is why you keep sacrificing virgins,

tedtalks 08:28
08:31

and why your unicorn defenses are working just great.

tedtalks 08:31
08:34

There aren't enough examples of failures.

tedtalks 08:35
08:38

Also, feelings that are clouding the issues --

tedtalks 08:38
08:40

the cognitive biases I talked about earlier,

tedtalks 08:40
08:43

fears, folk beliefs,

tedtalks 08:43
08:46

basically an inadequate model of reality.

tedtalks 08:47
08:50

So let me complicate things.

tedtalks 08:50
08:52

I have feeling and reality.

tedtalks 08:52
08:55

I want to add a third element. I want to add model.

tedtalks 08:55
08:57

Feeling and model in our head,

tedtalks 08:57
08:59

reality is the outside world.

tedtalks 08:59
09:02

It doesn't change; it's real.

tedtalks 09:02
09:04

So feeling is based on our intuition.

tedtalks 09:04
09:06

Model is based on reason.

tedtalks 09:06
09:09

That's basically the difference.

tedtalks 09:09
09:11

In a primitive and simple world,

tedtalks 09:11
09:14

there's really no reason for a model

tedtalks 09:14
09:17

because feeling is close to reality.

tedtalks 09:17
09:19

You don't need a model.

tedtalks 09:19
09:21

But in a modern and complex world,

tedtalks 09:21
09:23

you need models

tedtalks 09:23
09:26

to understand a lot of the risks we face.

tedtalks 09:27
09:29

There's no feeling about germs.

tedtalks 09:29
09:32

You need a model to understand them.

tedtalks 09:32
09:34

So this model

tedtalks 09:34
09:37

is an intelligent representation of reality.

tedtalks 09:37
09:40

It's, of course, limited by science,

tedtalks 09:40
09:42

by technology.

tedtalks 09:42
09:45

We couldn't have a germ theory of disease

tedtalks 09:45
09:48

before we invented the microscope to see them.

tedtalks 09:49
09:52

It's limited by our cognitive biases.

tedtalks 09:52
09:54

But it has the ability

tedtalks 09:54
09:56

to override our feelings.

tedtalks 09:56
09:59

Where do we get these models? We get them from others.

tedtalks 09:59
10:02

We get them from religion, from culture,

tedtalks 10:02
10:04

teachers, elders.

tedtalks 10:04
10:06

A couple years ago,

tedtalks 10:06
10:08

I was in South Africa on safari.

tedtalks 10:08
10:11

The tracker I was with grew up in Kruger National Park.

tedtalks 10:11
10:14

He had some very complex models of how to survive.

tedtalks 10:14
10:16

And it depended on if you were attacked

tedtalks 10:16
10:18

by a lion or a leopard or a rhino or an elephant --

tedtalks 10:18
10:21

and when you had to run away, and when you couldn't run away, and when you had to climb a tree --

tedtalks 10:21
10:23

when you could never climb a tree.

tedtalks 10:23
10:26

I would have died in a day,

tedtalks 10:26
10:28

but he was born there,

tedtalks 10:28
10:30

and he understood how to survive.

tedtalks 10:30
10:32

I was born in New York City.

tedtalks 10:32
10:35

I could have taken him to New York, and he would have died in a day.

tedtalks 10:35
10:37

(Laughter)

tedtalks 10:37
10:39

Because we had different models

tedtalks 10:39
10:42

based on our different experiences.

tedtalks 10:43
10:45

Models can come from the media,

tedtalks 10:45
10:48

from our elected officials.

tedtalks 10:48
10:51

Think of models of terrorism,

tedtalks 10:51
10:54

child kidnapping,

tedtalks 10:54
10:56

airline safety, car safety.

tedtalks 10:56
10:59

Models can come from industry.

tedtalks 10:59
11:01

The two I'm following are surveillance cameras,

tedtalks 11:01
11:03

ID cards,

tedtalks 11:03
11:06

quite a lot of our computer security models come from there.

tedtalks 11:06
11:09

A lot of models come from science.

tedtalks 11:09
11:11

Health models are a great example.

tedtalks 11:11
11:14

Think of cancer, of bird flu, swine flu, SARS.

tedtalks 11:14
11:17

All of our feelings of security

tedtalks 11:17
11:19

about those diseases

tedtalks 11:19
11:21

come from models

tedtalks 11:21
11:24

given to us, really, by science filtered through the media.

tedtalks 11:25
11:28

So models can change.

tedtalks 11:28
11:30

Models are not static.

tedtalks 11:30
11:33

As we become more comfortable in our environments,

tedtalks 11:33
11:37

our model can move closer to our feelings.

tedtalks 11:38
11:40

So an example might be,

tedtalks 11:40
11:42

if you go back 100 years ago

tedtalks 11:42
11:45

when electricity was first becoming common,

tedtalks 11:45
11:47

there were a lot of fears about it.

tedtalks 11:47
11:49

I mean, there were people who were afraid to push doorbells,

tedtalks 11:49
11:52

because there was electricity in there, and that was dangerous.

tedtalks 11:52
11:55

For us, we're very facile around electricity.

tedtalks 11:55
11:57

We change light bulbs

tedtalks 11:57
11:59

without even thinking about it.

tedtalks 11:59
12:03

Our model of security around electricity

tedtalks 12:03
12:06

is something we were born into.

tedtalks 12:06
12:09

It hasn't changed as we were growing up.

tedtalks 12:09
12:12

And we're good at it.

tedtalks 12:12
12:14

Or think of the risks

tedtalks 12:14
12:16

on the Internet across generations --

tedtalks 12:16
12:18

how your parents approach Internet security,

tedtalks 12:18
12:20

versus how you do,

tedtalks 12:20
12:23

versus how our kids will.

tedtalks 12:23
12:26

Models eventually fade into the background.

tedtalks 12:27
12:30

Intuitive is just another word for familiar.

tedtalks 12:30
12:32

So as your model is close to reality,

tedtalks 12:32
12:34

and it converges with feelings,

tedtalks 12:34
12:37

you often don't know it's there.

tedtalks 12:37
12:39

So a nice example of this

tedtalks 12:39
12:42

came from last year and swine flu.

tedtalks 12:42
12:44

When swine flu first appeared,

tedtalks 12:44
12:48

the initial news caused a lot of overreaction.

tedtalks 12:48
12:50

Now it had a name,

tedtalks 12:50
12:52

which made it scarier than the regular flu,

tedtalks 12:52
12:54

even though it was more deadly.

tedtalks 12:54
12:58

And people thought doctors should be able to deal with it.

tedtalks 12:58
13:00

So there was that feeling of lack of control.

tedtalks 13:00
13:02

And those two things

tedtalks 13:02
13:04

made the risk more than it was.

tedtalks 13:04
13:07

As the novelty wore off, the months went by,

tedtalks 13:07
13:09

there was some amount of tolerance,

tedtalks 13:09
13:11

people got used to it.

tedtalks 13:11
13:14

There was no new data, but there was less fear.

tedtalks 13:14
13:16

By autumn,

tedtalks 13:16
13:18

people thought

tedtalks 13:18
13:20

the doctors should have solved this already.

tedtalks 13:20
13:22

And there's kind of a bifurcation --

tedtalks 13:22
13:24

people had to choose

tedtalks 13:24
13:28

between fear and acceptance --

tedtalks 13:28
13:30

actually fear and indifference --

tedtalks 13:30
13:33

they kind of chose suspicion.

tedtalks 13:33
13:36

And when the vaccine appeared last winter,

tedtalks 13:36
13:39

there were a lot of people -- a surprising number --

tedtalks 13:39
13:42

who refused to get it --

tedtalks 13:43
13:45

as a nice example

tedtalks 13:45
13:48

of how people's feelings of security change, how their model changes,

tedtalks 13:48
13:50

sort of wildly

tedtalks 13:50
13:52

with no new information,

tedtalks 13:52
13:54

with no new input.

tedtalks 13:54
13:57

This kind of thing happens a lot.

tedtalks 13:57
14:00

I'm going to give one more complication.

tedtalks 14:00
14:03

We have feeling, model, reality.

tedtalks 14:03
14:05

I have a very relativistic view of security.

tedtalks 14:05
14:08

I think it depends on the observer.

tedtalks 14:08
14:10

And most security decisions

tedtalks 14:10
14:14

have a variety of people involved.

tedtalks 14:14
14:16

And stakeholders

tedtalks 14:16
14:19

with specific trade-offs

tedtalks 14:19
14:21

will try to influence the decision.

tedtalks 14:21
14:23

And I call that their agenda.

tedtalks 14:23
14:25

And you see agenda --

tedtalks 14:25
14:28

this is marketing, this is politics --

tedtalks 14:28
14:31

trying to convince you to have one model versus another,

tedtalks 14:31
14:33

trying to convince you to ignore a model

tedtalks 14:33
14:36

and trust your feelings,

tedtalks 14:36
14:39

marginalizing people with models you don't like.

tedtalks 14:39
14:42

This is not uncommon.

tedtalks 14:42
14:45

An example, a great example, is the risk of smoking.

tedtalks 14:46
14:49

In the history of the past 50 years, the smoking risk

tedtalks 14:49
14:51

shows how a model changes,

tedtalks 14:51
14:54

and it also shows how an industry fights against

tedtalks 14:54
14:56

a model it doesn't like.

tedtalks 14:56
14:59

Compare that to the secondhand smoke debate --

tedtalks 14:59
15:02

probably about 20 years behind.

tedtalks 15:02
15:04

Think about seat belts.

tedtalks 15:04
15:06

When I was a kid, no one wore a seat belt.

tedtalks 15:06
15:08

Nowadays, no kid will let you drive

tedtalks 15:08
15:10

if you're not wearing a seat belt.

tedtalks 15:11
15:13

Compare that to the airbag debate --

tedtalks 15:13
15:16

probably about 30 years behind.

tedtalks 15:16
15:19

All examples of models changing.

tedtalks 15:21
15:24

What we learn is that changing models is hard.

tedtalks 15:24
15:26

Models are hard to dislodge.

tedtalks 15:26
15:28

If they equal your feelings,

tedtalks 15:28
15:31

you don't even know you have a model.

tedtalks 15:31
15:33

And there's another cognitive bias

tedtalks 15:33
15:35

I'll call confirmation bias,

tedtalks 15:35
15:38

where we tend to accept data

tedtalks 15:38
15:40

that confirms our beliefs

tedtalks 15:40
15:43

and reject data that contradicts our beliefs.

tedtalks 15:44
15:46

So evidence against our model,

tedtalks 15:46
15:49

we're likely to ignore, even if it's compelling.

tedtalks 15:49
15:52

It has to get very compelling before we'll pay attention.

tedtalks 15:53
15:55

New models that extend long periods of time are hard.

tedtalks 15:55
15:57

Global warming is a great example.

tedtalks 15:57
15:59

We're terrible

tedtalks 15:59
16:01

at models that span 80 years.

tedtalks 16:01
16:03

We can do to the next harvest.

tedtalks 16:03
16:06

We can often do until our kids grow up.

tedtalks 16:06
16:09

But 80 years, we're just not good at.

tedtalks 16:09
16:12

So it's a very hard model to accept.

tedtalks 16:12
16:16

We can have both models in our head simultaneously,

tedtalks 16:16
16:19

right, that kind of problem

tedtalks 16:19
16:22

where we're holding both beliefs together,

tedtalks 16:22
16:24

right, the cognitive dissonance.

tedtalks 16:24
16:26

Eventually,

tedtalks 16:26
16:29

the new model will replace the old model.

tedtalks 16:29
16:32

Strong feelings can create a model.

tedtalks 16:32
16:35

September 11th created a security model

tedtalks 16:35
16:37

in a lot of people's heads.

tedtalks 16:37
16:40

Also, personal experiences with crime can do it,

tedtalks 16:40
16:42

personal health scare,

tedtalks 16:42
16:44

a health scare in the news.

tedtalks 16:44
16:46

You'll see these called flashbulb events

tedtalks 16:46
16:48

by psychiatrists.

tedtalks 16:48
16:51

They can create a model instantaneously,

tedtalks 16:51
16:54

because they're very emotive.

tedtalks 16:54
16:56

So in the technological world,

tedtalks 16:56
16:58

we don't have experience

tedtalks 16:58
17:00

to judge models.

tedtalks 17:00
17:02

And we rely on others. We rely on proxies.

tedtalks 17:02
17:06

I mean, this works as long as it's to correct others.

tedtalks 17:06
17:08

We rely on government agencies

tedtalks 17:08
17:13

to tell us what pharmaceuticals are safe.

tedtalks 17:13
17:15

I flew here yesterday.

tedtalks 17:15
17:17

I didn't check the airplane.

tedtalks 17:17
17:19

I relied on some other group

tedtalks 17:19
17:22

to determine whether my plane was safe to fly.

tedtalks 17:22
17:25

We're here, none of us fear the roof is going to collapse on us,

tedtalks 17:25
17:28

not because we checked,

tedtalks 17:28
17:30

but because we're pretty sure

tedtalks 17:30
17:33

the building codes here are good.

tedtalks 17:33
17:35

It's a model we just accept

tedtalks 17:35
17:37

pretty much by faith.

tedtalks 17:37
17:40

And that's okay.

tedtalks 17:42
17:44

Now, what we want

tedtalks 17:44
17:46

is people to get familiar enough

tedtalks 17:46
17:48

with better models --

tedtalks 17:48
17:50

have it reflected in their feelings --

tedtalks 17:50
17:54

to allow them to make security trade-offs.

tedtalks 17:54
17:56

Now when these go out of whack,

tedtalks 17:56
17:58

you have two options.

tedtalks 17:58
18:00

One, you can fix people's feelings,

tedtalks 18:00
18:02

directly appeal to feelings.

tedtalks 18:02
18:05

It's manipulation, but it can work.

tedtalks 18:05
18:07

The second, more honest way

tedtalks 18:07
18:10

is to actually fix the model.

tedtalks 18:11
18:13

Change happens slowly.

tedtalks 18:13
18:16

The smoking debate took 40 years,

tedtalks 18:16
18:19

and that was an easy one.

tedtalks 18:19
18:21

Some of this stuff is hard.

tedtalks 18:21
18:23

I mean really though,

tedtalks 18:23
18:25

information seems like our best hope.

tedtalks 18:25
18:27

And I lied.

tedtalks 18:27
18:29

Remember I said feeling, model, reality;

tedtalks 18:29
18:32

I said reality doesn't change. It actually does.

tedtalks 18:32
18:34

We live in a technological world;

tedtalks 18:34
18:37

reality changes all the time.

tedtalks 18:37
18:40

So we might have -- for the first time in our species --

tedtalks 18:40
18:43

feeling chases model, model chases reality, reality's moving --

tedtalks 18:43
18:46

they might never catch up.

tedtalks 18:47
18:49

We don't know.

tedtalks 18:49
18:51

But in the long-term,

tedtalks 18:51
18:54

both feeling and reality are important.

tedtalks 18:54
18:57

And I want to close with two quick stories to illustrate this.

tedtalks 18:57
18:59

1982 -- I don't know if people will remember this --

tedtalks 18:59
19:02

there was a short epidemic

tedtalks 19:02
19:04

of Tylenol poisonings in the United States.

tedtalks 19:04
19:07

It's a horrific story. Someone took a bottle of Tylenol,

tedtalks 19:07
19:10

put poison in it, closed it up, put it back on the shelf.

tedtalks 19:10
19:12

Someone else bought it and died.

tedtalks 19:12
19:14

This terrified people.

tedtalks 19:14
19:16

There were a couple of copycat attacks.

tedtalks 19:16
19:19

There wasn't any real risk, but people were scared.

tedtalks 19:19
19:21

And this is how

tedtalks 19:21
19:23

the tamper-proof drug industry was invented.

tedtalks 19:23
19:25

Those tamper-proof caps, that came from this.

tedtalks 19:25
19:27

It's complete security theater.

tedtalks 19:27
19:29

As a homework assignment, think of 10 ways to get around it.

tedtalks 19:29
19:32

I'll give you one, a syringe.

tedtalks 19:32
19:35

But it made people feel better.

tedtalks 19:35
19:37

It made their feeling of security

tedtalks 19:37
19:39

more match the reality.

tedtalks 19:39
19:42

Last story, a few years ago, a friend of mine gave birth.

tedtalks 19:42
19:44

I visit her in the hospital.

tedtalks 19:44
19:46

It turns out when a baby's born now,

tedtalks 19:46
19:48

they put an RFID bracelet on the baby,

tedtalks 19:48
19:50

put a corresponding one on the mother,

tedtalks 19:50
19:52

so if anyone other than the mother takes the baby out of the maternity ward,

tedtalks 19:52
19:54

an alarm goes off.

tedtalks 19:54
19:56

I said, "Well, that's kind of neat.

tedtalks 19:56
19:58

I wonder how rampant baby snatching is

tedtalks 19:58
20:00

out of hospitals."

tedtalks 20:00
20:02

I go home, I look it up.

tedtalks 20:02
20:04

It basically never happens.

tedtalks 20:04
20:06

But if you think about it,

tedtalks 20:06
20:08

if you are a hospital,

tedtalks 20:08
20:10

and you need to take a baby away from its mother,

tedtalks 20:10
20:12

out of the room to run some tests,

tedtalks 20:12
20:14

you better have some good security theater,

tedtalks 20:14
20:16

or she's going to rip your arm off.

tedtalks 20:16
20:18

(Laughter)

tedtalks 20:18
20:20

So it's important for us,

tedtalks 20:20
20:22

those of us who design security,

tedtalks 20:22
20:25

who look at security policy,

tedtalks 20:25
20:27

or even look at public policy

tedtalks 20:27
20:29

in ways that affect security.

tedtalks 20:29
20:32

It's not just reality; it's feeling and reality.

tedtalks 20:32
20:34

What's important

tedtalks 20:34
20:36

is that they be about the same.

tedtalks 20:36
20:38

It's important that, if our feelings match reality,

tedtalks 20:38
20:40

we make better security trade-offs.

tedtalks 20:40
20:42

Thank you.

tedtalks 20:42
20:44

(Applause)