Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,321 --> 00:00:03,656
Morgan: If you think
you see everyone as equal,
2
00:00:03,657 --> 00:00:05,626
you're kidding yourself.
3
00:00:05,627 --> 00:00:07,660
We all have biases.
4
00:00:07,661 --> 00:00:09,030
[ Sighs ]
5
00:00:11,032 --> 00:00:13,568
And no matter
how open-minded...
6
00:00:15,170 --> 00:00:17,804
We think we are...
7
00:00:17,805 --> 00:00:20,573
Morgan: Stereotypes color
our judgment of others.
8
00:00:20,574 --> 00:00:21,676
[ Gun cocks ]
9
00:00:24,512 --> 00:00:26,547
...and can lead us badly astray.
10
00:00:28,716 --> 00:00:33,120
We live in a society fractured
by race, religion,
11
00:00:33,121 --> 00:00:34,955
even our favorite sports teams.
12
00:00:34,956 --> 00:00:36,324
[ Baseball bat cracks ]
13
00:00:36,325 --> 00:00:38,291
Man:
Going, going, gone!
14
00:00:38,292 --> 00:00:39,993
Yes! Yes!
15
00:00:39,994 --> 00:00:44,664
Morgan: We divide ourselves
into rival tribes.
16
00:00:44,665 --> 00:00:47,667
The political divide
between us grows deeper
17
00:00:47,668 --> 00:00:50,471
with every passing year.
18
00:00:50,472 --> 00:00:53,874
When did hate become hard-wired
into our brains?
19
00:00:53,875 --> 00:00:56,843
We live in two different
americas, one for the rich...
20
00:00:56,844 --> 00:01:02,048
Are we all born to discriminate
against our fellow humans?
21
00:01:02,049 --> 00:01:04,987
Are we all bigots?
22
00:01:08,755 --> 00:01:13,094
Space, time, life itself.
23
00:01:15,496 --> 00:01:20,400
The secrets of the cosmos
lie through the wormhole.
24
00:01:20,401 --> 00:01:23,403
-- Captions by vitac --
www.Vitac.Com
25
00:01:23,404 --> 00:01:26,441
captions paid for by
discovery communications
26
00:01:33,421 --> 00:01:37,724
Have you ever thought about
who you instinctively trust
27
00:01:37,725 --> 00:01:40,362
and who you instinctively fear?
28
00:01:42,764 --> 00:01:47,334
Someone's walking toward you
down a dark alley...
29
00:01:47,335 --> 00:01:50,239
[ Metallic click ]
Holding something in his hand.
30
00:01:53,876 --> 00:01:57,946
Now, I think of myself
as an open-minded person.
31
00:01:57,947 --> 00:02:00,815
But scientists tell me
I'm kidding myself
32
00:02:00,816 --> 00:02:03,185
and so are you.
33
00:02:03,186 --> 00:02:06,453
We all look at the world
with prejudice,
34
00:02:06,454 --> 00:02:10,524
and when you have
only a split second to decide,
35
00:02:10,525 --> 00:02:14,329
your own snap judgments
may shock you.
36
00:02:18,801 --> 00:02:22,870
Josh Correll grew up
solving puzzles for fun,
37
00:02:22,871 --> 00:02:25,307
but now, as a psychologist,
38
00:02:25,308 --> 00:02:28,909
he's trying to solve
the puzzle of racism.
39
00:02:28,910 --> 00:02:32,546
And his work is
a matter of life and death.
40
00:02:32,547 --> 00:02:37,351
Correll:
So, my research was originally inspired by Amadou Diallo.
41
00:02:37,352 --> 00:02:41,154
He came home, went back outside
to sit on his front porch,
42
00:02:41,155 --> 00:02:42,856
basically, the stoop
of the apartment building.
43
00:02:42,857 --> 00:02:44,159
[ Vehicle approaches, stops ]
44
00:02:44,160 --> 00:02:46,862
Morgan: It was the early hours
of February 4, 1999.
45
00:02:46,863 --> 00:02:48,095
[ Car doors open ]
46
00:02:48,096 --> 00:02:51,231
Four New York police officers
approached him.
47
00:02:51,232 --> 00:02:54,000
Diallo reached for his wallet
48
00:02:54,001 --> 00:02:56,337
when one of the officers
shouted,
49
00:02:56,338 --> 00:02:58,206
"gun! He's got a gun!"
50
00:02:58,207 --> 00:03:05,080
[ Gunfire ]
51
00:03:05,081 --> 00:03:10,118
They fired 41 rounds,
killing him at the scene.
52
00:03:10,119 --> 00:03:13,255
The police officers were
acquitted of all charges,
53
00:03:13,256 --> 00:03:15,390
sparking
a heated national debate.
54
00:03:15,391 --> 00:03:16,590
[ Crowd chanting indistinctly ]
55
00:03:16,591 --> 00:03:19,659
It was a tragedy that has
since repeated itself
56
00:03:19,660 --> 00:03:23,098
in the death of Michael Brown
in Ferguson, Missouri,
57
00:03:23,099 --> 00:03:26,767
the death of Tamir Rice
in Cleveland, Ohio,
58
00:03:26,768 --> 00:03:30,671
the death of John Crawford
in Beavercreek, Ohio.
59
00:03:30,672 --> 00:03:33,540
And so, that presented a puzzle.
60
00:03:33,541 --> 00:03:35,376
That presented a question.
61
00:03:35,377 --> 00:03:38,278
How can we determine whether
or not race was actually
62
00:03:38,279 --> 00:03:40,082
what drove the officers
to shoot?
63
00:03:42,851 --> 00:03:44,518
Some of them will be armed.
64
00:03:44,519 --> 00:03:47,755
Morgan:
Josh is about to run an experiment with live ammunition
65
00:03:47,756 --> 00:03:50,993
and with participants
who are not policemen.
66
00:03:52,261 --> 00:03:54,928
He asks his white test subject
67
00:03:54,929 --> 00:03:59,200
to treat the simulation
as if it is real life.
68
00:03:59,201 --> 00:04:04,005
A potentially lethal person is
about to confront him,
69
00:04:04,006 --> 00:04:08,244
and he will have less than
one second to make a decision.
70
00:04:10,478 --> 00:04:12,580
There's time pressure.
They have to react quickly.
71
00:04:12,581 --> 00:04:14,447
And we can look and see
72
00:04:14,448 --> 00:04:16,483
whether when we change
the race of the suspect
73
00:04:16,484 --> 00:04:18,853
from black to white
or white to black,
74
00:04:18,854 --> 00:04:20,856
does that influence
a person's behavior.
75
00:04:20,857 --> 00:04:26,127
Morgan:
The subject will see a scene appear on a screen downrange.
76
00:04:26,128 --> 00:04:29,796
Then, a white man will appear
holding either a gun
77
00:04:29,797 --> 00:04:31,332
or a cellphone
78
00:04:31,333 --> 00:04:38,306
or it will be a black man
with a gun or a cellphone.
79
00:04:38,307 --> 00:04:42,310
The image of the man is only up
for one second.
80
00:04:42,311 --> 00:04:44,344
Time to decide.
81
00:04:44,345 --> 00:04:46,882
Shoot or hold fire?
82
00:04:48,016 --> 00:04:51,351
Mistake the gun for a phone
and die.
83
00:04:51,352 --> 00:04:56,191
Mistake the phone for a gun
and kill an innocent person.
84
00:04:59,026 --> 00:04:59,961
[ Gun cocks ]
85
00:05:01,996 --> 00:05:04,565
So, what we want to look at is,
in that situation
86
00:05:04,566 --> 00:05:06,834
where there's not good,
clear information
87
00:05:06,835 --> 00:05:08,503
where people have
to respond quickly,
88
00:05:08,504 --> 00:05:10,972
did they use race
to inform their decisions?
89
00:05:11,939 --> 00:05:16,144
Morgan: The simulation cycles
through dozens of confrontations
90
00:05:16,145 --> 00:05:19,615
equally split between white
and black male subjects.
91
00:05:22,050 --> 00:05:24,252
Josh records
how long it takes subjects
92
00:05:24,253 --> 00:05:25,986
to make a decision
93
00:05:25,987 --> 00:05:28,822
and whether or not they kill
an innocent person
94
00:05:28,823 --> 00:05:30,390
or die themselves.
95
00:05:30,391 --> 00:05:31,959
It's worth noting
that, in this game,
96
00:05:31,960 --> 00:05:33,260
people are pretty good.
97
00:05:33,261 --> 00:05:34,863
They don't make
a ton of mistakes.
98
00:05:36,064 --> 00:05:38,266
10%, 15% of the time,
they make a mistake.
99
00:05:39,468 --> 00:05:40,969
But when we look
at those mistakes,
100
00:05:40,970 --> 00:05:42,570
we see racial bias
in the errors.
101
00:05:42,571 --> 00:05:44,472
[ Gunshot ]
102
00:05:44,473 --> 00:05:46,308
So they're faster
to shoot the armed target
103
00:05:46,309 --> 00:05:47,710
if he's black rather than white.
104
00:05:49,078 --> 00:05:50,210
[ Gunshot ]
105
00:05:50,211 --> 00:05:51,613
When the target's got
a cellphone,
106
00:05:51,614 --> 00:05:53,414
they're much more likely
to make that decision
107
00:05:53,415 --> 00:05:55,718
to shoot an innocent target
when he's black...
108
00:05:58,420 --> 00:05:59,321
Rather than white.
109
00:06:02,357 --> 00:06:06,226
Morgan:
Josh has run this experiment on thousands of people.
110
00:06:06,227 --> 00:06:10,964
On average, white subjects are
quicker to shoot the black male
111
00:06:10,965 --> 00:06:17,171
and are 30% to 40% more likely
to mistake his phone for a gun.
112
00:06:17,172 --> 00:06:19,841
When Josh performs
this experiment
113
00:06:19,842 --> 00:06:22,142
with law-enforcement officers,
114
00:06:22,143 --> 00:06:24,678
he finds
that their expert training
115
00:06:24,679 --> 00:06:28,849
significantly reduced the
occurrence of fatal mistakes.
116
00:06:28,850 --> 00:06:31,786
But no matter what
their background or training,
117
00:06:31,787 --> 00:06:37,293
most participants are quicker
to shoot at a black target.
118
00:06:38,694 --> 00:06:39,827
Does this mean
119
00:06:39,828 --> 00:06:43,330
that white Americans are
inherently bigoted?
120
00:06:43,331 --> 00:06:47,902
An utterly shocking trend
with Josh's black participants
121
00:06:47,903 --> 00:06:51,607
suggests that it's
much more complicated than that.
122
00:07:01,316 --> 00:07:02,285
[ Gunshot ]
123
00:07:03,953 --> 00:07:08,623
We see that black participants
show the same anti-black bias
124
00:07:08,624 --> 00:07:10,126
that white participants do.
125
00:07:10,992 --> 00:07:12,293
Actually when we test
126
00:07:12,294 --> 00:07:14,729
to see if there is a difference
in the two groups,
127
00:07:14,730 --> 00:07:16,729
white participants
versus black participants,
128
00:07:16,730 --> 00:07:18,767
they are not statistically
different from each other.
129
00:07:21,237 --> 00:07:22,205
[ Gunshot ]
130
00:07:23,872 --> 00:07:26,339
So, we think this represents
131
00:07:26,340 --> 00:07:28,877
an awareness
of a cultural stereotype,
132
00:07:28,878 --> 00:07:32,447
not that our participants
believe necessarily
133
00:07:32,448 --> 00:07:34,882
that black men are
more dangerous than white men.
134
00:07:34,883 --> 00:07:36,017
[ Gun cocks ]
135
00:07:36,018 --> 00:07:38,853
But by virtue of the movies
that they watch,
136
00:07:38,854 --> 00:07:41,223
the music that they listen to,
news reports,
137
00:07:41,224 --> 00:07:42,322
they're getting the idea
138
00:07:42,323 --> 00:07:44,892
that "black male" goes
with "violent."
139
00:07:44,893 --> 00:07:46,327
[ Siren wails ]
140
00:07:46,328 --> 00:07:49,430
The group and the idea are
linked together in their brains
141
00:07:49,431 --> 00:07:52,101
whether they agree
with that stereotype or not.
142
00:07:53,634 --> 00:07:57,205
Why would we make
life-and-death decisions
143
00:07:57,206 --> 00:08:01,241
based on stereotypes
we don't even believe?
144
00:08:01,242 --> 00:08:02,310
I've always thought
145
00:08:02,311 --> 00:08:05,346
we could overcome
these bigoted ideas.
146
00:08:05,347 --> 00:08:10,284
But one neuroscientist says
it's not that simple.
147
00:08:10,285 --> 00:08:15,491
Racist stereotypes hijack
our subconscious minds.
148
00:08:17,693 --> 00:08:19,928
Neuroscientist
Jon Freeman believes
149
00:08:19,929 --> 00:08:24,164
that we all carry around
stereotypes in our subconscious.
150
00:08:24,165 --> 00:08:28,435
In fact, the instant
you see another person's face,
151
00:08:28,436 --> 00:08:32,606
your brain first perceives them
as a stereotype of their race,
152
00:08:32,607 --> 00:08:36,044
gender, or sexual orientation.
153
00:08:36,045 --> 00:08:39,646
So when you first lay eyes
on a young Asian student,
154
00:08:39,647 --> 00:08:42,550
she might register as...
155
00:08:43,617 --> 00:08:46,554
The stereotypical
Asian overachiever...
156
00:08:47,789 --> 00:08:50,691
But only for an instant.
157
00:08:50,692 --> 00:08:52,894
The way we make snap judgments
about others is
158
00:08:52,895 --> 00:08:55,462
nowhere near
politically correct.
159
00:08:55,463 --> 00:08:57,765
Whether you like it or not,
160
00:08:57,766 --> 00:09:00,333
a well-groomed man
may first trigger
161
00:09:00,334 --> 00:09:03,103
a stale stereotype
in your subconscious mind...
162
00:09:03,104 --> 00:09:05,874
[ Whistle blows,
upbeat music plays ]
163
00:09:05,875 --> 00:09:09,510
Until your conscious mind
corrects you.
164
00:09:09,511 --> 00:09:11,946
Jon wants
to understand precisely
165
00:09:11,947 --> 00:09:15,116
why first impressions
conjure up clich?s.
166
00:09:15,117 --> 00:09:18,986
Woman:
Excuse me. Jon?
167
00:09:18,987 --> 00:09:20,756
I'm a docile white girl.
168
00:09:21,689 --> 00:09:24,859
And he wants to learn
if there's a way for us
169
00:09:24,860 --> 00:09:27,895
to see through these clich?s.
170
00:09:27,896 --> 00:09:29,496
Jon: It takes hundreds
of milliseconds
171
00:09:29,497 --> 00:09:32,466
for that judgment
to sort of crystallize and form,
172
00:09:32,467 --> 00:09:34,568
and a lot happens
during that process.
173
00:09:34,569 --> 00:09:36,136
And we are only beginning
to understand
174
00:09:36,137 --> 00:09:38,338
what the implications
of that might be.
175
00:09:38,339 --> 00:09:41,609
Jon uses brain scanners
to determine exactly
176
00:09:41,610 --> 00:09:43,144
what is going on in the brain
177
00:09:43,145 --> 00:09:46,280
during the first fraction
of a second of perception,
178
00:09:46,281 --> 00:09:48,616
long before we are
consciously aware
179
00:09:48,617 --> 00:09:50,086
of what we're looking at.
180
00:09:51,152 --> 00:09:53,688
The brain is like an office,
181
00:09:53,689 --> 00:09:57,957
where two key desks handle most
of the face-analyzing workload,
182
00:09:57,958 --> 00:10:00,195
the fusiform face area...
183
00:10:02,463 --> 00:10:05,699
And the orbitofrontal cortex.
184
00:10:05,700 --> 00:10:09,370
When a visual signal arrives,
185
00:10:09,371 --> 00:10:12,606
they both get to work
simultaneously to process it
186
00:10:12,607 --> 00:10:15,242
in their own specialized ways.
187
00:10:15,243 --> 00:10:17,111
Jon: The fusiform face area is
188
00:10:17,112 --> 00:10:19,947
involved in taking
visual information
189
00:10:19,948 --> 00:10:21,750
and forming
a coherent representation
190
00:10:21,751 --> 00:10:24,318
of the identity
and, say, the gender
191
00:10:24,319 --> 00:10:25,688
and the race of the face.
192
00:10:28,290 --> 00:10:31,693
Morgan: But across the way,
the orbitofrontal cortex is
193
00:10:31,694 --> 00:10:34,895
focused on associating that face
with all the knowledge it has
194
00:10:34,896 --> 00:10:36,198
about the world.
195
00:10:37,399 --> 00:10:40,067
Jon: The orbitofrontal cortex is
retrieving
196
00:10:40,068 --> 00:10:43,438
all of the associations
spontaneously without awareness.
197
00:10:43,439 --> 00:10:45,973
Morgan:
When it sees a black man's face,
198
00:10:45,974 --> 00:10:48,610
the orbitofrontal cortex
quickly looks up
199
00:10:48,611 --> 00:10:52,047
all the general information
the brain has about black men,
200
00:10:52,048 --> 00:10:59,220
including many stereotypes,
and alters the visual signal.
201
00:10:59,221 --> 00:11:02,456
So, some brain regions can sort
of convince other brain regions
202
00:11:02,457 --> 00:11:04,792
to be in line with them.
203
00:11:04,793 --> 00:11:07,863
Morgan: Because of this,
stereotypes can hijack
204
00:11:07,864 --> 00:11:09,864
the signal from our eyes...
205
00:11:10,833 --> 00:11:15,035
And change what we perceive.
206
00:11:15,036 --> 00:11:16,603
When test subjects look
207
00:11:16,604 --> 00:11:19,240
at a black male
with a neutral expression,
208
00:11:19,241 --> 00:11:21,842
their brains
immediately light up
209
00:11:21,843 --> 00:11:24,611
as if they are perceiving anger.
210
00:11:24,612 --> 00:11:26,947
And even though
they don't realize it,
211
00:11:26,948 --> 00:11:30,018
when they look at a white female
with a blank gaze,
212
00:11:30,019 --> 00:11:34,454
their brain's instant reaction
is to perceive happiness.
213
00:11:34,455 --> 00:11:39,492
These stereotypes take place in
all of the brains Jon studied,
214
00:11:39,493 --> 00:11:44,030
no matter their gender, race,
or sexual orientation.
215
00:11:44,031 --> 00:11:47,266
These prejudiced thoughts are
quickly snuffed out
216
00:11:47,267 --> 00:11:48,803
by the conscious mind,
217
00:11:48,804 --> 00:11:51,472
but that doesn't mean
that they're harmless.
218
00:11:51,473 --> 00:11:54,141
Those stereotypes can actually
wind up impacting behavior.
219
00:11:54,142 --> 00:11:55,443
So, for example,
220
00:11:55,444 --> 00:11:59,380
if individuals unconsciously see
African-American faces
221
00:11:59,381 --> 00:12:02,183
as being slightly more angry
than they are,
222
00:12:02,184 --> 00:12:05,586
that's probably going to impact
how much they approach
223
00:12:05,587 --> 00:12:08,621
or avoid that individual
at a spontaneous level.
224
00:12:08,622 --> 00:12:13,059
Morgan: If we recognize that we
are all prone to these biases,
225
00:12:13,060 --> 00:12:14,162
maybe we can compensate...
226
00:12:14,163 --> 00:12:15,130
[ Gunshot ]
227
00:12:15,664 --> 00:12:18,698
...and avoid unintended acts
of prejudice.
228
00:12:18,699 --> 00:12:23,069
But one biologist is attempting
to go one step further
229
00:12:23,070 --> 00:12:25,073
to manipulate animal minds...
230
00:12:27,075 --> 00:12:29,144
And override their bigotry.
231
00:12:30,821 --> 00:12:36,158
A stereotype is
the brain's way of saving time.
232
00:12:36,159 --> 00:12:39,228
It looks at people or objects
233
00:12:39,229 --> 00:12:43,733
and makes quick decisions
about them.
234
00:12:43,734 --> 00:12:47,836
Who'd want to eat
this disgusting thing?
235
00:12:47,837 --> 00:12:51,576
But these mental shortcuts
can lead us astray.
236
00:12:55,244 --> 00:12:57,113
Mmm.
237
00:12:57,114 --> 00:12:59,349
Delicious.
238
00:12:59,350 --> 00:13:02,185
Can we look beyond appearances
239
00:13:02,186 --> 00:13:04,855
and see people
for who they really are?
240
00:13:10,427 --> 00:13:12,494
Neuroscientist Peggy Mason knows
241
00:13:12,495 --> 00:13:14,464
that getting through
her daily routine
242
00:13:14,465 --> 00:13:20,836
requires looking at everything
as a stereotype.
243
00:13:20,837 --> 00:13:23,372
A basket
of freshly plucked vegetables is
244
00:13:23,373 --> 00:13:25,708
a vegetable basket.
245
00:13:25,709 --> 00:13:29,378
Vegetable baskets contain
vegetables.
246
00:13:29,379 --> 00:13:32,315
don't they?
247
00:13:32,316 --> 00:13:34,749
Mason: We make expectations
about everything,
248
00:13:34,750 --> 00:13:36,953
and they smooth the way.
249
00:13:36,954 --> 00:13:38,387
They're shortcuts.
250
00:13:38,388 --> 00:13:41,558
They make our life happen much
faster and much more easily.
251
00:13:44,695 --> 00:13:46,597
Morgan: Without the ability
to stereotype,
252
00:13:46,598 --> 00:13:49,431
everything we do would take
enormous mental effort
253
00:13:49,432 --> 00:13:52,835
to understand.
254
00:13:52,836 --> 00:13:54,871
We could take nothing
for granted.
255
00:13:59,243 --> 00:14:01,978
We have relied
on stereotyping for eons
256
00:14:01,979 --> 00:14:05,580
to quickly tell our tribe
from outsiders.
257
00:14:05,581 --> 00:14:08,551
For all the hurt
that stereotyping causes,
258
00:14:08,552 --> 00:14:12,687
it's fundamental
to how our brains work.
259
00:14:12,688 --> 00:14:15,758
So, we're more likely
to help those closest to us,
260
00:14:15,759 --> 00:14:17,359
and for complete strangers
261
00:14:17,360 --> 00:14:19,862
that we've never even seen
the likes of,
262
00:14:19,863 --> 00:14:22,297
we're not so likely
to help them.
263
00:14:22,298 --> 00:14:24,700
Morgan: Peggy wanted to see
if there might be a way
264
00:14:24,701 --> 00:14:28,336
to get the brain
to overcome these biases.
265
00:14:28,337 --> 00:14:32,108
I think
that we humans act in part
266
00:14:32,109 --> 00:14:35,878
due to our shared
mammalian biology,
267
00:14:35,879 --> 00:14:39,381
and I think that we can
increase social cohesion
268
00:14:39,382 --> 00:14:41,452
in modern society
amongst humans.
269
00:14:43,854 --> 00:14:47,223
Morgan:
She began with a mammal that has a simpler brain than ours...
270
00:14:47,224 --> 00:14:48,257
[ Rat squeaking ]
271
00:14:48,258 --> 00:14:50,694
Mason: Hey, little guys.
How you doing?
272
00:14:50,695 --> 00:14:52,427
You're okay, little buddy.
273
00:14:52,428 --> 00:14:55,397
Morgan: ...The rat,
a creature who typically
274
00:14:55,398 --> 00:14:59,536
only aids members
of its own strain.
275
00:14:59,537 --> 00:15:01,170
Peggy's experiment involves
276
00:15:01,171 --> 00:15:04,440
temporarily trapping a rat
in a plastic tube.
277
00:15:04,441 --> 00:15:06,410
The tube has just one way out,
278
00:15:06,411 --> 00:15:10,814
through a door that can
only be opened by another rat.
279
00:15:10,815 --> 00:15:13,449
When another rat
from the same strain is added
280
00:15:13,450 --> 00:15:15,551
to the chamber,
it's not long
281
00:15:15,552 --> 00:15:20,723
before he works out how to free
his imprisoned fellow tribesman.
282
00:15:20,724 --> 00:15:25,328
Mason: These are all albino rats
of the sprague dawley stock.
283
00:15:25,329 --> 00:15:28,564
And so,
while they are not identical
284
00:15:28,565 --> 00:15:30,633
and they've never met
each other,
285
00:15:30,634 --> 00:15:34,903
they also might look like
the fifth cousin twice removed.
286
00:15:34,904 --> 00:15:38,208
Morgan: If the rat looks
familiar, the other rat helps.
287
00:15:38,209 --> 00:15:40,410
But then Peggy repeats
the experiment
288
00:15:40,411 --> 00:15:43,179
with rats of unrelated strains.
289
00:15:43,180 --> 00:15:45,980
Now it's a black capped rat
in the tube,
290
00:15:45,981 --> 00:15:49,951
and an albino rat has
the option to free him.
291
00:15:49,952 --> 00:15:52,253
Mason: They've never met
a black capped rat before.
292
00:15:52,254 --> 00:15:53,822
They don't open for them.
293
00:15:53,823 --> 00:15:55,258
They have no affiliative bond,
294
00:15:55,259 --> 00:15:57,060
and therefore
they do not act prosocially
295
00:15:57,061 --> 00:16:02,665
towards these very
strange-looking type of rats.
296
00:16:02,666 --> 00:16:05,501
Morgan: But can a rat
ever change its ways?
297
00:16:05,502 --> 00:16:09,370
To find out,
Peggy exposes a white rat
298
00:16:09,371 --> 00:16:11,040
to a black capped rat.
299
00:16:11,041 --> 00:16:13,309
Mason:
We took albino rats.
300
00:16:13,310 --> 00:16:15,344
We housed them
with black capped rats
301
00:16:15,345 --> 00:16:16,811
for two weeks.
302
00:16:16,812 --> 00:16:20,648
Then we rehoused them
with an albino rat
303
00:16:20,649 --> 00:16:23,553
so they've known
one black capped rat.
304
00:16:24,721 --> 00:16:28,157
Morgan: Does this experience
make the albino empathetic
305
00:16:28,158 --> 00:16:30,528
to all black capped rats?
306
00:16:32,562 --> 00:16:35,263
To find out, she places him
in the arena
307
00:16:35,264 --> 00:16:38,868
with a trapped black capped rat
that he's never met before.
308
00:16:38,869 --> 00:16:40,738
[ Rat squeaking ]
309
00:16:49,278 --> 00:16:54,117
The albino rat breaks
through the color line.
310
00:16:54,118 --> 00:16:57,753
Mason: What that suggested was
that just having a bond
311
00:16:57,754 --> 00:17:01,790
to one black capped rat
would allow an albino rat
312
00:17:01,791 --> 00:17:05,128
to generalize
all the black capped rats.
313
00:17:05,129 --> 00:17:07,830
They've known one.
They've lived with one.
314
00:17:07,831 --> 00:17:09,698
Now they get tested
with strangers.
315
00:17:09,699 --> 00:17:10,766
And lo and behold,
316
00:17:10,767 --> 00:17:13,368
they're perfectly helpful
to the strangers.
317
00:17:13,369 --> 00:17:17,439
So that was
really exceptional to me
318
00:17:17,440 --> 00:17:21,644
because it showed
that experience was so powerful.
319
00:17:21,645 --> 00:17:23,614
Morgan: It may not be
as hard as you think
320
00:17:23,615 --> 00:17:26,080
for a bigot to have
a change of heart.
321
00:17:26,081 --> 00:17:28,518
If any of us has
a positive experience
322
00:17:28,519 --> 00:17:31,121
with someone
from a different racial group,
323
00:17:31,122 --> 00:17:34,657
biology has the power
to make us feel empathy
324
00:17:34,658 --> 00:17:36,891
for a stranger from that group.
325
00:17:36,892 --> 00:17:41,930
In fact, Peggy believes that
empathy is a primal instinct
326
00:17:41,931 --> 00:17:43,299
for all mammals.
327
00:17:43,300 --> 00:17:48,437
What rats tell us is that
we have a mammalian inheritance
328
00:17:48,438 --> 00:17:53,775
which makes us want to help
another in distress.
329
00:17:53,776 --> 00:17:56,579
But the amazing thing
that we learn from the rats is
330
00:17:56,580 --> 00:18:00,081
that what the rats need to do is
to have an experience
331
00:18:00,082 --> 00:18:01,817
with a different type of rat,
332
00:18:01,818 --> 00:18:05,989
and then that rat can be part
of their ingroup, too.
333
00:18:05,990 --> 00:18:11,360
And that's really an amazing
and hopeful message, I think.
334
00:18:11,361 --> 00:18:15,764
Empathy has enormous power.
335
00:18:15,765 --> 00:18:19,167
Images of Nelson Mandela
behind bars
336
00:18:19,168 --> 00:18:24,139
evoked such compassion
from people of all races
337
00:18:24,140 --> 00:18:29,278
that they helped end apartheid
in South Africa.
338
00:18:29,279 --> 00:18:33,549
But there's another darker side
to the human mind,
339
00:18:33,550 --> 00:18:35,884
one that allows us
to take pleasure
340
00:18:35,885 --> 00:18:39,054
in the pain of others
341
00:18:39,055 --> 00:18:43,693
and could make us addicted
to bigotry.
342
00:18:45,821 --> 00:18:50,591
Bigotry is as old
as human society.
343
00:18:50,592 --> 00:18:56,464
We persecute people
of different skin color,
344
00:18:56,465 --> 00:19:01,102
of different religion.
345
00:19:01,103 --> 00:19:06,241
We discriminate
between men and women.
346
00:19:06,242 --> 00:19:10,811
But bigotry isn't just about
the circumstances of your birth.
347
00:19:10,812 --> 00:19:15,650
Even fans of rival sports teams
can learn to hate one another
348
00:19:15,651 --> 00:19:19,054
with all the venom of a bigot.
349
00:19:20,022 --> 00:19:23,758
Harvard psychologist Mina cikara
has been thinking
350
00:19:23,759 --> 00:19:28,897
about how human beings move
from individuals, to groups
351
00:19:28,898 --> 00:19:31,399
to bitter, violent rivals.
352
00:19:31,400 --> 00:19:34,302
Imagine a group
of perfect strangers.
353
00:19:34,303 --> 00:19:39,206
It takes very little for them
to form devout tribal alliances.
354
00:19:39,207 --> 00:19:41,842
Well, one of the most amazing
things about humans is
355
00:19:41,843 --> 00:19:43,611
how readily they form groups.
356
00:19:43,612 --> 00:19:46,313
In fact, psychological research
has shown
357
00:19:46,314 --> 00:19:48,349
that you can
randomly assign people
358
00:19:48,350 --> 00:19:50,284
to red team or blue team,
359
00:19:50,285 --> 00:19:52,521
and that's enough
to make them show
360
00:19:52,522 --> 00:19:54,321
what we call ingroup bias.
361
00:19:54,322 --> 00:19:57,693
They prefer their ingroup,
they treat them better,
362
00:19:57,694 --> 00:20:00,162
they devote
more resources to them,
363
00:20:00,163 --> 00:20:02,899
and, in general, it's just
a part of human nature.
364
00:20:07,369 --> 00:20:09,670
Morgan:
Since the dawn of humanity,
365
00:20:09,671 --> 00:20:14,643
we have needed the support
of others to thrive and survive.
366
00:20:14,644 --> 00:20:17,613
So when two groups come
into direct competition,
367
00:20:17,614 --> 00:20:21,383
no matter arbitrarily
those groups were formed,
368
00:20:21,384 --> 00:20:24,453
the individuals will put
the needs of the group
369
00:20:24,454 --> 00:20:26,221
above themselves.
370
00:20:26,222 --> 00:20:29,192
A line is drawn in the sand.
371
00:20:36,032 --> 00:20:37,666
Out.
Nice job.
372
00:20:37,667 --> 00:20:39,967
"Out"?
What... give me a break.
373
00:20:39,968 --> 00:20:44,272
All-out violence needs
only a little provocation.
374
00:20:44,273 --> 00:20:45,674
You were out.
375
00:20:45,675 --> 00:20:46,875
[ Scoffs ]
376
00:20:46,876 --> 00:20:49,511
A dose of escalation...
377
00:20:49,512 --> 00:20:51,112
And both sides will...
378
00:20:51,113 --> 00:20:52,214
[ All shouting indistinctly ]
379
00:20:52,215 --> 00:20:53,481
Charge.
380
00:20:53,482 --> 00:20:55,483
[ Shouting continues ]
381
00:20:55,484 --> 00:20:57,484
[ Upbeat music plays ]
382
00:20:57,485 --> 00:20:59,522
[ Crashing, thumping ]
383
00:21:01,956 --> 00:21:06,628
In general, people are adverse
to treating other people badly.
384
00:21:06,629 --> 00:21:08,997
But that's the thing
about competition.
385
00:21:08,998 --> 00:21:10,732
Being a good ingroup member
386
00:21:10,733 --> 00:21:12,466
means being a jerk
to the out group.
387
00:21:12,467 --> 00:21:13,768
Aah!
388
00:21:13,769 --> 00:21:16,072
It's not just that you want
your own team to do well.
389
00:21:16,073 --> 00:21:18,939
It's that you have to make sure
that the other team doesn't.
390
00:21:18,940 --> 00:21:20,009
Aaaaaah!
391
00:21:20,010 --> 00:21:21,476
Aaaah!
392
00:21:21,477 --> 00:21:24,445
Morgan: Mina wants to know
why this desire
393
00:21:24,446 --> 00:21:27,782
to persecute the other overrides
our better judgment.
394
00:21:27,783 --> 00:21:29,285
Aaaaah!
395
00:21:32,921 --> 00:21:37,091
These two are
a couple of stand-up guys.
396
00:21:37,092 --> 00:21:40,595
They certainly would never beat
each other into a pulp...
397
00:21:40,596 --> 00:21:42,098
Unless it's game day.
398
00:21:43,032 --> 00:21:46,667
Today Mina is going
to scan their brains
399
00:21:46,668 --> 00:21:49,437
as they watch
their rival team suffer.
400
00:21:49,438 --> 00:21:50,704
So, what we did was we recruited
401
00:21:50,705 --> 00:21:53,841
18 die-hard Red Sox
and Yankees fans.
402
00:21:53,842 --> 00:21:57,312
And the idea was
we wanted people to watch plays
403
00:21:57,313 --> 00:22:00,214
where their rivals did poorly
against another third team,
404
00:22:00,215 --> 00:22:00,982
the orioles.
405
00:22:00,983 --> 00:22:01,849
[ Crowd cheering ]
406
00:22:01,850 --> 00:22:04,019
The Red Sox fan watches a video
407
00:22:04,020 --> 00:22:07,188
where Alex Rodriguez
of the Yankees is pelted
408
00:22:07,189 --> 00:22:09,357
by a 100-mile-per-hour
fast ball.
409
00:22:09,358 --> 00:22:10,559
[ Baseball thuds,
crowd groans ]
410
00:22:10,560 --> 00:22:13,762
Man: Ooh!
That's gonna leave a mark.
411
00:22:13,763 --> 00:22:17,366
Morgan: The Yankees fan gets
to enjoy an embarrassing mistake
412
00:22:17,367 --> 00:22:21,537
that cost the Red Sox three runs
in a single play.
413
00:22:21,538 --> 00:22:23,538
[ Baseball bat cracks,
crowd groans ]
414
00:22:23,539 --> 00:22:26,308
Man: What a disaster.
415
00:22:26,309 --> 00:22:30,345
Ooh, how embarrassing
for the Red Sox.
416
00:22:30,346 --> 00:22:31,646
Morgan: Mina discovered
417
00:22:31,647 --> 00:22:35,215
that this feeling of pleasure
at our rival's pain,
418
00:22:35,216 --> 00:22:38,387
what the Germans call
schadenfreude,
419
00:22:38,388 --> 00:22:42,256
is something our brains learn
to crave.
420
00:22:42,257 --> 00:22:44,859
When participants watch
their rivals fail,
421
00:22:44,860 --> 00:22:46,561
what we saw
that there was activations
422
00:22:46,562 --> 00:22:48,730
of the region called
the ventral striatum.
423
00:22:48,731 --> 00:22:51,400
The way that this region
purportedly works is
424
00:22:51,401 --> 00:22:54,336
that it basically tags
positive information,
425
00:22:54,337 --> 00:22:57,539
rewarding events,
so that people then can say,
426
00:22:57,540 --> 00:23:00,908
"oh, I should come back to this
in order to get pleasure again."
427
00:23:00,909 --> 00:23:05,813
Morgan: The ventral striatum is
at the core of many addictions.
428
00:23:05,814 --> 00:23:08,182
When a smoker sees a cigarette,
429
00:23:08,183 --> 00:23:11,720
their ventral striatum
reminds them of its pleasures.
430
00:23:11,721 --> 00:23:15,524
And just as a cigarette a day
can soon become a pack a day,
431
00:23:15,525 --> 00:23:17,692
couldn't seeing
your rival suffer
432
00:23:17,693 --> 00:23:20,462
make you want to see it happen
more and more?
433
00:23:20,463 --> 00:23:21,997
Cikara:
So, the question then is
434
00:23:21,998 --> 00:23:24,765
whether or not watching
your rival suffer misfortune
435
00:23:24,766 --> 00:23:26,801
makes you then more likely
to endorse harm
436
00:23:26,802 --> 00:23:28,803
or actually do harm
to the rival team
437
00:23:28,804 --> 00:23:30,772
and affiliated individuals.
438
00:23:30,773 --> 00:23:32,942
Well, we have evidence
to suggest that it does.
439
00:23:34,275 --> 00:23:36,211
Morgan:
What troubles Mina is
440
00:23:36,212 --> 00:23:38,515
that this line
of group-oriented thinking
441
00:23:38,516 --> 00:23:41,281
extends beyond sports teams.
442
00:23:41,282 --> 00:23:44,485
In fact,
we see everybody belonging
443
00:23:44,486 --> 00:23:47,422
to one
of four social categories.
444
00:23:47,423 --> 00:23:50,024
The first time you meet
a new person or a group,
445
00:23:50,025 --> 00:23:52,026
there are two questions
you need answered right away.
446
00:23:52,027 --> 00:23:53,727
The first is "friend or foe,"
447
00:23:53,728 --> 00:23:55,797
and the second is,
"how capable are they
448
00:23:55,798 --> 00:23:58,666
of enacting their intentions
toward me, good or ill?"
449
00:23:58,667 --> 00:24:01,636
Morgan: First, there are
the friendly groups.
450
00:24:01,637 --> 00:24:04,005
Bright, young kids and doctors,
for example,
451
00:24:04,006 --> 00:24:06,442
we usually see as competent.
452
00:24:06,443 --> 00:24:10,445
Less capable friendly groups,
like the elderly and infirm,
453
00:24:10,446 --> 00:24:12,914
usually invoke pity.
454
00:24:12,915 --> 00:24:15,584
Drug addicts
or teenage Internet bullies,
455
00:24:15,585 --> 00:24:17,919
we categorize as foe.
456
00:24:17,920 --> 00:24:20,422
But these groups aren't
competent enough
457
00:24:20,423 --> 00:24:22,991
to spend much energy hating.
458
00:24:22,992 --> 00:24:27,928
It's the people seen both
as foe and highly competent
459
00:24:27,929 --> 00:24:31,099
who stir the strong urges
toward bigotry.
460
00:24:31,100 --> 00:24:34,903
This includes investment bankers
but also groups like asians
461
00:24:34,904 --> 00:24:38,308
or professional women in domains
where men generally dominate.
462
00:24:39,975 --> 00:24:42,109
Morgan: Mina studied hundreds
of subjects
463
00:24:42,110 --> 00:24:46,580
who report feeling pleasure when
members of these groups suffer.
464
00:24:46,581 --> 00:24:48,649
Cikara: When you ask them
who they're most likely to harm,
465
00:24:48,650 --> 00:24:51,252
to just hurt, not actually kill,
466
00:24:51,253 --> 00:24:53,588
they're most willing
to harm these competent groups
467
00:24:53,589 --> 00:24:56,457
that are competitive
with our own interests.
468
00:24:56,458 --> 00:24:58,893
So what's really interesting
about these groups is
469
00:24:58,894 --> 00:25:00,695
that, in times
of social stability,
470
00:25:00,696 --> 00:25:02,696
people go along
to get along with them
471
00:25:02,697 --> 00:25:04,666
because they control resources.
472
00:25:04,667 --> 00:25:07,403
But they're also the first ones
to get scapegoated
473
00:25:07,404 --> 00:25:09,705
when social relations
become unstable.
474
00:25:10,821 --> 00:25:12,624
Morgan:
Being part of a group is
475
00:25:12,625 --> 00:25:15,326
an unavoidable part
of being human.
476
00:25:15,327 --> 00:25:18,328
But groupism does more
than just block
477
00:25:18,329 --> 00:25:20,731
our natural empathy for others.
478
00:25:20,732 --> 00:25:23,867
When it involves
a political agenda,
479
00:25:23,868 --> 00:25:26,937
groupism may actually hack
our brains
480
00:25:26,938 --> 00:25:29,807
into perceiving a false reality.
481
00:25:29,808 --> 00:25:32,609
Do you see the world
as it really is
482
00:25:32,610 --> 00:25:37,116
or how your political party
wants you to see it?
483
00:25:38,821 --> 00:25:40,989
We're a tribal species,
484
00:25:40,990 --> 00:25:43,660
and we all want
to be in the winning tribe.
485
00:25:45,228 --> 00:25:49,163
But surely humanity can aspire
to rise above this,
486
00:25:49,164 --> 00:25:52,534
to bridge the divide
between us.
487
00:25:52,535 --> 00:25:55,304
Democracy was founded
in the principle of equality,
488
00:25:55,305 --> 00:25:58,739
that we could reach
across the aisle and compromise.
489
00:25:58,740 --> 00:26:00,643
But with
every passing year,
490
00:26:00,644 --> 00:26:02,544
political parties seem
to be getting
491
00:26:02,545 --> 00:26:05,314
more and more divided.
492
00:26:05,315 --> 00:26:09,485
Maybe it's because
conservatives are bigoted
493
00:26:09,486 --> 00:26:12,587
against liberals.
494
00:26:12,588 --> 00:26:14,790
I think
you have it backward.
495
00:26:19,729 --> 00:26:22,631
Darren Schreiber is
an American political scientist
496
00:26:22,632 --> 00:26:25,800
now working in exeter, u.K.
497
00:26:25,801 --> 00:26:27,402
If there's one thing
he's learned
498
00:26:27,403 --> 00:26:29,604
from moving across the pond,
499
00:26:29,605 --> 00:26:32,172
it's that no matter
where you go,
500
00:26:32,173 --> 00:26:37,113
liberals and conservatives are
not from the same planet.
501
00:26:37,114 --> 00:26:38,813
Do you see yourself
as being more liberal
502
00:26:38,814 --> 00:26:39,882
or more conservative?
503
00:26:39,883 --> 00:26:41,350
Definitely
more conservative.
504
00:26:41,351 --> 00:26:42,950
I'd see myself
more liberal.
505
00:26:42,951 --> 00:26:44,752
Military intervention
in the middle east...
506
00:26:44,753 --> 00:26:45,754
How do you feel
about that?
507
00:26:45,755 --> 00:26:47,156
I think
it's absolutely fundamental.
508
00:26:47,157 --> 00:26:50,325
Each country should be allowed
to determine their own future.
509
00:26:50,326 --> 00:26:52,294
What do you think
about immigration policy?
510
00:26:52,295 --> 00:26:53,862
The borders need
to be slightly more closed.
511
00:26:53,863 --> 00:26:55,599
We should have
an open-border policy.
512
00:26:55,600 --> 00:27:00,102
Liberals and conservatives
rarely see eye to eye.
513
00:27:00,103 --> 00:27:04,472
Could it be that they have
different brains?
514
00:27:04,473 --> 00:27:07,575
Darren decided he would try
to uncover the truth
515
00:27:07,576 --> 00:27:10,945
by using an mri brain scanner
to see
516
00:27:10,946 --> 00:27:14,016
how the brains of liberals
and conservatives handled
517
00:27:14,017 --> 00:27:17,818
decision-making
in a simple gambling game.
518
00:27:17,819 --> 00:27:22,156
Today, Darren and his students
are re-creating that experiment
519
00:27:22,157 --> 00:27:24,692
but without the mri.
520
00:27:24,693 --> 00:27:29,632
Each test subject is given ?1...
About a buck and a half.
521
00:27:29,633 --> 00:27:33,134
They can keep the money,
or they can gamble with it.
522
00:27:33,135 --> 00:27:35,569
There's a chance
to double the winnings
523
00:27:35,570 --> 00:27:38,707
but also a risk
of losing it all.
524
00:27:38,708 --> 00:27:41,209
Do you want to keep that ?1,
or do you want to take a risk
525
00:27:41,210 --> 00:27:43,412
at potentially winning
or losing ?2?
526
00:27:43,413 --> 00:27:44,813
I will risk ?2.
527
00:27:44,814 --> 00:27:47,214
All right,
so open up the envelope
528
00:27:47,215 --> 00:27:48,684
and see what you get.
529
00:27:50,452 --> 00:27:51,652
I've won ?2.
530
00:27:51,653 --> 00:27:52,755
Good. All right.
531
00:27:52,756 --> 00:27:54,021
So here's ?2.
532
00:27:54,022 --> 00:27:55,423
Do you want to keep
the ?1,
533
00:27:55,424 --> 00:27:57,425
or do you want
to risk potentially winning
534
00:27:57,426 --> 00:27:58,727
or losing ?2?
535
00:27:58,728 --> 00:28:01,331
I think I'll take a risk
at winning ?2.
536
00:28:01,332 --> 00:28:02,298
Okay.
537
00:28:03,200 --> 00:28:04,098
I've won ?2.
538
00:28:04,099 --> 00:28:05,432
Whoo-hoo.
Congratulations.
539
00:28:05,433 --> 00:28:07,536
So, here's your ?2.
540
00:28:07,537 --> 00:28:08,702
[ Laughing ]
I've lost ?1.
541
00:28:08,703 --> 00:28:10,005
You've lost.
All right.
542
00:28:10,006 --> 00:28:12,108
Well, you have
to give me ?4 now.
543
00:28:12,109 --> 00:28:13,442
Thanks, Sophie.
544
00:28:14,544 --> 00:28:17,013
Darren doesn't care
who wins or loses.
545
00:28:17,014 --> 00:28:20,149
His only interest is
in how their brains deal
546
00:28:20,150 --> 00:28:22,816
with taking gambling risks,
547
00:28:22,817 --> 00:28:26,221
an act that has
no intrinsic political slant.
548
00:28:26,222 --> 00:28:30,793
But to his surprise, liberals
and conservatives processed risk
549
00:28:30,794 --> 00:28:33,930
with wildly different regions
of their brains.
550
00:28:35,897 --> 00:28:39,367
Conservative brains consistently
use the amygdala
551
00:28:39,368 --> 00:28:42,770
to make the decisions
to risk everything.
552
00:28:42,771 --> 00:28:45,606
This region is associated
with gut feeling
553
00:28:45,607 --> 00:28:48,009
and fight-or-flight responses.
554
00:28:48,010 --> 00:28:50,112
Red brains experienced risk
555
00:28:50,113 --> 00:28:53,150
as a threat
with a potential reward.
556
00:28:55,050 --> 00:28:59,287
Liberal brains consistently use
the insula when gambling,
557
00:28:59,288 --> 00:29:00,823
a region associated
558
00:29:00,824 --> 00:29:03,392
with perception
of one's own feelings.
559
00:29:03,393 --> 00:29:08,563
Blue brains experienced risk
as a problem to be solved.
560
00:29:08,564 --> 00:29:12,902
Both brains end up at similar
conclusions about taking risks,
561
00:29:12,903 --> 00:29:17,471
but their brains experience it
differently.
562
00:29:17,472 --> 00:29:21,043
What it tells us is
that being a political liberal
563
00:29:21,044 --> 00:29:23,612
or a political conservative
influences
564
00:29:23,613 --> 00:29:25,780
everything that we see,
565
00:29:25,781 --> 00:29:27,882
that we see the world
in really different ways,
566
00:29:27,883 --> 00:29:30,818
we use different mental tools
when we're processing
567
00:29:30,819 --> 00:29:32,520
even basic things like gambling
568
00:29:32,521 --> 00:29:35,524
that appear to have nothing
to do with politics.
569
00:29:35,525 --> 00:29:36,826
And that just blew our minds.
570
00:29:37,559 --> 00:29:39,595
Morgan:
So, why is this?
571
00:29:39,596 --> 00:29:41,796
Are we born
with our future politics
572
00:29:41,797 --> 00:29:45,700
already fixed
in the structure of our brains?
573
00:29:45,701 --> 00:29:47,102
We're unsure about this,
574
00:29:47,103 --> 00:29:48,836
and I actually have a study
where we're gonna be looking
575
00:29:48,837 --> 00:29:51,672
to see with a data set
that has studied children
576
00:29:51,673 --> 00:29:53,408
from age 4 to 20,
577
00:29:53,409 --> 00:29:55,210
the differences
in people's brains
578
00:29:55,211 --> 00:29:57,479
and do they change over time.
579
00:29:57,480 --> 00:29:59,716
But what we know right now is
that people,
580
00:29:59,717 --> 00:30:01,216
when they're around 20,
581
00:30:01,217 --> 00:30:03,820
seem to have different sizes
of amygdala and insula
582
00:30:03,821 --> 00:30:06,954
depending on whether they're
liberals or conservatives.
583
00:30:06,955 --> 00:30:09,923
Morgan: Darren expects
that, by the time we are 20,
584
00:30:09,924 --> 00:30:11,692
most of us will have solidified
585
00:30:11,693 --> 00:30:14,895
the color
of our political brain.
586
00:30:14,896 --> 00:30:16,664
The possibility of Harmony
587
00:30:16,665 --> 00:30:18,866
between liberals
and conservatives is
588
00:30:18,867 --> 00:30:22,002
therefore unlikely.
589
00:30:22,003 --> 00:30:24,605
Politicians and politically
active citizens cannot
590
00:30:24,606 --> 00:30:27,843
truly see things
from the other side.
591
00:30:27,844 --> 00:30:31,145
Their brains won't let them.
592
00:30:31,146 --> 00:30:35,284
Our intense partisanship is
probably here to stay.
593
00:30:37,019 --> 00:30:39,688
But there may be a way
to reshape our brains
594
00:30:39,689 --> 00:30:44,325
to Cherish the greater good
of all mankind
595
00:30:44,326 --> 00:30:48,997
and open our minds and hearts
to one another
596
00:30:48,998 --> 00:30:51,766
if we all play
597
00:30:51,767 --> 00:30:55,437
ultra-violent video games.
598
00:30:55,438 --> 00:30:57,538
[ Gunfire ]
599
00:30:57,539 --> 00:30:58,874
[ Gun cocks ]
600
00:31:01,821 --> 00:31:04,890
The world is full of hate.
601
00:31:04,891 --> 00:31:08,493
Eliminating bigotry seems
hopeless.
602
00:31:08,494 --> 00:31:11,130
But there may be a way...
603
00:31:11,131 --> 00:31:17,670
Pure, unadulterated violence
in an alternate reality.
604
00:31:18,771 --> 00:31:20,072
[ Gunfire ]
Man: Aah!
605
00:31:20,073 --> 00:31:22,809
Professor Matthew grizzard has
spent a lot of time
606
00:31:22,810 --> 00:31:24,610
playing video games.
607
00:31:24,611 --> 00:31:25,678
[ Gunfire ]
608
00:31:25,679 --> 00:31:27,746
But
as a communications researcher,
609
00:31:27,747 --> 00:31:32,751
these mediated realities are
more than just entertainment.
610
00:31:32,752 --> 00:31:34,151
[ Gunfire ]
611
00:31:34,152 --> 00:31:36,053
We don't necessarily distinguish
very much
612
00:31:36,054 --> 00:31:40,659
between mediated reality
and real reality.
613
00:31:40,660 --> 00:31:42,026
We see visual elements.
614
00:31:42,027 --> 00:31:45,097
We hear things,
auditory elements.
615
00:31:45,098 --> 00:31:47,232
And our bodies respond
to those elements
616
00:31:47,233 --> 00:31:48,766
as if they were real.
617
00:31:48,767 --> 00:31:50,901
Morgan:
Immersed in a game,
618
00:31:50,902 --> 00:31:53,104
Matthew can feel
his pulse pounding
619
00:31:53,105 --> 00:31:56,909
and his stomach churning from
the intensity of the experience.
620
00:31:56,910 --> 00:31:58,310
[ Gunfire ]
621
00:31:58,311 --> 00:32:04,016
But one day, a level in a game
downright disturbed him.
622
00:32:07,186 --> 00:32:10,188
Grizzard:
So, I'm in an elevator.
623
00:32:10,189 --> 00:32:11,657
And I'm looking around,
624
00:32:11,658 --> 00:32:15,462
and there's a lot of armed men
in military fatigues with me.
625
00:32:17,729 --> 00:32:19,397
At that point,
elevator doors open.
626
00:32:19,398 --> 00:32:20,766
[ Elevator bell dings ]
627
00:32:23,736 --> 00:32:26,405
And we step out into what
appears to be a crowded airport.
628
00:32:26,406 --> 00:32:29,142
[ Intercom chatter,
[ Indistinct conversations ]
629
00:32:29,143 --> 00:32:31,409
And at that point,
the order is given,
630
00:32:31,410 --> 00:32:35,080
and we raise our guns
to point at innocent civilians
631
00:32:35,081 --> 00:32:37,047
surrounding this airport.
632
00:32:37,048 --> 00:32:40,118
[ Gun cocks ]
633
00:32:40,119 --> 00:32:41,753
And then, we start firing.
634
00:32:41,754 --> 00:32:43,622
[ Gunfire,
people screaming ]
635
00:32:51,831 --> 00:32:53,233
[ People moaning, sobbing ]
636
00:32:56,436 --> 00:32:57,703
Morgan: Matthew felt guilty
637
00:32:57,704 --> 00:33:01,907
for murdering
so many innocent pixels.
638
00:33:01,908 --> 00:33:06,277
But as a social psychologist,
he knows that guilt is a feeling
639
00:33:06,278 --> 00:33:10,082
that can profoundly change
our behavior.
640
00:33:10,083 --> 00:33:11,617
Grizzard:
We wanted to see if this guilt
641
00:33:11,618 --> 00:33:13,251
that was elicited
from virtual environments
642
00:33:13,252 --> 00:33:16,055
could cause people to think more
about real-world morality
643
00:33:16,056 --> 00:33:18,723
and could actually increase
their moral sensitivity
644
00:33:18,724 --> 00:33:20,159
to real-world issues.
645
00:33:21,393 --> 00:33:24,361
Morgan: Media pundits
often accuse violent video games
646
00:33:24,362 --> 00:33:27,666
of destroying
the morality of our youth.
647
00:33:27,667 --> 00:33:29,735
Is that really true?
648
00:33:29,736 --> 00:33:33,237
Matthew has a series
of test subjects play a game
649
00:33:33,238 --> 00:33:37,176
where they can hurt
simulated human beings.
650
00:33:37,177 --> 00:33:39,378
So, Matthew gives the order
651
00:33:39,379 --> 00:33:44,282
to commit blatant crimes
against humanity.
652
00:33:44,283 --> 00:33:47,052
So, we set up a situation
where people are gonna play
653
00:33:47,053 --> 00:33:48,719
a violent first-person shooter
654
00:33:48,720 --> 00:33:51,156
where they're engaging
in terrorist behaviors,
655
00:33:51,157 --> 00:33:53,192
where they're
committing genocides
656
00:33:53,193 --> 00:33:55,026
and they're killing
innocent civilians,
657
00:33:55,027 --> 00:33:57,294
they're bombing areas,
they're gauging in things
658
00:33:57,295 --> 00:34:01,332
that would be considered morally
reprehensible in the real world.
659
00:34:01,333 --> 00:34:03,134
Morgan:
Matthew's subjects surrender
660
00:34:03,135 --> 00:34:07,504
to the alternative reality
of the game.
661
00:34:07,505 --> 00:34:08,505
[ Gun cocks ]
662
00:34:08,506 --> 00:34:10,107
Inside their minds...
663
00:34:10,108 --> 00:34:11,210
[ Gun cocks ]
664
00:34:11,211 --> 00:34:13,446
They are living
through the experience
665
00:34:13,447 --> 00:34:15,214
of being a mass murder.
666
00:34:17,583 --> 00:34:19,951
The game is guilt-inducing...
[ Gun cocks ]
667
00:34:19,952 --> 00:34:21,787
To say the least.
668
00:34:26,291 --> 00:34:29,428
So we also had a control group
because we wanted to see
669
00:34:29,429 --> 00:34:31,362
and distinguish
video-game-induced guilt
670
00:34:31,363 --> 00:34:33,598
from real-world guilt.
671
00:34:33,599 --> 00:34:36,268
So we had individuals remember
a situation
672
00:34:36,269 --> 00:34:38,937
in which they felt
particularly guilty.
673
00:34:38,938 --> 00:34:42,975
Morgan: Writing this out is
emotionally taxing.
674
00:34:42,976 --> 00:34:45,009
It brings back painful memories,
675
00:34:45,010 --> 00:34:48,212
perhaps of the time
they cheated on a lover
676
00:34:48,213 --> 00:34:51,516
or lied
and got someone else in trouble
677
00:34:51,517 --> 00:34:55,187
or sabotaged a friend
for selfish gain.
678
00:34:55,188 --> 00:35:00,658
In every case,
real people were really hurt.
679
00:35:00,659 --> 00:35:04,830
Matthew compared this group
to the murder-simulator group.
680
00:35:06,599 --> 00:35:09,634
So, our findings showed
that individuals recalling
681
00:35:09,635 --> 00:35:13,638
a real-world guilty experience
actually felt more guilt
682
00:35:13,639 --> 00:35:15,708
but that guilt solicited
by video game
683
00:35:15,709 --> 00:35:19,712
was positively associated with
increased moral sensitivity.
684
00:35:19,713 --> 00:35:22,580
Morgan:
Committing virtual mass murder
685
00:35:22,581 --> 00:35:26,016
gave his subjects
a stronger sense of morality.
686
00:35:26,017 --> 00:35:27,051
[ Gun cocks ]
687
00:35:27,052 --> 00:35:28,752
It's a surprising result,
688
00:35:28,753 --> 00:35:32,123
but Matthew thinks
he knows why it's the case.
689
00:35:32,124 --> 00:35:36,962
The players violated their
own personal sense of fairness.
690
00:35:36,963 --> 00:35:40,765
They cannot right the wrongs
they have committed,
691
00:35:40,766 --> 00:35:44,569
so they atone
with a subconscious desire
692
00:35:44,570 --> 00:35:47,172
to be a better person.
693
00:35:47,173 --> 00:35:49,741
I think that's
the real power of video games.
694
00:35:49,742 --> 00:35:52,043
You can think of them
as kind of moral sandboxes,
695
00:35:52,044 --> 00:35:55,346
as areas where we can explore
different aspects of morality
696
00:35:55,347 --> 00:35:57,181
or even take viewpoints
that are opposed
697
00:35:57,182 --> 00:35:59,417
to our very core of morality.
698
00:35:59,418 --> 00:36:02,420
Morgan: But it's hard to imagine
everyone agreeing
699
00:36:02,421 --> 00:36:06,056
to play
guilt-inducting video games.
700
00:36:06,057 --> 00:36:09,094
And there will
always be sociopaths
701
00:36:09,095 --> 00:36:11,362
whose bigotry spreads
through society
702
00:36:11,363 --> 00:36:12,763
like a deadly virus.
703
00:36:12,764 --> 00:36:17,902
Could it be that we're just
too tolerant of intolerance?
704
00:36:17,903 --> 00:36:21,874
What would really happen if
we cut off the worst offenders?
705
00:36:23,276 --> 00:36:24,644
Could we ever do it?
706
00:36:29,881 --> 00:36:32,683
In the past century,
707
00:36:32,684 --> 00:36:36,754
we've broken down
a lot of walls that divided us.
708
00:36:36,755 --> 00:36:40,824
More social groups have been
accepted into the mainstream.
709
00:36:40,825 --> 00:36:45,062
Segregation and prejudice are
no longer the laws of the land.
710
00:36:45,063 --> 00:36:47,498
But there are still those
who think they're superior
711
00:36:47,499 --> 00:36:52,337
because of their skin color,
their age, or their gender.
712
00:36:52,338 --> 00:36:54,906
There may be a way to deal
713
00:36:54,907 --> 00:36:56,977
with these bigots
and their bigotry...
714
00:36:59,144 --> 00:37:01,246
Build walls around them.
715
00:37:04,483 --> 00:37:08,652
Sociologist and physician
Nicholas christakis is taking
716
00:37:08,653 --> 00:37:12,190
a bird's eye view
of human society.
717
00:37:12,191 --> 00:37:13,757
From his perspective,
718
00:37:13,758 --> 00:37:16,728
when bigots flourish
in a social network,
719
00:37:16,729 --> 00:37:19,665
it is partly the fault
of the group itself.
720
00:37:19,666 --> 00:37:22,132
Christakis: So we're embedded
in these networks.
721
00:37:22,133 --> 00:37:23,801
How we act
in the world is affected
722
00:37:23,802 --> 00:37:26,069
by how the people we know act
723
00:37:26,070 --> 00:37:28,574
but also even
by how people we don't know act
724
00:37:28,575 --> 00:37:30,174
as things ripple
through the network
725
00:37:30,175 --> 00:37:32,345
and come to affect us.
726
00:37:33,021 --> 00:37:36,323
Morgan: Nicholas and long-time
collaborator James Fowler are
727
00:37:36,324 --> 00:37:39,825
studying how social connections
change the behavior
728
00:37:39,826 --> 00:37:42,928
of the group as a whole.
729
00:37:42,929 --> 00:37:45,298
Christakis: When you add
these ties between people,
730
00:37:45,299 --> 00:37:48,433
the particular number of ties,
the particular pattern of ties
731
00:37:48,434 --> 00:37:50,336
then confers on the group
732
00:37:50,337 --> 00:37:53,674
properties it didn't
necessarily have before.
733
00:37:53,675 --> 00:37:56,609
Morgan:
Patterns and wiring matter.
734
00:37:56,610 --> 00:37:58,310
Think of the Internet.
735
00:37:58,311 --> 00:38:01,147
It's a highly dynamic network
of computers.
736
00:38:01,148 --> 00:38:03,183
If one area goes down,
737
00:38:03,184 --> 00:38:06,252
the data simply move
around the blockage.
738
00:38:06,253 --> 00:38:09,923
But the electrical grid is
more vulnerable.
739
00:38:09,924 --> 00:38:14,061
One bad node can damage all of
its neighboring connections...
740
00:38:16,029 --> 00:38:20,033
Or even bring
the whole network down.
741
00:38:20,034 --> 00:38:21,901
Nicholas is
performing experiments
742
00:38:21,902 --> 00:38:25,071
to see if he can re-engineer
human social networks
743
00:38:25,072 --> 00:38:27,339
to be more like the Internet.
744
00:38:27,340 --> 00:38:30,642
His student volunteers
form a single social network
745
00:38:30,643 --> 00:38:33,513
divided
into four separate groups.
746
00:38:33,514 --> 00:38:38,551
Every student's goal is to make
as much money as possible.
747
00:38:38,552 --> 00:38:40,285
Okay, guys, so here's
what we're gonna do.
748
00:38:40,286 --> 00:38:42,654
Each of you has a dollar
and a pad in front of you.
749
00:38:42,655 --> 00:38:44,991
In a moment, I'm gonna ask you
to write "give" or "take."
750
00:38:44,992 --> 00:38:46,560
You can contribute
to the collective,
751
00:38:46,561 --> 00:38:48,294
or you can be a parasite
and take advantage
752
00:38:48,295 --> 00:38:49,863
of the collective...
"Give" or "take."
753
00:38:49,864 --> 00:38:52,165
Then you'll be asked
to reveal your choice
754
00:38:52,166 --> 00:38:54,702
and to contribute your dollar
or not towards the middle.
755
00:38:54,703 --> 00:38:55,935
I will double the pot,
756
00:38:55,936 --> 00:38:57,370
and then we'll divide
the money equally
757
00:38:57,371 --> 00:38:59,339
amongst everyone at every table.
758
00:38:59,340 --> 00:39:00,973
If everybody shares,
759
00:39:00,974 --> 00:39:03,910
everybody makes
a good amount of money.
760
00:39:03,911 --> 00:39:07,446
If nobody shares,
nobody makes any money.
761
00:39:07,447 --> 00:39:09,083
So, reveal.
762
00:39:10,617 --> 00:39:11,917
All right.
You're all givers.
763
00:39:11,918 --> 00:39:14,287
So make your contributions,
as well, accordingly.
764
00:39:14,288 --> 00:39:15,388
And so, what we're gonna do is
765
00:39:15,389 --> 00:39:19,625
we're gonna double that
and share.
766
00:39:19,626 --> 00:39:22,395
Morgan: At first,
most are nice to each other
767
00:39:22,396 --> 00:39:25,031
and share in the rewards.
768
00:39:25,032 --> 00:39:28,401
But a few players choose
not to contribute.
769
00:39:28,402 --> 00:39:32,705
When the pot is doubled,
they get more than their peers.
770
00:39:32,706 --> 00:39:34,174
In these rounds,
771
00:39:34,175 --> 00:39:37,812
the students cannot change
their society's social wiring.
772
00:39:37,813 --> 00:39:39,946
So you're assigned
a position in the network,
773
00:39:39,947 --> 00:39:41,848
and you're told,
"these are your neighbors.
774
00:39:41,849 --> 00:39:44,551
These are your friends
for the next hour.
775
00:39:44,552 --> 00:39:47,086
And you're stuck now interacting
with these jerks."
776
00:39:47,087 --> 00:39:49,890
And you don't like it,
but all you can do is control
777
00:39:49,891 --> 00:39:54,060
whether you cooperate or defect,
whether you share or take.
778
00:39:54,061 --> 00:39:55,696
And what happens
after you've been sharing
779
00:39:55,697 --> 00:39:57,131
and some of the people
around you are taking is
780
00:39:57,132 --> 00:39:59,633
you say,
"I'm not gonna do that anymore,"
781
00:39:59,634 --> 00:40:02,067
and you switching to taking,
as well.
782
00:40:02,068 --> 00:40:04,537
And sharing disappears
from the system.
783
00:40:04,538 --> 00:40:07,073
Cooperation goes away.
784
00:40:07,074 --> 00:40:10,243
And what you find is that you
have then a society of takers.
785
00:40:10,244 --> 00:40:14,481
Morgan: But now Nicholas adds
one important rule.
786
00:40:14,482 --> 00:40:16,482
After each round,
787
00:40:16,483 --> 00:40:20,419
each student has
the option of moving seats.
788
00:40:20,420 --> 00:40:24,457
Now you can cut the ties
to the people who are defectors,
789
00:40:24,458 --> 00:40:26,459
the people who are
taking advantage of you
790
00:40:26,460 --> 00:40:27,828
and preferentially form ties
791
00:40:27,829 --> 00:40:29,462
to other nice people
in the network,
792
00:40:29,463 --> 00:40:30,698
to other cooperators.
793
00:40:30,699 --> 00:40:31,997
And if you do that,
794
00:40:31,998 --> 00:40:35,368
if you make that small change
in the social order,
795
00:40:35,369 --> 00:40:39,471
what you find is cooperation
flourishes in the system.
796
00:40:39,472 --> 00:40:41,707
Morgan: The selfish people
haven't gone away,
797
00:40:41,708 --> 00:40:45,011
but the society
that everybody now lives in
798
00:40:45,012 --> 00:40:48,583
has a completely
different culture.
799
00:40:49,617 --> 00:40:51,952
Nicholas believes humans
and societies are
800
00:40:51,953 --> 00:40:55,488
like groups of carbon atoms.
801
00:40:55,489 --> 00:40:56,790
Arrange them one way,
802
00:40:56,791 --> 00:40:59,524
and you get soft,
opaque graphite.
803
00:40:59,525 --> 00:41:03,462
But if you arrange
those same atoms just right,
804
00:41:03,463 --> 00:41:08,000
you get strong, clear,
sparkling diamond.
805
00:41:08,001 --> 00:41:10,069
And so, these properties
of softness and darkness
806
00:41:10,070 --> 00:41:12,640
aren't properties
of the carbon atoms.
807
00:41:12,641 --> 00:41:15,975
And it's just like that
with our social groups.
808
00:41:15,976 --> 00:41:19,078
The same human beings
connected different ways
809
00:41:19,079 --> 00:41:21,348
gives the group
different properties.
810
00:41:21,349 --> 00:41:23,283
Some corporations are
experimenting
811
00:41:23,284 --> 00:41:26,486
with an open social network
architecture.
812
00:41:26,487 --> 00:41:28,354
Employees are free
to break bonds
813
00:41:28,355 --> 00:41:31,191
with any other employee
they don't work well with
814
00:41:31,192 --> 00:41:33,392
and form new ones.
815
00:41:33,393 --> 00:41:36,062
Could a similar strategy work
for our national
816
00:41:36,063 --> 00:41:39,098
or even global society?
817
00:41:39,099 --> 00:41:42,667
If we were all free
to move around,
818
00:41:42,668 --> 00:41:44,803
would there be less hate?
819
00:41:44,804 --> 00:41:47,473
Christakis: Network science
doesn't offer one prescription.
820
00:41:47,474 --> 00:41:50,377
It's not as if there's
one network structure
821
00:41:50,378 --> 00:41:52,578
that's optimal for everything.
822
00:41:52,579 --> 00:41:54,915
What I can tell you is
that network structure matters
823
00:41:54,916 --> 00:41:57,785
to lots of problems,
and understanding the rules
824
00:41:57,786 --> 00:42:00,252
of social network structure
and function gives us
825
00:42:00,253 --> 00:42:03,558
a new set of tools to intervene
in the world to make it better.
826
00:42:05,259 --> 00:42:08,261
Morgan: We may not find
a solution to bigotry soon,
827
00:42:08,262 --> 00:42:12,066
but science is
at last exposing its roots...
828
00:42:13,034 --> 00:42:16,202
Our biased,
snap judgments of others,
829
00:42:16,203 --> 00:42:18,537
our innate groupism,
830
00:42:18,538 --> 00:42:21,808
our rigid political filters.
831
00:42:21,809 --> 00:42:26,979
For now, our best tool
to fight bigotry lies
832
00:42:26,980 --> 00:42:29,381
within ourselves...
833
00:42:29,382 --> 00:42:32,651
The courage to walk away.
834
00:42:32,652 --> 00:42:36,189
We all have bigotry inside us.
835
00:42:36,190 --> 00:42:40,692
Most of us work hard to suppress
our innate prejudices.
836
00:42:40,693 --> 00:42:42,696
But some don't.
837
00:42:42,697 --> 00:42:46,165
And their bigotry is infectious.
838
00:42:46,166 --> 00:42:48,334
The solution
to bigotry does not start
839
00:42:48,335 --> 00:42:49,736
with governments and laws.
840
00:42:49,737 --> 00:42:54,374
It starts with understanding
and neutralizing its source
841
00:42:54,375 --> 00:42:58,346
and with you and me doing
our best to change.
842
00:42:58,396 --> 00:43:02,946
Repair and Synchronization by
Easy Subtitles Synchronizer 1.0.0.0
66691
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.