Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:03,990 --> 00:00:06,020
We're at war,
2
00:00:06,030 --> 00:00:08,190
and the battle lines
have been redrawn.
3
00:00:10,860 --> 00:00:13,130
Terrorism can strike anywhere...
4
00:00:15,730 --> 00:00:18,840
...and everyone is a target.
5
00:00:18,840 --> 00:00:23,140
Science is peering into the dark
heart of terror networks
6
00:00:23,140 --> 00:00:29,010
to find out why terrorists
embrace unspeakable atrocities.
7
00:00:29,010 --> 00:00:31,580
Are they so different from us,
8
00:00:31,580 --> 00:00:33,620
or do we all have the ability
9
00:00:33,620 --> 00:00:36,890
to give our lives for a cause?
10
00:00:36,890 --> 00:00:41,060
Could we learn to
dehumanize our enemies,
11
00:00:41,060 --> 00:00:44,690
or could science show us
a way to stop terrorism...
12
00:00:47,000 --> 00:00:50,470
...and understand
what makes a terrorist?
13
00:00:56,240 --> 00:01:00,810
Space, time, life itself.
14
00:01:03,180 --> 00:01:07,520
The secrets of the cosmos
lie through the wormhole.
15
00:01:21,000 --> 00:01:22,770
When four planes dropped out
16
00:01:22,770 --> 00:01:27,140
of a clear-blue sky
on September 11th, 2001,
17
00:01:27,140 --> 00:01:30,070
it was shocking to realize
18
00:01:30,080 --> 00:01:33,510
we were all potential targets.
19
00:01:33,510 --> 00:01:37,810
Today, al-qaeda and Isis
claim the headlines,
20
00:01:37,820 --> 00:01:40,280
but it was once the ira
21
00:01:40,290 --> 00:01:43,620
and basque separatists
in Europe, or anarchists
22
00:01:43,620 --> 00:01:47,220
and the Ku Klux Klan
in the United States.
23
00:01:47,230 --> 00:01:49,030
Terrorists, like any group,
24
00:01:49,030 --> 00:01:51,260
have ideals they belive in...
25
00:01:51,260 --> 00:01:54,200
a cause to rally around.
26
00:01:54,200 --> 00:01:57,470
But for them, the cause
justifies the deliberate
27
00:01:57,470 --> 00:02:01,370
targeting of innocent civilians.
28
00:02:01,370 --> 00:02:04,470
How can they think that way?
29
00:02:04,480 --> 00:02:08,410
Can we peer inside
the mind of a terrorist?
30
00:02:18,120 --> 00:02:21,290
Anthropologist
Scott Atran has spent years
31
00:02:21,290 --> 00:02:24,390
trying to understand the enemy.
32
00:02:24,400 --> 00:02:26,260
He went to the front lines
of Iraq,
33
00:02:26,270 --> 00:02:28,770
embedding himself
with American allies
34
00:02:28,770 --> 00:02:31,740
to meet terrorists face to face.
35
00:02:31,740 --> 00:02:35,710
This is the foremost
position on the front lines.
36
00:02:35,710 --> 00:02:38,940
I'm 10 kilometers west
of makhmur.
37
00:02:38,940 --> 00:02:41,780
So, we went to Iraq in 2016
38
00:02:41,780 --> 00:02:43,950
to follow up on a battle.
39
00:02:43,950 --> 00:02:46,580
Some 500 kurdish
soldiers had cornered
40
00:02:46,590 --> 00:02:50,150
fewer than 100 Isis fighters.
41
00:02:50,160 --> 00:02:52,260
When all appeared
to be lost to them
42
00:02:52,260 --> 00:02:54,260
and the last 15 were retreated,
43
00:02:54,260 --> 00:02:58,900
7 of them blew themselves up
to cover their retreat.
44
00:02:58,900 --> 00:03:02,100
The kurdish forces, seeing this,
45
00:03:02,100 --> 00:03:04,330
decided it just wasn't
worth trying
46
00:03:04,340 --> 00:03:05,700
to hold on to the territory,
47
00:03:05,700 --> 00:03:07,600
even though
they vastly outnumbered
48
00:03:07,610 --> 00:03:09,540
the islamic state fighters.
49
00:03:09,540 --> 00:03:12,540
Scott wanted to know
how such a small group
50
00:03:12,540 --> 00:03:16,280
were able to hold off
a much more formable force.
51
00:03:16,280 --> 00:03:19,650
He sat down to interview
captured Isis fighters
52
00:03:19,650 --> 00:03:22,020
and put them through
a battery of tests
53
00:03:22,020 --> 00:03:24,350
to assess the strength
of the group.
54
00:03:28,130 --> 00:03:30,230
Yes.
55
00:03:30,230 --> 00:03:33,460
Scott discovered there
was an incredible bond
56
00:03:33,470 --> 00:03:35,600
between these fighters.
57
00:03:35,600 --> 00:03:39,000
They would do anything
for their cause.
58
00:03:41,570 --> 00:03:42,910
Scott has been studying
59
00:03:42,910 --> 00:03:45,840
the dynamics of groups
for years.
60
00:03:45,840 --> 00:03:48,580
Humans are
the preeminent group animal.
61
00:03:48,580 --> 00:03:52,250
That's what allows us
to build things like that.
62
00:03:52,250 --> 00:03:53,780
It's a collective effort.
63
00:03:53,790 --> 00:03:54,920
They have to coordinate,
64
00:03:54,920 --> 00:03:57,090
and they have to take risks
for one another.
65
00:03:57,090 --> 00:04:00,290
In fact, they have to be able
to sacrifice for one another.
66
00:04:00,290 --> 00:04:03,860
One world trade stands
as a testament to America's
67
00:04:03,860 --> 00:04:07,900
resolve after September 11th.
68
00:04:07,900 --> 00:04:10,870
When the towers
came down in 2001,
69
00:04:10,870 --> 00:04:14,540
it was the iron workers
from local 40, amongst others,
70
00:04:14,540 --> 00:04:18,910
who rebuilt the site
a decade later.
71
00:04:18,910 --> 00:04:24,180
Terrorists and iron workers
seem like polar opposites to us.
72
00:04:24,180 --> 00:04:25,880
But to an anthropologist,
73
00:04:25,880 --> 00:04:28,620
both groups share
a crucial trait.
74
00:04:32,260 --> 00:04:35,890
These New York City
iron workers risk their lives
75
00:04:35,890 --> 00:04:40,030
every day for each other
maneuvering these massive beams
76
00:04:40,030 --> 00:04:43,600
while balancing hundreds
of feet in the air.
77
00:04:43,600 --> 00:04:46,040
Just set it down here.
78
00:04:46,040 --> 00:04:49,170
They have an extremely
close group dynamic.
79
00:04:49,170 --> 00:04:53,240
Today, Scott is going to measure
how strong it is,
80
00:04:53,250 --> 00:04:55,950
administering to them
the same test
81
00:04:55,950 --> 00:04:59,420
he has given
to terrorists in Iraq.
82
00:04:59,420 --> 00:05:02,020
We're interested
in how individuals relate
83
00:05:02,020 --> 00:05:05,290
to the groups they live with
and they work with.
84
00:05:05,290 --> 00:05:07,320
The test is simple.
85
00:05:07,330 --> 00:05:10,090
It involves five cards.
86
00:05:10,100 --> 00:05:12,660
Each card shows
a pair of circles.
87
00:05:12,660 --> 00:05:15,170
The small one represents me.
88
00:05:15,170 --> 00:05:18,340
The large circle
represents the group.
89
00:05:18,340 --> 00:05:20,470
In the first pairing,
the "me" circle
90
00:05:20,470 --> 00:05:23,770
and the "group"
circle are totally separate.
91
00:05:23,780 --> 00:05:27,040
But gradually,
they begin to overlap.
92
00:05:27,050 --> 00:05:29,450
In the fifth pairing,
the "me" circle
93
00:05:29,450 --> 00:05:32,520
is entirely contained
within the "group" circle,
94
00:05:32,520 --> 00:05:35,790
the two identities fully fused.
95
00:05:35,790 --> 00:05:37,590
And what I'd like you to do
96
00:05:37,590 --> 00:05:41,620
is pick the one that defines
the way you think you are.
97
00:05:41,630 --> 00:05:43,460
The iron workers
see their identities
98
00:05:43,460 --> 00:05:46,760
closely overlapped
by the group's.
99
00:05:46,770 --> 00:05:48,800
Scott expects this...
100
00:05:48,800 --> 00:05:51,530
it's a natural byproduct
of a tight-knit group
101
00:05:51,540 --> 00:05:54,600
in a highly risky profession.
102
00:05:54,610 --> 00:05:57,570
So, how do the iron workers'
answers to the test
103
00:05:57,580 --> 00:05:59,940
compare to those of terrorists?
104
00:06:03,450 --> 00:06:05,980
The terrorists' answers
were unlike those
105
00:06:05,980 --> 00:06:08,690
of any other group
Scott had studied.
106
00:06:08,690 --> 00:06:11,120
For them, the group
was all consuming.
107
00:06:11,120 --> 00:06:14,360
The individual was nonexistent.
108
00:06:14,360 --> 00:06:16,960
Well, they did this
with guerillas in Libya,
109
00:06:16,960 --> 00:06:18,960
where they actually
rubbed it out
110
00:06:18,960 --> 00:06:20,800
and blackened out
the "me" in the group,
111
00:06:20,800 --> 00:06:22,330
as if there were no difference.
112
00:06:22,330 --> 00:06:25,270
Well, then we say they they're
completely fused with the group.
113
00:06:27,510 --> 00:06:30,910
What makes their
dedication so extreme?
114
00:06:30,910 --> 00:06:33,010
Scott says it has
to do with the values
115
00:06:33,010 --> 00:06:35,480
that hold the group together.
116
00:06:35,480 --> 00:06:38,050
Iron workers are loyal
to their brothers,
117
00:06:38,050 --> 00:06:41,320
but probably wouldn't go to work
if they weren't paid
118
00:06:41,320 --> 00:06:44,350
or their families
were being threatened.
119
00:06:44,360 --> 00:06:46,120
Not so for Isis fighters,
120
00:06:46,120 --> 00:06:48,930
who see their group
as their salvation
121
00:06:48,930 --> 00:06:53,300
and an islamic state
as absolutely non-negotiable.
122
00:06:53,300 --> 00:06:56,900
These values have become
sacred to them.
123
00:06:56,900 --> 00:06:59,640
Sacred values are
often religious values,
124
00:06:59,640 --> 00:07:01,600
because they're transcendent.
125
00:07:01,610 --> 00:07:04,070
You wouldn't comprise
or negotiate
126
00:07:04,080 --> 00:07:06,310
to trade off your children.
127
00:07:06,310 --> 00:07:08,450
Many of us wouldn't
negotiate or comprise
128
00:07:08,450 --> 00:07:11,610
to trade off
our country or our religion
129
00:07:11,620 --> 00:07:15,350
for any amount of money,
under any social pressure.
130
00:07:22,060 --> 00:07:24,390
They respond on
all other measures...
131
00:07:24,400 --> 00:07:27,760
willingness to fight, to die,
to sacrifice for one other,
132
00:07:27,770 --> 00:07:31,070
to torture,
to have their families suffer.
133
00:07:31,070 --> 00:07:35,140
And so, if you have a group
of people fused around
134
00:07:35,140 --> 00:07:36,710
a set of ideas they hold sacred,
135
00:07:36,710 --> 00:07:39,880
for which no negotiation
and compromise is possible,
136
00:07:39,880 --> 00:07:41,480
well, then you have
the most formable
137
00:07:41,480 --> 00:07:44,610
fighting force possible.
138
00:07:44,620 --> 00:07:46,920
And Isis is no exception.
139
00:07:46,920 --> 00:07:50,490
Scott's work has shown
how terrorists are single-minded
140
00:07:50,490 --> 00:07:54,060
and absolutely committed
to a cause.
141
00:07:54,060 --> 00:07:55,830
But how do they
take this behavior
142
00:07:55,830 --> 00:07:59,360
so far as to load a car
full of explosives
143
00:07:59,360 --> 00:08:02,300
and blow up a building
full of innocent people?
144
00:08:07,040 --> 00:08:11,940
How do we shut down our
ability to care for others?
145
00:08:11,940 --> 00:08:15,040
According to psychologist
Jay Van bavel,
146
00:08:15,050 --> 00:08:17,310
our brains have
a remarkable ability
147
00:08:17,320 --> 00:08:21,890
to dehumanize others
under the right circumstances.
148
00:08:21,890 --> 00:08:24,690
Jay has designed
a test to demonstrate
149
00:08:24,690 --> 00:08:28,390
how we recognize humanity
in the first place.
150
00:08:28,390 --> 00:08:31,830
People came in,
we took an image of a doll
151
00:08:31,830 --> 00:08:35,430
and morphed it with a human face
that looked very similar.
152
00:08:35,430 --> 00:08:38,030
And when does this face
look human to you?
153
00:08:41,310 --> 00:08:44,470
Jay asked participants
the same question.
154
00:08:44,480 --> 00:08:47,180
And right about now.
155
00:08:47,180 --> 00:08:48,680
Just passed the half-way mark
156
00:08:48,680 --> 00:08:51,480
between the doll face
and the real face,
157
00:08:51,480 --> 00:08:54,150
most of us perceive
that the face is human
158
00:08:54,150 --> 00:08:55,690
and has a mind.
159
00:08:55,690 --> 00:08:57,520
Right... There.
160
00:08:57,520 --> 00:09:02,030
In Jay's field, it's called
"perception of mind."
161
00:09:02,030 --> 00:09:05,930
But he says it's not fixed.
162
00:09:05,930 --> 00:09:09,300
Jay adjusted the experiment,
and told his subjects
163
00:09:09,300 --> 00:09:12,540
what college the person
in the photo went to.
164
00:09:12,540 --> 00:09:15,370
Their own college,
an in-group...
165
00:09:15,370 --> 00:09:17,670
And right about now.
166
00:09:17,680 --> 00:09:20,040
...or a rival college
across town,
167
00:09:20,050 --> 00:09:21,740
an out-group.
168
00:09:21,750 --> 00:09:25,320
This rivalry had
a measurable effect.
169
00:09:25,320 --> 00:09:27,980
What we found
was that these students
170
00:09:27,990 --> 00:09:31,690
were able to mind behind
the in-group face much faster.
171
00:09:31,690 --> 00:09:33,690
And they were slower
to see the humanity
172
00:09:33,690 --> 00:09:35,990
in students
from another college.
173
00:09:35,990 --> 00:09:38,030
Right about now.
174
00:09:38,030 --> 00:09:41,500
These are subtle kinds
of dehumanization.
175
00:09:41,500 --> 00:09:44,270
How do we get into
the mind of a terrorist
176
00:09:44,270 --> 00:09:46,570
who can view a tower
full of civilians
177
00:09:46,570 --> 00:09:49,440
as a justifiable target?
178
00:09:49,440 --> 00:09:52,540
Jay believes it's easier
than you think.
179
00:09:52,540 --> 00:09:55,550
For a second experiment,
Jay divides subjects,
180
00:09:55,550 --> 00:09:57,950
arbitrarily, into two teams...
181
00:09:57,950 --> 00:10:00,980
the "rattlers"
and the "eagles."
182
00:10:00,990 --> 00:10:03,790
We told people that we
were keeping track of points,
183
00:10:03,790 --> 00:10:06,160
and the winning team
would walk away with the money.
184
00:10:06,160 --> 00:10:09,030
Jay's creating
the essence of conflict...
185
00:10:09,030 --> 00:10:12,300
two group competing
for resources.
186
00:10:12,300 --> 00:10:15,870
So, now I'm gonna read both
teams a number of statements.
187
00:10:15,870 --> 00:10:18,330
He reads the group
short stories about people
188
00:10:18,340 --> 00:10:22,370
who supposedly belong
to neither team.
189
00:10:22,370 --> 00:10:26,840
Jane managed to get indoors
before it started to rain.
190
00:10:26,850 --> 00:10:30,250
These stories
are designed to measure empathy.
191
00:10:30,250 --> 00:10:32,550
If you empathize
with the person's story,
192
00:10:32,550 --> 00:10:34,820
you put your thumb up.
193
00:10:34,820 --> 00:10:37,450
If you don't care, thumb down.
194
00:10:37,460 --> 00:10:42,590
Brandon got soaked by
a taxi driver through a puddle.
195
00:10:42,590 --> 00:10:47,930
Eagles and rattlers give
equally empathetic responses.
196
00:10:47,930 --> 00:10:51,530
Then Jay changes the game.
197
00:10:51,540 --> 00:10:53,670
Andrew sat in gum
on a park bench.
198
00:10:53,670 --> 00:10:56,740
Andrew is a member
of the eagles.
199
00:10:56,740 --> 00:10:58,510
Serves him right.
200
00:10:58,510 --> 00:11:01,880
Even in an entirely
artificial setting,
201
00:11:01,880 --> 00:11:04,050
the results are profound.
202
00:11:04,050 --> 00:11:06,420
Once there's competition
added to the mix,
203
00:11:06,420 --> 00:11:08,220
conflict can
escalate very quickly
204
00:11:08,220 --> 00:11:09,790
towards the out-group.
205
00:11:09,790 --> 00:11:12,320
Team members
actually take pleasure
206
00:11:12,320 --> 00:11:14,690
in the other team's pain.
207
00:11:14,690 --> 00:11:17,230
Remember, they've only been
assigned to these teams
208
00:11:17,230 --> 00:11:19,830
for a matter of minutes.
209
00:11:19,830 --> 00:11:21,300
The thing
that's really remarkable
210
00:11:21,300 --> 00:11:23,430
is that there needs to be
no history of conflict
211
00:11:23,440 --> 00:11:25,440
or any stereotypes
towards the groups
212
00:11:25,440 --> 00:11:28,370
for them to start feel
negative towards them.
213
00:11:28,370 --> 00:11:30,440
Right there.
214
00:11:30,440 --> 00:11:33,480
Jay's tests reveal
that dehumanization can happen
215
00:11:33,480 --> 00:11:35,480
with very little prodding.
216
00:11:35,480 --> 00:11:37,180
Whenever groups compete,
217
00:11:37,180 --> 00:11:40,120
the dehumanization process
begins.
218
00:11:42,190 --> 00:11:45,790
When that competition
includes whole cultures,
219
00:11:45,790 --> 00:11:48,020
the results can be deadly.
220
00:11:54,370 --> 00:11:57,930
If science has discovered
anything about terrorists,
221
00:11:57,940 --> 00:12:00,770
it's that we all have
the potential
222
00:12:00,770 --> 00:12:02,910
to think like one.
223
00:12:02,910 --> 00:12:07,110
All of our brains
are programmed to dehumanize.
224
00:12:07,110 --> 00:12:11,410
We all have values
that we can't compromise.
225
00:12:11,420 --> 00:12:14,480
So what stops us from thinking
226
00:12:14,490 --> 00:12:18,190
and acting in extreme ways?
227
00:12:18,190 --> 00:12:21,960
The answer to that lies less
in what you believe
228
00:12:21,960 --> 00:12:24,160
and more in who you know.
229
00:12:28,330 --> 00:12:30,530
After every big
coordinated terrorist attack
230
00:12:30,540 --> 00:12:34,140
in Europe, the U.S., or Asia,
231
00:12:34,140 --> 00:12:37,610
we wonder how it
could have happened again.
232
00:12:37,610 --> 00:12:39,140
How did a terrorist cell
233
00:12:39,140 --> 00:12:41,680
live among us for months
or years
234
00:12:41,680 --> 00:12:44,350
before deciding to strike?
235
00:12:44,350 --> 00:12:47,650
What's the glue
that binds them together,
236
00:12:47,650 --> 00:12:51,450
and how can we dismantle them
before they strike again.
237
00:12:53,790 --> 00:12:55,190
A caf? in central London
238
00:12:55,190 --> 00:12:57,890
may seem like
a strange place for a lab,
239
00:12:57,900 --> 00:13:00,130
but this is where
social psychologist
240
00:13:00,130 --> 00:13:03,100
nafees Hamid is trying
to uncover
241
00:13:03,100 --> 00:13:05,700
the structure
of terror networks.
242
00:13:05,700 --> 00:13:09,870
My goal is to understand
the "how" of recruitment.
243
00:13:09,880 --> 00:13:12,440
If you start with how,
you understand
244
00:13:12,440 --> 00:13:15,040
the networks that are in play.
245
00:13:15,050 --> 00:13:16,950
Nafees is trying to figure out
246
00:13:16,950 --> 00:13:19,180
how terror networks function
247
00:13:19,180 --> 00:13:22,050
and how they can
be pulled apart.
248
00:13:22,050 --> 00:13:24,250
It's not a simple task.
249
00:13:24,260 --> 00:13:25,460
There's a lot of
theories out there,
250
00:13:25,460 --> 00:13:27,390
but there's very little data,
251
00:13:27,390 --> 00:13:29,290
and this is largely
because it's hard
252
00:13:29,290 --> 00:13:31,860
to get jihadis
to go into a laboratory.
253
00:13:31,860 --> 00:13:33,230
But nafees,
254
00:13:33,230 --> 00:13:35,970
a Pakistani-American
living in London,
255
00:13:35,970 --> 00:13:39,040
tried something a little out
of the ordinary.
256
00:13:41,170 --> 00:13:43,570
He called them.
257
00:13:43,580 --> 00:13:45,280
Salaam alaikum, brother.
258
00:13:45,280 --> 00:13:47,780
How are you?
How's it going?
259
00:13:47,780 --> 00:13:51,010
I decided to just contact them
directly online...
260
00:13:51,020 --> 00:13:54,050
found them on social media,
on Twitter, on various accounts.
261
00:13:54,050 --> 00:13:56,450
Getting terrorists on the phone
262
00:13:56,450 --> 00:13:58,920
is not as hard as you think.
263
00:13:58,920 --> 00:14:00,590
I'm very honest,
and that's the key.
264
00:14:00,590 --> 00:14:02,590
There's so many people,
intelligence officers
265
00:14:02,590 --> 00:14:04,460
and so forth,
who are pretending to be people
266
00:14:04,460 --> 00:14:06,460
they're not to contact
these people,
267
00:14:06,460 --> 00:14:09,000
and they can usually
sniff those people out.
268
00:14:09,000 --> 00:14:11,270
How's the weather in sham today?
269
00:14:11,270 --> 00:14:13,670
Nafees uses
these conversations to make
270
00:14:13,670 --> 00:14:17,510
social network analysis models
of terrorist groups.
271
00:14:17,510 --> 00:14:20,610
He's not tracking
"likes" on Facebook.
272
00:14:20,610 --> 00:14:24,010
He's looking at terrorists
real-life social networks...
273
00:14:24,020 --> 00:14:27,180
friends, family,
and acquaintances.
274
00:14:27,190 --> 00:14:30,320
He's talked to members
of Isis, al-qaeda,
275
00:14:30,320 --> 00:14:32,520
groups all over the world.
276
00:14:32,520 --> 00:14:35,290
Nafees' research
leads him to believe
277
00:14:35,290 --> 00:14:37,730
the reasons people join
a terror cell
278
00:14:37,730 --> 00:14:41,360
has little to do with how
religious their family is
279
00:14:41,370 --> 00:14:45,270
or how how poor
their neighborhood is.
280
00:14:45,270 --> 00:14:46,800
There doesn't seem
to be a correlation
281
00:14:46,810 --> 00:14:49,740
between the ecology
of an environment
282
00:14:49,740 --> 00:14:52,440
and who radicalizes
and who doesn't radicalize.
283
00:14:52,440 --> 00:14:53,880
Instead, your best predictor
284
00:14:53,880 --> 00:14:56,110
of whether someone's
going to go join
285
00:14:56,110 --> 00:14:59,020
an islamist group
is whether they have a friend
286
00:14:59,020 --> 00:15:01,080
who's already a member
of that group.
287
00:15:01,090 --> 00:15:03,920
Friendships basically create
an echo chamber
288
00:15:03,920 --> 00:15:08,760
that allows the radicalization
process to take shape.
289
00:15:08,760 --> 00:15:11,660
So, does knowing
how these networks form
290
00:15:11,660 --> 00:15:14,900
tell us how to break them apart?
291
00:15:14,900 --> 00:15:17,300
Paris, 2015.
292
00:15:19,740 --> 00:15:23,640
Brussels, 2016.
293
00:15:23,640 --> 00:15:25,580
After the attacks,
attention focused
294
00:15:25,580 --> 00:15:27,010
on an Isis terror cell
295
00:15:27,010 --> 00:15:30,380
from the poor Brussels
suburb of Molenbeek
296
00:15:30,380 --> 00:15:34,380
and their radical leader
Abdelhamid Abaaoud.
297
00:15:37,090 --> 00:15:41,020
What's amazing about
the Paris, Brussels attack
298
00:15:41,030 --> 00:15:43,490
is that a terror cell
carried out
299
00:15:43,500 --> 00:15:46,030
a major attack on European soil,
300
00:15:46,030 --> 00:15:48,630
then had the weight
of all of the police
301
00:15:48,630 --> 00:15:51,170
and intelligence agencies
of Europe
302
00:15:51,170 --> 00:15:52,900
brought down on it,
303
00:15:52,900 --> 00:15:54,970
and then was able
to successfully carry out
304
00:15:54,970 --> 00:15:58,680
a second attack within months.
305
00:15:58,680 --> 00:16:01,240
How were they able to do this?
306
00:16:01,250 --> 00:16:05,280
Nafees supplies social network
analysis to the Molenbeek cell
307
00:16:05,280 --> 00:16:08,620
to see how the group functioned.
308
00:16:08,620 --> 00:16:11,590
Nafees measures
each man's importance
309
00:16:11,590 --> 00:16:13,720
to the group by the quantity
310
00:16:13,730 --> 00:16:16,360
and type of his connections
within it.
311
00:16:16,360 --> 00:16:19,730
How long had he known
the others, from where?
312
00:16:19,730 --> 00:16:21,800
Had they served
on the battlefield
313
00:16:21,800 --> 00:16:24,970
or done time together in prison?
314
00:16:24,970 --> 00:16:28,400
In order to carry off
an attack of this magnitude,
315
00:16:28,410 --> 00:16:30,840
you need deep friendships
with people.
316
00:16:30,840 --> 00:16:32,980
In the end, you're trusting
these people with your lives,
317
00:16:32,980 --> 00:16:34,680
and so if they're siblings,
318
00:16:34,680 --> 00:16:37,980
if they're childhood
friends, prison mates.
319
00:16:37,980 --> 00:16:41,180
And you really see
this in this network.
320
00:16:41,190 --> 00:16:43,650
Predictably, abdelhamid abaaoud
321
00:16:43,660 --> 00:16:46,420
has the most connections.
322
00:16:46,420 --> 00:16:48,930
But abaaoud was killed
in a police shootout
323
00:16:48,930 --> 00:16:52,430
right after the Paris attacks.
324
00:16:52,430 --> 00:16:54,560
And instead of falling apart,
325
00:16:54,570 --> 00:16:57,830
the terror cell struck
again in Brussels.
326
00:16:57,840 --> 00:16:59,700
Nafees says
that network analysis
327
00:16:59,700 --> 00:17:02,510
doesn't just track
how many connections you have,
328
00:17:02,510 --> 00:17:06,180
it also tracks how you know
those connections.
329
00:17:06,180 --> 00:17:09,810
"Are you the only one
bridging one group to another?"
330
00:17:09,810 --> 00:17:13,380
"Do you only know people
who live close to you?"
331
00:17:13,390 --> 00:17:17,720
Using all of these metrics,
a seemingly minor player
332
00:17:17,720 --> 00:17:22,830
named salah abdeslam
rises to the surface.
333
00:17:22,830 --> 00:17:25,290
Salah was a guy
who never went to Syria,
334
00:17:25,300 --> 00:17:27,400
never recruited anyone himself,
335
00:17:27,400 --> 00:17:30,730
was supposed to
blow himself up in Paris,
336
00:17:30,740 --> 00:17:32,670
but chickened out
at the last minute.
337
00:17:32,670 --> 00:17:34,940
This was guy who was
clearly less radicalized
338
00:17:34,940 --> 00:17:37,040
than abdelhamid abaaoud.
339
00:17:37,040 --> 00:17:40,240
Salah didn't know
as many people as abaaoud,
340
00:17:40,250 --> 00:17:42,610
but he knew
more kinds of people.
341
00:17:42,610 --> 00:17:46,320
He was often the only bridge
between different groups.
342
00:17:46,320 --> 00:17:49,090
Salah abdeslam
was driving all over Europe,
343
00:17:49,090 --> 00:17:51,120
back and forth,
connecting people,
344
00:17:51,120 --> 00:17:52,620
moving money around.
345
00:17:52,620 --> 00:17:54,420
He was connecting players
346
00:17:54,430 --> 00:17:57,090
that wouldn't have been
connected to each other
347
00:17:57,100 --> 00:18:00,700
had he not been filling
that social network up.
348
00:18:00,700 --> 00:18:03,900
Facilitators
like salah are often
349
00:18:03,900 --> 00:18:06,640
less radical, less visible.
350
00:18:06,640 --> 00:18:10,970
They stay off the radar
and help a terror group survive,
351
00:18:10,980 --> 00:18:14,080
even after the leader
is taken out.
352
00:18:14,080 --> 00:18:16,580
Generally, police,
intelligence agencies
353
00:18:16,580 --> 00:18:19,720
don't pay enough attention
to the facilitators.
354
00:18:19,720 --> 00:18:21,120
If you go after those people,
355
00:18:21,120 --> 00:18:24,220
you will do more in reducing
the efficacy of that network
356
00:18:24,220 --> 00:18:27,460
then if you go
after the central figures.
357
00:18:27,460 --> 00:18:30,390
"Focus on the facilitators."
358
00:18:30,400 --> 00:18:34,200
This is a good rule of thumb,
until it's not.
359
00:18:34,200 --> 00:18:36,570
As law enforcement
has grown more successful
360
00:18:36,570 --> 00:18:38,530
tracking terror cells,
361
00:18:38,540 --> 00:18:42,610
more attacks are being
carried out by solo actors.
362
00:18:42,610 --> 00:18:47,440
And these lone wolves
could be anywhere.
363
00:18:51,010 --> 00:18:53,310
The lone wolf...
364
00:18:53,310 --> 00:18:56,780
the killer
who's part of no cell,
365
00:18:56,780 --> 00:19:01,050
who arouses no suspicion
until he acts.
366
00:19:01,050 --> 00:19:05,450
In September 2014,
Isis issued a global call
367
00:19:05,450 --> 00:19:08,290
for its followers
to attack the west.
368
00:19:08,290 --> 00:19:09,960
And since then,
369
00:19:09,960 --> 00:19:11,630
the pace of lone wolf attacks
370
00:19:11,630 --> 00:19:13,830
has been accelerating.
371
00:19:13,830 --> 00:19:17,230
The Orlando nightclub shooting,
the deadliest attack
372
00:19:17,230 --> 00:19:20,200
by a lone gunman
in American history,
373
00:19:20,200 --> 00:19:23,840
was part of that
horrifying trend.
374
00:19:23,840 --> 00:19:27,570
Can science trace
the origin of lone wolves
375
00:19:27,580 --> 00:19:31,850
and help us stop them?
376
00:19:31,850 --> 00:19:33,280
Social psychologist
377
00:19:33,280 --> 00:19:36,480
Sophia Moskalenko
researches radicalism
378
00:19:36,490 --> 00:19:39,520
at a university
of Maryland think tank.
379
00:19:39,520 --> 00:19:42,220
She grew up immersed
in the subject,
380
00:19:42,220 --> 00:19:44,060
but not by choice.
381
00:19:44,060 --> 00:19:47,230
I was born
in the Republic of Ukraine,
382
00:19:47,230 --> 00:19:49,000
and when I was growing up,
383
00:19:49,000 --> 00:19:53,270
I was part of
the Soviet union's effort
384
00:19:53,270 --> 00:19:56,470
to radicalize all
of its citizens.
385
00:19:56,470 --> 00:19:59,970
I often questioned that
and got into a lot of trouble
386
00:19:59,980 --> 00:20:02,640
at school for questioning it.
387
00:20:02,650 --> 00:20:06,150
Radicalism is still
the focus of her life.
388
00:20:06,150 --> 00:20:10,220
Now she studies the path
people take to terrorism.
389
00:20:10,220 --> 00:20:12,990
More frequently,
people become radicalized
390
00:20:12,990 --> 00:20:16,360
through groups or through people
that they already know
391
00:20:16,360 --> 00:20:19,190
who belong to terrorists
or radical groups.
392
00:20:19,200 --> 00:20:21,330
On rare occasions, however,
393
00:20:21,330 --> 00:20:23,760
people radicalize on their own.
394
00:20:23,770 --> 00:20:25,930
Typically, these lone wolves,
395
00:20:25,940 --> 00:20:28,600
like Ted Kaczynski,
the Unabomber,
396
00:20:28,600 --> 00:20:32,710
or Omar Mateen
the Orlando mass shooter,
397
00:20:32,710 --> 00:20:34,980
are disturbed misfits.
398
00:20:34,980 --> 00:20:37,480
And people like that can turn
to terrorism
399
00:20:37,480 --> 00:20:40,410
to escape their demons.
400
00:20:40,420 --> 00:20:43,320
But in 2004,
UK Investigators uncovered
401
00:20:43,320 --> 00:20:45,720
a plot that challenged
everyone's notion
402
00:20:45,720 --> 00:20:48,220
of what makes a terrorist.
403
00:20:48,220 --> 00:20:50,560
They were tailing a terror cell.
404
00:20:50,560 --> 00:20:52,990
All the members were known
to law enforcement,
405
00:20:53,000 --> 00:20:56,060
except one man
who only was discovered
406
00:20:56,070 --> 00:20:59,600
when another member picked him
up from the airport.
407
00:20:59,600 --> 00:21:02,400
They saw someone
they didn't yet know
408
00:21:02,410 --> 00:21:06,670
who, in fact, turned out
to be Momin Khawaja.
409
00:21:06,680 --> 00:21:08,680
24 year old Momin Khawaja
410
00:21:08,680 --> 00:21:11,210
lived in Ottawa, Canada.
411
00:21:11,210 --> 00:21:14,920
He was a computer programer
with no criminal record.
412
00:21:14,920 --> 00:21:18,520
He had become radicalized
in total isolation,
413
00:21:18,520 --> 00:21:20,650
just like a lone wolf.
414
00:21:20,660 --> 00:21:23,290
Police arrested Khawaja
and the others
415
00:21:23,290 --> 00:21:25,560
before hey had time to act.
416
00:21:25,560 --> 00:21:29,360
The cell was plotting
to make and detonate
417
00:21:29,370 --> 00:21:32,930
a number of fertilizer bombs
in and around London,
418
00:21:32,940 --> 00:21:37,840
and Khawaja's role in
the plot was to design
419
00:21:37,840 --> 00:21:41,140
and make the detonators.
420
00:21:41,140 --> 00:21:43,780
Captured terrorists
rarely take you back
421
00:21:43,780 --> 00:21:46,210
to the turning points
of their youth,
422
00:21:46,220 --> 00:21:48,780
but Khawaja was different.
423
00:21:48,780 --> 00:21:53,590
In blogs and e-mails,
he chronicled his journey...
424
00:21:53,590 --> 00:21:56,890
growing up in the suburbs
of Ottawa, Canada,
425
00:21:56,890 --> 00:21:59,990
in a house much like this.
426
00:22:00,000 --> 00:22:03,500
Khawaja was unique in that
427
00:22:03,500 --> 00:22:05,930
he liked to write down
his thoughts,
428
00:22:05,940 --> 00:22:09,640
his feelings, his intentions.
429
00:22:09,640 --> 00:22:12,870
For Sophia,
finding these written records
430
00:22:12,880 --> 00:22:16,380
was like stumbling upon
a terrorist's private journal.
431
00:22:18,780 --> 00:22:21,110
I once was a normal kid, too.
432
00:22:21,120 --> 00:22:24,520
I played basketball,
went swimming, bike riding,
433
00:22:24,520 --> 00:22:29,020
and did all the naughty,
little things kids do.
434
00:22:29,020 --> 00:22:30,560
For us, as researchers,
435
00:22:30,560 --> 00:22:34,460
this was a very rare opportunity
to look into the mind
436
00:22:34,460 --> 00:22:40,400
of a terrorist as radicalization
was unfolding in real time.
437
00:22:40,400 --> 00:22:43,240
How did he turn to terror?
438
00:22:43,240 --> 00:22:46,770
He and his family
had no contact with terrorists.
439
00:22:46,780 --> 00:22:50,680
He lived a typical teenage life
of friends and schoolwork.
440
00:22:50,680 --> 00:22:53,450
In his writing,
he expressed one motive
441
00:22:53,450 --> 00:22:58,020
for his radicalization...
empathy.
442
00:22:58,020 --> 00:22:59,450
He identified deeply
443
00:22:59,460 --> 00:23:02,520
with the suffering
of fellow muslims.
444
00:23:02,530 --> 00:23:05,390
Khawaja
was emotionally affected.
445
00:23:05,390 --> 00:23:08,430
He watched fundamentalist
islamic videos
446
00:23:08,430 --> 00:23:10,360
that depicted injustices
447
00:23:10,370 --> 00:23:13,070
perpetuated by the west
on muslims
448
00:23:13,070 --> 00:23:17,940
or revenge that muslims
take on westerners.
449
00:23:17,940 --> 00:23:21,610
Then, right after
I got out of college,
450
00:23:21,610 --> 00:23:24,310
the invasion of
Afghanistan happened.
451
00:23:24,310 --> 00:23:27,980
I felt that something
was wrong... terribly wrong.
452
00:23:27,980 --> 00:23:30,250
He was driven by ideology
453
00:23:30,250 --> 00:23:33,620
and felt compelled
to pick a side.
454
00:23:33,620 --> 00:23:37,220
He picked the side of terror.
455
00:23:37,230 --> 00:23:40,460
He was
a very sensitive individual
456
00:23:40,460 --> 00:23:43,630
who couldn't just stand idly
457
00:23:43,630 --> 00:23:46,970
while someone else
was suffering.
458
00:23:46,970 --> 00:23:49,870
Finally, fully radicalized,
459
00:23:49,870 --> 00:23:51,770
Khawaja boarded a plane
460
00:23:51,770 --> 00:23:54,980
to a terror training
camp in Pakistan.
461
00:23:54,980 --> 00:23:58,980
That's where he first met
the other members of his cell.
462
00:23:58,980 --> 00:24:01,680
Khawaja combined many qualities
463
00:24:01,680 --> 00:24:05,220
that can be prized
in any society.
464
00:24:05,220 --> 00:24:08,390
He was a self-starter,
he was very smart,
465
00:24:08,390 --> 00:24:10,660
and under different
circumstances,
466
00:24:10,660 --> 00:24:13,190
if he became a doctor,
which, you know,
467
00:24:13,200 --> 00:24:16,400
he contemplated at one point,
he would have utilized
468
00:24:16,400 --> 00:24:18,970
all of those talents
and propensities
469
00:24:18,970 --> 00:24:20,870
in a very different way.
470
00:24:20,870 --> 00:24:23,300
But he was instead
put on this path
471
00:24:23,310 --> 00:24:25,510
that lead him to terrorism.
472
00:24:30,480 --> 00:24:34,210
The goals of men
like Momin Khawaja
473
00:24:34,220 --> 00:24:38,490
are a despicable
perversion of empathy.
474
00:24:38,490 --> 00:24:41,050
Psychologists
say most young terrorists
475
00:24:41,060 --> 00:24:44,660
share the same traits
as young people everywhere.
476
00:24:44,660 --> 00:24:48,760
They desperately want
to belong to something.
477
00:24:48,760 --> 00:24:50,160
But how can they decide
478
00:24:50,170 --> 00:24:53,600
to kill themselves
for their cause?
479
00:24:55,770 --> 00:24:57,940
It may be easier than you think.
480
00:25:00,940 --> 00:25:03,040
Suicide attacks,
481
00:25:03,050 --> 00:25:06,780
the most insidious
aspect of terrorism -
482
00:25:06,780 --> 00:25:09,380
killers who walk calmly
among their victims
483
00:25:09,380 --> 00:25:11,790
before they strike.
484
00:25:11,790 --> 00:25:15,820
The phenomenon has haunted us
for over a century.
485
00:25:15,820 --> 00:25:18,190
The first we know of was
a Russian revolutionary
486
00:25:18,190 --> 00:25:20,930
who assassinated a czar.
487
00:25:20,930 --> 00:25:22,300
Since the 1980s,
488
00:25:22,300 --> 00:25:25,300
the number of suicide attacks
has risen dramatically,
489
00:25:25,300 --> 00:25:30,000
taking the lives
of almost 50,000 people.
490
00:25:30,010 --> 00:25:32,740
How can this happen so often?
491
00:25:32,740 --> 00:25:36,080
What can make someone
want to end their life
492
00:25:36,080 --> 00:25:38,280
in an act of mass murder?
493
00:25:43,750 --> 00:25:46,750
It's a question
psychologist arie kruglanski
494
00:25:46,760 --> 00:25:49,060
has long grappled with.
495
00:25:49,060 --> 00:25:51,220
The early assumption on the part
496
00:25:51,230 --> 00:25:53,960
of social scientists
was that it reflects
497
00:25:53,960 --> 00:25:57,330
a kind of psychopathology...
that is people are basically
498
00:25:57,330 --> 00:25:59,730
mentally disturbed and abnormal.
499
00:25:59,740 --> 00:26:03,740
That was in the '70s
and early '80s.
500
00:26:03,740 --> 00:26:05,840
But research has never shown
501
00:26:05,840 --> 00:26:08,040
that terrorists have
any more mental problems
502
00:26:08,040 --> 00:26:10,780
than the general population.
503
00:26:10,780 --> 00:26:12,750
So how do they decide
to kill themselves
504
00:26:12,750 --> 00:26:15,020
in order to murder many?
505
00:26:15,020 --> 00:26:19,120
Surely, most people
would not make this choice.
506
00:26:19,120 --> 00:26:20,290
Or could they?
507
00:26:20,290 --> 00:26:23,660
Let's go, big red!
508
00:26:23,660 --> 00:26:26,460
Could these
cheerleaders become martyrs?
509
00:26:26,460 --> 00:26:27,660
Let's go!
510
00:26:27,660 --> 00:26:29,900
It's an absurd question,
511
00:26:29,900 --> 00:26:34,400
but arie believes anyone can,
given the right push.
512
00:26:34,400 --> 00:26:36,170
In the particular experiment,
513
00:26:36,170 --> 00:26:40,970
we try to emulate the process
of group identification...
514
00:26:40,980 --> 00:26:44,780
what are you ready to commit
on behalf of the group?
515
00:26:44,780 --> 00:26:47,950
Arie divides
his subjects into two groups,
516
00:26:47,950 --> 00:26:51,120
then he asks each group
to read a story
517
00:26:51,120 --> 00:26:54,350
and to circle
the pronouns as they go.
518
00:26:54,360 --> 00:26:55,890
But arie has given each groups
519
00:26:55,890 --> 00:26:59,260
slightly different versions
of the story.
520
00:26:59,260 --> 00:27:00,660
One group has a story
521
00:27:00,660 --> 00:27:03,560
which only has
singular pronouns...
522
00:27:03,570 --> 00:27:05,730
"I," "me," and "my."
523
00:27:05,730 --> 00:27:07,330
"I go to the city often."
524
00:27:07,340 --> 00:27:08,970
My anticipation fills me
525
00:27:08,970 --> 00:27:11,770
"as I see the skyscrapers
come into view."
526
00:27:11,770 --> 00:27:14,370
The other group
has the same story,
527
00:27:14,380 --> 00:27:18,780
but with plural pronouns...
"we," "us," and "ours."
528
00:27:18,780 --> 00:27:20,780
"We go to the city often."
529
00:27:20,780 --> 00:27:22,720
Our anticipation fills us
530
00:27:22,720 --> 00:27:25,790
"as we see the skyscrapers
come into view."
531
00:27:25,790 --> 00:27:28,190
Arie's priming his subjects,
532
00:27:28,190 --> 00:27:31,690
activating their subconscious
to focus on belonging
533
00:27:31,690 --> 00:27:34,490
or not belonging to a group.
534
00:27:34,500 --> 00:27:38,130
It has been demonstrated
that once you prime these
535
00:27:38,130 --> 00:27:39,930
plural pronouns,
536
00:27:39,940 --> 00:27:43,500
a person gets into a mind set
of group identification.
537
00:27:44,840 --> 00:27:46,470
After this priming,
538
00:27:46,480 --> 00:27:49,880
it's time to test exactly
how much the "we"
539
00:27:49,880 --> 00:27:52,010
and the "I"
groups would sacrifice
540
00:27:52,010 --> 00:27:55,080
for members of their own group.
541
00:27:55,080 --> 00:27:57,780
He asks the cheerleaders
to participate
542
00:27:57,790 --> 00:27:59,950
in a classic
psychological scenario
543
00:27:59,960 --> 00:28:02,690
called the trolley problem.
544
00:28:02,690 --> 00:28:06,090
It portrays individuals
who are in danger
545
00:28:06,090 --> 00:28:08,330
of being run over by a trolley,
546
00:28:08,330 --> 00:28:10,130
and the dilemma is for a person
547
00:28:10,130 --> 00:28:13,100
who could safe them
by sacrificing their life
548
00:28:13,100 --> 00:28:16,500
in throwing themselves
in front of the trolley.
549
00:28:16,510 --> 00:28:20,070
First, arie tests
the subjects who were primed
550
00:28:20,080 --> 00:28:23,280
to think of themselves
as individuals.
551
00:28:23,280 --> 00:28:25,680
Let's go, big red!
552
00:28:30,120 --> 00:28:32,320
Only 30% say they would give
553
00:28:32,320 --> 00:28:34,850
their lives for their friends.
554
00:28:34,860 --> 00:28:37,620
Then arie tests the "we" group.
555
00:28:37,630 --> 00:28:39,930
Let's go, big red!
556
00:28:41,800 --> 00:28:44,800
The difference is profound.
557
00:28:44,800 --> 00:28:49,370
Over 90% of people primed
with the "we," "ours,"
558
00:28:49,370 --> 00:28:53,970
"us" pronouns were willing
to self-sacrifice for the group.
559
00:28:57,310 --> 00:28:59,950
Arie's experiment
shows that group dynamics
560
00:28:59,950 --> 00:29:04,050
have incredible influence
on individual behavior.
561
00:29:04,050 --> 00:29:06,090
Even a subtle feeling
of belonging
562
00:29:06,090 --> 00:29:10,120
can dramatically change
what you're willing to do.
563
00:29:10,130 --> 00:29:13,690
And that doesn't mean that
everybody's equally susceptible
564
00:29:13,700 --> 00:29:16,930
to the influence
of violent ideologies.
565
00:29:16,930 --> 00:29:18,600
So there are
individual differences,
566
00:29:18,600 --> 00:29:19,830
but by and large,
567
00:29:19,840 --> 00:29:22,370
it's not a psychopathological
phenomena.
568
00:29:22,370 --> 00:29:23,800
These people are not crazy,
569
00:29:23,810 --> 00:29:27,470
so it's a question of group
pressure, group influence.
570
00:29:27,480 --> 00:29:29,410
Arie says that group pressure
571
00:29:29,410 --> 00:29:32,450
and the human desire to belong
is the lever
572
00:29:32,450 --> 00:29:37,250
that allows terrorists
to give their lives for a cause.
573
00:29:37,250 --> 00:29:41,620
Under certain circumstances,
even the most normal person
574
00:29:41,620 --> 00:29:44,820
can become a violent extremist.
575
00:29:44,830 --> 00:29:46,760
If subtle cues can push someone
576
00:29:46,760 --> 00:29:50,100
towards violent self-sacrifice,
577
00:29:50,100 --> 00:29:54,800
the right push in the other
direction might stop it.
578
00:29:54,800 --> 00:29:56,940
What kind of push?
579
00:29:56,940 --> 00:30:00,540
The first step to changing
an extremist's mind
580
00:30:00,540 --> 00:30:03,010
might be to agree with him.
581
00:30:06,130 --> 00:30:10,330
We've looked inside
the mind of a terrorist.
582
00:30:10,340 --> 00:30:13,570
What we really want to know
is whether there is a way
583
00:30:13,570 --> 00:30:18,240
to get into that mind
and change it.
584
00:30:18,240 --> 00:30:19,810
You've heard the expression
585
00:30:19,810 --> 00:30:23,350
"winning the hearts
and minds of the enemy."
586
00:30:23,350 --> 00:30:27,480
With terrorists,
that seems like a tall order.
587
00:30:27,490 --> 00:30:30,090
But there might be a way,
588
00:30:30,090 --> 00:30:32,720
and it starts by making
a slight adjustment
589
00:30:32,730 --> 00:30:35,690
to another old phrase...
590
00:30:35,690 --> 00:30:39,360
"if you can't beat them,
agree with them."
591
00:30:42,130 --> 00:30:45,400
Psychology professor
eran halperin studies
592
00:30:45,400 --> 00:30:47,570
the science of changing minds.
593
00:30:47,570 --> 00:30:51,480
It's an uphill battle
in his country, Israel,
594
00:30:51,480 --> 00:30:53,280
home to the intractable struggle
595
00:30:53,280 --> 00:30:56,750
between Israelis
and palestinians.
596
00:30:56,750 --> 00:30:59,050
One of the reasons most
traditional peace interventions
597
00:30:59,050 --> 00:31:02,420
do not work is that we have
two groups of people
598
00:31:02,420 --> 00:31:06,690
with opposing views trying
to reason with each other.
599
00:31:06,690 --> 00:31:10,090
Israelis and palestinians
argue like humans everywhere...
600
00:31:10,100 --> 00:31:14,870
one side express an opinion,
the other counters it.
601
00:31:14,870 --> 00:31:17,940
When someone tells you something
that you disagree with,
602
00:31:17,940 --> 00:31:19,600
the most automatic reaction
603
00:31:19,610 --> 00:31:23,610
is to try to confront him
with a counter message.
604
00:31:23,610 --> 00:31:25,580
But we've all been in arguments
605
00:31:25,580 --> 00:31:29,250
where reason gets you nowhere,
for deeply held beliefs,
606
00:31:29,250 --> 00:31:32,680
say, for gun control
or abortion.
607
00:31:32,690 --> 00:31:35,290
Science has shown
that counter messages
608
00:31:35,290 --> 00:31:38,520
can actually be
counter productive.
609
00:31:38,520 --> 00:31:40,020
Because, basically,
their beliefs,
610
00:31:40,030 --> 00:31:42,260
their attitudes are part
of their identity,
611
00:31:42,260 --> 00:31:43,960
and then anything
that contradicts
612
00:31:43,960 --> 00:31:45,900
what they believe
in sounds to them,
613
00:31:45,900 --> 00:31:48,030
or looks to them, like a threat.
614
00:31:48,030 --> 00:31:51,740
In psychology, we call
the the state of mind "frozen."
615
00:31:51,740 --> 00:31:54,200
This kind of freezing
can happen around
616
00:31:54,210 --> 00:31:56,440
any tightly held belief,
617
00:31:56,440 --> 00:31:59,440
and it happens with terrorists.
618
00:31:59,450 --> 00:32:01,680
Their believes
are sacred to them,
619
00:32:01,680 --> 00:32:03,180
and they become frozen,
620
00:32:03,180 --> 00:32:07,020
resisting any argument,
no matter how rational.
621
00:32:08,650 --> 00:32:12,160
As a young man,
eran looked around his homeland.
622
00:32:12,160 --> 00:32:15,360
He saw a conflict
seemingly without end
623
00:32:15,360 --> 00:32:18,400
and a sea of frozen minds.
624
00:32:18,400 --> 00:32:21,470
I was very seriously
injured in the Israeli army,
625
00:32:21,470 --> 00:32:24,700
and I decided
that this is my mission.
626
00:32:24,700 --> 00:32:28,040
We cannot just accept
it as the reality.
627
00:32:28,040 --> 00:32:30,170
This cannot be
the only situation
628
00:32:30,180 --> 00:32:32,310
in which we can live in.
629
00:32:32,310 --> 00:32:34,610
But how do you change minds
630
00:32:34,610 --> 00:32:37,350
without having an argument?
631
00:32:37,350 --> 00:32:40,320
Eran and his students
at Tel Aviv university
632
00:32:40,320 --> 00:32:42,390
had an idea.
633
00:32:42,390 --> 00:32:45,360
We are going to tell
people with extreme views,
634
00:32:45,360 --> 00:32:47,930
"you know, what?
You're right."
635
00:32:47,930 --> 00:32:51,530
Eran calls it
"paradoxical thinking."
636
00:32:51,530 --> 00:32:53,700
It's like mental jujitsu...
637
00:32:53,700 --> 00:32:58,070
you use people's own
opinions against them.
638
00:32:58,070 --> 00:33:01,840
Let's go go back to a rivalry
we already know.
639
00:33:01,840 --> 00:33:03,310
Say you have a friend
640
00:33:03,310 --> 00:33:05,940
who is a rabid fan
of the rattlers.
641
00:33:05,950 --> 00:33:08,580
You could tell him
how awesome the eagles are,
642
00:33:08,580 --> 00:33:10,610
but this will just
freeze his love
643
00:33:10,620 --> 00:33:12,820
of the rattlers in place.
644
00:33:12,820 --> 00:33:16,150
However, if you use the
paradoxical thinking technique,
645
00:33:16,160 --> 00:33:19,520
you tell him how awesome
the rattlers are,
646
00:33:19,530 --> 00:33:22,330
you tell him
it's the best team ever,
647
00:33:22,330 --> 00:33:24,830
better than family,
better than love,
648
00:33:24,830 --> 00:33:26,430
better than anything.
649
00:33:26,430 --> 00:33:30,230
Overwhelmed,
your friend might reconsider
650
00:33:30,240 --> 00:33:33,840
his love of his team.
651
00:33:33,840 --> 00:33:36,240
Eran wanted to test
his theory on Israel's
652
00:33:36,240 --> 00:33:38,440
most pressing issue...
653
00:33:38,440 --> 00:33:42,150
the Israeli-palestinian
conflict.
654
00:33:42,150 --> 00:33:44,680
So, he made some ads about it.
655
00:33:48,090 --> 00:33:51,560
It's meant to shock.
656
00:33:51,560 --> 00:33:53,360
The text reads...
657
00:34:01,270 --> 00:34:04,300
We took the core ideas
that Israel belive in,
658
00:34:04,300 --> 00:34:07,440
and we practically
took them to the extreme.
659
00:34:07,440 --> 00:34:09,270
Israelis, by and large,
660
00:34:09,280 --> 00:34:12,580
believe their side
of the conflict is just.
661
00:34:12,580 --> 00:34:13,940
But the ads concluded
662
00:34:13,950 --> 00:34:17,250
that not only was
the conflict just,
663
00:34:17,250 --> 00:34:21,150
but Israelis needed
the conflict.
664
00:34:21,150 --> 00:34:23,050
For the sake of these ideas,
665
00:34:23,060 --> 00:34:24,660
to preserve these ideas,
666
00:34:24,660 --> 00:34:26,590
we have to preserve
the conflict.
667
00:34:26,590 --> 00:34:30,090
And this is absurd
in the eyes of most Israelis.
668
00:34:30,100 --> 00:34:32,560
Eran started
the experiment small,
669
00:34:32,560 --> 00:34:35,630
with a focus group
of conservative Israelis.
670
00:34:35,630 --> 00:34:38,370
While they were watching
these videos for the first time,
671
00:34:38,370 --> 00:34:41,040
Israelis got really,
really angry.
672
00:34:41,040 --> 00:34:42,410
But as the time went by,
673
00:34:42,410 --> 00:34:44,710
and as they saw
these video clips again
674
00:34:44,710 --> 00:34:46,210
and again and again,
675
00:34:46,210 --> 00:34:51,180
they started what we call
"a process of unfreezing."
676
00:34:51,180 --> 00:34:52,620
When bombarded by views
677
00:34:52,620 --> 00:34:54,650
even more extreme
than their own,
678
00:34:54,650 --> 00:34:58,760
the test subjects started to
question their own positions,
679
00:34:58,760 --> 00:35:00,120
even those who had said
680
00:35:00,130 --> 00:35:02,960
they would never compromise
with palestinians
681
00:35:02,960 --> 00:35:06,900
sudden signal
a willingness to talk.
682
00:35:06,900 --> 00:35:10,100
It seemed to work
with eran's small sample.
683
00:35:10,100 --> 00:35:13,740
How would it work
in a real Israeli town?
684
00:35:13,740 --> 00:35:15,540
It is called giv'at shmuel,
685
00:35:15,540 --> 00:35:18,810
a city mainly dominated
by Israeli
686
00:35:18,810 --> 00:35:21,040
rightist and centrist people.
687
00:35:21,050 --> 00:35:23,880
And we tried to implement
this intervention
688
00:35:23,880 --> 00:35:27,650
on this entire city
of giv'at shmuel.
689
00:35:27,650 --> 00:35:29,890
Eran and the other researchers
690
00:35:29,890 --> 00:35:31,790
handed out fliers
on the streets,
691
00:35:31,790 --> 00:35:33,060
put up billboards,
692
00:35:33,060 --> 00:35:35,290
and targeted the video clips
to people
693
00:35:35,290 --> 00:35:38,430
in the neighborhood
who were online.
694
00:35:38,430 --> 00:35:40,360
What they discovered was
695
00:35:40,370 --> 00:35:44,370
a change beyond
their expectations.
696
00:35:44,370 --> 00:35:47,270
Again, in some cases
in the beginning,
697
00:35:47,270 --> 00:35:48,940
people got very angry,
698
00:35:48,940 --> 00:35:50,710
but then they discussed
these issues,
699
00:35:50,710 --> 00:35:52,040
when they talked about them,
700
00:35:52,040 --> 00:35:54,850
when they really exposed
themselves to these ideas,
701
00:35:54,850 --> 00:35:58,620
suddenly they started
reconsidering their positions.
702
00:35:58,620 --> 00:36:01,220
In fact,
the intervention worked best
703
00:36:01,220 --> 00:36:04,860
with people who had
the most hard-line views.
704
00:36:04,860 --> 00:36:07,760
One year later, eran found
the shift in opinions
705
00:36:07,760 --> 00:36:11,190
in the test group
had held steady.
706
00:36:11,200 --> 00:36:14,060
We hope that by
exposing more and more people
707
00:36:14,070 --> 00:36:16,370
to this paradoxical
thinking intervention,
708
00:36:16,370 --> 00:36:20,040
we can help them consider
more seriously positive
709
00:36:20,040 --> 00:36:24,210
or peaceful solutions
to the conflict.
710
00:36:24,210 --> 00:36:26,310
But one scientist has another
711
00:36:26,310 --> 00:36:30,780
far more radical proposition
to reduce terrorism...
712
00:36:30,780 --> 00:36:33,320
complete disengagement.
713
00:36:40,590 --> 00:36:42,960
Our struggle against
terrorism feels like
714
00:36:42,960 --> 00:36:46,060
a wall with no foreseeable end.
715
00:36:46,060 --> 00:36:49,400
We take out osama bin laden,
716
00:36:49,400 --> 00:36:52,270
and we find ourselves
fighting Isis.
717
00:36:52,270 --> 00:36:56,540
And a military victory
over Isis in Iraq or Syria
718
00:36:56,540 --> 00:37:00,510
won't end their attacks
on civilians around the world.
719
00:37:00,510 --> 00:37:02,810
For every terrorist
we eliminate,
720
00:37:02,820 --> 00:37:06,250
many more seem
to take their place.
721
00:37:06,250 --> 00:37:08,490
What can we do?
722
00:37:08,490 --> 00:37:12,690
Is there any way to stop
terrorism once and for all?
723
00:37:15,690 --> 00:37:18,160
Evolutionary
anthropologist Peter turchin
724
00:37:18,160 --> 00:37:21,900
has combed through the library
of our collective history
725
00:37:21,900 --> 00:37:25,140
to study the rise
and fall of nations.
726
00:37:25,140 --> 00:37:26,640
He rose to prominence
727
00:37:26,640 --> 00:37:30,270
by making one
stunning prediction.
728
00:37:30,280 --> 00:37:31,810
My book "war and peace and war"
729
00:37:31,810 --> 00:37:34,040
which I published in 2005,
730
00:37:34,050 --> 00:37:36,180
two years
after the occupation of Iraq
731
00:37:36,180 --> 00:37:40,050
by the U.S. and allies, I wrote
the following prediction...
732
00:37:40,050 --> 00:37:41,950
"the western intrusion"
733
00:37:41,950 --> 00:37:44,590
will eventually generate
a counter response,
734
00:37:44,590 --> 00:37:49,060
"possibly in the form
of a new theocratic caliphate."
735
00:37:51,700 --> 00:37:53,830
The U.S. went to Iraq in part
736
00:37:53,830 --> 00:37:56,200
to promote nation building.
737
00:37:56,200 --> 00:37:57,600
According to Peter,
738
00:37:57,600 --> 00:38:01,510
that effort was
a stunning success.
739
00:38:01,510 --> 00:38:05,180
The name of that success
is "Isis,"
740
00:38:05,180 --> 00:38:09,510
an islamic caliphate declared
in 2013,
741
00:38:09,520 --> 00:38:12,150
eight years after
Peter's prediction.
742
00:38:14,850 --> 00:38:19,090
What had Peter seen
that others had missed?
743
00:38:19,090 --> 00:38:22,290
Peter is not your
typical anthropologist.
744
00:38:22,290 --> 00:38:25,130
He takes a dim view
of most so-called
745
00:38:25,130 --> 00:38:27,060
"lessons" of history.
746
00:38:27,070 --> 00:38:29,500
One German historian counted
747
00:38:29,500 --> 00:38:32,100
how many explanations
people have proposed
748
00:38:32,100 --> 00:38:34,340
for the fall
of the Roman empire,
749
00:38:34,340 --> 00:38:37,570
and he counted over 220.
750
00:38:37,580 --> 00:38:39,540
Peter wanted to find a way
751
00:38:39,550 --> 00:38:43,210
to put historical hypotheses
to a scientific test.
752
00:38:43,220 --> 00:38:45,920
He developed a new
mathematical approach
753
00:38:45,920 --> 00:38:49,520
to history called cliodynamics.
754
00:38:49,520 --> 00:38:52,090
I see it as a slayer
of theories.
755
00:38:52,090 --> 00:38:54,690
By the time you're done,
I want to have whole cemeteries
756
00:38:54,690 --> 00:38:57,690
of dead theories out there.
757
00:38:57,700 --> 00:38:59,200
In the early 2000s,
758
00:38:59,200 --> 00:39:01,260
he started to build
mathematical models
759
00:39:01,270 --> 00:39:03,400
to do just that.
760
00:39:03,400 --> 00:39:05,100
One theory he wanted to explore
761
00:39:05,100 --> 00:39:08,770
was what causes the rise
of strong nation-states.
762
00:39:08,770 --> 00:39:11,880
He had hypothesized
that strong nations form
763
00:39:11,880 --> 00:39:15,550
when two dramatically
different cultures go to war.
764
00:39:15,550 --> 00:39:18,050
He thought of these wars
as a global version
765
00:39:18,050 --> 00:39:22,050
of Darwin's survival
of the fittest.
766
00:39:22,050 --> 00:39:24,250
It's a little like
a backgammon tournament
767
00:39:24,260 --> 00:39:28,160
where players stand in
for nation-states.
768
00:39:28,160 --> 00:39:30,990
The conditions
of intense warfare create
769
00:39:31,000 --> 00:39:35,100
a constant struggle for survival
amongst armed groups.
770
00:39:38,340 --> 00:39:42,040
The group that wins is the one
which is the most cohesive,
771
00:39:42,040 --> 00:39:43,410
the most functional,
772
00:39:43,410 --> 00:39:45,980
and oftentimes,
the nastiest one.
773
00:39:52,150 --> 00:39:55,720
Peter set a team
to work collecting data points,
774
00:39:55,720 --> 00:39:58,960
over 100,000 in all...
775
00:39:58,960 --> 00:40:00,690
birth and death rates,
776
00:40:00,690 --> 00:40:04,030
whether they had iron weapons,
agriculture,
777
00:40:04,030 --> 00:40:07,700
anything that might measure
a nation's strength or weakness.
778
00:40:10,270 --> 00:40:13,400
He observed that the more
the groups fought,
779
00:40:13,410 --> 00:40:16,140
the stronger the winning
group became.
780
00:40:16,140 --> 00:40:19,040
Eventually, he developed
a mathematical formula
781
00:40:19,040 --> 00:40:24,410
that predicted the rise
and fall of the Roman empire,
782
00:40:24,420 --> 00:40:26,920
the rise of the ottoman empire,
783
00:40:26,920 --> 00:40:28,420
and the historical growth
784
00:40:28,420 --> 00:40:31,190
of islamic caliphates
in the middle east.
785
00:40:31,190 --> 00:40:32,960
It seemed to work in the past.
786
00:40:32,960 --> 00:40:35,030
How would it work in the future?
787
00:40:35,030 --> 00:40:37,430
That's when Peter
turned his attention
788
00:40:37,430 --> 00:40:40,300
to the U.S. invasion of Iraq.
789
00:40:42,470 --> 00:40:47,440
So, after the U.S.
overthrew Saddam Hussein,
790
00:40:47,440 --> 00:40:51,470
this created a large
swath of territory,
791
00:40:51,480 --> 00:40:54,540
which was essentially
stateless with multiple
792
00:40:54,550 --> 00:40:58,520
armed groups battling
against each other.
793
00:40:58,520 --> 00:41:00,620
Eventually, the strongest,
794
00:41:00,620 --> 00:41:04,750
most ruthless group survived...
795
00:41:04,760 --> 00:41:07,190
Isis.
796
00:41:07,190 --> 00:41:10,360
If data can predict
the rise of Isis,
797
00:41:10,360 --> 00:41:12,500
can it tell us
how the islamic state
798
00:41:12,500 --> 00:41:14,000
could be defeated?
799
00:41:14,000 --> 00:41:16,430
So, how should we deal
with the islamic state?
800
00:41:16,440 --> 00:41:19,970
We have three options...
one of them is stay the course,
801
00:41:19,970 --> 00:41:22,940
which essentially means
continue using air power.
802
00:41:25,010 --> 00:41:28,550
Peter believes this
will only perpetuate warfare,
803
00:41:28,550 --> 00:41:29,810
the evolutionary pressure
804
00:41:29,820 --> 00:41:32,880
that gave rise to Isis
in the first place.
805
00:41:32,880 --> 00:41:35,290
The second option
is to escalate.
806
00:41:35,290 --> 00:41:39,620
This means we must be
more ruthless than our enemy,
807
00:41:39,620 --> 00:41:43,760
but given that this would cause
widespread civilian deaths,
808
00:41:43,760 --> 00:41:46,730
Peter believes
it's not an option.
809
00:41:46,730 --> 00:41:50,200
The other opposite
extreme is to do nothing.
810
00:41:50,200 --> 00:41:53,540
In the face of a brutal
organization like Isis,
811
00:41:53,540 --> 00:41:57,010
doing nothing seems shocking.
812
00:41:57,010 --> 00:41:59,040
But Peter is convinced that
813
00:41:59,040 --> 00:42:01,640
if we completely turn
our backs on the region,
814
00:42:01,650 --> 00:42:05,420
the war that created
Isis will diminish.
815
00:42:05,420 --> 00:42:08,320
They'll be left like
a backgammon champion
816
00:42:08,320 --> 00:42:10,720
with no one to play against.
817
00:42:10,720 --> 00:42:13,690
Peter knows this course
will be painful.
818
00:42:13,690 --> 00:42:15,890
Isis and the horrors
they perpetuate
819
00:42:15,890 --> 00:42:18,260
will not go away in an instant.
820
00:42:18,260 --> 00:42:21,400
We'll have to sit by
and let it happen.
821
00:42:21,400 --> 00:42:24,070
But Peter points to his data.
822
00:42:24,070 --> 00:42:28,470
This is the best option
in terms of saving most lives.
823
00:42:28,470 --> 00:42:32,440
Essentially, it means
closing the board
824
00:42:32,440 --> 00:42:33,680
and going home.
825
00:42:36,050 --> 00:42:39,080
In this era of terrorism,
826
00:42:39,080 --> 00:42:42,150
we may not like our options.
827
00:42:42,150 --> 00:42:46,420
But disagreements and debate
are what make us free.
828
00:42:46,420 --> 00:42:50,890
Today, the tools of science
offer us new approaches.
829
00:42:50,900 --> 00:42:55,100
The war against terror
is a war of ideas.
830
00:42:55,100 --> 00:42:59,640
As terrorists seek
to impose their rigid ideas,
831
00:42:59,640 --> 00:43:04,610
our greatest weapon
is our openness to new ideas.
832
00:43:04,660 --> 00:43:09,210
Repair and Synchronization by
Easy Subtitles Synchronizer 1.0.0.0
64409
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.