Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,002 --> 00:00:04,303
BEN: Scientists design machines that think for themselves...
2
00:00:04,338 --> 00:00:06,371
I guess I have an aversion
to the word "drone".
3
00:00:06,407 --> 00:00:09,141
It implies something mindless.
4
00:00:09,176 --> 00:00:11,777
...with the potential to become weaponized...
5
00:00:11,812 --> 00:00:13,312
(Popping)
6
00:00:13,347 --> 00:00:16,815
The military would be crazy
not to take these technologies
7
00:00:16,850 --> 00:00:19,484
and utilize them in
ways that would help us
8
00:00:19,520 --> 00:00:21,553
perform better on
the battlefield.
9
00:00:21,589 --> 00:00:22,955
...or even hacked.
10
00:00:24,592 --> 00:00:27,492
There will always
be vulnerabilities.
11
00:00:27,528 --> 00:00:29,995
You can't design them
out of your system
12
00:00:30,030 --> 00:00:33,966
just with a sheer force
of will and intelligence.
13
00:00:34,001 --> 00:00:36,702
But how will it change the way we fight wars?
14
00:00:36,737 --> 00:00:41,640
It is a fantasy to think that
you can delegate the decision
15
00:00:41,675 --> 00:00:44,309
over life and death to a robot.
16
00:00:46,313 --> 00:00:56,321
♪
17
00:01:05,699 --> 00:01:07,833
50 years ago, it was pure science fiction to imagine
18
00:01:07,868 --> 00:01:10,202
a U.S. soldier sitting in a bunker in Nevada
19
00:01:10,237 --> 00:01:13,338
could remotely pilot a planeflying thousands of miles away,
20
00:01:13,374 --> 00:01:17,042
and then drop a hellfire missile on an enemy target.
21
00:01:17,077 --> 00:01:19,611
But this is reality in 2016.
22
00:01:19,647 --> 00:01:23,348
Today, unmanned aerial vehicles, or UAVs,
23
00:01:23,384 --> 00:01:25,183
are a vital part of warfare.
24
00:01:25,219 --> 00:01:27,319
They're in the stockpiles of more than two dozen nations,
25
00:01:27,354 --> 00:01:30,155
and target and kill thousands around the world.
26
00:01:30,190 --> 00:01:32,124
The next generation of drone technology
27
00:01:32,159 --> 00:01:34,026
is even more sophisticated.
28
00:01:34,061 --> 00:01:36,395
I found it here, in this idyllic little field
29
00:01:36,430 --> 00:01:38,730
just outside of Budapest, Hungary.
30
00:01:38,766 --> 00:01:41,400
When you look at a
flock of starlings in the sky,
31
00:01:41,435 --> 00:01:44,169
their motion is based
on very simple rules.
32
00:01:44,204 --> 00:01:47,372
The whole pattern, their
motion is some kind of
33
00:01:47,408 --> 00:01:50,242
meta level intelligence or
collective intelligence,
34
00:01:50,277 --> 00:01:53,412
and the same kind of thing
applies to drones as well.
35
00:01:53,447 --> 00:01:54,746
When I see the
whole flock flying,
36
00:01:54,782 --> 00:01:57,382
it's always magical for
me after so many years.
37
00:01:57,418 --> 00:01:59,918
This is Gabor Vasarhelyi.
38
00:01:59,953 --> 00:02:02,187
He's part of a team developing technology they hope will be
39
00:02:02,222 --> 00:02:04,656
used in agriculture or search and rescue.
40
00:02:04,692 --> 00:02:06,758
They're working on drone swarms,
41
00:02:06,794 --> 00:02:09,494
which is to say a flock of autonomous drones in the sky,
42
00:02:09,530 --> 00:02:12,631
moving and making decisions as a posse of like-minded robots.
43
00:02:12,666 --> 00:02:14,099
(Beeping noise)
44
00:02:18,038 --> 00:02:19,838
Unlike the typical drones used today, there's no
45
00:02:19,873 --> 00:02:22,174
human operator directing their movement individually.
46
00:02:22,209 --> 00:02:23,542
(Buzzing)
47
00:02:23,577 --> 00:02:25,510
Oh, now you're hearing them
kinda creep up, eh?
48
00:02:25,546 --> 00:02:27,112
(Buzzing)
49
00:02:27,147 --> 00:02:28,747
These machines are given a set of instructions...
50
00:02:28,782 --> 00:02:30,582
BEN: Holy ----.
51
00:02:30,617 --> 00:02:32,751
...then they figure out how to execute them as a team,
52
00:02:32,786 --> 00:02:34,219
solving problems in real time.
53
00:02:35,923 --> 00:02:37,756
(Shrugging)
54
00:02:37,791 --> 00:02:39,624
That beacon in my hands makes the drones track
55
00:02:39,660 --> 00:02:41,860
whoever's holding it, wherever they go.
56
00:02:43,897 --> 00:02:45,464
Notice how they coordinate and keep their distance
57
00:02:45,499 --> 00:02:47,199
from one another automatically.
58
00:02:49,236 --> 00:02:50,669
Ah!
59
00:02:54,608 --> 00:02:56,441
How are they communicating
with one another?
60
00:02:56,477 --> 00:03:00,212
The drones have a local
communication network.
61
00:03:00,247 --> 00:03:04,116
Every single drone just sends
out some info about itself,
62
00:03:04,151 --> 00:03:07,119
and all the drones that are
close enough to receive this,
63
00:03:07,154 --> 00:03:09,554
they receive and
integrate into the decisions.
64
00:03:09,590 --> 00:03:11,123
Why do you think the
military's interested
65
00:03:11,158 --> 00:03:12,524
in this kind of technology?
66
00:03:12,559 --> 00:03:15,093
If you have one big aircraft
and someone shoots it,
67
00:03:15,129 --> 00:03:19,498
then it's a lot of damage and
then your whole man is gone.
68
00:03:19,533 --> 00:03:23,268
When you have 100 drones instead
and someone shoots at it,
69
00:03:23,303 --> 00:03:26,471
then one drone is taken away
and the rest can do the same.
70
00:03:26,507 --> 00:03:28,340
That's how mosquitos
work, for example.
71
00:03:28,375 --> 00:03:30,308
Like there are many small
mosquitos, everybody is just
72
00:03:30,344 --> 00:03:33,078
sipping a little bit of blood,
but the whole species goes on.
73
00:03:34,348 --> 00:03:36,415
Air Forces of the future may be stoked on the idea
74
00:03:36,450 --> 00:03:39,151
of drone swarms, but what about the navies?
75
00:03:39,186 --> 00:03:42,154
Here at a NATO research facility in Italy, scientists are
76
00:03:42,189 --> 00:03:44,990
essentially developing autonomous unmanned subs.
77
00:03:45,025 --> 00:03:47,626
But signals travel more slowly underwater,
78
00:03:47,661 --> 00:03:50,695
and that makes it harder to create a submarine swarm.
79
00:03:50,731 --> 00:03:53,765
One of the things
we're trying to do is
80
00:03:53,801 --> 00:03:56,168
create the internet
for underwater robots.
81
00:03:56,203 --> 00:03:58,203
So we have the internet on land,
82
00:03:58,238 --> 00:04:01,039
and now we're getting
the Internet of Things.
83
00:04:01,074 --> 00:04:05,143
Well, we're doing Internet of
Things for underwater things.
84
00:04:05,179 --> 00:04:07,179
So, how is the
internet underwater?
85
00:04:07,214 --> 00:04:08,613
It sucks!
86
00:04:08,649 --> 00:04:10,282
(Laughing)
87
00:04:10,317 --> 00:04:13,285
John Potter is the scientist incharge of strategic development.
88
00:04:13,320 --> 00:04:14,886
He hopes that the drones they're developing here
89
00:04:14,922 --> 00:04:16,655
will be used to map the oceans.
90
00:04:16,690 --> 00:04:19,024
So when it comes
to autonomous...
91
00:04:19,059 --> 00:04:22,227
or when it comes to, let's
say, drones in the water,
92
00:04:22,262 --> 00:04:23,795
what is coming over the horizon?
93
00:04:23,831 --> 00:04:26,398
I guess I have an aversion
to the word "drone".
94
00:04:26,433 --> 00:04:29,334
It implies something mindless.
95
00:04:29,369 --> 00:04:31,203
From the outset,
there was never an option
96
00:04:31,238 --> 00:04:35,373
to go the "drone" route,
of having a dumb device.
97
00:04:35,409 --> 00:04:37,576
Because once you put it
in the sea, and it dives
98
00:04:37,611 --> 00:04:39,878
and it's gone more than
100 metres away from you,
99
00:04:39,913 --> 00:04:42,314
you don't know what it's up to.
100
00:04:42,349 --> 00:04:43,882
And if you ever want
to see it come back...
101
00:04:43,917 --> 00:04:45,417
(Laughing)
102
00:04:45,452 --> 00:04:47,052
...you want to have something
a little smarter than that.
103
00:04:47,087 --> 00:04:48,854
So what would you
call them, unmanned...?
104
00:04:48,889 --> 00:04:52,724
So these vehicles are a
step forward in autonomy
105
00:04:52,759 --> 00:04:54,392
with respect to
what's currently...
106
00:04:54,428 --> 00:04:56,695
typically in operation
at the moment.
107
00:04:56,730 --> 00:05:01,066
They're able to adaptively
change what they are doing.
108
00:05:01,101 --> 00:05:04,035
So you can put vehicles in
and have them work as a team.
109
00:05:04,071 --> 00:05:07,339
And if one vehicle sees
something and it knows it has
110
00:05:07,374 --> 00:05:09,708
a team member with maybe
a high resolution camera
111
00:05:09,743 --> 00:05:12,511
or imaging system that's
just over here somewhere,
112
00:05:12,546 --> 00:05:15,113
and it knows how far away
it is, it says, "Hey, Bill.
113
00:05:15,148 --> 00:05:17,382
Come and take a look at this,
I think I found a so-and-so.
114
00:05:17,417 --> 00:05:19,518
You wanna check that out,
see what you think of it?"
115
00:05:19,553 --> 00:05:21,286
And I guess that's one of the
other things people are going to
116
00:05:21,321 --> 00:05:23,722
wonder, is like... is something
like an underwater system
117
00:05:23,757 --> 00:05:25,957
like that, could that be
weaponized at some point?
118
00:05:25,993 --> 00:05:28,793
That's not something we
are really working on here.
119
00:05:28,829 --> 00:05:33,098
How these autonomous abilities
eventually translate into
120
00:05:33,133 --> 00:05:37,302
operational systems depends
on lots of other folk
121
00:05:37,337 --> 00:05:39,938
and organizations within
NATO and elsewhere.
122
00:05:39,973 --> 00:05:42,040
It is something
that's being considered.
123
00:05:42,075 --> 00:05:46,011
But I would like to see all the
stakeholders - which basically
124
00:05:46,046 --> 00:05:50,749
means all of society - become
informed, well informed,
125
00:05:50,784 --> 00:05:52,784
and think carefully about this.
126
00:05:52,819 --> 00:05:55,720
What is it that we actually
want from autonomous systems?
127
00:05:58,525 --> 00:05:59,958
And that really hit me.
128
00:05:59,993 --> 00:06:02,060
The people developing these systems aren't the ones
129
00:06:02,095 --> 00:06:04,563
who will decide how they're used in military conflict.
130
00:06:06,800 --> 00:06:08,099
But whether or not you're comfortable
131
00:06:08,135 --> 00:06:10,402
with autonomous drone swarms, they're coming.
132
00:06:10,437 --> 00:06:12,737
I'm in Portugal, heading out to a NATO research ship
133
00:06:12,773 --> 00:06:14,773
in the Atlantic to get a first-hand view
134
00:06:14,808 --> 00:06:16,408
of underwater drone testing.
135
00:06:17,811 --> 00:06:20,478
I'm gonna have to climb
that puppy right there,
136
00:06:20,514 --> 00:06:22,147
like Blackbeard or something.
137
00:06:23,383 --> 00:06:24,950
(Grunting)
138
00:06:26,320 --> 00:06:32,791
♪
139
00:06:32,826 --> 00:06:35,860
This is more or less our
control centre where we keep
140
00:06:35,896 --> 00:06:38,997
the picture of everything that
we have deployed at sea.
141
00:06:39,032 --> 00:06:41,299
This here is the
position of the ship,
142
00:06:41,335 --> 00:06:42,734
this is where we are now.
143
00:06:42,769 --> 00:06:46,037
You'll see here these green
points is our field of
144
00:06:46,073 --> 00:06:48,340
deployed assets for the
current tests we're doing.
145
00:06:48,375 --> 00:06:50,041
This is basically
your ocean lab.
146
00:06:50,077 --> 00:06:51,643
It's our ocean lab, exactly.
147
00:06:51,678 --> 00:06:53,345
That's a good way to put it,
it's our ocean lab.
148
00:06:54,381 --> 00:06:57,682
Joao Alves is the Coordinator for Underwater Communication.
149
00:06:57,718 --> 00:07:00,251
He and his crew spend weeks at sea experimenting with things
150
00:07:00,287 --> 00:07:04,623
like submarine communication and anti-sub warfare tech.
151
00:07:04,658 --> 00:07:05,991
"Time Bandit."
152
00:07:06,026 --> 00:07:07,792
So these are the
anti-submarine warfare ones?
153
00:07:07,828 --> 00:07:10,662
These are the ones that we
employ for our multistatic
154
00:07:10,697 --> 00:07:13,031
anti-submarine warfare
missions, yes indeed.
155
00:07:13,066 --> 00:07:15,467
Now I know these machines
right now are just...
156
00:07:15,502 --> 00:07:18,036
they're doing nothing more than
reconnaissance or surveillance
157
00:07:18,071 --> 00:07:21,239
or even just mapping, or
anti-submarine warfare
158
00:07:21,274 --> 00:07:25,343
in terms of detecting
different possible enemy craft,
159
00:07:25,379 --> 00:07:26,645
but do you think at any point
160
00:07:26,680 --> 00:07:28,713
these things could
be weaponized?
161
00:07:28,749 --> 00:07:30,181
Do you think
that's a possibility?
162
00:07:30,217 --> 00:07:32,350
We, as a science-based
research centre,
163
00:07:32,386 --> 00:07:35,053
are interested in
developing the autonomy,
164
00:07:35,088 --> 00:07:36,655
the capabilities
of these machines.
165
00:07:36,690 --> 00:07:40,325
Then, I mean, it's totally out
of our scope, the usage of...
166
00:07:40,360 --> 00:07:43,728
We are very excited on
the examples you just gave,
167
00:07:43,764 --> 00:07:47,032
on the improving
the reconnaissance,
168
00:07:47,067 --> 00:07:51,436
improving the mapping
capabilities...
169
00:07:51,471 --> 00:07:53,705
other than that, I mean, we...
170
00:07:53,740 --> 00:07:54,839
You're not touching the rest?
171
00:07:54,875 --> 00:07:55,907
No, not at all.
172
00:07:55,942 --> 00:07:57,509
I mean, not even our interest.
173
00:07:59,046 --> 00:08:01,746
Joao says he's not interested in weaponizing these drones.
174
00:08:01,782 --> 00:08:04,516
But like other people workingon the same autonomous machines,
175
00:08:04,551 --> 00:08:07,052
he doesn't decide how they're ultimately used.
176
00:08:07,087 --> 00:08:09,387
That's up to the military.
177
00:08:14,695 --> 00:08:16,594
BEN: Unlike the carpet bombings of the past,
178
00:08:16,630 --> 00:08:18,630
drone strikes are supposed to be precise,
179
00:08:18,665 --> 00:08:20,598
limiting civilian casualties.
180
00:08:20,634 --> 00:08:22,801
That said, it's tough to know exactly how many people
181
00:08:22,836 --> 00:08:25,570
around the world were killed by drones in the past decade.
182
00:08:25,605 --> 00:08:27,238
Estimates range from the thousands
183
00:08:27,274 --> 00:08:29,841
to the tens of thousands, and it's even harder to figure out
184
00:08:29,876 --> 00:08:32,143
how many of those killed or injured were civilians.
185
00:08:32,179 --> 00:08:35,480
♪
186
00:08:35,515 --> 00:08:38,249
But it's not just thenumbers that alarm some critics.
187
00:08:38,285 --> 00:08:40,919
Have you seen a US
president go on TV to say,
188
00:08:40,954 --> 00:08:43,688
"Tonight, I ordered
airstrikes in Libya"?
189
00:08:43,724 --> 00:08:45,690
Or Pakistan or Yemen?
190
00:08:45,726 --> 00:08:47,692
They don't do that anymore.
191
00:08:47,728 --> 00:08:50,095
Because something about
drone technology and other
192
00:08:50,130 --> 00:08:55,133
weapons technology has enabled
US presidents and politicians
193
00:08:55,168 --> 00:08:57,769
to basically shift
what the norm is.
194
00:08:57,804 --> 00:09:01,139
Naureen Shah is the director of Amnesty International USA's
195
00:09:01,174 --> 00:09:03,341
Security and Human Rights Program.
196
00:09:03,376 --> 00:09:05,043
She's a vocal opponent of the US military's
197
00:09:05,078 --> 00:09:06,878
targeted killing campaigns,
198
00:09:06,913 --> 00:09:09,013
and worries about the autonomous technologies in development.
199
00:09:10,383 --> 00:09:13,885
So the idea that this could
reduce civilian casualties
200
00:09:13,920 --> 00:09:16,154
to you is impossible?
201
00:09:17,324 --> 00:09:19,657
Our concern is it actually would
increase civilian casualties,
202
00:09:19,693 --> 00:09:21,659
increase the risk of
civilian casualties.
203
00:09:21,695 --> 00:09:24,896
I of course agree that if we can
keep people out of harm's way,
204
00:09:24,931 --> 00:09:27,565
that's vital.
205
00:09:27,601 --> 00:09:30,468
But if you don't have
governments having to weigh
206
00:09:30,504 --> 00:09:34,739
the costs to their own citizens,
when they decide to go to war,
207
00:09:34,775 --> 00:09:36,474
then you're really making
it so that their stakes
208
00:09:36,510 --> 00:09:38,576
are a lot lower.
209
00:09:38,612 --> 00:09:41,813
And that could mean that
the US and other governments
210
00:09:41,848 --> 00:09:44,549
are just engaging a lot more in
warfare than they did before,
211
00:09:44,584 --> 00:09:47,252
just not calling it warfare,
and we already see that.
212
00:09:47,287 --> 00:09:50,722
The US is using lethal force
right now in Libya, Syria,
213
00:09:50,757 --> 00:09:55,360
Iraq, Afghanistan,
Yemen, Pakistan, Somalia.
214
00:09:55,395 --> 00:09:58,496
There's something kind of
invisible about the use of
215
00:09:58,532 --> 00:10:00,765
autonomous weapons, or robots.
216
00:10:00,801 --> 00:10:03,668
Something that enables
policy makers to think, well,
217
00:10:03,703 --> 00:10:06,704
we could go... we could use
lethal force in a surgical way,
218
00:10:06,740 --> 00:10:08,506
in a limited way, and we
wouldn't even have to have
219
00:10:08,542 --> 00:10:10,008
a big public debate about it.
220
00:10:10,043 --> 00:10:13,144
We're supposedly
just at war all the time.
221
00:10:13,180 --> 00:10:16,848
But at what point is this
technology just inevitable?
222
00:10:19,386 --> 00:10:22,587
One of the things
that is so difficult
223
00:10:22,622 --> 00:10:25,924
is that technology is
fascinating for all of us.
224
00:10:25,959 --> 00:10:30,094
And it's so fascinating
that it creates
225
00:10:30,130 --> 00:10:31,796
almost a kind of glee.
226
00:10:31,832 --> 00:10:34,232
It's a fixation that
our generation has
227
00:10:34,267 --> 00:10:37,202
on the possibilities
provided by technology.
228
00:10:37,237 --> 00:10:39,204
It can make us better
than our own selves,
229
00:10:39,239 --> 00:10:44,209
that somehow a robot is more
selfless than a human being is,
230
00:10:44,244 --> 00:10:46,544
doesn't have the prejudices
of a human being, and that
231
00:10:46,580 --> 00:10:50,748
can solve the fundamental
gruesomeness of war.
232
00:10:50,784 --> 00:10:55,753
It is a fantasy to think that
you can delegate the decision
233
00:10:55,789 --> 00:11:00,291
over life and death to a robot
and somehow that inherently
234
00:11:00,327 --> 00:11:03,761
makes it more humane and
more precise and more lawful.
235
00:11:03,797 --> 00:11:06,764
Because there's nothing
that would give us any reason
236
00:11:06,800 --> 00:11:09,300
to believe that a robot
has human empathy,
237
00:11:09,336 --> 00:11:12,403
that it has the ability to
make a judgment about who is
238
00:11:12,439 --> 00:11:15,607
a civilian and who is not a
civilian, and whether or not in
239
00:11:15,642 --> 00:11:18,743
those particular circumstances a
civilian really poses a threat.
240
00:11:20,714 --> 00:11:23,448
But some scientists completely disagree with Naureen,
241
00:11:23,483 --> 00:11:25,850
and think that it is in fact possible to program a robot
242
00:11:25,886 --> 00:11:28,453
to act in a more humane, precise and lawful way
243
00:11:28,488 --> 00:11:30,288
than a human soldier.
244
00:11:30,323 --> 00:11:33,258
And here on the Georgia Techcampus, they tried to prove it.
245
00:11:33,293 --> 00:11:38,696
♪
246
00:11:38,732 --> 00:11:42,267
So if you did something bad,
and you felt guilty about it,
247
00:11:42,302 --> 00:11:44,369
you would be less
likely to do it again.
248
00:11:44,404 --> 00:11:47,171
We want the robot to
experience... to behave in...
249
00:11:47,207 --> 00:11:48,673
A similar way, right.
250
00:11:48,708 --> 00:11:51,709
The robots... None of these
robots feel emotions, okay?
251
00:11:51,745 --> 00:11:52,877
They don't feel anything.
252
00:11:52,913 --> 00:11:54,279
They're not sentient.
253
00:11:54,314 --> 00:11:56,681
But people can perceive
them as feeling things.
254
00:11:58,551 --> 00:12:00,818
Ron Arkin is a roboticist and professor.
255
00:12:00,854 --> 00:12:02,320
He's developing technology
256
00:12:02,355 --> 00:12:04,555
designed to program ethics into robots.
257
00:12:04,591 --> 00:12:07,992
There are issues associated
with human war fighters
258
00:12:08,028 --> 00:12:12,230
who occasionally are
careless, make mistakes,
259
00:12:12,265 --> 00:12:14,732
and in some cases
commit atrocities.
260
00:12:14,768 --> 00:12:18,202
So you really think that
you can program a robot to be
261
00:12:18,238 --> 00:12:22,373
a better and more efficient
killer than a human being,
262
00:12:22,409 --> 00:12:25,977
and without making the same kind
of mess ups that we might see,
263
00:12:26,012 --> 00:12:29,681
say in friendly fire
or civilian casualties?
264
00:12:29,716 --> 00:12:31,516
I'm not interested in making
them better and more efficient
265
00:12:31,551 --> 00:12:34,852
killers; I'm interested in
making them better protectors
266
00:12:34,888 --> 00:12:38,256
of non-combatants and better
protectors of civilians while
267
00:12:38,291 --> 00:12:41,592
they are conducting missions
than human war fighters are.
268
00:12:41,628 --> 00:12:45,363
So my goal is to make them
better adhere to international
269
00:12:45,398 --> 00:12:48,399
humanitarian law as embodied
in the Geneva conventions
270
00:12:48,435 --> 00:12:49,834
and the rules of engagement.
271
00:12:49,869 --> 00:12:52,036
You cannot shoot
in a no-kill zone,
272
00:12:52,072 --> 00:12:55,440
you cannot shoot people
that have surrendered,
273
00:12:55,475 --> 00:12:58,509
you cannot carry out
some summary executions.
274
00:12:58,545 --> 00:13:00,044
These systems should
have the right to
275
00:13:00,080 --> 00:13:01,713
refuse an order as well, too.
276
00:13:01,748 --> 00:13:02,947
- Really?
- Yeah.
277
00:13:02,983 --> 00:13:04,749
This is not just a
decision of when to fire,
278
00:13:04,784 --> 00:13:06,551
it's also a decision
of when not to fire.
279
00:13:06,586 --> 00:13:11,356
So that if someone tells it
to attack something that is
280
00:13:11,391 --> 00:13:13,291
against international
humanitarian law,
281
00:13:13,326 --> 00:13:15,193
and its programming tells
it that this is against
282
00:13:15,228 --> 00:13:17,595
international humanitarian
law, it should not engage
283
00:13:17,630 --> 00:13:19,097
that particular target.
284
00:13:19,132 --> 00:13:20,565
So will these weapon systems,
285
00:13:20,600 --> 00:13:23,301
could they be like the
Tesla self-driving car?
286
00:13:23,336 --> 00:13:26,604
You know, you won't get
into as many accidents.
287
00:13:26,639 --> 00:13:29,040
Yes, and that's the argument
that self-driving cars use.
288
00:13:29,075 --> 00:13:30,541
It's the same sort of thing.
289
00:13:30,577 --> 00:13:32,477
They say that human beings
are the most dangerous things
290
00:13:32,512 --> 00:13:35,480
on the road because
we get angry, we drink,
291
00:13:35,515 --> 00:13:38,149
we are distracted.
292
00:13:38,184 --> 00:13:40,885
It'd be better to have
the robots driving us there.
293
00:13:40,920 --> 00:13:42,387
Look at 9/11.
294
00:13:42,422 --> 00:13:44,288
There was no reason an
aircraft should've crashed
295
00:13:44,324 --> 00:13:45,790
into those buildings.
296
00:13:45,825 --> 00:13:49,160
It's an easy task for a
control system, now and then,
297
00:13:49,195 --> 00:13:51,629
to be able to change its
altitude when it recognizes
298
00:13:51,664 --> 00:13:55,099
it's in a collision course and
avoid that particular object.
299
00:13:55,135 --> 00:13:56,801
We chose not to do that.
300
00:13:56,836 --> 00:14:01,105
We choose to trust human beings
over and over and over again,
301
00:14:01,141 --> 00:14:03,241
but that's not
always the best solution.
302
00:14:08,782 --> 00:14:10,214
BEN: If there's one thing I'velearned over the last few years,
303
00:14:10,250 --> 00:14:12,316
it's that it's possible to hack virtually everything
304
00:14:12,352 --> 00:14:16,054
running on code - from a nuclear enrichment facility to an SUV.
305
00:14:16,089 --> 00:14:19,190
But what about the drone swarms of the future?
306
00:14:19,225 --> 00:14:21,726
Here in Texas, there's a teamof researchers that demonstrated
307
00:14:21,761 --> 00:14:25,029
how military drones used today are hackable.
308
00:14:25,065 --> 00:14:28,066
Todd Humphreys is the director of the Radionavigation Lab.
309
00:14:29,135 --> 00:14:30,668
So this is our arena.
310
00:14:30,703 --> 00:14:32,170
BEN: Nets for the drones?
311
00:14:32,205 --> 00:14:33,905
TODD: The nets' for the FAA.
312
00:14:33,940 --> 00:14:35,940
So there's a
future, you think,
313
00:14:35,975 --> 00:14:38,076
wherein you could see
a ton of these things?
314
00:14:38,111 --> 00:14:42,046
Dinner table-sized drones
possibly, like in a swarm?
315
00:14:42,082 --> 00:14:43,247
Oh yeah.
316
00:14:43,283 --> 00:14:44,916
- Attacking a target?
- That's right.
317
00:14:44,951 --> 00:14:48,753
And they have no regard
for their own life, right?
318
00:14:48,788 --> 00:14:50,421
So these are suicide drones.
319
00:14:50,457 --> 00:14:53,558
They pick their target,
they go directly at it,
320
00:14:53,593 --> 00:14:57,562
and the kinds of close-in
weapons systems and...
321
00:14:57,597 --> 00:15:00,264
large-scale weapon systems
that our US destroyers
322
00:15:00,300 --> 00:15:02,867
and the Navy have
today or other ships,
323
00:15:02,902 --> 00:15:05,403
they'll be no match against
16 of these at once.
324
00:15:05,438 --> 00:15:07,872
Right, and it all
starts here in Austin.
325
00:15:07,907 --> 00:15:12,110
Well, we don't intend
to do development
326
00:15:12,145 --> 00:15:13,544
of war machines here.
327
00:15:13,580 --> 00:15:16,280
But I will say that our
somewhat whimsical games
328
00:15:16,316 --> 00:15:19,550
that we'll be playing,
they are going to engage
329
00:15:19,586 --> 00:15:22,153
our operators and
our drones in scenarios
330
00:15:22,188 --> 00:15:25,289
that are applicable
to all sorts of fields.
331
00:15:25,325 --> 00:15:28,226
One of those so-called whimsical games he and his students
332
00:15:28,261 --> 00:15:31,629
showed me was a drone version of Capture the Flag.
333
00:15:32,632 --> 00:15:34,298
(Popping)
334
00:15:34,334 --> 00:15:36,501
But that's not all they've been up to.
335
00:15:36,536 --> 00:15:38,569
Back in 2011, when the Iranians claimed to have hacked
336
00:15:38,605 --> 00:15:41,239
a US military drone, Todd and his team proved that
337
00:15:41,274 --> 00:15:43,641
it was indeed possible, and demonstrated how
338
00:15:43,676 --> 00:15:45,977
a simple method called spoofing could have been used
339
00:15:46,012 --> 00:15:47,745
to jack US military hardware.
340
00:15:50,150 --> 00:15:52,517
Every drone has
a few vital links.
341
00:15:52,552 --> 00:15:55,153
One of them is to
its ground controller,
342
00:15:55,188 --> 00:15:58,689
and one of them is to overhead
satellites for navigation.
343
00:15:58,725 --> 00:16:01,092
Spoofing attacks one
of those vital links.
344
00:16:01,127 --> 00:16:05,263
It basically falsifies a GPS
signal, makes a forged signal,
345
00:16:05,298 --> 00:16:09,767
sends it over to the drone
and convinces the drone that
346
00:16:09,802 --> 00:16:12,537
it's in a different place or
at a different time, you know,
347
00:16:12,572 --> 00:16:15,606
because GPS gives us
both time and position.
348
00:16:15,642 --> 00:16:18,042
And you've actually proven
this is possible?
349
00:16:18,077 --> 00:16:19,810
We've demonstrated it, yeah.
350
00:16:19,846 --> 00:16:22,013
So we've done this,
we've done it with a drone,
351
00:16:22,048 --> 00:16:24,982
we've done it with
a 210-foot super yacht.
352
00:16:25,018 --> 00:16:27,118
So this is back in
2011 it was possible.
353
00:16:27,153 --> 00:16:28,519
- Yeah.
- How about now?
354
00:16:28,555 --> 00:16:30,221
Have drones kind of
caught up to this?
355
00:16:30,256 --> 00:16:32,790
Have engineers realized
that, you know,
356
00:16:32,825 --> 00:16:34,292
when you put this
autonomous thing in the air,
357
00:16:34,327 --> 00:16:37,628
essentially it can be
overtaken by a hostile actor?
358
00:16:37,664 --> 00:16:40,965
I'd like to be able
to say yes, but no.
359
00:16:41,000 --> 00:16:46,404
The FAA has... has charged a
tiger team to looking into this,
360
00:16:46,439 --> 00:16:48,673
and they came back after
two years of study and have
361
00:16:48,708 --> 00:16:52,143
put together a set of
proposals that would make
362
00:16:52,178 --> 00:16:54,979
commercial airliners
more resilient to spoofing,
363
00:16:55,014 --> 00:16:57,648
more... have better
defenses against spoofing.
364
00:16:57,684 --> 00:17:02,486
They've also looked at even
smaller unmanned aircraft.
365
00:17:02,522 --> 00:17:05,590
But things move slowly in the
world of commercial airliners.
366
00:17:05,625 --> 00:17:10,194
And as far as smaller unmanned
aircraft, I think those of us
367
00:17:10,230 --> 00:17:13,497
who are just playing around
with the small toys and such
368
00:17:13,533 --> 00:17:16,334
aren't really thinking so much
about security at this point.
369
00:17:16,369 --> 00:17:19,437
So we know that hacking
drones are possible, and yet
370
00:17:19,472 --> 00:17:22,540
people are starting to think
about creating drone swarms.
371
00:17:22,575 --> 00:17:23,774
Mm-hmm.
372
00:17:23,810 --> 00:17:26,210
What if your drone
swarm... you send one off
373
00:17:26,246 --> 00:17:29,347
and you're thinking you're
gonna destroy your enemy,
374
00:17:29,382 --> 00:17:31,115
and all of a sudden it
turns right back around
375
00:17:31,150 --> 00:17:32,950
and it comes at you
because it's been hacked?
376
00:17:32,986 --> 00:17:34,385
Right.
377
00:17:34,420 --> 00:17:36,220
I mean, do you have to prepare
it to be able to destroy
378
00:17:36,256 --> 00:17:38,456
your own drone swarm?
379
00:17:38,491 --> 00:17:40,024
Absolutely.
380
00:17:40,059 --> 00:17:43,694
I believe you have to have a
killswitch for your own swarm,
381
00:17:43,730 --> 00:17:46,697
your own resources,
your assets, and that
382
00:17:46,733 --> 00:17:49,133
that must be an
ironclad kill switch.
383
00:17:49,168 --> 00:17:53,070
The only way to think about
security is as an arms race.
384
00:17:53,106 --> 00:17:57,141
There will always
be vulnerabilities,
385
00:17:57,176 --> 00:18:02,046
and you can't somehow design
them out of your system
386
00:18:02,081 --> 00:18:06,217
just with sheer force
of will and intelligence.
387
00:18:06,252 --> 00:18:10,087
♪
388
00:18:12,692 --> 00:18:14,258
BEN: I'm at the Pentagon in Washington
389
00:18:14,294 --> 00:18:16,127
to meet with Robert Work.
390
00:18:16,162 --> 00:18:18,696
He's basically the number two at the Department of Defense,
391
00:18:18,731 --> 00:18:21,999
which is why he gets theBlackhawk helicopter treatment.
392
00:18:22,035 --> 00:18:24,835
We're heading out to the US Army Research Lab
393
00:18:24,871 --> 00:18:27,138
at the Aberdeen Proving Ground in Maryland.
394
00:18:27,173 --> 00:18:30,341
Work is leading the Third Offset Strategy, an initiative
395
00:18:30,376 --> 00:18:32,677
aimed at building up the military's tech capabilities
396
00:18:32,712 --> 00:18:36,814
to counter big-time adversaries like Russia or China.
397
00:18:36,849 --> 00:18:39,850
The strategy calls for the US military to focus on robotics,
398
00:18:39,886 --> 00:18:43,888
miniaturization, 3D printing, and autonomous systems.
399
00:18:43,923 --> 00:18:46,957
So why is the US government
so interested in autonomous
400
00:18:46,993 --> 00:18:49,960
weapons systems and autonomous
machinery for the military?
401
00:18:49,996 --> 00:18:52,963
Autonomy and
artificial intelligence
402
00:18:52,999 --> 00:18:55,666
is changing our lives every day.
403
00:18:55,702 --> 00:19:00,971
And the military would be crazy
not to take these technologies
404
00:19:01,007 --> 00:19:04,508
and utilize them in ways that
would help us perform better
405
00:19:04,544 --> 00:19:05,843
on the battlefield.
406
00:19:05,878 --> 00:19:07,912
I mean, would it be crazy
because other countries
407
00:19:07,947 --> 00:19:09,313
are going to do it too?
408
00:19:09,349 --> 00:19:11,749
We know that other great
powers like Russia and China
409
00:19:11,784 --> 00:19:15,853
are investing a lot of
money in autonomy and AI,
410
00:19:15,888 --> 00:19:18,189
and they think differently
about it than we do.
411
00:19:18,224 --> 00:19:21,192
You know, we think
about autonomy and AI
412
00:19:21,227 --> 00:19:23,694
of enabling the
human to be better.
413
00:19:23,730 --> 00:19:28,099
Authoritarian regimes sometimes
think about taking the human
414
00:19:28,134 --> 00:19:30,167
out of the equation and
allowing the machine
415
00:19:30,203 --> 00:19:32,737
to make the decision, and we
think that's very dangerous.
416
00:19:32,772 --> 00:19:34,672
That's not our
conception at all.
417
00:19:34,707 --> 00:19:36,273
It's more like Iron Man,
418
00:19:36,309 --> 00:19:40,411
where you would use the
machine as an exoskeleton
419
00:19:40,446 --> 00:19:43,547
to make the human stronger,
allow the human to do more;
420
00:19:43,583 --> 00:19:46,684
an autonomous intelligence
that's a part of the machine
421
00:19:46,719 --> 00:19:49,754
to help the human make
better decisions.
422
00:19:49,789 --> 00:19:51,622
Would taking humans
out of the loop
423
00:19:51,657 --> 00:19:54,058
give your adversary
an advantage?
424
00:19:54,093 --> 00:19:56,794
This is a question just
like cyber vulnerability
425
00:19:56,829 --> 00:19:58,396
that keeps us up at night.
426
00:19:58,431 --> 00:20:04,435
Would a network that is working
at machine speed all the time
427
00:20:04,470 --> 00:20:06,537
be able to beat a network
428
00:20:06,572 --> 00:20:10,107
in which machines and
humans work together?
429
00:20:10,143 --> 00:20:13,878
Um, and in certain
instances like...
430
00:20:13,913 --> 00:20:18,115
as I said, cyber warfare,
electronic warfare,
431
00:20:18,151 --> 00:20:19,784
machines will
always beat humans.
432
00:20:19,819 --> 00:20:21,752
I mean, that will always happen.
433
00:20:21,788 --> 00:20:25,389
This is a competition, and we
think that the way this will go
434
00:20:25,425 --> 00:20:29,160
for the next multiple decades is
435
00:20:29,195 --> 00:20:31,128
AI and autonomy will
help the human.
436
00:20:31,164 --> 00:20:34,498
And you will never try to
make it go all automatic.
437
00:20:34,534 --> 00:20:36,667
But we have to watch,
and we have to be careful,
438
00:20:36,702 --> 00:20:38,436
and make sure that
that doesn't happen.
439
00:20:40,139 --> 00:20:42,440
But not everyone thinks about the Third Offset Strategy
440
00:20:42,475 --> 00:20:44,442
in such an optimistic way.
441
00:20:44,477 --> 00:20:46,143
In fact, some people think it's only creating a new
442
00:20:46,179 --> 00:20:49,914
arms race for robotic war tools that could escalate quickly.
443
00:20:49,949 --> 00:20:51,982
Naureen Shah is one of those people.
444
00:20:53,653 --> 00:20:56,454
What would you say to the DOD
policy-makers who are actually
445
00:20:56,489 --> 00:21:00,458
looking into researching
autonomous weapons systems?
446
00:21:00,493 --> 00:21:03,461
I would say ban
these weapons systems,
447
00:21:03,496 --> 00:21:06,931
because you yourselves know
what the consequences would be
448
00:21:06,966 --> 00:21:08,632
if other governments had them.
449
00:21:08,668 --> 00:21:10,968
Look at this issue not
from the perspective just of
450
00:21:11,003 --> 00:21:13,871
the US government, but from
how to keep communities
451
00:21:13,906 --> 00:21:17,475
all around the world safe from
unlawful use of lethal force.
452
00:21:17,510 --> 00:21:20,711
If the US government is
concerned about what it sees as
453
00:21:20,746 --> 00:21:22,780
bad actors having
these weapons systems,
454
00:21:22,815 --> 00:21:24,281
then it shouldn't develop them.
455
00:21:24,317 --> 00:21:26,817
Once you start to manufacture
these weapons systems and
456
00:21:26,853 --> 00:21:29,353
authorize arms sales
to all these countries,
457
00:21:29,388 --> 00:21:31,288
this technology is
going to proliferate,
458
00:21:31,324 --> 00:21:34,959
and it will unfortunately
spiral out of control.
459
00:21:34,994 --> 00:21:36,327
We know that you can do it.
460
00:21:36,362 --> 00:21:38,529
We know that the
technology is tantalizing,
461
00:21:38,564 --> 00:21:42,032
but you know that if
you start down this road,
462
00:21:42,068 --> 00:21:44,235
you're going to a
very, very dangerous place,
463
00:21:44,270 --> 00:21:45,870
and you shouldn't go there.
464
00:21:47,573 --> 00:21:49,373
At this point, it might be too late.
465
00:21:49,408 --> 00:21:51,242
These war toys are on the way.
466
00:21:51,277 --> 00:21:54,144
And the thing is, almost everydeveloper of autonomous machines
467
00:21:54,180 --> 00:21:56,981
I met is separated from the people who actually decide
468
00:21:57,016 --> 00:21:59,416
how their creations will enter war.
469
00:21:59,452 --> 00:22:02,887
So while researchers toss around ideas about humane battles
470
00:22:02,922 --> 00:22:05,556
or friendly killer robots, you get the feeling
471
00:22:05,591 --> 00:22:07,525
conflict isn't going to be any less horrifying
472
00:22:07,560 --> 00:22:09,593
using autonomous bots.
473
00:22:09,629 --> 00:22:13,063
Whether it's stick and stones, guns, missiles, or drones,
474
00:22:13,099 --> 00:22:16,233
people will always go to war, and it's always brutal.
475
00:22:16,269 --> 00:22:18,202
Even if a robot is fighting for you.
44922
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.