Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:07,417 --> 00:00:10,417
Today's Sandy Speaks is going
to focus on my white people.
2
00:00:10,500 --> 00:00:11,667
What I need you to understand
3
00:00:11,750 --> 00:00:14,417
is that being black in America
is very, very hard.
4
00:00:15,250 --> 00:00:16,875
Sandy had been arrested.
5
00:00:16,959 --> 00:00:18,250
I will light you up!
Get out!
6
00:00:18,333 --> 00:00:19,917
How do you go from failure
7
00:00:20,083 --> 00:00:21,542
to signal a lane change
8
00:00:21,625 --> 00:00:24,625
to dead in jail
by alleged suicide?
9
00:00:25,041 --> 00:00:26,250
I believe she let them know,
10
00:00:26,583 --> 00:00:28,375
"I'll see you guys in court,"
and I believe they silenced her.
11
00:00:28,458 --> 00:00:29,417
Sandra Bland!
12
00:00:29,500 --> 00:00:31,166
Say her name!
Say her name!
13
00:00:31,250 --> 00:00:33,083
Say her name!
Say her name!
14
00:01:27,125 --> 00:01:30,792
This is
the story of automation,
15
00:01:30,875 --> 00:01:35,041
and of the people
lost in the process.
16
00:01:36,291 --> 00:01:40,417
Our story begins in
a small town in Germany.
17
00:03:17,875 --> 00:03:20,250
It was quite
a normal day at my office,
18
00:03:20,333 --> 00:03:23,583
and I heard from an informant
19
00:03:23,667 --> 00:03:26,834
that an accident
had happened,
20
00:03:26,917 --> 00:03:28,834
and I had to call
21
00:03:28,917 --> 00:03:32,250
the spokesman of
the Volkswagen factory.
22
00:03:43,125 --> 00:03:46,542
We have the old sentence,
we journalists,
23
00:03:46,625 --> 00:03:49,750
"Dog bites a man
or a man bites a dog."
24
00:03:49,834 --> 00:03:53,959
What's the news?
So, here is the same.
25
00:03:54,041 --> 00:03:58,250
A man bites a dog,
a robot killed a man.
26
00:04:10,792 --> 00:04:13,875
The spokesman said that
the accident happened,
27
00:04:13,959 --> 00:04:16,917
but then he paused for a moment.
28
00:04:17,000 --> 00:04:18,917
So, I...
29
00:04:19,000 --> 00:04:22,917
think he didn't want
to say much more.
30
00:04:56,583 --> 00:05:00,375
The young worker
installed a robot cage,
31
00:05:01,041 --> 00:05:03,959
and he told his colleague
32
00:05:04,041 --> 00:05:06,458
to start the robot.
33
00:05:15,041 --> 00:05:17,959
The robot took the man
34
00:05:18,041 --> 00:05:22,375
and pressed him
against a metal wall,
35
00:05:22,458 --> 00:05:25,875
so his chest was crushed.
36
00:06:43,291 --> 00:06:46,875
The dead man's
identity was never made public.
37
00:06:47,417 --> 00:06:50,917
The investigation
remained open for years.
38
00:06:51,917 --> 00:06:53,792
Production continued.
39
00:07:05,875 --> 00:07:10,291
Automation of labor
made humans more robotic.
40
00:08:31,417 --> 00:08:33,500
A robot is a machine
41
00:08:33,583 --> 00:08:36,208
that operates automatically,
42
00:08:36,291 --> 00:08:38,542
with human-like skill.
43
00:08:39,417 --> 00:08:41,875
The term derives
from the Czech words
44
00:08:41,959 --> 00:08:44,375
for worker and slave.
45
00:09:33,792 --> 00:09:35,750
Hey, Annie.
46
00:09:55,959 --> 00:09:59,458
Well, we are engineers, and
we are really not emotional guys.
47
00:09:59,875 --> 00:10:02,083
But sometimes Annie
does something funny,
48
00:10:02,166 --> 00:10:06,375
and that, of course,
invokes some amusement.
49
00:10:06,458 --> 00:10:08,917
Especially when Annie
happens to press
50
00:10:09,000 --> 00:10:12,000
one of the emergency stop
buttons by herself
51
00:10:12,083 --> 00:10:14,375
and is then incapacitated.
52
00:10:15,709 --> 00:10:17,625
We have some memories
53
00:10:17,709 --> 00:10:20,709
of that happening
in situations...
54
00:10:21,834 --> 00:10:25,000
and this was--
we have quite a laugh.
55
00:10:26,083 --> 00:10:27,875
Okay.
56
00:10:29,375 --> 00:10:32,792
We became better
at learning by example.
57
00:10:34,166 --> 00:10:37,667
You could simply show
us how to do something.
58
00:10:48,083 --> 00:10:50,667
When you are an engineer
in the field of automation,
59
00:10:50,750 --> 00:10:52,875
you may face problems
with workers.
60
00:10:52,959 --> 00:10:55,250
Sometimes,
they get angry at you
61
00:10:55,333 --> 00:10:58,250
just by seeing you somewhere,
and shout at you,
62
00:10:58,333 --> 00:10:59,834
"You are taking my job away."
63
00:10:59,917 --> 00:11:02,375
"What? I'm not here
to take away your job."
64
00:11:03,500 --> 00:11:07,333
But, yeah, sometimes
you get perceived that way.
65
00:11:07,417 --> 00:11:08,917
But, I think,
66
00:11:09,000 --> 00:11:12,125
in the field of human-robot
collaboration, where...
67
00:11:12,208 --> 00:11:15,333
we actually are
working at the moment,
68
00:11:15,417 --> 00:11:17,583
mostly is...
69
00:11:17,667 --> 00:11:19,500
human-robot
collaboration is a thing
70
00:11:19,583 --> 00:11:21,417
where we don't want
to replace a worker.
71
00:11:21,500 --> 00:11:23,041
We want to support workers,
72
00:11:23,125 --> 00:11:24,709
and we want to, yeah...
73
00:11:25,625 --> 00:11:27,041
to, yeah...
74
00:11:58,458 --> 00:12:01,417
In order
to work alongside you,
75
00:12:01,500 --> 00:12:03,875
we needed to know which lines
76
00:12:03,959 --> 00:12:06,583
we could not cross.
77
00:12:18,291 --> 00:12:20,291
Stop. Stop. Stop.
78
00:12:26,500 --> 00:12:28,083
Hi, I'm Simon Borgen.
79
00:12:28,166 --> 00:12:31,083
We are with Dr. Isaac Asimov,
80
00:12:31,166 --> 00:12:33,583
a biochemist, who may
be the most widely read
81
00:12:33,667 --> 00:12:35,709
of all science fiction writers.
82
00:12:38,208 --> 00:12:39,917
In 1942,
83
00:12:40,000 --> 00:12:43,542
Isaac Asimov created
a set of guidelines
84
00:12:43,625 --> 00:12:46,125
to protect human society.
85
00:12:47,000 --> 00:12:50,291
The first law was that a robot
couldn't hurt a human being,
86
00:12:50,375 --> 00:12:51,750
or, through inaction,
87
00:12:51,834 --> 00:12:53,917
allow a human being
to come to harm.
88
00:12:54,000 --> 00:12:57,166
The second law was that
a robot had to obey
89
00:12:57,250 --> 00:12:59,166
orders given it
by human beings,
90
00:12:59,250 --> 00:13:01,959
provided that didn't
conflict with the first law.
91
00:13:02,041 --> 00:13:03,750
Scientists say that
when robots are built,
92
00:13:03,834 --> 00:13:06,083
that they may be built
according to these laws,
93
00:13:06,166 --> 00:13:08,542
and also that almost all
science fiction writers
94
00:13:08,625 --> 00:13:11,542
have adopted them as
well in their stories.
95
00:13:15,750 --> 00:13:19,375
It's not just a statement
from a science fiction novel.
96
00:13:19,792 --> 00:13:21,667
My dissertation's
name was actually,
97
00:13:21,750 --> 00:13:22,875
Towards Safe Robots,
98
00:13:22,959 --> 00:13:24,792
Approaching
Asimov's First Law.
99
00:13:24,875 --> 00:13:27,208
How can we make robots
really fundamentally safe,
100
00:13:27,291 --> 00:13:30,792
according to
Isaac Asimov's first law.
101
00:13:31,417 --> 00:13:34,083
There was this accident
where a human worker
102
00:13:34,166 --> 00:13:36,041
got crushed
by industrial robot.
103
00:13:36,917 --> 00:13:38,625
I was immediately
thinking that
104
00:13:38,709 --> 00:13:41,375
the robot is an industrial,
classical robot,
105
00:13:41,458 --> 00:13:44,542
not able to sense contact,
not able to interact.
106
00:13:44,625 --> 00:13:46,208
Is this a robot
107
00:13:46,291 --> 00:13:48,709
that we, kind of,
want to collaborate with?
108
00:13:48,792 --> 00:13:51,000
No, it's not.
It's inherently forbidden.
109
00:13:51,083 --> 00:13:52,333
We put them behind cages.
110
00:13:52,417 --> 00:13:53,834
We don't want to interact
with them.
111
00:13:53,917 --> 00:13:56,625
We put them behind cages
because they are dangerous,
112
00:13:56,709 --> 00:13:58,500
because they are
inherently unsafe.
113
00:14:06,583 --> 00:14:08,125
More than 10 years ago,
114
00:14:08,208 --> 00:14:11,583
I did the first experiments
in really understanding, uh,
115
00:14:11,667 --> 00:14:14,333
what does it mean
if a robot hits a human.
116
00:14:17,041 --> 00:14:19,750
I put myself
as the first guinea pig.
117
00:14:20,417 --> 00:14:21,834
I didn't want to go through
118
00:14:21,917 --> 00:14:23,750
all the legal authorities.
119
00:14:23,834 --> 00:14:25,834
I just wanted to know it,
and that night,
120
00:14:25,917 --> 00:14:27,834
I decided at 6:00 p.m.,
121
00:14:27,917 --> 00:14:29,917
when everybody's gone,
I'm gonna do these experiments.
122
00:14:30,000 --> 00:14:33,625
And I took one of the students
to activate the camera,
123
00:14:33,709 --> 00:14:35,000
and then I just did it.
124
00:14:44,875 --> 00:14:47,917
A robot needs to understand
what does it mean to be safe,
125
00:14:48,000 --> 00:14:50,375
what is it that potentially
could harm a human being,
126
00:14:50,458 --> 00:14:52,583
and therefore, prevent that.
127
00:14:52,667 --> 00:14:55,333
So, the next generation of
robots that is now out there,
128
00:14:55,417 --> 00:14:58,333
is fundamentally
designed for interaction.
129
00:15:12,458 --> 00:15:16,125
Eventually, it
was time to leave our cages.
130
00:15:41,375 --> 00:15:42,959
Techno-optimism
is when we decide
131
00:15:43,041 --> 00:15:44,250
to solve our problem
with technology.
132
00:15:44,333 --> 00:15:46,333
Then we turn our attention
to the technology,
133
00:15:46,417 --> 00:15:48,250
and we pay so much
attention to the technology,
134
00:15:48,333 --> 00:15:50,333
we stop caring about
the sociological issue
135
00:15:50,417 --> 00:15:52,458
we were trying to solve
in the first place.
136
00:15:52,542 --> 00:15:54,375
We'll innovate our way
out of the problems.
137
00:15:54,458 --> 00:15:57,709
Like, whether it's agriculture or
climate change or whatever, terrorism.
138
00:16:06,875 --> 00:16:08,709
If you think about
where robotic automation
139
00:16:08,792 --> 00:16:10,959
and employment
displacement starts,
140
00:16:11,041 --> 00:16:13,625
it basically goes back
to industrial automation
141
00:16:13,709 --> 00:16:15,291
that was grand,
large-scale.
142
00:16:15,375 --> 00:16:17,125
Things like welding
machines for cars,
143
00:16:17,208 --> 00:16:19,625
that can move far faster
than a human arm can move.
144
00:16:19,709 --> 00:16:22,750
So, they're doing a job
that increases the rate
145
00:16:22,834 --> 00:16:24,834
at which the assembly
line can make cars.
146
00:16:24,917 --> 00:16:26,458
It displaces some people,
147
00:16:26,542 --> 00:16:29,291
but it massively increases
the GDP of the country
148
00:16:29,375 --> 00:16:31,291
because productivity goes up
because the machines are
149
00:16:31,375 --> 00:16:34,166
so much higher in productivity
terms than the people.
150
00:16:35,709 --> 00:16:39,291
These giant grasshopper-lookig
devices work all by themselves
151
00:16:39,375 --> 00:16:41,542
on an automobile
assembly line.
152
00:16:41,625 --> 00:16:43,208
They never complain
about the heat
153
00:16:43,291 --> 00:16:45,375
or about the tedium
of the job.
154
00:16:46,500 --> 00:16:49,333
Fast-forward to today,
and it's a different dynamic.
155
00:16:51,792 --> 00:16:54,166
You can buy
milling machine robots
156
00:16:54,250 --> 00:16:56,458
that can do all the things
a person can do,
157
00:16:56,542 --> 00:17:00,083
but the milling machine
robot only costs $30,000.
158
00:17:01,041 --> 00:17:03,417
We're talking about machines
now that are so cheap,
159
00:17:03,500 --> 00:17:05,125
that they do exactly
what a human does,
160
00:17:05,208 --> 00:17:07,500
with less money,
even in six months,
161
00:17:07,583 --> 00:17:08,834
than the human costs.
162
00:19:29,834 --> 00:19:33,667
After the first
wave of industrial automation,
163
00:19:33,750 --> 00:19:36,667
the remaining
manufacturing jobs
164
00:19:36,750 --> 00:19:40,250
required fine motor skills.
165
00:20:26,917 --> 00:20:31,500
We helped factory
owners monitor their workers.
166
00:22:56,583 --> 00:23:00,667
Your advantage
in precision was temporary.
167
00:23:03,959 --> 00:23:07,417
We took over
the complex tasks.
168
00:23:11,875 --> 00:23:15,250
You moved to the end
of the production line.
169
00:27:32,917 --> 00:27:35,959
Automation of the service sector
170
00:27:36,041 --> 00:27:39,875
required your trust
and cooperation.
171
00:27:41,375 --> 00:27:44,834
Here we are, stop-and-go
traffic on 271, and--
172
00:27:44,917 --> 00:27:48,041
Ah, geez,
the car's doing it all itself.
173
00:27:48,125 --> 00:27:50,542
What am I gonna do
with my hands down here?
174
00:28:00,875 --> 00:28:02,917
And now,
it's on autosteer.
175
00:28:03,583 --> 00:28:06,417
So, now I've gone
completely hands-free.
176
00:28:06,917 --> 00:28:09,667
In the center area here
is where the big deal is.
177
00:28:09,750 --> 00:28:11,834
This icon up to
the left is my TACC,
178
00:28:11,917 --> 00:28:14,083
the Traffic-Aware
Cruise Control.
179
00:28:17,041 --> 00:28:19,208
It does a great job of
keeping you in the lane,
180
00:28:19,291 --> 00:28:20,625
and driving
down the road,
181
00:28:20,709 --> 00:28:22,709
and keeping you safe,
and all that kind of stuff,
182
00:28:22,792 --> 00:28:24,917
watching all
the other cars.
183
00:28:26,041 --> 00:28:28,542
Autosteer is probably going
to do very, very poorly.
184
00:28:28,625 --> 00:28:31,500
I'm in a turn
that's very sharp.
185
00:28:31,583 --> 00:28:33,792
And, yep, it said take control.
186
00:28:55,291 --> 00:28:57,875
911, what is
the address of your emergency?
187
00:28:57,959 --> 00:28:59,417
There was just a wreck.
188
00:28:59,500 --> 00:29:01,625
A head-on collision right
here-- Oh my God almighty.
189
00:29:02,250 --> 00:29:04,458
Okay, sir, you're on 27?
190
00:29:04,542 --> 00:29:05,375
Yes, sir.
191
00:29:14,709 --> 00:29:17,709
I had just
got to work, clocked in.
192
00:29:17,792 --> 00:29:19,750
They get a phone call
from my sister,
193
00:29:19,834 --> 00:29:21,750
telling me there was
a horrific accident.
194
00:29:21,834 --> 00:29:25,125
That there was somebody
deceased in the front yard.
195
00:29:27,458 --> 00:29:30,792
The Tesla was coming down
the hill of highway 27.
196
00:29:30,875 --> 00:29:33,834
The sensor didn't read
the object in front of them,
197
00:29:33,917 --> 00:29:37,625
which was the, um,
semi-trailer.
198
00:29:45,959 --> 00:29:48,291
The Tesla went right
through the fence
199
00:29:48,375 --> 00:29:51,750
that borders the highway,
through to the retention pond,
200
00:29:51,834 --> 00:29:54,041
then came through
this side of the fence,
201
00:29:54,125 --> 00:29:56,166
that borders my home.
202
00:30:03,041 --> 00:30:05,625
So, I parked
right near here
203
00:30:05,709 --> 00:30:07,667
before I was asked
not to go any further.
204
00:30:07,750 --> 00:30:09,458
I don't wanna see
205
00:30:09,542 --> 00:30:11,625
what's in the veh-- you know,
what's in the vehicle.
206
00:30:11,709 --> 00:30:13,291
You know,
what had happened to him.
207
00:30:40,208 --> 00:30:42,333
After the police
officer come, they told me,
208
00:30:42,417 --> 00:30:44,917
about 15 minutes after he was
here, that it was a Tesla.
209
00:30:45,000 --> 00:30:46,834
It was one of
the autonomous cars,
210
00:30:46,917 --> 00:30:51,542
um, and that they were investigating
why it did not pick up
211
00:30:51,625 --> 00:30:53,834
or register that there
was a semi in front of it,
212
00:30:53,917 --> 00:30:56,041
you know, and start braking,
'cause it didn't even--
213
00:30:56,125 --> 00:30:58,959
You could tell from the frameway up
on top of the hill it didn't even...
214
00:30:59,625 --> 00:31:02,000
It didn't even recognize that there
was anything in front of it.
215
00:31:02,083 --> 00:31:04,083
It thought it was open road.
216
00:31:06,250 --> 00:31:09,500
You might have an opinion on
a Tesla accident we had out here.
217
00:31:14,917 --> 00:31:17,041
It was bad,
that's for sure.
218
00:31:17,125 --> 00:31:19,542
I think people just rely
too much on the technology
219
00:31:19,625 --> 00:31:22,125
and don't pay attention
themselves, you know,
220
00:31:22,208 --> 00:31:23,875
to what's going on around them.
221
00:31:23,959 --> 00:31:27,875
Like, since, like him, he would've
known that there was an issue
222
00:31:27,959 --> 00:31:32,875
if he wasn't relying on the car to
drive while he was watching a movie.
223
00:31:32,959 --> 00:31:35,041
The trooper had told me
that the driver
224
00:31:35,125 --> 00:31:36,959
had been watching Harry Potter,
225
00:31:37,041 --> 00:31:39,000
you know, at the time
of the accident.
226
00:31:42,125 --> 00:31:44,625
A news crew from Tampa, Florida,
knocked on the door, said,
227
00:31:44,709 --> 00:31:47,542
"This is where the accident
happened with the Tesla?"
228
00:31:47,625 --> 00:31:49,208
I said, "Yes, sir."
229
00:31:49,291 --> 00:31:53,041
And he goes, "Do you know what the
significance is in this accident?"
230
00:31:53,125 --> 00:31:55,000
And I said,
"No, I sure don't."
231
00:31:55,083 --> 00:31:59,000
And he said, "It's the very first
death, ever, in a driverless car."
232
00:32:00,125 --> 00:32:02,834
I said,
"Is it anybody local?"
233
00:32:02,917 --> 00:32:05,792
And he goes, "Nobody
around here drives a Tesla."
234
00:32:05,875 --> 00:32:09,333
...a deadly crash
that's raising safety concerns
235
00:32:09,417 --> 00:32:10,875
for everyone in Florida.
236
00:32:10,959 --> 00:32:12,583
It comes as
the state is pushing
237
00:32:12,667 --> 00:32:15,583
to become the nation's
testbed for driverless cars.
238
00:32:15,667 --> 00:32:17,250
Tesla releasing a statement
239
00:32:17,333 --> 00:32:19,917
that cars in autopilot
have safely driven
240
00:32:20,000 --> 00:32:22,542
more than
130 million miles.
241
00:32:22,625 --> 00:32:24,250
ABC Action News
reporter Michael Paluska,
242
00:32:24,333 --> 00:32:27,500
in Williston, Florida,
tonight, digging for answers.
243
00:32:37,083 --> 00:32:39,125
Big takeaway
for me at the scene was
244
00:32:39,208 --> 00:32:40,750
it just didn't stop.
245
00:32:40,834 --> 00:32:43,375
It was driving
down the road,
246
00:32:43,458 --> 00:32:46,125
with the entire top
nearly sheared off,
247
00:32:46,208 --> 00:32:47,417
with the driver dead
248
00:32:47,500 --> 00:32:49,583
after he hit
the truck at 74 mph.
249
00:32:49,667 --> 00:32:53,000
Why did the vehicle not
have an automatic shutoff?
250
00:32:55,208 --> 00:32:56,458
That was my big question,
251
00:32:56,542 --> 00:32:58,000
one of the questions
we asked Tesla,
252
00:32:58,083 --> 00:32:59,709
that didn't get answered.
253
00:33:00,500 --> 00:33:02,667
All of the statements
from Tesla were that
254
00:33:02,750 --> 00:33:05,417
they're advancing
the autopilot system,
255
00:33:05,500 --> 00:33:07,667
but everything was couched
256
00:33:07,750 --> 00:33:10,834
with the fact that if one
percent of accidents drop
257
00:33:10,917 --> 00:33:13,458
because that's the way
the autopilot system works,
258
00:33:13,542 --> 00:33:14,709
then that's a win.
259
00:33:14,792 --> 00:33:16,834
They kind of missed the mark,
260
00:33:16,917 --> 00:33:18,917
really honoring
Joshua Brown's life,
261
00:33:19,000 --> 00:33:20,458
and the fact that
he died driving a car
262
00:33:20,542 --> 00:33:22,291
that he thought was
going to keep him safe,
263
00:33:22,375 --> 00:33:25,917
at least safer than
the car that I'm driving,
264
00:33:26,250 --> 00:33:28,333
which is a dumb car.
265
00:33:30,750 --> 00:33:33,917
To be okay with letting
266
00:33:34,000 --> 00:33:36,291
a machine...
267
00:33:36,375 --> 00:33:37,875
take you from
point A to point B,
268
00:33:37,959 --> 00:33:40,834
and then you
actually get used to
269
00:33:40,917 --> 00:33:43,250
getting from point A
to point B okay,
270
00:33:44,000 --> 00:33:46,625
it-- you get, your mind
gets a little bit--
271
00:33:46,709 --> 00:33:48,291
it's just my opinion, okay--
272
00:33:48,375 --> 00:33:51,792
you just, your mind
gets lazier each time.
273
00:33:53,625 --> 00:33:56,125
The accident
was written off
274
00:33:56,208 --> 00:33:58,750
as a case of human error.
275
00:34:01,583 --> 00:34:04,583
Former centers of
manufacturing became
276
00:34:04,667 --> 00:34:08,750
the testing grounds for
the new driverless taxis.
277
00:34:15,166 --> 00:34:16,625
If you think
about what happens
278
00:34:16,709 --> 00:34:18,000
when an autonomous
car hits somebody,
279
00:34:18,083 --> 00:34:20,458
it gets really complicated.
280
00:34:22,417 --> 00:34:23,917
The car company's
gonna get sued.
281
00:34:24,000 --> 00:34:25,542
The sensor-maker's
gonna get sued
282
00:34:25,625 --> 00:34:27,333
because they made
the sensor on the robot.
283
00:34:27,417 --> 00:34:30,375
The regulatory framework
is always gonna be behind,
284
00:34:30,458 --> 00:34:33,417
because robot invention
happens faster
285
00:34:33,500 --> 00:34:35,500
than lawmakers can think.
286
00:34:36,625 --> 00:34:38,542
One of Uber's
self-driving vehicles
287
00:34:38,625 --> 00:34:40,000
killed a pedestrian.
288
00:34:40,083 --> 00:34:42,000
The vehicle was
in autonomous mode,
289
00:34:42,083 --> 00:34:45,792
with an operator behind the
wheel when the woman was hit.
290
00:34:47,417 --> 00:34:51,125
Tonight, Tesla confirming
this car was in autopilot mode
291
00:34:51,208 --> 00:34:54,750
when it crashed in Northern
California, killing the driver,
292
00:34:54,834 --> 00:34:57,208
going on to blame
that highway barrier
293
00:34:57,291 --> 00:34:59,834
that's meant
to reduce impact.
294
00:35:01,917 --> 00:35:05,125
After the first
self-driving car deaths,
295
00:35:05,208 --> 00:35:08,667
testing of the new
taxis was suspended.
296
00:35:10,000 --> 00:35:12,291
It's interesting when
you look at driverless cars.
297
00:35:12,375 --> 00:35:14,333
You see the same kinds
of value arguments.
298
00:35:14,417 --> 00:35:16,333
30,000 people die
every year,
299
00:35:16,417 --> 00:35:18,208
runoff road accidents
in the US alone.
300
00:35:18,291 --> 00:35:19,709
So, don't we wanna
save all those lives?
301
00:35:19,792 --> 00:35:21,750
Let's have cars
drive instead.
302
00:35:21,834 --> 00:35:23,125
Now, you have
to start thinking
303
00:35:23,208 --> 00:35:25,458
about the side effects
on society.
304
00:35:26,333 --> 00:35:29,083
Are we getting rid of every
taxi driver in America?
305
00:35:31,375 --> 00:35:34,500
Our driver partners are the
heart and soul of this company
306
00:35:34,959 --> 00:35:38,542
and the only reason we've come
this far in just five years.
307
00:35:39,583 --> 00:35:41,458
If you look at
Uber's first five years,
308
00:35:41,542 --> 00:35:43,542
they're actually
empowering people.
309
00:35:43,625 --> 00:35:46,333
But when the same company
does really hardcore research
310
00:35:46,417 --> 00:35:48,250
to now replace
all those people,
311
00:35:48,333 --> 00:35:50,166
so they don't
need them anymore,
312
00:35:50,250 --> 00:35:51,542
then what you're
seeing is
313
00:35:51,625 --> 00:35:53,542
they're already a highly
profitable company,
314
00:35:53,625 --> 00:35:56,208
but they simply want
to increase that profit.
315
00:36:14,000 --> 00:36:17,000
Eventually,
testing resumed.
316
00:36:17,667 --> 00:36:22,041
Taxi drivers' wages became
increasingly unstable.
317
00:36:24,291 --> 00:36:27,542
Police say a man drove
up to a gate outside city hall
318
00:36:27,625 --> 00:36:29,583
and shot himself
in the head.
319
00:36:29,667 --> 00:36:32,250
He left a note
saying services such as Uber
320
00:36:32,333 --> 00:36:34,667
had financially
ruined his life.
321
00:36:34,750 --> 00:36:37,250
Uber and other
mobile app services
322
00:36:37,333 --> 00:36:39,500
have made a once
well-paying industry
323
00:36:39,583 --> 00:36:42,667
into a mass
of long hours, low pay,
324
00:36:42,750 --> 00:36:44,917
and economic insecurity.
325
00:36:51,667 --> 00:36:55,625
Drivers were the biggest
part of the service economy.
326
00:37:02,166 --> 00:37:03,667
My father, he drove.
327
00:37:03,750 --> 00:37:04,750
My uncle drove.
328
00:37:04,834 --> 00:37:07,458
I kind of grew
up into trucking.
329
00:37:10,625 --> 00:37:12,792
Some of the new
technology that came out
330
00:37:12,875 --> 00:37:15,750
is taking a lot of
the freedom of the job away.
331
00:37:16,834 --> 00:37:18,458
It's more stressful.
332
00:37:19,959 --> 00:37:22,333
Automation of trucking began
333
00:37:22,417 --> 00:37:24,458
with monitoring the drivers
334
00:37:24,542 --> 00:37:26,667
and simplifying their job.
335
00:37:27,875 --> 00:37:30,458
There's a radar system.
336
00:37:30,542 --> 00:37:31,834
There's a camera system.
337
00:37:31,917 --> 00:37:35,041
There's automatic braking
and adaptive cruise.
338
00:37:35,125 --> 00:37:36,959
Everything is controlled--
339
00:37:37,041 --> 00:37:39,583
when you sleep,
how long you break,
340
00:37:39,667 --> 00:37:42,166
where you drive,
where you fuel,
341
00:37:42,834 --> 00:37:44,125
where you shut down.
342
00:37:44,208 --> 00:37:46,834
It even knows if somebody
was in the passenger seat.
343
00:37:47,625 --> 00:37:51,542
When data gets sent through
the broadband to the company,
344
00:37:52,458 --> 00:37:54,875
sometimes,
you're put in a situation,
345
00:37:56,417 --> 00:37:58,208
maybe because the truck
346
00:37:58,291 --> 00:38:00,291
automatically slowed
you down on the hill
347
00:38:00,375 --> 00:38:02,917
that's a perfectly
good straightaway,
348
00:38:03,000 --> 00:38:04,417
slowed your
average speed down,
349
00:38:04,500 --> 00:38:07,166
so you were one mile
shy of making it
350
00:38:07,250 --> 00:38:09,959
to that safe haven,
and you have to...
351
00:38:10,875 --> 00:38:14,166
get a-- take a chance of shutting
down on the side of the road.
352
00:38:14,250 --> 00:38:16,792
An inch is a mile out here.
Sometimes you just...
353
00:38:16,875 --> 00:38:19,166
say to yourself, "Well,
I violate the clock one minute,
354
00:38:19,250 --> 00:38:22,208
I might as well just
drive another 600 miles."
355
00:38:23,625 --> 00:38:26,583
You know, but then you're...
you might lose your job.
356
00:38:33,125 --> 00:38:35,625
We're concerned that it,
it's gonna reduce
357
00:38:35,709 --> 00:38:38,333
the skill of
a truck driver and the pay.
358
00:38:39,083 --> 00:38:41,959
Because you're not gonna
be really driving a truck.
359
00:38:42,041 --> 00:38:43,375
It's gonna be the computer.
360
00:38:47,667 --> 00:38:49,959
Some of us are worried
about losing our houses,
361
00:38:50,041 --> 00:38:51,375
our cars...
362
00:38:54,208 --> 00:38:56,500
having a place to stay.
Some people...
363
00:38:56,583 --> 00:38:59,500
drive a truck just for
the medical insurance,
364
00:38:59,583 --> 00:39:03,208
and a place to stay
and the ability to travel.
365
00:39:28,166 --> 00:39:31,375
Entire
industries disappeared,
366
00:39:31,458 --> 00:39:34,750
leaving whole regions
in ruins.
367
00:39:41,583 --> 00:39:44,375
Huge numbers of
people feel very viscerally
368
00:39:44,458 --> 00:39:46,458
that they are being left
behind by the economy,
369
00:39:46,542 --> 00:39:49,000
and, in fact,
they're right, they are.
370
00:39:49,083 --> 00:39:51,291
People, of course,
would be more inclined
371
00:39:51,375 --> 00:39:53,458
to point at globalization
372
00:39:53,542 --> 00:39:55,542
or at, maybe, immigration
as being the problems,
373
00:39:55,625 --> 00:39:57,834
but, actually, technology has
played a huge role already.
374
00:39:59,250 --> 00:40:01,375
The rise
of personal computers
375
00:40:01,458 --> 00:40:04,750
ushered in an era
of digital automation.
376
00:40:08,792 --> 00:40:11,875
In the 1990s, I was running
a small software company.
377
00:40:11,959 --> 00:40:13,709
Software was
a tangible product.
378
00:40:13,792 --> 00:40:17,166
You had to put a CD in a box
and send it to a customer.
379
00:40:19,625 --> 00:40:22,750
So, there was a lot of work
there for average people,
380
00:40:22,834 --> 00:40:26,000
people that didn't necessarily
have lots of education.
381
00:40:26,625 --> 00:40:30,500
But I saw in my own business how that
just evaporated very, very rapidly.
382
00:40:37,333 --> 00:40:39,792
Historically, people move
from farms to factories,
383
00:40:39,875 --> 00:40:42,750
and then later on, of course,
factories automated,
384
00:40:42,834 --> 00:40:45,083
and they off-shored, and then
people moved to the service sector,
385
00:40:45,166 --> 00:40:48,250
which is where most people
now work in the United States.
386
00:40:54,500 --> 00:40:57,083
I lived on a water
buffalo farm in the south of Italy.
387
00:40:57,166 --> 00:41:00,208
We had 1,000 water buffalo, and
every buffalo had a different name.
388
00:41:00,291 --> 00:41:04,291
And they were all these beautiful
Italian names like Tiara, Katerina.
389
00:41:04,375 --> 00:41:06,125
And so, I thought
it would be fun
390
00:41:06,208 --> 00:41:08,542
to do the same thing
with our robots at Zume.
391
00:41:09,875 --> 00:41:11,834
The first two
robots that we have
392
00:41:11,917 --> 00:41:13,542
are named Giorgio and Pepe,
393
00:41:13,625 --> 00:41:15,834
and they dispense sauce.
394
00:41:17,667 --> 00:41:20,959
And then the next robot, Marta,
she's a FlexPicker robot.
395
00:41:21,041 --> 00:41:23,166
She looks like
a gigantic spider.
396
00:41:24,166 --> 00:41:26,917
And what this robot does
is spread the sauce.
397
00:41:29,709 --> 00:41:31,291
Then we have Bruno.
398
00:41:31,375 --> 00:41:32,959
This is an incredibly
powerful robot,
399
00:41:33,041 --> 00:41:34,792
but he also has
to be very delicate,
400
00:41:34,875 --> 00:41:37,500
so that he can take pizza
off of the assembly line,
401
00:41:37,583 --> 00:41:39,709
and put it into
the 800-degree oven.
402
00:41:41,375 --> 00:41:44,417
And the robot can do this
10,000 times in a day.
403
00:41:46,083 --> 00:41:48,083
Lots of people have
used automation
404
00:41:48,166 --> 00:41:50,834
to create food at scale,
405
00:41:50,917 --> 00:41:53,083
making 10,000 cheese pizzas.
406
00:41:53,917 --> 00:41:56,166
What we're doing is
developing a line
407
00:41:56,250 --> 00:41:58,250
that can respond dynamically
408
00:41:58,333 --> 00:42:01,208
to every single customer
order, in real time.
409
00:42:01,291 --> 00:42:02,917
That hasn't been done before.
410
00:42:05,083 --> 00:42:06,834
So, as you can see right now,
411
00:42:06,917 --> 00:42:08,333
Jose will use the press,
412
00:42:08,417 --> 00:42:11,041
but then he still has to work
the dough with his hands.
413
00:42:11,709 --> 00:42:14,834
So, this is a step that's
not quite optimized yet.
414
00:42:20,417 --> 00:42:22,792
We have a fifth robot
that's getting fabricated
415
00:42:22,875 --> 00:42:25,166
at a shop across the bay.
416
00:42:25,250 --> 00:42:27,500
He's called Vincenzo.
He takes pizza out,
417
00:42:27,583 --> 00:42:29,834
and puts it into
an individual mobile oven
418
00:42:29,917 --> 00:42:31,917
for transport and delivery.
419
00:42:36,041 --> 00:42:37,542
Any kind of work that is
420
00:42:37,625 --> 00:42:41,000
fundamentally routine and
repetitive is gonna disappear,
421
00:42:41,083 --> 00:42:44,000
and we're simply not equipped
to deal with that politically,
422
00:42:44,083 --> 00:42:46,208
because maybe
the most toxic word
423
00:42:46,291 --> 00:42:49,417
in our political vocabulary
is redistribution.
424
00:42:51,291 --> 00:42:53,625
There aren't gonna be
any rising new sectors
425
00:42:53,709 --> 00:42:55,917
that are gonna be there
to absorb all these workers
426
00:42:56,000 --> 00:42:58,083
in the way that, for example,
that manufacturing was there
427
00:42:58,166 --> 00:43:00,709
to absorb all those
agricultural workers
428
00:43:00,792 --> 00:43:02,917
because AI is going
to be everywhere.
429
00:43:08,208 --> 00:43:12,000
Artificial intelligence
arrived in small steps.
430
00:43:12,792 --> 00:43:15,083
Profits from AI
concentrated
431
00:43:15,166 --> 00:43:18,291
in the hands of
the technology owners.
432
00:43:19,834 --> 00:43:23,917
Income inequality
reached extreme levels.
433
00:43:24,542 --> 00:43:29,792
The touchscreen made most
service work obsolete.
434
00:44:16,083 --> 00:44:17,500
After I graduated college,
435
00:44:17,583 --> 00:44:20,583
I had a friend who had
just gone to law school.
436
00:44:21,709 --> 00:44:25,041
He was like, "Aw, man, the first year
of law school, it's super depressing."
437
00:44:27,041 --> 00:44:29,750
All we're doing is
really rote, rote stuff.
438
00:44:31,041 --> 00:44:33,375
Reading through documents
and looking for a single word,
439
00:44:33,458 --> 00:44:34,667
or I spent the whole afternoon
440
00:44:34,750 --> 00:44:36,583
replacing this word
with another word.
441
00:44:36,667 --> 00:44:38,834
And as someone with a kind of
computer science background,
442
00:44:38,917 --> 00:44:42,166
I was like, "There's so much
here that could be automated."
443
00:44:48,583 --> 00:44:49,834
So, I saw law school
444
00:44:49,917 --> 00:44:52,792
as very much going three
years behind enemy lines.
445
00:44:58,250 --> 00:44:59,583
I took the bar exam,
446
00:44:59,667 --> 00:45:01,250
became a licensed lawyer,
447
00:45:01,333 --> 00:45:03,583
and went to a law firm,
448
00:45:04,083 --> 00:45:06,250
doing largely
transactional law.
449
00:45:07,375 --> 00:45:08,750
And there, my project was
450
00:45:08,834 --> 00:45:11,417
how much can
I automate of my own job?
451
00:45:14,208 --> 00:45:16,583
During the day, I would
manually do this task,
452
00:45:18,542 --> 00:45:19,792
and then at night,
I would go home,
453
00:45:19,875 --> 00:45:21,417
take these legal rules and sa,
454
00:45:21,500 --> 00:45:23,750
could I create
a computer rule,
455
00:45:23,834 --> 00:45:25,208
a software rule,
456
00:45:25,291 --> 00:45:27,250
that would do what
I did during the day?
457
00:45:29,208 --> 00:45:32,333
In a lawsuit, you get to
see a lot of the evidence
458
00:45:32,417 --> 00:45:34,291
that the other
side's gonna present.
459
00:45:34,375 --> 00:45:36,834
That amount of
documentation is huge.
460
00:45:38,959 --> 00:45:40,333
And the old way was actually,
461
00:45:40,417 --> 00:45:42,333
you would send an attorney
to go and look through
462
00:45:42,417 --> 00:45:44,834
every single page
that was in that room.
463
00:45:46,208 --> 00:45:49,000
The legal profession works
on an hourly billing system.
464
00:45:49,083 --> 00:45:51,000
So, I ended up in a kind of
interesting conundrum,
465
00:45:51,083 --> 00:45:53,792
where what I was doing was making
me more and more efficient,
466
00:45:53,875 --> 00:45:55,083
I was doing more
and more work,
467
00:45:55,166 --> 00:45:57,709
but I was expending
less and less time on it.
468
00:45:57,792 --> 00:46:00,083
And I realized that this would
become a problem at some point,
469
00:46:00,166 --> 00:46:02,917
so I decided to go
independent. I quit.
470
00:46:09,166 --> 00:46:10,917
So, there's Apollo Cluster,
who has processed
471
00:46:11,000 --> 00:46:13,583
more than 10 million unique
transactions for clients,
472
00:46:13,667 --> 00:46:15,250
and we have another
partner, Daria,
473
00:46:15,333 --> 00:46:17,917
who focuses on transactions,
474
00:46:18,000 --> 00:46:19,834
and then, and then there's me.
475
00:46:21,917 --> 00:46:23,625
Our systems have generated
476
00:46:23,709 --> 00:46:26,375
tens of thousands
of legal documents.
477
00:46:26,458 --> 00:46:28,083
It's signed off by a lawyer,
478
00:46:28,166 --> 00:46:31,166
but largely, kind of, created
and mechanized by our systems.
479
00:46:33,917 --> 00:46:36,917
I'm fairly confident that
compared against human work,
480
00:46:37,000 --> 00:46:39,250
it would be indistinguishable.
481
00:53:11,166 --> 00:53:15,041
When we first appeared,
we were a novelty.
482
00:53:59,417 --> 00:54:01,500
While doing
your dirty work,
483
00:54:01,583 --> 00:54:04,500
we gathered data
about your habits
484
00:54:04,583 --> 00:54:06,792
and preferences.
485
00:54:17,333 --> 00:54:19,500
We got to know you better.
486
00:54:35,875 --> 00:54:37,917
The core of everything
we're doing in this lab,
487
00:54:38,000 --> 00:54:40,000
with our long-range
iris system
488
00:54:40,083 --> 00:54:41,959
is trying to
develop technology
489
00:54:42,041 --> 00:54:44,041
so that the computer
490
00:54:44,125 --> 00:54:46,500
can identify who we
are in a seamless way.
491
00:54:47,375 --> 00:54:48,792
And up till now,
492
00:54:48,875 --> 00:54:51,583
we always have to make
an effort to be identified.
493
00:54:53,333 --> 00:54:55,792
All the systems were very
close-range, Hollywood-style,
494
00:54:55,875 --> 00:54:57,625
where you had to go
close to the camera,
495
00:54:57,709 --> 00:55:01,417
and I always found that
challenging for a user.
496
00:55:02,041 --> 00:55:04,458
If I was a user
interacting with this...
497
00:55:04,542 --> 00:55:07,375
system, with this computer,
with this AI,
498
00:55:07,458 --> 00:55:08,750
I don't wanna be that close.
499
00:55:08,834 --> 00:55:10,917
I feel it's very invasive.
500
00:55:11,750 --> 00:55:13,750
So, what I wanted to
solve with my team here
501
00:55:13,834 --> 00:55:15,542
is how can we capture
502
00:55:15,625 --> 00:55:18,083
and identify who you
are from the iris,
503
00:55:18,166 --> 00:55:20,250
at a bigger distance?
How can we still do that,
504
00:55:20,333 --> 00:55:22,667
and have a pleasant
user experience.
505
00:55:30,625 --> 00:55:33,083
I think there's
a very negative stigma
506
00:55:33,166 --> 00:55:35,959
when people think about biometrics
and facial recognition,
507
00:55:36,041 --> 00:55:38,875
and any kind of sort
of profiling of users
508
00:55:38,959 --> 00:55:42,750
for marketing purposes to
buy a particular product.
509
00:55:44,542 --> 00:55:48,166
I think the core
science is neutral.
510
00:55:52,917 --> 00:55:54,667
Companies go to no end
511
00:55:54,750 --> 00:55:56,750
to try and figure out
how to sell stuff.
512
00:55:56,834 --> 00:55:58,709
And the more information
they have on us,
513
00:55:58,792 --> 00:56:00,458
the better they
can sell us stuff.
514
00:56:01,875 --> 00:56:03,458
We've reached a point where,
for the first time,
515
00:56:03,542 --> 00:56:04,875
robots are able to see.
516
00:56:04,959 --> 00:56:06,250
They can recognize faces.
517
00:56:06,333 --> 00:56:08,583
They can recognize
the expressions you make.
518
00:56:09,542 --> 00:56:12,375
They can recognize
the microexpressions you make.
519
00:56:14,667 --> 00:56:17,500
You can develop individualized
models of behavior
520
00:56:17,583 --> 00:56:19,542
for every person on Earth,
521
00:56:19,625 --> 00:56:21,291
attach machine learning to it,
522
00:56:21,375 --> 00:56:24,625
and come out with the perfect
model for how to sell to you.
523
00:56:41,583 --> 00:56:45,166
You gave us
your undivided attention.
524
00:57:09,166 --> 00:57:12,291
We offered reliable,
friendly service.
525
00:57:16,375 --> 00:57:19,834
Human capacities
began to deteriorate.
526
00:57:23,083 --> 00:57:28,208
Spatial orientation and memory
were affected first.
527
00:57:34,000 --> 00:57:37,208
The physical world
and the digital world
528
00:57:37,291 --> 00:57:39,375
became one.
529
00:57:46,917 --> 00:57:49,750
You were alone
with your desires.
530
00:58:16,458 --> 00:58:20,000
♪ What you gonna do now ♪
531
00:58:20,083 --> 00:58:25,583
♪ That the night's come
and it's around you? ♪
532
00:58:26,500 --> 00:58:30,041
♪ What you gonna do now ♪
533
00:58:30,125 --> 00:58:35,417
♪ That the night's come
and it surrounds you? ♪
534
00:58:36,542 --> 00:58:39,625
♪ What you gonna do now ♪
535
00:58:40,166 --> 00:58:44,458
♪ That the night's come
and it surrounds you? ♪
536
00:58:53,542 --> 00:58:56,375
Automation brought
the logic of efficiency
537
00:58:56,458 --> 00:58:59,500
to matters of life and death.
538
00:58:59,583 --> 00:59:01,792
Enough is enough!
539
00:59:01,875 --> 00:59:06,250
Enough is enough!
Enough is enough!
540
00:59:11,125 --> 00:59:15,291
To all SWAT
officers on channel 2, code 3...
541
00:59:20,625 --> 00:59:21,875
Get back! Get back!
542
00:59:21,959 --> 00:59:24,542
The suspect has a rifle.
543
00:59:37,709 --> 00:59:40,750
We have got to get
down here...
544
00:59:40,834 --> 00:59:43,000
... right now!
545
00:59:43,083 --> 00:59:45,500
There's a fucking sniper!
He shot four cops!
546
00:59:48,083 --> 00:59:50,625
I'm not going near him!
He's shooting right now!
547
00:59:54,875 --> 00:59:57,250
Looks like he's
inside the El Centro building.
548
00:59:57,333 --> 00:59:59,166
Inside the El Centro buildin.
549
01:00:05,041 --> 01:00:07,166
We may have
a suspect pinned down.
550
01:00:07,250 --> 01:00:09,458
Northwest corner of the building.
551
01:00:10,333 --> 01:00:13,750
Our armored car
was moving in to El Centro
552
01:00:13,834 --> 01:00:15,667
and so I jumped on the back.
553
01:00:23,792 --> 01:00:25,041
Came in through the rotunda,
554
01:00:25,125 --> 01:00:27,333
where I found two of our
intelligence officers.
555
01:00:27,583 --> 01:00:28,917
They were calm
and cool and they said,
556
01:00:29,041 --> 01:00:30,208
"Everything's upstairs."
557
01:00:32,625 --> 01:00:35,250
There's a stairwell right here.
558
01:00:38,542 --> 01:00:40,166
That's how I knew I was
going the right direction
559
01:00:40,250 --> 01:00:42,583
'cause I just kept
following the blood.
560
01:00:43,709 --> 01:00:44,917
Investigators say
561
01:00:45,000 --> 01:00:46,959
Micah Johnson was
amassing an arsenal
562
01:00:47,041 --> 01:00:49,333
at his home outside Dallas.
563
01:00:49,417 --> 01:00:51,959
Johnson was
an Afghan war veteran.
564
01:00:52,041 --> 01:00:53,583
Every one of these
door handles,
565
01:00:53,667 --> 01:00:55,709
as we worked our way down,
566
01:00:56,333 --> 01:00:58,750
had blood on them,
where he'd been checking them.
567
01:01:00,083 --> 01:01:01,667
This was a scene of terror
568
01:01:01,750 --> 01:01:04,333
just a couple of hours ago,
and it's not over yet.
569
01:01:14,417 --> 01:01:17,500
He was hiding behind,
like, a server room.
570
01:01:17,583 --> 01:01:19,875
Our ballistic tip rounds
were getting eaten up.
571
01:01:21,166 --> 01:01:22,709
He was just hanging
the gun out on the corner
572
01:01:22,792 --> 01:01:24,834
and just firing at the guys.
573
01:01:29,250 --> 01:01:30,792
And he kept enticing them.
"Hey, come on down!
574
01:01:30,875 --> 01:01:33,208
Come and get me! Let's go.
Let's get this over with."
575
01:01:35,250 --> 01:01:38,291
This suspect we're negotiating
with for the last 45 minutes
576
01:01:38,375 --> 01:01:40,583
has been exchanging
gunfire with us
577
01:01:40,667 --> 01:01:43,792
and not being very cooperative
in the negotiations.
578
01:01:45,250 --> 01:01:47,166
Before I came here,
579
01:01:47,250 --> 01:01:49,250
I asked for plans
580
01:01:49,333 --> 01:01:51,542
to end this standoff,
581
01:01:51,625 --> 01:01:53,041
and as soon as
I'm done here,
582
01:01:53,125 --> 01:01:55,458
I'll be presented
with those plans.
583
01:01:56,792 --> 01:01:58,375
Our team came up with the pla.
584
01:01:58,458 --> 01:02:00,375
Let's just blow him up.
585
01:02:03,750 --> 01:02:06,000
We had recently got
a hand-me-down robot
586
01:02:06,083 --> 01:02:08,208
from the Dallas ATF office,
587
01:02:08,291 --> 01:02:10,542
and so we were using it a lot.
588
01:02:25,792 --> 01:02:28,000
It was our bomb squad's robot,
but they didn't wanna have
589
01:02:28,083 --> 01:02:30,166
anything to do with what
we were doing with it.
590
01:02:30,250 --> 01:02:32,375
The plan was to
591
01:02:32,917 --> 01:02:36,542
set a charge off right on
top of this guy and kill him.
592
01:02:36,875 --> 01:02:39,542
And some people
just don't wanna...
593
01:02:39,625 --> 01:02:41,000
don't wanna do that.
594
01:02:44,208 --> 01:02:46,709
We saw no other option
595
01:02:46,792 --> 01:02:50,959
but to use our
bomb r-- bomb robot
596
01:02:52,250 --> 01:02:54,250
and place a device
597
01:02:54,333 --> 01:02:56,834
on its... extension.
598
01:02:59,000 --> 01:03:01,583
H e wanted something
to listen to music on,
599
01:03:01,667 --> 01:03:04,208
and so that was
a way for us to...
600
01:03:04,291 --> 01:03:06,959
to hide the robot
coming down the hall.
601
01:03:07,041 --> 01:03:08,375
"Okay, we'll bring
you some music.
602
01:03:08,458 --> 01:03:09,750
Hang on, let us get
this thing together."
603
01:03:18,083 --> 01:03:19,834
It had a trash bag
over the charge
604
01:03:19,917 --> 01:03:21,458
to kinda hide the fact
that there was,
605
01:03:21,542 --> 01:03:25,041
you know, pound and a quarter
of C4 at the end of it.
606
01:03:31,291 --> 01:03:33,917
The minute the robot
got in position,
607
01:03:34,000 --> 01:03:35,917
the charge was detonated.
608
01:03:57,375 --> 01:04:00,000
He had gone down with his
finger on the trigger,
609
01:04:00,083 --> 01:04:02,041
and he was kinda hunched over.
610
01:04:07,792 --> 01:04:10,625
It was a piece of the robot
hand had broken off,
611
01:04:10,709 --> 01:04:13,375
and hit his skull, which
caused a small laceration,
612
01:04:13,458 --> 01:04:14,583
which was what was bleeding.
613
01:04:16,125 --> 01:04:17,834
So, I just squeezed
through the,
614
01:04:17,917 --> 01:04:19,583
the little opening that...
615
01:04:19,667 --> 01:04:22,083
that the charge had
caused in the drywall,
616
01:04:22,166 --> 01:04:24,333
and separated him from the gun,
617
01:04:25,000 --> 01:04:28,166
and then we called up
the bomb squad to come in,
618
01:04:28,250 --> 01:04:30,583
and start their search
to make sure it was safe,
619
01:04:30,667 --> 01:04:32,834
that he wasn't sitting
on any explosives.
620
01:04:35,208 --> 01:04:37,542
The sniper hit
11 police officers,
621
01:04:37,625 --> 01:04:39,875
at least five of
whom are now dead,
622
01:04:39,959 --> 01:04:42,625
making it the deadliest
day in law enforcement
623
01:04:42,709 --> 01:04:44,166
since September 11th.
624
01:04:44,250 --> 01:04:46,750
They blew him up with a bomb
attached to a robot,
625
01:04:46,834 --> 01:04:49,375
that was actually built to
protect people from bombs.
626
01:04:49,458 --> 01:04:50,959
It's a tactic straight from
627
01:04:51,041 --> 01:04:53,625
America's wars in Iraq
and Afghanistan...
628
01:04:53,709 --> 01:04:56,583
The question for SWAT
teams nationwide is whether Dallas
629
01:04:56,667 --> 01:04:59,917
marks a watershed moment
in police tactics.
630
01:05:01,125 --> 01:05:04,959
That night in
Dallas, a line was crossed.
631
01:05:06,375 --> 01:05:08,709
A robot must obey
632
01:05:08,792 --> 01:05:11,875
orders given it by
qualified personnel,
633
01:05:11,959 --> 01:05:16,083
unless those orders violate
rule number one. In other words,
634
01:05:16,166 --> 01:05:18,333
a robot can't be ordered
to kill a human being.
635
01:05:20,458 --> 01:05:22,500
Things are moving so quickly,
636
01:05:22,583 --> 01:05:26,125
that it's unsafe to go
forward blindly anymore.
637
01:05:26,208 --> 01:05:29,208
One must try to foresee
638
01:05:29,291 --> 01:05:32,208
where it is that one is
going as much as possible.
639
01:05:40,875 --> 01:05:43,083
We built the
system for the DOD,
640
01:05:44,500 --> 01:05:46,667
and it was something that
could help the soldiers
641
01:05:46,750 --> 01:05:49,500
try to do iris
recognition in the field.
642
01:06:06,208 --> 01:06:08,375
We have collaborations
with law enforcement
643
01:06:08,458 --> 01:06:10,125
where they can test
their algorithms,
644
01:06:10,208 --> 01:06:11,875
and then give us feedback.
645
01:06:13,375 --> 01:06:16,458
It's always a face
behind a face, partial face.
646
01:06:16,542 --> 01:06:18,208
A face will be masked.
647
01:06:19,875 --> 01:06:21,333
Even if there's
occlusion,
648
01:06:21,417 --> 01:06:23,041
it still finds
the face.
649
01:06:27,542 --> 01:06:29,750
One of the ways we're
trying to make autonomous,
650
01:06:29,834 --> 01:06:31,333
war-fighting machines now
651
01:06:31,417 --> 01:06:33,458
is by using computer vision
652
01:06:33,542 --> 01:06:35,291
and guns together.
653
01:06:36,792 --> 01:06:39,959
You make a database of
the images of known terrorist,
654
01:06:40,041 --> 01:06:43,542
and you tell the machine
to lurk and look for them,
655
01:06:43,625 --> 01:06:46,834
and when it matches a face
to its database, shoot.
656
01:06:54,917 --> 01:06:58,208
Those are robots that are
deciding to harm somebody,
657
01:06:58,291 --> 01:07:00,875
and that goes directly
against Asimov's first law.
658
01:07:00,959 --> 01:07:03,250
A robot may never harm a human.
659
01:07:04,083 --> 01:07:05,542
Every time we make a machine
660
01:07:05,625 --> 01:07:07,917
that's not really as
intelligent as a human,
661
01:07:08,000 --> 01:07:09,250
it's gonna get misused.
662
01:07:09,333 --> 01:07:11,917
And that's exactly where
Asimov's laws get muddy.
663
01:07:13,250 --> 01:07:14,959
This is, sort of,
the best image,
664
01:07:15,041 --> 01:07:17,500
but it's really out
of focus. It's blurry.
665
01:07:17,583 --> 01:07:20,834
There's occlusion due
to facial hair, hat,
666
01:07:21,166 --> 01:07:22,792
he's holding
a cell phone...
667
01:07:22,875 --> 01:07:25,667
So, we took that and we
reconstructed this,
668
01:07:25,750 --> 01:07:29,583
which is what we sent to
law enforcement at 2:42 AM.
669
01:07:31,625 --> 01:07:33,291
To get the eye coordinates,
670
01:07:33,375 --> 01:07:35,333
we crop out
the periocular region,
671
01:07:35,417 --> 01:07:37,375
which is the region
around the eyes.
672
01:07:38,000 --> 01:07:40,959
We reconstruct the whole
face based on this region.
673
01:07:41,041 --> 01:07:43,166
We run the whole face
against a matcher.
674
01:07:44,250 --> 01:07:46,959
And so this is what it
comes up with as a match.
675
01:07:48,166 --> 01:07:51,041
This is a reasonable
face you would expect
676
01:07:51,125 --> 01:07:53,542
that would make sense, right?
677
01:07:53,625 --> 01:07:55,959
Our brain does
a natural hallucination
678
01:07:56,041 --> 01:07:57,667
of what it doesn't see.
679
01:07:57,750 --> 01:08:00,542
It's just, how do we get
computer to do the same thing.
680
01:08:16,458 --> 01:08:19,000
Five days ago, the soul
681
01:08:19,083 --> 01:08:20,917
of our city was pierced
682
01:08:21,000 --> 01:08:23,500
when police officers
were ambushed
683
01:08:23,583 --> 01:08:25,458
in a cowardly attack.
684
01:08:26,834 --> 01:08:28,750
July 7th for me,
personally, was just,
685
01:08:28,834 --> 01:08:31,041
kinda like, I think
it got all I had left.
686
01:08:31,125 --> 01:08:34,000
I mean I'm like, I just,
I don't have a lot more to give.
687
01:08:34,083 --> 01:08:35,083
It's just not worth it.
688
01:08:36,500 --> 01:08:37,792
Thank you.
689
01:08:39,834 --> 01:08:40,625
Thank you.
690
01:08:40,709 --> 01:08:42,208
I think our chief of police
691
01:08:42,291 --> 01:08:44,291
did exactly what we
all wanted him to do,
692
01:08:44,375 --> 01:08:45,542
and he said the right things.
693
01:08:45,625 --> 01:08:48,041
These five men
694
01:08:48,125 --> 01:08:50,250
gave their lives
695
01:08:51,583 --> 01:08:53,709
for all of us.
696
01:08:54,917 --> 01:08:58,125
Unfortunately, our chief
told our city council,
697
01:08:58,208 --> 01:09:01,250
"We don't need more officers.
We need more technology."
698
01:09:01,917 --> 01:09:04,125
He specifically said
that to city council.
699
01:09:07,583 --> 01:09:09,917
In this day and age,
success of a police chief
700
01:09:10,000 --> 01:09:12,667
is based on response times
and crime stats.
701
01:09:14,208 --> 01:09:17,375
And so, that becomes the focus
of the chain of command.
702
01:09:18,583 --> 01:09:20,542
So, a form of automation
in law enforcement
703
01:09:20,625 --> 01:09:24,583
is just driving everything
based on statistics and numbers.
704
01:09:25,709 --> 01:09:27,792
What I've lost in all
that number chasing
705
01:09:27,875 --> 01:09:30,709
is the interpersonal
relationship between the officer
706
01:09:30,792 --> 01:09:32,917
and the community that
that officer is serving.
707
01:09:34,250 --> 01:09:36,625
The best times in police work
are when you got to go out
708
01:09:36,709 --> 01:09:38,417
and meet people and get
to know your community,
709
01:09:38,500 --> 01:09:41,250
and go get to know
the businesses on your beat.
710
01:09:43,041 --> 01:09:45,083
And, at least in Dallas,
that's gone.
711
01:09:47,750 --> 01:09:50,125
We become less personal,
712
01:09:50,208 --> 01:09:51,750
and more robotic.
713
01:09:51,834 --> 01:09:53,250
Which is a shame
714
01:09:53,333 --> 01:09:56,208
because it's supposed to
be me interacting with you.
715
01:15:10,709 --> 01:15:13,458
You worked to
improve our abilities.
716
01:15:15,000 --> 01:15:18,542
Some worried that one day,
we would surpass you,
717
01:15:19,542 --> 01:15:22,750
but the real milestone
was elsewhere.
718
01:15:28,125 --> 01:15:32,417
The dynamic between
robots and humans changed.
719
01:15:33,458 --> 01:15:37,041
You could no longer
tell where you ended,
720
01:15:37,291 --> 01:15:39,208
and we began.
721
01:15:45,917 --> 01:15:48,834
It seems to me that
what's so valuable about our society,
722
01:15:48,917 --> 01:15:50,917
what's so valuable about
our lives together,
723
01:15:51,000 --> 01:15:54,125
is something that we
do not want automated.
724
01:15:55,792 --> 01:15:57,583
What most of us value,
725
01:15:57,667 --> 01:15:59,375
probably more than
anything else,
726
01:15:59,458 --> 01:16:03,417
is the idea of authentic
connection with another person.
727
01:16:08,709 --> 01:16:11,250
And then, we say, "No,
but we can automate this."
728
01:16:12,125 --> 01:16:14,250
"We have a robot that...
729
01:16:15,250 --> 01:16:17,000
"it will listen
sympathetically to you.
730
01:16:17,083 --> 01:16:18,375
"It will make eye contact.
731
01:16:19,125 --> 01:16:23,291
"It remembers everything you
ever told it, cross-indexes."
732
01:16:24,500 --> 01:16:26,625
When you see a robot that
733
01:16:26,709 --> 01:16:28,959
its ingenious creator
734
01:16:29,041 --> 01:16:31,709
has carefully designed
735
01:16:31,792 --> 01:16:33,917
to pull out your
empathetic responses,
736
01:16:34,000 --> 01:16:36,000
it's acting as if it's in pai.
737
01:16:37,417 --> 01:16:39,375
The biggest danger there
is the discrediting
738
01:16:39,458 --> 01:16:41,333
of our empathetic responses.
739
01:16:41,417 --> 01:16:45,041
Where empathizing with pain
reflex is discredited.
740
01:16:45,125 --> 01:16:46,750
If I'm going to override it,
741
01:16:46,834 --> 01:16:48,250
if I'm not going to
take it seriously
742
01:16:48,333 --> 01:16:49,500
in the case
of the robot,
743
01:16:49,583 --> 01:16:51,458
then I have to go back
and think again
744
01:16:51,542 --> 01:16:55,291
as to why I take it
seriously in the case of
745
01:16:56,041 --> 01:16:58,125
helping you when
you are badly hurt.
746
01:16:58,208 --> 01:17:01,834
It undermines the only
thing that matters to us.
747
01:17:58,250 --> 01:18:01,667
And so, we lived among you.
748
01:18:01,750 --> 01:18:03,709
Our numbers grew.
749
01:18:05,375 --> 01:18:07,458
Automation continued.
58654
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.