Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:11,208 --> 00:00:13,458
♪ ♪
2
00:00:13,542 --> 00:00:15,625
(car humming)
3
00:00:18,834 --> 00:00:20,500
(machine whirring)
4
00:00:20,583 --> 00:00:23,917
(hydraulic hissing)
5
00:00:26,875 --> 00:00:29,792
(hydraulic whirring)
6
00:00:29,875 --> 00:00:33,000
(alarm blaring)
7
00:00:36,208 --> 00:00:39,291
(siren wailing)
8
00:00:40,792 --> 00:00:42,750
(beeping)
9
00:00:46,083 --> 00:00:49,750
(in robotic voice):
This is the story
of automation,
10
00:00:49,834 --> 00:00:54,000
and of the people
lost in the process.
11
00:00:55,250 --> 00:00:59,375
Our story begins in
a small town in Germany.
12
00:00:59,458 --> 00:01:02,959
♪ ♪
13
00:01:03,041 --> 00:01:05,792
(distant siren wailing)
14
00:01:09,750 --> 00:01:11,500
(whirring)
15
00:01:16,625 --> 00:01:18,917
♪ ♪
16
00:01:19,000 --> 00:01:21,125
(men speaking German)
17
00:01:33,000 --> 00:01:34,792
(crackling)
18
00:01:39,083 --> 00:01:41,208
(man speaking German)
19
00:01:49,667 --> 00:01:51,041
(crackles)
20
00:02:22,041 --> 00:02:25,667
♪ ♪
21
00:02:30,083 --> 00:02:31,542
(beeping)
22
00:02:36,875 --> 00:02:39,250
Sven Kühling:
It was quite a normal
day at my office,
23
00:02:39,333 --> 00:02:42,583
and I heard from an informant
24
00:02:42,667 --> 00:02:45,834
that an accident
had happened,
25
00:02:45,917 --> 00:02:47,834
and I had to call
26
00:02:47,917 --> 00:02:51,208
the spokesman of
the Volkswagen factory.
27
00:02:51,291 --> 00:02:53,000
(ticking)
28
00:02:53,083 --> 00:02:54,583
♪ ♪
29
00:02:58,792 --> 00:03:00,417
(beeping)
30
00:03:02,083 --> 00:03:05,500
We have the old sentence,
we journalists,
31
00:03:05,583 --> 00:03:08,709
"Dog bites a man
or a man bites a dog."
32
00:03:08,792 --> 00:03:12,917
What's the news?
So, here is the same.
33
00:03:13,000 --> 00:03:17,208
A man bites a dog,
a robot killed a man.
34
00:03:17,291 --> 00:03:20,125
(ticking continues)
35
00:03:24,792 --> 00:03:26,500
(beeping)
36
00:03:29,792 --> 00:03:32,834
The spokesman said that
the accident happened,
37
00:03:32,917 --> 00:03:35,875
but then he paused for a moment.
38
00:03:35,959 --> 00:03:37,875
So, I...
39
00:03:37,959 --> 00:03:41,875
think he didn't want
to say much more.
40
00:03:41,959 --> 00:03:44,083
♪ ♪
41
00:03:49,000 --> 00:03:51,709
(rattling)
42
00:03:51,792 --> 00:03:52,709
(beeps)
43
00:03:52,792 --> 00:03:54,500
(man speaking German)
44
00:04:13,834 --> 00:04:15,458
♪ ♪
45
00:04:15,542 --> 00:04:19,333
Kühling:
The young worker
installed a robot cage,
46
00:04:20,000 --> 00:04:22,917
and he told his colleague
47
00:04:23,000 --> 00:04:25,417
to start the robot.
48
00:04:25,500 --> 00:04:27,166
♪ ♪
49
00:04:29,500 --> 00:04:32,333
(whirring, clanking)
50
00:04:34,000 --> 00:04:36,917
The robot took the man
51
00:04:37,000 --> 00:04:41,375
and pressed him
against a metal wall,
52
00:04:41,458 --> 00:04:44,875
so his chest was crushed.
53
00:04:52,333 --> 00:04:54,667
(whirring, beeping)
54
00:04:54,750 --> 00:04:57,959
(hissing)
55
00:04:58,041 --> 00:05:01,166
♪ ♪
56
00:05:04,375 --> 00:05:06,333
(beeping)
57
00:05:18,083 --> 00:05:19,583
(speaking German)
58
00:06:00,625 --> 00:06:02,166
♪ ♪
59
00:06:02,250 --> 00:06:05,875
Kodomoroid:
The dead man's identity
was never made public.
60
00:06:06,417 --> 00:06:09,917
The investigation
remained open for years.
61
00:06:10,917 --> 00:06:12,792
Production continued.
62
00:06:13,834 --> 00:06:17,291
(metronome ticking)
63
00:06:17,375 --> 00:06:18,959
(speaking German)
64
00:06:24,834 --> 00:06:29,250
Kodomoroid:
Automation of labor made
humans more robotic.
65
00:06:43,250 --> 00:06:45,792
(ticking continuing)
66
00:06:51,291 --> 00:06:53,333
(man 1 speaking German)
67
00:07:05,542 --> 00:07:07,500
(man 2 speaking German)
68
00:07:16,000 --> 00:07:18,000
(man 1 speaking)
69
00:07:39,709 --> 00:07:42,750
♪ ♪
70
00:07:44,375 --> 00:07:45,542
(beeping)
71
00:07:50,375 --> 00:07:52,458
Kodomoroid:
A robot is a machine
72
00:07:52,542 --> 00:07:55,166
that operates automatically,
73
00:07:55,250 --> 00:07:57,500
with human-like skill.
74
00:07:58,375 --> 00:08:00,834
The term derives
from the Czech words
75
00:08:00,917 --> 00:08:03,333
for worker and slave.
76
00:08:08,000 --> 00:08:09,834
(whirring)
77
00:08:30,000 --> 00:08:32,583
(drill whirring)
78
00:08:38,375 --> 00:08:40,458
(drill whirring)
79
00:08:42,875 --> 00:08:44,917
♪ ♪
80
00:08:52,792 --> 00:08:54,750
Hey, Annie.
81
00:08:54,834 --> 00:08:56,792
(whirring)
82
00:09:02,291 --> 00:09:03,917
(whirring)
83
00:09:05,792 --> 00:09:07,500
(beeping)
84
00:09:14,917 --> 00:09:18,417
Walter:
Well, we are engineers,
and we are really not
emotional guys.
85
00:09:18,834 --> 00:09:21,041
But sometimes Annie
does something funny,
86
00:09:21,125 --> 00:09:25,333
and that, of course,
invokes some amusement.
87
00:09:25,417 --> 00:09:27,875
Especially when Annie
happens to press
88
00:09:27,959 --> 00:09:30,959
one of the emergency stop
buttons by herself
89
00:09:31,041 --> 00:09:33,375
and is then incapacitated.
90
00:09:34,709 --> 00:09:36,625
We have some memories
91
00:09:36,709 --> 00:09:39,709
of that happening
in situations...
92
00:09:40,834 --> 00:09:44,000
and this was--
we have quite a laugh.
93
00:09:45,041 --> 00:09:46,834
-(whirring)
-Okay.
94
00:09:48,333 --> 00:09:51,750
Kodomoroid:
We became better at
learning by example.
95
00:09:51,834 --> 00:09:53,041
♪ ♪
96
00:09:53,125 --> 00:09:56,625
You could simply show
us how to do something.
97
00:09:56,709 --> 00:09:58,834
(Walter speaks German)
98
00:10:07,041 --> 00:10:09,625
Walter:
When you are an engineer
in the field of automation,
99
00:10:09,709 --> 00:10:11,834
you may face problems
with workers.
100
00:10:11,917 --> 00:10:14,208
Sometimes,
they get angry at you
101
00:10:14,291 --> 00:10:17,250
just by seeing you somewhere,
and shout at you,
102
00:10:17,333 --> 00:10:18,834
"You are taking my job away."
103
00:10:18,917 --> 00:10:21,375
"What? I'm not here
to take away your job."
104
00:10:22,500 --> 00:10:26,333
But, yeah, sometimes
you get perceived that way.
105
00:10:26,417 --> 00:10:27,875
But, I think,
106
00:10:27,959 --> 00:10:31,083
in the field of human-robot
collaboration, where...
107
00:10:31,166 --> 00:10:34,291
we actually are
working at the moment,
108
00:10:34,375 --> 00:10:36,542
mostly is...
109
00:10:36,625 --> 00:10:38,458
human-robot
collaboration is a thing
110
00:10:38,542 --> 00:10:40,375
where we don't want
to replace a worker.
111
00:10:40,458 --> 00:10:42,000
We want to support workers,
112
00:10:42,083 --> 00:10:43,667
and we want to, yeah...
113
00:10:44,583 --> 00:10:46,000
to, yeah...
114
00:10:46,083 --> 00:10:48,208
(machinery rumbling)
115
00:10:48,875 --> 00:10:50,709
(latches clinking)
116
00:10:53,041 --> 00:10:54,709
(workers conversing
indistinctly)
117
00:10:58,125 --> 00:10:59,792
(machine hisses)
118
00:11:02,208 --> 00:11:03,500
(clicks)
119
00:11:03,583 --> 00:11:04,917
(hissing)
120
00:11:06,458 --> 00:11:08,583
♪ ♪
121
00:11:17,417 --> 00:11:20,375
Kodomoroid:
In order to work
alongside you,
122
00:11:20,458 --> 00:11:22,834
we needed to know which lines
123
00:11:22,917 --> 00:11:25,542
we could not cross.
124
00:11:27,750 --> 00:11:30,083
(whirring)
125
00:11:30,250 --> 00:11:31,917
(typing)
126
00:11:35,083 --> 00:11:37,166
(thudding)
127
00:11:37,250 --> 00:11:39,291
Stop. Stop. Stop.
128
00:11:40,959 --> 00:11:42,458
(thuds)
129
00:11:45,500 --> 00:11:47,083
Hi, I'm Simon Borgen.
130
00:11:47,166 --> 00:11:50,041
We are with Dr. Isaac Asimov,
131
00:11:50,125 --> 00:11:52,542
a biochemist, who may
be the most widely read
132
00:11:52,625 --> 00:11:54,667
of all science fiction writers.
133
00:11:55,792 --> 00:11:57,083
♪ ♪
134
00:11:57,166 --> 00:11:58,875
Kodomoroid:
In 1942,
135
00:11:58,959 --> 00:12:02,500
Isaac Asimov created
a set of guidelines
136
00:12:02,583 --> 00:12:05,083
to protect human society.
137
00:12:05,959 --> 00:12:09,250
The first law was that a robot
couldn't hurt a human being,
138
00:12:09,333 --> 00:12:10,709
or, through inaction,
139
00:12:10,792 --> 00:12:12,875
allow a human being
to come to harm.
140
00:12:12,959 --> 00:12:16,125
The second law was that
a robot had to obey
141
00:12:16,208 --> 00:12:18,125
orders given it
by human beings,
142
00:12:18,208 --> 00:12:20,959
provided that didn't
conflict with the first law.
143
00:12:21,041 --> 00:12:22,750
Scientists say that
when robots are built,
144
00:12:22,834 --> 00:12:25,083
that they may be built
according to these laws,
145
00:12:25,166 --> 00:12:27,542
and also that almost all
science fiction writers
146
00:12:27,625 --> 00:12:30,542
have adopted them as
well in their stories.
147
00:12:31,333 --> 00:12:33,000
(whirring)
148
00:12:34,709 --> 00:12:38,333
Sami Haddadin:
It's not just a statement
from a science fiction novel.
149
00:12:38,750 --> 00:12:40,625
My dissertation's
name was actually,
150
00:12:40,709 --> 00:12:41,834
Towards Safe Robots,
151
00:12:41,917 --> 00:12:43,750
Approaching
Asimov's First Law.
152
00:12:43,834 --> 00:12:46,166
How can we make robots
really fundamentally safe,
153
00:12:46,250 --> 00:12:49,750
according to
Isaac Asimov's first law.
154
00:12:50,375 --> 00:12:53,041
There was this accident
where a human worker
155
00:12:53,125 --> 00:12:55,000
got crushed
by industrial robot.
156
00:12:55,875 --> 00:12:57,583
I was immediately
thinking that
157
00:12:57,667 --> 00:13:00,333
the robot is an industrial,
classical robot,
158
00:13:00,417 --> 00:13:03,542
not able to sense contact,
not able to interact.
159
00:13:03,625 --> 00:13:05,208
Is this a robot
160
00:13:05,291 --> 00:13:07,709
that we, kind of,
want to collaborate with?
161
00:13:07,792 --> 00:13:10,000
No, it's not.
It's inherently forbidden.
162
00:13:10,083 --> 00:13:11,333
We put them behind cages.
163
00:13:11,417 --> 00:13:12,834
We don't want to interact
with them.
164
00:13:12,917 --> 00:13:15,583
We put them behind cages
because they are dangerous,
165
00:13:15,667 --> 00:13:17,458
because they are
inherently unsafe.
166
00:13:17,542 --> 00:13:19,250
(whirring)
167
00:13:21,667 --> 00:13:23,500
(clatters)
168
00:13:25,542 --> 00:13:27,083
More than 10 years ago,
169
00:13:27,166 --> 00:13:30,542
I did the first experiments
in really understanding, uh,
170
00:13:30,625 --> 00:13:33,291
what does it mean
if a robot hits a human.
171
00:13:33,375 --> 00:13:35,417
-(grunts)
-(laughter)
172
00:13:36,000 --> 00:13:38,709
I put myself
as the first guinea pig.
173
00:13:39,375 --> 00:13:40,792
I didn't want to go through
174
00:13:40,875 --> 00:13:42,709
all the legal authorities.
175
00:13:42,792 --> 00:13:44,834
I just wanted to know it,
and that night,
176
00:13:44,917 --> 00:13:46,834
I decided at 6:00 p.m.,
177
00:13:46,917 --> 00:13:48,917
when everybody's gone,
I'm gonna do these experiments.
178
00:13:49,000 --> 00:13:52,625
And I took one of the students
to activate the camera,
179
00:13:52,709 --> 00:13:54,000
and then I just did it.
180
00:13:54,083 --> 00:13:55,959
♪ ♪
181
00:13:58,500 --> 00:14:00,417
-(smacks)
-(laughing)
182
00:14:02,250 --> 00:14:03,750
(whirring)
183
00:14:03,834 --> 00:14:06,875
A robot needs to understand
what does it mean to be safe,
184
00:14:06,959 --> 00:14:09,333
what is it that potentially
could harm a human being,
185
00:14:09,417 --> 00:14:11,542
and therefore, prevent that.
186
00:14:11,625 --> 00:14:14,291
So, the next generation
of robots that is
now out there,
187
00:14:14,375 --> 00:14:17,291
is fundamentally
designed for interaction.
188
00:14:18,208 --> 00:14:20,709
(whirring, clicking)
189
00:14:25,750 --> 00:14:27,125
(laughs)
190
00:14:31,458 --> 00:14:35,125
Kodomoroid:
Eventually, it was time
to leave our cages.
191
00:14:36,291 --> 00:14:39,125
♪ ♪
192
00:14:41,583 --> 00:14:43,500
(whirring)
193
00:14:47,834 --> 00:14:49,458
(beeping)
194
00:14:57,125 --> 00:14:58,959
(beeping)
195
00:15:00,333 --> 00:15:01,917
Nourbakhsh:
Techno-optimism
is when we decide
196
00:15:02,000 --> 00:15:03,208
to solve our problem
with technology.
197
00:15:03,291 --> 00:15:05,291
Then we turn our attention
to the technology,
198
00:15:05,375 --> 00:15:07,250
and we pay so much
attention to the technology,
199
00:15:07,333 --> 00:15:09,333
we stop caring about
the sociological issue
200
00:15:09,417 --> 00:15:11,458
we were trying to solve
in the first place.
201
00:15:11,542 --> 00:15:13,375
We'll innovate our way
out of the problems.
202
00:15:13,458 --> 00:15:16,709
Like, whether it's
agriculture or climate change
or whatever, terrorism.
203
00:15:16,792 --> 00:15:19,500
♪ ♪
204
00:15:25,834 --> 00:15:27,667
If you think about
where robotic automation
205
00:15:27,750 --> 00:15:29,917
and employment
displacement starts,
206
00:15:30,000 --> 00:15:32,583
it basically goes back
to industrial automation
207
00:15:32,667 --> 00:15:34,250
that was grand,
large-scale.
208
00:15:34,333 --> 00:15:36,083
Things like welding
machines for cars,
209
00:15:36,166 --> 00:15:38,583
that can move far faster
than a human arm can move.
210
00:15:38,667 --> 00:15:41,709
So, they're doing a job
that increases the rate
211
00:15:41,792 --> 00:15:43,792
at which the assembly
line can make cars.
212
00:15:43,875 --> 00:15:45,417
It displaces some people,
213
00:15:45,500 --> 00:15:48,250
but it massively increases
the GDP of the country
214
00:15:48,333 --> 00:15:50,291
because productivity goes up
because the machines are
215
00:15:50,375 --> 00:15:53,166
so much higher in productivity
terms than the people.
216
00:15:54,709 --> 00:15:58,291
Narrator:
These giant grasshopper-lookig
devices work all by themselves
217
00:15:58,375 --> 00:16:00,542
on an automobile
assembly line.
218
00:16:00,625 --> 00:16:02,166
They never complain
about the heat
219
00:16:02,250 --> 00:16:04,333
or about the tedium
of the job.
220
00:16:05,458 --> 00:16:08,291
Nourbakhsh:
Fast-forward to today,
and it's a different dynamic.
221
00:16:08,375 --> 00:16:10,667
(grinding)
222
00:16:10,750 --> 00:16:13,125
You can buy
milling machine robots
223
00:16:13,208 --> 00:16:15,417
that can do all the things
a person can do,
224
00:16:15,500 --> 00:16:19,041
but the milling machine
robot only costs $30,000.
225
00:16:20,000 --> 00:16:22,375
We're talking about machines
now that are so cheap,
226
00:16:22,458 --> 00:16:24,083
that they do exactly
what a human does,
227
00:16:24,166 --> 00:16:26,458
with less money,
even in six months,
228
00:16:26,542 --> 00:16:27,792
than the human costs.
229
00:16:27,875 --> 00:16:30,375
♪ ♪
230
00:16:44,583 --> 00:16:46,750
(Wu Huifen speaking Chinese)
231
00:17:20,792 --> 00:17:23,667
(whirring)
232
00:17:23,750 --> 00:17:26,792
♪ ♪
233
00:17:36,959 --> 00:17:38,291
(beeping)
234
00:18:41,208 --> 00:18:44,500
♪ ♪
235
00:18:48,792 --> 00:18:52,625
Kodomoroid:
After the first wave
of industrial automation,
236
00:18:52,709 --> 00:18:55,625
the remaining
manufacturing jobs
237
00:18:55,709 --> 00:18:59,208
required fine motor skills.
238
00:19:21,709 --> 00:19:24,750
(bell ringing)
239
00:19:27,583 --> 00:19:29,875
-(indistinct chatter)
-(ringing continuing)
240
00:19:35,083 --> 00:19:36,917
(beeping, chimes)
241
00:19:42,166 --> 00:19:44,208
(beeping, chimes)
242
00:19:45,875 --> 00:19:50,458
Kodomoroid:
We helped factory owners
monitor their workers.
243
00:19:50,542 --> 00:19:51,583
(beeping)
244
00:19:52,750 --> 00:19:53,875
(beeping)
245
00:19:56,625 --> 00:19:58,333
♪ ♪
246
00:19:58,417 --> 00:19:59,417
(beeping)
247
00:20:01,959 --> 00:20:03,291
(beeping)
248
00:20:09,959 --> 00:20:10,959
(beeping)
249
00:20:20,417 --> 00:20:21,667
(sizzling)
250
00:20:25,583 --> 00:20:27,792
(Li Zheng speaking Chinese)
251
00:20:48,208 --> 00:20:50,333
♪ ♪
252
00:21:05,542 --> 00:21:07,166
(beeping)
253
00:21:15,834 --> 00:21:18,000
(whirring)
254
00:22:12,917 --> 00:22:14,834
(sizzling)
255
00:22:15,583 --> 00:22:19,625
Kodomoroid:
Your advantage in
precision was temporary.
256
00:22:19,709 --> 00:22:21,750
♪ ♪
257
00:22:22,917 --> 00:22:26,375
We took over
the complex tasks.
258
00:22:30,834 --> 00:22:34,208
You moved to the end
of the production line.
259
00:22:34,792 --> 00:22:37,458
♪ ♪
260
00:22:54,417 --> 00:22:55,500
(beeping)
261
00:23:01,291 --> 00:23:02,625
(beeping)
262
00:23:06,417 --> 00:23:08,959
(indistinct chatter)
263
00:23:15,583 --> 00:23:18,625
(music playing on speaker)
264
00:23:22,667 --> 00:23:24,792
(man speaking in Chinese)
265
00:23:35,000 --> 00:23:37,125
(woman speaking Chinese)
266
00:23:46,625 --> 00:23:50,291
(man speaking on speaker)
267
00:23:59,667 --> 00:24:01,792
(man speaking Chinese)
268
00:24:15,875 --> 00:24:17,875
♪ ♪
269
00:24:17,959 --> 00:24:20,083
(Luo Jun speaking Chinese)
270
00:24:34,750 --> 00:24:36,375
(beeping)
271
00:24:37,166 --> 00:24:38,667
(chickens clucking)
272
00:24:43,542 --> 00:24:45,583
(chickens clucking)
273
00:24:55,250 --> 00:24:57,625
♪ ♪
274
00:25:03,917 --> 00:25:06,041
♪ ♪
275
00:25:06,750 --> 00:25:08,917
(woman speaking Chinese)
276
00:25:25,625 --> 00:25:27,250
(indistinct chatter)
277
00:25:32,583 --> 00:25:35,750
♪ ♪
278
00:25:38,375 --> 00:25:40,667
(Wang Chao speaking Chinese)
279
00:25:58,667 --> 00:26:00,792
(beeping)
280
00:26:21,208 --> 00:26:23,834
♪ ♪
281
00:26:44,667 --> 00:26:47,166
(buzzing)
282
00:26:51,875 --> 00:26:54,917
Automation of the service sector
283
00:26:55,000 --> 00:26:58,875
required your trust
and cooperation.
284
00:27:00,375 --> 00:27:03,834
Man:
Here we are, stop-and-go
traffic on 271, and--
285
00:27:03,917 --> 00:27:07,041
Ah, geez,
the car's doing it all itself.
286
00:27:07,125 --> 00:27:09,500
What am I gonna do
with my hands down here?
287
00:27:09,583 --> 00:27:11,500
(beeping)
288
00:27:18,875 --> 00:27:19,750
(beeps)
289
00:27:19,834 --> 00:27:21,875
And now,
it's on autosteer.
290
00:27:22,542 --> 00:27:25,375
So, now I've gone
completely hands-free.
291
00:27:25,875 --> 00:27:28,625
In the center area here
is where the big deal is.
292
00:27:28,709 --> 00:27:30,792
This icon up to
the left is my TACC,
293
00:27:30,875 --> 00:27:33,041
the Traffic-Aware
Cruise Control.
294
00:27:36,000 --> 00:27:38,208
It does a great job of
keeping you in the lane,
295
00:27:38,291 --> 00:27:39,625
and driving
down the road,
296
00:27:39,709 --> 00:27:41,709
and keeping you safe,
and all that kind of stuff,
297
00:27:41,792 --> 00:27:43,917
watching all
the other cars.
298
00:27:45,041 --> 00:27:47,542
Autosteer is probably going
to do very, very poorly.
299
00:27:47,625 --> 00:27:50,458
I'm in a turn
that's very sharp.
300
00:27:50,542 --> 00:27:52,750
-(beeping)
-And, yep,
it said take control.
301
00:27:55,834 --> 00:27:57,917
(horn honking)
302
00:27:58,917 --> 00:28:01,125
-(Twitter whistles)
-(indistinct video audio)
303
00:28:12,125 --> 00:28:14,166
(phone ringing)
304
00:28:14,250 --> 00:28:16,834
Operator (on phone):
911, what is the address
of your emergency?
305
00:28:16,917 --> 00:28:18,375
Man (on phone):
There was just a wreck.
306
00:28:18,458 --> 00:28:20,625
A head-on collision right
here-- Oh my God almighty.
307
00:28:21,250 --> 00:28:23,458
Operator:
Okay, sir, you're on 27?
308
00:28:23,542 --> 00:28:24,375
Man:
Yes, sir.
309
00:28:24,458 --> 00:28:27,000
♪ ♪
310
00:28:33,667 --> 00:28:36,667
Bobby Vankaveelar:
I had just got to work,
clocked in.
311
00:28:36,750 --> 00:28:38,709
They get a phone call
from my sister,
312
00:28:38,792 --> 00:28:40,709
telling me there was
a horrific accident.
313
00:28:40,792 --> 00:28:44,083
That there was somebody
deceased in the front yard.
314
00:28:44,500 --> 00:28:45,583
(beeping)
315
00:28:46,417 --> 00:28:49,750
The Tesla was coming down
the hill of highway 27.
316
00:28:49,834 --> 00:28:52,792
The sensor didn't read
the object in front of them,
317
00:28:52,875 --> 00:28:56,583
which was the, um,
semi-trailer.
318
00:28:56,667 --> 00:28:58,959
♪ ♪
319
00:29:04,959 --> 00:29:07,291
The Tesla went right
through the fence
320
00:29:07,375 --> 00:29:10,750
that borders the highway,
through to the retention pond,
321
00:29:10,834 --> 00:29:13,000
then came through
this side of the fence,
322
00:29:13,083 --> 00:29:15,125
that borders my home.
323
00:29:17,250 --> 00:29:20,375
(police radio chatter)
324
00:29:22,000 --> 00:29:24,583
So, I parked
right near here
325
00:29:24,667 --> 00:29:26,625
before I was asked
not to go any further.
326
00:29:26,709 --> 00:29:28,417
I don't wanna see
327
00:29:28,500 --> 00:29:30,583
what's in the veh-- you know,
what's in the vehicle.
328
00:29:30,667 --> 00:29:32,250
You know,
what had happened to him.
329
00:29:32,333 --> 00:29:34,875
♪ ♪
330
00:29:46,959 --> 00:29:48,667
(scanner beeping)
331
00:29:55,583 --> 00:29:58,041
(indistinct chatter)
332
00:29:59,166 --> 00:30:01,291
Donley:
After the police officer
come, they told me,
333
00:30:01,375 --> 00:30:03,875
about 15 minutes
after he was here,
that it was a Tesla.
334
00:30:03,959 --> 00:30:05,792
It was one of
the autonomous cars,
335
00:30:05,875 --> 00:30:10,500
um, and that
they were investigating
why it did not pick up
336
00:30:10,583 --> 00:30:12,792
or register that there
was a semi in front of it,
337
00:30:12,875 --> 00:30:15,000
you know, and start braking,
'cause it didn't even--
338
00:30:15,083 --> 00:30:17,917
You could tell from
the frameway up on top of
the hill it didn't even...
339
00:30:18,583 --> 00:30:20,959
It didn't even recognize
that there was anything
in front of it.
340
00:30:21,041 --> 00:30:23,041
It thought it was open road.
341
00:30:25,250 --> 00:30:28,500
Donley: You might have
an opinion on a Tesla
accident we had out here.
342
00:30:28,583 --> 00:30:30,709
(man speaking)
343
00:30:33,917 --> 00:30:36,000
It was bad,
that's for sure.
344
00:30:36,083 --> 00:30:38,500
I think people just rely
too much on the technology
345
00:30:38,583 --> 00:30:41,083
and don't pay attention
themselves, you know,
346
00:30:41,166 --> 00:30:42,834
to what's going on around them.
347
00:30:42,917 --> 00:30:46,834
Like, since, like him,
he would've known
that there was an issue
348
00:30:46,917 --> 00:30:51,834
if he wasn't relying on the car
to drive while he was
watching a movie.
349
00:30:51,917 --> 00:30:54,000
The trooper had told me
that the driver
350
00:30:54,083 --> 00:30:55,917
had been watching Harry Potter,
351
00:30:56,000 --> 00:30:57,959
you know, at the time
of the accident.
352
00:30:58,041 --> 00:31:01,000
♪ ♪
353
00:31:01,083 --> 00:31:03,583
A news crew from
Tampa, Florida, knocked
on the door, said,
354
00:31:03,667 --> 00:31:06,542
"This is where the accident
happened with the Tesla?"
355
00:31:06,625 --> 00:31:08,208
I said, "Yes, sir."
356
00:31:08,291 --> 00:31:12,041
And he goes, "Do you know
what the significance is
in this accident?"
357
00:31:12,125 --> 00:31:14,000
And I said,
"No, I sure don't."
358
00:31:14,083 --> 00:31:17,959
And he said,
"It's the very first death,
ever, in a driverless car."
359
00:31:19,083 --> 00:31:21,792
I said,
"Is it anybody local?"
360
00:31:21,875 --> 00:31:24,750
And he goes, "Nobody
around here drives a Tesla."
361
00:31:24,834 --> 00:31:28,291
Newsman:
...a deadly crash that's
raising safety concerns
362
00:31:28,375 --> 00:31:29,834
for everyone in Florida.
363
00:31:29,917 --> 00:31:31,542
Newswoman:
It comes as
the state is pushing
364
00:31:31,625 --> 00:31:34,542
to become the nation's
testbed for driverless cars.
365
00:31:34,625 --> 00:31:36,208
Newsman:
Tesla releasing a statement
366
00:31:36,291 --> 00:31:38,875
that cars in autopilot
have safely driven
367
00:31:38,959 --> 00:31:41,500
more than
130 million miles.
368
00:31:41,583 --> 00:31:43,208
Paluska:
ABC Action News reporter
Michael Paluska,
369
00:31:43,291 --> 00:31:46,458
in Williston, Florida,
tonight, digging for answers.
370
00:31:51,625 --> 00:31:52,959
(beeping)
371
00:31:56,083 --> 00:31:58,125
Paluska:
Big takeaway for
me at the scene was
372
00:31:58,208 --> 00:31:59,750
it just didn't stop.
373
00:31:59,834 --> 00:32:02,333
It was driving
down the road,
374
00:32:02,417 --> 00:32:05,083
with the entire top
nearly sheared off,
375
00:32:05,166 --> 00:32:06,375
with the driver dead
376
00:32:06,458 --> 00:32:08,542
after he hit
the truck at 74 mph.
377
00:32:08,625 --> 00:32:11,959
Why did the vehicle not
have an automatic shutoff?
378
00:32:12,041 --> 00:32:14,083
♪ ♪
379
00:32:14,166 --> 00:32:15,417
That was my big question,
380
00:32:15,500 --> 00:32:16,959
one of the questions
we asked Tesla,
381
00:32:17,041 --> 00:32:18,667
that didn't get answered.
382
00:32:19,458 --> 00:32:21,625
All of the statements
from Tesla were that
383
00:32:21,709 --> 00:32:24,375
they're advancing
the autopilot system,
384
00:32:24,458 --> 00:32:26,625
but everything was couched
385
00:32:26,709 --> 00:32:29,792
with the fact that if one
percent of accidents drop
386
00:32:29,875 --> 00:32:32,458
because that's the way
the autopilot system works,
387
00:32:32,542 --> 00:32:33,709
then that's a win.
388
00:32:33,792 --> 00:32:35,834
They kind of missed the mark,
389
00:32:35,917 --> 00:32:37,917
really honoring
Joshua Brown's life,
390
00:32:38,000 --> 00:32:39,458
and the fact that
he died driving a car
391
00:32:39,542 --> 00:32:41,291
that he thought was
going to keep him safe,
392
00:32:41,375 --> 00:32:44,875
at least safer than
the car that I'm driving,
393
00:32:45,208 --> 00:32:47,291
which is a dumb car.
394
00:32:47,375 --> 00:32:49,625
♪ ♪
395
00:32:49,709 --> 00:32:52,875
Vankaveelar:
To be okay with letting
396
00:32:52,959 --> 00:32:55,250
a machine...
397
00:32:55,333 --> 00:32:56,834
take you from
point A to point B,
398
00:32:56,917 --> 00:32:59,792
and then you
actually get used to
399
00:32:59,875 --> 00:33:02,208
getting from point A
to point B okay,
400
00:33:02,959 --> 00:33:05,583
it-- you get, your mind
gets a little bit--
401
00:33:05,667 --> 00:33:07,250
it's just my opinion, okay--
402
00:33:07,333 --> 00:33:10,750
you just, your mind
gets lazier each time.
403
00:33:12,625 --> 00:33:15,125
Kodomoroid:
The accident
was written off
404
00:33:15,208 --> 00:33:17,750
as a case of human error.
405
00:33:18,250 --> 00:33:19,583
(beeping)
406
00:33:20,583 --> 00:33:23,542
Former centers of
manufacturing became
407
00:33:23,625 --> 00:33:27,709
the testing grounds for
the new driverless taxis.
408
00:33:28,917 --> 00:33:31,250
♪ ♪
409
00:33:34,125 --> 00:33:35,583
Nourbakhsh:
If you think
about what happens
410
00:33:35,667 --> 00:33:36,959
when an autonomous
car hits somebody,
411
00:33:37,041 --> 00:33:39,417
it gets really complicated.
412
00:33:41,375 --> 00:33:42,875
The car company's
gonna get sued.
413
00:33:42,959 --> 00:33:44,500
The sensor-maker's
gonna get sued
414
00:33:44,583 --> 00:33:46,291
because they made
the sensor on the robot.
415
00:33:46,375 --> 00:33:49,333
The regulatory framework
is always gonna be behind,
416
00:33:49,417 --> 00:33:52,375
because robot invention
happens faster
417
00:33:52,458 --> 00:33:54,500
than lawmakers can think.
418
00:33:55,625 --> 00:33:57,542
Newswoman:
One of Uber's
self-driving vehicles
419
00:33:57,625 --> 00:33:59,000
killed a pedestrian.
420
00:33:59,083 --> 00:34:01,000
The vehicle was
in autonomous mode,
421
00:34:01,083 --> 00:34:04,792
with an operator
behind the wheel
when the woman was hit.
422
00:34:06,375 --> 00:34:10,083
Newswoman 2:
Tonight, Tesla confirming
this car was in autopilot mode
423
00:34:10,166 --> 00:34:13,709
when it crashed
in Northern California,
killing the driver,
424
00:34:13,792 --> 00:34:16,166
going on to blame
that highway barrier
425
00:34:16,250 --> 00:34:18,792
that's meant
to reduce impact.
426
00:34:20,875 --> 00:34:24,083
Kodomoroid:
After the first
self-driving car deaths,
427
00:34:24,166 --> 00:34:27,625
testing of the new
taxis was suspended.
428
00:34:28,959 --> 00:34:31,250
Nourbakhsh:
It's interesting when you
look at driverless cars.
429
00:34:31,333 --> 00:34:33,291
You see the same kinds
of value arguments.
430
00:34:33,375 --> 00:34:35,333
30,000 people die
every year,
431
00:34:35,417 --> 00:34:37,208
runoff road accidents
in the US alone.
432
00:34:37,291 --> 00:34:38,709
So, don't we wanna
save all those lives?
433
00:34:38,792 --> 00:34:40,750
Let's have cars
drive instead.
434
00:34:40,834 --> 00:34:42,125
Now, you have
to start thinking
435
00:34:42,208 --> 00:34:44,458
about the side effects
on society.
436
00:34:45,333 --> 00:34:48,041
Are we getting rid of every
taxi driver in America?
437
00:34:50,333 --> 00:34:53,458
Our driver partners are
the heart and soul
of this company
438
00:34:53,917 --> 00:34:57,500
and the only reason we've come
this far in just five years.
439
00:34:58,542 --> 00:35:00,417
Nourbakhsh:
If you look at Uber's
first five years,
440
00:35:00,500 --> 00:35:02,500
they're actually
empowering people.
441
00:35:02,583 --> 00:35:05,291
But when the same company
does really hardcore research
442
00:35:05,375 --> 00:35:07,208
to now replace
all those people,
443
00:35:07,291 --> 00:35:09,125
so they don't
need them anymore,
444
00:35:09,208 --> 00:35:10,500
then what you're
seeing is
445
00:35:10,583 --> 00:35:12,500
they're already a highly
profitable company,
446
00:35:12,583 --> 00:35:15,166
but they simply want
to increase that profit.
447
00:35:17,500 --> 00:35:19,375
(beeping)
448
00:35:27,500 --> 00:35:29,333
(beeping)
449
00:35:32,959 --> 00:35:35,959
Kodomoroid:
Eventually,
testing resumed.
450
00:35:36,625 --> 00:35:41,000
Taxi drivers' wages became
increasingly unstable.
451
00:35:43,250 --> 00:35:46,500
Newsman:
Police say a man drove up
to a gate outside city hall
452
00:35:46,583 --> 00:35:48,542
and shot himself
in the head.
453
00:35:48,625 --> 00:35:51,208
Newswoman:
He left a note saying
services such as Uber
454
00:35:51,291 --> 00:35:53,625
had financially
ruined his life.
455
00:35:53,709 --> 00:35:56,208
Newsman:
Uber and other
mobile app services
456
00:35:56,291 --> 00:35:58,500
have made a once
well-paying industry
457
00:35:58,583 --> 00:36:01,667
into a mass
of long hours, low pay,
458
00:36:01,750 --> 00:36:03,917
and economic insecurity.
459
00:36:04,583 --> 00:36:07,041
♪ ♪
460
00:36:10,625 --> 00:36:14,583
Kodomoroid:
Drivers were the biggest
part of the service economy.
461
00:36:16,917 --> 00:36:18,750
(beeping)
462
00:36:21,125 --> 00:36:22,625
Brandon Ackerman:
My father, he drove.
463
00:36:22,709 --> 00:36:23,709
My uncle drove.
464
00:36:23,792 --> 00:36:26,417
I kind of grew
up into trucking.
465
00:36:29,583 --> 00:36:31,750
Some of the new
technology that came out
466
00:36:31,834 --> 00:36:34,709
is taking a lot of
the freedom of the job away.
467
00:36:35,792 --> 00:36:37,417
It's more stressful.
468
00:36:38,917 --> 00:36:41,333
Kodomoroid:
Automation of trucking began
469
00:36:41,417 --> 00:36:43,458
with monitoring the drivers
470
00:36:43,542 --> 00:36:45,667
and simplifying their job.
471
00:36:46,875 --> 00:36:49,458
There's a radar system.
472
00:36:49,542 --> 00:36:50,834
There's a camera system.
473
00:36:50,917 --> 00:36:54,000
There's automatic braking
and adaptive cruise.
474
00:36:54,083 --> 00:36:55,917
Everything is controlled--
475
00:36:56,000 --> 00:36:58,542
when you sleep,
how long you break,
476
00:36:58,625 --> 00:37:01,125
where you drive,
where you fuel,
477
00:37:01,792 --> 00:37:03,083
where you shut down.
478
00:37:03,166 --> 00:37:05,792
It even knows if somebody
was in the passenger seat.
479
00:37:06,583 --> 00:37:10,500
When data gets sent through
the broadband to the company,
480
00:37:11,417 --> 00:37:13,834
sometimes,
you're put in a situation,
481
00:37:15,375 --> 00:37:17,166
maybe because the truck
482
00:37:17,250 --> 00:37:19,250
automatically slowed
you down on the hill
483
00:37:19,333 --> 00:37:21,875
that's a perfectly
good straightaway,
484
00:37:22,000 --> 00:37:23,417
slowed your
average speed down,
485
00:37:23,500 --> 00:37:26,166
so you were one mile
shy of making it
486
00:37:26,250 --> 00:37:28,959
to that safe haven,
and you have to...
487
00:37:29,875 --> 00:37:33,125
get a-- take a chance
of shutting down
on the side of the road.
488
00:37:33,208 --> 00:37:35,750
An inch is a mile out here.
Sometimes you just...
489
00:37:35,834 --> 00:37:38,125
say to yourself, "Well,
I violate the clock one minute,
490
00:37:38,208 --> 00:37:41,166
I might as well just
drive another 600 miles."
491
00:37:42,583 --> 00:37:45,542
You know, but then you're...
you might lose your job.
492
00:37:45,625 --> 00:37:48,667
♪ ♪
493
00:37:52,083 --> 00:37:54,583
We're concerned that it,
it's gonna reduce
494
00:37:54,667 --> 00:37:57,291
the skill of
a truck driver and the pay.
495
00:37:58,041 --> 00:38:00,917
Because you're not gonna
be really driving a truck.
496
00:38:01,000 --> 00:38:02,333
It's gonna be the computer.
497
00:38:06,667 --> 00:38:08,959
Some of us are worried
about losing our houses,
498
00:38:09,041 --> 00:38:10,375
our cars...
499
00:38:13,208 --> 00:38:15,458
having a place to stay.
Some people...
500
00:38:15,542 --> 00:38:18,458
drive a truck just for
the medical insurance,
501
00:38:18,542 --> 00:38:22,166
and a place to stay
and the ability to travel.
502
00:38:22,250 --> 00:38:25,834
♪ ♪
503
00:38:47,166 --> 00:38:50,375
Kodomoroid:
Entire industries
disappeared,
504
00:38:50,458 --> 00:38:53,750
leaving whole regions
in ruins.
505
00:38:56,750 --> 00:38:59,667
♪ ♪
506
00:39:00,542 --> 00:39:03,333
Martin Ford:
Huge numbers of people
feel very viscerally
507
00:39:03,417 --> 00:39:05,417
that they are being left
behind by the economy,
508
00:39:05,500 --> 00:39:07,959
and, in fact,
they're right, they are.
509
00:39:08,041 --> 00:39:10,250
People, of course,
would be more inclined
510
00:39:10,333 --> 00:39:12,417
to point at globalization
511
00:39:12,500 --> 00:39:14,500
or at, maybe, immigration
as being the problems,
512
00:39:14,583 --> 00:39:16,792
but, actually, technology has
played a huge role already.
513
00:39:16,875 --> 00:39:18,125
♪ ♪
514
00:39:18,208 --> 00:39:20,333
Kodomoroid:
The rise of
personal computers
515
00:39:20,417 --> 00:39:23,709
ushered in an era
of digital automation.
516
00:39:25,125 --> 00:39:26,417
(beeping)
517
00:39:27,792 --> 00:39:30,875
Ford:
In the 1990s, I was running
a small software company.
518
00:39:30,959 --> 00:39:32,709
Software was
a tangible product.
519
00:39:32,792 --> 00:39:36,166
You had to put a CD in a box
and send it to a customer.
520
00:39:38,625 --> 00:39:41,709
So, there was a lot of work
there for average people,
521
00:39:41,792 --> 00:39:44,959
people that didn't necessarily
have lots of education.
522
00:39:45,583 --> 00:39:49,458
But I saw in my own business
how that just evaporated
very, very rapidly.
523
00:39:56,291 --> 00:39:58,750
Historically, people move
from farms to factories,
524
00:39:58,834 --> 00:40:01,709
and then later on, of course,
factories automated,
525
00:40:01,792 --> 00:40:04,041
and they off-shored,
and then people moved
to the service sector,
526
00:40:04,125 --> 00:40:07,208
which is where most people
now work in the United States.
527
00:40:07,291 --> 00:40:09,834
♪ ♪
528
00:40:13,500 --> 00:40:16,083
Julia Collins:
I lived on a water buffalo
farm in the south of Italy.
529
00:40:16,166 --> 00:40:19,208
We had 1,000 water buffalo,
and every buffalo
had a different name.
530
00:40:19,291 --> 00:40:23,250
And they were all these
beautiful Italian names
like Tiara, Katerina.
531
00:40:23,333 --> 00:40:25,083
And so, I thought
it would be fun
532
00:40:25,166 --> 00:40:27,500
to do the same thing
with our robots at Zume.
533
00:40:28,834 --> 00:40:30,792
The first two
robots that we have
534
00:40:30,875 --> 00:40:32,500
are named Giorgio and Pepe,
535
00:40:32,583 --> 00:40:34,792
and they dispense sauce.
536
00:40:36,625 --> 00:40:39,917
And then the next robot,
Marta, she's
a FlexPicker robot.
537
00:40:40,000 --> 00:40:42,125
She looks like
a gigantic spider.
538
00:40:43,125 --> 00:40:45,875
And what this robot does
is spread the sauce.
539
00:40:48,667 --> 00:40:50,291
Then we have Bruno.
540
00:40:50,375 --> 00:40:51,959
This is an incredibly
powerful robot,
541
00:40:52,041 --> 00:40:53,792
but he also has
to be very delicate,
542
00:40:53,875 --> 00:40:56,500
so that he can take pizza
off of the assembly line,
543
00:40:56,583 --> 00:40:58,709
and put it into
the 800-degree oven.
544
00:41:00,375 --> 00:41:03,375
And the robot can do this
10,000 times in a day.
545
00:41:05,041 --> 00:41:07,041
Lots of people have
used automation
546
00:41:07,125 --> 00:41:09,792
to create food at scale,
547
00:41:09,875 --> 00:41:12,041
making 10,000 cheese pizzas.
548
00:41:12,875 --> 00:41:15,125
What we're doing is
developing a line
549
00:41:15,208 --> 00:41:17,208
that can respond dynamically
550
00:41:17,291 --> 00:41:20,166
to every single customer
order, in real time.
551
00:41:20,250 --> 00:41:21,875
That hasn't been done before.
552
00:41:24,041 --> 00:41:25,792
So, as you can see right now,
553
00:41:25,875 --> 00:41:27,291
Jose will use the press,
554
00:41:27,375 --> 00:41:30,000
but then he still has to work
the dough with his hands.
555
00:41:30,667 --> 00:41:33,834
So, this is a step that's
not quite optimized yet.
556
00:41:33,917 --> 00:41:36,959
♪ ♪
557
00:41:39,417 --> 00:41:41,792
We have a fifth robot
that's getting fabricated
558
00:41:41,875 --> 00:41:44,125
at a shop across the bay.
559
00:41:44,208 --> 00:41:46,458
He's called Vincenzo.
He takes pizza out,
560
00:41:46,542 --> 00:41:48,792
and puts it into
an individual mobile oven
561
00:41:48,875 --> 00:41:50,875
for transport and delivery.
562
00:41:55,000 --> 00:41:56,500
Ford:
Any kind of work that is
563
00:41:56,583 --> 00:41:59,959
fundamentally routine and
repetitive is gonna disappear,
564
00:42:00,041 --> 00:42:02,959
and we're simply not equipped
to deal with that politically,
565
00:42:03,041 --> 00:42:05,166
because maybe
the most toxic word
566
00:42:05,250 --> 00:42:08,375
in our political vocabulary
is redistribution.
567
00:42:10,250 --> 00:42:12,583
There aren't gonna be
any rising new sectors
568
00:42:12,667 --> 00:42:14,917
that are gonna be there
to absorb all these workers
569
00:42:15,000 --> 00:42:17,083
in the way that, for example,
that manufacturing was there
570
00:42:17,166 --> 00:42:19,709
to absorb all those
agricultural workers
571
00:42:19,792 --> 00:42:21,917
because AI is going
to be everywhere.
572
00:42:22,000 --> 00:42:25,125
♪ ♪
573
00:42:27,166 --> 00:42:30,959
Kodomoroid:
Artificial intelligence
arrived in small steps.
574
00:42:31,750 --> 00:42:34,041
Profits from AI
concentrated
575
00:42:34,125 --> 00:42:37,250
in the hands of
the technology owners.
576
00:42:38,792 --> 00:42:42,875
Income inequality
reached extreme levels.
577
00:42:43,500 --> 00:42:48,750
-(beeping)
-The touchscreen made
most service work obsolete.
578
00:42:52,750 --> 00:42:54,667
♪ ♪
579
00:43:00,667 --> 00:43:02,375
♪ ♪
580
00:43:35,041 --> 00:43:36,458
Tim Hwang:
After I graduated college,
581
00:43:36,542 --> 00:43:39,583
I had a friend who had
just gone to law school.
582
00:43:40,709 --> 00:43:44,041
He was like, "Aw, man,
the first year of law school,
it's super depressing."
583
00:43:46,041 --> 00:43:48,750
All we're doing is
really rote, rote stuff.
584
00:43:50,000 --> 00:43:52,333
Reading through documents
and looking for a single word,
585
00:43:52,417 --> 00:43:53,625
or I spent the whole afternoon
586
00:43:53,709 --> 00:43:55,542
replacing this word
with another word.
587
00:43:55,625 --> 00:43:57,792
And as someone with a kind of
computer science background,
588
00:43:57,875 --> 00:44:01,125
I was like,
"There's so much here
that could be automated."
589
00:44:01,208 --> 00:44:02,875
(beeping)
590
00:44:07,542 --> 00:44:08,792
So, I saw law school
591
00:44:08,875 --> 00:44:11,750
as very much going three
years behind enemy lines.
592
00:44:17,208 --> 00:44:18,542
I took the bar exam,
593
00:44:18,625 --> 00:44:20,250
became a licensed lawyer,
594
00:44:20,333 --> 00:44:22,583
and went to a law firm,
595
00:44:23,083 --> 00:44:25,250
doing largely
transactional law.
596
00:44:26,375 --> 00:44:27,750
And there, my project was
597
00:44:27,834 --> 00:44:30,417
how much can
I automate of my own job?
598
00:44:33,166 --> 00:44:35,542
During the day, I would
manually do this task,
599
00:44:37,500 --> 00:44:38,750
and then at night,
I would go home,
600
00:44:38,834 --> 00:44:40,375
take these legal rules and sa,
601
00:44:40,458 --> 00:44:42,709
could I create
a computer rule,
602
00:44:42,792 --> 00:44:44,166
a software rule,
603
00:44:44,250 --> 00:44:46,208
that would do what
I did during the day?
604
00:44:46,291 --> 00:44:48,083
♪ ♪
605
00:44:48,166 --> 00:44:51,291
In a lawsuit,
you get to see a lot
of the evidence
606
00:44:51,375 --> 00:44:53,250
that the other
side's gonna present.
607
00:44:53,333 --> 00:44:55,792
That amount of
documentation is huge.
608
00:44:57,917 --> 00:44:59,291
And the old way was actually,
609
00:44:59,375 --> 00:45:01,333
you would send an attorney
to go and look through
610
00:45:01,417 --> 00:45:03,834
every single page
that was in that room.
611
00:45:05,208 --> 00:45:08,000
The legal profession works
on an hourly billing system.
612
00:45:08,083 --> 00:45:10,000
So, I ended up in a kind of
interesting conundrum,
613
00:45:10,083 --> 00:45:12,750
where what I was doing
was making me more
and more efficient,
614
00:45:12,834 --> 00:45:14,041
I was doing more
and more work,
615
00:45:14,125 --> 00:45:16,667
but I was expending
less and less time on it.
616
00:45:16,750 --> 00:45:19,041
And I realized that this would
become a problem at some point,
617
00:45:19,125 --> 00:45:21,875
so I decided to go
independent. I quit.
618
00:45:21,959 --> 00:45:24,792
♪ ♪
619
00:45:28,125 --> 00:45:29,875
So, there's Apollo Cluster,
who has processed
620
00:45:29,959 --> 00:45:32,542
more than 10 million unique
transactions for clients,
621
00:45:32,625 --> 00:45:34,208
and we have another
partner, Daria,
622
00:45:34,291 --> 00:45:36,875
who focuses on transactions,
623
00:45:36,959 --> 00:45:38,792
and then, and then there's me.
624
00:45:40,875 --> 00:45:42,625
Our systems have generated
625
00:45:42,709 --> 00:45:45,375
tens of thousands
of legal documents.
626
00:45:45,458 --> 00:45:47,083
-(beeping)
-It's signed off by a lawyer,
627
00:45:47,166 --> 00:45:50,166
but largely, kind of, created
and mechanized by our systems.
628
00:45:52,917 --> 00:45:55,875
I'm fairly confident that
compared against human work,
629
00:45:55,959 --> 00:45:58,208
it would be indistinguishable.
630
00:45:58,291 --> 00:46:01,458
♪ ♪
631
00:46:02,667 --> 00:46:04,208
(beeping)
632
00:46:17,875 --> 00:46:19,667
(whirring)
633
00:46:22,667 --> 00:46:23,709
(beeping)
634
00:46:39,500 --> 00:46:41,583
(Ishiguro speaking)
635
00:46:56,917 --> 00:47:00,000
♪ ♪
636
00:47:16,417 --> 00:47:18,458
(whirring)
637
00:47:21,417 --> 00:47:23,417
(robot speaking in Japanese)
638
00:47:46,083 --> 00:47:48,250
(speaking Japanese)
639
00:47:55,291 --> 00:47:57,458
(indistinct chatter)
640
00:48:07,083 --> 00:48:08,291
(giggles)
641
00:48:08,834 --> 00:48:10,333
(beeping)
642
00:48:14,291 --> 00:48:16,458
(Hideaki speaking in Japanese)
643
00:48:32,333 --> 00:48:34,500
(Robot speaking Japanese)
644
00:49:02,083 --> 00:49:03,667
(Hideaki speaking Japanese)
645
00:49:25,792 --> 00:49:27,917
(woman speaking Japanese)
646
00:50:07,583 --> 00:50:10,250
♪ ♪
647
00:50:12,250 --> 00:50:14,208
(whirs, beeps)
648
00:50:16,083 --> 00:50:17,208
(beeping)
649
00:50:21,250 --> 00:50:23,542
(Niigaki speaking Japanese)
650
00:50:54,500 --> 00:50:57,625
♪ ♪
651
00:51:04,333 --> 00:51:05,625
(beeping)
652
00:51:12,417 --> 00:51:14,458
(whirring)
653
00:51:22,875 --> 00:51:25,000
(robot speaking Japanese)
654
00:51:28,709 --> 00:51:31,333
♪ ♪
655
00:51:32,542 --> 00:51:34,667
(speaking Japanese)
656
00:51:43,000 --> 00:51:45,834
(jazzy piano music playing)
657
00:51:46,917 --> 00:51:48,458
-(beeps)
-(lock clicks)
658
00:51:54,208 --> 00:51:56,208
(piano music continuing)
659
00:52:03,458 --> 00:52:05,667
(automated voice
speaking Japanese)
660
00:52:30,125 --> 00:52:34,000
When we first appeared,
we were a novelty.
661
00:52:36,375 --> 00:52:38,041
(man speaking Japanese)
662
00:52:48,750 --> 00:52:51,041
(automated voice
speaking Japanese)
663
00:52:58,750 --> 00:53:00,959
(automated voice
speaking Japanese)
664
00:53:05,125 --> 00:53:08,083
♪ ♪
665
00:53:14,208 --> 00:53:16,750
(buzzing)
666
00:53:18,375 --> 00:53:20,458
Kodomoroid:
While doing
your dirty work,
667
00:53:20,542 --> 00:53:23,500
we gathered data
about your habits
668
00:53:23,583 --> 00:53:25,792
-and preferences.
-(humming)
669
00:53:36,291 --> 00:53:38,458
We got to know you better.
670
00:53:44,959 --> 00:53:47,333
(buzzing)
671
00:53:49,333 --> 00:53:51,291
(beeping)
672
00:53:54,834 --> 00:53:56,875
Savvides:
The core of everything
we're doing in this lab,
673
00:53:56,959 --> 00:53:58,959
with our long-range
iris system
674
00:53:59,041 --> 00:54:00,917
is trying to
develop technology
675
00:54:01,000 --> 00:54:03,041
so that the computer
676
00:54:03,125 --> 00:54:05,500
can identify who we
are in a seamless way.
677
00:54:06,375 --> 00:54:07,792
And up till now,
678
00:54:07,875 --> 00:54:10,583
we always have to make
an effort to be identified.
679
00:54:10,667 --> 00:54:12,250
♪ ♪
680
00:54:12,333 --> 00:54:14,750
-(beeping)
-All the systems were very
close-range, Hollywood-style,
681
00:54:14,834 --> 00:54:16,583
where you had to go
close to the camera,
682
00:54:16,667 --> 00:54:20,375
and I always found that
challenging for a user.
683
00:54:21,000 --> 00:54:23,417
If I was a user
interacting with this...
684
00:54:23,500 --> 00:54:26,333
system, with this computer,
with this AI,
685
00:54:26,417 --> 00:54:27,709
I don't wanna be that close.
686
00:54:27,792 --> 00:54:29,875
I feel it's very invasive.
687
00:54:30,709 --> 00:54:32,709
So, what I wanted to
solve with my team here
688
00:54:32,792 --> 00:54:34,500
is how can we capture
689
00:54:34,583 --> 00:54:37,041
and identify who you
are from the iris,
690
00:54:37,125 --> 00:54:39,208
at a bigger distance?
How can we still do that,
691
00:54:39,291 --> 00:54:41,625
and have a pleasant
user experience.
692
00:54:41,709 --> 00:54:44,542
♪ ♪
693
00:54:49,625 --> 00:54:52,083
I think there's
a very negative stigma
694
00:54:52,166 --> 00:54:54,959
when people think about
biometrics and facial
recognition,
695
00:54:55,041 --> 00:54:57,834
and any kind of sort
of profiling of users
696
00:54:57,917 --> 00:55:01,709
for marketing purposes to
buy a particular product.
697
00:55:03,500 --> 00:55:07,125
I think the core
science is neutral.
698
00:55:07,208 --> 00:55:10,250
♪ ♪
699
00:55:11,875 --> 00:55:13,625
Nourbakhsh:
Companies go to no end
700
00:55:13,709 --> 00:55:15,709
to try and figure out
how to sell stuff.
701
00:55:15,792 --> 00:55:17,667
And the more information
they have on us,
702
00:55:17,750 --> 00:55:19,417
the better they
can sell us stuff.
703
00:55:19,500 --> 00:55:20,750
(beeping)
704
00:55:20,834 --> 00:55:22,417
We've reached a point where,
for the first time,
705
00:55:22,500 --> 00:55:23,834
robots are able to see.
706
00:55:23,917 --> 00:55:25,208
They can recognize faces.
707
00:55:25,291 --> 00:55:27,583
They can recognize
the expressions you make.
708
00:55:28,542 --> 00:55:31,375
They can recognize
the microexpressions you make.
709
00:55:33,667 --> 00:55:36,500
You can develop individualized
models of behavior
710
00:55:36,583 --> 00:55:38,500
for every person on Earth,
711
00:55:38,583 --> 00:55:40,250
attach machine learning to it,
712
00:55:40,333 --> 00:55:43,583
and come out with the perfect
model for how to sell to you.
713
00:55:44,000 --> 00:55:45,583
(door squeaks)
714
00:55:49,917 --> 00:55:52,125
(beeping)
715
00:55:57,083 --> 00:55:59,125
♪ ♪
716
00:56:00,542 --> 00:56:04,125
Kodomoroid:
You gave us your
undivided attention.
717
00:56:25,792 --> 00:56:28,041
(whirring)
718
00:56:28,125 --> 00:56:31,250
We offered reliable,
friendly service.
719
00:56:35,333 --> 00:56:38,792
Human capacities
began to deteriorate.
720
00:56:42,041 --> 00:56:47,166
Spatial orientation and memory
were affected first.
721
00:56:48,166 --> 00:56:50,709
♪ ♪
722
00:56:53,000 --> 00:56:56,208
The physical world
and the digital world
723
00:56:56,291 --> 00:56:58,375
became one.
724
00:57:01,959 --> 00:57:03,625
(neon sign buzzing)
725
00:57:05,875 --> 00:57:08,709
You were alone
with your desires.
726
00:57:08,792 --> 00:57:11,667
("What You Gonna Do Now?"
by Carla dal Forno playing)
727
00:57:35,458 --> 00:57:39,000
♪ What you gonna do now ♪
728
00:57:39,083 --> 00:57:44,542
♪ That the night's come
and it's around you? ♪
729
00:57:45,458 --> 00:57:49,000
♪ What you gonna do now ♪
730
00:57:49,083 --> 00:57:54,375
♪ That the night's come
and it surrounds you? ♪
731
00:57:55,500 --> 00:57:58,583
♪ What you gonna do now ♪
732
00:57:59,125 --> 00:58:03,417
♪ That the night's come
and it surrounds you? ♪
733
00:58:03,500 --> 00:58:05,417
(buzzing)
734
00:58:12,500 --> 00:58:15,375
Automation brought
the logic of efficiency
735
00:58:15,458 --> 00:58:18,500
to matters of life and death.
736
00:58:18,583 --> 00:58:20,792
Protesters:
Enough is enough!
737
00:58:20,875 --> 00:58:25,208
Enough is enough!
Enough is enough!
738
00:58:25,291 --> 00:58:27,417
-(gunfire)
-(screaming)
739
00:58:30,083 --> 00:58:34,250
Police Radio:
To all SWAT officers
on channel 2, code 3...
740
00:58:37,625 --> 00:58:39,500
♪ ♪
741
00:58:39,583 --> 00:58:40,834
Get back! Get back!
742
00:58:40,917 --> 00:58:43,500
-(gunfire)
-Police Radio:
The suspect has a rifle.
743
00:58:43,583 --> 00:58:45,333
-(police radio chatter)
-(sirens)
744
00:58:45,417 --> 00:58:46,542
(gunfire)
745
00:58:48,500 --> 00:58:50,750
(sirens wailing)
746
00:58:56,709 --> 00:58:59,750
Police Radio:
We have got to get
(unintelligible) down here...
747
00:58:59,834 --> 00:59:02,000
... right now!
(chatter continues)
748
00:59:02,083 --> 00:59:04,500
Man:
There's a fucking sniper!
He shot four cops!
749
00:59:04,583 --> 00:59:06,959
(gunfire)
750
00:59:07,041 --> 00:59:09,583
Woman:
I'm not going near him!
He's shooting right now!
751
00:59:10,458 --> 00:59:12,667
-(sirens continue)
-(gunfire)
752
00:59:13,834 --> 00:59:16,208
Police Radio:
Looks like he's inside
the El Centro building.
753
00:59:16,291 --> 00:59:18,125
-Inside the El Centro buildin.
-(radio beeps)
754
00:59:18,208 --> 00:59:21,250
-(gunfire)
-(helicopter whirring)
755
00:59:21,333 --> 00:59:23,250
(indistinct chatter)
756
00:59:24,000 --> 00:59:26,125
Police Radio:
We may have
a suspect pinned down.
757
00:59:26,208 --> 00:59:28,417
-Northwest corner
of the building.
-(radio beeps)
758
00:59:29,291 --> 00:59:32,709
Chris Webb:
Our armored car was
moving in to El Centro
759
00:59:32,792 --> 00:59:34,625
and so I jumped on the back.
760
00:59:37,834 --> 00:59:39,375
(beeping)
761
00:59:41,667 --> 00:59:42,709
(indistinct chatter)
762
00:59:42,792 --> 00:59:44,041
Came in through the rotunda,
763
00:59:44,125 --> 00:59:46,333
where I found two of our
intelligence officers.
764
00:59:46,583 --> 00:59:47,917
They were calm
and cool and they said,
765
00:59:48,000 --> 00:59:49,166
"Everything's upstairs."
766
00:59:51,583 --> 00:59:54,208
-There's a stairwell right here.
-(door squeaks)
767
00:59:54,291 --> 00:59:57,250
♪ ♪
768
00:59:57,500 --> 00:59:59,125
That's how I knew I was
going the right direction
769
00:59:59,208 --> 01:00:01,542
'cause I just kept
following the blood.
770
01:00:02,667 --> 01:00:03,875
Newswoman:
Investigators say
771
01:00:03,959 --> 01:00:05,917
Micah Johnson was
amassing an arsenal
772
01:00:06,000 --> 01:00:08,291
at his home outside Dallas.
773
01:00:08,375 --> 01:00:10,917
Johnson was
an Afghan war veteran.
774
01:00:11,000 --> 01:00:12,542
Every one of these
door handles,
775
01:00:12,625 --> 01:00:14,667
as we worked our way down,
776
01:00:15,291 --> 01:00:17,709
had blood on them,
where he'd been checking them.
777
01:00:19,083 --> 01:00:20,667
Newswoman:
This was a scene of terror
778
01:00:20,750 --> 01:00:23,333
just a couple of hours ago,
and it's not over yet.
779
01:00:23,417 --> 01:00:26,667
(helicopter whirring)
780
01:00:27,250 --> 01:00:28,792
(police radio chatter)
781
01:00:28,875 --> 01:00:31,917
♪ ♪
782
01:00:33,375 --> 01:00:36,458
Webb:
He was hiding behind,
like, a server room.
783
01:00:36,542 --> 01:00:38,834
Our ballistic tip rounds
were getting eaten up.
784
01:00:38,917 --> 01:00:40,041
(gunfire)
785
01:00:40,125 --> 01:00:41,667
He was just hanging
the gun out on the corner
786
01:00:41,750 --> 01:00:43,792
and just firing at the guys.
787
01:00:43,875 --> 01:00:46,375
(siren blares)
788
01:00:46,875 --> 01:00:48,125
(gunfire)
789
01:00:48,208 --> 01:00:49,750
And he kept enticing them.
"Hey, come on down!
790
01:00:49,834 --> 01:00:52,166
Come and get me! Let's go.
Let's get this over with."
791
01:00:54,208 --> 01:00:57,250
Brown:
This suspect we're negotiating
with for the last 45 minutes
792
01:00:57,333 --> 01:00:59,542
has been exchanging
gunfire with us
793
01:00:59,625 --> 01:01:02,792
and not being very cooperative
in the negotiations.
794
01:01:04,250 --> 01:01:06,166
Before I came here,
795
01:01:06,250 --> 01:01:08,250
I asked for plans
796
01:01:08,333 --> 01:01:10,542
to end this standoff,
797
01:01:10,625 --> 01:01:12,000
and as soon as
I'm done here,
798
01:01:12,083 --> 01:01:14,417
I'll be presented
with those plans.
799
01:01:14,500 --> 01:01:15,667
(police radio chatter)
800
01:01:15,750 --> 01:01:17,333
Webb:
Our team came up with the pla.
801
01:01:17,417 --> 01:01:19,333
Let's just blow him up.
802
01:01:19,417 --> 01:01:21,083
♪ ♪
803
01:01:22,709 --> 01:01:24,959
We had recently got
a hand-me-down robot
804
01:01:25,041 --> 01:01:27,166
from the Dallas ATF office,
805
01:01:27,250 --> 01:01:29,500
and so we were using it a lot.
806
01:01:29,583 --> 01:01:31,583
(whirring)
807
01:01:37,542 --> 01:01:38,583
(beeping)
808
01:01:42,291 --> 01:01:44,709
♪ ♪
809
01:01:44,792 --> 01:01:47,000
It was our bomb squad's robot,
but they didn't wanna have
810
01:01:47,083 --> 01:01:49,166
anything to do with what
we were doing with it.
811
01:01:49,250 --> 01:01:51,375
The plan was to
812
01:01:51,917 --> 01:01:55,500
set a charge off right on
top of this guy and kill him.
813
01:01:55,834 --> 01:01:58,500
And some people
just don't wanna...
814
01:01:58,583 --> 01:01:59,959
don't wanna do that.
815
01:02:03,166 --> 01:02:05,667
We saw no other option
816
01:02:05,750 --> 01:02:09,917
but to use our
bomb r-- bomb robot
817
01:02:11,208 --> 01:02:13,208
and place a device
818
01:02:13,291 --> 01:02:15,792
on its... extension.
819
01:02:17,959 --> 01:02:20,542
Webb:
H e wanted something
to listen to music on,
820
01:02:20,625 --> 01:02:23,208
and so that was
a way for us to...
821
01:02:23,291 --> 01:02:25,959
to hide the robot
coming down the hall.
822
01:02:26,041 --> 01:02:27,375
"Okay, we'll bring
you some music.
823
01:02:27,458 --> 01:02:28,750
Hang on, let us get
this thing together."
824
01:02:28,834 --> 01:02:31,834
(ticking)
825
01:02:37,041 --> 01:02:38,792
It had a trash bag
over the charge
826
01:02:38,875 --> 01:02:40,417
to kinda hide the fact
that there was,
827
01:02:40,500 --> 01:02:44,000
you know, pound and a quarter
of C4 at the end of it.
828
01:02:50,250 --> 01:02:52,875
The minute the robot
got in position,
829
01:02:52,959 --> 01:02:54,875
the charge was detonated.
830
01:02:55,500 --> 01:02:56,792
(boom)
831
01:02:56,875 --> 01:02:59,959
(high-pitched ringing)
832
01:03:00,041 --> 01:03:03,166
♪ ♪
833
01:03:04,250 --> 01:03:07,375
(muted gunfire)
834
01:03:16,375 --> 01:03:18,959
He had gone down with his
finger on the trigger,
835
01:03:19,041 --> 01:03:21,000
and he was kinda hunched over.
836
01:03:26,750 --> 01:03:29,583
It was a piece of the robot
hand had broken off,
837
01:03:29,667 --> 01:03:32,333
and hit his skull, which
caused a small laceration,
838
01:03:32,417 --> 01:03:33,542
which was what was bleeding.
839
01:03:35,083 --> 01:03:36,792
So, I just squeezed
through the,
840
01:03:36,875 --> 01:03:38,542
the little opening that...
841
01:03:38,625 --> 01:03:41,041
that the charge had
caused in the drywall,
842
01:03:41,125 --> 01:03:43,291
and separated him from the gun,
843
01:03:43,959 --> 01:03:47,166
and then we called up
the bomb squad to come in,
844
01:03:47,250 --> 01:03:49,583
and start their search
to make sure it was safe,
845
01:03:49,667 --> 01:03:51,834
that he wasn't sitting
on any explosives.
846
01:03:51,917 --> 01:03:54,125
♪ ♪
847
01:03:54,208 --> 01:03:56,542
Newsman:
The sniper hit
11 police officers,
848
01:03:56,625 --> 01:03:58,834
at least five of
whom are now dead,
849
01:03:58,917 --> 01:04:01,583
making it the deadliest
day in law enforcement
850
01:04:01,667 --> 01:04:03,125
since September 11th.
851
01:04:03,208 --> 01:04:05,709
They blew him up with a bomb
attached to a robot,
852
01:04:05,792 --> 01:04:08,333
that was actually built to
protect people from bombs.
853
01:04:08,417 --> 01:04:09,917
Newsman:
It's a tactic straight from
854
01:04:10,000 --> 01:04:12,583
America's wars in Iraq
and Afghanistan...
855
01:04:12,667 --> 01:04:15,542
Newsman 2:
The question for SWAT teams
nationwide is whether Dallas
856
01:04:15,625 --> 01:04:18,875
marks a watershed moment
in police tactics.
857
01:04:20,083 --> 01:04:23,917
Kodomoroid:
That night in Dallas,
a line was crossed.
858
01:04:25,333 --> 01:04:27,667
A robot must obey
859
01:04:27,750 --> 01:04:30,875
orders given it by
qualified personnel,
860
01:04:30,959 --> 01:04:35,083
unless those orders violate
rule number one. In other words,
861
01:04:35,166 --> 01:04:37,333
a robot can't be ordered
to kill a human being.
862
01:04:39,458 --> 01:04:41,458
Things are moving so quickly,
863
01:04:41,542 --> 01:04:45,083
that it's unsafe to go
forward blindly anymore.
864
01:04:45,166 --> 01:04:48,166
One must try to foresee
865
01:04:48,250 --> 01:04:51,166
where it is that one is
going as much as possible.
866
01:04:51,250 --> 01:04:54,792
♪ ♪
867
01:04:59,834 --> 01:05:02,041
Savvides:
We built the system
for the DOD,
868
01:05:02,125 --> 01:05:03,375
(indistinct chatter)
869
01:05:03,458 --> 01:05:05,625
and it was something that
could help the soldiers
870
01:05:05,709 --> 01:05:08,458
try to do iris
recognition in the field.
871
01:05:11,208 --> 01:05:13,166
♪ ♪
872
01:05:25,166 --> 01:05:27,333
We have collaborations
with law enforcement
873
01:05:27,417 --> 01:05:29,083
where they can test
their algorithms,
874
01:05:29,166 --> 01:05:30,834
and then give us feedback.
875
01:05:32,333 --> 01:05:35,417
It's always a face
behind a face, partial face.
876
01:05:35,500 --> 01:05:37,166
A face will be masked.
877
01:05:38,834 --> 01:05:40,291
Even if there's
occlusion,
878
01:05:40,375 --> 01:05:42,000
it still finds
the face.
879
01:05:46,500 --> 01:05:48,709
Nourbakhsh:
One of the ways we're
trying to make autonomous,
880
01:05:48,792 --> 01:05:50,291
war-fighting machines now
881
01:05:50,375 --> 01:05:52,458
is by using computer vision
882
01:05:52,542 --> 01:05:54,291
and guns together.
883
01:05:54,375 --> 01:05:55,709
(beeping)
884
01:05:55,792 --> 01:05:58,959
You make a database of
the images of known terrorist,
885
01:05:59,041 --> 01:06:02,542
and you tell the machine
to lurk and look for them,
886
01:06:02,625 --> 01:06:05,792
and when it matches a face
to its database, shoot.
887
01:06:05,875 --> 01:06:08,583
(gunfire)
888
01:06:10,542 --> 01:06:12,083
(gunfire)
889
01:06:13,875 --> 01:06:17,166
Those are robots that are
deciding to harm somebody,
890
01:06:17,250 --> 01:06:19,834
and that goes directly
against Asimov's first law.
891
01:06:19,917 --> 01:06:22,208
A robot may never harm a human.
892
01:06:23,041 --> 01:06:24,500
Every time we make a machine
893
01:06:24,583 --> 01:06:26,875
that's not really as
intelligent as a human,
894
01:06:26,959 --> 01:06:28,208
it's gonna get misused.
895
01:06:28,291 --> 01:06:30,875
And that's exactly where
Asimov's laws get muddy.
896
01:06:32,208 --> 01:06:33,917
This is, sort of,
the best image,
897
01:06:34,041 --> 01:06:36,500
but it's really out
of focus. It's blurry.
898
01:06:36,583 --> 01:06:39,834
There's occlusion due
to facial hair, hat,
899
01:06:40,166 --> 01:06:41,792
he's holding
a cell phone...
900
01:06:41,875 --> 01:06:44,667
So, we took that and we
reconstructed this,
901
01:06:44,750 --> 01:06:48,542
which is what we sent to
law enforcement at 2:42 AM.
902
01:06:48,625 --> 01:06:50,500
♪ ♪
903
01:06:50,583 --> 01:06:52,250
To get the eye coordinates,
904
01:06:52,333 --> 01:06:54,291
we crop out
the periocular region,
905
01:06:54,375 --> 01:06:56,333
which is the region
around the eyes.
906
01:06:56,959 --> 01:06:59,917
We reconstruct the whole
face based on this region.
907
01:07:00,000 --> 01:07:02,125
We run the whole face
against a matcher.
908
01:07:03,208 --> 01:07:05,917
And so this is what it
comes up with as a match.
909
01:07:07,125 --> 01:07:10,000
This is a reasonable
face you would expect
910
01:07:10,083 --> 01:07:12,500
that would make sense, right?
911
01:07:12,583 --> 01:07:14,917
Our brain does
a natural hallucination
912
01:07:15,041 --> 01:07:16,667
of what it doesn't see.
913
01:07:16,750 --> 01:07:19,542
It's just, how do we get
computer to do the same thing.
914
01:07:19,625 --> 01:07:22,667
♪ ♪
915
01:07:27,625 --> 01:07:30,458
(police radio chatter)
916
01:07:35,417 --> 01:07:37,959
Man:
Five days ago, the soul
917
01:07:38,041 --> 01:07:39,875
of our city was pierced
918
01:07:39,959 --> 01:07:42,458
when police officers
were ambushed
919
01:07:42,542 --> 01:07:44,417
in a cowardly attack.
920
01:07:45,792 --> 01:07:47,709
Webb:
July 7th for me,
personally, was just,
921
01:07:47,792 --> 01:07:50,000
kinda like, I think
it got all I had left.
922
01:07:50,083 --> 01:07:52,959
I mean I'm like, I just,
I don't have a lot more to give.
923
01:07:53,041 --> 01:07:54,041
It's just not worth it.
924
01:07:54,125 --> 01:07:55,375
-(applause)
-(music playing)
925
01:07:55,458 --> 01:07:56,750
Thank you.
926
01:07:58,834 --> 01:07:59,625
Thank you.
927
01:07:59,709 --> 01:08:01,208
I think our chief of police
928
01:08:01,291 --> 01:08:03,291
did exactly what we
all wanted him to do,
929
01:08:03,375 --> 01:08:04,542
and he said the right things.
930
01:08:04,625 --> 01:08:07,041
These five men
931
01:08:07,125 --> 01:08:09,208
gave their lives
932
01:08:10,542 --> 01:08:12,667
for all of us.
933
01:08:13,875 --> 01:08:17,083
Unfortunately, our chief
told our city council,
934
01:08:17,166 --> 01:08:20,208
"We don't need more officers.
We need more technology."
935
01:08:20,875 --> 01:08:23,083
He specifically said
that to city council.
936
01:08:23,166 --> 01:08:24,500
(police radio chatter)
937
01:08:26,542 --> 01:08:28,875
In this day and age,
success of a police chief
938
01:08:28,959 --> 01:08:31,625
is based on response times
and crime stats.
939
01:08:31,709 --> 01:08:33,083
(beeping)
940
01:08:33,166 --> 01:08:36,333
And so, that becomes the focus
of the chain of command.
941
01:08:37,542 --> 01:08:39,542
So, a form of automation
in law enforcement
942
01:08:39,625 --> 01:08:43,583
is just driving everything
based on statistics and numbers.
943
01:08:44,709 --> 01:08:46,792
What I've lost in all
that number chasing
944
01:08:46,875 --> 01:08:49,709
is the interpersonal
relationship between the officer
945
01:08:49,792 --> 01:08:51,875
and the community that
that officer is serving.
946
01:08:53,208 --> 01:08:55,583
The best times in police work
are when you got to go out
947
01:08:55,667 --> 01:08:57,375
and meet people and get
to know your community,
948
01:08:57,458 --> 01:09:00,208
and go get to know
the businesses on your beat.
949
01:09:02,000 --> 01:09:04,041
And, at least in Dallas,
that's gone.
950
01:09:06,709 --> 01:09:09,083
We become less personal,
951
01:09:09,166 --> 01:09:10,709
and more robotic.
952
01:09:10,792 --> 01:09:12,208
Which is a shame
953
01:09:12,291 --> 01:09:15,166
because it's supposed to
be me interacting with you.
954
01:09:16,709 --> 01:09:19,834
♪ ♪
955
01:09:24,375 --> 01:09:26,542
(Zhen Jiajia speaking Chinese)
956
01:09:29,417 --> 01:09:30,875
(beeping)
957
01:09:48,583 --> 01:09:50,625
(typing)
958
01:10:10,792 --> 01:10:12,917
(office chatter)
959
01:11:03,667 --> 01:11:04,709
(beeping)
960
01:11:05,709 --> 01:11:07,875
(automated voice
speaking Chinese)
961
01:11:25,208 --> 01:11:26,542
♪ ♪
962
01:12:43,291 --> 01:12:44,291
(beeps)
963
01:12:46,834 --> 01:12:49,333
(beep, music playing)
964
01:13:39,333 --> 01:13:40,667
(sighs)
965
01:13:44,709 --> 01:13:47,834
♪ ♪
966
01:13:58,333 --> 01:14:00,458
♪ ♪
967
01:14:29,667 --> 01:14:32,417
Kodomoroid:
You worked to improve
our abilities.
968
01:14:33,959 --> 01:14:37,500
Some worried that one day,
we would surpass you,
969
01:14:38,500 --> 01:14:41,709
but the real milestone
was elsewhere.
970
01:14:47,083 --> 01:14:51,375
The dynamic between
robots and humans changed.
971
01:14:52,417 --> 01:14:56,041
You could no longer
tell where you ended,
972
01:14:56,291 --> 01:14:58,208
and we began.
973
01:15:01,709 --> 01:15:04,834
(chatter, laughter)
974
01:15:04,917 --> 01:15:07,792
John Campbell:
It seems to me that what's so
valuable about our society,
975
01:15:07,875 --> 01:15:09,875
what's so valuable about
our lives together,
976
01:15:09,959 --> 01:15:13,083
is something that we
do not want automated.
977
01:15:14,750 --> 01:15:16,542
What most of us value,
978
01:15:16,625 --> 01:15:18,333
probably more than
anything else,
979
01:15:18,417 --> 01:15:22,375
is the idea of authentic
connection with another person.
980
01:15:23,208 --> 01:15:24,792
(beeping)
981
01:15:27,667 --> 01:15:30,208
And then, we say, "No,
but we can automate this."
982
01:15:31,083 --> 01:15:33,208
"We have a robot that...
983
01:15:34,208 --> 01:15:36,000
"it will listen
sympathetically to you.
984
01:15:36,083 --> 01:15:37,375
"It will make eye contact.
985
01:15:38,125 --> 01:15:42,291
"It remembers everything you
ever told it, cross-indexes."
986
01:15:43,500 --> 01:15:45,625
-When you see a robot that
-(beep)
987
01:15:45,709 --> 01:15:47,917
its ingenious creator
988
01:15:48,000 --> 01:15:50,667
has carefully designed
989
01:15:50,750 --> 01:15:52,875
to pull out your
empathetic responses,
990
01:15:52,959 --> 01:15:54,959
it's acting as if it's in pai.
991
01:15:56,375 --> 01:15:58,333
The biggest danger there
is the discrediting
992
01:15:58,417 --> 01:16:00,291
of our empathetic responses.
993
01:16:00,375 --> 01:16:04,000
Where empathizing with pain
reflex is discredited.
994
01:16:04,083 --> 01:16:05,709
If I'm going to override it,
995
01:16:05,792 --> 01:16:07,208
if I'm not going to
take it seriously
996
01:16:07,291 --> 01:16:08,458
in the case
of the robot,
997
01:16:08,542 --> 01:16:10,417
then I have to go back
and think again
998
01:16:10,500 --> 01:16:14,250
as to why I take it
seriously in the case of
999
01:16:15,000 --> 01:16:17,125
helping you when
you are badly hurt.
1000
01:16:17,208 --> 01:16:20,834
It undermines the only
thing that matters to us.
1001
01:16:22,959 --> 01:16:26,000
♪ ♪
1002
01:17:13,792 --> 01:17:17,125
("Hikkoshi"
by Maki Asakawa playing)
1003
01:17:17,208 --> 01:17:20,625
Kodomoroid:
And so, we lived among you.
1004
01:17:20,709 --> 01:17:22,667
Our numbers grew.
1005
01:17:24,333 --> 01:17:26,417
Automation continued.
1006
01:17:27,667 --> 01:17:29,834
(woman singing in Japanese)
1007
01:17:40,709 --> 01:17:42,375
♪ ♪
1008
01:18:02,083 --> 01:18:03,625
♪ ♪
1009
01:18:16,083 --> 01:18:18,250
(speaking Japanese)
1010
01:18:33,792 --> 01:18:35,834
♪ ♪
1011
01:18:49,417 --> 01:18:51,458
♪ ♪
73686
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.