Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:04,166 --> 00:00:08,500
♪ ♪
2
00:00:08,500 --> 00:00:10,766
MILES O'BRIEN:
Machines that think like humans.
3
00:00:10,766 --> 00:00:14,300
Our dream to create machines
in our own image
4
00:00:14,300 --> 00:00:17,000
that are smart and intelligent
5
00:00:17,000 --> 00:00:19,566
goes back to antiquity.
6
00:00:19,566 --> 00:00:21,666
Well, can it bring it to me?
7
00:00:21,666 --> 00:00:23,866
O'BRIEN:
Is it possible that the dream
of artificial intelligence
8
00:00:23,866 --> 00:00:26,233
has become reality?
9
00:00:27,333 --> 00:00:28,933
They're able to do things
10
00:00:28,933 --> 00:00:31,266
that we didn't
think they could do.
11
00:00:32,866 --> 00:00:36,533
MANOLIS KELLIS:
Go was thought to be a game
where machines would never win.
12
00:00:36,533 --> 00:00:39,933
The number of choices
for every move is enormous.
13
00:00:39,933 --> 00:00:44,033
O'BRIEN:
And now, the possibilities
seem endless.
14
00:00:44,033 --> 00:00:45,833
MUSTAFA SULEYMAN:
And this is going to be
15
00:00:45,833 --> 00:00:47,866
one of the greatest boosts
16
00:00:47,866 --> 00:00:50,300
to productivity in the history
of our species.
17
00:00:52,466 --> 00:00:55,966
That looks like just a hint
of some type of smoke.
18
00:00:55,966 --> 00:00:58,333
O'BRIEN:
Identifying problems
before a human can...
19
00:00:58,333 --> 00:01:00,633
LECIA SEQUIST:
We taught the model to recognize
20
00:01:00,633 --> 00:01:02,833
developing lung cancer.
21
00:01:02,833 --> 00:01:06,000
O'BRIEN:
...and inventing new drugs.
22
00:01:06,000 --> 00:01:08,100
PETRINA KAMYA:
I never thought
that we would be able
23
00:01:08,100 --> 00:01:10,033
to be doing the things
we're doing with A.I..
24
00:01:10,033 --> 00:01:12,400
O'BRIEN:
But along with the hope...
25
00:01:12,400 --> 00:01:14,200
(imitating Obama):
This is a dangerous time.
26
00:01:14,200 --> 00:01:17,100
O'BRIEN:
...comes deep concern.
27
00:01:17,100 --> 00:01:18,900
One of the first drops
in the feared flood
28
00:01:18,900 --> 00:01:20,500
of A.I.-created disinformation.
29
00:01:20,500 --> 00:01:24,533
We have lowered barriers
to entry to manipulate reality.
30
00:01:24,533 --> 00:01:27,266
We're going to live in a world
where we don't know what's real.
31
00:01:27,266 --> 00:01:31,166
The risks are uncertain
and potentially enormous.
32
00:01:31,166 --> 00:01:35,366
O'BRIEN:
How powerful is A.I.?
How does it work?
33
00:01:35,366 --> 00:01:37,933
And how can we reap
its extraordinary benefits...
34
00:01:37,933 --> 00:01:39,400
Sybil looked here,
35
00:01:39,400 --> 00:01:42,100
and anticipated
that there would be a problem.
36
00:01:42,100 --> 00:01:44,300
O'BRIEN:
...without jeopardizing
our future?
37
00:01:44,300 --> 00:01:45,633
"A.I. Revolution"
38
00:01:45,633 --> 00:01:48,400
right now, on "NOVA!"
39
00:01:48,400 --> 00:01:51,533
(whirring)
40
00:01:51,533 --> 00:02:11,700
♪ ♪
41
00:02:12,966 --> 00:02:16,100
Tell me the backstory
on inflection A.I..
42
00:02:16,100 --> 00:02:20,300
(voiceover):
Our story begins
with the making of this story.
43
00:02:20,300 --> 00:02:25,266
PI (on computer):
The story of Inflection A.I.
is an exciting one.
44
00:02:25,266 --> 00:02:26,333
O'BRIEN (voiceover):
I was researching
45
00:02:26,333 --> 00:02:27,500
an interview subject.
46
00:02:27,500 --> 00:02:30,033
Who is Mustafa Suleyman?
47
00:02:30,033 --> 00:02:31,633
(voiceover):
Something I've done
48
00:02:31,633 --> 00:02:34,066
a thousand times
in my 40-year career.
49
00:02:34,066 --> 00:02:35,900
PI (on computer):
Mustafa Suleyman is a
true pioneer
50
00:02:35,900 --> 00:02:38,366
in the field
of artificial intelligence.
51
00:02:38,366 --> 00:02:40,566
(voiceover):
But this time, it was different:
52
00:02:40,566 --> 00:02:43,700
I wasn't typing out
search terms.
53
00:02:43,700 --> 00:02:46,500
What is machine learning?
54
00:02:46,500 --> 00:02:49,933
O'BRIEN (voiceover):
I was having a conversation
with a computer.
55
00:02:49,933 --> 00:02:52,066
PI:
Sounds like an
exciting project, Miles.
56
00:02:52,066 --> 00:02:55,500
(voiceover):
It felt like something
big had changed.
57
00:02:55,500 --> 00:02:59,166
PI:
Machine learning, ML, is a type
of artificial intelligence.
58
00:02:59,166 --> 00:03:00,766
O'BRIEN (voiceover):
And as it happened,
59
00:03:00,766 --> 00:03:05,566
I was focused on one of the
innovators of this revolution.
60
00:03:05,566 --> 00:03:07,100
Okay, so if I do this...
61
00:03:07,100 --> 00:03:10,100
(voiceover):
Mustafa Suleyman
is Co-Founder
62
00:03:10,100 --> 00:03:11,833
of a startup called Inflection.
63
00:03:11,833 --> 00:03:15,100
It makes an artificial
intelligence assistant
64
00:03:15,100 --> 00:03:16,766
called Pi.
65
00:03:16,766 --> 00:03:18,166
So now you can speak...
66
00:03:18,166 --> 00:03:21,933
O'BRIEN (voiceover):
I met them both in London.
67
00:03:21,933 --> 00:03:23,433
It's fundamentally
different, isn't it?
68
00:03:23,433 --> 00:03:24,933
Yeah,
it's a conversational style.
69
00:03:24,933 --> 00:03:27,433
All of us humans
learn through stories,
70
00:03:27,433 --> 00:03:30,300
and through narrative, and
through interactive dialogue.
71
00:03:30,300 --> 00:03:32,933
And now, the machine can
kind of come alive,
72
00:03:32,933 --> 00:03:34,700
and talk to you
about whatever it is
73
00:03:34,700 --> 00:03:36,100
that's on top of your mind.
74
00:03:36,100 --> 00:03:40,233
O'BRIEN:
Tell me about the PBS program
"NOVA."
75
00:03:40,233 --> 00:03:43,600
(voiceover):
Chatbots can offer up
quick answers, write poems,
76
00:03:43,600 --> 00:03:46,166
finish essays,
and translate languages
77
00:03:46,166 --> 00:03:48,333
among many other things.
78
00:03:48,333 --> 00:03:51,033
PI (over phone):
"NOVA" is a science
documentary series...
79
00:03:51,033 --> 00:03:52,533
O'BRIEN (voiceover):
They aren't perfect,
80
00:03:52,533 --> 00:03:54,800
but they have put artificial
intelligence in our hands,
81
00:03:54,800 --> 00:03:57,233
and into
the public consciousness.
82
00:03:57,233 --> 00:04:00,866
And it seems
we're equal parts leery
83
00:04:00,866 --> 00:04:02,533
and intrigued.
84
00:04:02,533 --> 00:04:04,233
SULEYMAN:
A.I. is a tool
85
00:04:04,233 --> 00:04:07,833
for helping us to understand
the world around us,
86
00:04:07,833 --> 00:04:11,966
predict what's likely to happen,
and then invent
87
00:04:11,966 --> 00:04:15,000
solutions that help improve
the world around us.
88
00:04:15,000 --> 00:04:18,533
My motivation was to try
to use A.I. tools
89
00:04:18,533 --> 00:04:20,800
to, uh, you know,
invent the future.
90
00:04:20,800 --> 00:04:23,500
The rise
in artificial intelligence...
91
00:04:23,500 --> 00:04:25,333
REPORTER:
A.I. technology is developing...
92
00:04:25,333 --> 00:04:28,800
O'BRIEN (voiceover):
Lately, it seems a dark future
is already here...
93
00:04:28,800 --> 00:04:32,966
The technology could replace
millions of jobs...
94
00:04:32,966 --> 00:04:34,733
O'BRIEN (voiceover):
...if you listen
to the news reporting.
95
00:04:34,733 --> 00:04:37,966
The moment civilization
was transformed.
96
00:04:37,966 --> 00:04:40,433
O'BRIEN (voiceover):
So how can
artificial intelligence help us,
97
00:04:40,433 --> 00:04:42,800
and how might it hurt us?
98
00:04:42,800 --> 00:04:45,966
At the center of
the public handwringing:
99
00:04:45,966 --> 00:04:49,966
how should we put
guardrails around it?
100
00:04:49,966 --> 00:04:52,900
We definitely need
more regulations in place...
101
00:04:52,900 --> 00:04:54,300
O'BRIEN (voiceover):
Artificial intelligence
is moving fast
102
00:04:54,300 --> 00:04:56,200
and changing the world.
103
00:04:56,200 --> 00:04:57,766
Can we keep up?
104
00:04:57,766 --> 00:04:59,700
Non-human minds
smarter than our own.
105
00:04:59,700 --> 00:05:02,133
O'BRIEN (voiceover):
The news coverage may make it
seem like
106
00:05:02,133 --> 00:05:04,866
artificial intelligence
is something new.
107
00:05:04,866 --> 00:05:06,866
At a moment of revolution...
108
00:05:06,866 --> 00:05:09,300
O'BRIEN (voiceover):
But human beings have been
thinking about this
109
00:05:09,300 --> 00:05:12,433
for a very long time.
110
00:05:12,433 --> 00:05:16,300
I have a very fine brain.
111
00:05:16,300 --> 00:05:20,633
Our dream to create machines
in our own image
112
00:05:20,633 --> 00:05:24,700
that are smart and intelligent
goes back to antiquity.
113
00:05:24,700 --> 00:05:27,166
Uh, it's,
it's something that has,
114
00:05:27,166 --> 00:05:31,966
has permeated the evolution
of society and of science.
115
00:05:31,966 --> 00:05:34,400
(mortars firing)
116
00:05:34,400 --> 00:05:36,800
O'BRIEN (voiceover):
The modern origins
of artificial intelligence
117
00:05:36,800 --> 00:05:38,966
can be traced
back to World War II,
118
00:05:38,966 --> 00:05:43,400
and the prodigious
human brain of Alan Turing.
119
00:05:43,400 --> 00:05:46,100
The legendary
British mathematician
120
00:05:46,100 --> 00:05:48,100
developed a machine
121
00:05:48,100 --> 00:05:52,233
capable of deciphering
coded messages from the Nazis.
122
00:05:52,233 --> 00:05:56,300
After the war, he was among
the first to predict computers
123
00:05:56,300 --> 00:05:59,566
might one day match
the human brain.
124
00:05:59,566 --> 00:06:02,633
There are no surviving
recordings of Turing's voice,
125
00:06:02,633 --> 00:06:08,233
but in 1951, he gave
a short lecture on BBC radio.
126
00:06:08,233 --> 00:06:12,766
We asked an A.I.-generated voice
to read a passage.
127
00:06:12,766 --> 00:06:15,066
TURING A.I. VOICE:
I think it is probable,
for instance,
128
00:06:15,066 --> 00:06:17,066
that at the end of the century,
129
00:06:17,066 --> 00:06:19,133
it will be possible
to program a machine
130
00:06:19,133 --> 00:06:21,100
to answer questions
in such a way
131
00:06:21,100 --> 00:06:23,200
that it will be extremely
difficult to guess
132
00:06:23,200 --> 00:06:25,300
whether the answers are being
given by a man
133
00:06:25,300 --> 00:06:27,400
or by the machine.
134
00:06:27,400 --> 00:06:30,300
O'BRIEN (voiceover):
And so,
the Turing test was born.
135
00:06:30,300 --> 00:06:32,433
Could anyone build a machine
136
00:06:32,433 --> 00:06:34,800
that could converse
with a human in a way
137
00:06:34,800 --> 00:06:37,866
that is indistinguishable
from another person?
138
00:06:37,866 --> 00:06:41,166
In 1956,
139
00:06:41,166 --> 00:06:43,466
a group of pioneering scientists
spent the summer
140
00:06:43,466 --> 00:06:46,233
brainstorming
at Dartmouth College.
141
00:06:47,266 --> 00:06:49,400
And they told the world that
they have coined
142
00:06:49,400 --> 00:06:51,166
a new academic field of study.
143
00:06:51,166 --> 00:06:53,300
They called it
artificial intelligence
144
00:06:53,300 --> 00:06:56,833
O'BRIEN (voiceover):
For decades,
their aspirations remained
145
00:06:56,833 --> 00:06:59,600
far ahead of
the capabilities of computers.
146
00:07:01,333 --> 00:07:03,066
In 1978,
147
00:07:03,066 --> 00:07:07,933
"NOVA" released its first film
on artificial intelligence.
148
00:07:07,933 --> 00:07:09,800
We have seen the first
crude beginnings
149
00:07:09,800 --> 00:07:11,500
of artificial intelligence...
150
00:07:11,500 --> 00:07:13,300
O'BRIEN (voiceover):
And the legendary science
fiction writer,
151
00:07:13,300 --> 00:07:17,833
Arthur C. Clark was,
as always, prescient.
152
00:07:17,833 --> 00:07:19,433
It doesn't really exist yet at
any level,
153
00:07:19,433 --> 00:07:23,466
because our most complex
computers are still morons,
154
00:07:23,466 --> 00:07:26,400
high-speed morons,
but still morons.
155
00:07:26,400 --> 00:07:29,200
Nevertheless, we have
the possibility of machines
156
00:07:29,200 --> 00:07:31,466
which can outpace their
creators,
157
00:07:31,466 --> 00:07:35,966
and therefore,
become more intelligent than us.
158
00:07:37,233 --> 00:07:41,333
At the time, researchers were
developing "expert systems,"
159
00:07:41,333 --> 00:07:46,500
purpose-built
to perform specific tasks.
160
00:07:46,500 --> 00:07:48,100
So the thing that we need to do
161
00:07:48,100 --> 00:07:52,666
to make machine understand, um,
you know, our world,
162
00:07:52,666 --> 00:07:55,600
is to put all our knowledge
into a machine
163
00:07:55,600 --> 00:07:58,433
and then provide it
with some rules.
164
00:07:58,433 --> 00:08:00,466
♪ ♪
165
00:08:00,466 --> 00:08:03,600
O'BRIEN (voiceover):
Classic A.I. reached a pivotal
moment in 1997
166
00:08:03,600 --> 00:08:07,766
when an artificial intelligence
program devised by IBM,
167
00:08:07,766 --> 00:08:10,833
called "Deep Blue" defeated
world chess champion
168
00:08:10,833 --> 00:08:14,133
and grandmaster Garry Kasparov.
169
00:08:14,133 --> 00:08:18,166
It searched about 200 million
positions a second,
170
00:08:18,166 --> 00:08:20,600
navigating through
a tree of possibilities
171
00:08:20,600 --> 00:08:23,100
to determine the best move.
172
00:08:23,100 --> 00:08:25,566
RUS:
The program analyzed
the board configuration,
173
00:08:25,566 --> 00:08:28,533
could project forward
millions of moves
174
00:08:28,533 --> 00:08:31,133
to examine millions of
possibilities,
175
00:08:31,133 --> 00:08:33,666
and then picked the best path.
176
00:08:33,666 --> 00:08:36,366
O'BRIEN (voiceover):
Effective, but brittle,
177
00:08:36,366 --> 00:08:40,400
Deep Blue wasn't
strategizing as a human does.
178
00:08:40,400 --> 00:08:43,300
From the outset, artificial
intelligence researchers
179
00:08:43,300 --> 00:08:46,033
imagined making machines
180
00:08:46,033 --> 00:08:47,800
that think like us.
181
00:08:47,800 --> 00:08:51,100
The human brain, with
more than 80 billion neurons,
182
00:08:51,100 --> 00:08:54,000
learns not by following rules,
183
00:08:54,000 --> 00:08:57,266
but rather by taking
in a steady stream of data,
184
00:08:57,266 --> 00:08:59,633
and looking for patterns.
185
00:09:01,066 --> 00:09:03,400
KELLIS:
The way that learning
actually works
186
00:09:03,400 --> 00:09:06,266
in the human brain is by
updating the weights
187
00:09:06,266 --> 00:09:07,933
of the synaptic connections
188
00:09:07,933 --> 00:09:09,800
that are underlying this
neural network.
189
00:09:09,800 --> 00:09:13,633
O'BRIEN (voiceover):
Manolis Kellis is a
Professor of Computer Science
190
00:09:13,633 --> 00:09:17,966
at the Massachusetts Institute
of Technology.
191
00:09:17,966 --> 00:09:19,966
So we have trillions
of parameters in our brain
192
00:09:19,966 --> 00:09:22,100
that we can adjust
based on experience.
193
00:09:22,100 --> 00:09:24,000
I'm getting a reward.
194
00:09:24,000 --> 00:09:25,833
I will update
the strength of the connections
195
00:09:25,833 --> 00:09:27,900
that led to this reward--
I'm getting punished,
196
00:09:27,900 --> 00:09:29,666
I will diminish the strength
of the connections
197
00:09:29,666 --> 00:09:31,333
that led to the punishment.
198
00:09:31,333 --> 00:09:33,500
So this is
the original neural network.
199
00:09:33,500 --> 00:09:37,066
We did not invent it,
we, you know, we inherited it.
200
00:09:37,066 --> 00:09:41,166
O'BRIEN (voiceover):
But could an artificial
neural network
201
00:09:41,166 --> 00:09:44,200
be made in our own image?
Turing imagined it.
202
00:09:44,200 --> 00:09:46,500
But computers were nowhere near
203
00:09:46,500 --> 00:09:50,033
powerful enough to do it
until recently.
204
00:09:51,666 --> 00:09:53,466
It's only with the advent
of extraordinary data sets
205
00:09:53,466 --> 00:09:56,133
that we have, uh,
since the early 2000s,
206
00:09:56,133 --> 00:09:59,133
that we were able to build up
enough images,
207
00:09:59,133 --> 00:10:00,766
enough annotations,
208
00:10:00,766 --> 00:10:03,866
enough text to be able
to finally train
209
00:10:03,866 --> 00:10:06,833
these sufficiently powerful
models.
210
00:10:08,300 --> 00:10:10,733
O'BRIEN (voiceover):
An artificial neural network is,
in fact,
211
00:10:10,733 --> 00:10:13,266
modeled on the human brain.
212
00:10:13,266 --> 00:10:16,600
It uses interconnected nodes,
or neurons,
213
00:10:16,600 --> 00:10:18,900
that communicate with
each other.
214
00:10:18,900 --> 00:10:21,633
Each node receives
inputs from other nodes
215
00:10:21,633 --> 00:10:25,566
and processes those inputs
to produce outputs,
216
00:10:25,566 --> 00:10:29,133
which are then passed on to
still other nodes.
217
00:10:29,133 --> 00:10:32,500
It learns by adjusting
the strength of the connections
218
00:10:32,500 --> 00:10:37,033
between the nodes based on
the data it is exposed to.
219
00:10:37,033 --> 00:10:39,500
This process
of adjusting the connections
220
00:10:39,500 --> 00:10:41,400
is called training,
221
00:10:41,400 --> 00:10:43,866
and it allows an
artificial neural network
222
00:10:43,866 --> 00:10:47,233
to recognize patterns
and learn from its experiences
223
00:10:47,233 --> 00:10:49,466
like humans do.
224
00:10:51,166 --> 00:10:52,700
A child,
how is it learning so fast?
225
00:10:52,700 --> 00:10:54,366
It is learning so fast
226
00:10:54,366 --> 00:10:56,700
because it's constantly
predicting the future
227
00:10:56,700 --> 00:10:59,100
and then seeing what happens
228
00:10:59,100 --> 00:11:02,266
and updating their weights in
their neural network
229
00:11:02,266 --> 00:11:04,266
based on what just happened.
230
00:11:04,266 --> 00:11:05,566
Now you can take this
231
00:11:05,566 --> 00:11:07,000
self-supervised learning
paradigm
232
00:11:07,000 --> 00:11:09,166
and apply it to machines.
233
00:11:10,700 --> 00:11:13,933
O'BRIEN (voiceover):
At first, some of these
artificial neural networks
234
00:11:13,933 --> 00:11:16,733
were trained on vintage
Atari video games
235
00:11:16,733 --> 00:11:18,700
like "Space Invaders"
236
00:11:18,700 --> 00:11:21,700
and "Breakout."
237
00:11:21,700 --> 00:11:24,933
Games reduce the complexity
of the real world
238
00:11:24,933 --> 00:11:28,600
to a very narrow set
of actions that can be taken.
239
00:11:28,600 --> 00:11:31,166
O'BRIEN (voiceover):
Before he started Inflection,
240
00:11:31,166 --> 00:11:34,266
Mustafa Suleyman co-founded
a company called
241
00:11:34,266 --> 00:11:36,766
DeepMind in 2010.
242
00:11:36,766 --> 00:11:40,766
It was acquired by Google
four years later.
243
00:11:40,766 --> 00:11:42,166
When an A.I. plays a game,
244
00:11:42,166 --> 00:11:45,766
we show it frame-by-frame,
every pixel
245
00:11:45,766 --> 00:11:47,966
in the moving image.
246
00:11:47,966 --> 00:11:49,900
And so the A.I. learns
to associate pixels
247
00:11:49,900 --> 00:11:52,000
with actions that it can take
248
00:11:52,000 --> 00:11:55,800
moving left or right
or pressing the fire button.
249
00:11:57,133 --> 00:12:00,433
O'BRIEN (voiceover):
When it obliterates blocks
or shoots aliens,
250
00:12:00,433 --> 00:12:03,800
the connections between the
nodes that enabled that success
251
00:12:03,800 --> 00:12:05,566
are strengthened.
252
00:12:05,566 --> 00:12:07,933
In other words, it is rewarded.
253
00:12:07,933 --> 00:12:10,966
When it fails, no reward.
254
00:12:10,966 --> 00:12:13,566
Eventually,
all those reinforced connections
255
00:12:13,566 --> 00:12:15,833
overrule the weaker ones.
256
00:12:15,833 --> 00:12:18,666
The program has learned
how to win.
257
00:12:20,400 --> 00:12:22,833
This sort of repeated allocation
of reward
258
00:12:22,833 --> 00:12:27,366
for repetitive behavior
is a great way to train a dog.
259
00:12:27,366 --> 00:12:29,266
It's a great way to teach a kid.
260
00:12:29,266 --> 00:12:32,033
It's a great way for us
as adults to adapt our behavior.
261
00:12:32,033 --> 00:12:34,433
And in fact,
it's actually a good way
262
00:12:34,433 --> 00:12:37,100
to train machine learning
algorithms to get better.
263
00:12:39,900 --> 00:12:43,133
O'BRIEN (voiceover):
In 2014, DeepMind began work
on an artificial neural network
264
00:12:43,133 --> 00:12:45,733
called "AlphaGo"
265
00:12:45,733 --> 00:12:47,200
that could play the ancient,
266
00:12:47,200 --> 00:12:50,133
and deceptively complex,
board game of Go.
267
00:12:51,966 --> 00:12:55,400
KELLIS:
Go was thought to be a game
where machines would never win.
268
00:12:55,400 --> 00:12:58,800
The number of choices
for every move is enormous.
269
00:12:58,800 --> 00:13:01,000
O'BRIEN (voiceover):
But at DeepMind,
270
00:13:01,000 --> 00:13:02,766
they were counting on
271
00:13:02,766 --> 00:13:07,100
the astounding growth
of compute power.
272
00:13:07,100 --> 00:13:09,666
And I think that's the key
concept to try to grasp,
273
00:13:09,666 --> 00:13:14,133
is that we are massively,
exponentially growing
274
00:13:14,133 --> 00:13:16,966
the amount of computation used,
and in some sense,
275
00:13:16,966 --> 00:13:19,666
that computation is a proxy
276
00:13:19,666 --> 00:13:22,700
for how intelligent
the model is.
277
00:13:23,800 --> 00:13:27,633
O'BRIEN (voiceover):
AlphaGo was trained two ways.
278
00:13:27,633 --> 00:13:30,433
First, it was fed a large
data set of expert Go games
279
00:13:30,433 --> 00:13:33,800
so that it could
learn how to play the game.
280
00:13:33,800 --> 00:13:36,133
This is known
as supervised learning.
281
00:13:36,133 --> 00:13:41,633
Then the software played against
itself many millions of times,
282
00:13:41,633 --> 00:13:44,366
so-called
reinforcement learning.
283
00:13:44,366 --> 00:13:47,800
This gradually improved
its skills and strategies.
284
00:13:47,800 --> 00:13:50,600
In March 2016,
285
00:13:50,600 --> 00:13:52,366
AlphaGo faced Lee Sedol,
286
00:13:52,366 --> 00:13:54,333
one of the world's
top-ranking players
287
00:13:54,333 --> 00:13:57,866
in a five-game match in
Seoul, South Korea.
288
00:13:57,866 --> 00:13:59,933
AlphaGo not only won,
289
00:13:59,933 --> 00:14:04,266
but also made a move so novel,
the Go cognoscenti
290
00:14:04,266 --> 00:14:07,166
thought it was a huge blunder.
That's a very
surprising move.
291
00:14:08,933 --> 00:14:11,066
There's no question to me
that these A.I. models
292
00:14:11,066 --> 00:14:12,766
are creative.
293
00:14:12,766 --> 00:14:15,600
They're incredibly creative.
294
00:14:15,600 --> 00:14:19,700
O'BRIEN (voiceover):
It turns out the move
was a stroke of brilliance.
295
00:14:19,700 --> 00:14:21,700
And this
emergent creative behavior
296
00:14:21,700 --> 00:14:23,733
was a hint of what was to come:
297
00:14:23,733 --> 00:14:26,900
generative A.I.
298
00:14:26,900 --> 00:14:28,633
Meanwhile,
299
00:14:28,633 --> 00:14:31,066
a company called OpenA.I.
was creating
300
00:14:31,066 --> 00:14:33,066
a generative A.I. model
301
00:14:33,066 --> 00:14:36,200
that would become ChatGPT.
302
00:14:36,200 --> 00:14:38,333
It allows users
to engage in a dialogue
303
00:14:38,333 --> 00:14:42,200
with a machine
that seems uncannily human.
304
00:14:42,200 --> 00:14:44,866
It was first released in 2018,
305
00:14:44,866 --> 00:14:48,933
but it was a subsequent version
that became a global sensation
306
00:14:48,933 --> 00:14:51,233
in late 2022.
307
00:14:51,233 --> 00:14:53,966
This promises to be
the viral sensation
308
00:14:53,966 --> 00:14:56,633
that could completely reset
how we do things.
309
00:14:56,633 --> 00:14:58,633
Cranking out entire essays
310
00:14:58,633 --> 00:15:00,500
in a matter of seconds.
311
00:15:00,500 --> 00:15:03,400
O'BRIEN (voiceover):
Not only did it wow the public,
it also caught
312
00:15:03,400 --> 00:15:06,666
artificial intelligence
innovators off guard.
313
00:15:08,033 --> 00:15:10,133
YOSHUA BENGIO:
It surprised me a lot
314
00:15:10,133 --> 00:15:12,200
that they're able
to do things that
315
00:15:12,200 --> 00:15:15,700
we didn't think they could do
simply by
316
00:15:15,700 --> 00:15:19,933
learning to imitate
how humans respond.
317
00:15:19,933 --> 00:15:23,500
And I thought this
kind of abilities would take
318
00:15:23,500 --> 00:15:26,300
many more years or decades.
319
00:15:26,300 --> 00:15:30,033
O'BRIEN (voiceover):
ChatGPT is
a large language model.
320
00:15:30,033 --> 00:15:34,266
LLMs start by consuming massive
amounts of text:
321
00:15:34,266 --> 00:15:36,300
books, articles and websites,
322
00:15:36,300 --> 00:15:39,200
which are publicly available on
the internet.
323
00:15:39,200 --> 00:15:42,833
By recognizing patterns
in billions of words,
324
00:15:42,833 --> 00:15:46,066
they can make guesses
at the next word in a sentence.
325
00:15:46,066 --> 00:15:49,333
That's how ChatGPT
generates unique answers
326
00:15:49,333 --> 00:15:51,466
to your questions.
327
00:15:51,466 --> 00:15:54,033
If I ask for a haiku
about the blue sky
328
00:15:54,033 --> 00:15:58,866
it writes something
that seems completely original.
329
00:15:58,866 --> 00:16:00,833
KELLIS:
If you're good at predicting
330
00:16:00,833 --> 00:16:02,766
this next word,
331
00:16:02,766 --> 00:16:04,800
it means you're understanding
something about the sentence.
332
00:16:04,800 --> 00:16:07,500
What the style
of the sentence is,
333
00:16:07,500 --> 00:16:10,233
what the feeling
of the sentence is.
334
00:16:10,233 --> 00:16:13,766
And you can't tell whether
this was a human or a machine.
335
00:16:13,766 --> 00:16:15,800
That's basically the definition
of the Turing test.
336
00:16:15,800 --> 00:16:19,633
O'BRIEN (voiceover):
So, how is this changing
our world?
337
00:16:19,633 --> 00:16:23,333
Well, It might change my world--
as an arm amputee.
338
00:16:23,333 --> 00:16:25,366
Ready for my casting call,
right?
339
00:16:25,366 --> 00:16:26,666
MONROE (chuckling):
Yes.
340
00:16:26,666 --> 00:16:28,466
Let's do it.
All right.
341
00:16:28,466 --> 00:16:30,266
O'BRIEN (voiceover):
That's Brian Monroe of
the Hanger Clinic.
342
00:16:30,266 --> 00:16:31,566
He's been my prosthetist
343
00:16:31,566 --> 00:16:34,500
since an injury
took my arm above the elbow
344
00:16:34,500 --> 00:16:36,266
ten years ago.
345
00:16:36,266 --> 00:16:38,800
So what we're going to do today
is take a mold of your arm.
Uh-huh.
346
00:16:38,800 --> 00:16:41,200
Kind of is like
a cast for a broken bone.
347
00:16:41,200 --> 00:16:45,266
O'BRIEN (voiceover):
Up until now, I have used
a body-powered prosthetic.
348
00:16:45,266 --> 00:16:48,400
Harness and a cable allow me
to move it
349
00:16:48,400 --> 00:16:50,200
by shrugging my shoulders.
350
00:16:50,200 --> 00:16:54,333
The technology is
more than a century old.
351
00:16:54,333 --> 00:16:56,133
But artificial intelligence,
352
00:16:56,133 --> 00:16:58,933
coupled with small
electric motors,
353
00:16:58,933 --> 00:17:03,566
is finally pushing prosthetics
into the 21st century.
354
00:17:05,100 --> 00:17:07,300
Which brings me to Chicago
355
00:17:07,300 --> 00:17:10,600
and the offices of
a small company called Coapt.
356
00:17:10,600 --> 00:17:13,600
I met the C.E.O., Blair Locke,
357
00:17:13,600 --> 00:17:17,466
a pioneer in the push
to apply artificial intelligence
358
00:17:17,466 --> 00:17:21,500
to artificial limbs.
359
00:17:21,500 --> 00:17:23,933
So, what do we have here?
What are we going to do?
360
00:17:23,933 --> 00:17:27,000
This allows us to very easily
test how your control would be
361
00:17:27,000 --> 00:17:30,100
using a pretty simple cuff;
this has electrodes in it,
362
00:17:30,100 --> 00:17:32,033
and we'll let the power
of the electronics
363
00:17:32,033 --> 00:17:33,600
that are doing
the machine learning
364
00:17:33,600 --> 00:17:35,733
see what you're capable of.
All right, let's give it a try.
365
00:17:35,733 --> 00:17:38,066
(voiceover):
Like most amputees,
366
00:17:38,066 --> 00:17:42,133
I feel my missing hand almost
as if it was still there--
367
00:17:42,133 --> 00:17:43,600
a phantom.
368
00:17:43,600 --> 00:17:45,300
Everything will touch.
Is that okay?
369
00:17:45,300 --> 00:17:46,366
Yeah.
Not too tight?
370
00:17:46,366 --> 00:17:48,200
No. All good.
Okay.
371
00:17:48,200 --> 00:17:50,400
O'BRIEN (voiceover):
It's almost entirely immobile,
stuck in molasses.
372
00:17:50,400 --> 00:17:52,933
Make a fist, not too hard.
373
00:17:52,933 --> 00:17:57,100
O'BRIEN (voiceover):
But I am able to imagine
moving it ever so slightly.
374
00:17:57,100 --> 00:17:58,766
And I'm gonna have you squeeze
into that a little bit harder.
375
00:17:58,766 --> 00:18:01,566
Very good, and I see the
pattern on the screen
376
00:18:01,566 --> 00:18:02,833
change a little bit.
377
00:18:02,833 --> 00:18:04,400
O'BRIEN (voiceover):
And when I do,
378
00:18:04,400 --> 00:18:07,333
I generate an array of faint
electrical signals in my stump.
379
00:18:07,333 --> 00:18:09,166
That's your muscle information.
380
00:18:09,166 --> 00:18:11,033
It feels,
it feels like I'm overcoming
381
00:18:11,033 --> 00:18:12,866
something that's really stuck.
382
00:18:12,866 --> 00:18:14,366
I don't know,
is that enough signal?
383
00:18:14,366 --> 00:18:16,300
Should be.
Oh, okay.
384
00:18:16,300 --> 00:18:17,533
We don't need a lot of signal,
385
00:18:17,533 --> 00:18:18,866
we're going for information
386
00:18:18,866 --> 00:18:20,733
in the signal,
not how loud it is.
387
00:18:20,733 --> 00:18:23,466
O'BRIEN (voiceover):
And this is where artificial
intelligence comes in.
388
00:18:25,266 --> 00:18:28,566
Using a virtual
prosthetic depicted on a screen,
389
00:18:28,566 --> 00:18:32,300
I trained a machine learning
algorithm to become fluent
390
00:18:32,300 --> 00:18:36,800
in the language
of my nerves and muscles.
391
00:18:36,800 --> 00:18:38,433
We see eight different signals
on the screen.
392
00:18:38,433 --> 00:18:40,866
All eight of those
sensor sites
393
00:18:40,866 --> 00:18:42,400
are going to
feed in together
394
00:18:42,400 --> 00:18:44,100
and let the algorithm
sort out the data.
395
00:18:44,100 --> 00:18:46,166
What you are experiencing
396
00:18:46,166 --> 00:18:48,800
is your ability
to teach the system
397
00:18:48,800 --> 00:18:50,633
what is hand-closed to you.
398
00:18:50,633 --> 00:18:52,633
And that's different
than what it would be to me.
399
00:18:52,633 --> 00:18:57,033
O'BRIEN (voiceover):
I told the software
what motion I desired,
400
00:18:57,033 --> 00:18:59,433
open, close, or rotate,
401
00:18:59,433 --> 00:19:03,633
then imagined moving
my phantom limb accordingly.
402
00:19:03,633 --> 00:19:05,833
This generates an array
of electromyographic,
403
00:19:05,833 --> 00:19:08,700
or EMG, signals in
my remaining muscles.
404
00:19:08,700 --> 00:19:12,333
I was training the A.I.
to connect the pattern
405
00:19:12,333 --> 00:19:15,133
of these electrical signals
with a specific movement.
406
00:19:17,366 --> 00:19:18,700
LOCK:
The system adapts,
407
00:19:18,700 --> 00:19:21,300
and as you add more data
and use it over time,
408
00:19:21,300 --> 00:19:23,300
it becomes more robust,
409
00:19:23,300 --> 00:19:27,100
and it learns
to improve upon use.
410
00:19:27,100 --> 00:19:29,500
O'BRIEN:
Is it me that's learning, or
the algorithm that's learning?
411
00:19:29,500 --> 00:19:31,600
Or are we learning together?
LOCK:
You're learning together.
412
00:19:31,600 --> 00:19:32,633
Okay.
413
00:19:34,133 --> 00:19:37,300
O'BRIEN (voiceover):
So, how does the Coapt pattern
recognition system work?
414
00:19:37,300 --> 00:19:42,166
It's called a Bayesian
classification model.
415
00:19:42,166 --> 00:19:43,966
As I train the software,
416
00:19:43,966 --> 00:19:46,600
it labels my
various EMG patterns
417
00:19:46,600 --> 00:19:49,300
into corresponding
classes of movement--
418
00:19:49,300 --> 00:19:53,533
hand open, hand closed,
wrist rotation, for example.
419
00:19:53,533 --> 00:19:56,166
As I use the arm,
420
00:19:56,166 --> 00:19:58,833
it compares the electrical
signals I'm transmitting
421
00:19:58,833 --> 00:20:02,666
to the existing library
of classifications I taught it.
422
00:20:02,666 --> 00:20:05,633
It relies on
statistical probability
423
00:20:05,633 --> 00:20:08,466
to choose the best match.
424
00:20:08,466 --> 00:20:10,466
And this is just one way
machine learning
425
00:20:10,466 --> 00:20:13,300
is quietly
revolutionizing medicine.
426
00:20:16,100 --> 00:20:18,600
Computer scientist
Regina Barzilay
427
00:20:18,600 --> 00:20:21,766
first started working on
artificial intelligence
428
00:20:21,766 --> 00:20:26,133
in the 1990s, just as
rule-based A.I. like Deep Blue
429
00:20:26,133 --> 00:20:28,633
was giving way
to neural networks.
430
00:20:28,633 --> 00:20:30,766
She used the techniques
431
00:20:30,766 --> 00:20:32,733
to decipher dead languages.
432
00:20:32,733 --> 00:20:35,866
You might call it
a small language model.
433
00:20:35,866 --> 00:20:38,433
Something that is fun and
intellectually very challenging,
434
00:20:38,433 --> 00:20:40,466
but it's not like
it's going to change our life.
435
00:20:41,966 --> 00:20:44,766
O'BRIEN (voiceover):
And then her life changed
in an instant.
436
00:20:44,766 --> 00:20:47,466
CONSTANCE LEHMAN:
We see a spot there.
437
00:20:47,466 --> 00:20:51,000
O'BRIEN (voiceover):
In 2014, she was diagnosed
with breast cancer.
438
00:20:51,000 --> 00:20:52,833
BARZILAY (voiceover):
When you go through the
treatment,
439
00:20:52,833 --> 00:20:54,066
there are a lot of
people who are suffering.
440
00:20:54,066 --> 00:20:55,600
I was interested in
441
00:20:55,600 --> 00:20:58,833
what I can do about it, and
clearly it was not continuing
442
00:20:58,833 --> 00:21:00,633
deciphering dead languages,
443
00:21:00,633 --> 00:21:02,966
and it was quite a journey.
444
00:21:02,966 --> 00:21:07,500
O'BRIEN (voiceover):
Not surprisingly, she began that
journey with mammograms.
445
00:21:07,500 --> 00:21:09,166
LEHMAN:
It's a little bit
more prominent.
446
00:21:09,166 --> 00:21:10,966
O'BRIEN (voiceover):
She and Constance Lehman,
447
00:21:10,966 --> 00:21:14,966
a radiologist at
Massachusetts General Hospital,
448
00:21:14,966 --> 00:21:17,600
realized the Achilles heel
in the diagnostic system
449
00:21:17,600 --> 00:21:20,400
is the human eye.
450
00:21:20,400 --> 00:21:22,600
BARZILAY (voiceover):
So the question that we ask is,
451
00:21:22,600 --> 00:21:24,400
what is the likelihood
of these patients
452
00:21:24,400 --> 00:21:27,666
to develop cancer
within the next five years?
453
00:21:27,666 --> 00:21:29,466
We, with our human eyes,
454
00:21:29,466 --> 00:21:31,366
cannot really make these
assertions
455
00:21:31,366 --> 00:21:33,966
because the patterns
are so subtle.
456
00:21:33,966 --> 00:21:37,566
LEHMAN:
Now, is that different
from the surrounding tissue?
457
00:21:37,566 --> 00:21:40,066
O'BRIEN (voiceover):
It's a perfect use case
for pattern recognition
458
00:21:40,066 --> 00:21:43,733
using what is known as
a convolutional neural network.
459
00:21:43,733 --> 00:21:45,566
♪ ♪
460
00:21:45,566 --> 00:21:48,900
Here's an example
of how CNNs get smart:
461
00:21:48,900 --> 00:21:53,600
they comb through a picture with
many virtual magnifying glasses.
462
00:21:53,600 --> 00:21:57,000
Each one is looking for
a specific kind of puzzle piece,
463
00:21:57,000 --> 00:21:59,600
like an edge,
a shape, or a texture.
464
00:21:59,600 --> 00:22:01,900
Then it makes
simplified versions,
465
00:22:01,900 --> 00:22:05,533
repeating the process
on larger and larger sections.
466
00:22:05,533 --> 00:22:08,233
Eventually
the puzzle can be assembled.
467
00:22:08,233 --> 00:22:10,366
And it's time to make a guess.
468
00:22:10,366 --> 00:22:13,566
Is it a cat? A dog? A tree?
469
00:22:13,566 --> 00:22:17,633
Sometimes the guess is right,
but sometimes it's wrong.
470
00:22:17,633 --> 00:22:19,900
And here's the learning part:
471
00:22:19,900 --> 00:22:22,366
with a process
called backpropagation,
472
00:22:22,366 --> 00:22:27,233
labeled images are sent back to
correct the previous operation.
473
00:22:27,233 --> 00:22:29,900
So the next time
it plays the guessing game,
474
00:22:29,900 --> 00:22:31,933
it will be even better.
475
00:22:31,933 --> 00:22:35,233
To validate the model,
Regina and her team gathered up
476
00:22:35,233 --> 00:22:38,233
more than 128,000 mammograms
477
00:22:38,233 --> 00:22:41,466
collected at seven sites
in four countries.
478
00:22:41,466 --> 00:22:45,166
More than 3,800 of them
led to a cancer diagnosis
479
00:22:45,166 --> 00:22:48,933
within five years.
480
00:22:48,933 --> 00:22:50,766
You just give to it the image,
481
00:22:50,766 --> 00:22:53,100
and then
the five years of outcomes,
482
00:22:53,100 --> 00:22:57,400
and it can learn the likelihood
of getting a cancer diagnosis.
483
00:22:57,400 --> 00:23:01,300
O'BRIEN (voiceover):
The software, called Mirai,
was a success.
484
00:23:01,300 --> 00:23:05,566
In fact, it is between
75% and 84% accurate
485
00:23:05,566 --> 00:23:08,966
in predicting
future cancer diagnoses.
486
00:23:11,433 --> 00:23:16,133
Then, a friend of
Regina's developed lung cancer.
487
00:23:16,133 --> 00:23:17,833
SEQUIST:
In lung cancer, it's actually
488
00:23:17,833 --> 00:23:20,266
sort of mind boggling
how much has changed.
489
00:23:20,266 --> 00:23:23,566
O'BRIEN (voiceover):
Her friend saw oncologist
Lecia Sequist.
490
00:23:24,800 --> 00:23:26,000
She and Regina wondered
491
00:23:26,000 --> 00:23:29,433
if artificial intelligence
could be applied
492
00:23:29,433 --> 00:23:31,766
to CAT scans of patients' lungs.
493
00:23:31,766 --> 00:23:33,400
SEQUIST:
We taught the model
494
00:23:33,400 --> 00:23:37,733
to recognize the patterns
of developing lung cancer
495
00:23:37,733 --> 00:23:40,433
by using thousands of CAT scans
496
00:23:40,433 --> 00:23:41,566
from patients who were
participating
497
00:23:41,566 --> 00:23:42,866
in a clinical trial.
498
00:23:42,866 --> 00:23:45,033
From the new study?
Oh, interesting.
Correct.
499
00:23:45,033 --> 00:23:47,133
SEQUIST (voiceover):
We had a lot of information
about them.
500
00:23:47,133 --> 00:23:48,800
We had demographic information,
501
00:23:48,800 --> 00:23:50,633
we had health information,
502
00:23:50,633 --> 00:23:52,266
and we had outcomes information.
503
00:23:52,266 --> 00:23:55,366
O'BRIEN (voiceover):
They call the model Sibyl.
504
00:23:55,366 --> 00:23:56,766
In the retrospective study,
right,
505
00:23:56,766 --> 00:23:58,366
so the retrospective data...
506
00:23:58,366 --> 00:23:59,900
O'BRIEN (voiceover):
Radiologist Florian Fintelmann
507
00:23:59,900 --> 00:24:01,933
showed me what it can do.
508
00:24:01,933 --> 00:24:05,066
FINTELMANN:
This is earlier,
and this is later.
509
00:24:05,066 --> 00:24:06,766
There is nothing
510
00:24:06,766 --> 00:24:10,100
that I can perceive, pick up,
or describe.
511
00:24:10,100 --> 00:24:12,666
There's no, what we call,
a precursor lesion
512
00:24:12,666 --> 00:24:13,800
on this CT scan.
513
00:24:13,800 --> 00:24:15,366
Sibyl looked here
514
00:24:15,366 --> 00:24:17,566
and then anticipated that
there would be a problem
515
00:24:17,566 --> 00:24:20,233
based on the baseline scan.
What is it seeing?
516
00:24:20,233 --> 00:24:21,866
That's the million dollar
question.
517
00:24:21,866 --> 00:24:24,066
And, and maybe not
the million dollar question.
518
00:24:24,066 --> 00:24:26,500
Does it really matter? Does it?
519
00:24:26,500 --> 00:24:28,866
O'BRIEN (voiceover):
When they compared
the predictions
520
00:24:28,866 --> 00:24:33,233
to actual outcomes from previous
cases, Sybil fared well.
521
00:24:33,233 --> 00:24:35,533
It correctly forecast cancer
522
00:24:35,533 --> 00:24:38,500
between 80% and 95% of the time,
523
00:24:38,500 --> 00:24:41,400
depending on the population
it studied.
524
00:24:41,400 --> 00:24:44,100
The technique is
still in the trial phase.
525
00:24:44,100 --> 00:24:46,133
But once it is deployed,
526
00:24:46,133 --> 00:24:49,766
it could provide
a potent tool for prevention.
527
00:24:52,533 --> 00:24:54,933
The hope is that if you
can predict very early on
528
00:24:54,933 --> 00:24:57,400
that the patient
is in the wrong way,
529
00:24:57,400 --> 00:25:00,300
you can do clinical trials,
you can develop the drugs
530
00:25:00,300 --> 00:25:05,100
that are doing the prevention,
rather than treatment
531
00:25:05,100 --> 00:25:07,666
of very advanced disease
that we are doing today.
532
00:25:09,000 --> 00:25:12,300
O'BRIEN (voiceover):
Which takes us back to DeepMind
and AlphaGo.
533
00:25:12,300 --> 00:25:14,733
The fun and games
were just the beginning,
534
00:25:14,733 --> 00:25:17,300
a means to an end.
535
00:25:17,300 --> 00:25:20,900
We have always set out
at DeepMind
536
00:25:20,900 --> 00:25:24,466
to, um, use our technologies to
make the world a better place.
537
00:25:24,466 --> 00:25:27,400
O'BRIEN (voiceover):
In 2021,
538
00:25:27,400 --> 00:25:29,400
the company released AlphaFold.
539
00:25:29,400 --> 00:25:31,733
It is pattern
recognition software
540
00:25:31,733 --> 00:25:34,066
designed to make
it easier for researchers
541
00:25:34,066 --> 00:25:35,800
to understand proteins,
542
00:25:35,800 --> 00:25:39,100
long chains of amino acids
543
00:25:39,100 --> 00:25:41,166
involved in nearly
every function in our bodies.
544
00:25:41,166 --> 00:25:43,066
How a protein folds
545
00:25:43,066 --> 00:25:45,466
into a specific,
three-dimensional shape
546
00:25:45,466 --> 00:25:50,133
determines how it interacts
with other molecules.
547
00:25:50,133 --> 00:25:52,000
SULEYMAN:
There's this correlation between
548
00:25:52,000 --> 00:25:55,233
what the protein does
and how it's structured.
549
00:25:55,233 --> 00:25:58,733
So if we can predict
how the protein folds,
550
00:25:58,733 --> 00:26:01,366
then say something
about their function.
551
00:26:01,366 --> 00:26:04,766
O'BRIEN:
If we know how a disease's
protein is shaped, or folded,
552
00:26:04,766 --> 00:26:08,700
we can sometimes create
a drug to disable it.
553
00:26:08,700 --> 00:26:12,800
But the shape of millions
of proteins remained a mystery.
554
00:26:12,800 --> 00:26:15,333
DeepMind trained AlphaFold
555
00:26:15,333 --> 00:26:18,333
on thousands of
known protein structures.
556
00:26:18,333 --> 00:26:20,600
It leveraged this knowledge
to predict
557
00:26:20,600 --> 00:26:23,266
200 million
protein structures,
558
00:26:23,266 --> 00:26:27,766
nearly all the proteins
known to science.
559
00:26:27,766 --> 00:26:30,466
SULEYMAN:
You take some high-quality
known data,
560
00:26:30,466 --> 00:26:33,733
and you use that to,
you know,
561
00:26:33,733 --> 00:26:37,800
make a prediction about how
a similar piece of information
562
00:26:37,800 --> 00:26:40,100
is likely to unfold
over some time series,
563
00:26:40,100 --> 00:26:42,166
and the structure
of proteins is,
564
00:26:42,166 --> 00:26:44,366
you know, in that sense,
no different to
565
00:26:44,366 --> 00:26:47,533
making a prediction in
the game of Go or in Atari
566
00:26:47,533 --> 00:26:49,266
or in a mammography scan,
567
00:26:49,266 --> 00:26:51,800
or indeed,
in a large language model.
568
00:26:51,800 --> 00:26:53,333
KAMYA:
These thin sticks here?
569
00:26:53,333 --> 00:26:55,700
Yeah?
They represent
the amino acids
570
00:26:55,700 --> 00:26:57,233
that make up a protein.
571
00:26:57,233 --> 00:26:58,400
O'BRIEN (voiceover):
Theoretical chemist
572
00:26:58,400 --> 00:27:01,266
Petrina Kamya works for
a company called
573
00:27:01,266 --> 00:27:03,333
Insilico Medicine.
574
00:27:03,333 --> 00:27:05,133
It uses AlphaFold
575
00:27:05,133 --> 00:27:07,200
and its own
deep-learning models
576
00:27:07,200 --> 00:27:12,533
to make accurate predictions
about protein structures.
577
00:27:12,533 --> 00:27:14,766
What we're doing in drug design
is we're designing a molecule
578
00:27:14,766 --> 00:27:17,966
that is analogous
to the natural molecule
579
00:27:17,966 --> 00:27:19,166
that binds to the protein,
580
00:27:19,166 --> 00:27:20,966
but instead it will lock it,
if this molecule
581
00:27:20,966 --> 00:27:23,533
is involved in a disease
where it's hyperactive.
582
00:27:24,533 --> 00:27:26,333
O'BRIEN (voiceover):
If the molecule fits well,
583
00:27:26,333 --> 00:27:29,266
it can inhibit the
disease-causing proteins.
584
00:27:29,266 --> 00:27:30,800
So you're
filtering it down
585
00:27:30,800 --> 00:27:33,333
like you're choosing
an Airbnb or something to,
586
00:27:33,333 --> 00:27:35,300
you know, number of bedrooms,
whatever.
To suit your needs.
587
00:27:35,300 --> 00:27:36,466
(laughs)
Exactly, right.
588
00:27:36,466 --> 00:27:38,100
Right, yeah.
That's a very good analogy.
589
00:27:38,100 --> 00:27:39,966
It's sort of like Airbnb.
590
00:27:39,966 --> 00:27:42,033
So you are putting in
your criteria,
591
00:27:42,033 --> 00:27:43,666
and then Airbnb will
filter out
592
00:27:43,666 --> 00:27:44,900
all the different
properties
593
00:27:44,900 --> 00:27:46,100
based on your criteria.
594
00:27:46,100 --> 00:27:47,433
So you can be very, very
restrictive
595
00:27:47,433 --> 00:27:48,833
or you can be very,
very free...
Right.
596
00:27:48,833 --> 00:27:51,133
In terms of guiding the
generative algorithms
597
00:27:51,133 --> 00:27:52,533
and telling them
what types of molecules
598
00:27:52,533 --> 00:27:54,266
you want them to generate.
599
00:27:54,266 --> 00:27:59,266
O'BRIEN (voiceover):
It will take 48 to 72 hours
of computing time
600
00:27:59,266 --> 00:28:02,666
to identify the best
candidates ranked in order.
601
00:28:02,666 --> 00:28:04,166
How long would it
have taken you
602
00:28:04,166 --> 00:28:07,100
to figure that out
as a computational chemist?
603
00:28:07,100 --> 00:28:08,700
I would have thought of
some of these,
604
00:28:08,700 --> 00:28:09,733
but not all of them.
Okay.
605
00:28:11,100 --> 00:28:13,600
O'BRIEN (voiceover):
While there are no shortcuts
for human trials,
606
00:28:13,600 --> 00:28:15,700
nor should we hope for that,
607
00:28:15,700 --> 00:28:20,100
this could greatly speed up
the drug development pipeline.
608
00:28:21,500 --> 00:28:23,300
There will not be the need
to invest so heavily
609
00:28:23,300 --> 00:28:25,500
in preclinical discovery,
610
00:28:25,500 --> 00:28:29,100
and so,
drugs can therefore be cheaper.
611
00:28:29,100 --> 00:28:30,800
And you can go
after those diseases
612
00:28:30,800 --> 00:28:33,566
that are
otherwise neglected,
613
00:28:33,566 --> 00:28:35,266
because you don't have
to invest so heavily
614
00:28:35,266 --> 00:28:36,433
in order for you
to come up with a drug,
615
00:28:36,433 --> 00:28:38,666
a viable drug.
616
00:28:38,666 --> 00:28:40,766
O'BRIEN (voiceover):
But medicine isn't
the only place
617
00:28:40,766 --> 00:28:43,033
where A.I. is breaking
new frontiers.
618
00:28:43,033 --> 00:28:46,400
It's conducting
financial analysis,
619
00:28:46,400 --> 00:28:49,200
helps with fraud detection.
620
00:28:49,200 --> 00:28:50,666
(mechanical whirring)
621
00:28:50,666 --> 00:28:53,966
It's now being deployed
to discover novel materials
622
00:28:53,966 --> 00:28:58,366
and could help us build
clean energy technology.
623
00:28:58,366 --> 00:29:02,900
And It is even helping
to save lives
624
00:29:02,900 --> 00:29:04,800
as the climate crisis
boils over.
625
00:29:06,033 --> 00:29:07,600
(indistinct radio chatter)
626
00:29:07,600 --> 00:29:08,966
In St. Helena, California,
627
00:29:08,966 --> 00:29:10,300
dispatchers at the
628
00:29:10,300 --> 00:29:13,966
CAL FIRE Sonoma-Lake-Napa
Command Center
629
00:29:13,966 --> 00:29:16,433
caught a break in 2023.
630
00:29:16,433 --> 00:29:22,100
Wildfires blackened nearly
700 acres of their territory.
631
00:29:22,100 --> 00:29:24,266
We were at 400,000 acres
in 2020.
632
00:29:25,533 --> 00:29:27,300
Something like that would
generate a response from us...
633
00:29:27,300 --> 00:29:30,500
O'BRIEN (voiceover):
Chief Mike Marcucci has
been fighting fires
634
00:29:30,500 --> 00:29:32,533
for more than 30 years.
635
00:29:32,533 --> 00:29:34,666
MARCUCCI (voiceover):
Once we started having
these devastating fires,
636
00:29:34,666 --> 00:29:35,766
we needed more intel.
637
00:29:35,766 --> 00:29:37,566
The need for intelligence
638
00:29:37,566 --> 00:29:40,200
is just overwhelming
in today's fire service.
639
00:29:41,400 --> 00:29:43,200
O'BRIEN (voiceover):
Over the past 20 years,
640
00:29:43,200 --> 00:29:45,166
California
has installed a network
641
00:29:45,166 --> 00:29:47,366
of more than
1,000 remotely operated
642
00:29:47,366 --> 00:29:51,633
pan, tilt, zoom surveillance
cameras on mountaintops.
643
00:29:53,000 --> 00:29:55,000
PETE AVANSINO:
Vegetation fire,
Highway 29 at Doton Road.
644
00:29:56,833 --> 00:29:59,933
O'BRIEN (voiceover):
All those cameras generate
petabytes of video.
645
00:30:00,966 --> 00:30:03,833
CAL FIRE partnered with
scientists at U.C. San Diego
646
00:30:03,833 --> 00:30:05,766
to train a neural network
647
00:30:05,766 --> 00:30:08,200
to spot the early signs
of trouble.
648
00:30:08,200 --> 00:30:11,500
It's called
ALERT California.
649
00:30:11,500 --> 00:30:13,133
SeLEGUE:
So here's one
that just popped up.
650
00:30:13,133 --> 00:30:15,166
Here's an anomaly.
651
00:30:15,166 --> 00:30:19,166
O'BRIEN (voiceover):
CAL FIRE Staff Chief of Fire
and Intelligence Philip SeLegue
652
00:30:19,166 --> 00:30:22,533
showed me how it works
while it was in action,
653
00:30:22,533 --> 00:30:24,433
detecting nascent fires,
654
00:30:24,433 --> 00:30:26,800
micro fires.
655
00:30:26,800 --> 00:30:28,266
That looks like
just a little hint
656
00:30:28,266 --> 00:30:30,733
of some type of smoke
that was there...
657
00:30:30,733 --> 00:30:32,433
O'BRIEN (voiceover):
Based on this,
dispatchers can orchestrate
658
00:30:32,433 --> 00:30:34,066
a fast response.
659
00:30:35,733 --> 00:30:40,500
A.I. has given us the ability
to detect and to see
660
00:30:40,500 --> 00:30:42,333
where those fires
are starting.
661
00:30:42,333 --> 00:30:45,100
AVANSINO:
Transport 1447
responding via MDC.
662
00:30:45,100 --> 00:30:46,533
O'BRIEN (voiceover):
For all they know,
663
00:30:46,533 --> 00:30:49,966
they have nipped
some megafires in the bud.
664
00:30:49,966 --> 00:30:51,200
The success are the fires
665
00:30:51,200 --> 00:30:52,833
that you don't hear about
in the news.
666
00:30:52,833 --> 00:30:55,200
O'BRIEN (voiceover):
Artificial intelligence
667
00:30:55,200 --> 00:30:57,833
can't put out
wildfires just yet.
668
00:30:57,833 --> 00:31:01,833
Human firefighters
still need to do that job.
669
00:31:03,366 --> 00:31:05,600
But researchers are pushing hard
670
00:31:05,600 --> 00:31:07,700
to combine neural networks
671
00:31:07,700 --> 00:31:10,300
with mobility and dexterity.
672
00:31:11,766 --> 00:31:13,400
This is where people
get nervous.
673
00:31:13,400 --> 00:31:15,133
Will they take our jobs?
674
00:31:15,133 --> 00:31:17,133
Or could they turn against us?
675
00:31:18,100 --> 00:31:19,766
But at M.I.T.,
676
00:31:19,766 --> 00:31:22,433
they're exploring ideas
to make robots
677
00:31:22,433 --> 00:31:24,100
good human partners.
678
00:31:25,900 --> 00:31:27,900
We are interested in
making machines
679
00:31:27,900 --> 00:31:30,633
that help people with
physical and cognitive tasks.
680
00:31:30,633 --> 00:31:32,266
So this is
really great,
681
00:31:32,266 --> 00:31:35,266
it has the stiffness
that we wanted...
682
00:31:35,266 --> 00:31:38,066
O'BRIEN (voiceover):
Daniela Rus is director of
M.I.T.'s Computer Science
683
00:31:38,066 --> 00:31:41,133
and
Artificial Intelligence Lab.
684
00:31:41,133 --> 00:31:42,133
Oh, can you
bring it to me?
685
00:31:42,133 --> 00:31:44,000
O'BRIEN (voiceover):
CSAIL.
686
00:31:44,000 --> 00:31:46,066
They are different, like,
kind of like muscles
687
00:31:46,066 --> 00:31:47,466
or actuators.
688
00:31:47,466 --> 00:31:49,066
RUS (voiceover):
We can do so much more
689
00:31:49,066 --> 00:31:52,066
when we get people and machines
working together.
690
00:31:53,233 --> 00:31:54,533
We can get better reach.
691
00:31:54,533 --> 00:31:55,533
We can get lift,
692
00:31:55,533 --> 00:31:58,700
precision, strength, vision.
693
00:31:58,700 --> 00:32:00,466
All of these are
physical superpowers
694
00:32:00,466 --> 00:32:01,633
we can get
through machines.
695
00:32:03,033 --> 00:32:04,066
O'BRIEN (voiceover):
So, they're focusing
696
00:32:04,066 --> 00:32:05,633
on making it safe for humans
697
00:32:05,633 --> 00:32:08,866
to work in close proximity
to machines.
698
00:32:08,866 --> 00:32:11,733
They're using some of
the technology that's inside
699
00:32:11,733 --> 00:32:13,166
my prosthetic arm.
700
00:32:13,166 --> 00:32:15,233
Electrodes
that can read
701
00:32:15,233 --> 00:32:17,666
the faint
EMG signals generated
702
00:32:17,666 --> 00:32:19,000
as our nerves command
703
00:32:19,000 --> 00:32:20,466
our muscles to move.
704
00:32:22,866 --> 00:32:25,566
They have the capability to
interact with a human,
705
00:32:25,566 --> 00:32:26,933
to understand the human,
706
00:32:26,933 --> 00:32:29,600
to step in and help the human
as needed.
707
00:32:29,600 --> 00:32:33,466
I am at your disposal with
187 other languages,
708
00:32:33,466 --> 00:32:35,133
along with their various
709
00:32:35,133 --> 00:32:36,966
dialects and sub tongues.
710
00:32:36,966 --> 00:32:39,133
O'BRIEN (voiceover):
But making robots as useful
711
00:32:39,133 --> 00:32:41,933
as they are in the movies
is a big challenge.
712
00:32:41,933 --> 00:32:43,733
♪ ♪
713
00:32:43,733 --> 00:32:47,433
Most neural networks run on
powerful supercomputers--
714
00:32:47,433 --> 00:32:51,733
thousands of processors
occupying entire buildings.
715
00:32:53,266 --> 00:32:54,966
RUS:
We have brains that require
716
00:32:54,966 --> 00:32:58,800
massive computation,
which you cannot include
717
00:32:58,800 --> 00:33:01,433
on a self-contained body.
718
00:33:01,433 --> 00:33:04,766
We address
the size challenge by
719
00:33:04,766 --> 00:33:06,533
making liquid networks.
720
00:33:06,533 --> 00:33:08,533
O'BRIEN (voiceover):
Liquid networks.
721
00:33:08,533 --> 00:33:10,033
So it looks like
an autonomous vehicle
722
00:33:10,033 --> 00:33:11,266
like I've seen before,
723
00:33:11,266 --> 00:33:12,600
but it is a little
different, right?
724
00:33:12,600 --> 00:33:13,966
ALEXANDER AMINI:
Very different.
725
00:33:13,966 --> 00:33:15,300
This is an
autonomous vehicle
726
00:33:15,300 --> 00:33:16,833
that can drive in
brand-new environments
727
00:33:16,833 --> 00:33:19,466
that it has never seen
before for the first time.
728
00:33:20,600 --> 00:33:22,700
O'BRIEN (voiceover):
Most self-driving cars
today rely,
729
00:33:22,700 --> 00:33:25,600
to some extent,
on detailed databases
730
00:33:25,600 --> 00:33:28,500
that help them recognize
their immediate environment.
731
00:33:28,500 --> 00:33:33,300
Those robot cars get lost
in unfamiliar terrain.
732
00:33:34,733 --> 00:33:37,033
O'BRIEN:
In this case,
you're not relying on
733
00:33:37,033 --> 00:33:39,866
a huge, expansive
neural network.
734
00:33:39,866 --> 00:33:41,366
You're running on
19 neurons, right?
735
00:33:41,366 --> 00:33:43,366
Correct.
736
00:33:43,366 --> 00:33:45,333
O'BRIEN (voiceover):
Computer scientist
Alexander Amini
737
00:33:45,333 --> 00:33:48,633
took me on a ride
in an autonomous vehicle
738
00:33:48,633 --> 00:33:52,233
with a liquid neural
network brain.
739
00:33:52,233 --> 00:33:54,366
AMINI:
We've become very accustomed
to relying on
740
00:33:54,366 --> 00:33:57,033
big, giant data centers
and cloud compute.
741
00:33:57,033 --> 00:33:58,633
But in an autonomous vehicle,
742
00:33:58,633 --> 00:34:00,266
you cannot make
such assumptions, right?
743
00:34:00,266 --> 00:34:01,833
You need to be able to operate,
744
00:34:01,833 --> 00:34:03,700
even if you lose
internet connectivity
745
00:34:03,700 --> 00:34:06,200
and you cannot
talk to the cloud anymore,
746
00:34:06,200 --> 00:34:07,766
your entire neural network,
747
00:34:07,766 --> 00:34:10,033
the brain of the car,
needs to live on the car,
748
00:34:10,033 --> 00:34:12,800
and that imposes a lot
of interesting constraints.
749
00:34:13,966 --> 00:34:15,366
O'BRIEN (voiceover):
To build a brain smart enough
750
00:34:15,366 --> 00:34:17,333
and small enough to
do this job,
751
00:34:17,333 --> 00:34:20,133
they took some inspiration
from nature,
752
00:34:20,133 --> 00:34:24,333
a lowly worm
called C. elegans.
753
00:34:24,333 --> 00:34:27,633
Its brain contains all of
300 neurons,
754
00:34:27,633 --> 00:34:30,100
but it's a very
different kind of neuron.
755
00:34:32,133 --> 00:34:33,633
It can capture
more complex behaviors
756
00:34:33,633 --> 00:34:35,166
in every single piece
of that puzzle.
757
00:34:35,166 --> 00:34:36,366
And also the wiring,
758
00:34:36,366 --> 00:34:38,800
how a neuron talks to
another neuron
759
00:34:38,800 --> 00:34:40,666
is completely different
than what we see
760
00:34:40,666 --> 00:34:42,366
in today's
neural networks.
761
00:34:43,800 --> 00:34:47,233
O'BRIEN (voiceover):
Autonomous cars that tap
into today's neural networks
762
00:34:47,233 --> 00:34:50,966
require huge amounts of
compute power in the cloud.
763
00:34:52,500 --> 00:34:55,300
But this car is using
just 19 liquid neurons.
764
00:34:56,433 --> 00:34:59,533
A worm at the wheel...
sort of.
765
00:34:59,533 --> 00:35:01,000
AMINI (voiceover):
Today's A.I. models
766
00:35:01,000 --> 00:35:02,700
are really
pushing the boundaries
767
00:35:02,700 --> 00:35:05,300
of the scale of compute
that we have.
768
00:35:05,300 --> 00:35:07,100
They're also pushing
the boundaries
769
00:35:07,100 --> 00:35:08,433
of the data sets
that we have.
770
00:35:08,433 --> 00:35:09,966
And that's not sustainable,
771
00:35:09,966 --> 00:35:11,900
because ultimately,
we need to deploy A.I.
772
00:35:11,900 --> 00:35:13,500
onto the device itself,
right?
773
00:35:13,500 --> 00:35:15,766
Onto the cars,
onto the surgical robots.
774
00:35:15,766 --> 00:35:17,366
All of these edge devices
775
00:35:17,366 --> 00:35:20,733
that actually makes
the decisions.
776
00:35:20,733 --> 00:35:23,366
O'BRIEN (voiceover):
The A.I. worm may, in fact,
777
00:35:23,366 --> 00:35:24,666
turn.
778
00:35:27,500 --> 00:35:28,833
The portability of
artificial intelligence
779
00:35:28,833 --> 00:35:32,100
was on my mind
when it came time
780
00:35:32,100 --> 00:35:35,833
to pick up
my new myoelectric arm...
781
00:35:35,833 --> 00:35:38,833
equipped with
Coapt A.I. pattern recognition.
782
00:35:38,833 --> 00:35:40,666
All right,
let's just check this
783
00:35:40,666 --> 00:35:42,133
real quick...
784
00:35:42,133 --> 00:35:43,500
O'BRIEN (voiceover):
A few weeks after
785
00:35:43,500 --> 00:35:44,833
my trip to Chicago,
786
00:35:44,833 --> 00:35:46,266
I met Brian Monroe
787
00:35:46,266 --> 00:35:50,000
at his home office
outside Washington, D.C.
788
00:35:50,000 --> 00:35:52,166
Are you happy with
the way it came out?
Yeah.
789
00:35:52,166 --> 00:35:54,133
Would you
tell me otherwise?
790
00:35:54,133 --> 00:35:56,933
(laughing):
Yeah, I would, yeah...
791
00:35:58,066 --> 00:35:59,366
O'BRIEN (voiceover):
As usual,
792
00:35:59,366 --> 00:36:02,200
he did a great job
making a tight socket.
793
00:36:03,566 --> 00:36:05,233
How's the socket feel?
Does it feel like
794
00:36:05,233 --> 00:36:06,766
it's sliding down or
795
00:36:06,766 --> 00:36:09,200
falling out...
No, it fits like a glove.
796
00:36:10,333 --> 00:36:11,900
O'BRIEN (voiceover):
It's really important in
this case,
797
00:36:11,900 --> 00:36:15,233
because the electrodes designed
to read the signals
798
00:36:15,233 --> 00:36:17,600
from my muscles...
799
00:36:17,600 --> 00:36:19,266
...have to stay in place snugly
800
00:36:19,266 --> 00:36:23,266
in order to generate
accurate, reliable commands
801
00:36:23,266 --> 00:36:24,900
to the actuators
in my new hand.
802
00:36:26,533 --> 00:36:28,200
Wait, is that you?
That's me.
803
00:36:30,066 --> 00:36:32,366
(voiceover):
He also provided me with
804
00:36:32,366 --> 00:36:34,366
a human-like bionic hand.
805
00:36:35,633 --> 00:36:37,400
But getting it
to work just right
806
00:36:37,400 --> 00:36:39,333
took some time.
807
00:36:39,333 --> 00:36:41,566
That's open and it's closing.
808
00:36:41,566 --> 00:36:42,766
It's backwards?
809
00:36:42,766 --> 00:36:44,166
Yeah.
Now try.
810
00:36:44,166 --> 00:36:45,533
If it's reversed,
811
00:36:45,533 --> 00:36:46,966
I can swap the electrodes.
There we go.
812
00:36:46,966 --> 00:36:48,966
That's got it.
Is it the right direction?
813
00:36:48,966 --> 00:36:50,333
Yeah.
Uh-huh. Okay.
814
00:36:50,333 --> 00:36:53,766
O'BRIEN (voiceover):
It's a long way from the movies,
815
00:36:53,766 --> 00:36:55,300
and I'm no Luke Skywalker.
816
00:36:55,300 --> 00:36:59,266
But my new arm and I
are now together.
817
00:36:59,266 --> 00:37:01,033
And I'm heartened
to know
818
00:37:01,033 --> 00:37:02,666
that I have the freedom
and independence
819
00:37:02,666 --> 00:37:04,133
to teach and tweak it
820
00:37:04,133 --> 00:37:05,166
on my own.
821
00:37:05,166 --> 00:37:06,733
That's kind of cool.
Yeah.
822
00:37:06,733 --> 00:37:09,266
(voiceover):
Hopefully we will listen to
each other.
823
00:37:09,266 --> 00:37:10,766
It's pretty awesome.
824
00:37:10,766 --> 00:37:12,466
O'BRIEN (voiceover):
But we might want to listen
825
00:37:12,466 --> 00:37:14,533
with a skeptical ear.
826
00:37:15,766 --> 00:37:19,300
JORDAN PEELE (imitating Obama):
You see, I would never
say these things,
827
00:37:19,300 --> 00:37:21,833
at least not in
a public address,
828
00:37:21,833 --> 00:37:23,766
but someone else would.
829
00:37:23,766 --> 00:37:26,100
Someone like Jordan Peele.
830
00:37:27,766 --> 00:37:29,900
This is a dangerous time.
831
00:37:29,900 --> 00:37:33,200
O'BRIEN (voiceover):
It's even more dangerous now
than it was in 2018
832
00:37:33,200 --> 00:37:35,333
when comedian Jordan Peele
833
00:37:35,333 --> 00:37:38,400
combined his pitch-perfect
Obama impression
834
00:37:38,400 --> 00:37:44,233
with A.I. software to make
this convincing fake video.
835
00:37:44,233 --> 00:37:47,166
...or whether we become some
kind of (bleep) up dystopia.
836
00:37:47,166 --> 00:37:49,166
♪ ♪
837
00:37:49,166 --> 00:37:51,233
O'BRIEN (voiceover):
Fakes are about as old as
838
00:37:51,233 --> 00:37:53,200
photography itself.
839
00:37:53,200 --> 00:37:56,566
Mussolini, Hitler,
and Stalin
840
00:37:56,566 --> 00:37:59,633
all ordered that pictures be
doctored or redacted,
841
00:37:59,633 --> 00:38:03,100
erasing those
who fell out of favor,
842
00:38:03,100 --> 00:38:05,233
consolidating power,
843
00:38:05,233 --> 00:38:08,333
manipulating their followers
through images.
844
00:38:08,333 --> 00:38:09,600
HANY FARID:
They've always been manipulated,
845
00:38:09,600 --> 00:38:12,000
throughout history, but--
846
00:38:12,000 --> 00:38:14,266
there was literally,
you can count on one hand,
847
00:38:14,266 --> 00:38:15,833
the number of people
in the world who could do this.
848
00:38:15,833 --> 00:38:18,166
But now,
you need almost no skill.
849
00:38:18,166 --> 00:38:19,900
And we said,
"Give us an image
850
00:38:19,900 --> 00:38:21,100
"of a middle-aged woman,
newscaster,
851
00:38:21,100 --> 00:38:22,833
sitting at her desk,
reading the news."
852
00:38:22,833 --> 00:38:25,100
O'BRIEN (voiceover):
Hany Farid is a professor
of computer science
853
00:38:25,100 --> 00:38:27,033
at U.C. Berkeley.
854
00:38:27,033 --> 00:38:29,333
(on computer):
And this is your daily dose
of future flash.
855
00:38:29,333 --> 00:38:30,600
O'BRIEN (voiceover):
He and his team
856
00:38:30,600 --> 00:38:32,933
are trying to navigate
the house of mirrors
857
00:38:32,933 --> 00:38:36,266
that is the world of
A.I.-enabled deepfake imagery.
858
00:38:37,266 --> 00:38:38,566
Not perfect.
859
00:38:38,566 --> 00:38:40,900
She's not blinking,
but it's pretty good.
860
00:38:40,900 --> 00:38:43,633
And by the way, he did this
in a day and a half.
861
00:38:43,633 --> 00:38:45,233
FARID (voiceover):
It's the
classic automation story.
862
00:38:45,233 --> 00:38:47,266
We have lowered
barriers to entry
863
00:38:47,266 --> 00:38:49,133
to manipulate reality.
864
00:38:49,133 --> 00:38:50,666
And when you do that,
865
00:38:50,666 --> 00:38:52,066
more and more people
will do it.
866
00:38:52,066 --> 00:38:53,200
Some good people
will do it,
867
00:38:53,200 --> 00:38:54,433
but lots of bad people
will do it.
868
00:38:54,433 --> 00:38:55,800
There'll be some
interesting use cases,
869
00:38:55,800 --> 00:38:57,500
and there'll be a lot of
nefarious use cases.
870
00:38:57,500 --> 00:39:00,766
Okay, so, um...
871
00:39:00,766 --> 00:39:02,866
Glasses off.
How's the framing?
872
00:39:02,866 --> 00:39:03,966
Everything okay?
873
00:39:03,966 --> 00:39:05,333
(voiceover):
About a week before
874
00:39:05,333 --> 00:39:06,966
I got on a plane to see him...
Hold on.
875
00:39:06,966 --> 00:39:08,900
O'BRIEN (voiceover):
He asked me to meet him on Zoom
876
00:39:08,900 --> 00:39:10,533
so he could
get a good recording
877
00:39:10,533 --> 00:39:11,866
of my voice and mannerisms.
878
00:39:11,866 --> 00:39:14,366
And I assume
you're recording, Miles.
879
00:39:14,366 --> 00:39:16,466
O'BRIEN (voiceover):
And he turned the table on me
a little bit,
880
00:39:16,466 --> 00:39:18,400
asking me a lot of questions
881
00:39:18,400 --> 00:39:20,100
to get a good sampling.
882
00:39:20,100 --> 00:39:21,700
FARID (on computer):
How are you feeling about
883
00:39:21,700 --> 00:39:24,800
the role of A.I.
as it enters into our world
884
00:39:24,800 --> 00:39:26,266
on a daily basis?
885
00:39:26,266 --> 00:39:28,233
I think it's very important,
first of all,
886
00:39:28,233 --> 00:39:31,000
to calibrate the concern level.
887
00:39:31,000 --> 00:39:33,366
Let's take it away from
the "Terminator" scenario...
888
00:39:34,566 --> 00:39:36,633
(voiceover):
The "Terminator" scenario.
889
00:39:36,633 --> 00:39:38,066
Come with me
if you want to live.
890
00:39:39,333 --> 00:39:42,400
O'BRIEN (voiceover):
You know, a malevolent
neural network
891
00:39:42,400 --> 00:39:44,133
hellbent on exterminating
humanity.
892
00:39:44,133 --> 00:39:45,600
You're really real.
893
00:39:45,600 --> 00:39:47,533
O'BRIEN (voiceover):
In the film series,
894
00:39:47,533 --> 00:39:48,866
the cyborg assassin
895
00:39:48,866 --> 00:39:51,966
is memorably played
by Arnold Schwarzenegger.
896
00:39:51,966 --> 00:39:54,033
Hany thought it would be fun
897
00:39:54,033 --> 00:39:57,266
to use A.I.
to turn Arnold into me.
898
00:39:57,266 --> 00:39:58,300
Okay.
899
00:39:59,500 --> 00:40:01,000
O'BRIEN (voiceover):
A week later, I showed up at
900
00:40:01,000 --> 00:40:02,933
Berkeley's
School of Information,
901
00:40:02,933 --> 00:40:06,800
ironically located in
the oldest building on campus.
902
00:40:08,366 --> 00:40:10,300
So you had me do
this strange thing on Zoom.
903
00:40:10,300 --> 00:40:12,666
Here I am.
What did you do with me?
904
00:40:12,666 --> 00:40:14,300
Yeah, well,
it's gonna teach you
905
00:40:14,300 --> 00:40:15,800
to let me record
your Zoom call, isn't it?
906
00:40:15,800 --> 00:40:17,833
I did this
with some trepidation.
907
00:40:17,833 --> 00:40:20,000
(voiceover):
I was excited to see what tricks
908
00:40:20,000 --> 00:40:21,400
were up his sleeve.
909
00:40:21,400 --> 00:40:23,033
FARID (voiceover):
I uploaded 90 seconds of audio,
910
00:40:23,033 --> 00:40:25,133
and I clicked a box saying
911
00:40:25,133 --> 00:40:27,500
"Miles has given me
permission to use his voice,"
912
00:40:27,500 --> 00:40:28,600
which I don't actually
913
00:40:28,600 --> 00:40:30,700
think you did.
(chuckles)
914
00:40:30,700 --> 00:40:32,800
Um, and, I waited about,
eh, maybe 20 seconds,
915
00:40:32,800 --> 00:40:35,600
and it said, "Okay, what would
you like for Miles to say?"
916
00:40:35,600 --> 00:40:37,300
And I started typing,
917
00:40:37,300 --> 00:40:39,566
and I generated an audio
of you saying
918
00:40:39,566 --> 00:40:40,933
whatever I wanted you to say.
919
00:40:40,933 --> 00:40:43,133
We are synthesizing,
920
00:40:43,133 --> 00:40:45,500
at much, much lower
resolution.
921
00:40:45,500 --> 00:40:46,833
O'BRIEN (voiceover):
You could have knocked me over
922
00:40:46,833 --> 00:40:49,566
with a feather
when I watched this.
923
00:40:49,566 --> 00:40:50,966
A.I. O'BRIEN:
Terminators were
science fiction back then,
924
00:40:50,966 --> 00:40:54,266
but if you follow the
recent A.I. media coverage,
925
00:40:54,266 --> 00:40:57,200
you might think that Terminators
are just around the corner.
926
00:40:57,200 --> 00:40:58,966
The reality is...
927
00:40:58,966 --> 00:41:00,933
O'BRIEN (voiceover):
The eyes and the mouth
need some work,
928
00:41:00,933 --> 00:41:03,366
but it sure does
sound like me.
929
00:41:04,366 --> 00:41:07,733
And consider what happened
in May of 2023.
930
00:41:07,733 --> 00:41:10,833
Someone posted
this A.I.-generated image
931
00:41:10,833 --> 00:41:13,300
of what appeared to be
a terrorist bombing
932
00:41:13,300 --> 00:41:14,766
at the Pentagon.
933
00:41:14,766 --> 00:41:16,033
NEWS ANCHOR:
Today we may have witnessed
934
00:41:16,033 --> 00:41:18,033
one of the first drops
in the feared flood
935
00:41:18,033 --> 00:41:20,133
of A.I.-created
disinformation.
936
00:41:20,133 --> 00:41:21,733
O'BRIEN (voiceover):
It was shared on Twitter
937
00:41:21,733 --> 00:41:23,200
via what seemed to be
938
00:41:23,200 --> 00:41:26,500
a verified account
from Bloomberg News.
939
00:41:26,500 --> 00:41:28,466
NEWS ANCHOR:
It only took seconds
to spread fast.
940
00:41:28,466 --> 00:41:31,766
The Dow now down about
200 points...
941
00:41:31,766 --> 00:41:33,633
Two minutes later,
the stock market dropped
942
00:41:33,633 --> 00:41:36,000
a half a trillion dollars
943
00:41:36,000 --> 00:41:38,366
from a single fake image.
944
00:41:38,366 --> 00:41:39,933
Anybody could've made
that image,
945
00:41:39,933 --> 00:41:41,833
whether it was intentionally
manipulating the market
946
00:41:41,833 --> 00:41:43,000
or unintentionally,
947
00:41:43,000 --> 00:41:44,233
in some ways,
it doesn't really matter.
948
00:41:45,333 --> 00:41:46,800
O'BRIEN (voiceover):
So what are the technological
949
00:41:46,800 --> 00:41:50,166
innovations that make this tool
widely available?
950
00:41:51,866 --> 00:41:53,800
One technique is called
951
00:41:53,800 --> 00:41:56,033
the generative
adversarial network,
952
00:41:56,033 --> 00:41:57,266
or GAN.
953
00:41:57,266 --> 00:41:58,600
Two algorithms
954
00:41:58,600 --> 00:42:02,233
in a dizzying
student-teacher back and forth.
955
00:42:02,233 --> 00:42:05,300
Let's say it's learning how to
generate a cat.
956
00:42:05,300 --> 00:42:07,733
FARID:
And it starts by
just splatting down
957
00:42:07,733 --> 00:42:09,233
a bunch of pixels onto a canvas.
958
00:42:09,233 --> 00:42:12,266
And it sends it over to
a discriminator.
959
00:42:12,266 --> 00:42:14,266
And the discriminator
has access
960
00:42:14,266 --> 00:42:16,233
to millions and millions
of images
961
00:42:16,233 --> 00:42:17,466
of the category that
you want.
962
00:42:17,466 --> 00:42:18,933
And it says,
963
00:42:18,933 --> 00:42:20,833
"Nope, that doesn't look
like all these other things."
964
00:42:20,833 --> 00:42:24,033
So it goes back to the generator
and says, "Try again."
965
00:42:24,033 --> 00:42:25,233
Modifies some pixels,
966
00:42:25,233 --> 00:42:26,533
sends it back
to the discriminator,
967
00:42:26,533 --> 00:42:28,000
and they do this in
what's called
968
00:42:28,000 --> 00:42:29,233
an adversarial loop.
969
00:42:29,233 --> 00:42:30,633
O'BRIEN (voiceover):
And eventually,
970
00:42:30,633 --> 00:42:33,066
after many
thousands of volleys,
971
00:42:33,066 --> 00:42:36,000
the generator
finally serves up a cat.
972
00:42:36,000 --> 00:42:38,000
And the discriminator says,
973
00:42:38,000 --> 00:42:40,233
"Do more like that."
974
00:42:40,233 --> 00:42:42,433
Today, we have a whole new way
of doing these things.
975
00:42:42,433 --> 00:42:43,933
They're called diffusion-based.
976
00:42:44,966 --> 00:42:46,333
What diffusion does
977
00:42:46,333 --> 00:42:48,933
is it has vacuumed up
billions of images
978
00:42:48,933 --> 00:42:51,466
with captions
that are descriptive.
979
00:42:51,466 --> 00:42:54,133
O'BRIEN (voiceover):
It starts by making those
labeled images
980
00:42:54,133 --> 00:42:56,000
visually noisy on purpose.
981
00:42:57,833 --> 00:42:59,933
FARID:
And then it corrupts it more,
and it goes backwards
982
00:42:59,933 --> 00:43:01,466
and corrupts it more,
and goes backwards
983
00:43:01,466 --> 00:43:02,533
and corrupts it more
and goes backwards--
984
00:43:02,533 --> 00:43:04,766
and it does that
six billion times.
985
00:43:05,833 --> 00:43:07,200
O'BRIEN (voiceover):
Eventually it corrupts it
986
00:43:07,200 --> 00:43:11,933
so it's unrecognizable
from the original image.
987
00:43:11,933 --> 00:43:14,566
Now that it knows how
to turn an image into nothing,
988
00:43:14,566 --> 00:43:16,466
it can reverse the process,
989
00:43:16,466 --> 00:43:20,166
turning seemingly nothing,
into a beautiful image.
990
00:43:21,200 --> 00:43:22,933
FARID:
What it's learned is how to take
991
00:43:22,933 --> 00:43:26,533
a completely indescript image,
just pure noise,
992
00:43:26,533 --> 00:43:30,200
and go back to a coherent image,
conditioned on a text prompt.
993
00:43:30,200 --> 00:43:33,466
You're basically
reverse engineering an image
994
00:43:33,466 --> 00:43:34,866
down to the pixel.
995
00:43:34,866 --> 00:43:36,266
Yeah, exactly, yeah.
996
00:43:36,266 --> 00:43:38,333
And it's-- and by the way--
if you had asked me,
997
00:43:38,333 --> 00:43:39,866
"Will this work?"
I would have said,
998
00:43:39,866 --> 00:43:41,333
"No, there's no way
this system works."
999
00:43:41,333 --> 00:43:43,700
It just, it just doesn't
seem like it should work.
1000
00:43:43,700 --> 00:43:45,500
And that's sort of the magic
1001
00:43:45,500 --> 00:43:47,133
of when you get this much data
1002
00:43:47,133 --> 00:43:49,466
and very powerful algorithms
and very powerful computing
1003
00:43:49,466 --> 00:43:52,600
to be able to crunch
these massive data sets.
1004
00:43:52,600 --> 00:43:54,466
I mean, we're not
going to contain it.
1005
00:43:54,466 --> 00:43:55,566
That's done.
1006
00:43:55,566 --> 00:43:56,600
(voiceover):
I sat down with Hany
1007
00:43:56,600 --> 00:43:57,766
and two of his grad students:
1008
00:43:57,766 --> 00:44:01,566
Justin Norman
and Sarah Barrington.
1009
00:44:01,566 --> 00:44:04,066
We looked at some
the A.I. trickery
1010
00:44:04,066 --> 00:44:05,700
they have seen and made.
1011
00:44:05,700 --> 00:44:08,100
Somebody else
wrote some base code
1012
00:44:08,100 --> 00:44:09,533
and they got grew on to
1013
00:44:09,533 --> 00:44:11,300
and grow on to and
grow on to and eventually...
1014
00:44:11,300 --> 00:44:12,733
O'BRIEN (voiceover):
In a world where anything
1015
00:44:12,733 --> 00:44:14,800
can be manipulated
with such ease
1016
00:44:14,800 --> 00:44:16,033
and seeming authenticity,
1017
00:44:16,033 --> 00:44:19,633
how are we to know
what's real anymore?
1018
00:44:19,633 --> 00:44:20,800
How you look at the world,
1019
00:44:20,800 --> 00:44:22,200
how you interact with
people in it,
1020
00:44:22,200 --> 00:44:24,033
and where you look for
your threats of that change.
1021
00:44:24,033 --> 00:44:28,366
O'BRIEN (voiceover):
Generative A.I. is now
part of a larger ecosystem
1022
00:44:28,366 --> 00:44:31,666
that is built on mistrust.
1023
00:44:31,666 --> 00:44:32,900
We're going to live
in a world where
1024
00:44:32,900 --> 00:44:34,433
we don't know what's real.
1025
00:44:34,433 --> 00:44:35,633
FARID (voiceover):
There is distrust of
governments,
1026
00:44:35,633 --> 00:44:37,266
there is distrust of media,
1027
00:44:37,266 --> 00:44:38,633
there is distrust
of academics.
1028
00:44:38,633 --> 00:44:41,466
And now throw on top of that
video evidence.
1029
00:44:41,466 --> 00:44:43,133
So-called video evidence.
1030
00:44:43,133 --> 00:44:44,900
I think this is
the very definition
1031
00:44:44,900 --> 00:44:47,100
of throwing jet fuel onto
a dumpster fire.
1032
00:44:47,100 --> 00:44:48,866
And it's already happening,
1033
00:44:48,866 --> 00:44:50,400
and I imagine
we will see more of it.
1034
00:44:50,400 --> 00:44:52,033
(Arnold's voice):
Come with me
if you want to live.
1035
00:44:52,033 --> 00:44:53,700
O'BRIEN (voiceover):
But it also can be
1036
00:44:53,700 --> 00:44:54,700
kind of fun.
1037
00:44:54,700 --> 00:44:55,866
As Hany promised,
1038
00:44:55,866 --> 00:44:58,000
here's my face
1039
00:44:58,000 --> 00:44:59,933
on the Terminator's body.
1040
00:44:59,933 --> 00:45:01,366
(gunfire blasting)
1041
00:45:01,366 --> 00:45:03,833
Long before A.I. might take
1042
00:45:03,833 --> 00:45:06,166
an existential turn
against humanity,
1043
00:45:06,166 --> 00:45:08,966
we will need to
reckon with the likes...
1044
00:45:08,966 --> 00:45:11,533
Go! Now!
O'BRIEN (voiceover):
Of the Milesinator.
1045
00:45:11,533 --> 00:45:13,566
TRAILER NARRATOR:
This time, he's back.
1046
00:45:13,566 --> 00:45:15,200
(booming)
1047
00:45:15,200 --> 00:45:16,700
O'BRIEN (voiceover):
Who will no doubt, be back.
1048
00:45:16,700 --> 00:45:18,266
Trust me.
1049
00:45:19,633 --> 00:45:20,800
O'BRIEN (voiceover):
Trust,
1050
00:45:20,800 --> 00:45:23,100
but always verify.
1051
00:45:23,100 --> 00:45:26,366
So, what kind of A.I. magic
1052
00:45:26,366 --> 00:45:28,733
is readily available online?
1053
00:45:28,733 --> 00:45:30,600
It's pretty simple
to make it look
1054
00:45:30,600 --> 00:45:33,066
like you're fluent
in another language.
1055
00:45:33,066 --> 00:45:35,466
(speaking Mandarin):
1056
00:45:36,800 --> 00:45:37,966
It was pretty easy to do,
1057
00:45:37,966 --> 00:45:40,400
I just had to upload
a video and wait.
1058
00:45:40,400 --> 00:45:43,333
(speaking German):
1059
00:45:44,766 --> 00:45:47,533
And, suddenly,
I look pretty darn smart.
1060
00:45:47,533 --> 00:45:50,766
(speaking Greek):
1061
00:45:51,733 --> 00:45:53,900
Sure, it's fun,
but I think you can see
1062
00:45:53,900 --> 00:45:55,600
where it leads to mischief
1063
00:45:55,600 --> 00:45:57,966
and possibly even mayhem.
1064
00:45:58,966 --> 00:46:03,466
(voiceover):
Yoshua Bengio is an
artificial intelligence pioneer.
1065
00:46:03,466 --> 00:46:05,166
He says he didn't spend
much time
1066
00:46:05,166 --> 00:46:07,633
thinking about
science fiction dystopia
1067
00:46:07,633 --> 00:46:10,566
as he was creating
the technology.
1068
00:46:10,566 --> 00:46:13,500
But as his brilliant ideas
became reality,
1069
00:46:13,500 --> 00:46:15,733
reality set in.
1070
00:46:15,733 --> 00:46:17,100
BENGIO:
And the more I read,
1071
00:46:17,100 --> 00:46:19,000
the more
I thought about it...
1072
00:46:19,000 --> 00:46:21,200
the more concerned I got.
1073
00:46:22,200 --> 00:46:25,233
If we are not honest
with ourselves,
1074
00:46:25,233 --> 00:46:26,233
we're gonna fool ourselves.
1075
00:46:26,233 --> 00:46:28,233
We're gonna...
lose.
1076
00:46:29,466 --> 00:46:30,866
O'BRIEN (voiceover):
Avoiding that outcome
1077
00:46:30,866 --> 00:46:33,366
is now his main priority.
1078
00:46:33,366 --> 00:46:35,433
He has signed
several public warnings
1079
00:46:35,433 --> 00:46:37,600
issued by A.I. thought leaders,
1080
00:46:37,600 --> 00:46:40,966
including this stark
single-sentence statement
1081
00:46:40,966 --> 00:46:43,066
in May of 2023.
1082
00:46:43,066 --> 00:46:45,933
"Mitigating the risk of
extinction from A.I.
1083
00:46:45,933 --> 00:46:47,900
"should be a global priority
1084
00:46:47,900 --> 00:46:50,633
"alongside other
societal scale risks,
1085
00:46:50,633 --> 00:46:52,166
"such as pandemics
1086
00:46:52,166 --> 00:46:53,600
and nuclear war."
1087
00:46:56,666 --> 00:47:00,500
As we approach more and more
capable A.I. systems
1088
00:47:00,500 --> 00:47:04,833
that might even become stronger
than humans in many areas,
1089
00:47:04,833 --> 00:47:06,366
they become
more and more dangerous.
1090
00:47:06,366 --> 00:47:07,766
Can't we just pull
the plug on the thing?
1091
00:47:07,766 --> 00:47:09,366
Oh, that's
the safest thing to do,
1092
00:47:09,366 --> 00:47:10,633
pull the plug.
1093
00:47:10,633 --> 00:47:13,066
Before it gets
so powerful that
1094
00:47:13,066 --> 00:47:14,700
it prevents us from
pulling the plug.
1095
00:47:14,700 --> 00:47:16,633
DAVE:
Open the pod bay doors, Hal.
1096
00:47:16,633 --> 00:47:18,400
HAL:
I'm sorry, Dave,
1097
00:47:18,400 --> 00:47:20,400
I'm afraid I can't do that.
1098
00:47:21,566 --> 00:47:23,366
O'BRIEN (voiceover):
It may be some time
1099
00:47:23,366 --> 00:47:24,933
before computers are able
1100
00:47:24,933 --> 00:47:27,300
to act like
movie supervillains...
1101
00:47:27,300 --> 00:47:28,333
HAL:
Goodbye.
1102
00:47:29,766 --> 00:47:33,133
O'BRIEN (voiceover):
But there are near-term dangers
already emerging.
1103
00:47:33,133 --> 00:47:36,366
Besides deepfakes and
misinformation,
1104
00:47:36,366 --> 00:47:40,300
A.I. can also supercharge bias
and hate content,
1105
00:47:40,300 --> 00:47:43,133
replace human jobs...
1106
00:47:43,133 --> 00:47:44,766
This is why
we're striking, everybody.
(crowd exclaiming)
1107
00:47:45,900 --> 00:47:47,100
O'BRIEN (voiceover):
And make it easier
1108
00:47:47,100 --> 00:47:50,500
for terrorists
to create bioweapons.
1109
00:47:50,500 --> 00:47:53,300
And A.I. systems are so complex
1110
00:47:53,300 --> 00:47:56,000
that they are difficult
to comprehend,
1111
00:47:56,000 --> 00:47:58,600
all but impossible to audit.
1112
00:47:58,600 --> 00:48:00,466
RUS (voiceover):
Nobody really understands
1113
00:48:00,466 --> 00:48:03,133
how those systems
reach their decisions.
1114
00:48:03,133 --> 00:48:05,300
So we have to be
much more thoughtful
1115
00:48:05,300 --> 00:48:07,700
about how we
test and evaluate them
1116
00:48:07,700 --> 00:48:09,100
before releasing them.
1117
00:48:09,100 --> 00:48:12,200
They're concerned
whether machine will be able
1118
00:48:12,200 --> 00:48:14,400
to begin to
think for itself.
1119
00:48:14,400 --> 00:48:17,133
O'BRIEN (voiceover):
The U.S. and Europe have begun
charting a strategy
1120
00:48:17,133 --> 00:48:18,866
to try to ensure safe, secure,
1121
00:48:18,866 --> 00:48:22,300
and trustworthy
artificial intelligence.
1122
00:48:22,300 --> 00:48:24,766
RISHI SUNAK:
...in a way that will
be safe for our communities...
1123
00:48:24,766 --> 00:48:26,233
O'BRIEN (voiceover):
But how to do that
1124
00:48:26,233 --> 00:48:28,433
in the midst of a frenetic race
1125
00:48:28,433 --> 00:48:29,433
to dominate a technology
1126
00:48:29,433 --> 00:48:33,466
with a predicted economic impact
1127
00:48:33,466 --> 00:48:37,200
of 13 trillion dollars by 2030.
1128
00:48:37,200 --> 00:48:40,566
There is such a strong
commercial incentive
1129
00:48:40,566 --> 00:48:43,066
to develop this
and win the competition
1130
00:48:43,066 --> 00:48:44,333
against the other companies,
1131
00:48:44,333 --> 00:48:46,700
not to mention
the other countries,
1132
00:48:46,700 --> 00:48:49,500
that it's hard
to stop that train.
1133
00:48:50,600 --> 00:48:54,000
But that's what
governments should be doing.
1134
00:48:54,000 --> 00:48:56,600
NEWS ANCHOR:
The titans of social media
1135
00:48:56,600 --> 00:48:59,366
didn't want to come to
Capitol Hill.
1136
00:48:59,366 --> 00:49:00,933
O'BRIEN (voiceover):
Historically, the tech industry
1137
00:49:00,933 --> 00:49:03,833
has bridled against regulation.
1138
00:49:03,833 --> 00:49:06,900
You have an army of lawyers
and lobbyists
1139
00:49:06,900 --> 00:49:08,166
that have fought us on this...
1140
00:49:08,166 --> 00:49:09,266
SULEYMAN (voiceover):
There's no question that
1141
00:49:09,266 --> 00:49:10,566
guardrails
will slow things down,
1142
00:49:10,566 --> 00:49:11,800
But, the risks are uncertain
1143
00:49:11,800 --> 00:49:15,200
and potentially enormous.
1144
00:49:15,200 --> 00:49:16,600
So, it makes sense for us
1145
00:49:16,600 --> 00:49:18,400
to start having
the conversation right now.
1146
00:49:19,733 --> 00:49:21,300
O'BRIEN (voiceover):
For me, the conversation
1147
00:49:21,300 --> 00:49:23,966
about A.I. is personal.
1148
00:49:23,966 --> 00:49:26,766
Okay, no network detected.
1149
00:49:26,766 --> 00:49:28,033
Okay, um...
1150
00:49:28,033 --> 00:49:30,366
Oh, here we go.
Okay.
1151
00:49:30,366 --> 00:49:32,166
And now I'm going to open,
open, open, open, open...
1152
00:49:33,633 --> 00:49:35,700
(voiceover):
I used the Coapt app
1153
00:49:35,700 --> 00:49:38,900
to train the A.I.
inside my new prosthetic.
1154
00:49:38,900 --> 00:49:41,900
♪ ♪
1155
00:49:41,900 --> 00:49:43,566
It says all of my
training data is good,
1156
00:49:43,566 --> 00:49:44,866
it's four of five stars.
1157
00:49:44,866 --> 00:49:46,266
And now let's try to close.
1158
00:49:46,266 --> 00:49:47,866
(whirring)
1159
00:49:47,866 --> 00:49:49,033
All right.
1160
00:49:49,033 --> 00:49:53,900
Seems to be doing
what it was told.
1161
00:49:53,900 --> 00:49:55,400
(voiceover):
Was my new arm listening?
1162
00:49:55,400 --> 00:49:56,766
Maybe.
1163
00:49:56,766 --> 00:49:58,666
I decided to make things
simpler.
1164
00:49:59,733 --> 00:50:03,733
I took off the hand and
attached a myoelectric hook.
1165
00:50:03,733 --> 00:50:05,633
(quietly):
All right.
1166
00:50:05,633 --> 00:50:08,266
(voiceover):
Function over form.
1167
00:50:08,266 --> 00:50:10,966
Not a conversation piece
necessarily at a cocktail party
1168
00:50:10,966 --> 00:50:12,933
like this thing is.
1169
00:50:12,933 --> 00:50:15,666
This looks more like
Luke Skywalker, I suppose.
1170
00:50:15,666 --> 00:50:18,933
But this thing has a tremendous
amount of function to it.
1171
00:50:18,933 --> 00:50:21,633
Although, right now,
it wants to stay open.
1172
00:50:21,633 --> 00:50:23,600
(voiceover):
And that problem persisted.
1173
00:50:23,600 --> 00:50:25,500
Find a tripod plate...
1174
00:50:25,500 --> 00:50:26,900
(voiceover):
When I tried using it
1175
00:50:26,900 --> 00:50:28,800
to set up my basement studio
1176
00:50:28,800 --> 00:50:30,133
for a live broadcast.
1177
00:50:30,133 --> 00:50:32,800
Come on, close.
1178
00:50:32,800 --> 00:50:34,900
(voiceover):
I was quickly frustrated.
1179
00:50:34,900 --> 00:50:37,266
(item drops, audio beep)
1180
00:50:37,266 --> 00:50:38,566
Really annoying.
1181
00:50:38,566 --> 00:50:41,100
Not useful.
1182
00:50:41,100 --> 00:50:44,433
(voiceover):
The hook continuously
opened on its own.
1183
00:50:44,433 --> 00:50:46,133
(clattering)
Damn it!
1184
00:50:46,133 --> 00:50:48,600
(voiceover):
So I completely reset
1185
00:50:48,600 --> 00:50:50,833
and retrained the arm.
1186
00:50:51,800 --> 00:50:53,800
And... reset,
there we go.
1187
00:50:53,800 --> 00:50:56,600
Add data...
1188
00:50:56,600 --> 00:50:59,300
(voiceover):
But the software was
1189
00:50:59,300 --> 00:51:00,766
artificially unhappy.
1190
00:51:02,700 --> 00:51:04,800
"Electrodes are not
making good skin contact."
1191
00:51:04,800 --> 00:51:07,233
Maybe that is my problem,
ultimately.
1192
00:51:08,533 --> 00:51:10,200
(voiceover):
My problem really is
1193
00:51:10,200 --> 00:51:12,466
I haven't given this
enough time.
1194
00:51:12,466 --> 00:51:14,733
Amputees tell me it can take
1195
00:51:14,733 --> 00:51:16,433
many months to really learn
1196
00:51:16,433 --> 00:51:18,400
how to use an arm like
this one.
1197
00:51:19,400 --> 00:51:22,133
The choke point isn't
artificial intelligence.
1198
00:51:22,133 --> 00:51:24,766
Dead as a doornail.
1199
00:51:24,766 --> 00:51:26,500
(voiceover):
But rather, what is the best way
1200
00:51:26,500 --> 00:51:28,400
to communicate
my intentions to it?
1201
00:51:29,766 --> 00:51:31,266
Little reboot there,
I guess.
1202
00:51:31,266 --> 00:51:33,033
All right.
1203
00:51:33,033 --> 00:51:34,166
Close.
1204
00:51:34,166 --> 00:51:36,700
Open, close.
1205
00:51:36,700 --> 00:51:39,066
(voiceover):
It turns out machine learning
1206
00:51:39,066 --> 00:51:42,100
isn't smart enough to
give me a replacement arm
1207
00:51:42,100 --> 00:51:44,066
like Luke Skywalker got.
1208
00:51:44,066 --> 00:51:47,900
Nor is it capable
of creating the Terminator.
1209
00:51:47,900 --> 00:51:51,833
Right now, it seems many
hopes and fears
1210
00:51:51,833 --> 00:51:52,900
for artificial intelligence...
1211
00:51:52,900 --> 00:51:54,366
Oh!
1212
00:51:54,366 --> 00:51:57,100
(voiceover):
...are rooted
in science fiction.
1213
00:51:59,066 --> 00:52:03,166
But we are walking down a road
to the unknown.
1214
00:52:03,166 --> 00:52:06,066
The door is opening to
a revolution.
1215
00:52:07,566 --> 00:52:08,600
(door closes)
1216
00:52:08,600 --> 00:52:12,633
♪ ♪
1217
00:52:31,766 --> 00:52:39,300
♪ ♪
1218
00:52:43,133 --> 00:52:50,666
♪ ♪
1219
00:52:52,300 --> 00:52:59,833
♪ ♪
1220
00:53:01,533 --> 00:53:09,066
♪ ♪
1221
00:53:14,800 --> 00:53:21,966
♪ ♪
95623
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.