Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,941 --> 00:00:07,941
DOWNLOADED FROM WWW.AWAFIM.TV
2
00:00:07,941 --> 00:00:10,118
[RUMINATIVE MUSIC PLAYING]
3
00:00:29,572 --> 00:00:30,660
[DEVICE CHIMES]
4
00:00:30,703 --> 00:00:31,835
JULIETTE LOVE: Hi, Alpha.
5
00:00:32,705 --> 00:00:34,055
ALPHA: Hello.
6
00:00:34,098 --> 00:00:35,317
LOVE: Can you
help me write code?
7
00:00:37,580 --> 00:00:39,103
ALPHA: I was trained
to answer questions,
8
00:00:40,583 --> 00:00:42,237
but I'm able to learn.
9
00:00:43,716 --> 00:00:45,979
LOVE: That's very
open-minded of you.
10
00:00:46,023 --> 00:00:48,678
ALPHA: Thank you.
I'm glad you're happy with me.
11
00:00:50,332 --> 00:00:51,333
What's this guy doing?
12
00:00:54,553 --> 00:00:56,642
ALPHA: That's a developer.
13
00:00:56,686 --> 00:00:58,035
What do you think
he's working on?
14
00:00:59,558 --> 00:01:01,038
ALPHA:
That's a tough question.
15
00:01:01,082 --> 00:01:02,909
He might be working
on a new feature,
16
00:01:02,953 --> 00:01:05,260
a bug fix or something else.
17
00:01:05,303 --> 00:01:06,348
It's quite possible.
18
00:01:06,391 --> 00:01:07,349
ALPHA: Yes.
19
00:01:10,569 --> 00:01:11,875
LOVE: Do you see my backpack?
20
00:01:14,007 --> 00:01:15,313
ALPHA:
That's a badminton racket.
21
00:01:15,357 --> 00:01:17,924
It's a squash racket,
but that's pretty close.
22
00:01:20,144 --> 00:01:21,711
ALPHA:
That's a badminton racket.
23
00:01:21,754 --> 00:01:23,147
No, but you're not
the first person
24
00:01:23,191 --> 00:01:24,670
to make that mistake.
25
00:01:24,714 --> 00:01:26,846
[UPBEAT MUSIC PLAYING]
26
00:01:34,724 --> 00:01:35,986
NEWSREADER 1:
AI, the technology
27
00:01:36,029 --> 00:01:38,423
that has been advancing
at breakneck speed.
28
00:01:38,467 --> 00:01:40,817
NEWSREADER 2: Artificial
intelligence is all the rage.
29
00:01:40,860 --> 00:01:42,340
NEWSREADER 3: Some are now
raising alarm about...
30
00:01:42,384 --> 00:01:43,907
NEWSREADER 4:
It is definitely concerning.
31
00:01:43,950 --> 00:01:45,343
NEWSREADER 5:
This is an AI arms race.
32
00:01:45,387 --> 00:01:46,475
NEWSREADER 6: We don't know
33
00:01:46,518 --> 00:01:47,606
how this is all
going to shake out,
34
00:01:47,650 --> 00:01:48,912
but it's clear
something is happening.
35
00:01:53,830 --> 00:01:55,136
DEMIS HASSABIS:
I'm kind of restless.
36
00:01:57,094 --> 00:02:00,141
Trying to build AGI
is the most exciting journey,
37
00:02:00,184 --> 00:02:02,404
in my opinion, that humans
have ever embarked on.
38
00:02:04,275 --> 00:02:06,321
If you're really going
to take that seriously,
39
00:02:06,364 --> 00:02:08,366
there isn't a lot of time.
40
00:02:08,410 --> 00:02:09,846
Life's very short.
41
00:02:12,544 --> 00:02:14,503
My whole life goal is to solve
42
00:02:14,546 --> 00:02:16,896
artificial
general intelligence.
43
00:02:16,940 --> 00:02:20,378
And on the way,
use AI as the ultimate tool
44
00:02:20,422 --> 00:02:21,771
to solve all the world's
45
00:02:21,814 --> 00:02:23,251
most complex
scientific problems.
46
00:02:25,470 --> 00:02:27,124
I think that's bigger
than the Internet.
47
00:02:27,168 --> 00:02:28,473
I think that's bigger
than mobile.
48
00:02:29,692 --> 00:02:31,041
I think it's more like
49
00:02:31,084 --> 00:02:32,912
the advent
of electricity or fire.
50
00:02:45,708 --> 00:02:46,752
ANNOUNCER: World leaders
51
00:02:46,796 --> 00:02:48,537
and artificial
intelligence experts
52
00:02:48,580 --> 00:02:49,929
are gathering
for the first ever
53
00:02:49,973 --> 00:02:52,802
global AI safety summit,
54
00:02:52,845 --> 00:02:54,369
set to look at the risks
55
00:02:54,412 --> 00:02:56,632
of the fast growing technology
and also...
56
00:02:56,675 --> 00:02:57,894
HASSABIS: I think
this is a hugely
57
00:02:57,937 --> 00:03:00,157
critical moment
for all humanity.
58
00:03:01,114 --> 00:03:03,639
It feels like
we're on the cusp
59
00:03:03,682 --> 00:03:05,815
of some incredible things
happening.
60
00:03:05,858 --> 00:03:07,077
NEWSREADER:
Let me take you through
61
00:03:07,120 --> 00:03:08,296
some of the reactions
in today's papers.
62
00:03:08,339 --> 00:03:10,036
HASSABIS: AGI is pretty close,
I think.
63
00:03:10,080 --> 00:03:12,865
There's clearly huge interest
in what it is capable of,
64
00:03:12,909 --> 00:03:14,824
where it's taking us.
65
00:03:14,867 --> 00:03:15,999
HASSABIS: This is the moment
66
00:03:16,042 --> 00:03:17,957
I've been living
my whole life for.
67
00:03:18,001 --> 00:03:20,177
[MID-TEMPO
ELECTRONIC MUSIC PLAYS]
68
00:03:22,223 --> 00:03:25,008
I've always been fascinated
by the mind.
69
00:03:25,051 --> 00:03:28,011
So I set my heart
on studying neuroscience
70
00:03:28,054 --> 00:03:29,969
because I wanted
to get inspiration
71
00:03:30,013 --> 00:03:31,667
from the brain for AI.
72
00:03:31,710 --> 00:03:33,321
ELEANOR MAGUIRE:
I remember asking Demis,
73
00:03:33,364 --> 00:03:34,670
"What's the end game?"
74
00:03:34,713 --> 00:03:36,193
You know?
So you're going to come here
75
00:03:36,237 --> 00:03:38,239
and you're going
to study neuroscience
76
00:03:38,282 --> 00:03:41,459
and you're going to maybe
get a Ph.D. if you work hard.
77
00:03:42,895 --> 00:03:44,157
And he said,
78
00:03:44,201 --> 00:03:46,986
"You know, I want
to be able to solve AI.
79
00:03:47,030 --> 00:03:49,380
"I want to be able
to solve intelligence."
80
00:03:49,424 --> 00:03:51,382
HASSABIS: The human brain
is the only existent proof
81
00:03:51,426 --> 00:03:53,515
we have, perhaps
in the entire universe,
82
00:03:53,558 --> 00:03:56,300
that general intelligence
is possible at all.
83
00:03:56,344 --> 00:03:59,303
And I thought
someone in this building
84
00:03:59,347 --> 00:04:00,391
should be interested
85
00:04:00,435 --> 00:04:02,611
in general intelligence
like I am.
86
00:04:02,654 --> 00:04:04,830
And then Shane's name
popped up.
87
00:04:04,874 --> 00:04:07,355
HOST: Our next speaker today
is Shane Legg.
88
00:04:07,398 --> 00:04:08,617
He's from New Zealand,
89
00:04:08,660 --> 00:04:11,533
where he trained in math
and classical ballet.
90
00:04:11,576 --> 00:04:14,100
Are machines actually
becoming more intelligent?
91
00:04:14,144 --> 00:04:16,538
Some people say yes,
some people say no.
92
00:04:16,581 --> 00:04:17,669
It's not really clear.
93
00:04:17,713 --> 00:04:18,888
We know they're getting
a lot faster
94
00:04:18,931 --> 00:04:20,237
at doing computations.
95
00:04:20,281 --> 00:04:21,630
But are we actually
going forwards
96
00:04:21,673 --> 00:04:23,849
in terms
of general intelligence?
97
00:04:23,893 --> 00:04:25,721
HASSABIS: We were both
obsessed with AGI,
98
00:04:25,764 --> 00:04:27,418
artificial
general intelligence.
99
00:04:27,462 --> 00:04:29,725
So today I'm going
to be talking about
100
00:04:29,768 --> 00:04:32,118
different approaches
to building AGI.
101
00:04:32,162 --> 00:04:33,903
With my colleague
Demis Hassabis,
102
00:04:33,946 --> 00:04:35,905
we're looking at ways
to bring in ideas
103
00:04:35,948 --> 00:04:37,472
from theoretical neuroscience.
104
00:04:37,515 --> 00:04:41,606
I felt like we were
the keepers of a secret
105
00:04:41,650 --> 00:04:43,173
that no one else knew.
106
00:04:43,216 --> 00:04:45,436
Shane and I knew
no one in academia
107
00:04:45,480 --> 00:04:47,830
would be supportive
of what we were doing.
108
00:04:47,873 --> 00:04:50,876
AI was almost
an embarrassing word
109
00:04:50,920 --> 00:04:52,878
to use in academic circles,
right?
110
00:04:52,922 --> 00:04:54,924
If you said
you were working on AI,
111
00:04:54,967 --> 00:04:58,014
then you clearly weren't
a serious scientist.
112
00:04:58,057 --> 00:05:00,103
So I convinced Shane
the right way to do it
113
00:05:00,146 --> 00:05:01,365
would be to start a company.
114
00:05:01,409 --> 00:05:03,236
SHANE LEGG: Okay,
we're going to try to do
115
00:05:03,280 --> 00:05:04,934
artificial
general intelligence.
116
00:05:04,977 --> 00:05:06,936
It may not even be possible.
117
00:05:06,979 --> 00:05:08,503
We're not quite sure
how we're going to do it,
118
00:05:08,546 --> 00:05:11,549
but we have some ideas
or, kind of, approaches.
119
00:05:11,593 --> 00:05:14,987
Huge amounts of money,
huge amounts of risk,
120
00:05:15,031 --> 00:05:16,554
lots and lots of compute.
121
00:05:18,730 --> 00:05:20,341
And if we pull this off,
122
00:05:20,384 --> 00:05:23,561
it'll be the biggest thing
ever, right?
123
00:05:23,605 --> 00:05:26,129
That is a very hard thing
for a typical investor
124
00:05:26,172 --> 00:05:27,348
to put their money on.
125
00:05:27,391 --> 00:05:29,654
It's almost like
buying a lottery ticket.
126
00:05:29,698 --> 00:05:32,483
I'm going to be speaking about
the system of neuroscience
127
00:05:32,527 --> 00:05:36,748
and how it might be used
to help us build AGI.
128
00:05:36,792 --> 00:05:37,967
HASSABIS:
Finding initial funding
129
00:05:38,010 --> 00:05:39,229
for this was very hard.
130
00:05:39,272 --> 00:05:41,274
We're going to solve
all of intelligence.
131
00:05:41,318 --> 00:05:42,928
You can imagine
some of the looks I got
132
00:05:42,972 --> 00:05:44,800
when we were
pitching that around.
133
00:05:44,843 --> 00:05:47,933
So I'm a V.C.
and I look at about
134
00:05:47,977 --> 00:05:51,023
700 to 1,000 projects a year.
135
00:05:51,067 --> 00:05:54,810
And I fund
literally 1% of those.
136
00:05:54,853 --> 00:05:57,160
About eight projects a year.
137
00:05:57,203 --> 00:06:00,293
So that means 99% of the time,
you're in "No" mode.
138
00:06:00,337 --> 00:06:01,643
"Wait a minute.
I'm telling you,
139
00:06:01,686 --> 00:06:03,819
"this is the most important
thing of all time.
140
00:06:03,862 --> 00:06:05,124
"I'm giving you
all this build-up
141
00:06:05,168 --> 00:06:06,256
"about how... explain
142
00:06:06,299 --> 00:06:07,692
"how it connects
with the brain,
143
00:06:07,736 --> 00:06:09,433
"why the time's right now,
and then you're asking me,
144
00:06:09,477 --> 00:06:11,000
"'But what's your, how are you
going to make money?
145
00:06:11,043 --> 00:06:12,175
"'What's your product?'"
146
00:06:12,218 --> 00:06:16,005
It's like,
so prosaic a question.
147
00:06:16,527 --> 00:06:17,963
You know?
148
00:06:18,007 --> 00:06:19,138
"Have you not been listening
to what I've been saying?"
149
00:06:19,182 --> 00:06:21,097
LEGG: We needed investors
150
00:06:21,140 --> 00:06:23,491
who aren't necessarily
going to invest
151
00:06:23,534 --> 00:06:25,144
because they think
it's the best
152
00:06:25,188 --> 00:06:26,711
investment decision.
153
00:06:26,755 --> 00:06:27,886
They're probably
going to invest
154
00:06:27,930 --> 00:06:29,497
because they just think
it's really cool.
155
00:06:29,540 --> 00:06:31,499
NEWSREADER:
He's the Silicon Valley
156
00:06:31,542 --> 00:06:33,544
version of the man
behind the curtain
157
00:06:33,588 --> 00:06:34,893
inThe Wizard of Oz.
158
00:06:34,937 --> 00:06:36,286
He had a lot to do
with giving you
159
00:06:36,329 --> 00:06:39,028
PayPal, Facebook,
YouTube and Yelp.
160
00:06:39,071 --> 00:06:40,464
LEGG: If everyone says "X,"
161
00:06:40,508 --> 00:06:43,206
Peter Thiel suspects
that the opposite of X
162
00:06:43,249 --> 00:06:44,729
is quite possibly true.
163
00:06:44,773 --> 00:06:47,297
HASSABIS: So Peter Thiel
was our first big investor.
164
00:06:47,340 --> 00:06:50,126
But he insisted that
we come to Silicon Valley
165
00:06:50,169 --> 00:06:51,823
because that
was the only place we could...
166
00:06:51,867 --> 00:06:52,998
There would be the talent,
167
00:06:53,042 --> 00:06:55,087
and we could build
that kind of company.
168
00:06:55,131 --> 00:06:57,002
But I was pretty adamant
we should be in London
169
00:06:57,046 --> 00:06:59,048
because I think
London's an amazing city.
170
00:06:59,091 --> 00:07:01,485
Plus, I knew there were
really amazing people
171
00:07:01,529 --> 00:07:03,835
trained at Cambridge
and Oxford and UCL.
172
00:07:03,879 --> 00:07:05,054
In Silicon Valley,
173
00:07:05,097 --> 00:07:06,664
everybody's founding
a company every year,
174
00:07:06,708 --> 00:07:07,839
and then if it doesn't work,
175
00:07:07,883 --> 00:07:09,624
you chuck it
and you start something new.
176
00:07:09,667 --> 00:07:11,190
That is not conducive
177
00:07:11,234 --> 00:07:14,585
to a long-term
research challenge.
178
00:07:14,629 --> 00:07:17,370
So we were totally
an outlier for him.
179
00:07:17,414 --> 00:07:20,983
Hi, everyone.
Welcome to DeepMind.
180
00:07:21,026 --> 00:07:22,550
So, what is our mission?
181
00:07:23,768 --> 00:07:25,727
We summarize it as...
182
00:07:25,770 --> 00:07:28,033
DeepMind's mission is to build
the world's first
183
00:07:28,077 --> 00:07:29,600
general learning machine.
184
00:07:29,644 --> 00:07:31,733
So we always stress the word
"general" and "learning" here
185
00:07:31,776 --> 00:07:32,908
are the key things.
186
00:07:32,951 --> 00:07:35,127
LEGG: Our mission
was to build an AGI,
187
00:07:35,171 --> 00:07:36,781
an artificial
general intelligence.
188
00:07:36,825 --> 00:07:40,393
And so that means that we need
a system which is general.
189
00:07:40,437 --> 00:07:42,744
It doesn't learn to do
one specific thing.
190
00:07:42,787 --> 00:07:45,094
That's a really key part
of human intelligence.
191
00:07:45,137 --> 00:07:46,704
We can learn to do
many, many things.
192
00:07:46,748 --> 00:07:48,706
It's going to, of course,
be a lot of hard work.
193
00:07:48,750 --> 00:07:51,100
But one of the things
that keeps me up at night
194
00:07:51,143 --> 00:07:53,189
is to not waste this
opportunity to, you know,
195
00:07:53,232 --> 00:07:54,451
to really make
a difference here,
196
00:07:54,495 --> 00:07:56,845
and have a big impact
on the world.
197
00:07:56,888 --> 00:07:58,324
LEGG: The first people
that came
198
00:07:58,368 --> 00:08:00,283
and joined DeepMind
really believed in the dream.
199
00:08:00,326 --> 00:08:02,111
But this was, I think,
one of the first times
200
00:08:02,154 --> 00:08:04,461
they found a place
full of other dreamers.
201
00:08:04,505 --> 00:08:06,419
You know, we collected
this Manhattan Project,
202
00:08:06,463 --> 00:08:08,421
if you like,
together to solve AI.
203
00:08:08,465 --> 00:08:09,771
HELEN KING:
In the first two years,
204
00:08:09,814 --> 00:08:10,815
we were in total stealth mode.
205
00:08:10,859 --> 00:08:12,295
And so we couldn't
say to anyone
206
00:08:12,338 --> 00:08:14,776
what were we doing
or where we worked.
207
00:08:14,819 --> 00:08:16,125
It was all quite vague.
208
00:08:16,168 --> 00:08:17,605
BEN COPPIN: It had
no public presence at all.
209
00:08:17,648 --> 00:08:18,954
You couldn't
look at a website.
210
00:08:18,997 --> 00:08:21,130
The office
was at a secret location.
211
00:08:21,173 --> 00:08:24,002
When we would interview people
in those early days,
212
00:08:24,046 --> 00:08:26,309
they would show up
very nervously.
213
00:08:26,352 --> 00:08:27,440
[LAUGHS]
214
00:08:27,484 --> 00:08:29,660
I had at least one candidate
who said,
215
00:08:29,704 --> 00:08:32,010
"I just messaged my wife
to tell her exactly
216
00:08:32,054 --> 00:08:33,359
"where I'm going just in case
217
00:08:33,403 --> 00:08:34,535
"this turns out to be some
kind of horrible scam
218
00:08:34,578 --> 00:08:35,840
"and I'm going
to get kidnapped."
219
00:08:35,884 --> 00:08:39,714
Well, my favorite new person
who's an investor,
220
00:08:39,757 --> 00:08:43,021
who I've been working
for a year, is Elon Musk.
221
00:08:43,065 --> 00:08:44,153
So for those of you
who don't know,
222
00:08:44,196 --> 00:08:45,371
this is what he looks like.
223
00:08:45,415 --> 00:08:47,504
And he hadn't really thought
much about AI
224
00:08:47,548 --> 00:08:49,158
until we chatted.
225
00:08:49,201 --> 00:08:51,290
His mission is to die on Mars
or something.
226
00:08:51,334 --> 00:08:52,944
- But not on impact.
- [LAUGHTER]
227
00:08:52,988 --> 00:08:54,206
So...
228
00:08:55,207 --> 00:08:57,166
We made some big decisions
229
00:08:57,209 --> 00:08:59,342
about how we were going
to approach building AI.
230
00:08:59,385 --> 00:09:01,126
This is a reinforcement
learning setup.
231
00:09:01,170 --> 00:09:02,824
This is the kind of setup
that we think about
232
00:09:02,867 --> 00:09:06,349
when we say we're building,
you know, an AI agent.
233
00:09:06,392 --> 00:09:08,525
It's basically the agent,
which is the AI,
234
00:09:08,569 --> 00:09:10,005
and then there's
the environment
235
00:09:10,048 --> 00:09:11,180
that it's interacting with.
236
00:09:11,223 --> 00:09:12,485
We decided that games,
237
00:09:12,529 --> 00:09:14,009
as long as
you're very disciplined
238
00:09:14,052 --> 00:09:15,314
about how you use them,
239
00:09:15,358 --> 00:09:17,273
are the perfect
training ground
240
00:09:17,316 --> 00:09:19,362
for AI development.
241
00:09:19,405 --> 00:09:21,712
LEGG: We wanted
to try to create one algorithm
242
00:09:21,756 --> 00:09:23,671
that could to be
trained up to play
243
00:09:23,714 --> 00:09:26,064
several dozen
different Atari games.
244
00:09:26,108 --> 00:09:27,500
So just like a human,
245
00:09:27,544 --> 00:09:29,546
you have to use the same brain
to play all the games.
246
00:09:29,590 --> 00:09:30,895
DAVID SILVER:
You can think of it
247
00:09:30,939 --> 00:09:33,028
that you provide the agent
with the cartridge.
248
00:09:33,071 --> 00:09:34,333
And you say,
249
00:09:34,377 --> 00:09:35,726
"Okay, imagine you're born
into that world
250
00:09:35,770 --> 00:09:37,554
"with that cartridge,
and you just get to interact
251
00:09:37,598 --> 00:09:39,425
"with the pixels
and see the score.
252
00:09:40,383 --> 00:09:41,689
"What can you do?"
253
00:09:43,995 --> 00:09:47,390
So what you're going to do is
take your Q function. Q-K...
254
00:09:47,433 --> 00:09:49,218
HASSABIS: Q-learning
is one of the oldest methods
255
00:09:49,261 --> 00:09:50,915
for reinforcement learning.
256
00:09:50,959 --> 00:09:53,701
And what we did was combine
reinforcement learning
257
00:09:53,744 --> 00:09:56,791
with deep learning
in one system.
258
00:09:56,834 --> 00:09:59,358
No one had ever combined
those two things together
259
00:09:59,402 --> 00:10:01,447
at scale to do
anything impressive,
260
00:10:01,491 --> 00:10:03,536
and we needed
to prove out this thesis.
261
00:10:03,580 --> 00:10:06,801
LEGG: We tried doingPong
as the first game.
262
00:10:06,844 --> 00:10:08,150
It seemed like the simplest.
263
00:10:08,193 --> 00:10:10,239
It hasn't been told
264
00:10:10,282 --> 00:10:11,849
anything about
what it's controlling
265
00:10:11,893 --> 00:10:12,894
or what it's supposed to do.
266
00:10:12,937 --> 00:10:14,678
All it knows
is that score is good
267
00:10:14,722 --> 00:10:18,073
and it has to learn
what its controls do,
268
00:10:18,116 --> 00:10:20,641
and build everything...
first principles.
269
00:10:21,250 --> 00:10:22,773
[GAME BEEPING]
270
00:10:29,214 --> 00:10:30,433
It wasn't really working.
271
00:10:32,609 --> 00:10:34,089
HASSABIS: I was just
saying to Shane,
272
00:10:34,132 --> 00:10:37,092
"Maybe we're just wrong,
and we can't even doPong."
273
00:10:37,135 --> 00:10:38,833
LEGG: It was a bit
nerve-racking,
274
00:10:38,876 --> 00:10:40,443
thinking how far we had to go
275
00:10:40,486 --> 00:10:42,358
if we were going
to really build
276
00:10:42,401 --> 00:10:44,447
a generally
intelligent system.
277
00:10:44,490 --> 00:10:45,753
HASSABIS: And it felt like
it was time
278
00:10:45,796 --> 00:10:47,276
to give up and move on.
279
00:10:48,190 --> 00:10:49,408
And then suddenly...
280
00:10:49,452 --> 00:10:51,367
[STIRRING MUSIC PLAYS]
281
00:10:51,410 --> 00:10:53,630
We got our first point.
282
00:10:53,674 --> 00:10:56,633
And then it was like,
"Is this random?"
283
00:10:56,677 --> 00:10:59,201
"No, no, it's really
getting a point now."
284
00:10:59,244 --> 00:11:00,724
It was really exciting
that this thing
285
00:11:00,768 --> 00:11:02,247
that previously
couldn't even figure out
286
00:11:02,291 --> 00:11:03,553
how to move a paddle
287
00:11:03,596 --> 00:11:05,947
had suddenly been able
to totally get it right.
288
00:11:05,990 --> 00:11:07,165
HASSABIS: Then it was getting
a few points.
289
00:11:07,209 --> 00:11:08,645
And then it won
its first game.
290
00:11:08,689 --> 00:11:10,995
And then three months later,
no human could beat it.
291
00:11:11,039 --> 00:11:14,259
You hadn't told it the rules,
how to get the score, nothing.
292
00:11:14,303 --> 00:11:16,131
And you just tell it
to maximize the score,
293
00:11:16,174 --> 00:11:17,393
and it goes away and does it.
294
00:11:17,436 --> 00:11:18,699
This is the first time
295
00:11:18,742 --> 00:11:20,570
anyone had done
this end-to-end learning.
296
00:11:20,613 --> 00:11:24,313
"Okay, so we have this working
in quite a general way.
297
00:11:24,356 --> 00:11:25,793
"Now let's try another game."
298
00:11:25,836 --> 00:11:27,533
HASSABIS: So then
we triedBreakout.
299
00:11:27,577 --> 00:11:29,144
At the beginning,
after 100 games,
300
00:11:29,187 --> 00:11:30,798
the agent is not very good.
301
00:11:30,841 --> 00:11:32,495
It's missing the ball
most of the time,
302
00:11:32,538 --> 00:11:34,410
but it's starting to get
the hang of the idea
303
00:11:34,453 --> 00:11:36,020
that the bat should go
towards the ball.
304
00:11:36,064 --> 00:11:37,587
Now, after 300 games,
305
00:11:37,630 --> 00:11:40,677
it's about as good as
any human can play this.
306
00:11:40,721 --> 00:11:42,070
We thought,
"Well, that's pretty cool,"
307
00:11:42,113 --> 00:11:44,333
but we left the system playing
for another 200 games,
308
00:11:44,376 --> 00:11:46,074
and it did this amazing thing.
309
00:11:46,117 --> 00:11:47,379
It found the optimal strategy
310
00:11:47,423 --> 00:11:49,425
was to dig a tunnel
around the side
311
00:11:49,468 --> 00:11:51,688
and put the ball
around the back of the wall.
312
00:11:51,732 --> 00:11:53,255
KORAY KAVUKCUOGLU:
Finally, the agent
313
00:11:53,298 --> 00:11:54,386
is actually achieving
314
00:11:54,430 --> 00:11:55,648
what you thought
it would achieve.
315
00:11:55,692 --> 00:11:57,346
That is a great feeling.
Right?
316
00:11:57,389 --> 00:11:59,304
Like, I mean,
when we do research,
317
00:11:59,348 --> 00:12:00,697
that is the best
we can hope for.
318
00:12:00,741 --> 00:12:03,221
We started generalizing
to 50 games,
319
00:12:03,265 --> 00:12:05,484
and we basically
created a recipe.
320
00:12:05,528 --> 00:12:06,834
We could just take a game
321
00:12:06,877 --> 00:12:08,400
that we have
never seen before.
322
00:12:08,444 --> 00:12:09,924
We would run
the algorithm on that,
323
00:12:09,967 --> 00:12:13,057
and DQN could train itself
from scratch,
324
00:12:13,101 --> 00:12:14,450
achieving human level
325
00:12:14,493 --> 00:12:16,017
or sometimes better
than human level.
326
00:12:16,060 --> 00:12:18,454
LEGG: We didn't build it
to play any of them.
327
00:12:18,497 --> 00:12:20,543
We could just give it
a bunch of games
328
00:12:20,586 --> 00:12:22,675
and would figure it out
for itself.
329
00:12:22,719 --> 00:12:25,200
And there was something
quite magical in that.
330
00:12:25,243 --> 00:12:26,549
MURRAY SHANAHAN:
Suddenly you had something
331
00:12:26,592 --> 00:12:27,942
that would respond and learn
332
00:12:27,985 --> 00:12:30,379
whatever situation
it was parachuted into.
333
00:12:30,422 --> 00:12:33,034
And that was like a huge,
huge breakthrough.
334
00:12:33,077 --> 00:12:35,297
It was in many respects
335
00:12:35,340 --> 00:12:36,646
the first example
336
00:12:36,689 --> 00:12:39,083
of any kind of thing
you could call
337
00:12:39,127 --> 00:12:40,563
a general intelligence.
338
00:12:42,130 --> 00:12:43,914
HASSABIS: Although we were
a well-funded startup,
339
00:12:43,958 --> 00:12:47,396
holding us back
was not enough compute power.
340
00:12:47,439 --> 00:12:49,224
I realized that
this would accelerate
341
00:12:49,267 --> 00:12:51,487
our time scale
to AGI massively.
342
00:12:51,530 --> 00:12:53,184
I used to see Demis
quite frequently.
343
00:12:53,228 --> 00:12:55,056
We'd have lunch, and he did...
344
00:12:56,231 --> 00:12:58,886
say to me that
he had two companies
345
00:12:58,929 --> 00:13:02,585
that were involved
in buying DeepMind.
346
00:13:02,628 --> 00:13:04,630
And he didn't know
which one to go with.
347
00:13:04,674 --> 00:13:08,547
The issue was,
would any commercial company
348
00:13:08,591 --> 00:13:12,682
appreciate the real importance
of the research?
349
00:13:12,725 --> 00:13:15,859
And give the research time
to come to fruition
350
00:13:15,903 --> 00:13:17,818
and not be breathing down
their necks,
351
00:13:17,861 --> 00:13:21,560
saying, "We want some kind of
commercial benefit from this."
352
00:13:23,432 --> 00:13:24,607
[MACHINERY HUMMING]
353
00:13:27,305 --> 00:13:32,484
Google has bought DeepMind
for a reported £400,000,000,
354
00:13:32,528 --> 00:13:34,747
making the artificial
intelligence firm
355
00:13:34,791 --> 00:13:37,968
its largest
European acquisition so far.
356
00:13:38,012 --> 00:13:39,665
The company was founded
357
00:13:39,709 --> 00:13:43,365
by 37-year-old entrepreneur
Demis Hassabis.
358
00:13:43,408 --> 00:13:45,584
After the acquisition,
I started mentoring
359
00:13:45,628 --> 00:13:47,282
and spending time with Demis,
360
00:13:47,325 --> 00:13:48,936
and just listening to him.
361
00:13:48,979 --> 00:13:52,417
And this is a person
who fundamentally
362
00:13:52,461 --> 00:13:55,594
is a scientist
and a natural scientist.
363
00:13:55,638 --> 00:13:58,510
He wants science to solve
every problem in the world,
364
00:13:58,554 --> 00:14:00,643
and he believes it can do so.
365
00:14:00,686 --> 00:14:03,646
That's not a normal person
you find in a tech company.
366
00:14:05,517 --> 00:14:07,476
HASSABIS: We were able
to not only join Google
367
00:14:07,519 --> 00:14:10,261
but run independently
in London,
368
00:14:10,305 --> 00:14:11,349
build our culture,
369
00:14:11,393 --> 00:14:13,482
which was optimized
for breakthroughs
370
00:14:13,525 --> 00:14:15,440
and not deal with products,
371
00:14:15,484 --> 00:14:17,747
do pure research.
372
00:14:17,790 --> 00:14:19,314
Our investors
didn't want to sell,
373
00:14:19,357 --> 00:14:20,706
but we decided
374
00:14:20,750 --> 00:14:22,752
that this was the best thing
for the mission.
375
00:14:22,795 --> 00:14:24,667
In many senses,
we were underselling
376
00:14:24,710 --> 00:14:26,190
in terms of value
before it more matured,
377
00:14:26,234 --> 00:14:28,149
and you could have sold it
for a lot more money.
378
00:14:28,192 --> 00:14:32,849
And the reason is because
there's no time to waste.
379
00:14:32,893 --> 00:14:35,199
There's so many things
that got to be cracked
380
00:14:35,243 --> 00:14:37,680
while the brain
is still in gear.
381
00:14:37,723 --> 00:14:39,029
You know, I'm still alive.
382
00:14:39,073 --> 00:14:40,901
There's all these things
that gotta be done.
383
00:14:40,944 --> 00:14:42,946
So you haven't got--
I mean, how many...
384
00:14:42,990 --> 00:14:44,513
How many billions
would you trade for
385
00:14:44,556 --> 00:14:45,818
another five years of life,
you know,
386
00:14:45,862 --> 00:14:48,386
to do what you set out to do?
387
00:14:48,430 --> 00:14:49,779
Okay, all of a sudden,
388
00:14:49,822 --> 00:14:52,738
we've got this massive scale
compute available to us.
389
00:14:52,782 --> 00:14:53,957
What can we do with that?
390
00:14:56,612 --> 00:14:59,963
HASSABIS: Go is the pinnacle
of board games.
391
00:15:00,007 --> 00:15:04,663
It is the most complex game
ever devised by man.
392
00:15:04,707 --> 00:15:06,709
There are more possible
board configurations
393
00:15:06,752 --> 00:15:09,842
in the game of Go than there
are atoms in the universe.
394
00:15:09,886 --> 00:15:13,411
SILVER: Go is the holy grail
of artificial intelligence.
395
00:15:13,455 --> 00:15:14,543
For many years,
396
00:15:14,586 --> 00:15:16,023
people have looked
at this game
397
00:15:16,066 --> 00:15:17,938
and they've thought,
"Wow, this is just too hard."
398
00:15:17,981 --> 00:15:20,201
Everything we've ever
tried in AI,
399
00:15:20,244 --> 00:15:22,725
it just falls over when
you try the game of Go.
400
00:15:22,768 --> 00:15:23,987
And so that's why
it feels like
401
00:15:24,031 --> 00:15:26,120
a real litmus test
of progress.
402
00:15:26,163 --> 00:15:28,600
We had just bought DeepMind.
403
00:15:28,644 --> 00:15:30,733
They were working
on reinforcement learning
404
00:15:30,776 --> 00:15:32,996
and they were the world's
experts in games.
405
00:15:33,040 --> 00:15:34,867
And so when
they introduced the idea
406
00:15:34,911 --> 00:15:37,218
that they could beat
the top level Go players
407
00:15:37,261 --> 00:15:40,134
in a game that was thought
to be incomputable,
408
00:15:40,177 --> 00:15:42,745
I thought, "Well,
that's pretty interesting."
409
00:15:42,788 --> 00:15:46,488
Our ultimate next step
is to play the legendary
410
00:15:46,531 --> 00:15:49,230
Lee Sedol
in just over two weeks.
411
00:15:50,535 --> 00:15:52,059
NEWSREADER 1:
A match like no other
412
00:15:52,102 --> 00:15:54,278
is about to get underway
in South Korea.
413
00:15:54,322 --> 00:15:57,847
NEWSREADER 2: Lee Sedol
is getting ready to rumble.
414
00:15:57,890 --> 00:15:59,283
HASSABIS:
Lee Sedol is probably
415
00:15:59,327 --> 00:16:01,720
one of the greatest players
of the last decade.
416
00:16:01,764 --> 00:16:04,245
I describe him
as the Roger Federer of Go.
417
00:16:05,594 --> 00:16:06,943
ERIC SCHMIDT: He showed up,
418
00:16:06,987 --> 00:16:09,859
and all of a sudden
we have a thousand Koreans
419
00:16:09,902 --> 00:16:13,080
who represent
all of Korean society,
420
00:16:13,123 --> 00:16:14,211
the top Go players.
421
00:16:15,604 --> 00:16:17,823
And then we have Demis.
422
00:16:17,867 --> 00:16:19,869
And the great
engineering team.
423
00:16:20,609 --> 00:16:22,306
He's very famous
424
00:16:22,350 --> 00:16:25,527
for very creative
fighting play.
425
00:16:25,570 --> 00:16:28,791
So this could be
difficult for us.
426
00:16:28,834 --> 00:16:32,012
SCHMIDT: I figured Lee Sedol
is going to beat these guys,
427
00:16:32,055 --> 00:16:34,318
but they'll make
a good showing.
428
00:16:34,362 --> 00:16:35,754
Good for a startup.
429
00:16:38,061 --> 00:16:39,584
I went over
to the technical group
430
00:16:39,628 --> 00:16:40,933
and they said,
431
00:16:40,977 --> 00:16:42,326
"Let me show you
how our algorithm works."
432
00:16:43,675 --> 00:16:45,112
RESEARCHER: If you step
through the actual game,
433
00:16:45,155 --> 00:16:47,810
we can see, kind of,
how AlphaGo thinks.
434
00:16:47,853 --> 00:16:50,247
HASSABIS: The way we start off
on training AlphaGo
435
00:16:50,291 --> 00:16:52,989
is by showing it 100,000 games
436
00:16:53,033 --> 00:16:54,512
that strong amateurs
have played.
437
00:16:54,556 --> 00:16:55,731
And we first initially
438
00:16:55,774 --> 00:16:58,908
get AlphaGo to mimic
the human player,
439
00:16:58,951 --> 00:17:00,823
and then through
reinforcement learning,
440
00:17:00,866 --> 00:17:02,390
it plays against
different versions of itself
441
00:17:02,433 --> 00:17:05,741
many millions of times
and learns from its errors.
442
00:17:05,784 --> 00:17:07,612
Hmm, this is interesting.
443
00:17:07,656 --> 00:17:08,700
ANNOUNCER 1: All right, folks,
444
00:17:08,744 --> 00:17:10,615
you're going to see
history made.
445
00:17:10,659 --> 00:17:11,877
[ANNOUNCER 2 SPEAKING KOREAN]
446
00:17:12,791 --> 00:17:14,793
SCHMIDT: So the game starts.
447
00:17:14,837 --> 00:17:16,012
ANNOUNCER 1:
He's really concentrating.
448
00:17:16,056 --> 00:17:17,666
ANNOUNCER 3:
If you really look at the...
449
00:17:19,494 --> 00:17:20,886
[ANNOUNCERS EXCLAIM]
450
00:17:20,930 --> 00:17:25,369
That's a very surprising move.
451
00:17:25,413 --> 00:17:27,937
ANNOUNCER 3: I think we're
seeing an original move here.
452
00:17:34,422 --> 00:17:35,814
Yeah, that's an exciting move.
453
00:17:36,206 --> 00:17:37,251
I like...
454
00:17:37,294 --> 00:17:38,600
SILVER:
Professional commentators
455
00:17:38,643 --> 00:17:40,123
almost unanimously said
456
00:17:40,167 --> 00:17:43,344
that not a single human player
would have chosen move 37.
457
00:17:43,387 --> 00:17:45,824
So I actually had a poke
around in AlphaGo
458
00:17:45,868 --> 00:17:47,435
to see what AlphaGo thought.
459
00:17:47,478 --> 00:17:50,351
And AlphaGo actually agreed
with that assessment.
460
00:17:50,394 --> 00:17:53,832
AlphaGo said there was a one
in 10,000 probability
461
00:17:53,876 --> 00:17:57,793
that move 37 would have been
played by a human player.
462
00:17:57,836 --> 00:18:00,796
[SEDOL SPEAKING IN KOREAN]
463
00:18:08,847 --> 00:18:10,197
SILVER: The game of Go
has been studied
464
00:18:10,240 --> 00:18:11,589
for thousands of years.
465
00:18:11,633 --> 00:18:15,158
And AlphaGo discovered
something completely new.
466
00:18:16,899 --> 00:18:19,728
ANNOUNCER: He resigned.
Lee Sedol has just resigned.
467
00:18:19,771 --> 00:18:21,208
He's beaten.
468
00:18:21,251 --> 00:18:22,687
[ELECTRONIC MUSIC PLAYING]
469
00:18:22,731 --> 00:18:24,689
NEWSREADER 1: The battle
between man versus machine,
470
00:18:24,733 --> 00:18:26,256
a computer just came out
the victor.
471
00:18:26,300 --> 00:18:28,258
NEWSREADER 2: Google
put its DeepMind team
472
00:18:28,302 --> 00:18:29,694
to the test against
473
00:18:29,738 --> 00:18:32,044
one of the brightest minds
in the world and won.
474
00:18:32,088 --> 00:18:33,742
SCHMIDT:
That's when we realized
475
00:18:33,785 --> 00:18:35,265
the DeepMind people knew
what they were doing
476
00:18:35,309 --> 00:18:37,572
and to pay attention
to reinforcement learning
477
00:18:37,615 --> 00:18:38,921
as they have invented it.
478
00:18:40,096 --> 00:18:41,837
Based on that experience,
479
00:18:41,880 --> 00:18:44,840
AlphaGo got better
and better and better.
480
00:18:44,883 --> 00:18:45,971
And they had a little chart
481
00:18:46,015 --> 00:18:47,538
of how much better
they were getting.
482
00:18:47,582 --> 00:18:49,323
And I said,
"When does this stop?"
483
00:18:50,106 --> 00:18:51,020
And Demis said,
484
00:18:51,063 --> 00:18:52,717
"When we beat the Chinese guy,
485
00:18:52,761 --> 00:18:55,764
"the top-rated player
in the world."
486
00:18:56,982 --> 00:18:59,246
ANNOUNCER 1:
Ke Jie versus AlphaGo.
487
00:19:03,641 --> 00:19:04,816
ANNOUNCER 2:
And I think we will see
488
00:19:04,860 --> 00:19:06,166
AlphaGo pushing through there.
489
00:19:06,209 --> 00:19:08,211
ANNOUNCER 1:
AlphaGo is ahead quite a bit.
490
00:19:08,255 --> 00:19:11,475
SCHMIDT: About halfway
through the first game,
491
00:19:11,519 --> 00:19:14,696
the best player in the world
was not doing so well.
492
00:19:14,739 --> 00:19:17,829
ANNOUNCER 1:
What can black do here?
493
00:19:19,135 --> 00:19:21,268
ANNOUNCER 2: Looks difficult.
494
00:19:21,311 --> 00:19:23,357
SCHMIDT:
And at a critical moment...
495
00:19:33,018 --> 00:19:35,934
the Chinese government
ordered the feed cut off.
496
00:19:38,328 --> 00:19:41,766
It was at that moment
we were telling the world
497
00:19:41,810 --> 00:19:44,987
that something new
had arrived on earth.
498
00:19:47,642 --> 00:19:48,991
In the 1950s
499
00:19:49,034 --> 00:19:51,950
when Russia'sSputnik
satellite was launched,
500
00:19:53,300 --> 00:19:55,215
it changed
the course of history.
501
00:19:55,258 --> 00:19:57,565
TV HOST: It is a challenge
that America must meet
502
00:19:57,608 --> 00:19:59,654
to survive in the Space Age.
503
00:19:59,697 --> 00:20:02,396
SCHMIDT: This has been
called theSputnik moment.
504
00:20:02,439 --> 00:20:06,356
The Sputnikmoment created
a massive reaction in the US
505
00:20:06,400 --> 00:20:10,055
in terms of funding
for science and engineering,
506
00:20:10,099 --> 00:20:12,275
and particularly
of space technology.
507
00:20:12,319 --> 00:20:15,844
For China,
AlphaGo was the wakeup call,
508
00:20:15,887 --> 00:20:17,149
the Sputnikmoment.
509
00:20:17,193 --> 00:20:19,891
It launched an AI space race.
510
00:20:21,241 --> 00:20:23,068
HASSABIS: We had this
huge idea that worked,
511
00:20:23,112 --> 00:20:26,376
and now the whole world knows.
512
00:20:26,420 --> 00:20:28,900
It's always easier
to land on the moon
513
00:20:28,944 --> 00:20:30,859
if someone's already
landed there.
514
00:20:32,077 --> 00:20:34,471
It is going to matter
who builds AI,
515
00:20:34,515 --> 00:20:36,647
and how it gets built.
516
00:20:36,691 --> 00:20:38,345
I always feel that pressure.
517
00:20:42,087 --> 00:20:43,785
SILVER: There's been
a big chain of events
518
00:20:43,828 --> 00:20:46,701
that followed on from all
of the excitement of AlphaGo.
519
00:20:46,744 --> 00:20:48,093
When we played
against Lee Sedol,
520
00:20:48,137 --> 00:20:49,269
we actually had a system
521
00:20:49,312 --> 00:20:50,705
that had been trained
on human data,
522
00:20:50,748 --> 00:20:52,272
on all of the millions
of games
523
00:20:52,315 --> 00:20:55,057
that have been played
by human experts.
524
00:20:55,100 --> 00:20:57,015
We eventually found
a new algorithm,
525
00:20:57,059 --> 00:20:59,191
a much more elegant approach
to the whole system,
526
00:20:59,235 --> 00:21:01,193
which actually stripped out
all of the human knowledge
527
00:21:01,237 --> 00:21:03,718
and just started
completely from scratch.
528
00:21:03,761 --> 00:21:06,721
And that became a project
which we called AlphaZero.
529
00:21:06,764 --> 00:21:09,550
Zero, meaning having zero
human knowledge in the loop.
530
00:21:11,900 --> 00:21:13,075
Instead of learning
from human data,
531
00:21:13,118 --> 00:21:15,817
it learned from its own games.
532
00:21:15,860 --> 00:21:17,862
So it actually
became its own teacher.
533
00:21:21,301 --> 00:21:23,520
HASSABIS:
AlphaZero is an experiment
534
00:21:23,564 --> 00:21:26,741
in how little knowledge
can we put into these systems
535
00:21:26,784 --> 00:21:28,264
and how quickly
and how efficiently
536
00:21:28,308 --> 00:21:29,744
can they learn?
537
00:21:29,787 --> 00:21:32,573
But the other thing is AlphaZero
doesn't have any rules.
538
00:21:32,616 --> 00:21:33,574
It learns through experience.
539
00:21:36,141 --> 00:21:38,883
The next stage
was to make it more general,
540
00:21:38,927 --> 00:21:41,016
so that it could play
any two-player game.
541
00:21:41,059 --> 00:21:42,365
Things like chess,
542
00:21:42,409 --> 00:21:44,106
and in fact,
any kind of two-player
543
00:21:44,149 --> 00:21:45,412
perfect information game.
544
00:21:45,455 --> 00:21:46,674
It's going really well.
545
00:21:46,717 --> 00:21:47,849
It's going
really, really well.
546
00:21:47,892 --> 00:21:50,112
- Oh, wow.
- It's going down, like fast.
547
00:21:50,155 --> 00:21:53,071
HASSABIS: AlphaGo used
to take a few months to train,
548
00:21:53,115 --> 00:21:55,683
but AlphaZero could start
in the morning
549
00:21:55,726 --> 00:21:57,815
playing completely randomly
550
00:21:57,859 --> 00:22:01,036
and then by tea
be at superhuman level.
551
00:22:01,079 --> 00:22:03,386
And by dinner it will be
the strongest chess entity
552
00:22:03,430 --> 00:22:04,779
there's ever been.
553
00:22:04,822 --> 00:22:06,650
- Amazing, it's amazing.
- Yeah.
554
00:22:06,694 --> 00:22:09,392
It's discovered its own
attacking style, you know,
555
00:22:09,436 --> 00:22:11,438
to take on the current
level of defense.
556
00:22:11,481 --> 00:22:12,787
I mean, I never
in my wildest dreams...
557
00:22:12,830 --> 00:22:14,963
I agree. Actually, I was not
expecting that either.
558
00:22:15,006 --> 00:22:16,443
And it's fun for me.
559
00:22:16,486 --> 00:22:18,619
I mean, it's inspired me
to get back into chess again,
560
00:22:18,662 --> 00:22:20,185
because it's cool to see
561
00:22:20,229 --> 00:22:22,405
that there's even more depth
than we thought in chess.
562
00:22:24,494 --> 00:22:25,452
[HORN BLOWS]
563
00:22:31,632 --> 00:22:34,461
HASSABIS: I actually got
into AI through games.
564
00:22:35,679 --> 00:22:37,681
Initially, it was board games.
565
00:22:37,725 --> 00:22:40,075
I was thinking,
"How is my brain doing this?"
566
00:22:40,118 --> 00:22:41,990
Like, what is it doing?
567
00:22:43,383 --> 00:22:47,169
I was very aware of that
from a very young age.
568
00:22:47,212 --> 00:22:50,085
So I've always been thinking
about thinking.
569
00:22:50,128 --> 00:22:52,783
NEWSREADER: The British
and American chess champions
570
00:22:52,827 --> 00:22:55,046
meet to begin
a series of matches.
571
00:22:55,090 --> 00:22:56,613
Playing alongside them
are the cream
572
00:22:56,657 --> 00:22:59,094
of Britain and America's
youngest players.
573
00:22:59,137 --> 00:23:01,488
NEWSREADER 2: Demis Hassabis
is representing Britain.
574
00:23:06,231 --> 00:23:07,885
COSTAS HASSABIS:
When Demis was four,
575
00:23:07,929 --> 00:23:11,280
he first showed
an aptitude for chess.
576
00:23:12,760 --> 00:23:14,065
By the time he was six,
577
00:23:14,109 --> 00:23:18,069
he became London
under-eight champion.
578
00:23:18,113 --> 00:23:19,506
HASSABIS: My parents
were very interesting
579
00:23:19,549 --> 00:23:20,855
and unusual, actually.
580
00:23:20,898 --> 00:23:23,771
I'd probably describe them
as quite bohemian.
581
00:23:23,814 --> 00:23:25,512
My father
was a singer-songwriter
582
00:23:25,555 --> 00:23:26,730
when he was younger,
583
00:23:26,774 --> 00:23:28,384
and Bob Dylan was his hero.
584
00:23:32,867 --> 00:23:34,521
[HORN HONKS]
585
00:23:34,564 --> 00:23:36,131
[ANGELA HASSABIS SPEAKING]
586
00:23:38,089 --> 00:23:39,308
Yeah, yeah.
587
00:23:41,658 --> 00:23:44,052
HOST: What is it
that you like about this game?
588
00:23:45,096 --> 00:23:47,142
It's just a good
thinking game.
589
00:23:49,274 --> 00:23:51,059
HASSABIS: At the time,
I was the second-highest rated
590
00:23:51,102 --> 00:23:52,713
chess player in the world
for my age.
591
00:23:52,756 --> 00:23:54,366
But although I was on track
592
00:23:54,410 --> 00:23:55,977
to be a professional
chess player,
593
00:23:56,020 --> 00:23:57,544
I thought that was what
I was going to do.
594
00:23:57,587 --> 00:23:59,241
No matter how much
I loved the game,
595
00:23:59,284 --> 00:24:01,243
it was incredibly stressful.
596
00:24:01,286 --> 00:24:03,375
Definitely was not fun
and games for me.
597
00:24:03,419 --> 00:24:05,247
My parents used to, you know,
598
00:24:05,290 --> 00:24:06,944
get very upset
when I lost the game
599
00:24:06,988 --> 00:24:10,426
and angry
if I forgot something.
600
00:24:10,470 --> 00:24:12,428
And because it was quite high
stakes for them, you know,
601
00:24:12,472 --> 00:24:14,082
it cost a lot of money
to go to these tournaments.
602
00:24:14,125 --> 00:24:15,736
And my parents
didn't have much money.
603
00:24:18,434 --> 00:24:19,740
My parents thought, you know,
604
00:24:19,783 --> 00:24:22,090
"If you interested
in being a chess professional,
605
00:24:22,133 --> 00:24:25,397
"this is really important.
It's like your exams."
606
00:24:27,312 --> 00:24:30,098
I remember
I was about 12-years-old
607
00:24:30,141 --> 00:24:32,100
and I was at this
international chess tournament
608
00:24:32,143 --> 00:24:34,276
in Liechtenstein
up in the mountains.
609
00:24:36,583 --> 00:24:39,324
[BELL TOLLING]
610
00:24:43,328 --> 00:24:45,592
And we were in this
huge church hall
611
00:24:47,202 --> 00:24:48,420
with, you know,
612
00:24:48,464 --> 00:24:50,205
hundreds of international
chess players.
613
00:24:52,424 --> 00:24:56,037
And I was playing
the ex-Danish champion.
614
00:24:56,080 --> 00:24:58,822
He must have been
in his 30s, probably.
615
00:25:00,432 --> 00:25:03,000
In those days,
there was a long time limit.
616
00:25:03,044 --> 00:25:05,089
The games could
literally last all day.
617
00:25:05,612 --> 00:25:08,310
[YAWNS]
618
00:25:08,353 --> 00:25:10,834
- [TIMER TICKING]
- We were into our tenth hour.
619
00:25:10,878 --> 00:25:12,401
[TIMER TICKS FRANTICALLY]
620
00:25:17,711 --> 00:25:20,235
[MOUSE GASPS]
621
00:25:20,278 --> 00:25:23,107
And we were in this
incredibly unusual ending.
622
00:25:23,151 --> 00:25:24,587
I think it should be a draw.
623
00:25:26,328 --> 00:25:28,678
But he kept on trying
to win for hours.
624
00:25:32,203 --> 00:25:33,422
[HORSE NEIGHS]
625
00:25:35,380 --> 00:25:38,340
Finally, he tried
one last cheap trick.
626
00:25:42,562 --> 00:25:44,476
All I had to do
was give away my queen.
627
00:25:44,520 --> 00:25:45,695
Then it would be stalemate.
628
00:25:47,044 --> 00:25:48,655
But I was so tired,
629
00:25:48,698 --> 00:25:50,178
I thought it was inevitable
I was going to be checkmated.
630
00:25:52,310 --> 00:25:53,529
And so I resigned.
631
00:25:57,315 --> 00:25:59,579
He jumped up.
Just started laughing.
632
00:25:59,622 --> 00:26:00,580
[LAUGHING]
633
00:26:01,798 --> 00:26:02,930
And he went,
634
00:26:02,973 --> 00:26:04,192
"Why have you resigned?
It's a draw."
635
00:26:04,235 --> 00:26:05,367
And he immediately,
with a flourish,
636
00:26:05,410 --> 00:26:06,673
sort of showed me
the drawing move.
637
00:26:09,066 --> 00:26:12,417
I felt so sick to my stomach.
638
00:26:12,461 --> 00:26:14,158
It made me think of
the rest of that tournament.
639
00:26:14,202 --> 00:26:16,683
Like, are we wasting
our minds?
640
00:26:16,726 --> 00:26:19,729
Is this the best use
of all this brain power?
641
00:26:19,773 --> 00:26:22,384
Everybody's, collectively,
in that building?
642
00:26:22,427 --> 00:26:24,081
If you could somehow plug in
643
00:26:24,125 --> 00:26:27,563
those 300 brains
into a system,
644
00:26:27,607 --> 00:26:29,043
you might be able
to solve cancer
645
00:26:29,086 --> 00:26:30,522
with that level
of brain power.
646
00:26:31,654 --> 00:26:33,525
This intuitive feeling
came over me
647
00:26:33,569 --> 00:26:35,092
that although I love chess,
648
00:26:35,136 --> 00:26:38,356
this is not the right thing
to spend my whole life on.
649
00:26:51,500 --> 00:26:53,110
LEGG: Demis and myself,
650
00:26:53,154 --> 00:26:56,113
our plan was always
to fill DeepMind
651
00:26:56,157 --> 00:26:57,245
with some of the most
652
00:26:57,288 --> 00:26:59,247
brilliant scientists
in the world.
653
00:26:59,290 --> 00:27:01,075
So we had the human brains
654
00:27:01,118 --> 00:27:04,948
necessary to create
an AGI system.
655
00:27:04,992 --> 00:27:09,300
By definition, the "G"
in AGI is about generality.
656
00:27:09,344 --> 00:27:13,130
What I imagine is being able
to talk to an agent,
657
00:27:13,174 --> 00:27:15,219
the agent can talk back,
658
00:27:15,263 --> 00:27:18,919
and the agent is able to solve
novel problems
659
00:27:18,962 --> 00:27:20,616
that it hasn't seen before.
660
00:27:20,660 --> 00:27:22,749
That's a really key part
of human intelligence,
661
00:27:22,792 --> 00:27:24,489
and it's that
cognitive breadth
662
00:27:24,533 --> 00:27:27,754
and flexibility
that's incredible.
663
00:27:27,797 --> 00:27:29,494
The only natural
general intelligence
664
00:27:29,538 --> 00:27:30,931
we know of as humans,
665
00:27:30,974 --> 00:27:33,455
we obviously learn a lot
from our environment.
666
00:27:33,498 --> 00:27:35,892
So we think that
simulated environments
667
00:27:35,936 --> 00:27:38,895
are one of the ways
to create an AGI.
668
00:27:40,549 --> 00:27:42,377
SIMON CARTER:
The very early humans
669
00:27:42,420 --> 00:27:44,335
were having to solve
logic problems.
670
00:27:44,379 --> 00:27:46,773
They were having to solve
navigation, memory,
671
00:27:46,816 --> 00:27:48,992
and we evolved
in that environment.
672
00:27:50,385 --> 00:27:52,517
If we can create
a virtual recreation
673
00:27:52,561 --> 00:27:54,650
of that kind of environment,
674
00:27:54,694 --> 00:27:56,347
that's the perfect
testing ground
675
00:27:56,391 --> 00:27:57,479
and training ground
676
00:27:57,522 --> 00:27:59,350
for everything
we do at DeepMind.
677
00:28:04,268 --> 00:28:05,748
GUY SIMMONS:
What they were doing here
678
00:28:05,792 --> 00:28:09,273
was creating environments
for childlike beings,
679
00:28:09,317 --> 00:28:11,711
the agents to exist
within and play.
680
00:28:12,450 --> 00:28:13,669
That just sounded like
681
00:28:13,713 --> 00:28:16,803
the most interesting thing
in all the world.
682
00:28:16,846 --> 00:28:19,153
SHANAHAN: A child
learns by tearing things up
683
00:28:19,196 --> 00:28:20,676
and then throwing food around
684
00:28:20,720 --> 00:28:23,374
and getting a response
from mommy or daddy.
685
00:28:23,418 --> 00:28:25,637
This seems like an important
idea to incorporate
686
00:28:25,681 --> 00:28:27,901
in the way you train an agent.
687
00:28:27,944 --> 00:28:30,991
RESEARCHER 1: The humanoid
is supposed to stand up.
688
00:28:31,034 --> 00:28:32,993
As his center
of gravity rises,
689
00:28:33,036 --> 00:28:34,429
it gets more points.
690
00:28:37,475 --> 00:28:38,825
You have a reward
691
00:28:38,868 --> 00:28:40,870
and the agent
learns from the reward,
692
00:28:40,914 --> 00:28:43,394
like, you do something well,
you get a positive reward.
693
00:28:43,438 --> 00:28:47,529
You do something bad,
you get a negative reward.
694
00:28:47,572 --> 00:28:49,400
RESEARCHER 2: [EXCLAIMS]
It looks like it's standing.
695
00:28:50,880 --> 00:28:52,186
It's still a bit drunk.
696
00:28:52,229 --> 00:28:53,665
RESEARCHER 1:
It likes to walk backwards.
697
00:28:53,709 --> 00:28:55,363
RESEARCHER 2: [CHUCKLES] Yeah.
698
00:28:55,406 --> 00:28:57,539
The whole algorithm
is trying to optimize
699
00:28:57,582 --> 00:28:59,715
for receiving as much rewards
as possible,
700
00:28:59,759 --> 00:29:02,326
and it's found that
walking backwards,
701
00:29:02,370 --> 00:29:05,373
it's good enough
to get very good scores.
702
00:29:07,767 --> 00:29:09,594
RAIA HADSELL:
When we learn to navigate,
703
00:29:09,638 --> 00:29:11,292
when we learn to get around
in our world,
704
00:29:11,335 --> 00:29:13,511
we don't start with maps.
705
00:29:13,555 --> 00:29:16,340
We just start
with our own exploration,
706
00:29:16,384 --> 00:29:18,038
adventuring off
across the park,
707
00:29:18,081 --> 00:29:21,911
without our parents
by our side,
708
00:29:21,955 --> 00:29:24,348
or finding our way home
from school when we're young.
709
00:29:24,392 --> 00:29:25,567
[FAST ELECTRONIC
MUSIC PLAYING]
710
00:29:26,960 --> 00:29:28,744
HADSELL: A few of us
came up with this idea
711
00:29:28,788 --> 00:29:32,008
that if we had an environment
where a simulated robot
712
00:29:32,052 --> 00:29:33,749
just had to run forward,
713
00:29:33,793 --> 00:29:36,317
we could put all sorts of
obstacles in its way
714
00:29:36,360 --> 00:29:38,232
and see if it could manage
to navigate
715
00:29:38,275 --> 00:29:40,495
different types of terrain.
716
00:29:40,538 --> 00:29:43,063
The idea would be like
a parkour challenge.
717
00:29:46,414 --> 00:29:48,764
It's not graceful,
718
00:29:48,808 --> 00:29:51,854
but was never trained to hold
a glass whilst it was running
719
00:29:51,898 --> 00:29:53,073
and not spill water.
720
00:29:54,204 --> 00:29:55,727
You set this objective
that says,
721
00:29:55,771 --> 00:29:58,252
"Just move forward,
forward velocity,
722
00:29:58,295 --> 00:30:00,602
"and you'll get
a reward for that."
723
00:30:00,645 --> 00:30:02,691
And the learning algorithm
figures out
724
00:30:02,734 --> 00:30:05,433
how to move
this complex set of joints.
725
00:30:06,173 --> 00:30:07,391
That's the power of
726
00:30:07,435 --> 00:30:10,090
reward-based
reinforcement learning.
727
00:30:10,133 --> 00:30:12,788
SILVER: Our goal
is to try and build agents
728
00:30:12,832 --> 00:30:15,791
which, we drop them in,
they know nothing,
729
00:30:15,835 --> 00:30:18,402
they get to play around in
whatever problem you give them
730
00:30:18,446 --> 00:30:22,363
and eventually figure out how
to solve it for themselves.
731
00:30:22,406 --> 00:30:24,844
Now we want something
which can do that
732
00:30:24,887 --> 00:30:27,411
in as many different types
of problems as possible.
733
00:30:29,283 --> 00:30:32,939
A human needs diverse skills
to interact with the world.
734
00:30:32,982 --> 00:30:35,071
How to deal
with complex images,
735
00:30:35,115 --> 00:30:37,900
how to manipulate
thousands of things at once,
736
00:30:37,944 --> 00:30:40,468
how to deal
with missing information.
737
00:30:40,511 --> 00:30:42,035
We think all of these things
together
738
00:30:42,078 --> 00:30:45,299
are represented
by this game calledStarCraft.
739
00:30:45,342 --> 00:30:47,301
All it's being trained
to do is,
740
00:30:47,344 --> 00:30:50,478
given this situation,
this screen,
741
00:30:50,521 --> 00:30:51,871
what would a human do?
742
00:30:51,914 --> 00:30:55,265
We took inspiration from
large language models
743
00:30:55,309 --> 00:30:57,659
where you simply train
a model
744
00:30:57,702 --> 00:30:59,574
to predict the next word,
745
00:31:03,752 --> 00:31:05,493
which is exactly the same as
746
00:31:05,536 --> 00:31:07,756
predict the next
StarCraft move.
747
00:31:07,799 --> 00:31:08,975
SILVER: Unlike chess or Go,
748
00:31:09,018 --> 00:31:11,238
where players take turns
to make moves,
749
00:31:11,281 --> 00:31:14,154
inStarCraft there's a
continuous flow of decisions.
750
00:31:15,155 --> 00:31:16,504
On top of that,
751
00:31:16,547 --> 00:31:18,680
you can't even see
what the opponent is doing.
752
00:31:18,723 --> 00:31:20,900
There is no longer
a clear definition
753
00:31:20,943 --> 00:31:22,336
of what it means
to play the best way.
754
00:31:22,379 --> 00:31:23,946
It depends on
what your opponent does.
755
00:31:23,990 --> 00:31:25,513
HADSELL: This is the way
that we'll get to
756
00:31:25,556 --> 00:31:27,515
a much more fluid,
757
00:31:27,558 --> 00:31:31,693
more natural, faster,
more reactive agent.
758
00:31:31,736 --> 00:31:33,042
ORIOL VINYALS:
This is a huge challenge
759
00:31:33,086 --> 00:31:35,436
and let's see how far
we can push.
760
00:31:35,479 --> 00:31:36,480
TIM LILLICRAP: Oh!
761
00:31:36,524 --> 00:31:38,221
Holy monkey!
762
00:31:38,265 --> 00:31:40,397
I'm a pretty
low-level amateur.
763
00:31:40,441 --> 00:31:42,878
I'm okay, but I'm
a pretty low-level amateur.
764
00:31:42,922 --> 00:31:46,012
These agents have
a long ways to go.
765
00:31:46,055 --> 00:31:48,536
HASSABIS: We couldn't
beat someone of Tim's level.
766
00:31:48,579 --> 00:31:50,451
You know, that was
a little bit alarming.
767
00:31:50,494 --> 00:31:52,018
LILLICRAP:
At that point, it felt like
768
00:31:52,061 --> 00:31:53,541
it was going to be, like,
a really big long challenge,
769
00:31:53,584 --> 00:31:55,021
maybe a couple of years.
770
00:31:58,198 --> 00:32:01,853
VINYALS: Dani is the best
DeepMind StarCraft 2 player.
771
00:32:01,897 --> 00:32:05,118
I've been playing the agent
every day for a few weeks now.
772
00:32:07,207 --> 00:32:08,904
I could feel that the agent
773
00:32:08,948 --> 00:32:11,124
was getting better
really fast.
774
00:32:11,167 --> 00:32:13,387
[CHEERING, LAUGHTER]
775
00:32:13,430 --> 00:32:15,041
Wow, we beat Danny.
That, for me,
776
00:32:15,084 --> 00:32:16,956
was already
like a huge achievement.
777
00:32:18,305 --> 00:32:19,393
HASSABIS: The next step is
778
00:32:19,436 --> 00:32:21,743
we're going to book in
a pro to play.
779
00:32:23,092 --> 00:32:24,224
[KEYBOARD TAPPING]
780
00:32:27,053 --> 00:32:28,271
[GROANS]
781
00:32:28,315 --> 00:32:30,534
[CHEERING, WHOOPING]
782
00:32:33,015 --> 00:32:35,583
[CHEERING, WHOOPING]
783
00:32:35,626 --> 00:32:37,889
- [LAUGHS]
- [PEOPLE CLAPPING]
784
00:32:42,024 --> 00:32:44,070
It feels a bit unfair.
All you guys against me.
785
00:32:44,113 --> 00:32:45,593
[ALL LAUGH]
786
00:32:45,636 --> 00:32:46,986
HASSABIS: We're way
ahead of what I thought
787
00:32:47,029 --> 00:32:49,423
we would do, given where
we were two months ago.
788
00:32:49,466 --> 00:32:50,815
Just trying to digest it all,
actually.
789
00:32:50,859 --> 00:32:52,774
But it's very, very cool.
790
00:32:52,817 --> 00:32:54,167
SILVER: Now we're in
a position where
791
00:32:54,210 --> 00:32:56,082
we can finally share
the work that we've done
792
00:32:56,125 --> 00:32:57,213
with the public.
793
00:32:57,257 --> 00:32:58,606
This is a big step.
794
00:32:58,649 --> 00:33:00,695
We are really putting
ourselves on the line here.
795
00:33:00,738 --> 00:33:02,610
- Take it away. Cheers.
- Thank you.
796
00:33:02,653 --> 00:33:04,481
We're going to be live
from London.
797
00:33:04,525 --> 00:33:05,743
It's happening.
798
00:33:08,659 --> 00:33:10,618
ANNOUNCER 1:
Welcome to London.
799
00:33:10,661 --> 00:33:13,273
We are going to have
a live exhibition match,
800
00:33:13,316 --> 00:33:15,362
MaNa against AlphaStar.
801
00:33:15,405 --> 00:33:17,190
[CHEERING, APPLAUSE]
802
00:33:18,365 --> 00:33:20,019
At this point now,
803
00:33:20,062 --> 00:33:23,848
AlphaStar, 10 and 0
against professional gamers.
804
00:33:23,892 --> 00:33:25,894
Any thoughts
before we get into this game?
805
00:33:25,937 --> 00:33:27,504
VINYALS: I just want to see
a good game, yeah.
806
00:33:27,548 --> 00:33:28,984
I want to see a good game.
807
00:33:29,028 --> 00:33:30,507
SILVER: Absolutely,
good game. We're all excited.
808
00:33:30,551 --> 00:33:33,032
ANNOUNCER: All right. Let's
see what MaNa can pull off.
809
00:33:34,990 --> 00:33:36,426
ANNOUNCER 2:
AlphaStar is definitely
810
00:33:36,470 --> 00:33:38,385
dominating the pace
of this game.
811
00:33:38,428 --> 00:33:41,127
[SPORADIC CHEERING]
812
00:33:41,170 --> 00:33:44,173
ANNOUNCER 1: Wow. AlphaStar
is playing so smartly.
813
00:33:44,217 --> 00:33:46,828
[LAUGHTER]
814
00:33:46,871 --> 00:33:48,482
This really looks like
I'm watching
815
00:33:48,525 --> 00:33:49,961
a professional human gamer
816
00:33:50,005 --> 00:33:51,311
from the AlphaStar
point of view.
817
00:33:53,052 --> 00:33:54,879
[KEYBOARD TAPPING]
818
00:33:57,273 --> 00:34:01,973
HASSABIS: I hadn't really seen
a pro playStarCraft up close,
819
00:34:02,017 --> 00:34:03,584
and the 800 clicks per minute.
820
00:34:03,627 --> 00:34:06,021
I don't understand how anyone
can even click 800 times,
821
00:34:06,065 --> 00:34:09,503
let alone doing
800 useful clicks.
822
00:34:09,546 --> 00:34:11,113
ANNOUNCER 1:
Oh, another good hit.
823
00:34:11,157 --> 00:34:13,115
- [ALL GROAN]
- AlphaStar is just
824
00:34:13,159 --> 00:34:14,638
completely relentless.
825
00:34:14,682 --> 00:34:16,162
SILVER: We need to be careful
826
00:34:16,205 --> 00:34:19,382
because many of us grew up
as gamers and are gamers.
827
00:34:19,426 --> 00:34:21,384
And so to us,
it's very natural
828
00:34:21,428 --> 00:34:23,821
to view games
as what they are,
829
00:34:23,865 --> 00:34:26,998
which is pure vehicles
for fun,
830
00:34:27,042 --> 00:34:29,740
and not to see
that more militaristic side
831
00:34:29,784 --> 00:34:32,874
that the public might see
if they looked at this.
832
00:34:32,917 --> 00:34:37,574
You can't look at gunpowder
and only make a firecracker.
833
00:34:37,618 --> 00:34:41,230
All technologies inherently
point into certain directions.
834
00:34:43,145 --> 00:34:44,712
MARGARET LEVI:
I'm very worried about
835
00:34:44,755 --> 00:34:46,627
the certain ways in which AI
836
00:34:46,670 --> 00:34:49,673
will be used
for military purposes.
837
00:34:51,327 --> 00:34:55,070
And that makes it even clearer
how important it is
838
00:34:55,114 --> 00:34:58,378
for our societies
to be in control
839
00:34:58,421 --> 00:35:01,076
of these new technologies.
840
00:35:01,120 --> 00:35:05,211
The potential for abuse
from AI will be significant.
841
00:35:05,254 --> 00:35:08,779
Wars that occur faster
than humans can comprehend
842
00:35:08,823 --> 00:35:11,086
and more powerful
surveillance.
843
00:35:12,479 --> 00:35:15,786
How do you keep power forever
844
00:35:15,830 --> 00:35:19,442
over something that's
much more powerful than you?
845
00:35:19,486 --> 00:35:21,401
[STEPHEN HAWKING SPEAKING]
846
00:35:43,074 --> 00:35:45,773
Technologies can be used
to do terrible things.
847
00:35:47,514 --> 00:35:50,473
And technology can be used
to do wonderful things
848
00:35:50,517 --> 00:35:52,171
and solve
all kinds of problems.
849
00:35:53,607 --> 00:35:54,999
When DeepMind
was acquired by Google...
850
00:35:55,043 --> 00:35:56,610
- Yeah.
- ...you got Google to promise
851
00:35:56,653 --> 00:35:58,046
that technology you developed
won't be used by the military
852
00:35:58,089 --> 00:35:59,439
- for surveillance.
- Right.
853
00:35:59,482 --> 00:36:00,614
- Yes.
- Tell us about that.
854
00:36:00,657 --> 00:36:03,182
I think technology
is neutral in itself,
855
00:36:03,225 --> 00:36:05,619
um, but how, you know,
we as a society
856
00:36:05,662 --> 00:36:07,403
or humans and companies
and other things,
857
00:36:07,447 --> 00:36:09,536
other entities and governments
decide to use it
858
00:36:09,579 --> 00:36:12,582
is what determines whether
things become good or bad.
859
00:36:12,626 --> 00:36:16,282
You know, I personally think
having autonomous weaponry
860
00:36:16,325 --> 00:36:17,500
is just a very bad idea.
861
00:36:19,198 --> 00:36:21,287
ANNOUNCER 1:
AlphaStar is playing
862
00:36:21,330 --> 00:36:24,115
an extremely intelligent game
right now.
863
00:36:24,159 --> 00:36:27,380
CUKIER: There is an element to
what's being created
864
00:36:27,423 --> 00:36:28,903
at DeepMind in London
865
00:36:28,946 --> 00:36:34,169
that does seem like
the Manhattan Project.
866
00:36:34,213 --> 00:36:37,607
There's a relationship between
Robert Oppenheimer
867
00:36:37,651 --> 00:36:39,696
and Demis Hassabis
868
00:36:39,740 --> 00:36:44,223
in which they're unleashing
a new force upon humanity.
869
00:36:44,266 --> 00:36:46,225
ANNOUNCER 1:
MaNa is fighting back, though.
870
00:36:46,268 --> 00:36:48,183
Oh, man!
871
00:36:48,227 --> 00:36:50,229
HASSABIS:
I think that Oppenheimer
872
00:36:50,272 --> 00:36:52,492
and some of the other leaders
of that project got caught up
873
00:36:52,535 --> 00:36:54,929
in the excitement
of building the technology
874
00:36:54,972 --> 00:36:56,191
and seeing if it was possible.
875
00:36:56,235 --> 00:36:58,541
ANNOUNCER 1:
Where is AlphaStar?
876
00:36:58,585 --> 00:36:59,803
Where is AlphaStar?
877
00:36:59,847 --> 00:37:01,979
I don't see AlphaStar's units
anywhere.
878
00:37:02,023 --> 00:37:03,546
HASSABIS: They did not think
carefully enough
879
00:37:03,590 --> 00:37:07,333
about the morals of what
they were doing early enough.
880
00:37:07,376 --> 00:37:08,986
What we should do
as scientists
881
00:37:09,030 --> 00:37:11,032
with powerful new technologies
882
00:37:11,075 --> 00:37:13,904
is try and understand it in
controlled conditions first.
883
00:37:14,949 --> 00:37:16,820
ANNOUNCER 1: And that is that.
884
00:37:16,864 --> 00:37:19,432
MaNa has defeated AlphaStar.
885
00:37:29,572 --> 00:37:31,357
I mean, my honest feeling is
that I think it is
886
00:37:31,400 --> 00:37:33,228
a fair representation
of where we are.
887
00:37:33,272 --> 00:37:35,970
And I think that part feels...
feels okay.
888
00:37:36,013 --> 00:37:37,450
- I'm very happy for you.
- I'm happy.
889
00:37:37,493 --> 00:37:38,886
So well... well done.
890
00:37:38,929 --> 00:37:40,888
My view is that the approach
to building technology
891
00:37:40,931 --> 00:37:43,369
which is embodied by
move fast and break things,
892
00:37:43,412 --> 00:37:46,241
is exactly what
we should not be doing,
893
00:37:46,285 --> 00:37:47,982
because you can't afford
to break things
894
00:37:48,025 --> 00:37:49,070
and then fix them afterwards.
895
00:37:49,113 --> 00:37:50,419
- Cheers.
- Thank you so much.
896
00:37:50,463 --> 00:37:52,029
Yeah, get... get some rest.
You did really well.
897
00:37:52,073 --> 00:37:53,944
- Cheers, yeah?
- Thank you for having us.
898
00:38:01,648 --> 00:38:03,302
[ELECTRONIC MUSIC PLAYING]
899
00:38:04,259 --> 00:38:05,521
HASSABIS: When I was eight,
900
00:38:05,565 --> 00:38:06,870
I bought my first computer
901
00:38:06,914 --> 00:38:09,569
with the winnings
from a chess tournament.
902
00:38:09,612 --> 00:38:11,179
I sort of had this intuition
903
00:38:11,222 --> 00:38:13,747
that computers
are this magical device
904
00:38:13,790 --> 00:38:15,923
that can extend
the power of the mind.
905
00:38:15,966 --> 00:38:17,403
I had a couple
of school friends,
906
00:38:17,446 --> 00:38:19,187
and we used to have
a hacking club,
907
00:38:19,230 --> 00:38:21,929
writing code, making games.
908
00:38:26,281 --> 00:38:27,848
And then over
the summer holidays,
909
00:38:27,891 --> 00:38:29,240
I'd spend the whole day
910
00:38:29,284 --> 00:38:31,547
flicking through
games magazines.
911
00:38:31,591 --> 00:38:33,462
And one day I noticed
there was a competition
912
00:38:33,506 --> 00:38:35,899
to write an original version
of Space Invaders.
913
00:38:35,943 --> 00:38:39,642
And the winner won a job
at Bullfrog.
914
00:38:39,686 --> 00:38:42,341
Bullfrog at the time was the
best game development house
915
00:38:42,384 --> 00:38:43,777
in all of Europe.
916
00:38:43,820 --> 00:38:45,300
You know, I really wanted
to work at this place
917
00:38:45,344 --> 00:38:48,608
and see how they build games.
918
00:38:48,651 --> 00:38:50,436
NEWSCASTER: Bullfrog,
based here in Guildford,
919
00:38:50,479 --> 00:38:52,307
began with a big idea.
920
00:38:52,351 --> 00:38:54,701
That idea turned into the game
Populous,
921
00:38:54,744 --> 00:38:56,572
which became
a global bestseller.
922
00:38:56,616 --> 00:38:59,880
In the '90s, there was
no recruitment agencies.
923
00:38:59,923 --> 00:39:02,143
You couldn't go out and say,
you know,
924
00:39:02,186 --> 00:39:04,972
"Come and work
in the games industry."
925
00:39:05,015 --> 00:39:08,192
It was still not even
considered an industry.
926
00:39:08,236 --> 00:39:11,239
So we came up with the idea
to have a competition
927
00:39:11,282 --> 00:39:13,676
and we got
a lot of applicants.
928
00:39:14,721 --> 00:39:17,637
And one of those was Demis's.
929
00:39:17,680 --> 00:39:20,727
I can still remember clearly
930
00:39:20,770 --> 00:39:23,991
the day that Demis came in.
931
00:39:24,034 --> 00:39:27,168
He walked in the door,
he looked about 12.
932
00:39:28,691 --> 00:39:30,040
I thought, "Oh, my God,
933
00:39:30,084 --> 00:39:31,607
"what the hell are we going
to do with this guy?"
934
00:39:31,651 --> 00:39:33,000
I applied to Cambridge.
935
00:39:33,043 --> 00:39:35,394
I got in but they said
I was way too young.
936
00:39:35,437 --> 00:39:37,874
So...
So I needed to take a year off
937
00:39:37,918 --> 00:39:39,920
so I'd be at least 17
before I got there.
938
00:39:39,963 --> 00:39:42,792
And that's when I decided
to spend that entire gap year
939
00:39:42,836 --> 00:39:44,490
working at Bullfrog.
940
00:39:44,533 --> 00:39:46,187
They couldn't even
legally employ me,
941
00:39:46,230 --> 00:39:48,319
so I ended up being paid
in brown paper envelopes.
942
00:39:48,363 --> 00:39:49,364
[CHUCKLES]
943
00:39:50,844 --> 00:39:54,325
I got a feeling of being
really at the cutting edge
944
00:39:54,369 --> 00:39:58,068
and how much fun that was
to invent things every day.
945
00:39:58,112 --> 00:40:00,593
And then you know,
a few months later,
946
00:40:00,636 --> 00:40:03,683
maybe everyone... a million
people will be playing it.
947
00:40:03,726 --> 00:40:06,686
MOLYNEUX: In those days
computer games had to evolve.
948
00:40:06,729 --> 00:40:08,557
There had to be new genres
949
00:40:08,601 --> 00:40:11,778
which were more
than just shooting things.
950
00:40:11,821 --> 00:40:14,084
Wouldn't it be amazing
to have a game
951
00:40:14,128 --> 00:40:18,828
where you design and build
your own theme park?
952
00:40:18,872 --> 00:40:21,048
[GAME CHARACTERS SCREAMING]
953
00:40:22,615 --> 00:40:25,966
Demis and I started to talk
aboutTheme Park.
954
00:40:26,009 --> 00:40:28,925
It allows the player
to build a world
955
00:40:28,969 --> 00:40:31,885
and see the consequences
of your choices
956
00:40:31,928 --> 00:40:34,148
that you've made
in that world.
957
00:40:34,191 --> 00:40:36,106
HASSABIS: A human player
set out the layout
958
00:40:36,150 --> 00:40:38,587
of the theme park and designed
the roller coaster
959
00:40:38,631 --> 00:40:41,372
and set the prices
in the chip shop.
960
00:40:41,416 --> 00:40:43,636
What I was working on was
the behaviors of the people.
961
00:40:43,679 --> 00:40:45,159
They were autonomous
962
00:40:45,202 --> 00:40:47,335
and that was the AI
in this case.
963
00:40:47,378 --> 00:40:48,902
So what I was trying to do
was mimic
964
00:40:48,945 --> 00:40:51,034
interesting human behavior
965
00:40:51,078 --> 00:40:52,340
so that the simulation
would be
966
00:40:52,383 --> 00:40:54,603
more interesting
to interact with.
967
00:40:54,647 --> 00:40:56,562
MOLYNEUX: Demis worked
on ridiculous things,
968
00:40:56,605 --> 00:40:59,434
like you could place down
these shops
969
00:40:59,478 --> 00:41:03,612
and if you put a shop too near
a very dangerous ride,
970
00:41:03,656 --> 00:41:05,396
then people on the ride
would throw up
971
00:41:05,440 --> 00:41:08,051
because they'd just eaten.
972
00:41:08,095 --> 00:41:09,662
And then that would make
other people throw up
973
00:41:09,705 --> 00:41:12,142
when they saw the throwing-up
on the floor,
974
00:41:12,186 --> 00:41:14,580
so you then had to have
lots of sweepers
975
00:41:14,623 --> 00:41:17,844
to quickly sweep it up
before the people saw it.
976
00:41:17,887 --> 00:41:19,541
That's the cool thing
about it.
977
00:41:19,585 --> 00:41:22,805
You as the player tinker with
it and then it reacts to you.
978
00:41:22,849 --> 00:41:25,895
MOLYNEUX: All those nuanced
simulation things he did
979
00:41:25,939 --> 00:41:28,115
and that was an invention
980
00:41:28,158 --> 00:41:31,248
which never really
existed before.
981
00:41:31,292 --> 00:41:34,251
It was
unbelievably successful.
982
00:41:34,295 --> 00:41:35,731
DAVID GARDNER:
Theme Park actually turned out
983
00:41:35,775 --> 00:41:37,211
to be a top ten title
984
00:41:37,254 --> 00:41:39,953
and that was the first time
we were starting to see
985
00:41:39,996 --> 00:41:43,043
how AI could make
a difference.
986
00:41:43,086 --> 00:41:44,827
[BRASS BAND PLAYING]
987
00:41:46,176 --> 00:41:47,613
CARTER: We were doing
some Christmas shopping
988
00:41:47,656 --> 00:41:51,268
and were waiting for the taxi
to take us home.
989
00:41:51,312 --> 00:41:54,968
I have this very clear memory
of Demis talking about AI
990
00:41:55,011 --> 00:41:56,230
in a very different way,
991
00:41:56,273 --> 00:41:58,449
in a way that we didn't
commonly talk about.
992
00:41:58,493 --> 00:42:02,366
This idea of AI being useful
for other things
993
00:42:02,410 --> 00:42:04,107
other than entertainment.
994
00:42:04,151 --> 00:42:07,458
So being useful for, um,
helping the world
995
00:42:07,502 --> 00:42:10,331
and the potential of AI
to change the world.
996
00:42:10,374 --> 00:42:13,247
I just said to Demis,
"What is it you want to do?"
997
00:42:13,290 --> 00:42:14,553
And he said to me,
998
00:42:14,596 --> 00:42:16,816
"I want to be the person
that solves AI."
999
00:42:22,691 --> 00:42:25,781
HASSABIS:
Peter offered me £1 million
1000
00:42:25,825 --> 00:42:27,696
to not go to university.
1001
00:42:30,220 --> 00:42:32,614
But I had a plan
from the beginning.
1002
00:42:32,658 --> 00:42:35,835
And my plan was always
to go to Cambridge.
1003
00:42:35,878 --> 00:42:36,923
I think a lot of
my schoolfriends
1004
00:42:36,966 --> 00:42:38,054
thought I was mad.
1005
00:42:38,098 --> 00:42:39,273
Why would you not...
1006
00:42:39,316 --> 00:42:40,709
I mean, £1 million,
that's a lot of money.
1007
00:42:40,753 --> 00:42:43,538
In the '90s,
that is a lot of money, right?
1008
00:42:43,582 --> 00:42:46,367
For a...
For a poor 17-year-old kid.
1009
00:42:46,410 --> 00:42:50,240
He's like this little seed
that's going to burst through,
1010
00:42:50,284 --> 00:42:53,679
and he's not going to be able
to do that at Bullfrog.
1011
00:42:56,464 --> 00:42:59,119
I had to drop him off
at the train station
1012
00:42:59,162 --> 00:43:02,601
and I can still see
that picture
1013
00:43:02,644 --> 00:43:07,040
of this little elven character
disappear down that tunnel.
1014
00:43:07,083 --> 00:43:09,825
That was an incredibly
sad moment.
1015
00:43:13,263 --> 00:43:14,656
HASSABIS:
I had this romantic ideal
1016
00:43:14,700 --> 00:43:16,963
of what Cambridge
would be like,
1017
00:43:17,006 --> 00:43:18,660
1,000 years of history,
1018
00:43:18,704 --> 00:43:21,054
walking the same streets
that Turing,
1019
00:43:21,097 --> 00:43:23,622
Newton and Crick had walked.
1020
00:43:23,665 --> 00:43:26,668
I wanted to explore
the edge of the universe.
1021
00:43:26,712 --> 00:43:27,756
[CHURCH BELLS TOLLING]
1022
00:43:29,105 --> 00:43:30,367
When I got to Cambridge,
1023
00:43:30,411 --> 00:43:32,674
I'd basically been working
my whole life.
1024
00:43:33,762 --> 00:43:35,111
Every single summer,
1025
00:43:35,155 --> 00:43:37,157
I was either playing chess
professionally,
1026
00:43:37,200 --> 00:43:39,725
or I was working,
doing an internship.
1027
00:43:39,768 --> 00:43:43,729
So I was, like, "Right,
I am gonna have fun now
1028
00:43:43,772 --> 00:43:46,732
"and explore what it means
to be a normal teenager."
1029
00:43:47,994 --> 00:43:50,213
[PEOPLE CHEERING, LAUGHING]
1030
00:43:50,257 --> 00:43:52,259
Come on! Go, boy, go!
1031
00:43:52,302 --> 00:43:54,043
TIM STEVENS: It was work hard
and play hard.
1032
00:43:54,087 --> 00:43:55,828
[ALL SINGING]
1033
00:43:55,871 --> 00:43:57,046
I first met Demis
1034
00:43:57,090 --> 00:43:59,222
because we both attended
Queens' College.
1035
00:44:00,136 --> 00:44:01,224
Our group of friends,
1036
00:44:01,268 --> 00:44:03,226
we'd often drink beer
in the bar,
1037
00:44:03,270 --> 00:44:04,967
play table football.
1038
00:44:05,011 --> 00:44:07,361
HASSABIS: In the bar,
I used to play speed chess,
1039
00:44:07,404 --> 00:44:09,276
pieces flying off the board,
1040
00:44:09,319 --> 00:44:11,104
you know, the whole game
in one minute.
1041
00:44:11,147 --> 00:44:12,322
Demis sat down opposite me.
1042
00:44:12,366 --> 00:44:13,584
And I looked at him
and I thought,
1043
00:44:13,628 --> 00:44:15,238
"I remember you
from when we were kids."
1044
00:44:15,282 --> 00:44:17,197
HASSABIS: I had actually been
in the same chess tournament
1045
00:44:17,240 --> 00:44:18,807
as Dave in Ipswich,
1046
00:44:18,851 --> 00:44:20,461
where I used to go and try
and raid his local chess club
1047
00:44:20,504 --> 00:44:22,724
to win a bit of prize money.
1048
00:44:22,768 --> 00:44:24,639
COPPIN: We were studying
computer science.
1049
00:44:24,683 --> 00:44:26,815
Some people,
who at the age of 17
1050
00:44:26,859 --> 00:44:28,425
would have come in and made
sure to tell everybody
1051
00:44:28,469 --> 00:44:29,513
everything about themselves.
1052
00:44:29,557 --> 00:44:30,993
"Hey, I worked at Bullfrog
1053
00:44:31,037 --> 00:44:33,039
"and built the world's
most successful video game."
1054
00:44:33,082 --> 00:44:34,736
But he wasn't like that
at all.
1055
00:44:34,780 --> 00:44:36,433
SILVER: At Cambridge,
Demis and myself
1056
00:44:36,477 --> 00:44:38,435
both had an interest
in computational neuroscience
1057
00:44:38,479 --> 00:44:40,263
and trying to understand
how computers and brains
1058
00:44:40,307 --> 00:44:42,657
intertwined
and linked together.
1059
00:44:42,701 --> 00:44:44,311
JOHN DAUGMAN:
Both David and Demis
1060
00:44:44,354 --> 00:44:46,443
came to me for supervisions.
1061
00:44:46,487 --> 00:44:49,795
It happens just by coincidence
that the year 1997,
1062
00:44:49,838 --> 00:44:51,666
their third and final year
at Cambridge,
1063
00:44:51,710 --> 00:44:55,322
was also the year when
the first chess grandmaster
1064
00:44:55,365 --> 00:44:56,802
was beaten by
a computer program.
1065
00:44:56,845 --> 00:44:58,281
[CAMERA SHUTTERS CLICKING]
1066
00:44:58,325 --> 00:45:00,109
NEWSCASTER: Round one today
of a chess match
1067
00:45:00,153 --> 00:45:03,722
between the ranking
world champion Garry Kasparov
1068
00:45:03,765 --> 00:45:06,028
and an opponent named
Deep Blue
1069
00:45:06,072 --> 00:45:10,511
to test to see if the human
brain can outwit a machine.
1070
00:45:10,554 --> 00:45:11,642
HASSABIS: I remember the drama
1071
00:45:11,686 --> 00:45:13,819
of Kasparov
losing the last match.
1072
00:45:13,862 --> 00:45:15,255
NEWSCASTER 2: Whoa!
1073
00:45:15,298 --> 00:45:17,213
Kasparov has resigned!
1074
00:45:17,257 --> 00:45:19,607
When Deep Blue
beat Garry Kasparov,
1075
00:45:19,650 --> 00:45:21,478
that was a real
watershed event.
1076
00:45:21,522 --> 00:45:23,176
HASSABIS:
My main memory of it was
1077
00:45:23,219 --> 00:45:25,352
I wasn't that impressed
with Deep Blue.
1078
00:45:25,395 --> 00:45:27,267
I was more impressed
with Kasparov's mind.
1079
00:45:27,310 --> 00:45:29,530
That he could play chess
to this level,
1080
00:45:29,573 --> 00:45:31,662
where he could compete
on an equal footing
1081
00:45:31,706 --> 00:45:33,273
with the brute of a machine,
1082
00:45:33,316 --> 00:45:35,188
but of course, Kasparov can do
1083
00:45:35,231 --> 00:45:36,972
everything else humans can do,
too.
1084
00:45:37,016 --> 00:45:38,278
It was a huge achievement.
1085
00:45:38,321 --> 00:45:39,409
But the truth
of the matter was,
1086
00:45:39,453 --> 00:45:40,889
Deep Blue
could only play chess.
1087
00:45:42,456 --> 00:45:44,545
What we would regard
as intelligence
1088
00:45:44,588 --> 00:45:46,895
was missing from that system.
1089
00:45:46,939 --> 00:45:49,855
This idea of generality
and also learning.
1090
00:45:53,772 --> 00:45:55,425
Cambridge was amazing,
because of course, you know,
1091
00:45:55,469 --> 00:45:56,687
you're mixing with people
1092
00:45:56,731 --> 00:45:58,254
who are studying
many different subjects.
1093
00:45:58,298 --> 00:46:01,431
SILVER: There were scientists,
philosophers, artists...
1094
00:46:01,475 --> 00:46:04,478
STEVENS: ...geologists,
biologists, ecologists.
1095
00:46:04,521 --> 00:46:07,437
You know, everybody is talking
about everything all the time.
1096
00:46:07,481 --> 00:46:10,789
I was obsessed with
the protein folding problem.
1097
00:46:10,832 --> 00:46:13,269
HASSABIS: Tim Stevens used
to talk obsessively,
1098
00:46:13,313 --> 00:46:15,402
almost like religiously
about this problem,
1099
00:46:15,445 --> 00:46:17,186
protein folding problem.
1100
00:46:17,230 --> 00:46:18,884
STEVENS:
Proteins are, you know,
1101
00:46:18,927 --> 00:46:22,104
one of the most beautiful and
elegant things about biology.
1102
00:46:22,148 --> 00:46:24,759
They are the machines of life.
1103
00:46:24,803 --> 00:46:27,022
They build everything,
they control everything,
1104
00:46:27,066 --> 00:46:29,590
they're why biology works.
1105
00:46:29,633 --> 00:46:32,680
Proteins are made from strings
of amino acids
1106
00:46:32,723 --> 00:46:37,076
that fold up to create
a protein structure.
1107
00:46:37,119 --> 00:46:39,905
If we can predict
the structure of proteins
1108
00:46:39,948 --> 00:46:43,125
from just their amino acid
sequences,
1109
00:46:43,169 --> 00:46:46,128
then a new protein
to cure cancer
1110
00:46:46,172 --> 00:46:49,436
or break down plastic
to help the environment
1111
00:46:49,479 --> 00:46:50,829
is definitely something
1112
00:46:50,872 --> 00:46:52,961
that you could begin
to think about.
1113
00:46:53,962 --> 00:46:55,050
I kind of thought,
1114
00:46:55,094 --> 00:46:58,314
"Well, is a human being
clever enough
1115
00:46:58,358 --> 00:47:00,012
"to actually fold a protein?"
1116
00:47:00,055 --> 00:47:02,101
We can't work it out.
1117
00:47:02,144 --> 00:47:04,103
JOHN MOULT: Since the 1960s,
1118
00:47:04,146 --> 00:47:05,974
we thought that in principle,
1119
00:47:06,018 --> 00:47:08,934
if I know what the amino acid
sequence of a protein is,
1120
00:47:08,977 --> 00:47:11,458
I should be able to compute
what the structure's like.
1121
00:47:11,501 --> 00:47:13,721
So, if you could
just press a button,
1122
00:47:13,764 --> 00:47:16,332
and they'd all come
popping out, that would be...
1123
00:47:16,376 --> 00:47:18,030
that would have some impact.
1124
00:47:20,293 --> 00:47:21,598
HASSABIS: It stuck in my mind.
1125
00:47:21,642 --> 00:47:23,426
"Oh, this is
a very interesting problem."
1126
00:47:23,470 --> 00:47:26,865
And it felt to me
like it would be solvable.
1127
00:47:26,908 --> 00:47:29,955
But I thought
it would need AI to do it.
1128
00:47:31,521 --> 00:47:34,046
If we could just solve
protein folding,
1129
00:47:34,089 --> 00:47:35,656
it could change the world.
1130
00:47:50,889 --> 00:47:52,847
HASSABIS: Ever since
I was a student at Cambridge,
1131
00:47:54,022 --> 00:47:55,632
I've never
stopped thinking about
1132
00:47:55,676 --> 00:47:57,156
the protein folding problem.
1133
00:47:59,810 --> 00:48:02,813
If you were
to solve protein folding,
1134
00:48:02,857 --> 00:48:05,686
then the potential
to help solve problems like
1135
00:48:05,729 --> 00:48:09,690
Alzheimer's, dementia
and drug discovery is huge.
1136
00:48:09,733 --> 00:48:11,692
Solving disease is probably
1137
00:48:11,735 --> 00:48:13,346
the most major impact
we could have.
1138
00:48:13,389 --> 00:48:14,608
[CLICKS MOUSE]
1139
00:48:15,478 --> 00:48:16,697
Thousands of very smart people
1140
00:48:16,740 --> 00:48:18,786
have tried
to solve protein folding.
1141
00:48:18,829 --> 00:48:21,006
I just think now
is the right time
1142
00:48:21,049 --> 00:48:22,529
for AI to crack it.
1143
00:48:22,572 --> 00:48:24,313
[THRILLING MUSIC PLAYING]
1144
00:48:24,357 --> 00:48:26,533
[INDISTINCT CONVERSATION]
1145
00:48:26,576 --> 00:48:28,404
RICHARD EVANS: We needed
a reasonable way
1146
00:48:28,448 --> 00:48:29,536
to apply machine learning
1147
00:48:29,579 --> 00:48:30,537
to the protein folding
problem.
1148
00:48:30,580 --> 00:48:32,408
[CLICKING MOUSE]
1149
00:48:32,452 --> 00:48:35,194
We came across
this Foldit game.
1150
00:48:35,237 --> 00:48:38,806
The goal is to move around
this 3D model of a protein
1151
00:48:38,849 --> 00:48:41,722
and you get a score
every time you move it.
1152
00:48:41,765 --> 00:48:43,071
The more accurate
you make these structures,
1153
00:48:43,115 --> 00:48:45,552
the more useful
they will be to biologists.
1154
00:48:46,248 --> 00:48:47,510
I spent a few days
1155
00:48:47,554 --> 00:48:48,859
just kind of seeing
how well we could do.
1156
00:48:48,903 --> 00:48:50,644
[GAME DINGING]
1157
00:48:50,687 --> 00:48:52,428
We did reasonably well.
1158
00:48:52,472 --> 00:48:53,864
But even if you were
1159
00:48:53,908 --> 00:48:55,301
the world's
best Foldit player,
1160
00:48:55,344 --> 00:48:57,520
you wouldn't
solve protein folding.
1161
00:48:57,564 --> 00:48:59,522
That's why we had
to move beyond the game.
1162
00:48:59,566 --> 00:49:00,741
HASSABIS: Games
are always just
1163
00:49:00,784 --> 00:49:03,744
the proving ground
for our algorithms.
1164
00:49:03,787 --> 00:49:07,748
The ultimate goal was not just
to crack Go and StarCraft.
1165
00:49:07,791 --> 00:49:10,055
It was to crack
real-world challenges.
1166
00:49:10,925 --> 00:49:13,058
[THRILLING MUSIC CONTINUES]
1167
00:49:16,148 --> 00:49:18,541
JOHN JUMPER: I remember
hearing this rumor
1168
00:49:18,585 --> 00:49:21,196
that Demis was
getting into proteins.
1169
00:49:21,240 --> 00:49:23,764
I talked to some people
at DeepMind and I would ask,
1170
00:49:23,807 --> 00:49:25,070
"So are you doing
protein folding?"
1171
00:49:25,113 --> 00:49:26,941
And they would
artfully change the subject.
1172
00:49:26,985 --> 00:49:30,118
And when that happened twice,
I pretty much figured it out.
1173
00:49:30,162 --> 00:49:32,860
So I thought
I should submit a resume.
1174
00:49:32,903 --> 00:49:35,689
HASSABIS: All right, everyone,
welcome to DeepMind.
1175
00:49:35,732 --> 00:49:37,647
I know some of you,
this may be your first week,
1176
00:49:37,691 --> 00:49:39,214
but I hope you all set...
1177
00:49:39,258 --> 00:49:40,911
JUMPER: The really appealing
part for me about the job
1178
00:49:40,955 --> 00:49:42,696
was this, like,
sense of connection
1179
00:49:42,739 --> 00:49:44,524
to the larger purpose.
1180
00:49:44,567 --> 00:49:45,612
HASSABIS: If we can crack
1181
00:49:45,655 --> 00:49:48,049
some fundamental problems
in science,
1182
00:49:48,093 --> 00:49:49,181
many other people
1183
00:49:49,224 --> 00:49:50,921
and other companies
and labs and so on
1184
00:49:50,965 --> 00:49:52,488
could build
on top of our work.
1185
00:49:52,532 --> 00:49:53,837
This is your chance now
1186
00:49:53,881 --> 00:49:55,839
to add your chapter
to this story.
1187
00:49:55,883 --> 00:49:57,319
JUMPER: When I arrived,
1188
00:49:57,363 --> 00:49:59,626
I was definitely[CHUCKLES]
quite a bit nervous.
1189
00:49:59,669 --> 00:50:00,757
I'm still trying to keep...
1190
00:50:00,801 --> 00:50:02,933
I haven't taken
any biology courses.
1191
00:50:02,977 --> 00:50:05,414
We haven't spent
years of our lives
1192
00:50:05,458 --> 00:50:07,938
looking at these structures
and understanding them.
1193
00:50:07,982 --> 00:50:09,984
We are just going off the data
1194
00:50:10,028 --> 00:50:11,290
and our machine learning
models.
1195
00:50:12,726 --> 00:50:13,857
JUMPER: In machine learning,
1196
00:50:13,901 --> 00:50:15,816
you train a network
like flashcards.
1197
00:50:15,859 --> 00:50:18,819
Here's the question.
Here's the answer.
1198
00:50:18,862 --> 00:50:20,777
Here's the question.
Here's the answer.
1199
00:50:20,821 --> 00:50:22,344
But in protein folding,
1200
00:50:22,388 --> 00:50:25,478
we're not doing the kind
of standard task at DeepMind
1201
00:50:25,521 --> 00:50:28,176
where you have unlimited data.
1202
00:50:28,220 --> 00:50:30,787
Your job is to get better
at chess or Go
1203
00:50:30,831 --> 00:50:33,007
and you can play
as many games of chess or Go
1204
00:50:33,051 --> 00:50:34,835
as your computers will allow.
1205
00:50:35,531 --> 00:50:36,793
With proteins,
1206
00:50:36,837 --> 00:50:39,753
we're sitting on
a very thick size of data
1207
00:50:39,796 --> 00:50:42,016
that's been determined
by a half century
1208
00:50:42,060 --> 00:50:46,499
of time-consuming experimental
methods in laboratories.
1209
00:50:46,542 --> 00:50:49,719
These painstaking methods
can take months or years
1210
00:50:49,763 --> 00:50:52,374
to determine
a single protein structure,
1211
00:50:52,418 --> 00:50:55,812
and sometimes, a structure
can never be determined.
1212
00:50:55,856 --> 00:50:57,292
[TYPING]
1213
00:50:57,336 --> 00:51:00,295
That's why we're working
with such small datasets
1214
00:51:00,339 --> 00:51:02,254
to train our algorithms.
1215
00:51:02,297 --> 00:51:04,386
EWAN BIRNEY: When DeepMind
started to explore
1216
00:51:04,430 --> 00:51:05,996
the folding problem,
1217
00:51:06,040 --> 00:51:07,998
they were talking to us about
which datasets they were using
1218
00:51:08,042 --> 00:51:09,913
and what would be
the possibilities
1219
00:51:09,957 --> 00:51:11,654
if they did
solve this problem.
1220
00:51:12,394 --> 00:51:13,961
Many people have tried,
1221
00:51:14,004 --> 00:51:16,659
and yet no one on the planet
has solved protein folding.
1222
00:51:16,703 --> 00:51:18,226
[CHUCKLES]
I did think to myself,
1223
00:51:18,270 --> 00:51:19,967
"Well, you know, good luck."
1224
00:51:20,010 --> 00:51:22,752
JUMPER: If we can solve
the protein folding problem,
1225
00:51:22,796 --> 00:51:25,712
it would have an incredible
kind of medical relevance.
1226
00:51:25,755 --> 00:51:27,757
HASSABIS:
This is the cycle of science.
1227
00:51:27,801 --> 00:51:30,064
You do a huge amount
of exploration,
1228
00:51:30,108 --> 00:51:31,935
and then you go
into exploitation mode,
1229
00:51:31,979 --> 00:51:33,589
and you focus and you see
1230
00:51:33,633 --> 00:51:35,374
how good
are those ideas, really?
1231
00:51:35,417 --> 00:51:36,549
And there's nothing better
1232
00:51:36,592 --> 00:51:38,116
than external competition
for that.
1233
00:51:39,856 --> 00:51:43,077
So we decided
to enter CASP competition.
1234
00:51:43,121 --> 00:51:47,255
CASP, we started
to try and speed up
1235
00:51:47,299 --> 00:51:49,823
the solution to
the protein folding problem.
1236
00:51:49,866 --> 00:51:52,086
CASP is when we say,
1237
00:51:52,130 --> 00:51:54,393
"Look, DeepMind
is doing protein folding,
1238
00:51:54,436 --> 00:51:55,611
"this is how good we are,
1239
00:51:55,655 --> 00:51:57,613
"and maybe it's better
than everybody else.
1240
00:51:57,657 --> 00:51:58,701
"Maybe it isn't."
1241
00:51:58,745 --> 00:52:00,094
CASP is a bit like
1242
00:52:00,138 --> 00:52:02,096
the Olympic Games
of protein folding.
1243
00:52:03,663 --> 00:52:06,056
CASP is
a community-wide assessment
1244
00:52:06,100 --> 00:52:08,146
that's held every two years.
1245
00:52:09,582 --> 00:52:10,887
Teams are given
1246
00:52:10,931 --> 00:52:14,282
the amino acid sequences
of about 100 proteins,
1247
00:52:14,326 --> 00:52:17,546
and then they try
to solve this folding problem
1248
00:52:17,590 --> 00:52:20,854
using computational methods.
1249
00:52:20,897 --> 00:52:23,422
These proteins have
already been determined
1250
00:52:23,465 --> 00:52:25,989
by experiments
in a laboratory,
1251
00:52:26,033 --> 00:52:29,297
but have not yet
been revealed publicly.
1252
00:52:29,341 --> 00:52:30,777
And these known structures
1253
00:52:30,820 --> 00:52:33,301
represent the gold standard
against which
1254
00:52:33,345 --> 00:52:36,957
all the computational
predictions will be compared.
1255
00:52:37,958 --> 00:52:39,351
MOULT: We've got a score
1256
00:52:39,394 --> 00:52:42,180
that measures the accuracy
of the predictions.
1257
00:52:42,223 --> 00:52:44,747
And you would expect
a score of over 90
1258
00:52:44,791 --> 00:52:47,446
to be a solution to
the protein folding problem.
1259
00:52:47,489 --> 00:52:48,534
[INDISTINCT CHATTER]
1260
00:52:48,577 --> 00:52:50,057
MAN: Welcome, everyone,
1261
00:52:50,100 --> 00:52:52,059
to our first, uh, semifinals
in the winners' bracket.
1262
00:52:52,102 --> 00:52:54,975
Nick and John
versus Demis and Frank.
1263
00:52:55,018 --> 00:52:57,238
Please join us, come around.
This will be an intense match.
1264
00:52:57,282 --> 00:52:59,327
STEVENS:
When I learned that Demis was
1265
00:52:59,371 --> 00:53:02,069
going to tackle
the protein folding issue,
1266
00:53:02,112 --> 00:53:04,811
um, I wasn't at all surprised.
1267
00:53:04,854 --> 00:53:06,813
It's very typical of Demis.
1268
00:53:06,856 --> 00:53:08,945
You know,
he loves competition.
1269
00:53:08,989 --> 00:53:10,164
And that's the end
1270
00:53:10,208 --> 00:53:12,993
- of the first game, 10-7.
- [ALL CHEERING]
1271
00:53:13,036 --> 00:53:14,124
HASSABIS:
The aim for CASP would be
1272
00:53:14,168 --> 00:53:15,996
to not just
win the competition,
1273
00:53:16,039 --> 00:53:19,913
but sort of, um,
retire the need for it.
1274
00:53:19,956 --> 00:53:23,438
So, 20 targets total
have been released by CASP.
1275
00:53:23,482 --> 00:53:24,831
JUMPER: We were thinking maybe
1276
00:53:24,874 --> 00:53:26,920
throw in the standard
kind of machine learning
1277
00:53:26,963 --> 00:53:28,835
and see how far
that could take us.
1278
00:53:28,878 --> 00:53:30,750
Instead of having a couple
of days on an experiment,
1279
00:53:30,793 --> 00:53:33,579
we can turn around
five experiments a day.
1280
00:53:33,622 --> 00:53:35,363
Great. Well done, everyone.
1281
00:53:35,407 --> 00:53:36,756
[TYPING]
1282
00:53:36,799 --> 00:53:38,714
Can you show me the real one
instead of ours?
1283
00:53:38,758 --> 00:53:39,802
MAN 1: The true answer is
1284
00:53:39,846 --> 00:53:42,065
supposed to look
something like that.
1285
00:53:42,109 --> 00:53:45,068
MAN 2: It's a lot more
cylindrical than I thought.
1286
00:53:45,112 --> 00:53:47,419
JUMPER: The results
were not very good.
1287
00:53:47,462 --> 00:53:48,724
Okay.
1288
00:53:48,768 --> 00:53:49,856
JUMPER: We throw
all the obvious ideas to it
1289
00:53:49,899 --> 00:53:51,814
and the problem laughs at you.
1290
00:53:52,902 --> 00:53:54,382
This makes no sense.
1291
00:53:54,426 --> 00:53:56,036
EVANS: We thought
we could just throw
1292
00:53:56,079 --> 00:53:58,343
some of our best algorithms
at the problem.
1293
00:53:59,387 --> 00:54:00,997
We were slightly naive.
1294
00:54:01,041 --> 00:54:02,303
JUMPER:
We should be learning this,
1295
00:54:02,347 --> 00:54:04,305
you know,
in the blink of an eye.
1296
00:54:05,393 --> 00:54:07,003
The thing
I'm worried about is,
1297
00:54:07,047 --> 00:54:08,396
we take the field from
1298
00:54:08,440 --> 00:54:10,920
really bad answers
to moderately bad answers.
1299
00:54:10,964 --> 00:54:13,967
I feel like we need
some sort of new technology
1300
00:54:14,010 --> 00:54:15,185
for moving around
these things.
1301
00:54:15,229 --> 00:54:17,362
[THRILLING MUSIC CONTINUES]
1302
00:54:20,452 --> 00:54:22,236
HASSABIS: With only
a week left of CASP,
1303
00:54:22,280 --> 00:54:24,369
it's now a sprint
to get it deployed.
1304
00:54:24,412 --> 00:54:25,370
[MUSIC FADES]
1305
00:54:26,675 --> 00:54:28,111
You've done your best.
1306
00:54:28,155 --> 00:54:29,896
Then there's
nothing more you can do
1307
00:54:29,939 --> 00:54:32,420
but wait for CASP
to deliver the results.
1308
00:54:32,464 --> 00:54:34,422
[HOPEFUL MUSIC PLAYING]
1309
00:54:52,614 --> 00:54:53,746
This famous thing of Einstein,
1310
00:54:53,789 --> 00:54:55,051
the last couple of years
of his life,
1311
00:54:55,095 --> 00:54:57,402
when he was here,
he overlapped with Kurt Godel
1312
00:54:57,445 --> 00:54:59,795
and he said one of the reasons
he still comes in to work
1313
00:54:59,839 --> 00:55:01,667
is so that
he gets to walk home
1314
00:55:01,710 --> 00:55:03,538
and discuss things with Godel.
1315
00:55:03,582 --> 00:55:05,932
It's a pretty big compliment
for Kurt Godel,
1316
00:55:05,975 --> 00:55:07,499
shows you how amazing he was.
1317
00:55:09,152 --> 00:55:10,589
MAN: The Institute
for Advanced Study
1318
00:55:10,632 --> 00:55:12,939
was formed in 1933.
1319
00:55:12,982 --> 00:55:14,244
In the early years,
1320
00:55:14,288 --> 00:55:16,508
the intense scientific
atmosphere attracted
1321
00:55:16,551 --> 00:55:19,380
some of the most brilliant
mathematicians and physicists
1322
00:55:19,424 --> 00:55:22,557
ever concentrated
in a single place and time.
1323
00:55:22,601 --> 00:55:24,603
HASSABIS: The founding
principle of this place,
1324
00:55:24,646 --> 00:55:28,041
it's the idea of unfettered
intellectual pursuits,
1325
00:55:28,084 --> 00:55:30,173
even if you don't know
what you're exploring.
1326
00:55:30,217 --> 00:55:32,175
Will result
in some cool things,
1327
00:55:32,219 --> 00:55:34,917
and sometimes that then
ends up being useful,
1328
00:55:34,961 --> 00:55:36,397
which, of course,
1329
00:55:36,441 --> 00:55:38,007
is partially what I've been
trying to do at DeepMind.
1330
00:55:38,051 --> 00:55:40,009
How many big breakthroughs
do you think are required
1331
00:55:40,053 --> 00:55:41,707
to get all the way to AGI?
1332
00:55:41,750 --> 00:55:43,099
And, you know,
I estimate maybe
1333
00:55:43,143 --> 00:55:44,362
there's about
a dozen of those.
1334
00:55:44,405 --> 00:55:46,146
You know, I hope
it's within my lifetime.
1335
00:55:46,189 --> 00:55:47,626
- Yes, okay.
- HASSABIS: But then,
1336
00:55:47,669 --> 00:55:49,192
all scientists
hope that, right?
1337
00:55:49,236 --> 00:55:51,151
EMCEE: Demis has
many accolades.
1338
00:55:51,194 --> 00:55:54,023
He was elected Fellow to
the Royal Society last year.
1339
00:55:54,067 --> 00:55:55,982
He is also a Fellow
of Royal Society of Arts.
1340
00:55:56,025 --> 00:55:57,549
A big hand for Demis Hassabis.
1341
00:56:02,989 --> 00:56:04,207
[MUSIC FADES]
1342
00:56:04,251 --> 00:56:05,948
HASSABIS: My dream
has always been to try
1343
00:56:05,992 --> 00:56:08,124
and make
AI-assisted science possible.
1344
00:56:08,168 --> 00:56:09,256
And what I think is
1345
00:56:09,299 --> 00:56:11,171
our most exciting project,
last year,
1346
00:56:11,214 --> 00:56:13,173
which is our work
in protein folding.
1347
00:56:13,216 --> 00:56:15,349
Uh, and we call this system
AlphaFold.
1348
00:56:15,393 --> 00:56:18,352
We entered it into CASP
and our system, uh,
1349
00:56:18,396 --> 00:56:20,528
was the most accurate,
uh, predicting structures
1350
00:56:20,572 --> 00:56:24,967
for 25 out of the 43 proteins
in the hardest category.
1351
00:56:25,011 --> 00:56:26,229
So we're state of the art,
1352
00:56:26,273 --> 00:56:27,535
but we still...
I have to make... Be clear,
1353
00:56:27,579 --> 00:56:28,580
we're still a long way from
1354
00:56:28,623 --> 00:56:30,495
solving the protein
folding problem.
1355
00:56:30,538 --> 00:56:31,887
We're working hard
on this, though,
1356
00:56:31,931 --> 00:56:33,759
and we're exploring
many other techniques.
1357
00:56:33,802 --> 00:56:35,500
[SOMBER MUSIC PLAYING]
1358
00:56:49,209 --> 00:56:50,253
Let's get started.
1359
00:56:50,297 --> 00:56:53,169
JUMPER: So kind of
a rapid debrief,
1360
00:56:53,213 --> 00:56:55,476
these are
our final rankings for CASP.
1361
00:56:56,521 --> 00:56:57,565
HASSABIS:
We beat the second team
1362
00:56:57,609 --> 00:57:00,176
in this competition
by nearly 50%,
1363
00:57:00,220 --> 00:57:01,613
but we've still got
a long way to go
1364
00:57:01,656 --> 00:57:04,354
before we've solved
the protein folding problem
1365
00:57:04,398 --> 00:57:07,053
in a sense that
a biologist could use it.
1366
00:57:07,096 --> 00:57:09,011
JUMPER: It is area of concern.
1367
00:57:11,623 --> 00:57:14,234
JANET THORNTON: The quality
of predictions varied
1368
00:57:14,277 --> 00:57:16,758
and they were no more useful
than the previous methods.
1369
00:57:16,802 --> 00:57:19,935
PAUL NURSE: AlphaFold didn't
produce good enough data
1370
00:57:19,979 --> 00:57:22,547
for it to be useful
in a practical way
1371
00:57:22,590 --> 00:57:24,026
to, say, somebody like me
1372
00:57:24,070 --> 00:57:28,248
investigating
my own biological problems.
1373
00:57:28,291 --> 00:57:30,337
JUMPER: That was kind of
a humbling moment
1374
00:57:30,380 --> 00:57:32,774
'cause we thought we'd worked
very hard and succeeded.
1375
00:57:32,818 --> 00:57:34,907
And what we'd found is
we were the best in the world
1376
00:57:34,950 --> 00:57:36,474
at a problem
the world's not good at.
1377
00:57:37,692 --> 00:57:38,954
We knew we sucked.
1378
00:57:38,998 --> 00:57:40,434
[INDISTINCT CHATTER]
1379
00:57:40,478 --> 00:57:42,349
JUMPER: It doesn't help
if you have the tallest ladder
1380
00:57:42,392 --> 00:57:44,656
when you're going to the moon.
1381
00:57:44,699 --> 00:57:47,136
HASSABIS: The opinion of quite
a few people on the team,
1382
00:57:47,180 --> 00:57:51,489
that this is sort of
a fool's errand in some ways.
1383
00:57:51,532 --> 00:57:54,100
And I might have been wrong
with protein folding.
1384
00:57:54,143 --> 00:57:55,580
Maybe it's too hard still
1385
00:57:55,623 --> 00:57:58,452
for where we're at
generally with AI.
1386
00:57:58,496 --> 00:58:01,194
If you want to do
biological research,
1387
00:58:01,237 --> 00:58:03,065
you have to be
prepared to fail
1388
00:58:03,109 --> 00:58:06,591
because biology
is very complicated.
1389
00:58:06,634 --> 00:58:09,811
I've run a laboratory
for nearly 50 years,
1390
00:58:09,855 --> 00:58:11,117
and half my time,
1391
00:58:11,160 --> 00:58:12,640
I'm just
an amateur psychiatrist
1392
00:58:12,684 --> 00:58:18,124
to keep, um, my colleagues
cheerful when nothing works.
1393
00:58:18,167 --> 00:58:22,563
And quite a lot of the time
and I mean, 80, 90%,
1394
00:58:22,607 --> 00:58:24,434
it does not work.
1395
00:58:24,478 --> 00:58:26,741
If you are
at the forefront of science,
1396
00:58:26,785 --> 00:58:30,136
I can tell you,
you will fail a great deal.
1397
00:58:32,486 --> 00:58:33,487
[CLICKS MOUSE]
1398
00:58:35,184 --> 00:58:37,186
HASSABIS:
I just felt disappointed.
1399
00:58:38,710 --> 00:58:41,626
Lesson I learned is that
ambition is a good thing,
1400
00:58:41,669 --> 00:58:43,715
but you need
to get the timing right.
1401
00:58:43,758 --> 00:58:46,805
There's no point being
50 years ahead of your time.
1402
00:58:46,848 --> 00:58:48,154
You will never survive
1403
00:58:48,197 --> 00:58:49,938
fifty years of
that kind of endeavor
1404
00:58:49,982 --> 00:58:51,984
before it yields something.
1405
00:58:52,027 --> 00:58:53,289
You'll literally die trying.
1406
00:58:53,333 --> 00:58:55,117
[TENSE MUSIC PLAYING]
1407
00:59:08,957 --> 00:59:11,307
CUKIER:
When we talk about AGI,
1408
00:59:11,351 --> 00:59:14,397
the holy grail
of artificial intelligence,
1409
00:59:14,441 --> 00:59:15,529
it becomes really difficult
1410
00:59:15,573 --> 00:59:17,836
to know what
we're even talking about.
1411
00:59:17,879 --> 00:59:19,664
HASSABIS: Which bits
are we gonna see today?
1412
00:59:19,707 --> 00:59:21,666
MAN: We're going
to start in the garden.
1413
00:59:21,709 --> 00:59:23,102
[MACHINE BEEPS]
1414
00:59:23,145 --> 00:59:25,670
This is the garden looking
from the observation area.
1415
00:59:25,713 --> 00:59:27,454
Research scientists
and engineers
1416
00:59:27,497 --> 00:59:30,892
can analyze and collaborate
and evaluate
1417
00:59:30,936 --> 00:59:33,025
what's going on in real time.
1418
00:59:33,068 --> 00:59:34,635
CUKIER: So in the 1800s,
1419
00:59:34,679 --> 00:59:37,029
we'd think of things like
television and the submarine
1420
00:59:37,072 --> 00:59:38,160
or a rocket ship to the moon
1421
00:59:38,204 --> 00:59:40,249
and say these things
are impossible.
1422
00:59:40,293 --> 00:59:41,511
Yet Jules Verne
wrote about them and,
1423
00:59:41,555 --> 00:59:44,427
a century and a half later,
they happened.
1424
00:59:44,471 --> 00:59:45,472
HASSABIS: We'll be
experimenting
1425
00:59:45,515 --> 00:59:47,909
on civilizations really,
1426
00:59:47,953 --> 00:59:50,608
civilizations of AI agents.
1427
00:59:50,651 --> 00:59:52,740
Once the experiments
start going,
1428
00:59:52,784 --> 00:59:54,263
it's going to be
the most exciting thing ever.
1429
00:59:54,307 --> 00:59:57,005
- So how will we get sleep?
- [MAN LAUGHS]
1430
00:59:57,049 --> 00:59:58,703
I won't be able to sleep.
1431
00:59:58,746 --> 01:00:00,705
LEGG: Full AGI
will be able to do
1432
01:00:00,748 --> 01:00:03,882
any cognitive task
a person can do.
1433
01:00:03,925 --> 01:00:08,408
It will be at a scale,
potentially, far beyond that.
1434
01:00:08,451 --> 01:00:10,323
STUART RUSSELL:
It's really impossible for us
1435
01:00:10,366 --> 01:00:14,849
to imagine the outputs
of a superintelligent entity.
1436
01:00:14,893 --> 01:00:18,984
It's like asking a gorilla
to imagine, you know,
1437
01:00:19,027 --> 01:00:20,202
what Einstein does
1438
01:00:20,246 --> 01:00:23,423
when he produces
the theory of relativity.
1439
01:00:23,466 --> 01:00:25,512
LEGG: People often ask me
these questions like,
1440
01:00:25,555 --> 01:00:29,516
"What happens if you're wrong,
and AGI is quite far away?"
1441
01:00:29,559 --> 01:00:31,474
And I'm like,
I never worry about that.
1442
01:00:31,518 --> 01:00:33,868
I actually
worry about the reverse.
1443
01:00:33,912 --> 01:00:37,263
I actually worry
that it's coming faster
1444
01:00:37,306 --> 01:00:39,744
than we can
really prepare for.
1445
01:00:39,787 --> 01:00:42,050
[ROBOTIC ARM WHIRRING]
1446
01:00:42,094 --> 01:00:45,880
HADSELL: It really feels
like we're in a race to AGI.
1447
01:00:45,924 --> 01:00:49,928
The prototypes and the models
that we are developing now
1448
01:00:49,971 --> 01:00:51,843
are actually transforming
1449
01:00:51,886 --> 01:00:54,236
the space of what
we know about intelligence.
1450
01:00:54,280 --> 01:00:57,326
[WHIRRING]
1451
01:00:57,370 --> 01:00:58,806
LEGG: Recently,
we've had agents
1452
01:00:58,850 --> 01:01:00,068
that are powerful enough
1453
01:01:00,112 --> 01:01:03,463
to actually start
playing games in teams,
1454
01:01:03,506 --> 01:01:06,161
then competing
against other teams.
1455
01:01:06,205 --> 01:01:08,816
We're seeing
co-operative social dynamics
1456
01:01:08,860 --> 01:01:10,513
coming out of agents
1457
01:01:10,557 --> 01:01:13,342
where we haven't
pre-programmed in
1458
01:01:13,386 --> 01:01:15,605
any of these sorts
of dynamics.
1459
01:01:15,649 --> 01:01:19,261
It's completely learned
from their own experiences.
1460
01:01:20,828 --> 01:01:23,309
When we started,
we thought we were
1461
01:01:23,352 --> 01:01:25,746
out to build
an intelligence system
1462
01:01:25,790 --> 01:01:28,357
and convince the world
that we'd done it.
1463
01:01:28,401 --> 01:01:29,968
We're now starting
to wonder whether
1464
01:01:30,011 --> 01:01:31,317
we're gonna build systems
1465
01:01:31,360 --> 01:01:32,927
that we're not convinced
are fully intelligent,
1466
01:01:32,971 --> 01:01:34,712
and we're trying to convince
the world that they're not.
1467
01:01:34,755 --> 01:01:35,800
[CHUCKLES]
1468
01:01:35,843 --> 01:01:36,801
[CELL PHONE DINGS]
1469
01:01:38,672 --> 01:01:40,021
Hi, Alpha.
1470
01:01:40,065 --> 01:01:41,544
ALPHA: Hello there.
1471
01:01:41,588 --> 01:01:43,938
LOVE: Where are we today?
1472
01:01:43,982 --> 01:01:46,680
You're at the Museum of
Modern Art in New York City.
1473
01:01:48,421 --> 01:01:53,034
Kind of.
Um, what painting is this?
1474
01:01:53,078 --> 01:01:55,515
This isThe Creation of Adam
by Michelangelo.
1475
01:01:55,558 --> 01:01:58,431
I don't think that painting
is in New York City.
1476
01:01:58,474 --> 01:02:01,564
You are right.
It's in the Vatican City.
1477
01:02:01,608 --> 01:02:02,827
LOVE: Do you think
that's where we are?
1478
01:02:04,698 --> 01:02:05,830
ALPHA: No.
1479
01:02:05,873 --> 01:02:07,614
LOVE: Then how
am I looking at this?
1480
01:02:10,530 --> 01:02:11,661
ALPHA: You're looking at
1481
01:02:11,705 --> 01:02:12,967
a digital copy
of the painting.
1482
01:02:13,011 --> 01:02:14,012
Yes.
1483
01:02:15,056 --> 01:02:16,362
MAN 1: Who lives
in this house?
1484
01:02:16,405 --> 01:02:19,191
[EXCITING MUSIC PLAYING]
1485
01:02:19,234 --> 01:02:20,366
ALPHA: An astronaut.
1486
01:02:26,067 --> 01:02:28,026
What's this astronaut's name?
1487
01:02:28,069 --> 01:02:29,418
ALPHA: His name is Mark.
1488
01:02:29,462 --> 01:02:32,073
This is a famous equation
by Albert Einstein.
1489
01:02:32,117 --> 01:02:35,598
It shows that energy and mass
are equivalent.
1490
01:02:35,642 --> 01:02:37,687
How would you explain it
to a five year old?
1491
01:02:40,952 --> 01:02:43,563
ALPHA: Well, imagine
you have a big rock,
1492
01:02:43,606 --> 01:02:45,347
and you want to know
how much it weighs.
1493
01:02:45,391 --> 01:02:47,610
MAN 2: Is Mark in danger?
1494
01:02:47,654 --> 01:02:49,743
ALPHA: No.
Mark is not in danger.
1495
01:02:49,787 --> 01:02:50,788
Why is that?
1496
01:02:54,443 --> 01:02:56,184
ALPHA: Because the dinosaur
is not real.
1497
01:02:56,228 --> 01:02:57,316
[MEN CHUCKLING]
1498
01:02:57,359 --> 01:02:58,491
LOVE: All right.
1499
01:02:58,534 --> 01:03:00,536
Why could you
not answer my question?
1500
01:03:00,580 --> 01:03:01,929
ALPHA: I think I couldn't
answer your question
1501
01:03:01,973 --> 01:03:04,062
because I don't know
enough about the subject.
1502
01:03:04,932 --> 01:03:06,151
Okay.
1503
01:03:06,194 --> 01:03:08,675
How could you learn
more about the subject?
1504
01:03:08,718 --> 01:03:09,981
ALPHA: I can learn more
about the subject
1505
01:03:10,024 --> 01:03:12,897
by reading books,
articles, and websites.
1506
01:03:12,940 --> 01:03:14,289
You can teach me.
1507
01:03:14,333 --> 01:03:15,856
HASSABIS: My conclusion is,
1508
01:03:15,900 --> 01:03:17,640
if you now think about
what we're doing,
1509
01:03:17,684 --> 01:03:20,078
is learning from all humans,
1510
01:03:20,121 --> 01:03:22,602
all their knowledge at once
put on the Internet,
1511
01:03:22,645 --> 01:03:24,952
you would actually
know a lot about the world.
1512
01:03:24,996 --> 01:03:27,737
Like a significant portion
of everything humans can do.
1513
01:03:27,781 --> 01:03:29,652
And now,
I think it's more like,
1514
01:03:29,696 --> 01:03:30,958
"Well, it might just work."
1515
01:03:31,002 --> 01:03:33,134
This is a big moment.
1516
01:03:33,178 --> 01:03:34,657
Who is this?
1517
01:03:34,701 --> 01:03:36,572
ALPHA: This is God
reaching out to Adam.
1518
01:03:36,616 --> 01:03:38,009
RUSSELL: The advent of AGI
1519
01:03:38,052 --> 01:03:40,881
will divide human history
into two parts.
1520
01:03:40,925 --> 01:03:43,971
The part up to that point
and the part after that point.
1521
01:03:44,015 --> 01:03:47,279
LOVE: Why is he reaching out
to touch Adam?
1522
01:03:47,322 --> 01:03:48,933
ALPHA: God is
reaching out to touch Adam
1523
01:03:48,976 --> 01:03:50,238
to give him life.
1524
01:03:50,282 --> 01:03:52,240
RUSSELL:
It will give us a tool
1525
01:03:52,284 --> 01:03:57,550
that can completely reinvent
our entire civilization.
1526
01:03:57,593 --> 01:03:59,421
LOVE: What does
this painting mean to you?
1527
01:04:01,771 --> 01:04:03,773
ALPHA: The painting
means a lot to me.
1528
01:04:03,817 --> 01:04:04,818
Okay. Like what?
1529
01:04:09,083 --> 01:04:10,302
[MUSIC FADES]
1530
01:04:10,345 --> 01:04:11,477
ALPHA: I think
the painting is a reminder
1531
01:04:11,520 --> 01:04:12,695
that we are all
connected to each other
1532
01:04:12,739 --> 01:04:13,958
and that we are
all part of something
1533
01:04:14,001 --> 01:04:15,133
bigger than ourselves.
1534
01:04:16,482 --> 01:04:17,787
That's pretty nice.
1535
01:04:19,050 --> 01:04:21,400
LEGG: When you cross
that barrier of
1536
01:04:21,443 --> 01:04:23,968
"AGI might happen
one day in the future"
1537
01:04:24,011 --> 01:04:26,666
to "No, actually, this could
really happen in a time frame
1538
01:04:26,709 --> 01:04:28,711
"that is sort of, like,
on my watch, you know,"
1539
01:04:28,755 --> 01:04:30,496
something changes
in your thinking.
1540
01:04:30,539 --> 01:04:32,715
MAN: ...learned to orient
itself by looking...
1541
01:04:32,759 --> 01:04:35,066
HASSABIS: We have to be
careful with how we use it
1542
01:04:35,109 --> 01:04:37,198
and thoughtful about
how we deploy it.
1543
01:04:37,242 --> 01:04:39,809
[GRIPPING MUSIC BUILDING]
1544
01:04:39,853 --> 01:04:41,159
HASSABIS:
You'd have to consider
1545
01:04:41,202 --> 01:04:42,508
what's its top level goal.
1546
01:04:42,551 --> 01:04:45,032
If it's to keep humans happy,
1547
01:04:45,076 --> 01:04:48,949
which set of humans?
What does happiness mean?
1548
01:04:48,993 --> 01:04:52,039
A lot of our collective goals
are very tricky,
1549
01:04:52,083 --> 01:04:54,912
even for humans to figure out.
1550
01:04:54,955 --> 01:04:58,524
CUKIER: Technology always
embeds our values.
1551
01:04:58,567 --> 01:05:01,701
It's not just technical,
it's ethical as well.
1552
01:05:01,744 --> 01:05:02,920
So we've got
to be really cautious
1553
01:05:02,963 --> 01:05:04,312
about what
we're building into it.
1554
01:05:04,356 --> 01:05:06,097
MAN: We're trying to find
a single algorithm which...
1555
01:05:06,140 --> 01:05:07,837
SILVER: The reality is
that this is an algorithm
1556
01:05:07,881 --> 01:05:11,058
that has been created
by people, by us.
1557
01:05:11,102 --> 01:05:13,234
You know, what does it mean
to endow our agents
1558
01:05:13,278 --> 01:05:15,628
with the same kind of values
that we hold dear?
1559
01:05:15,671 --> 01:05:17,673
What is the purpose
of making these AI systems
1560
01:05:17,717 --> 01:05:19,066
appear so humanlike
1561
01:05:19,110 --> 01:05:20,763
so that they do
capture hearts and minds
1562
01:05:20,807 --> 01:05:21,982
because they're kind of
1563
01:05:22,026 --> 01:05:24,724
exploiting a human
vulnerability also?
1564
01:05:24,767 --> 01:05:26,552
The heart and mind
of these systems
1565
01:05:26,595 --> 01:05:28,075
are very much
human-generated data...
1566
01:05:28,119 --> 01:05:29,076
WOMAN: Mmm-hmm.
1567
01:05:29,120 --> 01:05:30,512
...for all the good
and the bad.
1568
01:05:30,556 --> 01:05:32,036
LEVI:
There is a parallel
1569
01:05:32,079 --> 01:05:34,038
between
the Industrial Revolution,
1570
01:05:34,081 --> 01:05:36,779
which was an incredible
moment of displacement
1571
01:05:36,823 --> 01:05:42,394
and the current technological
change created by AI.
1572
01:05:42,437 --> 01:05:43,743
[CHANTING] Pause AI!
1573
01:05:43,786 --> 01:05:45,745
LEVI: We have to think
about who's displaced
1574
01:05:45,788 --> 01:05:48,617
and how we're going
to support them.
1575
01:05:48,661 --> 01:05:50,097
This technology
is coming a lot sooner,
1576
01:05:50,141 --> 01:05:52,447
uh, than really
the world knows or kind of
1577
01:05:52,491 --> 01:05:55,929
even we 18, 24 months
ago thought.
1578
01:05:55,973 --> 01:05:57,278
So there's
a tremendous opportunity,
1579
01:05:57,322 --> 01:05:58,410
tremendous excitement,
1580
01:05:58,453 --> 01:06:00,412
but also
tremendous responsibility.
1581
01:06:00,455 --> 01:06:01,761
It's happening so fast.
1582
01:06:02,675 --> 01:06:04,024
How will we govern it?
1583
01:06:05,156 --> 01:06:06,244
How will we decide
1584
01:06:06,287 --> 01:06:08,202
what is okay
and what is not okay?
1585
01:06:08,246 --> 01:06:10,944
AI-generated images are
getting more sophisticated.
1586
01:06:10,988 --> 01:06:14,556
RUSSELL: The use of AI
for generating disinformation
1587
01:06:14,600 --> 01:06:17,037
and manipulating
human psychology
1588
01:06:17,081 --> 01:06:20,258
is only going to get
much, much worse.
1589
01:06:21,215 --> 01:06:22,608
LEGG: AGI is coming,
1590
01:06:22,651 --> 01:06:24,653
whether we do it here
at DeepMind or not.
1591
01:06:25,480 --> 01:06:26,786
CUKIER: It's gonna happen,
1592
01:06:26,829 --> 01:06:29,049
so we better create
institutions to protect us.
1593
01:06:29,093 --> 01:06:30,616
It's gonna require
global coordination.
1594
01:06:30,659 --> 01:06:32,748
And I worry that humanity is
1595
01:06:32,792 --> 01:06:35,403
increasingly getting worse
at that rather than better.
1596
01:06:35,447 --> 01:06:37,144
LEGG: We need
a lot more people
1597
01:06:37,188 --> 01:06:40,060
really taking this seriously
and thinking about this.
1598
01:06:40,104 --> 01:06:43,020
It's, yeah, it's serious.
It worries me.
1599
01:06:44,064 --> 01:06:45,892
It worries me. Yeah.
1600
01:06:45,935 --> 01:06:48,634
RUSSELL: If you received
an email saying
1601
01:06:48,677 --> 01:06:50,853
this superior
alien civilization
1602
01:06:50,897 --> 01:06:52,812
is going to arrive on Earth,
1603
01:06:52,855 --> 01:06:54,596
there would be
emergency meetings
1604
01:06:54,640 --> 01:06:56,294
of all the governments.
1605
01:06:56,337 --> 01:06:58,165
We would go into overdrive
1606
01:06:58,209 --> 01:07:00,124
trying to figure out
how to prepare.
1607
01:07:00,167 --> 01:07:01,647
- [MUSIC FADES]
- [BELL TOLLING FAINTLY]
1608
01:07:01,690 --> 01:07:03,997
The arrival of AGI will be
1609
01:07:04,041 --> 01:07:06,956
the most important moment
that we have ever faced.
1610
01:07:07,000 --> 01:07:09,176
[BELL CONTINUES
TOLLING FAINTLY]
1611
01:07:14,399 --> 01:07:17,576
HASSABIS: My dream
was that on the way to AGI,
1612
01:07:17,619 --> 01:07:20,709
we would create
revolutionary technologies
1613
01:07:20,753 --> 01:07:23,103
that would be
of use to humanity.
1614
01:07:23,147 --> 01:07:25,192
That's what I wanted
with AlphaFold.
1615
01:07:26,715 --> 01:07:28,674
I think
it's more important than ever
1616
01:07:28,717 --> 01:07:31,068
that we should solve
the protein folding problem.
1617
01:07:32,025 --> 01:07:34,245
This is gonna be really hard,
1618
01:07:34,288 --> 01:07:36,812
but I won't give up
until it's done.
1619
01:07:36,856 --> 01:07:37,900
You know,
we need to double down
1620
01:07:37,944 --> 01:07:40,338
and go as fast as possible
from here.
1621
01:07:40,381 --> 01:07:41,817
I think we've got
no time to lose.
1622
01:07:41,861 --> 01:07:45,778
So we are going to make
a protein folding strike team.
1623
01:07:45,821 --> 01:07:47,562
Team lead for the strike team
will be John.
1624
01:07:47,606 --> 01:07:48,694
Yeah, we've seen Alpha...
1625
01:07:48,737 --> 01:07:50,304
You know,
we're gonna try everything,
1626
01:07:50,348 --> 01:07:51,349
kitchen sink, the whole lot.
1627
01:07:52,219 --> 01:07:53,351
CASP14 is about
1628
01:07:53,394 --> 01:07:55,179
proving we can
solve the whole problem.
1629
01:07:56,354 --> 01:07:57,746
And I felt that to do that,
1630
01:07:57,790 --> 01:08:00,358
we would need to incorporate
some domain knowledge.
1631
01:08:00,401 --> 01:08:01,881
[EXCITING MUSIC PLAYING]
1632
01:08:01,924 --> 01:08:03,752
We had some
fantastic engineers on it,
1633
01:08:03,796 --> 01:08:05,754
but they were
not trained in biology.
1634
01:08:08,496 --> 01:08:10,281
KATHRYN TUNYASUVUNAKOOL:
As a computational biologist,
1635
01:08:10,324 --> 01:08:12,152
when I initially joined
the AlphaFold team,
1636
01:08:12,196 --> 01:08:14,241
I didn't immediately feel
confident about anything.
1637
01:08:14,285 --> 01:08:15,373
[CHUCKLES] You know,
1638
01:08:15,416 --> 01:08:17,244
whether we were
gonna be successful.
1639
01:08:17,288 --> 01:08:21,118
Biology is so
ridiculously complicated.
1640
01:08:21,161 --> 01:08:25,122
It just felt like this very
far-off mountain to climb.
1641
01:08:25,165 --> 01:08:26,775
MAN: I'm starting to play with
the underlying temperatures
1642
01:08:26,819 --> 01:08:27,994
to see if we can get...
1643
01:08:28,037 --> 01:08:29,169
As one of the few people
on the team
1644
01:08:29,213 --> 01:08:31,867
who's done work
in biology before,
1645
01:08:31,911 --> 01:08:34,870
you feel this huge sense
of responsibility.
1646
01:08:34,914 --> 01:08:36,133
"We're expecting you to do
1647
01:08:36,176 --> 01:08:37,699
"great things
on this strike team."
1648
01:08:37,743 --> 01:08:38,918
That's terrifying.
1649
01:08:40,485 --> 01:08:42,748
But one of the reasons
why I wanted to come here
1650
01:08:42,791 --> 01:08:45,577
was to do
something that matters.
1651
01:08:45,620 --> 01:08:48,493
This is the number
of missing things.
1652
01:08:48,536 --> 01:08:49,972
What about making use
1653
01:08:50,016 --> 01:08:52,584
of whatever understanding
you have of physics?
1654
01:08:52,627 --> 01:08:54,412
Using that
as a source of data?
1655
01:08:54,455 --> 01:08:55,500
But if it's systematic...
1656
01:08:55,543 --> 01:08:56,805
Then, that can't be
right, though.
1657
01:08:56,849 --> 01:08:58,329
If it's systematically wrong
in some weird way,
1658
01:08:58,372 --> 01:09:01,245
you might be learning that
systematically wrong physics.
1659
01:09:01,288 --> 01:09:02,376
The team is already
1660
01:09:02,420 --> 01:09:04,770
trying to think
of multiple ways that...
1661
01:09:04,813 --> 01:09:06,250
TUNYASUVUNAKOOL:
Biological relevance
1662
01:09:06,293 --> 01:09:07,816
is what we're going for.
1663
01:09:09,078 --> 01:09:11,385
So we rewrote
the whole data pipeline
1664
01:09:11,429 --> 01:09:13,300
that AlphaFold uses to learn.
1665
01:09:13,344 --> 01:09:15,607
HASSABIS: You can't
force the creative phase.
1666
01:09:15,650 --> 01:09:18,262
You have to give it space
for those flowers to bloom.
1667
01:09:19,263 --> 01:09:20,307
We won CASP.
1668
01:09:20,351 --> 01:09:22,091
Then it was
back to the drawing board
1669
01:09:22,135 --> 01:09:24,137
and like,
what are our new ideas?
1670
01:09:24,181 --> 01:09:26,966
Um, and then it's taken
a little while, I would say,
1671
01:09:27,009 --> 01:09:28,707
for them to get back
to where they were,
1672
01:09:28,750 --> 01:09:30,361
but with the new ideas.
1673
01:09:30,404 --> 01:09:31,536
And then now I think
1674
01:09:31,579 --> 01:09:33,973
we're seeing the benefits
of the new ideas.
1675
01:09:34,016 --> 01:09:35,757
They can go further, right?
1676
01:09:35,801 --> 01:09:38,151
So, um, that's a really
important moment.
1677
01:09:38,195 --> 01:09:40,980
I've seen that moment
so many times now,
1678
01:09:41,023 --> 01:09:42,634
but I know
what that means now.
1679
01:09:42,677 --> 01:09:44,505
And I know
this is the time now to press.
1680
01:09:44,549 --> 01:09:45,898
[EXCITING MUSIC CONTINUES]
1681
01:09:45,941 --> 01:09:48,030
JUMPER: Adding side-chains
improves direct folding.
1682
01:09:48,074 --> 01:09:49,684
That drove
a lot of the progress.
1683
01:09:49,728 --> 01:09:51,033
- We'll talk about that.
- Great.
1684
01:09:51,077 --> 01:09:54,820
The last four months,
we've made enormous gains.
1685
01:09:54,863 --> 01:09:56,474
EVANS: During CASP13,
1686
01:09:56,517 --> 01:09:59,520
it would take us a day or two
to fold one of the proteins,
1687
01:09:59,564 --> 01:10:01,783
and now we're folding, like,
1688
01:10:01,827 --> 01:10:03,959
hundreds of thousands
a second.
1689
01:10:04,003 --> 01:10:05,657
Yeah, it's just insane.
[CHUCKLES]
1690
01:10:05,700 --> 01:10:07,006
KAVUKCUOGLU: Now,
this is a model
1691
01:10:07,049 --> 01:10:09,922
that is
orders of magnitude faster,
1692
01:10:09,965 --> 01:10:12,272
while at the same time
being better.
1693
01:10:12,316 --> 01:10:13,665
We're getting
a lot of structures
1694
01:10:13,708 --> 01:10:15,275
into the high-accuracy regime.
1695
01:10:15,319 --> 01:10:17,538
We're rapidly improving
to a system
1696
01:10:17,582 --> 01:10:18,844
that is starting to really
1697
01:10:18,887 --> 01:10:20,498
get at the core and heart
of the problem.
1698
01:10:20,541 --> 01:10:21,716
HASSABIS: It's great work.
1699
01:10:21,760 --> 01:10:23,109
It looks like
we're in good shape.
1700
01:10:23,152 --> 01:10:26,243
So we got, what, six,
five weeks left? Six weeks?
1701
01:10:26,286 --> 01:10:29,637
So what's, uh... Is it...
You got enough compute power?
1702
01:10:29,681 --> 01:10:31,552
MAN: I... We could use more.
1703
01:10:31,596 --> 01:10:32,945
[ALL LAUGHING]
1704
01:10:32,988 --> 01:10:34,381
TUNYASUVUNAKOOL:
I was nervous about CASP
1705
01:10:34,425 --> 01:10:36,601
but as the system
is starting to come together,
1706
01:10:36,644 --> 01:10:37,993
I don't feel as nervous.
1707
01:10:38,037 --> 01:10:39,517
I feel like things
have, sort of,
1708
01:10:39,560 --> 01:10:41,214
come into perspective
recently,
1709
01:10:41,258 --> 01:10:44,261
and, you know,
it's gonna be fine.
1710
01:10:47,351 --> 01:10:48,874
NEWSCASTER: The Prime Minister
has announced
1711
01:10:48,917 --> 01:10:51,311
the most drastic limits
to our lives
1712
01:10:51,355 --> 01:10:53,879
the U.K. has ever seen
in living memory.
1713
01:10:53,922 --> 01:10:55,054
BORIS JOHNSON:
I must give the British people
1714
01:10:55,097 --> 01:10:56,925
a very simple instruction.
1715
01:10:56,969 --> 01:10:59,058
You must stay at home.
1716
01:10:59,101 --> 01:11:02,540
HASSABIS: It feels like we're
in a science fiction novel.
1717
01:11:02,583 --> 01:11:04,890
You know, I'm delivering food
to my parents,
1718
01:11:04,933 --> 01:11:08,241
making sure
they stay isolated and safe.
1719
01:11:08,285 --> 01:11:10,591
I think it just highlights
the incredible need
1720
01:11:10,635 --> 01:11:12,898
for AI-assisted science.
1721
01:11:17,119 --> 01:11:18,382
TUNYASUVUNAKOOL:
You always know that
1722
01:11:18,425 --> 01:11:21,036
something like this
is a possibility.
1723
01:11:21,080 --> 01:11:23,909
But nobody ever really
believes it's gonna happen
1724
01:11:23,952 --> 01:11:25,606
in their lifetime, though.
1725
01:11:25,650 --> 01:11:26,955
[COMPUTER BEEPS]
1726
01:11:26,999 --> 01:11:29,175
- JUMPER: Are you recording yet?
- RESEARCHER: Yes.
1727
01:11:29,218 --> 01:11:31,046
- Okay, morning, all.
- Hey.
1728
01:11:31,090 --> 01:11:32,700
Good. CASP has started.
1729
01:11:32,744 --> 01:11:36,095
It's nice I get to sit around
in my pajama bottoms all day.
1730
01:11:36,138 --> 01:11:37,618
TUNYASUVUNAKOOL: I never
thought I'd live in a house
1731
01:11:37,662 --> 01:11:39,185
where so much was going on.
1732
01:11:39,228 --> 01:11:41,448
I would be trying to solve
protein folding in one room,
1733
01:11:41,492 --> 01:11:42,580
and my husband would be trying
1734
01:11:42,623 --> 01:11:43,929
to make robots walk
in the other.
1735
01:11:45,409 --> 01:11:46,932
[EXHALES]
1736
01:11:46,975 --> 01:11:49,413
One of the hardest proteins
we've gotten in CASP thus far
1737
01:11:49,456 --> 01:11:51,240
is the SARS-CoV-2 protein
1738
01:11:51,284 --> 01:11:52,241
called ORF8.
1739
01:11:52,285 --> 01:11:54,940
ORF8 is
a coronavirus protein.
1740
01:11:54,983 --> 01:11:56,985
It's one of the main proteins,
um,
1741
01:11:57,029 --> 01:11:58,770
that dampens
the immune system.
1742
01:11:58,813 --> 01:12:00,075
TUNYASUVUNAKOOL:
We tried really hard
1743
01:12:00,119 --> 01:12:01,773
to improve our prediction.
1744
01:12:01,816 --> 01:12:03,514
Like, really, really hard.
1745
01:12:03,557 --> 01:12:05,603
Probably the most time
that we have ever spent
1746
01:12:05,646 --> 01:12:07,126
on a single target.
1747
01:12:07,169 --> 01:12:08,954
To the point where
my husband is, like,
1748
01:12:08,997 --> 01:12:12,218
"It's midnight.
You need to go to bed."
1749
01:12:12,261 --> 01:12:16,440
So I think we're at
Day 102 since lockdown.
1750
01:12:16,483 --> 01:12:19,965
My daughter
is keeping a journal.
1751
01:12:20,008 --> 01:12:22,141
Now you can go out
as much as you want.
1752
01:12:25,057 --> 01:12:27,233
JUMPER: We have received
the last target.
1753
01:12:27,276 --> 01:12:29,670
They've said they will be
sending out no more targets
1754
01:12:29,714 --> 01:12:31,368
in our category of CASP.
1755
01:12:32,673 --> 01:12:33,674
So we're just making sure
1756
01:12:33,718 --> 01:12:35,502
we get
the best possible answer.
1757
01:12:40,551 --> 01:12:43,336
MOULT: As soon as we started
to get the results,
1758
01:12:43,380 --> 01:12:48,254
I'd sit down and start looking
at how close did anybody come
1759
01:12:48,297 --> 01:12:50,604
to getting the protein
structures correct.
1760
01:12:54,869 --> 01:12:56,958
[ROBOT SQUEAKING]
1761
01:12:59,134 --> 01:13:00,222
[INCOMING CALL BEEPING]
1762
01:13:00,266 --> 01:13:01,572
- Oh, hi there.
- MAN: Hello.
1763
01:13:01,615 --> 01:13:03,791
[ALL CHUCKLING]
1764
01:13:03,835 --> 01:13:07,099
It is an unbelievable thing,
CASP has finally ended.
1765
01:13:07,142 --> 01:13:09,493
I think it's at least time
to raise a glass.
1766
01:13:09,536 --> 01:13:11,233
Um, I don't know
if everyone has a glass
1767
01:13:11,277 --> 01:13:12,844
of something
that they can raise.
1768
01:13:12,887 --> 01:13:14,976
If not, raise,
I don't know, your laptops.
1769
01:13:15,020 --> 01:13:17,109
- Um...
- [LAUGHTER]
1770
01:13:17,152 --> 01:13:18,632
I'll probably make a speech
in a minute.
1771
01:13:18,676 --> 01:13:20,504
I feel like I should but I
just have no idea what to say.
1772
01:13:21,026 --> 01:13:24,290
So... let's see.
1773
01:13:24,333 --> 01:13:27,075
I feel like
a reading of email...
1774
01:13:27,119 --> 01:13:28,555
is the right thing to do.
1775
01:13:28,599 --> 01:13:29,904
[ALL CHUCKLING]
1776
01:13:29,948 --> 01:13:31,253
TUNYASUVUNAKOOL:
When John said,
1777
01:13:31,297 --> 01:13:33,212
"I'm gonna read an email,"
at a team social,
1778
01:13:33,255 --> 01:13:35,519
I thought, "Wow, John,
you know how to have fun."
1779
01:13:35,562 --> 01:13:38,391
We're gonna read an email now.
[LAUGHS]
1780
01:13:38,435 --> 01:13:41,655
Uh, I got this
about four o'clock today.
1781
01:13:42,743 --> 01:13:44,745
Um, it is from John Moult.
1782
01:13:45,746 --> 01:13:47,052
And I'll just read it.
1783
01:13:47,095 --> 01:13:49,402
It says,
"As I expect you know,
1784
01:13:49,446 --> 01:13:53,624
"your group has performed
amazingly well in CASP 14,
1785
01:13:53,667 --> 01:13:55,408
"both relative to other groups
1786
01:13:55,452 --> 01:13:57,932
"and in absolute
model accuracy."
1787
01:13:57,976 --> 01:13:59,804
[PEOPLE CLAPPING]
1788
01:13:59,847 --> 01:14:01,240
"Congratulations on this work.
1789
01:14:01,283 --> 01:14:03,068
"It is really outstanding."
1790
01:14:03,111 --> 01:14:05,287
The structures were so good,
1791
01:14:05,331 --> 01:14:07,464
it was... it was just amazing.
1792
01:14:07,507 --> 01:14:09,117
[TRIUMPHANT INSTRUMENTAL
MUSIC PLAYING]
1793
01:14:09,161 --> 01:14:10,771
After half a century,
1794
01:14:10,815 --> 01:14:12,251
we finally have a solution
1795
01:14:12,294 --> 01:14:14,949
to the protein folding
problem.
1796
01:14:14,993 --> 01:14:17,430
When I saw this email,
I read it,
1797
01:14:17,474 --> 01:14:19,606
I go, "Oh, shit!"
1798
01:14:19,650 --> 01:14:21,608
And my wife goes,
"Is everything okay?"
1799
01:14:21,652 --> 01:14:24,263
I call my parents, and just,
like, "Hey, Mum.
1800
01:14:24,306 --> 01:14:26,265
"Um, got something
to tell you.
1801
01:14:26,308 --> 01:14:27,571
"We've done this thing
1802
01:14:27,614 --> 01:14:29,834
"and it might be kind of
a big deal." [LAUGHS]
1803
01:14:29,877 --> 01:14:31,662
When I learned of
the CASP 14 results,
1804
01:14:32,663 --> 01:14:34,055
I was gobsmacked.
1805
01:14:34,099 --> 01:14:35,840
I was just excited.
1806
01:14:35,883 --> 01:14:38,930
This is a problem
that I was beginning to think
1807
01:14:38,973 --> 01:14:42,107
would not get solved
in my lifetime.
1808
01:14:42,150 --> 01:14:44,762
NURSE: Now we have a tool
that can be used
1809
01:14:44,805 --> 01:14:46,633
practically by scientists.
1810
01:14:46,677 --> 01:14:48,461
SENIOR: These people
are asking us, you know,
1811
01:14:48,505 --> 01:14:50,245
"I've got this protein
involved in malaria,"
1812
01:14:50,289 --> 01:14:52,160
or, you know,
some infectious disease.
1813
01:14:52,204 --> 01:14:53,248
"We don't know the structure.
1814
01:14:53,292 --> 01:14:55,207
"Can we use AlphaFold
to solve it?"
1815
01:14:55,250 --> 01:14:56,991
JUMPER: We can easily predict
all known sequences
1816
01:14:57,035 --> 01:14:58,297
in a month.
1817
01:14:58,340 --> 01:14:59,994
All known sequences
in a month?
1818
01:15:00,038 --> 01:15:01,300
- Yeah, easily.
- Mmm-hmm?
1819
01:15:01,343 --> 01:15:02,606
JUMPER:
A billion, two billion.
1820
01:15:02,649 --> 01:15:03,694
Um, and they're...
1821
01:15:03,737 --> 01:15:05,217
So why don't we just do that?
Yeah.
1822
01:15:05,260 --> 01:15:07,132
- We should just do that a lot.
- Well, I mean...
1823
01:15:07,175 --> 01:15:09,264
That's way better.
Why don't we just do that?
1824
01:15:09,308 --> 01:15:11,136
SENIOR: So that's
one of the options.
1825
01:15:11,179 --> 01:15:12,659
- HASSABIS: Right.
- There's this...
1826
01:15:12,703 --> 01:15:15,140
We should just...
Right, that's a great idea.
1827
01:15:15,183 --> 01:15:17,534
We should just run
every protein in existence.
1828
01:15:18,317 --> 01:15:19,492
And then release that.
1829
01:15:19,536 --> 01:15:21,015
Why didn't someone
suggest this before?
1830
01:15:21,059 --> 01:15:22,147
Of course that's
what we should do.
1831
01:15:22,190 --> 01:15:23,975
Why are we thinking about
making a service
1832
01:15:24,018 --> 01:15:25,672
and then people submit
their protein?
1833
01:15:25,716 --> 01:15:26,934
We just fold everything.
1834
01:15:26,978 --> 01:15:28,675
And then give it to
everyone in the world.
1835
01:15:28,719 --> 01:15:31,504
Who knows how many discoveries
will be made from that?
1836
01:15:31,548 --> 01:15:33,811
BIRNEY: Demis called us up
and said,
1837
01:15:33,854 --> 01:15:35,639
"We want to make this open.
1838
01:15:35,682 --> 01:15:37,858
"Not just make sure
the code is open,
1839
01:15:37,902 --> 01:15:39,599
"but we're gonna make it
really easy
1840
01:15:39,643 --> 01:15:42,689
"for everybody to get access
to the predictions."
1841
01:15:45,083 --> 01:15:47,259
THORNTON: That is fantastic.
1842
01:15:47,302 --> 01:15:49,348
It's like drawing back
the curtain
1843
01:15:49,391 --> 01:15:52,873
and seeing the whole world
of protein structures.
1844
01:15:52,917 --> 01:15:55,223
[ETHEREAL MUSIC PLAYING]
1845
01:15:55,267 --> 01:15:57,008
SCHMIDT:
They released the structures
1846
01:15:57,051 --> 01:15:59,793
of 200 million proteins.
1847
01:15:59,837 --> 01:16:01,839
These are gifts to humanity.
1848
01:16:07,671 --> 01:16:10,935
JUMPER: The moment AlphaFold
is live to the world,
1849
01:16:10,978 --> 01:16:13,894
we will no longer be
the most important people
1850
01:16:13,938 --> 01:16:15,243
in AlphaFold's story.
1851
01:16:15,287 --> 01:16:16,854
HASSABIS: Can't quite believe
it's all out.
1852
01:16:16,897 --> 01:16:18,377
PEOPLE: Aw!
1853
01:16:18,420 --> 01:16:20,335
WOMAN: A hundred
and sixty-four users.
1854
01:16:20,379 --> 01:16:22,599
HASSABIS:
Loads of activity in Japan.
1855
01:16:22,642 --> 01:16:24,949
RESEARCHER 1:
We have 655 users currently.
1856
01:16:24,992 --> 01:16:26,951
RESEARCHER 2: We currently
have 100,000 concurrent users.
1857
01:16:26,994 --> 01:16:28,213
Wow!
1858
01:16:31,129 --> 01:16:33,914
Today is just crazy.
1859
01:16:33,958 --> 01:16:36,525
HASSABIS: What an absolutely
unbelievable effort
1860
01:16:36,569 --> 01:16:37,744
from everyone.
1861
01:16:37,788 --> 01:16:38,571
We're gonna all remember
these moments
1862
01:16:38,615 --> 01:16:40,051
for the rest of our lives.
1863
01:16:40,094 --> 01:16:41,748
I'm excited about AlphaFold.
1864
01:16:41,792 --> 01:16:45,622
For my research, it's already
propelling lots of progress.
1865
01:16:45,665 --> 01:16:47,406
And this is
just the beginning.
1866
01:16:47,449 --> 01:16:48,929
SCHMIDT: My guess is,
1867
01:16:48,973 --> 01:16:53,064
every single biological
and chemistry achievement
1868
01:16:53,107 --> 01:16:55,719
will be related to AlphaFold
in some way.
1869
01:16:55,762 --> 01:16:57,895
[TRIUMPHANT INSTRUMENTAL
MUSIC PLAYING]
1870
01:17:13,388 --> 01:17:15,434
AlphaFold is an index moment.
1871
01:17:15,477 --> 01:17:18,089
It's a moment
that people will not forget
1872
01:17:18,132 --> 01:17:20,265
because the world changed.
1873
01:17:39,676 --> 01:17:41,503
HASSABIS:
Everybody's realized now
1874
01:17:41,547 --> 01:17:43,767
what Shane and I have known
for more than 20 years,
1875
01:17:43,810 --> 01:17:46,639
that AI is going to be
the most important thing
1876
01:17:46,683 --> 01:17:48,467
humanity's ever gonna invent.
1877
01:17:48,510 --> 01:17:50,251
TRAIN ANNOUNCER:
We will shortly be arriving
1878
01:17:50,295 --> 01:17:52,079
at our final destination.
1879
01:17:52,123 --> 01:17:53,602
[ELECTRONIC MUSIC PLAYING]
1880
01:18:02,089 --> 01:18:04,788
HASSABIS: The pace of
innovation and capabilities
1881
01:18:04,831 --> 01:18:06,528
is accelerating,
1882
01:18:06,572 --> 01:18:09,314
like a boulder rolling down
a hill that we've kicked off
1883
01:18:09,357 --> 01:18:12,665
and now it's continuing
to gather speed.
1884
01:18:12,709 --> 01:18:15,320
NEWSCASTER: We are at
a crossroads in human history.
1885
01:18:15,363 --> 01:18:16,756
AI has the potential
1886
01:18:16,800 --> 01:18:19,193
to transform our lives
in every aspect.
1887
01:18:19,237 --> 01:18:23,807
It's no less important than
the discovery of electricity.
1888
01:18:23,850 --> 01:18:26,505
HASSABIS: We should be looking
at the scientific method
1889
01:18:26,548 --> 01:18:28,855
and trying to understand
each step of the way
1890
01:18:28,899 --> 01:18:30,117
in a rigorous way.
1891
01:18:30,161 --> 01:18:32,685
This is a moment
of profound opportunity.
1892
01:18:32,729 --> 01:18:34,774
SUNAK:
Harnessing this technology
1893
01:18:34,818 --> 01:18:37,734
could eclipse anything
we have ever known.
1894
01:18:40,432 --> 01:18:42,173
[ELECTRONIC DEVICE BEEPS]
1895
01:18:42,216 --> 01:18:43,696
HASSABIS: Hi, Alpha.
1896
01:18:44,697 --> 01:18:45,785
ALPHA: Hi.
1897
01:18:47,178 --> 01:18:48,440
What is this?
1898
01:18:50,703 --> 01:18:53,750
ALPHA: This is a chessboard.
1899
01:18:53,793 --> 01:18:56,535
If I was to play white, what
move would you recommend?
1900
01:18:59,886 --> 01:19:00,974
ALPHA: I would recommend
1901
01:19:01,018 --> 01:19:02,802
that you move your pawn
from E2 to E4.
1902
01:19:05,892 --> 01:19:08,808
And now if you were black,
what would you play now?
1903
01:19:11,593 --> 01:19:13,639
ALPHA: I would play
the Sicilian Defense.
1904
01:19:15,859 --> 01:19:16,903
That's a good choice.
1905
01:19:19,427 --> 01:19:21,473
- ALPHA: Thanks.
- [CHUCKLES]
1906
01:19:23,736 --> 01:19:25,912
So what do you see?
What is this object?
1907
01:19:28,567 --> 01:19:30,525
ALPHA:
This is a pencil sculpture.
1908
01:19:32,832 --> 01:19:35,052
What happens if I move
one of the pencils?
1909
01:19:38,011 --> 01:19:39,491
ALPHA: If you move
one of the pencils,
1910
01:19:39,534 --> 01:19:42,102
the sculpture will fall apart.
1911
01:19:42,146 --> 01:19:44,322
I'd better leave it alone,
then.
1912
01:19:44,365 --> 01:19:45,889
ALPHA: That's probably
a good idea.
1913
01:19:45,932 --> 01:19:47,412
[HASSABIS CHUCKLES]
1914
01:19:50,589 --> 01:19:52,765
HASSABIS:
AGI is on the horizon now.
1915
01:19:54,854 --> 01:19:56,682
Very clearly
the next generation
1916
01:19:56,725 --> 01:19:58,162
is going to live
in a future world
1917
01:19:58,205 --> 01:20:01,078
where things will be radically
different because of AI.
1918
01:20:02,514 --> 01:20:05,517
And if you want to steward
that responsibly,
1919
01:20:05,560 --> 01:20:09,260
every moment is vital.
1920
01:20:09,303 --> 01:20:12,698
This is the moment I've been
living my whole life for.
1921
01:20:19,183 --> 01:20:21,141
It's just
a good thinking game.
1922
01:20:22,490 --> 01:20:24,623
[UPLIFTING INSTRUMENTAL
MUSIC PLAYING]
1923
01:20:24,623 --> 01:20:29,623
DOWNLOADED FROM WWW.AWAFIM.TV
1924
01:20:24,623 --> 01:20:34,623
For latest movies and series with subtitles
Visit WWW.AWAFIM.TV Today
140283
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.