Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,000 --> 00:00:05,100
If you want to understand the weird
and wonderful world around you,
2
00:00:05,100 --> 00:00:07,660
then maths is your friend.
3
00:00:07,660 --> 00:00:10,980
But the real trick is, that with
a little bit of mathematical
4
00:00:10,980 --> 00:00:15,980
thinking, you can take those rules
and patterns, bend and twist them,
5
00:00:15,980 --> 00:00:18,500
and fire them straight back at you.
6
00:00:20,700 --> 00:00:25,500
Tonight, we are going to explore
just how far we can push those
7
00:00:25,500 --> 00:00:28,180
barriers to achieve amazing things.
8
00:00:51,660 --> 00:00:55,980
CHEERING AND APPLAUSE
9
00:01:03,380 --> 00:01:06,460
Welcome to the Christmas Lectures.
I am Dr Hannah Fry
10
00:01:06,460 --> 00:01:08,980
and tonight we are going
to be talking about how to give
11
00:01:08,980 --> 00:01:11,380
ourselves superhuman skills.
12
00:01:11,380 --> 00:01:14,580
Now, we're very good at using maths
to understand the world around us.
13
00:01:14,580 --> 00:01:17,580
But if we want to bend the world
to our will, we're going
14
00:01:17,580 --> 00:01:19,380
to need to be a little bit
more inventive.
15
00:01:19,380 --> 00:01:21,780
We're going to need to take
this maths up a level here.
16
00:01:21,780 --> 00:01:25,340
So I thought what we'd do is we
would start off tonight with
17
00:01:25,340 --> 00:01:26,860
something of a competition.
18
00:01:26,860 --> 00:01:31,140
So I wonder, does anyone know
how to solve one of these?
19
00:01:31,140 --> 00:01:33,540
Who doesn't? Oh, actually,
quite a few of you.
20
00:01:33,540 --> 00:01:35,180
OK. Who doesn't mind coming down?
21
00:01:35,180 --> 00:01:36,820
Let's go with you, actually, yeah.
22
00:01:36,820 --> 00:01:39,180
If you want to come down.
Round of applause, if you can,
23
00:01:39,180 --> 00:01:41,220
as he comes to the stage.
24
00:01:42,740 --> 00:01:44,380
What's your name? George. George.
25
00:01:44,380 --> 00:01:46,220
OK. How good are you
at these, George?
26
00:01:46,220 --> 00:01:48,980
Solved it a few times. OK. All
right. Let's give you a go.
27
00:01:48,980 --> 00:01:51,100
We're going to give you lots of
encouragement as you go.
28
00:01:51,100 --> 00:01:53,660
Can I look at it first, like...?
Yeah, you can have a little look,
29
00:01:53,660 --> 00:01:56,060
yeah. Cheers. We'll be patient
with you, though. OK.
30
00:01:56,060 --> 00:01:58,900
All right. Yeah.
I reckon I'm ready to go.
31
00:01:58,900 --> 00:02:01,460
OK. Come on, then, George.
Come on, George!
32
00:02:01,460 --> 00:02:03,940
Hang on, George. Hang on, George.
33
00:02:03,940 --> 00:02:05,460
AUDIENCE MURMURS
34
00:02:07,260 --> 00:02:08,580
George?
35
00:02:08,580 --> 00:02:09,820
AUDIENCE: Whoo!
36
00:02:09,820 --> 00:02:11,500
AUDIENCE WHOOPS
37
00:02:15,460 --> 00:02:17,780
George, you've solved these more
than a few times, haven't you?
38
00:02:17,780 --> 00:02:20,900
Yeah, quite a few, actually. Tell
us who you actually are, George.
39
00:02:20,900 --> 00:02:23,900
I'm actually the UK champion
for solving a Rubik's Cube.
40
00:02:23,900 --> 00:02:25,660
Yeah. You're the best in Britain.
41
00:02:25,660 --> 00:02:27,820
Yeah. Currently reigning UK
champion. Yeah.
42
00:02:27,820 --> 00:02:30,180
So tell us, OK, how do you
actually solve one of these?
43
00:02:30,180 --> 00:02:33,300
OK, so I guess the first thing to
know is that, like, it's got quite
44
00:02:33,300 --> 00:02:35,860
a lot of combinations. So it's not
like you just like learn one thing
45
00:02:35,860 --> 00:02:37,580
and then you can solve
it straight away.
46
00:02:37,580 --> 00:02:40,300
So, like, when you scramble it,
chances are that scramble's
47
00:02:40,300 --> 00:02:42,340
actually never, ever been seen
before. It just has so many
48
00:02:42,340 --> 00:02:44,380
combinations - it has 42 quintillion
49
00:02:44,380 --> 00:02:47,060
combinations... That's a big number.
43 followed by 19 zeros.
50
00:02:47,060 --> 00:02:51,980
So, you kind of need to split up
into steps and you need to learn
51
00:02:51,980 --> 00:02:54,780
algorithms, like this whole
actual... A list of instructions.
52
00:02:54,780 --> 00:02:55,980
Yes.
53
00:02:55,980 --> 00:02:58,700
So, algorithms, when applied
to a cube, that kind of means
54
00:02:58,700 --> 00:03:01,780
you're temporarily mixing
up the cube, then putting
55
00:03:01,780 --> 00:03:04,100
it back, having switched
a few pieces.
56
00:03:04,100 --> 00:03:07,300
So the more algorithms you know,
the more efficient you'll be.
57
00:03:07,300 --> 00:03:09,980
If I combine that efficiency
with faster turning, it allows me
58
00:03:09,980 --> 00:03:11,340
to be a faster solver.
59
00:03:11,340 --> 00:03:13,700
Now, there's a very key point
here in what George is saying,
60
00:03:13,700 --> 00:03:16,660
which is that sometimes maths
doesn't look like numbers and
61
00:03:16,660 --> 00:03:19,580
equations, sometimes maths can just
look like a list of instructions.
62
00:03:19,580 --> 00:03:22,300
I mean, you can solve this by
following a list of instructions.
63
00:03:22,300 --> 00:03:23,940
An algorithm, as you say.
64
00:03:23,940 --> 00:03:27,540
And that means, George, that if I
want to beat you at this, all I've
65
00:03:27,540 --> 00:03:30,140
got to do is follow those
instructions faster than you,
66
00:03:30,140 --> 00:03:33,780
right? Yeah. Well, thankfully I
have just the thing that's going to
67
00:03:33,780 --> 00:03:39,100
help me, and Illius here
is the creator of this machine.
68
00:03:39,100 --> 00:03:42,020
So I tell you what, hang on, let me
just scramble this one up again.
69
00:03:42,020 --> 00:03:45,020
We're going to have a bit of a
competition. We're going to do you
70
00:03:45,020 --> 00:03:46,740
against the machine.
OK, so here we go.
71
00:03:46,740 --> 00:03:47,940
Let's scramble this one.
72
00:03:47,940 --> 00:03:49,780
You're not allowed to look at this
now. All right.
73
00:03:49,780 --> 00:03:52,020
So I'll tell you when you're
allowed to look.
74
00:03:52,020 --> 00:03:53,580
Is it ready to go, Illius?
75
00:03:53,580 --> 00:03:55,940
OK. All right. We're going to count
it down. And then you and the
76
00:03:55,940 --> 00:03:58,340
machine can both look. Ready?
Three, two, one, go.
77
00:03:59,260 --> 00:04:02,140
Who's going to win?
Who's going to win?
78
00:04:02,140 --> 00:04:04,060
Ohhh!
79
00:04:04,060 --> 00:04:07,820
George! You're letting yourself
down! I know!
80
00:04:07,820 --> 00:04:10,220
That is absolutely incredible.
81
00:04:10,220 --> 00:04:12,340
APPLAUSE AND CHEERING
82
00:04:15,660 --> 00:04:18,380
I mean, that was amazing.
That was amazing. OK.
83
00:04:18,380 --> 00:04:20,980
Let's go over here have a little
look back at what we're seeing.
84
00:04:20,980 --> 00:04:23,660
I mean, this machine is doing the
same things as you. Exactly, yeah.
85
00:04:23,660 --> 00:04:27,300
It's just, it's, like, seeing when
to use those algorithms way more
86
00:04:27,300 --> 00:04:29,380
quickly and it does it,
like, all in one go,
87
00:04:29,380 --> 00:04:31,940
so it doesn't have to pause
to look to when to use the next
88
00:04:31,940 --> 00:04:34,540
algorithm or anything like that.
So we can look at a little slow-mo
89
00:04:34,540 --> 00:04:36,980
replay, I think, of this.
90
00:04:36,980 --> 00:04:39,620
And it is a properly mixed-up
cube, there.
91
00:04:39,620 --> 00:04:40,900
Wow! Amazing.
92
00:04:40,900 --> 00:04:42,780
That was absolutely incredible.
93
00:04:42,780 --> 00:04:45,940
So it's not better than you.
It's just faster.
94
00:04:45,940 --> 00:04:48,980
I suppose so. Yeah. It'll know
more algorithms than I do.
95
00:04:48,980 --> 00:04:51,620
Yeah. Amazing. George
and the cube-solving machine,
96
00:04:51,620 --> 00:04:54,740
thank you very much. Thank you.
97
00:04:54,740 --> 00:04:58,220
Thank you. That was amazing.
So impressed.
98
00:04:58,220 --> 00:05:00,700
That really, I think,
was a great example
99
00:05:00,700 --> 00:05:02,940
of a machine following instructions.
100
00:05:02,940 --> 00:05:06,180
The question is, how do you get
those instructions into the machine
101
00:05:06,180 --> 00:05:07,260
in the first place?
102
00:05:07,260 --> 00:05:11,740
Well, one man who knows the answer
is coder extraordinaire
103
00:05:11,740 --> 00:05:13,740
Seb Lee-Delisle.
104
00:05:15,180 --> 00:05:16,940
Hey, Seb!
105
00:05:18,740 --> 00:05:20,940
So, Seb, tell us about the
kind of things that you code.
106
00:05:20,940 --> 00:05:24,060
Well, I like to make really big
light installations
107
00:05:24,060 --> 00:05:25,700
that are interactive. Oh, OK.
108
00:05:25,700 --> 00:05:28,540
And we're seeing some footage
of this, here. On the side of
109
00:05:28,540 --> 00:05:29,940
buildings, no less. Yeah. Yeah.
110
00:05:29,940 --> 00:05:33,660
See, my ambition really is
to replace fireworks. With lasers.
111
00:05:33,660 --> 00:05:36,460
With lasers. Yeah, cos, like,
fireworks are old technology.
112
00:05:36,460 --> 00:05:40,420
Uh-huh. Now we've got technology
that's just as bright as fireworks,
113
00:05:40,420 --> 00:05:43,020
lasers, super-bright LEDs.
114
00:05:43,020 --> 00:05:45,660
But all of my technology
is controlled by computers.
115
00:05:45,660 --> 00:05:48,060
OK, so talking about computers,
then, how do
116
00:05:48,060 --> 00:05:49,300
you control these things?
117
00:05:49,300 --> 00:05:52,380
Well, I'm going to show you
how we make something called
118
00:05:52,380 --> 00:05:53,700
a particle system.
119
00:05:53,700 --> 00:05:55,820
So I've got a browser open.
120
00:05:55,820 --> 00:05:58,580
I'm going to do some coding
in JavaScript to make some graphics.
121
00:05:58,580 --> 00:06:01,140
You're going to make an algorithm.
An algorithm, yeah. A series of
122
00:06:01,140 --> 00:06:02,940
instructions for the machine.
Exactly right.
123
00:06:02,940 --> 00:06:05,180
So the first thing I'm going to do
is make a single particle,
124
00:06:05,180 --> 00:06:07,860
which is essentially just a shape.
And we've got to decide
125
00:06:07,860 --> 00:06:11,140
what colour that shape should be.
Any ideas what colour?
126
00:06:11,140 --> 00:06:13,340
AUDIENCE SHOUTS SUGGESTIONS
127
00:06:15,180 --> 00:06:17,980
I think first thing I heard
was "pink".
128
00:06:17,980 --> 00:06:20,940
So I'm going to actually
use magenta,
129
00:06:20,940 --> 00:06:23,180
which is my favourite type of pink,
130
00:06:23,180 --> 00:06:25,660
and it's a very nice bright pink.
131
00:06:25,660 --> 00:06:29,460
And the next thing I've got to do
is draw a filled-in shape, so
132
00:06:29,460 --> 00:06:30,540
let's hear some shapes.
133
00:06:30,540 --> 00:06:31,860
AUDIENCE SHOUTS SUGGESTIONS
134
00:06:31,860 --> 00:06:34,420
Circle. OK, OK!
135
00:06:34,420 --> 00:06:36,900
Square. I think I heard "square".
136
00:06:36,900 --> 00:06:39,980
Definitely. OK. Square.
So a square is just a rectangle.
137
00:06:39,980 --> 00:06:42,220
So let's draw a filled-in rectangle.
138
00:06:42,220 --> 00:06:45,860
We're going to draw it at 00,
which is the top left of the screen
139
00:06:45,860 --> 00:06:50,140
in X and Y coordinates, and we're
going to make it 100 wide and
140
00:06:50,140 --> 00:06:52,540
100 high, so let's look at our
square. There it is.
141
00:06:52,540 --> 00:06:54,780
So this X and Y that you were
talking about there... Yeah.
142
00:06:54,780 --> 00:06:56,620
..are you treating
this as though it's a graph?
143
00:06:56,620 --> 00:06:59,220
Yeah. So this huge
canvas in JavaScript,
144
00:06:59,220 --> 00:07:02,940
it's got pixels and I can identify
each pixel by its X and Y
145
00:07:02,940 --> 00:07:06,020
co-ordinate. OK. Yeah. This is
just straight-up maths so far.
146
00:07:06,020 --> 00:07:09,340
It's all maths. It's maths all the
way down, and I can change this X
147
00:07:09,340 --> 00:07:13,980
value and you can see that
the rectangle moves to whatever
148
00:07:13,980 --> 00:07:17,620
co-ordinate I specify for the X
position. As you increase X,
149
00:07:17,620 --> 00:07:20,140
this is moving right. Yeah. Exactly.
150
00:07:20,140 --> 00:07:22,140
So let's do the same thing
for the Y value.
151
00:07:22,140 --> 00:07:25,180
You can see it moves down
and to the right. Ahh!
152
00:07:25,180 --> 00:07:27,300
Right. So this is good.
It's the start of our particle.
153
00:07:27,300 --> 00:07:28,380
But it's not moving.
154
00:07:28,380 --> 00:07:29,940
It's completely static.
155
00:07:29,940 --> 00:07:33,300
So in order to get it moving, I'm
going to wrap all of this stuff up
156
00:07:33,300 --> 00:07:37,380
in a function, and I'm going
to call this function 60
157
00:07:37,380 --> 00:07:38,820
times a seconds.
158
00:07:38,820 --> 00:07:43,380
But now I can do X+=1,
which adds one to X every time.
159
00:07:43,380 --> 00:07:46,820
And now, we should have the
rectangle in a brand-new place
160
00:07:46,820 --> 00:07:48,660
every time. We've made some
animation. Aha!
161
00:07:48,660 --> 00:07:50,740
Doesn't look much like a firework.
No.
162
00:07:50,740 --> 00:07:52,820
So we get we're getting there.
163
00:07:52,820 --> 00:07:55,900
So, in particle systems, usually
the particles are very small.
164
00:07:55,900 --> 00:07:57,700
That's the first thing we can fix.
165
00:07:57,700 --> 00:07:59,700
Let's just make it really little.
Aww!
166
00:07:59,700 --> 00:08:05,220
And now, let's make some new
variables for the X velocity
167
00:08:05,220 --> 00:08:07,460
and for the Y velocity.
168
00:08:07,460 --> 00:08:11,900
The best part is when I give those
velocity values a random
169
00:08:11,900 --> 00:08:16,140
number. Oh, OK. So you can do that
with the command "random".
170
00:08:16,140 --> 00:08:20,140
If I pass in minus ten to plus ten,
then we're going to get a random
171
00:08:20,140 --> 00:08:23,100
number in the X velocity
between those two numbers
172
00:08:23,100 --> 00:08:25,620
and the same with the Y velocity.
173
00:08:25,620 --> 00:08:30,180
And you'll see now that every time
we run it...
174
00:08:30,180 --> 00:08:32,220
Goes in a different direction!
..it goes in a completely
175
00:08:32,220 --> 00:08:34,220
different direction. A random
direction every time.
176
00:08:34,220 --> 00:08:36,340
Right. Exactly.
So that's kind of fun.
177
00:08:36,340 --> 00:08:38,020
So we've got one particle.
178
00:08:38,020 --> 00:08:39,500
One particle is cute.
179
00:08:39,500 --> 00:08:42,500
But really, the idea of particle
systems is that we make a whole
180
00:08:42,500 --> 00:08:44,020
bunch of particles.
181
00:08:44,020 --> 00:08:45,860
So now, in this next bit of code,
182
00:08:45,860 --> 00:08:48,180
I'm doing essentially
the same thing,
183
00:08:48,180 --> 00:08:50,260
but I'm creating an array
of particles.
184
00:08:50,260 --> 00:08:53,380
And we get an effect like this. Ahh!
185
00:08:53,380 --> 00:08:55,980
That looks like a sparkler!
We're getting there. Aren't we?
186
00:08:55,980 --> 00:08:57,220
We're getting to the sparkler.
187
00:08:57,220 --> 00:09:00,540
Now, this is an ordinary projector,
so it's not very bright.
188
00:09:00,540 --> 00:09:03,060
If we want to create something
that's truly spectacular,
189
00:09:03,060 --> 00:09:06,140
like a real firework,
we need to use a laser.
190
00:09:06,140 --> 00:09:08,260
Ah. Wow! They're extremely bright.
Right.
191
00:09:08,260 --> 00:09:10,020
They're really, really bright,
aren't they?
192
00:09:10,020 --> 00:09:12,340
And so I can adjust all
the settings here.
193
00:09:12,340 --> 00:09:14,940
I can change the speed,
how fast they are,
194
00:09:14,940 --> 00:09:18,060
make them fade out over
time a little bit.
195
00:09:18,060 --> 00:09:23,620
And, yeah, I can add some gravity
to make them fall down. Amazing!
196
00:09:23,620 --> 00:09:25,860
But, of course, the stuff
that you are really well known
197
00:09:25,860 --> 00:09:29,140
for, Seb, is not
just you having control
198
00:09:29,140 --> 00:09:31,940
of where this is on the Y axis,
where it is on the graph...
199
00:09:31,940 --> 00:09:34,900
Yeah. ..but letting the audience
control it. Exactly. I love doing
200
00:09:34,900 --> 00:09:38,180
interactive things, so what I
thought I'd do is actually take
201
00:09:38,180 --> 00:09:42,980
the sound from the microphone
in my laptop and apply that level
202
00:09:42,980 --> 00:09:46,940
to the Y origin of the particles
and then we could make a game
203
00:09:46,940 --> 00:09:48,980
out of it.
So here we've got this.
204
00:09:48,980 --> 00:09:50,820
You can see the sparkler
on the left.
205
00:09:50,820 --> 00:09:53,580
Now it should be that the more
noise that you make, the higher
206
00:09:53,580 --> 00:09:55,580
this sparkle rises. All right.
207
00:09:55,580 --> 00:09:59,100
Here we go. Three, two, one, go.
208
00:09:59,100 --> 00:10:02,260
AUDIENCE CHEERS RAUCOUSLY
209
00:10:06,180 --> 00:10:09,620
NOISE DROWNS SPEECH
210
00:10:15,620 --> 00:10:17,420
This is it, guys!
211
00:10:20,740 --> 00:10:23,580
It's getting harder.
It's getting harder.
212
00:10:25,780 --> 00:10:27,980
NOISE DROWNS SEB'S SPEECH
213
00:10:29,900 --> 00:10:33,300
Ohh! That was amazing!
That was such a high score.
214
00:10:33,300 --> 00:10:37,580
Well done, everyone.
Amazing, amazing. Seb, thank you
215
00:10:37,580 --> 00:10:40,940
for showing us your all your lasers.
Very, very welcome. Thank you.
216
00:10:40,940 --> 00:10:42,300
Thank you very much. Thank you.
217
00:10:42,300 --> 00:10:44,140
CHEERING AND APPLAUSE
218
00:10:45,620 --> 00:10:48,180
Now, Seb's fireworks
look very realistic, there,
219
00:10:48,180 --> 00:10:51,380
but I think if you want something
that looks as real as possible,
220
00:10:51,380 --> 00:10:54,820
then you really can't do better
than looking at Hollywood films -
221
00:10:54,820 --> 00:10:57,340
the visual effects that
they use in their movies.
222
00:10:57,340 --> 00:10:59,500
Now, these are the people,
the people who create those,
223
00:10:59,500 --> 00:11:02,620
they're the people who, they don't
only copy reality, but they really
224
00:11:02,620 --> 00:11:05,060
know how to bend the rules
to their will
225
00:11:05,060 --> 00:11:06,860
for our entertainment.
226
00:11:06,860 --> 00:11:10,620
And we are very lucky, because
today, we are joined by someone who
227
00:11:10,620 --> 00:11:14,460
does exactly that.
From Weta Digital in New Zealand,
228
00:11:14,460 --> 00:11:17,700
please join me
in welcoming Anders Langlands.
229
00:11:17,700 --> 00:11:21,060
CHEERING AND APPLAUSE
230
00:11:21,060 --> 00:11:22,820
Hi. Hey, Anders!
231
00:11:24,300 --> 00:11:26,220
So what films have you worked on,
Anders?
232
00:11:26,220 --> 00:11:28,660
Well, Weta's got quite a long
history of going all the way back
233
00:11:28,660 --> 00:11:32,180
to The Lord Of The Rings, Avatar,
Planet Of The Apes and then, most
234
00:11:32,180 --> 00:11:34,540
recently, Avengers this year.
Oh, crikey.
235
00:11:34,540 --> 00:11:36,460
I mean, you don't get much bigger
than Avengers. No.
236
00:11:36,460 --> 00:11:39,380
How much of the animations that we
see - the CGI that we see in films -
237
00:11:39,380 --> 00:11:41,300
how much of it is maths?
238
00:11:41,300 --> 00:11:42,620
Well, it's all maths, really.
239
00:11:42,620 --> 00:11:44,780
It's like a combination
between art and science.
240
00:11:44,780 --> 00:11:49,580
And we use algorithms to simulate
how things move, to give animators
241
00:11:49,580 --> 00:11:52,980
controls, to make things move
according to their desires,
242
00:11:52,980 --> 00:11:56,140
and to render the pictures and make
the images through mathematical
243
00:11:56,140 --> 00:11:57,340
equations, as well.
244
00:11:57,340 --> 00:12:00,820
But presumably, if you want to make
things look realistic on the screen,
245
00:12:00,820 --> 00:12:03,300
you have to understand how
they work in real life.
246
00:12:03,300 --> 00:12:07,220
And that really is what this tower
behind us, that has just been built,
247
00:12:07,220 --> 00:12:08,820
is here for.
248
00:12:08,820 --> 00:12:12,580
Because I would like a volunteer
who would like to come
249
00:12:12,580 --> 00:12:17,500
down and help me to dismantle
this tower.
250
00:12:17,500 --> 00:12:19,780
Let's go for, yeah,
do you want to come down?
251
00:12:19,780 --> 00:12:21,740
Round of applause as
they come to the stage!
252
00:12:21,740 --> 00:12:24,820
CHEERING
253
00:12:27,220 --> 00:12:28,260
What's your name?
254
00:12:28,260 --> 00:12:29,580
APPLAUSE DROWNS RESPONSE
255
00:12:29,580 --> 00:12:32,300
All right, so we have got
this bowling ball.
256
00:12:32,300 --> 00:12:34,060
Now, I want you to roll it
very gently.
257
00:12:34,060 --> 00:12:37,580
This is a very old and important
building and we will send you a
258
00:12:37,580 --> 00:12:40,180
bill. So very gently and try
and smash down this tower.
259
00:12:40,180 --> 00:12:42,780
Can I stand somewhere else? You
can stand a bit further back here
260
00:12:42,780 --> 00:12:45,940
if you want to, give you a bit
of a run up, but let's count it
261
00:12:45,940 --> 00:12:50,020
down. Ready? OK. Three...
ALL: Two, one.
262
00:12:50,020 --> 00:12:51,580
Go!
263
00:12:51,580 --> 00:12:53,260
Oh!
264
00:12:53,260 --> 00:12:54,980
It's pretty good!
265
00:12:54,980 --> 00:12:58,540
Yay! Ha-ha. Well done!
266
00:12:58,540 --> 00:13:00,380
CHEERING
267
00:13:01,740 --> 00:13:05,460
But stuff like that, it isn't just
for fun for you guys, right?
268
00:13:05,460 --> 00:13:08,820
No, I mean, we have to do things
a little bit more complex than that,
269
00:13:08,820 --> 00:13:12,340
but the basic equations of motion
that govern something like that,
270
00:13:12,340 --> 00:13:14,580
as you've just seen,
are pretty simple.
271
00:13:14,580 --> 00:13:16,620
Simple enough to solve by hand,
in a lot of cases.
272
00:13:16,620 --> 00:13:19,340
And if you've done GCSE maths, then
you've probably encountered some
273
00:13:19,340 --> 00:13:21,700
of that, too. Just the physics
behind what just happened?
274
00:13:21,700 --> 00:13:24,060
The physics behind what just
happened. Of course, we want to
275
00:13:24,060 --> 00:13:25,740
simulate much more complex things,
and even
276
00:13:25,740 --> 00:13:28,140
something like this tower
falling over,
277
00:13:28,140 --> 00:13:31,060
we need to use computers to do
that so we can do it fast enough
278
00:13:31,060 --> 00:13:32,900
to make a moving animation.
279
00:13:32,900 --> 00:13:36,100
And, in fact, talking of moving
animations, you have brought one
280
00:13:36,100 --> 00:13:38,860
with you, so let's have a little
look over here.
281
00:13:38,860 --> 00:13:40,260
This is our tower, right?
282
00:13:40,260 --> 00:13:43,220
This is kind of the same as our
tower? Yes, pretty similar.
283
00:13:43,220 --> 00:13:45,900
But this one is
a complete animation.
284
00:13:45,900 --> 00:13:49,740
Yes. So we look at the forces of
gravity and the force of the ball
285
00:13:49,740 --> 00:13:52,820
being thrown, analyse
the collisions, and then,
286
00:13:52,820 --> 00:13:53,940
every single frame -
287
00:13:53,940 --> 00:13:57,020
so, at least 24 times a second -
we update all of that and work out
288
00:13:57,020 --> 00:13:58,860
where the blocks are going
to be on the next frame
289
00:13:58,860 --> 00:14:01,260
to create an animation. So let's
have a little look at it
290
00:14:01,260 --> 00:14:03,740
as it's animated. It's pretty good.
291
00:14:03,740 --> 00:14:05,940
I mean, you've even got the
little bit I had to kick down.
292
00:14:05,940 --> 00:14:07,380
Just the wrong way round.
293
00:14:07,380 --> 00:14:09,980
I haven't got a digital Hannah
to kick it down, I'm afraid. OK.
294
00:14:09,980 --> 00:14:13,260
So once you are at this stage,
once you have this stuff created,
295
00:14:13,260 --> 00:14:15,580
you understand how the basics work,
then what do you do?
296
00:14:15,580 --> 00:14:18,260
Well, then we can have a little fun
with it. So because we create the
297
00:14:18,260 --> 00:14:19,860
world, we can do things like change
298
00:14:19,860 --> 00:14:22,500
all the boxes into china
so they shatter when it gets hit,
299
00:14:22,500 --> 00:14:24,420
rather than falling over.
300
00:14:24,420 --> 00:14:27,260
We could change all the boxes into
balloons and then pump them full
301
00:14:27,260 --> 00:14:28,940
of helium so they float away.
302
00:14:28,940 --> 00:14:31,660
That's nice! Escaping round
the side. That's nice.
303
00:14:31,660 --> 00:14:33,900
I like that a lot. It's getting
away. Or maybe if we wanted
304
00:14:33,900 --> 00:14:36,020
something a bit more exciting,
we could douse them
305
00:14:36,020 --> 00:14:38,460
in petrol and then set them on fire.
Oh, wow!
306
00:14:38,460 --> 00:14:39,620
AUDIENCE LAUGHS
307
00:14:39,620 --> 00:14:41,500
That looks very realistic.
308
00:14:41,500 --> 00:14:44,180
Thanks. That's kind of the point!
309
00:14:44,180 --> 00:14:47,020
How do you make that look
that realistic, though?
310
00:14:47,020 --> 00:14:48,860
Well, like you said,
we have to analyse what's
311
00:14:48,860 --> 00:14:51,540
going on in the real world and use
maths to try and simulate it
312
00:14:51,540 --> 00:14:55,500
as closely as possible. So in the
case of fire or explosions, for
313
00:14:55,500 --> 00:14:57,700
example, we look at the chemical
processes that drive that.
314
00:14:57,700 --> 00:15:02,420
So, for fire, the combustion process
is turning a fuel into water
315
00:15:02,420 --> 00:15:04,660
and carbon dioxide and some carbon.
316
00:15:04,660 --> 00:15:08,500
And then the temperature
then decides how hot that soot gets,
317
00:15:08,500 --> 00:15:11,580
and that decides what the brightness
and the colour of the flame then is.
318
00:15:11,580 --> 00:15:14,780
When you're working with
buildings, say, do you take into
319
00:15:14,780 --> 00:15:17,220
account the things that the
buildings are made from?
320
00:15:17,220 --> 00:15:19,580
Yeah, we want to analyse all
the different material properties
321
00:15:19,580 --> 00:15:21,860
that are in a building, so, like,
322
00:15:21,860 --> 00:15:23,980
how bricks shatter
when they get hit.
323
00:15:23,980 --> 00:15:26,980
Mortar turning into dust,
the steel bending.
324
00:15:26,980 --> 00:15:30,500
We have to layer all of these things
up using either one big simulation
325
00:15:30,500 --> 00:15:33,300
or lots of simulations to get
put together.
326
00:15:33,300 --> 00:15:37,060
And, you know, if you're doing
a giant bath toy knocking a building
327
00:15:37,060 --> 00:15:39,700
down, you're kind of outside of
the realm of real-world physics
328
00:15:39,700 --> 00:15:42,660
at that point. So a lot of what our
effects artists do, a lot of their
329
00:15:42,660 --> 00:15:47,940
skill comes into figuring out how to
tune the physics of the simulation
330
00:15:47,940 --> 00:15:50,740
to give something that you've never
seen before that the director wants,
331
00:15:50,740 --> 00:15:53,020
but also something that the audience
can believe is real.
332
00:15:53,020 --> 00:15:56,060
Take reality and bend it. Yeah.
So a big part of what our team of
333
00:15:56,060 --> 00:15:58,020
scientists and software engineers
334
00:15:58,020 --> 00:16:01,140
at Weta do is use maths to figure
out cool tricks to make the
335
00:16:01,140 --> 00:16:03,460
simulations run faster,
while still looking real.
336
00:16:03,460 --> 00:16:05,700
That is amazing.
Anders, thank you. very much.
337
00:16:05,700 --> 00:16:07,220
Thank you.
338
00:16:07,220 --> 00:16:09,860
CHEERING
339
00:16:09,860 --> 00:16:14,020
As Anders mentioned there, it
takes a huge amount of brainpower
340
00:16:14,020 --> 00:16:18,140
to compute those simulations -
takes even machines days to do.
341
00:16:18,140 --> 00:16:20,700
But, of course, it's not just fun
and frivolity that's on the table
342
00:16:20,700 --> 00:16:23,500
when you could get machines
to do your maths for you,
343
00:16:23,500 --> 00:16:27,340
because working out how to get
computers to crunch the numbers
344
00:16:27,340 --> 00:16:29,180
also saves lives.
345
00:16:29,180 --> 00:16:32,340
And something that has had a huge
impact from using algorithms
346
00:16:32,340 --> 00:16:34,620
is organ-donor matching.
347
00:16:34,620 --> 00:16:36,980
Let me explain what I am
talking about here.
348
00:16:36,980 --> 00:16:39,740
So let's imagine
that you're a bit poorly
349
00:16:39,740 --> 00:16:41,900
and you needed a new kidney.
350
00:16:41,900 --> 00:16:46,780
Now, luckily, your friend here
has very kindly and generously
351
00:16:46,780 --> 00:16:49,380
offered to donate to you
one of hers.
352
00:16:49,380 --> 00:16:52,460
Almost all of us are born
with two kidneys and you can live
353
00:16:52,460 --> 00:16:55,860
a very long and happy life
with just one healthy kidney.
354
00:16:55,860 --> 00:17:00,260
Now, if you were both good tissue
matches or blood matches for each
355
00:17:00,260 --> 00:17:03,620
other, then this would be absolutely
fine, the transplant could go ahead.
356
00:17:03,620 --> 00:17:08,300
But if for some reason your blood
or tissue type didn't match,
357
00:17:08,300 --> 00:17:10,660
then you would be in a position
where you had one person
358
00:17:10,660 --> 00:17:12,220
who needed a kidney,
359
00:17:12,220 --> 00:17:14,660
one person who is willing
to donate a kidney,
360
00:17:14,660 --> 00:17:16,780
but nothing that you could do
about it.
361
00:17:16,780 --> 00:17:18,420
That was until very recently.
362
00:17:18,420 --> 00:17:21,700
All you could do was really wait
around on the transplant list,
363
00:17:21,700 --> 00:17:24,020
hoping for another donation.
364
00:17:24,020 --> 00:17:27,860
That was until quite recently,
when someone had a very bright idea.
365
00:17:27,860 --> 00:17:32,540
Because while you two might be stuck
in that position, perhaps,
366
00:17:32,540 --> 00:17:35,380
over here, there are another
two people
367
00:17:35,380 --> 00:17:37,420
who are in a similar position.
368
00:17:37,420 --> 00:17:39,860
So maybe you need a kidney.
369
00:17:39,860 --> 00:17:43,220
I mean, you have one that
you are willing to donate,
370
00:17:43,220 --> 00:17:46,420
but that unfortunately
doesn't match your loved one
371
00:17:46,420 --> 00:17:48,700
who you're trying to help.
372
00:17:48,700 --> 00:17:52,740
But what if your kidney, rather than
matching your friend here,
373
00:17:52,740 --> 00:17:56,540
what if your kidney was a match
for you?
374
00:17:56,540 --> 00:18:00,180
Now, that transplant could go ahead
and that would be fine.
375
00:18:00,180 --> 00:18:02,620
But the question is,
why would you want to give
376
00:18:02,620 --> 00:18:05,540
up your kidney to someone you've
never met before, especially
377
00:18:05,540 --> 00:18:10,100
when your friend is still sick
and in need of a transplant?
378
00:18:10,100 --> 00:18:14,180
But what if there was a third
pair of people?
379
00:18:14,180 --> 00:18:16,380
What if we could close this loop
380
00:18:16,380 --> 00:18:18,980
with another group who are over
here?
381
00:18:18,980 --> 00:18:23,620
So one more person and their loved
one who is willing to give
382
00:18:23,620 --> 00:18:26,180
them one, you've got a little loop
up here. Perfect. Thank you.
383
00:18:26,180 --> 00:18:27,540
I'll take that down.
384
00:18:27,540 --> 00:18:31,260
Now, in this situation,
if this could happen,
385
00:18:31,260 --> 00:18:33,780
if this happened to be a match
for you as well,
386
00:18:33,780 --> 00:18:35,340
then everyone's happy, right?
387
00:18:35,340 --> 00:18:39,940
Everyone who needs a transplant
has got one and everyone has helped
388
00:18:39,940 --> 00:18:43,100
make sure that their loved ones
have the operation that they need.
389
00:18:43,100 --> 00:18:46,940
Now, the only problem with this is
that there are often hundreds
390
00:18:46,940 --> 00:18:49,500
of people in the country
who will find themselves
391
00:18:49,500 --> 00:18:50,740
in this position.
392
00:18:50,740 --> 00:18:54,340
And finding these matches
is incredibly difficult.
393
00:18:54,340 --> 00:18:56,380
The number of possibilities
with hundreds of people
394
00:18:56,380 --> 00:18:57,900
is absolutely enormous.
395
00:18:57,900 --> 00:19:00,500
And you have to make sure that
you create a closed loop every
396
00:19:00,500 --> 00:19:04,140
single time so that
everybody ends up happy.
397
00:19:04,140 --> 00:19:06,700
But this is the kind of thing
that computers
398
00:19:06,700 --> 00:19:09,340
are absolutely perfect for.
399
00:19:09,340 --> 00:19:12,900
And, in fact, this is exactly
what a team at the University
400
00:19:12,900 --> 00:19:14,740
of Glasgow have done.
401
00:19:14,740 --> 00:19:17,660
So here is the output
from that algorithm.
402
00:19:17,660 --> 00:19:21,020
This algorithm
runs in seven seconds.
403
00:19:21,020 --> 00:19:23,660
And this is the network
that it spits out.
404
00:19:23,660 --> 00:19:28,140
So each one of these dots is a real
person, and each one of the lines
405
00:19:28,140 --> 00:19:30,340
is, effectively, a ribbon.
406
00:19:30,340 --> 00:19:34,420
And if we zoom in on this,
you can see here just how many
407
00:19:34,420 --> 00:19:38,300
ribbons this algorithm is
potentially considering.
408
00:19:38,300 --> 00:19:41,180
Now, clearly, there is no way that
any human would be able to weigh
409
00:19:41,180 --> 00:19:43,980
all of this up within their own
minds.
410
00:19:43,980 --> 00:19:47,020
But this algorithm, this isn't
just something that's theoretical.
411
00:19:47,020 --> 00:19:51,260
This is something that actually does
go on to save lives because,
412
00:19:51,260 --> 00:19:55,900
in fact, up there, holding
on to the green ribbon is Yuki,
413
00:19:55,900 --> 00:19:58,980
who was part of a real-life chain.
414
00:19:58,980 --> 00:20:02,980
So, Yuki, if you want to join us on
stage. Round of applause for Yuki.
415
00:20:02,980 --> 00:20:05,980
CHEERING
416
00:20:10,380 --> 00:20:12,340
Thank you for joining us, Yuki.
417
00:20:12,340 --> 00:20:14,340
So how long ago
did you have your operation?
418
00:20:14,340 --> 00:20:16,500
I had it three years ago.
419
00:20:16,500 --> 00:20:19,740
And who was the loved one
who donated a kidney on your behalf?
420
00:20:19,740 --> 00:20:22,940
It was my grandma from Japan.
From Japan.
421
00:20:22,940 --> 00:20:25,620
I think we have a photograph,
actually, here, of you guys
422
00:20:25,620 --> 00:20:27,060
outside Great Ormond Street.
423
00:20:27,060 --> 00:20:31,220
So your grandmother's kidney
went to someone else.
424
00:20:31,220 --> 00:20:33,140
Where did your kidney come from?
425
00:20:33,140 --> 00:20:36,060
My kidney came from someone
in Birmingham.
426
00:20:36,060 --> 00:20:38,620
And do you know how many people
there were in your chain?
427
00:20:38,620 --> 00:20:42,140
I think three. So exactly
the same as we have here.
428
00:20:42,140 --> 00:20:44,300
Yeah. It's a really
incredible story.
429
00:20:44,300 --> 00:20:48,100
Yuki, thank you so much for joining
us and telling your story.
430
00:20:48,100 --> 00:20:51,180
CHEERING
431
00:20:56,020 --> 00:20:58,940
Now, this work is a monumental
breakthrough, it's something that's
432
00:20:58,940 --> 00:21:01,900
been a real game-changer
for people within the UK.
433
00:21:01,900 --> 00:21:05,940
And the key thing that that
algorithm can do, the thing that no
434
00:21:05,940 --> 00:21:07,620
human can do on their own,
435
00:21:07,620 --> 00:21:10,380
is consider this vast number of
possibilities.
436
00:21:10,380 --> 00:21:15,180
And that is the superhuman power
that you open up when you hand
437
00:21:15,180 --> 00:21:18,980
over your maths to a machine,
because suddenly it can use
438
00:21:18,980 --> 00:21:22,180
all of its grunt to tailor things
to the individual, to work out
439
00:21:22,180 --> 00:21:24,300
precisely what is right for you,
440
00:21:24,300 --> 00:21:25,740
which kidney you need,
441
00:21:25,740 --> 00:21:29,580
and in a much smaller,
but no less profound way,
442
00:21:29,580 --> 00:21:32,980
which song or video you might like.
443
00:21:32,980 --> 00:21:36,500
Now YouTube videos, Netflix,
Amazon, BBC iPlayer - all of them
444
00:21:36,500 --> 00:21:40,500
are crunching through vast amounts
of information on their websites.
445
00:21:40,500 --> 00:21:44,140
But how do they make sense
of the content that is uploaded?
446
00:21:44,140 --> 00:21:47,500
And what kind of process
do they use to decide
447
00:21:47,500 --> 00:21:51,060
what you might like to see next?
Well, OK, I'll tell you what.
448
00:21:51,060 --> 00:21:54,620
Let's find out here
by making our own video.
449
00:21:54,620 --> 00:21:59,860
So I wonder, can I have a volunteer,
someone who is happy to help me
450
00:21:59,860 --> 00:22:02,740
commentate on
a little YouTube video?
451
00:22:02,740 --> 00:22:04,180
OK. Perfect. Yeah. Let's go there.
452
00:22:04,180 --> 00:22:06,660
Come on down. Round of applause as
he comes to the stage.
453
00:22:06,660 --> 00:22:09,140
Now, what's your name? Anthony.
454
00:22:09,140 --> 00:22:10,500
Anthony. OK, perfect. Anthony.
455
00:22:10,500 --> 00:22:13,060
Right. So we're going to make...
Have you got your phone? Yeah.
456
00:22:13,060 --> 00:22:15,580
OK. Perfect. Right. We're going
to make a video, Anthony.
457
00:22:15,580 --> 00:22:18,140
And we want it to be seen by
as many people as possible.
458
00:22:18,140 --> 00:22:21,620
Right? We want it to be picked up
by the algorithm.
459
00:22:21,620 --> 00:22:24,220
So we're going to make a great
YouTube video here.
460
00:22:24,220 --> 00:22:28,460
Now, thankfully, the world famous
Royal Institution demo team are on
461
00:22:28,460 --> 00:22:30,940
hand to help us make our video
as exciting as possible.
462
00:22:30,940 --> 00:22:33,980
So if you give me this
and I'll do your video for you.
463
00:22:33,980 --> 00:22:36,420
If you take these, you can come
and stand over here and help me
464
00:22:36,420 --> 00:22:37,860
commentate on this video.
465
00:22:37,860 --> 00:22:39,220
So I'll take the video.
466
00:22:39,220 --> 00:22:42,900
Here's Gemma. You start
off whenever you want to, Anthony.
467
00:22:44,460 --> 00:22:45,780
There we go!
468
00:22:45,780 --> 00:22:47,300
What's up, folks?
469
00:22:47,300 --> 00:22:49,060
It's me, Anthony, again.
470
00:22:49,060 --> 00:22:50,860
Thanks for subscribing.
471
00:22:50,860 --> 00:22:52,660
This is my best...
472
00:22:52,660 --> 00:22:56,780
..best friend
Gemma doing some science.
473
00:22:56,780 --> 00:23:01,020
They have some extremely
cold liquid nitrogen.
474
00:23:01,020 --> 00:23:05,340
Whoa! Look at all
these balloon dogs!
475
00:23:05,340 --> 00:23:07,500
Where are they coming from?
476
00:23:07,500 --> 00:23:10,300
It's the balloon-dog challenge!
477
00:23:10,300 --> 00:23:15,660
Now, Gemma is going to add hot
water to the liquid nitrogen and it
478
00:23:15,660 --> 00:23:17,700
will yeet everywhere!
479
00:23:17,700 --> 00:23:19,660
Let's give her a countdown.
480
00:23:19,660 --> 00:23:21,220
Nice and loud.
481
00:23:21,220 --> 00:23:24,940
ALL: Three, two, one...
482
00:23:24,940 --> 00:23:27,500
AUDIENCE WHOOPS
483
00:23:27,500 --> 00:23:32,500
Now, sure to get views,
it's YouTuber Tom Scott!
484
00:23:32,500 --> 00:23:35,940
And behind the cloud, appears
485
00:23:35,940 --> 00:23:38,660
YouTube sensation Tom Scott.
486
00:23:38,660 --> 00:23:41,380
Anthony, thank you so much. There,
and you've got your video there.
487
00:23:41,380 --> 00:23:45,020
Thank you very much indeed.
Big round of applause if you can!
488
00:23:47,740 --> 00:23:50,780
Do you often appear
in clouds of smoke?
489
00:23:50,780 --> 00:23:53,220
No, it's the first time
I've been summoned like this.
490
00:23:53,220 --> 00:23:55,700
It's wonderful. Now, I should tell
you, for any of you who don't
491
00:23:55,700 --> 00:23:59,740
recognise Tom, Tom is one of
Britain's foremost YouTubers.
492
00:23:59,740 --> 00:24:03,380
If there is one person that we
can ask about how to make
493
00:24:03,380 --> 00:24:05,540
sure a video gets seen by as many
people as possible,
494
00:24:05,540 --> 00:24:07,620
I mean, you are that person. Yeah.
495
00:24:07,620 --> 00:24:11,100
I wish there was some magical way
to guarantee it but, I mean,
496
00:24:11,100 --> 00:24:12,540
I've done some pretty cool stuff.
497
00:24:12,540 --> 00:24:17,980
I've been in Zero G, but that video
didn't do well compared to, well,
498
00:24:17,980 --> 00:24:21,460
a two-minute shot of me continuously
looking at some toasters
499
00:24:21,460 --> 00:24:23,820
and talking about how long it's
going to take them to pop up.
500
00:24:23,820 --> 00:24:27,860
That video did better than me
flying about in zero G.
501
00:24:27,860 --> 00:24:31,780
But what did better than both of
those, to achieve, like, 25 million
502
00:24:31,780 --> 00:24:35,260
views, was me attaching some
garlic bread to a helium
503
00:24:35,260 --> 00:24:39,100
balloon and sending it to the edge
of space, then bringing it back
504
00:24:39,100 --> 00:24:41,980
down and eating it. Now...
25 million. 25 million.
505
00:24:41,980 --> 00:24:44,860
I kind of figured that one
was going to do well.
506
00:24:44,860 --> 00:24:48,460
I didn't think it was going to do
THAT well. I have never, ever been
507
00:24:48,460 --> 00:24:52,460
able to find anything that will get
these recommendation engines,
508
00:24:52,460 --> 00:24:55,980
these algorithms, to specifically
go for a particular video.
509
00:24:55,980 --> 00:24:58,620
Because it is an algorithm that's
deciding what gets promoted
510
00:24:58,620 --> 00:25:02,020
on different people's YouTubes,
right? Yes. Originally,
511
00:25:02,020 --> 00:25:05,260
years and years ago, it was all
down to the title and the thumbnail.
512
00:25:05,260 --> 00:25:06,620
CRACKLING
513
00:25:06,620 --> 00:25:09,180
Is that OK? That's crackling
a lot. I'm slightly nervous!
514
00:25:09,180 --> 00:25:11,420
I mean, this is going to make
quite a YouTube video
515
00:25:11,420 --> 00:25:14,460
if it explodes on us now, Tom!
I'm going to keep going.
516
00:25:14,460 --> 00:25:18,220
Keep going. Originally, when I
started 2006, it was all about
517
00:25:18,220 --> 00:25:20,820
the title and the thumbnail
and that was it.
518
00:25:20,820 --> 00:25:23,140
So you could trick people
into clicking on something
519
00:25:23,140 --> 00:25:25,220
with a lie in the thumbnail,
a lie in the title,
520
00:25:25,220 --> 00:25:27,980
and that's still true now,
but it doesn't work as well,
521
00:25:27,980 --> 00:25:30,380
because after that, they said
it was more about how long people
522
00:25:30,380 --> 00:25:32,060
watched a video for.
523
00:25:32,060 --> 00:25:35,060
So at that point, people put,
you know, a little explosion
524
00:25:35,060 --> 00:25:36,380
at the end of a 20-minute video.
525
00:25:36,380 --> 00:25:38,940
So you'd have to watch all the way
through. Those long introductions
526
00:25:38,940 --> 00:25:40,500
that were very fashionable
for a while.
527
00:25:40,500 --> 00:25:44,060
Because they kept people watching.
So now YouTube doesn't give any
528
00:25:44,060 --> 00:25:48,020
advice at all on how to make
something please the algorithm.
529
00:25:48,020 --> 00:25:50,620
And as a YouTube creator,
all of us just call it
530
00:25:50,620 --> 00:25:54,060
"the algorithm".
This one monolithic thing.
531
00:25:54,060 --> 00:25:57,900
They say that if people
watch it and people like it,
532
00:25:57,900 --> 00:26:00,820
then the algorithm will also
recommend it to other people.
533
00:26:00,820 --> 00:26:04,100
But they will never, ever
say what the reasons are,
534
00:26:04,100 --> 00:26:07,460
because the minute they do that,
everyone will say, "Oh, yeah, well,
535
00:26:07,460 --> 00:26:08,700
"I'll start doing that."
536
00:26:08,700 --> 00:26:12,100
And suddenly 100,000, 200,000
people are all doing that and
537
00:26:12,100 --> 00:26:15,940
no-one's watching it any more.
Yeah. There are 500 hours of video
538
00:26:15,940 --> 00:26:19,260
uploaded to YouTube every minute.
Not watched - like, uploaded.
539
00:26:19,260 --> 00:26:22,500
So I'm sure it's a great video
on your phone.
540
00:26:22,500 --> 00:26:26,580
Statistically, it's probably not
going to go big, but it might, cos
541
00:26:26,580 --> 00:26:29,660
the more times you can roll that
dice, the better your chances are.
542
00:26:29,660 --> 00:26:32,860
I'm going to go create a garlic
bread video of my own now, I think.
543
00:26:32,860 --> 00:26:34,300
Good luck.
544
00:26:34,300 --> 00:26:36,660
Apparently it does work!
545
00:26:36,660 --> 00:26:39,780
Tom Scott, thank you very much
indeed. Thank you. Thank you.
546
00:26:39,780 --> 00:26:43,020
CHEERING
547
00:26:43,020 --> 00:26:47,260
Now, as Tom hinted there, there
is actually something slightly
548
00:26:47,260 --> 00:26:49,580
different going on here to the
fireworks
549
00:26:49,580 --> 00:26:52,900
or the organ-donor algorithms
that we saw earlier.
550
00:26:52,900 --> 00:26:56,620
That YouTube algorithm, it isn't
just crunching through a straight
551
00:26:56,620 --> 00:26:58,260
list of instructions any more.
552
00:26:58,260 --> 00:27:00,420
It's actually doing something
a little bit different.
553
00:27:00,420 --> 00:27:05,100
I want to explain this to you
using a cup of tea and a robot.
554
00:27:05,100 --> 00:27:06,180
Oh, OK.
555
00:27:06,180 --> 00:27:10,580
Maybe not an actual robot, but
definitely the next-best thing.
556
00:27:10,580 --> 00:27:14,420
Please join me
in welcoming robot Matt Parker.
557
00:27:14,420 --> 00:27:16,540
CHEERING
558
00:27:21,620 --> 00:27:24,460
That's a... Thank you!
559
00:27:26,660 --> 00:27:29,340
That's a great outfit, Matt.
Thank you. I've made it myself.
560
00:27:29,340 --> 00:27:30,820
Yeah. Sorry...
561
00:27:30,820 --> 00:27:32,980
ROBOTIC VOICE: I compiled it myself.
562
00:27:32,980 --> 00:27:36,460
OK. Now, robot Matt,
like all robots,
563
00:27:36,460 --> 00:27:38,260
he isn't particularly bright.
564
00:27:38,260 --> 00:27:41,940
Oh-oh. Offence registered.
565
00:27:41,940 --> 00:27:44,740
They also take things incredibly
literally,
566
00:27:44,740 --> 00:27:48,420
as we are about to discover,
because we are going to, together,
567
00:27:48,420 --> 00:27:53,940
try and instruct robot Matt Parker
with how to make a cup of tea.
568
00:27:53,940 --> 00:27:55,140
OK, so here we go.
569
00:27:55,140 --> 00:27:57,860
What's the first thing to do if you
make a cup of tea?
570
00:27:57,860 --> 00:28:00,220
You heat up the water in the
kettle.
571
00:28:00,220 --> 00:28:01,460
Heat up the water in the kettle.
572
00:28:01,460 --> 00:28:03,260
Perfect. Sounds sensible?
573
00:28:03,260 --> 00:28:05,500
There's a kettle. OK, next step.
Next step.
574
00:28:05,500 --> 00:28:08,940
You take out the teabags.
Take out the teabags.
575
00:28:10,980 --> 00:28:13,420
Ha-ha! That's good. Next step?
576
00:28:13,420 --> 00:28:15,220
Er, get a mug.
577
00:28:15,220 --> 00:28:16,260
Get a mug.
578
00:28:18,300 --> 00:28:21,140
Oh, it's a teeny, tiny one.
579
00:28:21,140 --> 00:28:22,380
What should we do next?
580
00:28:22,380 --> 00:28:24,780
Put the teabag in the mug.
581
00:28:24,780 --> 00:28:26,820
LAUGHTER
582
00:28:30,780 --> 00:28:32,420
Is that the right mug, there?
583
00:28:32,420 --> 00:28:34,420
Well, let's get another.
584
00:28:34,420 --> 00:28:36,660
Put the boiling water in.
585
00:28:36,660 --> 00:28:38,540
LAUGHTER
586
00:28:38,540 --> 00:28:41,940
He takes things very literally!
587
00:28:41,940 --> 00:28:42,980
HANNAH CHUCKLES
588
00:28:42,980 --> 00:28:45,700
Heat registering!
589
00:28:47,220 --> 00:28:50,980
Get a normal-sized teabag and cup.
590
00:28:52,420 --> 00:28:54,340
A bigger cup.
591
00:28:58,940 --> 00:29:01,340
The normal teabag in the mug.
592
00:29:01,340 --> 00:29:03,500
LAUGHTER
593
00:29:03,500 --> 00:29:06,980
I'm going to come round there.
Hang on. One second.
594
00:29:06,980 --> 00:29:08,660
Hold on a second.
I'm going to come up here.
595
00:29:08,660 --> 00:29:10,340
Here we go.
596
00:29:10,340 --> 00:29:11,700
What's next? What's next?
597
00:29:11,700 --> 00:29:14,100
Stop pouring the water.
598
00:29:16,100 --> 00:29:18,060
Put the teabag in the mug.
599
00:29:18,060 --> 00:29:20,780
Pour the water into the mug.
600
00:29:20,780 --> 00:29:23,300
AUDIENCE MEMBERS CHORTLE
601
00:29:26,940 --> 00:29:28,940
LAUGHTER
602
00:29:28,940 --> 00:29:31,580
Stop pouring the water.
603
00:29:31,580 --> 00:29:34,380
Add milk and sugar to taste.
604
00:29:34,380 --> 00:29:36,340
HANNAH CHUCKLES
605
00:29:37,940 --> 00:29:40,300
LAUGHTER
606
00:29:44,420 --> 00:29:48,380
Pour the milk from the first mug
into the mug with the tea.
607
00:29:52,700 --> 00:29:54,260
OK. I think we've got one more.
608
00:29:54,260 --> 00:29:57,420
Stop pouring the milk into the cup.
609
00:29:57,420 --> 00:29:59,660
And the last step?
610
00:29:59,660 --> 00:30:00,940
Drink it.
611
00:30:00,940 --> 00:30:03,100
LAUGHTER
612
00:30:04,860 --> 00:30:07,180
I made this for you.
613
00:30:08,580 --> 00:30:10,020
Cheers, everyone.
614
00:30:10,020 --> 00:30:11,460
Oh, that's...
615
00:30:11,460 --> 00:30:13,460
LAUGHTER AND CHEERING
616
00:30:13,460 --> 00:30:16,380
Thank you very much!
617
00:30:16,380 --> 00:30:19,540
OK. So I think it's quite obvious
that our list of instructions there
618
00:30:19,540 --> 00:30:21,620
needs we need to be quite long
and that is only for making
619
00:30:21,620 --> 00:30:23,620
a cup of tea -
something very easy.
620
00:30:23,620 --> 00:30:27,140
So imagine how much harder
it would be to write a list
621
00:30:27,140 --> 00:30:30,100
of instructions for
a very literal computer.
622
00:30:30,100 --> 00:30:34,540
OK. How would you explain to a
computer how to recognise a picture
623
00:30:34,540 --> 00:30:35,980
of a dog?
624
00:30:35,980 --> 00:30:37,620
What do dogs look like? Shout out.
625
00:30:37,620 --> 00:30:39,500
What are the important things
about what dogs look
626
00:30:39,500 --> 00:30:41,220
like that you would need?
627
00:30:41,220 --> 00:30:42,460
AUDIENCE OFFERS SUGGESTIONS
628
00:30:42,460 --> 00:30:43,980
Fluffy, fur, yeah.
629
00:30:43,980 --> 00:30:45,500
AUDIENCE SHOUTS
630
00:30:45,500 --> 00:30:46,980
Four legs.
631
00:30:46,980 --> 00:30:49,780
Er, tail? Yeah. OK.
632
00:30:49,780 --> 00:30:52,460
All right, all right.
So we got... We got...
633
00:30:52,460 --> 00:30:56,700
We got fluffy or furry, tail,
and four legs. Great.
634
00:30:56,700 --> 00:30:59,580
OK. But what about this? Uhhh...
635
00:30:59,580 --> 00:31:01,060
Slight problem, there.
636
00:31:01,060 --> 00:31:03,020
OK. All right. What else? What
else? What about its face?
637
00:31:03,020 --> 00:31:05,540
What's important to
recognise about its face?
638
00:31:05,540 --> 00:31:07,100
AUDIENCE OFFERS SUGGESTIONS
639
00:31:07,100 --> 00:31:09,140
Pointy ears.
640
00:31:09,140 --> 00:31:12,380
Pointy snout. OK. OK. OK.
Here we go.
641
00:31:12,380 --> 00:31:14,620
All right. So "pointy snout",
someone said.
642
00:31:14,620 --> 00:31:15,980
How about this, though?
643
00:31:15,980 --> 00:31:18,060
Ahhh. OK.
644
00:31:18,060 --> 00:31:22,620
Even if you could work out, you
know, maybe take into account
645
00:31:22,620 --> 00:31:24,540
the colour, all of that stuff,
646
00:31:24,540 --> 00:31:26,740
didn't you say that a dog
had to have four legs?
647
00:31:26,740 --> 00:31:28,700
Because what about this guy?
648
00:31:28,700 --> 00:31:30,820
Still definitely a dog.
649
00:31:30,820 --> 00:31:34,100
Still definitely a dog. But it just
goes to show just how hard it is
650
00:31:34,100 --> 00:31:37,700
if you are trying to write a long
list of instructions, just how hard
651
00:31:37,700 --> 00:31:41,180
it is to pin down exactly
what a dog looks like.
652
00:31:41,180 --> 00:31:43,060
Now, humans, we're amazing at this.
653
00:31:43,060 --> 00:31:46,460
We can do this without even
thinking, but explaining how we do
654
00:31:46,460 --> 00:31:49,340
it, especially to a pretty dumb
computer that takes things
655
00:31:49,340 --> 00:31:53,180
very literally, ends up being quite
a lot harder than you might imagine.
656
00:31:53,180 --> 00:31:56,180
But there is another way
around this to explain.
657
00:31:56,180 --> 00:31:59,500
Please join me in welcoming computer
scientist extraordinaire
658
00:31:59,500 --> 00:32:01,500
Anne-Marie Imafidon.
659
00:32:01,500 --> 00:32:05,020
CHEERING AND APPLAUSE
660
00:32:07,100 --> 00:32:09,300
There is another way to all
of this, Anne-Marie?
661
00:32:09,300 --> 00:32:12,020
There is another way. We need
a different type of program
662
00:32:12,020 --> 00:32:15,700
that isn't as specific. Isn't a
list of straightforward "do this,
663
00:32:15,700 --> 00:32:17,260
"do this, do this". Exactly.
664
00:32:17,260 --> 00:32:20,780
We need a type of program
that has an end point or a goal.
665
00:32:20,780 --> 00:32:23,580
And we let the computer
or the machine figure out
666
00:32:23,580 --> 00:32:26,580
its own rules to get to that
end goal. Without giving
667
00:32:26,580 --> 00:32:28,540
it straightforward instructions?
Without giving
668
00:32:28,540 --> 00:32:31,380
it any instructions other than
that's where we're going,
669
00:32:31,380 --> 00:32:34,060
and maybe when it's got it right.
OK.
670
00:32:34,060 --> 00:32:35,900
Anne-Marie,
this sounds like magic.
671
00:32:35,900 --> 00:32:38,500
Oh, it is computer science
more than it's magic!
672
00:32:38,500 --> 00:32:40,380
It's a lot like training an animal.
673
00:32:40,380 --> 00:32:43,780
When you get a dog, you're training
it to maybe do a particular trick.
674
00:32:43,780 --> 00:32:46,620
You don't say, "Move this muscle
and look that way."
675
00:32:46,620 --> 00:32:49,060
You kind of just give it a treat
when it does the right thing.
676
00:32:49,060 --> 00:32:51,260
And you've got a clear idea in
your mind of what it is you want
677
00:32:51,260 --> 00:32:52,980
them to do. Exactly.
Whether it's sitting
678
00:32:52,980 --> 00:32:55,180
or whether it's anything else.
OK. All right.
679
00:32:55,180 --> 00:32:58,580
Well, on that point, then, of
training animals, what's this kind
680
00:32:58,580 --> 00:33:01,100
of algorithm called, incidentally,
when it's in computer science?
681
00:33:01,100 --> 00:33:03,460
So we call it a reinforcement
training algorithm.
682
00:33:03,460 --> 00:33:06,540
And it's one that learns on its own,
so it's actually part of machine
683
00:33:06,540 --> 00:33:09,780
learning. OK. All right.
So reinforcement learning.
684
00:33:09,780 --> 00:33:12,780
Let's see how this works,
because we're going to try some
685
00:33:12,780 --> 00:33:14,460
reinforcement learning
on one of you.
686
00:33:14,460 --> 00:33:17,460
So who would like
to be our volunteer?
687
00:33:17,460 --> 00:33:20,900
You had your hand up, I think,
first. So if you want to come down.
688
00:33:20,900 --> 00:33:23,180
CHEERING
689
00:33:25,820 --> 00:33:28,500
What's your name?
Emily. Emily. OK. Perfect.
690
00:33:28,500 --> 00:33:30,220
Emily, if you want to come over
here. OK.
691
00:33:30,220 --> 00:33:32,540
So, Emily, what we're going to
do, then, is we're going to play
692
00:33:32,540 --> 00:33:34,940
a little game with you. If you just
come to the side while they bring
693
00:33:34,940 --> 00:33:38,140
all these things on. What we're
going to do is we're going to show
694
00:33:38,140 --> 00:33:40,180
you four pots. You can see
all the different rounds
695
00:33:40,180 --> 00:33:43,300
that we've got there. These pots are
coloured and they've got sums
696
00:33:43,300 --> 00:33:46,820
on them on the side, and so on.
Now, Anne-Marie and I, we've got in
697
00:33:46,820 --> 00:33:49,540
our minds, we've got a very
clear idea of which pot
698
00:33:49,540 --> 00:33:50,780
we want you to pick.
699
00:33:50,780 --> 00:33:53,820
So, in each round, all you
do is you just touch one pot.
700
00:33:53,820 --> 00:33:56,380
OK. If you are correct,
we're going to give you a treat,
701
00:33:56,380 --> 00:33:58,740
we're going to give you a little
reward, and I think they might just
702
00:33:58,740 --> 00:34:01,420
be in here. We're going to give
you a little reward of a humbug.
703
00:34:01,420 --> 00:34:04,300
And if you're wrong, we're going
to move on to the next round. OK?
704
00:34:04,300 --> 00:34:07,220
If you want to just stand in there
for me. Now, just to prove
705
00:34:07,220 --> 00:34:10,260
that this really is something
that is not specific to humans,
706
00:34:10,260 --> 00:34:13,660
you are going to play this game
against a competitor,
707
00:34:13,660 --> 00:34:15,620
by which I mean a pigeon.
708
00:34:15,620 --> 00:34:18,300
If we can welcome onto the stage
709
00:34:18,300 --> 00:34:22,020
Alfie and his trainer, Lloyd.
Round of applause.
710
00:34:27,380 --> 00:34:29,860
Now, I have to tell you, Emily,
711
00:34:29,860 --> 00:34:32,660
Alfie has had a bit of a head start
on this. Alfie has been trained.
712
00:34:32,660 --> 00:34:34,060
Look how sweet Alfie is!
713
00:34:34,060 --> 00:34:35,740
Very beautiful bird.
714
00:34:35,740 --> 00:34:38,020
You happy? You understand the game?
All right, let's go.
715
00:34:38,020 --> 00:34:39,220
Let's go for round one.
716
00:34:39,220 --> 00:34:40,940
So just touch one pot.
717
00:34:43,740 --> 00:34:45,100
Incorrect.
718
00:34:45,100 --> 00:34:47,980
Ah, Alfie got it right, though.
He did. He did!
719
00:34:47,980 --> 00:34:50,300
Alfie, one. Human, zero.
720
00:34:50,300 --> 00:34:51,580
LAUGHTER
721
00:34:51,580 --> 00:34:53,940
OK. Here we go.
722
00:34:53,940 --> 00:34:54,940
Round two.
723
00:34:57,340 --> 00:34:58,900
Incorrect.
724
00:34:58,900 --> 00:35:02,540
Ohhh! Alfie, two. Human, zero.
725
00:35:04,180 --> 00:35:05,980
And here we go.
726
00:35:07,820 --> 00:35:09,300
Next one.
727
00:35:11,420 --> 00:35:13,220
Correct. OK!
728
00:35:14,460 --> 00:35:16,100
Next one.
729
00:35:18,060 --> 00:35:20,340
Ohh! Incorrect.
730
00:35:20,340 --> 00:35:22,060
Alfie's got it correct again.
731
00:35:22,060 --> 00:35:24,660
OK. Let's go final round.
732
00:35:24,660 --> 00:35:28,340
Now. OK. The equations on this side
are getting harder and harder
733
00:35:28,340 --> 00:35:31,620
and harder, almost like
734
00:35:31,620 --> 00:35:33,740
there's no possible way
that you could solve them.
735
00:35:35,940 --> 00:35:37,660
Ohh! Incorrect!
736
00:35:37,660 --> 00:35:39,020
That is tough, that is tough!
737
00:35:39,020 --> 00:35:41,700
OK. Tell us, how did you find it?
Tell us what you thought.
738
00:35:41,700 --> 00:35:43,540
It was quite hard.
739
00:35:43,540 --> 00:35:45,940
I just basically picked random ones.
740
00:35:45,940 --> 00:35:48,060
There wasn't really a method.
741
00:35:48,060 --> 00:35:50,180
What were you thinking
when you got it right?
742
00:35:50,180 --> 00:35:54,300
I didn't really have a thing. I just
sort of picked one, apart from
743
00:35:54,300 --> 00:35:57,140
the one where the yellow
pot fell on the floor.
744
00:35:57,140 --> 00:35:59,740
Oh, did you see the yellow pot?
That was a big clue.
745
00:35:59,740 --> 00:36:01,020
In fact, actually.
746
00:36:01,020 --> 00:36:03,460
Which pot do you think that Alfie
was picking every time?
747
00:36:03,460 --> 00:36:06,100
Shall we tell her? The yellow one?
The yellow one.
748
00:36:06,100 --> 00:36:08,540
Exactly right. Exactly right.
749
00:36:08,540 --> 00:36:10,500
APPLAUSE
750
00:36:10,500 --> 00:36:13,420
Now, I know that was quite tough,
that was quite tough.
751
00:36:13,420 --> 00:36:15,860
But, Anne-Marie, tell us, how
similar is this to the things we see
752
00:36:15,860 --> 00:36:17,060
in computer science?
753
00:36:17,060 --> 00:36:19,780
So this is very similar to what
we see in computer science.
754
00:36:19,780 --> 00:36:24,260
But with a computer, it's able to do
this thousands, millions, billions
755
00:36:24,260 --> 00:36:25,660
of times in a second,
756
00:36:25,660 --> 00:36:30,580
and so can then pick up that pattern
quicker than we can as humans.
757
00:36:30,580 --> 00:36:34,140
And often it can see patterns that
we humans can't see and can make
758
00:36:34,140 --> 00:36:35,860
those kind of connections, as well.
759
00:36:35,860 --> 00:36:39,500
And that is a really important point
because you only had a few rounds,
760
00:36:39,500 --> 00:36:42,780
but if you were a computer,
you'd have had 10,000 in that time.
761
00:36:42,780 --> 00:36:45,100
So there you go, have another treat
and a big round of applause.
762
00:36:45,100 --> 00:36:47,020
Thank you! Thank you very much,
Emily. Thank you.
763
00:36:47,020 --> 00:36:48,420
CHEERING
764
00:36:50,660 --> 00:36:53,660
Now, the way that Alfie was trained
there was just using exactly
765
00:36:53,660 --> 00:36:56,380
that same technique. Alfie just
had a lot more goes at it.
766
00:36:56,380 --> 00:37:00,940
But as Anne-Marie said there, if an
animal can learn in this way
767
00:37:00,940 --> 00:37:04,300
with very simple rewards,
never hearing full instructions,
768
00:37:04,300 --> 00:37:09,020
then so can a machine. Now, that
might seem like a bit of leap.
769
00:37:09,020 --> 00:37:12,500
How do computers learn, especially
when, as we found out
770
00:37:12,500 --> 00:37:16,020
with that cup of tea,
machines are incredibly dumb?
771
00:37:16,020 --> 00:37:18,020
Well, it is possible.
772
00:37:18,020 --> 00:37:21,740
And perhaps unsurprisingly, given
that it's in this programme, it is
773
00:37:21,740 --> 00:37:26,820
totally mathematical because it
turns out you CAN teach an inanimate
774
00:37:26,820 --> 00:37:28,780
object to learn.
775
00:37:28,780 --> 00:37:31,940
And we have been doing exactly
that for the last week.
776
00:37:31,940 --> 00:37:35,380
Please welcome Matt Parker
and the Menace Machine.
777
00:37:37,700 --> 00:37:41,380
Have to be honest, Matt. Menace
Machine looks quite a lot like a
778
00:37:41,380 --> 00:37:43,540
whole heap of matchboxes.
779
00:37:43,540 --> 00:37:47,060
That's because Menace IS
just a heap of matchboxes!
780
00:37:47,060 --> 00:37:50,260
But we've taught these matchboxes
to play noughts and crosses.
781
00:37:50,260 --> 00:37:52,660
OK, OK. Talk me through it.
How on Earth can matchboxes play
782
00:37:52,660 --> 00:37:55,380
noughts and crosses? Well, the great
thing about noughts and crosses is
783
00:37:55,380 --> 00:37:58,140
there aren't that many possible
games. There's only nine squares,
784
00:37:58,140 --> 00:37:59,820
it's a nought or a cross.
785
00:37:59,820 --> 00:38:03,420
And on the front of these boxes,
every single one is a different
786
00:38:03,420 --> 00:38:05,140
state that the game could be in.
787
00:38:05,140 --> 00:38:09,500
In fact, this is every possible
position of every possible game
788
00:38:09,500 --> 00:38:11,420
that Menace could possibly face.
789
00:38:11,420 --> 00:38:12,740
We've got all the first moves,
790
00:38:12,740 --> 00:38:16,220
second moves, the third moves are
in blue and the final, fourth
791
00:38:16,220 --> 00:38:18,140
moves are over here in pink.
792
00:38:18,140 --> 00:38:21,220
And inside each box, we've put
some coloured beads.
793
00:38:21,220 --> 00:38:24,780
So here's one that we haven't used
yet. And you can see there's a real
794
00:38:24,780 --> 00:38:27,100
mixture. There's, like, nine
different-coloured beads
795
00:38:27,100 --> 00:38:29,460
and each colour of bead corresponds
to a move Menace can make.
796
00:38:29,460 --> 00:38:30,580
Because, of course,
797
00:38:30,580 --> 00:38:32,980
there's only ever nine possible
places that you could play for
798
00:38:32,980 --> 00:38:35,420
noughts and crosses. Exactly.
799
00:38:35,420 --> 00:38:38,380
OK, so how do you actually train it,
though? Well, initially, there's
800
00:38:38,380 --> 00:38:40,660
just a random collection of beads
in every single one.
801
00:38:40,660 --> 00:38:44,060
So actually it's making every
possible move equally likely.
802
00:38:44,060 --> 00:38:46,180
And initially, Menace is terrible.
803
00:38:46,180 --> 00:38:49,900
But the reinforcement learning is
we track each game, and if Menace
804
00:38:49,900 --> 00:38:53,580
wins, we give it more beads
that are the same colour
805
00:38:53,580 --> 00:38:54,940
into the same boxes.
806
00:38:54,940 --> 00:38:57,540
If it loses, we confiscate
those beads, which makes
807
00:38:57,540 --> 00:39:00,900
it less likely to do that move
again. And to train it,
808
00:39:00,900 --> 00:39:03,420
we've had it play hundreds of games.
809
00:39:03,420 --> 00:39:06,180
And thank you so much, everyone
who, before the show tonight,
810
00:39:06,180 --> 00:39:07,500
played against Menace.
811
00:39:07,500 --> 00:39:10,100
Each time we either reward it
or we punish it.
812
00:39:10,100 --> 00:39:12,540
And with each game,
it gets a little bit better.
813
00:39:12,540 --> 00:39:15,260
And how good is it? Do you think
it can beat a human?
814
00:39:15,260 --> 00:39:19,860
I think it will probably
not lose to a human.
815
00:39:19,860 --> 00:39:22,820
All right. Well, let's give it a go.
816
00:39:22,820 --> 00:39:26,100
Who would like to come
and play with Menace?
817
00:39:26,100 --> 00:39:29,740
Let's get you, there, yeah. Round
of applause as she comes down.
818
00:39:31,220 --> 00:39:33,900
So, what's your name?
Ali. Ali. OK, perfect.
819
00:39:33,900 --> 00:39:37,100
Ali. Now, Ali, you've got the
advantage of being a sentient being,
820
00:39:37,100 --> 00:39:40,020
so I think it's only fair that we
let the matchboxes go first.
821
00:39:40,020 --> 00:39:41,540
OK, so Menace is going to go first.
822
00:39:41,540 --> 00:39:44,060
And this is the blank box,
where there's nothing
823
00:39:44,060 --> 00:39:45,780
on it because we haven't
played at all yet.
824
00:39:45,780 --> 00:39:48,140
I'm going to give it a shake
and then I'm going to pull a bit
825
00:39:48,140 --> 00:39:50,340
out at random.
So while Menace is the brain,
826
00:39:50,340 --> 00:39:52,580
I've got to do the actual moving
around. OK.
827
00:39:52,580 --> 00:39:56,020
So the first bead is green,
828
00:39:56,020 --> 00:39:58,180
which is not surprising
because you look in the box -
829
00:39:58,180 --> 00:39:59,860
they're all green!
830
00:39:59,860 --> 00:40:03,100
There's one purple one,
because Menace
831
00:40:03,100 --> 00:40:05,220
has learnt very quickly
832
00:40:05,220 --> 00:40:09,780
if it goes in the middle to start
with, it's way more likely to win.
833
00:40:09,780 --> 00:40:11,780
Now, there you are, Ali.
Where would you like to go?
834
00:40:11,780 --> 00:40:13,860
Put a cross wherever you wish.
835
00:40:16,540 --> 00:40:18,500
Oh, a corner. Bold move.
836
00:40:18,500 --> 00:40:22,260
OK. I've now got to find,
in the group of second-move boxes,
837
00:40:22,260 --> 00:40:26,860
this game. OK, there it is.
838
00:40:26,860 --> 00:40:29,940
But if you have a look, Ali,
it's the same as this game,
839
00:40:29,940 --> 00:40:31,460
but it's the mirror image.
840
00:40:31,460 --> 00:40:34,180
And that's because we didn't want
to use twice as many boxes.
841
00:40:34,180 --> 00:40:35,700
It's not good for the environment.
842
00:40:35,700 --> 00:40:38,220
So if you don't mind me
turning the game over,
843
00:40:38,220 --> 00:40:40,220
are you happy that's still the same
game? Uh-huh.
844
00:40:40,220 --> 00:40:44,380
But now, it exactly matches what's
on the box. So I can give it a shake
845
00:40:44,380 --> 00:40:46,420
and Menace is going to go...
846
00:40:47,820 --> 00:40:50,460
Make sure it's properly random.
Yellow.
847
00:40:50,460 --> 00:40:52,900
They're all yellow.
OK. So, it's learnt...
848
00:40:52,900 --> 00:40:55,180
I mean, there are other moves
it could make, but it happens
849
00:40:55,180 --> 00:40:58,100
to have learnt that there is
a good second move.
850
00:40:58,100 --> 00:40:59,420
OK, your go.
851
00:41:02,300 --> 00:41:04,260
Predictable. OK.
852
00:41:04,260 --> 00:41:06,060
You're not letting it win
that easily.
853
00:41:06,060 --> 00:41:07,780
That's fine. OK.
854
00:41:07,780 --> 00:41:10,060
Third move. Where are we?
Where are we?
855
00:41:10,060 --> 00:41:11,380
There! OK.
856
00:41:11,380 --> 00:41:15,060
So you can see that one there,
that matches this state.
857
00:41:15,060 --> 00:41:17,100
Here we go. Give it a shake.
858
00:41:19,100 --> 00:41:20,900
OK. Oh, white. OK.
859
00:41:20,900 --> 00:41:23,900
So the next move
is going to be white.
860
00:41:23,900 --> 00:41:25,820
That's... Oh, that's not bad.
861
00:41:25,820 --> 00:41:27,140
There you are.
862
00:41:31,540 --> 00:41:34,940
You're just not letting Menace
have any luck, are you? OK.
863
00:41:34,940 --> 00:41:39,820
That's fine. So now I've got
to find this in here.
864
00:41:39,820 --> 00:41:41,460
Bear with me.
865
00:41:41,460 --> 00:41:46,140
This is exactly as much fun
as real noughts and cross.
866
00:41:46,140 --> 00:41:47,540
HANNAH CHUCKLES
867
00:41:47,540 --> 00:41:51,020
Ah, OK. I think that's it.
868
00:41:51,020 --> 00:41:53,220
Let's have a look. That's...
869
00:41:53,220 --> 00:41:55,740
Yes. OK. Right.
870
00:41:55,740 --> 00:41:57,740
Menace is going to go....
871
00:41:57,740 --> 00:41:59,660
Oh, there's not many beads
left in this one.
872
00:41:59,660 --> 00:42:00,940
Oh, orange. OK.
873
00:42:00,940 --> 00:42:02,980
So I'm going to put an orange one
there.
874
00:42:02,980 --> 00:42:05,020
Now, that move.
875
00:42:06,460 --> 00:42:09,180
And now,
for the exciting conclusion...
876
00:42:09,180 --> 00:42:11,300
LAUGHTER
877
00:42:13,300 --> 00:42:16,580
Excellent. And now we go over...
It's a draw.
878
00:42:16,580 --> 00:42:19,140
And the problem with noughts and
crosses is if everyone's playing
879
00:42:19,140 --> 00:42:21,460
perfectly well,
it always ends in a draw.
880
00:42:21,460 --> 00:42:26,340
So in this case, I'm going to give
Menace one extra of those beads
881
00:42:26,340 --> 00:42:29,260
to say that was fine, but I'm
not going to give it as many
882
00:42:29,260 --> 00:42:31,180
as if it had won. Amazing.
883
00:42:31,180 --> 00:42:34,060
Well, there we go. You managed to
draw with a bunch of cardboard
884
00:42:34,060 --> 00:42:37,700
boxes. Very impressive! A big round
of applause, if you can. Thank you.
885
00:42:40,380 --> 00:42:43,420
The point here is that you don't
need to explain the rules
886
00:42:43,420 --> 00:42:46,460
or the instructions. The machine -
in that case, the matchboxes -
887
00:42:46,460 --> 00:42:48,780
learns from getting rewards
888
00:42:48,780 --> 00:42:52,260
when it does the right thing and it
turns out that you can use this for
889
00:42:52,260 --> 00:42:55,100
all kinds of things, so you can
teach matchboxes how to play noughts
890
00:42:55,100 --> 00:42:56,380
and crosses.
891
00:42:56,380 --> 00:43:01,060
And going back to that earlier
challenge of recognising dogs -
892
00:43:01,060 --> 00:43:04,020
turns out you can teach machines
to do it, too.
893
00:43:04,020 --> 00:43:06,860
Now, for this,
I would like a volunteer
894
00:43:06,860 --> 00:43:08,500
to help me demonstrate this.
895
00:43:08,500 --> 00:43:10,860
At the back there. Yeah.
896
00:43:10,860 --> 00:43:13,380
With the red jumper. Round of
applause as she comes to the stage.
897
00:43:13,380 --> 00:43:14,860
Thank you.
898
00:43:17,340 --> 00:43:19,420
What's your name? Felicity.
899
00:43:19,420 --> 00:43:22,020
OK, Felicity, right. So let's
have a little look over here.
900
00:43:22,020 --> 00:43:26,260
I'm going to demonstrate to you
how a machine can learn
901
00:43:26,260 --> 00:43:29,460
to recognise pictures of dogs
and not dogs.
902
00:43:29,460 --> 00:43:33,060
Now, the way that this works is,
what you do is you give
903
00:43:33,060 --> 00:43:37,220
the machine a bunch of pictures
of both dogs and not dogs,
904
00:43:37,220 --> 00:43:38,900
and then you label them
for the machine.
905
00:43:38,900 --> 00:43:41,300
So that one goes into the dog pile.
Just here.
906
00:43:41,300 --> 00:43:42,500
Perfect.
907
00:43:42,500 --> 00:43:45,780
Now, initially, when you give
these pictures to the machine,
908
00:43:45,780 --> 00:43:49,140
it's just going to be guessing
at random, like we saw earlier
909
00:43:49,140 --> 00:43:51,460
in the first stages
of that pigeon thing.
910
00:43:51,460 --> 00:43:56,260
But what you can do is if you play
this over and over and over again
911
00:43:56,260 --> 00:44:01,420
with tens or maybe hundreds
of thousands of different images,
912
00:44:01,420 --> 00:44:06,180
and eventually, if you do this
enough times, you can end up with
913
00:44:06,180 --> 00:44:09,340
something like this little app
over here.
914
00:44:09,340 --> 00:44:11,260
If you want to come
and sit over here.
915
00:44:11,260 --> 00:44:14,180
Now, we've got a little
algorithm running on this iPod
916
00:44:14,180 --> 00:44:16,860
that has been trained in just
the way that we described.
917
00:44:16,860 --> 00:44:20,580
And we are going to show
this algorithm a series of objects.
918
00:44:20,580 --> 00:44:22,940
And the algorithm is going to work
out whether they are dogs
919
00:44:22,940 --> 00:44:24,780
or not dogs. All right.
920
00:44:24,780 --> 00:44:27,780
Is everyone ready to play
dog or not dog?
921
00:44:27,780 --> 00:44:29,940
ALL: Yes!
Here we go.
922
00:44:29,940 --> 00:44:33,260
Contestant number one, please.
Dog or not dog?
923
00:44:38,220 --> 00:44:40,020
Hello!
924
00:44:40,020 --> 00:44:41,020
DOG BARKS
925
00:44:41,020 --> 00:44:43,180
Ohh! Dog!
926
00:44:43,180 --> 00:44:44,540
Hooray!
927
00:44:44,540 --> 00:44:47,100
Thank you, Luna!
Contestant number two!
928
00:44:48,100 --> 00:44:50,340
Dog or not dog?
929
00:44:50,340 --> 00:44:52,780
Not dog! Thank you very much.
930
00:44:52,780 --> 00:44:55,020
Contestant number three.
931
00:44:55,020 --> 00:44:56,380
AUDIENCE: Awww...
932
00:44:56,380 --> 00:44:58,300
You OK, little guy?
933
00:44:58,300 --> 00:45:00,300
Dog! Amazing!
934
00:45:00,300 --> 00:45:02,740
Next contestant, please!
935
00:45:02,740 --> 00:45:04,740
HANNAH CACKLES
936
00:45:04,740 --> 00:45:07,820
Oh, this is very cute! Sit! Sit!
937
00:45:09,140 --> 00:45:11,500
Not dog! Well done.
938
00:45:11,500 --> 00:45:12,980
Haven't got one, mate!
939
00:45:12,980 --> 00:45:15,500
Next contestant, please!
940
00:45:19,380 --> 00:45:21,580
Aww! Look!
941
00:45:21,580 --> 00:45:23,500
Not dog!
942
00:45:23,500 --> 00:45:25,500
And our final contestant.
943
00:45:27,700 --> 00:45:29,980
AUDIENCE: Aww!
944
00:45:29,980 --> 00:45:32,340
Look how beautiful!
945
00:45:32,340 --> 00:45:34,180
Dog!
946
00:45:34,180 --> 00:45:35,780
Thank you very much,
947
00:45:35,780 --> 00:45:38,580
and thank you very much for
playing the game with us.
948
00:45:38,580 --> 00:45:41,180
Big round of applause for everyone
involved in dog or not dog!
949
00:45:49,500 --> 00:45:52,740
Now, this idea, this kind of
algorithm, a reinforcement learning
950
00:45:52,740 --> 00:45:55,780
algorithm, it can actually
have profound consequences
951
00:45:55,780 --> 00:45:58,500
because the algorithm doesn't
really care what it's looking at.
952
00:45:58,500 --> 00:46:01,060
If it can learn to identify
dogs, it can also learn
953
00:46:01,060 --> 00:46:03,220
to identify diseases.
954
00:46:03,220 --> 00:46:06,220
To explain, please welcome
an ophthalmologist
955
00:46:06,220 --> 00:46:09,020
from Moorfields
Eye Hospital, Pearse Keane.
956
00:46:12,020 --> 00:46:14,820
So, Pearse, tell me
what it is that you do.
957
00:46:14,820 --> 00:46:19,100
So I'm a consultant ophthalmologist
at Moorfields Eye Hospital
958
00:46:19,100 --> 00:46:22,180
and I specialise in the treatment
of retinal diseases.
959
00:46:22,180 --> 00:46:25,420
So, in particular, I specialise
in the treatment of a disease
960
00:46:25,420 --> 00:46:27,700
called macular degeneration,
961
00:46:27,700 --> 00:46:30,900
and macular degeneration is
the commonest cause of blindness
962
00:46:30,900 --> 00:46:33,540
in the United Kingdom.
And is it treatable?
963
00:46:33,540 --> 00:46:36,820
So the thing about macular
degeneration is that, if we pick
964
00:46:36,820 --> 00:46:40,660
it up early, we have some
good treatments and we can stop
965
00:46:40,660 --> 00:46:42,460
people going blind from it.
966
00:46:42,460 --> 00:46:45,660
The problem that we have is that it
affects so many people as they get
967
00:46:45,660 --> 00:46:49,620
older that sometimes we have a
challenge to actually find
968
00:46:49,620 --> 00:46:52,460
the people who've developed it
and treat them in a timely
969
00:46:52,460 --> 00:46:55,540
fashion. Because there are so many
people. Because there's so many.
970
00:46:55,540 --> 00:46:59,060
So we actually get 200 people who
develop the blinding forms
971
00:46:59,060 --> 00:47:03,500
of macular degeneration every
single day just in the UK.
972
00:47:03,500 --> 00:47:07,940
And so, for us, it seemed like
this would be a perfect example
973
00:47:07,940 --> 00:47:12,860
where we could apply artificial
intelligence to try to identify
974
00:47:12,860 --> 00:47:15,620
those people with the most
sight-threatening diseases at the
975
00:47:15,620 --> 00:47:17,220
earliest point and save their sight.
976
00:47:17,220 --> 00:47:19,060
And I've brought Elaine Manor,
977
00:47:19,060 --> 00:47:21,700
a patient of mine at Moorfields
Eye Hospital.
978
00:47:28,300 --> 00:47:30,860
Thank you very much
for joining us, Elaine.
979
00:47:30,860 --> 00:47:34,340
For me, Elaine exemplifies
actually why this is important,
980
00:47:34,340 --> 00:47:40,220
because the Elaine story is
that actually she had lost sight
981
00:47:40,220 --> 00:47:43,660
in one eye from this condition,
from macular degeneration,
982
00:47:43,660 --> 00:47:48,100
and then in 2012, she started
to develop it in her good eye.
983
00:47:48,100 --> 00:47:50,900
And she went to her high street
optician and was told,
984
00:47:50,900 --> 00:47:53,780
"You need to be urgently seen
by a retina specialist,"
985
00:47:53,780 --> 00:47:58,060
someone like me, but she got an
appointment for six weeks later.
986
00:47:58,060 --> 00:48:00,380
So you can imagine a situation
if you're losing sight
987
00:48:00,380 --> 00:48:04,140
in your good eye and there is an
effective treatment, but you're told
988
00:48:04,140 --> 00:48:05,780
that you have to wait six weeks.
989
00:48:05,780 --> 00:48:07,420
What was that like at the time,
Elaine?
990
00:48:07,420 --> 00:48:09,780
I was absolutely terrified.
991
00:48:09,780 --> 00:48:14,500
I felt that, when I went to bed,
would I wake up to a world
992
00:48:14,500 --> 00:48:17,740
of darkness?
Would I see my family again?
993
00:48:17,740 --> 00:48:19,780
It was worrying. Yeah.
994
00:48:19,780 --> 00:48:22,220
And this is all just because of
the sheer volume of patients?
995
00:48:22,220 --> 00:48:25,060
Because of the huge number
of patients.
996
00:48:25,060 --> 00:48:27,100
So what is this machine?
How does that work?
997
00:48:27,100 --> 00:48:28,900
So this is an eye scanner.
998
00:48:28,900 --> 00:48:30,900
It's something that scans
the retina.
999
00:48:30,900 --> 00:48:33,820
It's super-high resolution,
a three-dimensional image
1000
00:48:33,820 --> 00:48:34,980
of the back of your eye.
1001
00:48:34,980 --> 00:48:38,100
But it's something that these
algorithms can get to work on.
1002
00:48:38,100 --> 00:48:39,540
Well, that's exactly it.
1003
00:48:39,540 --> 00:48:42,380
I mean, if the algorithm
can tell dog or not dog,
1004
00:48:42,380 --> 00:48:46,460
then it seems like it could tell
macular degeneration or not macular
1005
00:48:46,460 --> 00:48:50,100
degeneration, and help people...help
prevent people from going blind.
1006
00:48:50,100 --> 00:48:55,140
So I hope that we can have a look
at how this algorithm looks
1007
00:48:55,140 --> 00:48:56,820
when it tries to process an image.
1008
00:48:56,820 --> 00:48:59,420
Can we have a little look? You can
see here on the scan, it's called
1009
00:48:59,420 --> 00:49:02,020
out that there is an urgent
problem with the eye.
1010
00:49:02,020 --> 00:49:04,420
It's identified choroidal
neovascularisation.
1011
00:49:04,420 --> 00:49:07,580
This red area here is some
blood vessel growth at the back
1012
00:49:07,580 --> 00:49:11,100
of the eye, which is suggestive of
the condition Pearse was describing.
1013
00:49:11,100 --> 00:49:13,260
We've published a research article.
1014
00:49:13,260 --> 00:49:17,420
What we've shown is the proof
of concept that the algorithm
1015
00:49:17,420 --> 00:49:22,180
that we've developed is actually
as good as me or other consultants
1016
00:49:22,180 --> 00:49:24,860
at Moorfields or other
world-leading eye doctors
1017
00:49:24,860 --> 00:49:26,660
at diagnosing these diseases.
1018
00:49:26,660 --> 00:49:29,900
And this stuff is, I mean,
it's incredibly important to get
1019
00:49:29,900 --> 00:49:33,100
you to those urgent cases quicker.
Yes.
1020
00:49:33,100 --> 00:49:34,340
Yes. Like Elaine, I guess.
1021
00:49:34,340 --> 00:49:36,980
But the great thing was, although
it was stressful for Elaine
1022
00:49:36,980 --> 00:49:39,780
at the start, she was able
to get the treatment
1023
00:49:39,780 --> 00:49:41,660
that she needed in this eye in time,
1024
00:49:41,660 --> 00:49:43,300
and we have been able
to save the sight
1025
00:49:43,300 --> 00:49:45,140
that she has in her good eye.
1026
00:49:45,140 --> 00:49:48,460
Pearse Keane and Elaine, thank you
very much indeed for joining me.
1027
00:49:48,460 --> 00:49:49,980
APPLAUSE
1028
00:49:55,420 --> 00:49:56,900
Wow.
1029
00:49:56,900 --> 00:49:59,780
As I think Pearse's work
there demonstrates you can see
1030
00:49:59,780 --> 00:50:02,900
just how far you can go
with a machine that is capable
1031
00:50:02,900 --> 00:50:05,100
of categorising images.
As impressive
1032
00:50:05,100 --> 00:50:06,820
as all of this technology is,
1033
00:50:06,820 --> 00:50:09,060
it's not quite perfect.
1034
00:50:09,060 --> 00:50:11,180
It is important to say
that these algorithms,
1035
00:50:11,180 --> 00:50:14,740
they don't really understand the
world in the same way that we do.
1036
00:50:14,740 --> 00:50:17,740
They don't understand context,
they don't understand nuance,
1037
00:50:17,740 --> 00:50:21,060
and that means that it
really can make mistakes.
1038
00:50:21,060 --> 00:50:24,020
In particular, when you take
something and you put
1039
00:50:24,020 --> 00:50:27,660
it in a strange situation,
like this picture, here.
1040
00:50:27,660 --> 00:50:31,620
So if we take farmyard animals and
we put them in strange situations,
1041
00:50:31,620 --> 00:50:34,660
like being cuddled by a child,
for instance,
1042
00:50:34,660 --> 00:50:37,380
the algorithm labels as a dog.
1043
00:50:37,380 --> 00:50:40,620
This one I quite like. If you take
a farmyard animal and put it in a
1044
00:50:40,620 --> 00:50:43,340
tree - this is a real photograph,
by the way -
1045
00:50:43,340 --> 00:50:44,940
the algorithm gets a bit confused!
1046
00:50:44,940 --> 00:50:46,020
LAUGHTER
1047
00:50:46,020 --> 00:50:48,060
Surely it's an orangutan?!
1048
00:50:48,060 --> 00:50:50,660
And my favourite one of all
is that, if you leave them back
1049
00:50:50,660 --> 00:50:54,260
in the original context but instead
paint them pink, the algorithm
1050
00:50:54,260 --> 00:50:57,860
thinks they must be flowers,
which I think is quite sweet!
1051
00:50:57,860 --> 00:50:59,260
LAUGHTER
1052
00:50:59,260 --> 00:51:02,700
This is the consequence of the
way that these machines learn.
1053
00:51:02,700 --> 00:51:05,780
As we have seen a couple
of times during this show,
1054
00:51:05,780 --> 00:51:08,820
in the beginning they just start off
doing loads of completely random
1055
00:51:08,820 --> 00:51:11,980
things, which means that as
they settle in to completing
1056
00:51:11,980 --> 00:51:14,540
their task they can end up
with some rather
1057
00:51:14,540 --> 00:51:17,780
strange little quirks. Have a
look at this little video, here.
1058
00:51:17,780 --> 00:51:19,780
So this is a simulation
1059
00:51:19,780 --> 00:51:24,860
of an algorithm that was given
a body, and its objective was to
1060
00:51:24,860 --> 00:51:27,300
work out how to move its body.
1061
00:51:27,300 --> 00:51:30,380
Every time it falls over,
it fails and starts again.
1062
00:51:30,380 --> 00:51:32,020
And you can see it in the beginning,
1063
00:51:32,020 --> 00:51:34,980
it's just trying lots of things,
lots of random things.
1064
00:51:34,980 --> 00:51:39,460
But the consequence of that is
that it really has, like, some quite
1065
00:51:39,460 --> 00:51:43,420
strange...quite strange habits
that it ends up picking up
1066
00:51:43,420 --> 00:51:46,980
as it's sort of flailing all of
its limbs around at random.
1067
00:51:46,980 --> 00:51:49,740
Now, this is what happens when
you put reinforcement learning
1068
00:51:49,740 --> 00:51:53,260
into a body - a simulated one
in that particular case -
1069
00:51:53,260 --> 00:51:56,740
but there is nothing stopping you
from putting the very same ideas
1070
00:51:56,740 --> 00:52:00,340
into the body of a robot too.
So, to explain,
1071
00:52:00,340 --> 00:52:03,180
please join me in welcoming back
Anne-Marie Imafidon.
1072
00:52:12,500 --> 00:52:14,740
So, Anne-Marie, robots -
talk me through it.
1073
00:52:14,740 --> 00:52:17,060
So, embodied robots, we've got
1074
00:52:17,060 --> 00:52:19,700
lots of them that we're trying
to train to do all kinds
1075
00:52:19,700 --> 00:52:22,820
of different tasks,
and we've got an example here.
1076
00:52:22,820 --> 00:52:25,660
OK. So this is from Professor
Pieter Abbeel
1077
00:52:25,660 --> 00:52:28,460
at the Berkeley AI Research Lab.
Exactly.
1078
00:52:28,460 --> 00:52:31,700
So we can see this is a robot
that has never kind of known
1079
00:52:31,700 --> 00:52:33,140
how to move its arm before.
1080
00:52:33,140 --> 00:52:35,620
So it's just basically flinging
its arm around at random,
1081
00:52:35,620 --> 00:52:37,220
trying to get the stuff in the box?
1082
00:52:37,220 --> 00:52:39,180
Yes, just trying to get the peg
into the hole.
1083
00:52:39,180 --> 00:52:40,420
It's not doing very well.
1084
00:52:40,420 --> 00:52:43,580
It's not, because it hasn't had
the time to learn what an elbow
1085
00:52:43,580 --> 00:52:46,620
is, kind of how it needs to move
around, but also that the shape
1086
00:52:46,620 --> 00:52:48,740
needs to fit the hole
that it's got there.
1087
00:52:48,740 --> 00:52:50,580
OK. So this is after five goes.
1088
00:52:50,580 --> 00:52:52,780
We can see here iteration
eight as well
1089
00:52:52,780 --> 00:52:55,660
we've jumped to, and it's
getting very close.
1090
00:52:55,660 --> 00:52:58,620
Yeah, it's still not managing it.
I mean, it's basically just trying
1091
00:52:58,620 --> 00:53:00,380
lots and lots of things at random.
1092
00:53:00,380 --> 00:53:03,780
Yes. This I think, actually, it's
something that pretty much every
1093
00:53:03,780 --> 00:53:06,980
parent of a young child will
recognise.
1094
00:53:06,980 --> 00:53:11,180
This... This is Juno. Come on, then,
1095
00:53:11,180 --> 00:53:15,060
Juno, do you want to come
and have a little play?
1096
00:53:15,060 --> 00:53:17,220
Now, this is something
that you really notice,
1097
00:53:17,220 --> 00:53:23,020
is that, when they're very little,
they don't necessarily understand
1098
00:53:23,020 --> 00:53:25,500
how to control their limbs.
Let's give you a go.
1099
00:53:25,500 --> 00:53:28,340
You can try this, too,
Juno, here we go.
1100
00:53:28,340 --> 00:53:30,300
Do you want to play with this one?
1101
00:53:30,300 --> 00:53:33,100
Is it a similar thing going on?
Exactly.
1102
00:53:33,100 --> 00:53:36,700
So you learn the kind of motor
skills you might need to actually
1103
00:53:36,700 --> 00:53:39,700
complete this task between
12 to 18 months,
1104
00:53:39,700 --> 00:53:41,740
and Juno here is only six months
1105
00:53:41,740 --> 00:53:43,980
so it's gone in her mouth,
1106
00:53:43,980 --> 00:53:47,380
which the robot didn't do,
admittedly!
1107
00:53:47,380 --> 00:53:49,860
Different kind of reward!
Exactly.
1108
00:53:49,860 --> 00:53:53,940
But it's only over time, by learning
through lots and lots of
1109
00:53:53,940 --> 00:53:55,580
random behaviour, initially,
1110
00:53:55,580 --> 00:53:57,380
that we learn how to
control our limbs.
1111
00:53:57,380 --> 00:53:59,420
We've all been there.
We've all been Juno!
1112
00:53:59,420 --> 00:54:01,700
So the robots are learning
in a very similar way,
1113
00:54:01,700 --> 00:54:03,500
just they do have fewer
distractions.
1114
00:54:03,500 --> 00:54:06,180
They don't want to put things
in their mouths, and their rewards
1115
00:54:06,180 --> 00:54:07,740
are just a little bit different.
1116
00:54:07,740 --> 00:54:11,340
So if the humans take about 18
months to master this task,
1117
00:54:11,340 --> 00:54:13,020
how long did that robot take?
1118
00:54:13,020 --> 00:54:15,780
So this particular robot that we
were just looking at learned
1119
00:54:15,780 --> 00:54:18,140
to do that task in under an hour.
1120
00:54:18,140 --> 00:54:21,020
Yeah. Well, not long to go now,
Juno. You might not manage it in
1121
00:54:21,020 --> 00:54:22,500
an hour, but give it a few months
1122
00:54:22,500 --> 00:54:24,260
and you'll definitely
be doing these.
1123
00:54:24,260 --> 00:54:27,660
Thank you very much to Juno
and Anne-Marie! Thank you.
1124
00:54:27,660 --> 00:54:29,540
APPLAUSE
1125
00:54:32,420 --> 00:54:36,980
That is the key idea - it's all
about harnessing the power
1126
00:54:36,980 --> 00:54:41,540
of randomness, using trial and error
to hone in on something that works.
1127
00:54:41,540 --> 00:54:43,980
Babies do it, robots do it,
1128
00:54:43,980 --> 00:54:46,340
and as an audience now,
we're going to do it, too.
1129
00:54:46,340 --> 00:54:51,020
So, as a room, we're all going
to be artificial intelligence.
1130
00:54:51,020 --> 00:54:54,580
We're going to let you loose
like the machine-learning bots.
1131
00:54:54,580 --> 00:54:56,700
And for that, I'm
going to hand over
1132
00:54:56,700 --> 00:54:59,020
to the very wonderful WiFi Wars.
1133
00:54:59,020 --> 00:55:01,380
CHEERING AND APPLAUSE
1134
00:55:04,060 --> 00:55:06,700
OK, Steve, tell us what we're
going to do.
1135
00:55:06,700 --> 00:55:10,740
So what we're going to do is we'll
get you guys here to represent
1136
00:55:10,740 --> 00:55:12,780
random computer attempts at
navigating a maze.
1137
00:55:12,780 --> 00:55:14,540
What we need to do is get your
phones out
1138
00:55:14,540 --> 00:55:15,820
and go into your browsers.
1139
00:55:15,820 --> 00:55:18,420
And while you are going back
into there, I will show you the maze
1140
00:55:18,420 --> 00:55:20,900
if I can, please, Rob. There we are.
So what we're going to do,
1141
00:55:20,900 --> 00:55:23,460
we're going to put all of you in
this blue room in the bottom right
1142
00:55:23,460 --> 00:55:25,980
corner, but we're going to
incentivise you to go exploring.
1143
00:55:25,980 --> 00:55:28,220
So you will get ten points for each
square in the corridor
1144
00:55:28,220 --> 00:55:30,180
you get through, but, far more
importantly,
1145
00:55:30,180 --> 00:55:31,860
you'll get 1,000 bonus
points for each
1146
00:55:31,860 --> 00:55:33,860
of the coloured rooms
you manage to get to.
1147
00:55:33,860 --> 00:55:37,180
So the big prize in this one is get
to these three interconnected rooms
1148
00:55:37,180 --> 00:55:39,900
in the top left. If you do that,
I will be very impressed.
1149
00:55:39,900 --> 00:55:41,860
Everything we do at WiFi Wars
is competitive
1150
00:55:41,860 --> 00:55:43,900
so what we've done is we split
you into two halves.
1151
00:55:43,900 --> 00:55:45,660
So you guys on this
half of the room,
1152
00:55:45,660 --> 00:55:49,180
I believe, are the blue computer.
CHEERING
1153
00:55:49,180 --> 00:55:51,740
And you guys are the much
louder red computer!
1154
00:55:51,740 --> 00:55:53,500
CHEERING/BOOING
1155
00:55:53,500 --> 00:55:55,900
That wasn't fair. That wasn't fair.
Booing already!
1156
00:55:55,900 --> 00:55:59,180
OK. So I'm going to give you
90 seconds to go through that.
1157
00:55:59,180 --> 00:56:00,780
So, because there's so many people
1158
00:56:00,780 --> 00:56:03,140
and we're going to be looking
at that big map... Yeah.
1159
00:56:03,140 --> 00:56:05,740
..it's really difficult to work
out who you are, right?
1160
00:56:05,740 --> 00:56:08,300
So you are essentially each
representing a random
1161
00:56:08,300 --> 00:56:10,180
attempt at winning this game.
Exactly.
1162
00:56:10,180 --> 00:56:12,580
So if it was one or two
you could find your way through,
1163
00:56:12,580 --> 00:56:14,940
but we're going to make that
much more complicated. OK.
1164
00:56:14,940 --> 00:56:17,140
Good. Are you ready?
ALL: Yeah!
1165
00:56:17,140 --> 00:56:18,860
All right. Let's do a game, then.
1166
00:56:18,860 --> 00:56:21,220
If you send it into their phones
now, please, Rob. We'll put
1167
00:56:21,220 --> 00:56:23,660
90 seconds on the clock and
hopefully we're going to see...
1168
00:56:23,660 --> 00:56:25,460
Immediately we see you all
there appearing.
1169
00:56:25,460 --> 00:56:27,980
We've got a lot of blue and red
players from the two computers.
1170
00:56:27,980 --> 00:56:30,620
Already people are getting out - a
red person was the first one out.
1171
00:56:30,620 --> 00:56:32,980
My goodness me! Although
collectively the blue team...
1172
00:56:32,980 --> 00:56:34,580
I was going to say they're
doing better
1173
00:56:34,580 --> 00:56:36,900
but by the time I said it,
they're not, so the red computer
1174
00:56:36,900 --> 00:56:39,180
after 15 seconds have 6,000 points
and the blues 5,400.
1175
00:56:39,180 --> 00:56:41,180
No-one's found a coloured room yet.
1176
00:56:41,180 --> 00:56:42,540
And, as I've said, if you can get
1177
00:56:42,540 --> 00:56:44,180
to these three in
the top left corner...
1178
00:56:44,180 --> 00:56:46,980
Blues doing very well. Yeah, they're
all going for it. Going round
1179
00:56:46,980 --> 00:56:49,780
the two main channels. As you would
expect, sort of more often than not
1180
00:56:49,780 --> 00:56:52,500
they're picking the main
thoroughfares as they hit dead ends.
1181
00:56:52,500 --> 00:56:55,340
And actually we've got a few people
who are in the purple room now,
1182
00:56:55,340 --> 00:56:57,420
so well done. I think red was the
first so well done.
1183
00:56:57,420 --> 00:56:59,500
For context, if you're wondering
what's going on,
1184
00:56:59,500 --> 00:57:02,100
what Rob's created is a thing
that allows us - without installing
1185
00:57:02,100 --> 00:57:04,260
anything, any device,
browser or operating system -
1186
00:57:04,260 --> 00:57:06,500
to beam games onto your devices.
A red one's on the way!
1187
00:57:06,500 --> 00:57:08,740
Wow. All right. Yeah. Good.
I don't know if you know...
1188
00:57:08,740 --> 00:57:12,380
If there's a red, if you now
are in a green room, cheer.
1189
00:57:12,380 --> 00:57:14,900
Yes, you! Well done to you.
You were very, very good.
1190
00:57:14,900 --> 00:57:16,980
You've got 30 seconds left to do
something about it.
1191
00:57:16,980 --> 00:57:19,700
Nobody's managed to find
the orange or the red yet,
1192
00:57:19,700 --> 00:57:20,900
but we have got a few...
1193
00:57:20,900 --> 00:57:23,700
Well, we've got another red coming
for the green, but a lot more blues,
1194
00:57:23,700 --> 00:57:25,380
actually, so the blues
might level this up
1195
00:57:25,380 --> 00:57:27,700
if they can all get in there,
but with 20 seconds left
1196
00:57:27,700 --> 00:57:29,660
it is 70,000 points
to the red computer.
1197
00:57:29,660 --> 00:57:31,100
Blues, you need to do better.
1198
00:57:31,100 --> 00:57:33,300
You've got only 60,000
with 16 seconds left.
1199
00:57:33,300 --> 00:57:35,500
So many people getting stuck
in this horrible dead end
1200
00:57:35,500 --> 00:57:37,380
that we cruelly put in the top
right corner.
1201
00:57:37,380 --> 00:57:39,740
I think blues will need a miracle.
The reds are cheering.
1202
00:57:39,740 --> 00:57:40,900
Ten seconds left.
1203
00:57:40,900 --> 00:57:44,060
Five seconds left now.
88,000 for the red computer.
1204
00:57:44,060 --> 00:57:45,700
82,000 now for the blues,
1205
00:57:45,700 --> 00:57:49,340
so that is a win, I'm afraid,
for the red computer!
1206
00:57:49,340 --> 00:57:51,260
Wahey! Well done to you all.
1207
00:57:51,260 --> 00:57:55,140
"Boo!" Polite applause.
1208
00:57:55,140 --> 00:57:57,540
Well done. That was absolutely
fantastic.
1209
00:57:57,540 --> 00:57:59,180
OK. Here is the big question
1210
00:57:59,180 --> 00:58:01,740
on top of all of that,
because we saw in the last lecture
1211
00:58:01,740 --> 00:58:04,180
the highs and the lows of
uncertainty. We saw what can go
1212
00:58:04,180 --> 00:58:07,540
right and what can go wrong
when you rely on something
1213
00:58:07,540 --> 00:58:09,780
that has randomness at its core.
1214
00:58:09,780 --> 00:58:14,100
So, OK, if we accept that perfection
is impossible - if we acknowledge
1215
00:58:14,100 --> 00:58:17,940
that these things are always going
to have errors - we have to ask,
1216
00:58:17,940 --> 00:58:22,300
do we want to put flawed machines
in a position of power?
1217
00:58:22,300 --> 00:58:25,180
That, I think, is one for
the next lecture.
1218
00:58:25,180 --> 00:58:28,140
But, for now, who'd like
to have a rematch of this?
1219
00:58:28,140 --> 00:58:30,300
CHEERING
103883
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.