Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:08,720 --> 00:00:10,480
- [Gemma] Artificial
intelligence,
2
00:00:10,481 --> 00:00:14,199
for years, a dream of scientists
and Hollywood producers,
3
00:00:14,960 --> 00:00:16,400
but also the violent
4
00:00:16,401 --> 00:00:18,800
and destructive stuff
of our nightmares.
5
00:00:19,399 --> 00:00:21,199
My name is Gemma Chan.
6
00:00:21,200 --> 00:00:24,480
I play a robot in the
futuristic sci-fi
drama, "Humans."
7
00:00:24,481 --> 00:00:27,320
As the development of
artificial intelligence
accelerates
8
00:00:27,321 --> 00:00:29,920
and starts to pervade every
aspect of our lives,
9
00:00:29,921 --> 00:00:33,959
I'm hoping to find out whether
the world depicted
10
00:00:33,960 --> 00:00:35,960
in science fiction
is 10 years away,
11
00:00:35,961 --> 00:00:40,118
100 years away, or
closer than we think.
12
00:00:40,119 --> 00:00:43,520
I'm going to meet some of the
greatest minds
in science.
13
00:00:43,521 --> 00:00:44,919
They're divided.
14
00:00:44,920 --> 00:00:48,159
Some think AI holds the key to
a safe and prosperous future.
15
00:00:48,160 --> 00:00:50,719
- I see AI as an
opportunity actually,
16
00:00:50,720 --> 00:00:53,199
to unlock humanity's
full potential.
17
00:00:53,200 --> 00:00:54,759
- [Gemma] Others
think the nightmare
18
00:00:54,760 --> 00:00:56,639
may just be about to begin.
19
00:00:56,640 --> 00:01:00,478
- We're worried that this
will come out too soon,
20
00:01:00,479 --> 00:01:01,800
people will die.
21
00:01:01,801 --> 00:01:03,860
- Once you have a kind of
super intelligent genie
22
00:01:03,861 --> 00:01:05,239
that's out of the bottle,
23
00:01:04,959 --> 00:01:07,000
it might not be possible
to put it back in again.
24
00:01:07,001 --> 00:01:10,600
- [Gemma] To see just how far
we can take the power of AI,
25
00:01:10,601 --> 00:01:12,560
we're conducting a
unique experiment,
26
00:01:12,561 --> 00:01:14,560
building a robot version of me.
27
00:01:14,561 --> 00:01:17,680
- This is really the first
time we've tried this,
28
00:01:17,681 --> 00:01:19,358
so it's very, very new.
29
00:01:19,359 --> 00:01:20,319
(intense music)
30
00:01:20,320 --> 00:01:24,039
- It's really quite uncanny.
31
00:01:24,040 --> 00:01:26,480
A robot that looks
like me is one thing.
32
00:01:26,481 --> 00:01:28,199
That is my nose.
33
00:01:28,200 --> 00:01:31,280
But can it harness the power
of artificial intelligence
34
00:01:31,281 --> 00:01:33,598
to actually think like me?
35
00:01:33,599 --> 00:01:35,519
- [Robot Gemma] I like
the taste of cheese.
36
00:01:35,520 --> 00:01:36,679
(laughing)
37
00:01:36,680 --> 00:01:38,000
Can we build a human?
38
00:01:38,640 --> 00:01:39,359
It's so strange.
39
00:01:39,360 --> 00:01:42,799
(intense music)
40
00:01:47,879 --> 00:01:50,640
(eerie music)
41
00:01:57,040 --> 00:01:58,958
Hi!
42
00:01:58,959 --> 00:01:59,599
Hello.
43
00:01:59,519 --> 00:02:00,519
Nice to meet you.
44
00:02:00,520 --> 00:02:02,199
Day one of the robot build
45
00:02:02,200 --> 00:02:03,920
and we need a body
to house its brain.
46
00:02:03,921 --> 00:02:05,260
- Welcome to Millennium FX.
47
00:02:05,261 --> 00:02:07,240
- [Gemma] Thank you,
thanks for having us.
48
00:02:07,241 --> 00:02:10,080
Millennium FX is one of
Europe's leading suppliers
49
00:02:10,081 --> 00:02:13,320
of prosthetics, animatronics
and specialist makeup.
50
00:02:13,321 --> 00:02:16,158
(eerie music)
51
00:02:16,159 --> 00:02:17,519
Oh my goodness.
52
00:02:17,520 --> 00:02:19,439
- This is something that we
do a lot at Millennium.
53
00:02:19,440 --> 00:02:21,459
Babies come up in TV shows a lot
54
00:02:21,460 --> 00:02:23,599
but you can't film on
babies for very long
55
00:02:23,600 --> 00:02:25,240
so we produce these
lifelike babies
56
00:02:25,080 --> 00:02:26,039
so that people can hold them.
57
00:02:26,040 --> 00:02:28,438
- It's just so weird.
58
00:02:28,439 --> 00:02:31,679
The closer something
is to looking human,
59
00:02:31,680 --> 00:02:33,839
the weirder it feels.
60
00:02:33,840 --> 00:02:36,038
- It's what we call
the Uncanny Valley,
61
00:02:36,039 --> 00:02:37,919
where your mind kind of
knows, it kinda knows,
62
00:02:37,920 --> 00:02:41,198
that no matter how
perfect something is,
63
00:02:41,199 --> 00:02:42,520
no matter how good
the movement is
64
00:02:42,521 --> 00:02:43,900
or how realistic it looks,
65
00:02:43,901 --> 00:02:46,960
your mind knows that there's
something not
quite right.
66
00:02:46,961 --> 00:02:49,480
- I think I'm just
gonna put her down.
67
00:02:49,481 --> 00:02:52,620
Making a silicone duplicate
will enable the
robot builders
68
00:02:52,621 --> 00:02:54,319
to create a lifelike skin.
69
00:02:54,320 --> 00:02:57,638
A-E-I-O-U.
70
00:02:57,639 --> 00:03:01,679
My double will need to have
hundreds of facial expressions
71
00:03:01,680 --> 00:03:04,000
just like me and they'll
need to be in sync
72
00:03:04,001 --> 00:03:06,599
with what the artificial
brain is thinking.
73
00:03:06,600 --> 00:03:08,279
Hi, pleasure to meet you.
74
00:03:08,280 --> 00:03:10,039
Time for me to get
my face copied.
75
00:03:10,040 --> 00:03:13,199
- [Kate] We're going
to have you 3D scanned.
76
00:03:14,280 --> 00:03:16,839
- Kate Walsh is the
producer in charge.
77
00:03:16,840 --> 00:03:20,300
- Today we're gonna try and
watch you as closely as possible
78
00:03:20,301 --> 00:03:23,799
and see if we can pick up on
all the subtle expressions
79
00:03:23,800 --> 00:03:25,120
and movements of your face.
80
00:03:25,121 --> 00:03:28,478
- I'm intrigued to
see how far we can go
81
00:03:28,479 --> 00:03:30,679
and how lifelike we can make it.
82
00:03:30,680 --> 00:03:34,119
I really have no
idea what to expect.
83
00:03:34,120 --> 00:03:36,718
- [Man] Three,
two, one, scanning.
84
00:03:36,719 --> 00:03:37,560
(eerie music)
85
00:03:37,561 --> 00:03:39,080
- [Gemma] It's so cool.
86
00:03:41,599 --> 00:03:43,920
- This is the room where we'll
be doing your headcast
87
00:03:43,921 --> 00:03:46,579
and on the wall behind you
are some of the people
88
00:03:46,580 --> 00:03:49,158
who have had the pleasure
of light casting before.
89
00:03:49,159 --> 00:03:50,360
- Is that Gordon Ramsay there?
90
00:03:50,361 --> 00:03:51,479
- [Kate] It is, yes.
91
00:03:52,599 --> 00:03:53,878
- That's it, perfect.
92
00:03:53,879 --> 00:03:55,879
And if you wanna just open
your mouth, just
a fraction,
93
00:03:55,880 --> 00:03:57,240
and just blow out gently.
94
00:03:57,241 --> 00:03:58,319
That's it.
95
00:03:58,320 --> 00:03:59,960
Keep your eyes nice and relaxed.
96
00:03:59,800 --> 00:04:00,919
You're doing fantastically.
97
00:04:00,920 --> 00:04:02,799
- [Gemma] The cast consists
98
00:04:02,800 --> 00:04:04,479
of two different
types of silicone
99
00:04:04,480 --> 00:04:06,840
and is finished off with a
traditional plaster shell.
100
00:04:06,841 --> 00:04:08,840
- [Man] One, two, three. (camera
snapping)
101
00:04:08,841 --> 00:04:10,598
Great shot. Well done.
102
00:04:10,599 --> 00:04:12,639
- [Gemma] The whole process
needs to be quick,
103
00:04:12,640 --> 00:04:16,119
as after 20 minutes, the
heat becomes intolerable.
104
00:04:16,120 --> 00:04:18,398
- [Man] We're gonna
wipe that off.
105
00:04:18,399 --> 00:04:19,719
That's great.
106
00:04:19,720 --> 00:04:23,280
That's a great cast and you did
really, really,
really well.
107
00:04:23,281 --> 00:04:24,460
- Great, thank you!
108
00:04:24,461 --> 00:04:26,220
While the team will
need to turn silicone
109
00:04:26,221 --> 00:04:28,080
into something
that resembles me,
110
00:04:28,081 --> 00:04:30,959
a bigger challenge will
be to create its mind.
111
00:04:30,960 --> 00:04:34,279
For that, the robot needs
artificial intelligence,
112
00:04:35,040 --> 00:04:37,760
defined by computer
scientists as a machine
113
00:04:37,761 --> 00:04:40,120
having the capability to make
a human-like decision.
114
00:04:41,839 --> 00:04:44,860
Thank you so much for agreeing
to talk to
us today.
115
00:04:44,861 --> 00:04:47,278
Oxford professor of
philosophy, Nick Bostrom,
116
00:04:47,279 --> 00:04:48,920
is one of the world's
leading experts in AI.
117
00:04:48,921 --> 00:04:53,078
- So the goal of AI,
artificial intelligence,
118
00:04:53,079 --> 00:04:56,518
has all along, since it's
beginnings in the 50s,
119
00:04:56,519 --> 00:04:59,780
been to make machines that
have the same general purpose,
120
00:04:59,781 --> 00:05:01,660
smartness, that we humans have,
121
00:05:01,661 --> 00:05:04,839
that can do all the things
that the human
brain can do.
122
00:05:04,840 --> 00:05:06,198
(intense music)
123
00:05:06,199 --> 00:05:08,720
- [Gemma] From smartphones
to share prices,
124
00:05:08,721 --> 00:05:11,198
from online customer
support, to CCTV,
125
00:05:11,199 --> 00:05:13,758
we're now surrounded
by so much of it,
126
00:05:13,759 --> 00:05:16,359
we started to take
it for granted.
127
00:05:16,360 --> 00:05:18,000
- AI is a general
purpose technology.
128
00:05:18,001 --> 00:05:21,039
You can go through any
sector in the economy.
129
00:05:21,040 --> 00:05:23,999
You can go through healthcare,
entertainment,
130
00:05:24,000 --> 00:05:26,040
medicine, defense, you name it,
131
00:05:26,041 --> 00:05:29,398
it could think of ways
in which processes
132
00:05:29,399 --> 00:05:32,120
could be improved
by being smarter.
133
00:05:33,399 --> 00:05:36,119
- [Gemma] Machines are being
developed as soldiers,
134
00:05:36,120 --> 00:05:37,519
mastering real time translations
135
00:05:37,520 --> 00:05:40,518
using the latest in word
recognition software.
136
00:05:40,519 --> 00:05:41,959
- Can you hear me in french?
137
00:05:41,800 --> 00:05:42,959
(speaking foreign language)
138
00:05:42,960 --> 00:05:45,039
- Oh weird! (laughing)
139
00:05:45,040 --> 00:05:47,440
- [Gemma] And AI could soon
have a significant role
140
00:05:47,441 --> 00:05:48,800
in the legal system.
141
00:05:49,600 --> 00:05:51,319
An AI judge that
was shadowing cases
142
00:05:51,320 --> 00:05:53,319
from the European
Court of Human Rights,
143
00:05:53,320 --> 00:05:55,000
recently came up with
the same decision
144
00:05:55,001 --> 00:05:58,198
as the human judge in
four out of five cases.
145
00:05:58,199 --> 00:06:01,119
(eerie music)
146
00:06:01,120 --> 00:06:03,120
The medical profession
is also making use of it.
147
00:06:05,360 --> 00:06:07,920
These Canadian scientists
have recently been
148
00:06:07,921 --> 00:06:10,580
to the U.K. to sell
a AI diagnostic tool
149
00:06:10,581 --> 00:06:13,079
that they say can identify
a tumor instantly
150
00:06:13,080 --> 00:06:15,960
and accurately without
the need for a biopsy
151
00:06:15,961 --> 00:06:17,780
and with immediate results.
152
00:06:17,781 --> 00:06:20,639
- Just like a human,
the machine got trained
153
00:06:20,640 --> 00:06:25,759
on multiple thousands of images
to get an accuracy level.
154
00:06:26,600 --> 00:06:28,560
- [Gemma] Not only is it less
invasive and quicker,
155
00:06:28,561 --> 00:06:32,040
but it could say
the NHS millions of
pounds if introduced.
156
00:06:32,041 --> 00:06:35,839
(relaxing music)
157
00:06:36,600 --> 00:06:39,040
Some of us are already
entrusting our
lives to AI.
158
00:06:39,041 --> 00:06:42,879
(relaxing music)
159
00:06:47,480 --> 00:06:49,559
This is the Tesla Model S,
160
00:06:49,560 --> 00:06:53,039
on the market for a
cool 100,000 pounds.
161
00:06:53,040 --> 00:06:55,120
(relaxing music)
162
00:06:59,000 --> 00:07:00,159
Pick a lane.
163
00:07:00,160 --> 00:07:03,080
Noel Sharkey is an academic
specializing
in AI
164
00:07:03,081 --> 00:07:05,660
and is interested in how
autonomous decision making
165
00:07:05,661 --> 00:07:06,880
will effect us all.
166
00:07:06,881 --> 00:07:08,958
- You're gonna take
your hands off.
167
00:07:08,959 --> 00:07:10,680
- I've arranged to meet
him on a test track
168
00:07:10,681 --> 00:07:13,839
to try out one of the car's
most impressive features,
169
00:07:14,639 --> 00:07:18,040
the ability to drive itself
using artificial intelligence.
170
00:07:18,041 --> 00:07:21,759
(intense music)
171
00:07:22,680 --> 00:07:24,080
All right, the hands
are coming off.
172
00:07:25,279 --> 00:07:26,480
The hands are off.
173
00:07:26,240 --> 00:07:27,359
- Whoa.
- We're approaching a bend.
174
00:07:27,360 --> 00:07:28,319
- Oh!
175
00:07:28,320 --> 00:07:31,120
(dramatic music)
176
00:07:32,319 --> 00:07:33,559
(beeping)
177
00:07:33,560 --> 00:07:35,679
- Oh my goodness. See that?
178
00:07:35,680 --> 00:07:37,279
It nearly took us
off the road there.
179
00:07:37,240 --> 00:07:38,160
- [Noel] Yes.
180
00:07:38,161 --> 00:07:40,100
- I know this is meant to be
181
00:07:40,101 --> 00:07:43,160
a less stressful driving
experience but
I don't know.
182
00:07:44,920 --> 00:07:47,160
I'm sweating.
- Yeah, of course.
183
00:07:47,079 --> 00:07:48,120
- I'm sweating.
184
00:07:48,000 --> 00:07:49,040
Oh my goodness, right.
185
00:07:48,879 --> 00:07:50,240
- You try being the passenger.
186
00:07:50,241 --> 00:07:52,919
(laughing)
187
00:07:52,920 --> 00:07:54,639
- The semi-autonomous
cars like this
188
00:07:54,640 --> 00:07:56,878
are really quite controversial,
189
00:07:56,879 --> 00:07:59,239
especially in the
field of robotics
190
00:07:59,240 --> 00:08:02,679
and we really need much
broader societal debate.
191
00:08:02,680 --> 00:08:05,518
The U.K. government's rolling
out autonomous cars
192
00:08:05,519 --> 00:08:07,000
and they haven't really debated
193
00:08:07,001 --> 00:08:09,460
as to whether people
want it or not.
194
00:08:09,461 --> 00:08:11,199
- [Gemma] This controversy
has been fueled
195
00:08:11,200 --> 00:08:12,638
by drivers in America,
196
00:08:12,639 --> 00:08:14,120
ignoring the company's
safety guidelines
197
00:08:14,121 --> 00:08:17,799
and uploading outrageous
clips to the internet.
198
00:08:17,800 --> 00:08:20,000
- But I think what puts
it all in perspective
199
00:08:20,001 --> 00:08:22,679
is the recent fatal
crash in the U.S.A.
200
00:08:22,680 --> 00:08:26,638
- Yes, I remember
hearing something
about that in the news.
201
00:08:26,639 --> 00:08:30,438
Earlier this year, a
driver crashed and died,
202
00:08:30,439 --> 00:08:32,360
allegedly whilst watching
a Harry Potter film.
203
00:08:34,039 --> 00:08:38,279
His car hit the underside of a
trailer at 74 miles per hour.
204
00:08:39,279 --> 00:08:41,719
- It pulled out right in front
of it, bright white trailer,
205
00:08:41,720 --> 00:08:43,580
bright white sky, very sunny,
206
00:08:43,581 --> 00:08:45,560
and the camera couldn't
detect the trailer
207
00:08:45,561 --> 00:08:47,000
and that's why it ran into it.
208
00:08:47,001 --> 00:08:50,678
- And I suppose the
more autonomous vehicles
209
00:08:50,679 --> 00:08:52,200
you have on the road,
210
00:08:52,799 --> 00:08:56,119
the likelihood of incidents
like that, or accidents,
211
00:08:56,120 --> 00:08:58,000
they're bound to happen.
212
00:08:59,320 --> 00:09:02,839
- It's measuring the space now
with its sensors, I think.
213
00:09:02,840 --> 00:09:04,079
(beeping)
214
00:09:04,080 --> 00:09:07,480
- [Gemma] But while Tesla say AI
cars will have fewer accidents-
215
00:09:08,639 --> 00:09:09,638
Wow!
216
00:09:09,639 --> 00:09:12,038
Ultimately, we
won't be in control.
217
00:09:12,039 --> 00:09:13,000
Look at, look!
218
00:09:12,840 --> 00:09:14,240
It's correcting itself as well!
219
00:09:15,919 --> 00:09:17,799
- So you imagine
what's going to happen
220
00:09:17,559 --> 00:09:19,080
when there really
are fully autonomous
221
00:09:19,081 --> 00:09:21,380
because you're gonna
have to delegate
222
00:09:21,381 --> 00:09:23,278
all your driving
decisions to them.
223
00:09:23,279 --> 00:09:24,720
- [Gemma] Life and
death decisions.
224
00:09:24,721 --> 00:09:26,519
- It could well be life
and death decisions.
225
00:09:26,520 --> 00:09:29,120
Well look, do you see this
van coming down here.
226
00:09:29,121 --> 00:09:31,080
Well we're just
driving along here
227
00:09:31,081 --> 00:09:33,558
on that side in this direction.
228
00:09:33,559 --> 00:09:37,239
The van comes hammering
at us for some reason
229
00:09:37,240 --> 00:09:39,799
and you got that woman
there in the push chair
230
00:09:39,800 --> 00:09:41,639
and she's gonna be
hit if we avoid it.
231
00:09:42,720 --> 00:09:45,500
The car has to make a
decision to take the hit
232
00:09:45,501 --> 00:09:47,518
or to kill the mother and baby.
233
00:09:47,519 --> 00:09:48,399
What would you do?
234
00:09:48,400 --> 00:09:49,679
- I don't know.
235
00:09:49,360 --> 00:09:51,399
I don't know what I would
do in that split second.
236
00:09:51,240 --> 00:09:52,440
I-
- Yeah, that's what I'm-
237
00:09:52,240 --> 00:09:53,919
thinking.
- I think it's a really,
238
00:09:53,559 --> 00:09:55,680
that's a really difficult
decision for anyone
to make,
239
00:09:55,681 --> 00:09:57,080
for a human to make.
240
00:09:57,799 --> 00:10:01,239
How can a car decide which
life is more valuable?
241
00:10:01,240 --> 00:10:03,440
- I don't think
it should, myself.
242
00:10:03,441 --> 00:10:05,599
(ominous music)
243
00:10:05,600 --> 00:10:06,860
- The robot we're building
244
00:10:06,861 --> 00:10:09,039
won't need to make life
and death decisions
245
00:10:12,399 --> 00:10:15,519
but it will have to harness
AI's decision making powers
246
00:10:15,520 --> 00:10:16,879
to think like me.
247
00:10:19,039 --> 00:10:21,600
It's being built on a
modest industrial estate
248
00:10:21,601 --> 00:10:23,679
on the outskirts of
Penryn, Cornwall.
249
00:10:24,480 --> 00:10:26,919
I wasn't expecting the
hub of the robot build
250
00:10:26,920 --> 00:10:29,840
to be next to a B & Q.
251
00:10:30,639 --> 00:10:33,080
I wonder if it's going
to be a mop head.
252
00:10:35,440 --> 00:10:37,180
Will Jackson is at the forefront
253
00:10:37,181 --> 00:10:39,279
of constructing
human like robots.
254
00:10:39,280 --> 00:10:41,038
- [Robot] Well hello!
255
00:10:41,039 --> 00:10:42,119
- Hello.
256
00:10:42,120 --> 00:10:44,000
To be convincing,
robots need to reason,
257
00:10:44,001 --> 00:10:46,999
learn and understand
so they can react
258
00:10:47,000 --> 00:10:49,200
almost instinctively like us.
259
00:10:50,000 --> 00:10:51,119
- Hello.
260
00:10:51,120 --> 00:10:52,039
How are you?
261
00:10:51,879 --> 00:10:53,279
- I'm very well, thank you.
262
00:10:53,280 --> 00:10:55,319
How are you?
263
00:10:55,320 --> 00:10:56,440
- [Will] He's thinking about it.
264
00:10:56,441 --> 00:10:57,919
- I'm very well, thank you.
265
00:10:57,920 --> 00:10:59,398
- He's great!
266
00:10:59,399 --> 00:11:00,959
- He's very much a robot.
267
00:11:00,960 --> 00:11:03,480
You definitely know that this
is a machine,
not a person.
268
00:11:03,481 --> 00:11:05,319
He's pretty limited.
269
00:11:05,320 --> 00:11:07,639
- [Gemma] Will plans to apply
the knowledge acquired
270
00:11:07,640 --> 00:11:10,599
building RoboThespian
RT4 as a basis
271
00:11:10,600 --> 00:11:13,398
for constructing the
AI version of me.
272
00:11:13,399 --> 00:11:14,920
In order to be convincing,
273
00:11:14,921 --> 00:11:18,360
the robot will need to master
the complexity
of language.
274
00:11:18,361 --> 00:11:20,120
- So the first thing
that's gonna happen
275
00:11:20,121 --> 00:11:22,000
is somebody's gonna
speak to the robot
276
00:11:21,759 --> 00:11:23,480
and we've got
microphones in the ears,
277
00:11:23,481 --> 00:11:26,660
pick up the sound and at that
point, it's just sound.
278
00:11:26,661 --> 00:11:28,060
It doesn't mean anything
279
00:11:28,061 --> 00:11:30,080
so we have to turn the sound
into words,
into texts.
280
00:11:30,081 --> 00:11:33,359
Once we've got
the sound as text,
281
00:11:33,360 --> 00:11:35,300
we've gotta try and
find the meaning.
282
00:11:35,301 --> 00:11:37,479
What is this person
actually saying to me?
283
00:11:37,480 --> 00:11:38,399
What's the key word?
284
00:11:38,279 --> 00:11:39,320
What did they ask about?
285
00:11:39,321 --> 00:11:41,719
And once we've got that meaning,
286
00:11:41,720 --> 00:11:44,240
we have to try and think
of a sensible reply.
287
00:11:44,241 --> 00:11:47,398
We have to then turn
that back into speech.
288
00:11:47,399 --> 00:11:48,840
So we're gonna have to take
289
00:11:48,559 --> 00:11:50,080
the computer generated
version of your voice,
290
00:11:50,081 --> 00:11:52,239
that sounds like you.
291
00:11:52,240 --> 00:11:54,278
While we're doing all of this,
292
00:11:54,279 --> 00:11:57,300
the robot can't just sit
still so it has to have
293
00:11:57,301 --> 00:11:59,080
all of the little actions
that you would have
294
00:11:59,081 --> 00:12:00,660
the way you can listen,
295
00:12:00,661 --> 00:12:03,160
the way you think about what
somebody's going
to say,
296
00:12:03,161 --> 00:12:06,180
the way you think about
what you're going to say.
297
00:12:06,181 --> 00:12:08,300
You've gotta get all these
little subtle
things right
298
00:12:08,301 --> 00:12:09,600
all at the same time.
299
00:12:09,601 --> 00:12:12,280
The brain in this robot
has got to come up
300
00:12:12,281 --> 00:12:13,860
with an answer that makes sense,
301
00:12:13,861 --> 00:12:16,739
even when it's never heard
the question before.
302
00:12:16,740 --> 00:12:17,960
It's a huge challenge
303
00:12:17,961 --> 00:12:20,678
and we have so many
things to get right.
304
00:12:20,679 --> 00:12:23,480
(intense music)
305
00:12:28,240 --> 00:12:30,779
- [Gemma] Will's going to use
his existing robot hardware
306
00:12:30,780 --> 00:12:32,940
to test out the speech
recognition software
307
00:12:32,941 --> 00:12:35,200
that he plans to
use on the robot me.
308
00:12:36,000 --> 00:12:37,960
- There's an old saying
in computer programming,
309
00:12:37,961 --> 00:12:40,079
it's garbage in, garbage out.
310
00:12:40,080 --> 00:12:43,120
If the robot cannot recognize
what you're saying,
311
00:12:43,121 --> 00:12:46,140
if it just gets one or two
words in a sentence wrong
312
00:12:46,141 --> 00:12:49,559
when you speak, what you get
back is complete gibberish.
313
00:12:50,440 --> 00:12:52,440
- [Gemma] Using speech
recognition software,
314
00:12:52,441 --> 00:12:55,560
can it recognize words it
hears, turn them
into text,
315
00:12:55,561 --> 00:12:57,558
and then accurately repeat them?
316
00:12:57,559 --> 00:12:58,559
- Echo on.
317
00:13:00,039 --> 00:13:01,519
- [Robot] Okay, Echo on.
318
00:13:02,279 --> 00:13:03,960
- Peter Piper picked a
piece of pickled pepper,
319
00:13:03,961 --> 00:13:06,878
put it on a panda car,
drove it around the moon,
320
00:13:06,879 --> 00:13:08,440
and ate a sausage
on the way home.
321
00:13:09,919 --> 00:13:12,339
- Peter Piper picked a
piece of pickled pepper,
322
00:13:12,340 --> 00:13:14,919
put in on a pan, pick our
dragon's random name generator,
323
00:13:14,920 --> 00:13:17,199
sausage on the way home.
324
00:13:17,200 --> 00:13:18,159
(laughing)
325
00:13:18,000 --> 00:13:19,440
- Random name generator sausage.
326
00:13:20,639 --> 00:13:22,440
- [Gemma] Part of the
challenge is to get
327
00:13:22,441 --> 00:13:24,960
our robot to respond as
quickly as a human would.
328
00:13:24,961 --> 00:13:27,758
That's within a
10th of a second.
329
00:13:27,759 --> 00:13:28,759
- [Will] Hello.
330
00:13:29,559 --> 00:13:30,440
- [Robot] Hello.
331
00:13:30,399 --> 00:13:31,240
- Too slow.
332
00:13:31,240 --> 00:13:32,240
Hello.
333
00:13:33,120 --> 00:13:33,960
- [Robot] Hello.
334
00:13:33,961 --> 00:13:35,200
- It's too slow.
335
00:13:35,720 --> 00:13:37,659
- [Gemma] To respond
as fast as a human,
336
00:13:37,660 --> 00:13:39,539
the robot needs to work
out what's being said
337
00:13:39,540 --> 00:13:41,960
and what it means before
the end of a sentence
338
00:13:41,961 --> 00:13:44,599
and reply with a
lightening fast response,
339
00:13:44,600 --> 00:13:46,159
which involves
instinct, like us.
340
00:13:47,720 --> 00:13:50,798
- We've got to be so
quick with understanding
341
00:13:50,799 --> 00:13:53,080
what's being said that
we can be replying
342
00:13:53,081 --> 00:13:56,278
before even really the
last words come out.
343
00:13:56,279 --> 00:13:57,900
So it's gotta be super fast.
344
00:13:57,901 --> 00:14:01,000
- [Gemma] Building a machine
that can understand a human
345
00:14:01,001 --> 00:14:02,679
and answer back convincingly
346
00:14:02,680 --> 00:14:04,919
is one of the toughest
challenges in AI.
347
00:14:07,440 --> 00:14:09,440
- If I speak quicker,
does it work better?
348
00:14:10,840 --> 00:14:14,079
- [Robot] If I speak quicker,
does it work better?
349
00:14:14,080 --> 00:14:15,639
- It's almost fast enough.
350
00:14:15,640 --> 00:14:18,039
It's almost there
but it's not quite.
351
00:14:18,040 --> 00:14:22,479
If we can't get the
recognition that quick,
352
00:14:22,480 --> 00:14:24,398
blown it on the robot.
353
00:14:24,399 --> 00:14:27,240
(intense music)
354
00:14:28,440 --> 00:14:30,539
- [Gemma] We're testing the
boundaries of science,
355
00:14:30,540 --> 00:14:33,320
with a unique artificial
intelligence test,
356
00:14:33,321 --> 00:14:37,359
building a robot that looks,
sounds and thinks
like me.
357
00:14:37,360 --> 00:14:40,439
(intense music)
358
00:14:40,440 --> 00:14:42,080
The robot team is
progressing well
359
00:14:42,081 --> 00:14:45,220
but it's my facial expressions
that are proving hard
360
00:14:45,221 --> 00:14:47,279
for the robot to
sync with its brain.
361
00:14:47,559 --> 00:14:49,799
- [Man] It's quite a scary
looking thing at the moment
362
00:14:49,559 --> 00:14:51,320
but once it's got
the rest of the core
363
00:14:51,321 --> 00:14:53,678
that goes on which has
got the top teeth in it,
364
00:14:53,679 --> 00:14:55,240
and then the skin
will go over the top,
365
00:14:55,241 --> 00:14:57,360
it will start to look
a lot more like Gemma.
366
00:14:57,361 --> 00:15:00,158
(intense music)
367
00:15:00,159 --> 00:15:02,079
- [Gemma] The physical part
of the build
is tricky
368
00:15:02,080 --> 00:15:04,000
but it's the robot's ability
to converse like
a human
369
00:15:04,001 --> 00:15:07,159
that's really going to
test their engineers.
370
00:15:08,159 --> 00:15:10,720
With Moore's Law stating that
computer processing power
371
00:15:10,721 --> 00:15:12,158
doubles every two years,
372
00:15:12,159 --> 00:15:14,080
artificially intelligent
achievements in
the real world
373
00:15:14,081 --> 00:15:17,320
have been accelerating fast.
374
00:15:18,919 --> 00:15:23,039
In the 1950s, computers beat
us at knots and crosses,
375
00:15:23,799 --> 00:15:27,000
then in the 1990s,
they beat us at chess.
376
00:15:28,039 --> 00:15:30,320
Those computers were
programmed to work out
377
00:15:30,321 --> 00:15:32,360
all the possible
outcomes of each move
378
00:15:32,361 --> 00:15:34,799
and then weighed up how each
move would contribute
379
00:15:34,800 --> 00:15:38,398
to a winning strategy but
we're now entering an age
380
00:15:38,399 --> 00:15:40,240
when computers aren't
just programmed
381
00:15:40,241 --> 00:15:44,119
but can learn for themselves,
computers like IBM's Watson.
382
00:15:44,120 --> 00:15:49,038
- [Announcer] This is
"Jeopardy," the
IBM Challenge.
383
00:15:49,039 --> 00:15:50,360
(audience applauding)
384
00:15:50,361 --> 00:15:51,599
Here are-
385
00:15:51,600 --> 00:15:53,760
- [Gemma] In 2011, Watson
was put up to one of
386
00:15:53,761 --> 00:15:55,380
the toughest challenges ever,
387
00:15:55,381 --> 00:15:56,759
a general knowledge quiz
388
00:15:56,760 --> 00:15:59,319
that requires logic
and quick thinking.
389
00:15:59,320 --> 00:16:00,519
- This is "Jeopardy."
390
00:16:00,520 --> 00:16:02,479
It's a bit of an
American institution.
391
00:16:02,480 --> 00:16:04,039
It's a general
knowledge quiz program
392
00:16:04,040 --> 00:16:07,798
and they ask questions
in a really strange way.
393
00:16:07,799 --> 00:16:09,518
They give you the answer
394
00:16:09,519 --> 00:16:12,159
and you have to work out
what the question is.
395
00:16:12,160 --> 00:16:14,540
- [Gemma] Duncan Anderson
is IBM's European
396
00:16:14,541 --> 00:16:16,840
Chief Technology
Officer for Watson.
397
00:16:16,841 --> 00:16:18,918
- So we've got the
two best players here.
398
00:16:18,919 --> 00:16:20,200
We've got the person who won
399
00:16:20,201 --> 00:16:22,080
the most amount of
money on the show
400
00:16:22,081 --> 00:16:25,158
and the person who has the
longest winning streak.
401
00:16:25,159 --> 00:16:26,960
- [Gemma] Two of the
most brilliant reigns
402
00:16:26,961 --> 00:16:29,319
had won $5 million between them.
403
00:16:29,320 --> 00:16:31,639
This game was worth
another million.
404
00:16:31,759 --> 00:16:34,918
- Watson itself, is not
connected to the internet.
405
00:16:34,919 --> 00:16:37,558
So it's not out there searching.
406
00:16:37,559 --> 00:16:40,319
It's there, standalone,
playing against
407
00:16:40,320 --> 00:16:41,960
these champion players.
408
00:16:41,961 --> 00:16:43,120
It was a big risk.
409
00:16:43,121 --> 00:16:44,919
- The category is
19th Century Novelists
410
00:16:44,920 --> 00:16:47,079
and here is the clue.
411
00:16:47,080 --> 00:16:48,278
(beeping)
412
00:16:48,279 --> 00:16:50,879
William Wilkinson's "An
Account of the Principalities
413
00:16:50,880 --> 00:16:52,878
"of Wallachia and Moldavia,"
414
00:16:52,879 --> 00:16:57,199
inspired this author's
most famous novel.
415
00:16:57,200 --> 00:16:59,518
- [Alex] Who is Bram Stoker?
416
00:16:59,519 --> 00:17:00,599
Hello!
417
00:17:00,600 --> 00:17:02,240
- And this is the point
where Watson's won!
418
00:17:02,241 --> 00:17:04,160
We've beat the best human
players at "Jeopardy."
419
00:17:06,920 --> 00:17:10,838
- So how exactly do
you program a machine
420
00:17:10,839 --> 00:17:12,860
to do something that
it's never done before?
421
00:17:12,861 --> 00:17:15,120
- Well, the first thing
is you don't program it.
422
00:17:15,121 --> 00:17:17,960
Trying to guess every single
question that
might come up
423
00:17:17,961 --> 00:17:19,699
and then program the computer
424
00:17:19,700 --> 00:17:21,780
with the right answer
for that question,
425
00:17:21,781 --> 00:17:23,219
we would be here forever,
426
00:17:23,220 --> 00:17:25,079
so we use this thing
called machine learning
427
00:17:25,080 --> 00:17:28,598
which is an approach
to solving problems
428
00:17:28,599 --> 00:17:31,639
where by the machine can
learn from experiences,
429
00:17:31,640 --> 00:17:34,879
so we took Watson
and we taught it,
430
00:17:34,880 --> 00:17:37,159
we fed it lots of information.
431
00:17:37,160 --> 00:17:39,960
For example, back issues
of "Time" magazine,
432
00:17:39,961 --> 00:17:43,278
Wikipedia, encyclopedias-
433
00:17:43,279 --> 00:17:45,199
- So Watson was
learning, in a way.
434
00:17:45,200 --> 00:17:48,240
- Mm, and then we go through
a teaching process,
435
00:17:48,241 --> 00:17:50,560
just like you would
teach a child.
436
00:17:50,561 --> 00:17:52,919
We're teaching Watson,
we're testing it,
437
00:17:52,920 --> 00:17:54,999
and we're giving it feedback
when it gets
it right
438
00:17:55,000 --> 00:17:56,640
and feedback when
it gets it wrong
439
00:17:56,641 --> 00:18:00,159
and then it adjusts it's
approach to making decisions.
440
00:18:00,160 --> 00:18:04,118
You could think of it a
bit like trying to find
441
00:18:04,119 --> 00:18:05,960
a pathway through a field.
442
00:18:05,961 --> 00:18:09,479
So you have a very
faint distinct path
443
00:18:09,480 --> 00:18:11,160
that maybe only one person
has trodden through
444
00:18:11,161 --> 00:18:14,798
and what you're trying to
do is to feed information
445
00:18:14,799 --> 00:18:17,459
so that that pathway
becomes more defined.
446
00:18:17,460 --> 00:18:19,000
As more people go
down that path,
447
00:18:18,759 --> 00:18:19,960
the path gets more
trodden through
448
00:18:19,961 --> 00:18:21,758
and becomes more obvious.
449
00:18:21,759 --> 00:18:25,758
- So the more data that
you feed into Watson,
450
00:18:25,759 --> 00:18:30,798
it's almost like the more or
the wider the
path becomes,
451
00:18:30,799 --> 00:18:33,598
or the more distinct
the path becomes.
452
00:18:33,599 --> 00:18:34,559
- Precisely.
453
00:18:34,560 --> 00:18:36,279
So Watson becomes more confident
454
00:18:36,280 --> 00:18:39,319
that that pathway is the
right pathway to take.
455
00:18:40,400 --> 00:18:42,320
- [Gemma] We now need
my robot to undergo
456
00:18:42,321 --> 00:18:44,000
a basic version of this process.
457
00:18:44,001 --> 00:18:46,439
It needs its own pathway
458
00:18:46,440 --> 00:18:48,560
and be fed hundreds
of bespoke new rules
459
00:18:48,561 --> 00:18:50,439
on how to respond to a question
460
00:18:50,440 --> 00:18:52,160
and then learn how to use them.
461
00:18:54,119 --> 00:18:55,680
- Ready to go now
if you wanna try,
462
00:18:55,681 --> 00:18:57,519
try the latest
transcript from Bruce.
463
00:18:57,520 --> 00:18:59,999
- Okay. Key moment.
464
00:19:00,000 --> 00:19:01,680
- [Gemma] Will has commissioned
one of the world's
465
00:19:01,681 --> 00:19:04,420
leading computer programmers
to build a chatbot,
466
00:19:04,421 --> 00:19:07,380
a piece of software which
simulates human conversation
467
00:19:07,381 --> 00:19:10,279
and responds with the answer
it thinks I
would give.
468
00:19:13,160 --> 00:19:14,680
- Do you like repetition?
469
00:19:15,519 --> 00:19:16,480
- [Chatbot] Yes.
470
00:19:16,481 --> 00:19:18,000
- Do you like repetition?
471
00:19:18,001 --> 00:19:20,320
- [Chatbot] You
said that already.
472
00:19:20,321 --> 00:19:23,039
- Aha! Now that's the kind of
reply I'm looking for.
473
00:19:23,040 --> 00:19:25,719
That's what I wanna get to.
474
00:19:25,720 --> 00:19:26,559
That was good.
475
00:19:26,560 --> 00:19:27,599
You're improving.
476
00:19:27,839 --> 00:19:29,838
- [Chatbot] Right.
477
00:19:29,839 --> 00:19:32,239
How about we talk about work?
478
00:19:32,240 --> 00:19:34,500
- [Will] You've played a lot
of supernatural characters.
479
00:19:34,501 --> 00:19:37,119
Are you a fan of the genre or
is it just a coincidence?
480
00:19:38,720 --> 00:19:40,319
- [Chatbot] I do
love science fiction
481
00:19:40,320 --> 00:19:41,500
and things that explore
482
00:19:41,501 --> 00:19:43,079
the boundaries of the possible,
483
00:19:43,080 --> 00:19:46,479
but actually having a bunch
of moves in
that genre
484
00:19:46,480 --> 00:19:48,039
is just a coincidence.
485
00:19:48,720 --> 00:19:50,840
- [Gemma] The team have
certainly done their homework.
486
00:19:51,680 --> 00:19:53,360
Just as Watson was
fed information
487
00:19:53,361 --> 00:19:54,879
from previous "Jeopardy" games,
488
00:19:54,880 --> 00:19:56,119
our chatbot has
been fed interviews
489
00:19:56,120 --> 00:19:58,318
and background
information on me.
490
00:19:58,319 --> 00:20:00,319
- Once we can get
the expressions
491
00:20:00,320 --> 00:20:03,358
and all those other
little subtle cues,
492
00:20:03,359 --> 00:20:04,679
movement, things in there,
493
00:20:04,680 --> 00:20:06,440
I think it will start
to come together.
494
00:20:06,319 --> 00:20:07,359
- [Chatbot] Fantastic!
495
00:20:07,360 --> 00:20:09,279
- Yeah, it will be fantastic.
496
00:20:10,200 --> 00:20:11,460
- [Chatbot] Are you sure?
497
00:20:11,461 --> 00:20:12,759
- I am positive.
498
00:20:13,559 --> 00:20:15,480
- What's your next question?
499
00:20:16,759 --> 00:20:17,920
- [Gemma] The next question
500
00:20:17,921 --> 00:20:21,118
is how to make it sound like me.
501
00:20:21,119 --> 00:20:22,358
Hello.
- Hello.
502
00:20:22,359 --> 00:20:24,300
- I'm Gemma.
- My name's Burdill.
503
00:20:24,301 --> 00:20:26,839
- [Gemma] The human voice
has a huge variation
504
00:20:26,840 --> 00:20:29,479
of inflection, pitch
and intonation.
505
00:20:29,480 --> 00:20:32,000
Our robot will need to
replicate the essence
of my voice,
506
00:20:32,001 --> 00:20:36,000
with all the specific quirks
that make it unique
to me.
507
00:20:37,119 --> 00:20:39,780
- So just keep in mind
that when you get commas,
508
00:20:39,781 --> 00:20:41,160
you make a small pause.
509
00:20:41,759 --> 00:20:44,000
- Burdill Madison is a
computational linguist
510
00:20:44,001 --> 00:20:45,960
and works for a company
that specializes
511
00:20:45,961 --> 00:20:47,640
in synthesized voices.
512
00:20:48,119 --> 00:20:51,140
It's not the words that you're
recording here. You're more
513
00:20:51,141 --> 00:20:53,800
interested in the different
sounds that I'm making.
514
00:20:53,801 --> 00:20:55,559
- Yeah, I'm not interested
in words at all.
515
00:20:55,359 --> 00:20:56,480
I'm interested in the
combinations
516
00:20:56,481 --> 00:20:58,078
of phoniums that we are getting.
517
00:20:58,079 --> 00:20:59,400
- Oh the phoniums, oh I see.
518
00:20:59,401 --> 00:21:01,240
- So the combinations of sounds.
519
00:21:01,241 --> 00:21:04,220
- [Gemma] How many sentences
do I need to record?
520
00:21:04,221 --> 00:21:07,278
- [Burdill] Well, in the end
it's about 1400 sentences.
521
00:21:07,279 --> 00:21:08,160
- [Gemma] 1400?
522
00:21:08,161 --> 00:21:09,199
- [Burdill] Yeah.
523
00:21:09,200 --> 00:21:11,039
- Today? This is gonna be quite
a long afternoon.
524
00:21:11,040 --> 00:21:13,639
(intense music)
525
00:21:13,640 --> 00:21:14,960
I certainly have appreciated
526
00:21:14,961 --> 00:21:17,118
that outlook for my creativity.
527
00:21:17,119 --> 00:21:19,399
(intense music)
528
00:21:19,400 --> 00:21:21,960
Imagine George Bush
singing in the shower.
529
00:21:21,961 --> 00:21:24,880
(intense music)
530
00:21:26,319 --> 00:21:28,079
- You will have to do
the previous one again.
531
00:21:32,599 --> 00:21:34,960
- We're teaching our robot
how to speak like me.
532
00:21:34,961 --> 00:21:37,798
You are an athlete
if you play golf.
533
00:21:37,799 --> 00:21:40,919
But in the future,
we may get to a stage
534
00:21:40,920 --> 00:21:44,000
where it can teach itself,
learning from experience,
535
00:21:44,001 --> 00:21:46,000
and coming up with
its own solutions,
536
00:21:46,001 --> 00:21:48,240
which is what AIs
are starting to do.
537
00:21:49,799 --> 00:21:51,000
- Hi.
- Hi, Denis!
538
00:21:51,001 --> 00:21:52,838
- Nice to meet you.
- Good to meet you.
539
00:21:52,839 --> 00:21:54,400
Thanks.
- So welcome to the offices.
540
00:21:54,401 --> 00:21:56,399
- Thanks for having me.
- No problem at all.
541
00:21:56,400 --> 00:21:57,399
So-
542
00:21:57,400 --> 00:21:59,119
- [Gemma] Demis Hassabis
was a child chess prodigy
543
00:21:59,120 --> 00:22:00,159
from North London.
544
00:22:00,160 --> 00:22:01,559
- It's actually quite
strange meeting you
545
00:22:01,400 --> 00:22:02,559
because I've watched you on,
546
00:22:02,560 --> 00:22:04,080
being, pretending to be an AI.
547
00:22:04,081 --> 00:22:05,879
- Pretending to be a robot.
- Yeah.
548
00:22:05,880 --> 00:22:06,839
(laughing)
549
00:22:06,840 --> 00:22:08,440
- [Gemma] In 2011, he launched
550
00:22:08,441 --> 00:22:11,560
a British artificial
intelligence company
called DeepMind.
551
00:22:11,561 --> 00:22:14,440
Just three years later,
Google bought his company
552
00:22:14,441 --> 00:22:16,440
for 400 million pounds.
553
00:22:17,200 --> 00:22:19,400
- One of the first things
we got our program to do
554
00:22:19,401 --> 00:22:21,360
was to play classic Atari games
555
00:22:21,361 --> 00:22:23,119
and we wanted the
AI to actually learn
556
00:22:23,120 --> 00:22:25,880
to play these games by just
from the pixels
on the screen
557
00:22:25,881 --> 00:22:28,920
and no other information so
it had to learn
for itself
558
00:22:28,921 --> 00:22:32,439
what the rules of the game
were, how to get points,
559
00:22:32,440 --> 00:22:34,440
and how to sort of
master the game.
560
00:22:35,480 --> 00:22:37,800
The idea behind "Breakout"
is that you control
561
00:22:37,801 --> 00:22:40,280
a bat and a ball and you've
got to breakout through
562
00:22:40,281 --> 00:22:41,960
a rainbow colored
wall brick by brick.
563
00:22:41,961 --> 00:22:44,838
- I think I remember this
from when I was younger.
564
00:22:44,839 --> 00:22:45,759
- Atari.
565
00:22:45,599 --> 00:22:46,839
So you can see it's starting
566
00:22:46,840 --> 00:22:48,798
to get the hang of
what it should do.
567
00:22:48,799 --> 00:22:49,839
It's not very good.
568
00:22:49,680 --> 00:22:51,160
It misses the ball quite a lot
569
00:22:50,960 --> 00:22:52,440
but it's starting to understand
570
00:22:52,160 --> 00:22:54,000
that it's got to move
the bat towards the ball
571
00:22:54,001 --> 00:22:55,679
if it wants to get points.
572
00:22:55,680 --> 00:22:58,959
(beeping)
573
00:22:58,960 --> 00:23:01,038
This is after 300 games.
574
00:23:01,039 --> 00:23:02,558
It's still not that many.
575
00:23:02,559 --> 00:23:04,079
So we thought this
is pretty good
576
00:23:04,080 --> 00:23:06,140
but what would happen if
we just let the program
577
00:23:06,141 --> 00:23:07,580
continue to play the game
578
00:23:07,581 --> 00:23:09,520
so we left it playing
for another 200 games
579
00:23:09,521 --> 00:23:12,100
and then we came back and it
did this amazing strategy
580
00:23:12,101 --> 00:23:14,580
of digging a tunnel around
the side of the wall
581
00:23:14,581 --> 00:23:16,440
and actually sending
the ball round the back.
582
00:23:16,200 --> 00:23:17,359
So it's really amazing.
- That's amazing!
583
00:23:17,200 --> 00:23:18,839
- It's discovered it for itself
584
00:23:18,840 --> 00:23:22,639
and obviously it can do it
with super human precision.
585
00:23:22,640 --> 00:23:24,960
(beeping)
586
00:23:27,559 --> 00:23:29,319
- [Gemma] Then last
year, Demis and his team
587
00:23:29,320 --> 00:23:31,200
built a computer program to play
588
00:23:31,201 --> 00:23:33,479
the most complex
game ever devised,
589
00:23:33,480 --> 00:23:36,518
an ancient Chinese
game called "Go."
590
00:23:36,519 --> 00:23:39,060
- The aim of the game in
"Go" is to either capture
591
00:23:39,061 --> 00:23:42,240
your opponent's pieces or to
surround areas
of the board
592
00:23:42,241 --> 00:23:43,719
and make it your territory.
593
00:23:43,720 --> 00:23:45,519
- [Gemma] In chess,
the board is made up
594
00:23:45,520 --> 00:23:47,719
of an eight by eight grid.
595
00:23:47,720 --> 00:23:50,199
This means the number of
possible moves in a game
596
00:23:50,200 --> 00:23:52,440
can be number crunched
by a computer.
597
00:23:52,441 --> 00:23:55,319
(robotic noises)
598
00:23:56,359 --> 00:24:00,799
With a 19 by 19 board, Go is
a much more complex game.
599
00:24:01,960 --> 00:24:04,219
- Even the best players,
they use their intuition
600
00:24:04,220 --> 00:24:06,759
and their instincts
more than calculation.
601
00:24:07,599 --> 00:24:10,120
- [Gemma] Astonishingly, the
number of possible moves
602
00:24:10,121 --> 00:24:13,260
is greater than the number
of atoms in the universe.
603
00:24:13,261 --> 00:24:15,758
- So even if you took all the
world's computing power
604
00:24:15,759 --> 00:24:17,319
and ran it for a million years,
605
00:24:17,320 --> 00:24:18,839
that would still not be enough
606
00:24:18,840 --> 00:24:21,480
to brute force a solution
to how to win "Go."
607
00:24:22,279 --> 00:24:24,220
- [Gemma] So Demis gave
a more powerful computer
608
00:24:24,221 --> 00:24:25,599
the same challenge it gave
609
00:24:25,600 --> 00:24:28,118
the Atari computer
five years before.
610
00:24:28,119 --> 00:24:29,980
Could it teach
itself how to play
611
00:24:29,981 --> 00:24:32,020
the world's most
complex board game
612
00:24:32,021 --> 00:24:34,160
and beat the
world's best player?
613
00:24:34,161 --> 00:24:35,479
(intense music)
614
00:24:35,480 --> 00:24:37,319
Earlier this year, Demis
took the computer program
615
00:24:37,320 --> 00:24:41,679
he'd named AlphaGo to Korea
to play the world champion.
616
00:24:41,680 --> 00:24:45,518
- It was actually a genuine
sort of excitement
617
00:24:45,519 --> 00:24:48,839
and sort of fear about what
was actually gonna happen.
618
00:24:49,759 --> 00:24:51,720
- [Gemma] The man on the
right is making the moves
619
00:24:51,721 --> 00:24:52,920
on behalf of AlphaGo.
620
00:24:52,921 --> 00:24:56,399
- That's a very surprising move.
621
00:24:56,400 --> 00:24:59,559
- I thought it was, I
thought it was a mistake.
622
00:25:00,559 --> 00:25:04,380
- AlphaGo played a move that
was just completely unthinkable
623
00:25:04,381 --> 00:25:06,000
for a human to play.
624
00:25:06,001 --> 00:25:08,558
(intense music)
625
00:25:08,559 --> 00:25:10,560
So there's two
important lines in "Go."
626
00:25:10,561 --> 00:25:12,279
If you play on the third line,
627
00:25:11,880 --> 00:25:13,920
you are trying to take
territory on the side
of the board.
628
00:25:13,759 --> 00:25:15,079
If you play on the fourth line,
629
00:25:14,920 --> 00:25:16,200
you are trying to take influence
630
00:25:16,201 --> 00:25:17,640
into the center of the board
631
00:25:17,641 --> 00:25:19,880
and what AlphaGo did is it
played on the fifth line
632
00:25:19,881 --> 00:25:23,380
and you never do that in the
position that it played in.
633
00:25:23,381 --> 00:25:26,079
And we were actually quite
worried because obviously at
that point,
634
00:25:26,080 --> 00:25:28,679
we didn't know if
this was a crazy move
635
00:25:28,680 --> 00:25:33,239
or a brilliant, original move,
and then 50 moves later,
636
00:25:33,240 --> 00:25:35,078
that move ended up joining up
637
00:25:35,079 --> 00:25:36,580
with another part of the board-
638
00:25:36,581 --> 00:25:37,799
- [Gemma] So it worked.
639
00:25:37,559 --> 00:25:39,239
- And sort of
magically just, yeah,
640
00:25:39,240 --> 00:25:41,319
resulting in helping
it win the game.
641
00:25:42,400 --> 00:25:44,860
- [Gemma] Demis' AI made
headlines around
the world
642
00:25:44,861 --> 00:25:46,240
when it won the match.
643
00:25:46,241 --> 00:25:49,598
- We're not there yet,
but in the next few years
644
00:25:49,599 --> 00:25:51,599
we would like to get
to the point where
645
00:25:51,600 --> 00:25:53,879
you could give it any data,
646
00:25:53,880 --> 00:25:56,439
scientific, medical
or commercial,
647
00:25:56,440 --> 00:25:59,399
and it would find these
structures or these patterns,
648
00:25:59,400 --> 00:26:01,079
that perhaps human
experts have missed
649
00:26:01,080 --> 00:26:04,999
and highlight those so that
improvements can
be made, yeah.
650
00:26:05,000 --> 00:26:07,879
(intense music)
651
00:26:07,880 --> 00:26:09,839
- I think what's really
interesting about it
652
00:26:09,840 --> 00:26:13,278
is the fact that this
program can teach itself.
653
00:26:13,279 --> 00:26:14,920
It can learn from its mistakes.
654
00:26:14,921 --> 00:26:17,880
It can come up with a
genuinely creative solution
655
00:26:17,881 --> 00:26:21,480
to a problem and really you
can apply that
to anything.
656
00:26:22,480 --> 00:26:26,119
So if we can give my robot the
ability to learn for itself,
657
00:26:26,120 --> 00:26:28,239
who knows where
it might take us.
658
00:26:28,240 --> 00:26:29,000
It's so strange.
659
00:26:29,001 --> 00:26:32,879
(intense music)
660
00:26:32,880 --> 00:26:34,279
We're testing the
boundaries of science
661
00:26:34,280 --> 00:26:37,159
with a unique artificial
intelligence test,
662
00:26:37,160 --> 00:26:40,079
building a robot that looks,
sounds and thinks like me.
663
00:26:42,440 --> 00:26:45,240
The team is working on getting
the robot's facial expressions
664
00:26:45,241 --> 00:26:47,160
to work in tandem with the AI.
665
00:26:48,599 --> 00:26:51,879
- If you look at other
robots of this type,
666
00:26:51,880 --> 00:26:54,559
this kind of flexible
silicone face robot,
667
00:26:54,560 --> 00:26:56,239
this is bloody good.
668
00:26:56,240 --> 00:26:57,240
Look at me.
669
00:26:58,279 --> 00:26:59,119
Look at me.
670
00:26:59,120 --> 00:27:00,999
Oh, spooky.
671
00:27:01,000 --> 00:27:02,278
(chuckling)
672
00:27:02,279 --> 00:27:04,279
- [Gemma] The heart of
the robot is conversation
673
00:27:04,280 --> 00:27:06,679
that seems real and the
software to do this,
674
00:27:06,680 --> 00:27:08,720
the chatbot, has just arrived.
675
00:27:09,680 --> 00:27:11,920
- [Chatbot] Hello, my
name is Gemma Chan.
676
00:27:11,921 --> 00:27:13,758
I am not a robot.
677
00:27:13,759 --> 00:27:16,959
- That sounds like
Gemma, doesn't it?
678
00:27:16,960 --> 00:27:18,860
I think it sounds like Gemma.
679
00:27:18,861 --> 00:27:20,959
Do you think? I think that
sounds like Gemma.
680
00:27:20,960 --> 00:27:22,400
- [Gemma] But it's
proving a real challenge
681
00:27:22,401 --> 00:27:24,920
to synchronize the
body with the mind.
682
00:27:24,921 --> 00:27:27,319
- When we can get all of the
actions going at the same time,
683
00:27:27,320 --> 00:27:29,399
it will look really good.
684
00:27:29,400 --> 00:27:31,160
- [Gemma] Hopefully,
the robot will be able
685
00:27:31,161 --> 00:27:33,919
to pick the right expression
for the right thought.
686
00:27:33,920 --> 00:27:35,960
- [Will] This is the
first stored pose
687
00:27:35,961 --> 00:27:38,798
and it's a little smile.
688
00:27:38,799 --> 00:27:39,680
- That's awesome.
689
00:27:39,681 --> 00:27:41,440
I like the look of that.
690
00:27:41,759 --> 00:27:45,038
- [Gemma] Will is
teaching the robot 180
691
00:27:45,039 --> 00:27:46,999
of the most common movements.
692
00:27:47,000 --> 00:27:48,440
- I think she's happy.
693
00:27:48,640 --> 00:27:51,360
- [Gemma] Hoping she'll
chose the right ones
694
00:27:51,361 --> 00:27:53,799
to react and be
convincing as a human.
695
00:27:53,800 --> 00:27:57,160
- Gemma now knows
how to smile forever
696
00:27:58,000 --> 00:28:00,679
and she will be able
to seamlessly blend
697
00:28:00,680 --> 00:28:03,439
from one smile to
a frown to anger
698
00:28:03,440 --> 00:28:07,159
to despair and the whole
range of human emotions
699
00:28:07,160 --> 00:28:08,519
but we have to teach her them.
700
00:28:11,599 --> 00:28:13,620
- If we can teach
AI almost anything
701
00:28:13,621 --> 00:28:15,780
that involves reasoning
and decision making,
702
00:28:15,781 --> 00:28:19,439
once it has those skills, what
might be the consequences?
703
00:28:19,440 --> 00:28:22,279
(intense music)
704
00:28:23,880 --> 00:28:26,639
Oh we're high.
705
00:28:26,640 --> 00:28:29,078
This is the London
Gateway Dockyard.
706
00:28:29,079 --> 00:28:30,679
(intense music)
707
00:28:30,680 --> 00:28:33,798
Every day, over 20,000
containers are moved
708
00:28:33,799 --> 00:28:36,340
by a highly complex
AI which controls
709
00:28:36,341 --> 00:28:40,119
the logistics and timing of
what happens when
and where.
710
00:28:41,799 --> 00:28:44,680
I can only see
about five people.
711
00:28:46,279 --> 00:28:48,560
Just 10 years ago,
a port of this size
712
00:28:48,561 --> 00:28:50,340
would've employed
thousands of workers
713
00:28:50,341 --> 00:28:51,960
to shift these containers.
714
00:28:53,559 --> 00:28:57,358
Today, many of the 500
employees spend
their time
715
00:28:57,359 --> 00:29:00,000
supervising the AI machines.
716
00:29:02,559 --> 00:29:04,240
The AI is incredibly efficient,
717
00:29:04,241 --> 00:29:07,598
moving the containers in the
fewest number
of moves,
718
00:29:07,599 --> 00:29:09,439
like a very basic AlphaGo.
719
00:29:09,440 --> 00:29:10,920
It comes up with
solutions faster
720
00:29:10,921 --> 00:29:13,358
than any human would be able to.
721
00:29:13,359 --> 00:29:15,239
- Wow, that's a
lot of containers.
722
00:29:15,240 --> 00:29:17,000
- [Gemma] AI expert and
writer, Martin Ford,
723
00:29:17,001 --> 00:29:19,400
thinks what we're
seeing here reflects
724
00:29:19,401 --> 00:29:21,199
the shape of things to come.
725
00:29:21,200 --> 00:29:23,000
- Look out at all
these containers here
726
00:29:23,001 --> 00:29:25,880
and think of those as
representing the
job market
727
00:29:25,881 --> 00:29:28,439
in the United Kingdom
and imagine 35%,
728
00:29:28,440 --> 00:29:30,660
roughly a third of
those disappearing
729
00:29:30,661 --> 00:29:33,780
and what would happen to our
society and our economy
730
00:29:33,781 --> 00:29:35,319
if that were to happen?
731
00:29:35,320 --> 00:29:37,519
(eerie music)
732
00:29:38,920 --> 00:29:41,300
There's an incredible
impact on all of us
733
00:29:41,301 --> 00:29:42,640
and on the economy.
734
00:29:42,641 --> 00:29:45,040
It's something that is gonna
be uniquely disruptive,
735
00:29:45,041 --> 00:29:46,979
something that we've never
seen before in history
736
00:29:46,980 --> 00:29:49,560
and one of the things
that is really driving it
737
00:29:49,561 --> 00:29:51,219
is that machines
in a limited way
738
00:29:51,220 --> 00:29:53,559
are at least, they're
beginning to think.
739
00:29:54,400 --> 00:29:56,679
- [Gemma] And it's not
just blue collar jobs.
740
00:29:56,680 --> 00:29:58,960
You may not realize this but
a lot of online journalism
741
00:29:58,961 --> 00:30:02,879
based on statistics like
sports and business articles
742
00:30:02,880 --> 00:30:05,520
are increasingly
being written by AIs.
743
00:30:05,521 --> 00:30:09,639
- This is a corporate earning
report for Star Bulk carriers,
744
00:30:09,640 --> 00:30:11,200
which is actually
one of the companies
745
00:30:11,201 --> 00:30:12,400
that utilizes this port.
746
00:30:12,401 --> 00:30:15,199
One of these items is
written by a machine
747
00:30:15,200 --> 00:30:16,919
and one is written
by a human journalist
748
00:30:16,920 --> 00:30:18,319
and just take a look and see
749
00:30:18,320 --> 00:30:21,318
if you can determine
which is which.
750
00:30:21,319 --> 00:30:23,800
- Yeah, so the first
one, Athens, Greece.
751
00:30:23,801 --> 00:30:26,520
Star Bulk Carriers
Corporation on Wednesday
752
00:30:26,521 --> 00:30:29,799
reported a loss of $48.8
million in its first quarter.
753
00:30:31,839 --> 00:30:34,319
Sounds pretty... Sounds
pretty human
to me.
754
00:30:34,320 --> 00:30:36,598
- And the other one starts,
755
00:30:36,599 --> 00:30:39,000
this Star Bulk Carriers
Corporation reported
756
00:30:39,001 --> 00:30:41,160
a net revenue decrease
of 14.9 percent
757
00:30:41,161 --> 00:30:42,838
in the first quarter of 2016.
758
00:30:42,839 --> 00:30:44,400
- Which one do
you think is human
759
00:30:44,401 --> 00:30:46,359
and which one do you
think is a machine?
760
00:30:46,360 --> 00:30:48,078
- It's really hard to tell.
761
00:30:48,079 --> 00:30:49,799
Which one is written
by, do you know?
762
00:30:49,800 --> 00:30:52,320
- I believe the one on the
right is written
by a person
763
00:30:52,321 --> 00:30:54,359
and the one on the left is
written by the machine.
764
00:30:54,360 --> 00:30:55,939
- They both sound like something
765
00:30:55,940 --> 00:30:57,679
that a person could've written,
766
00:30:57,680 --> 00:30:59,200
doesn't it.
- That's right.
767
00:30:58,839 --> 00:31:00,880
We are really heading toward
a kind of a tipping point,
768
00:31:00,881 --> 00:31:02,959
or a point at which things
are gonna accelerate
769
00:31:02,960 --> 00:31:04,559
beyond anything
we've seen before.
770
00:31:04,560 --> 00:31:06,100
This is just really historic.
771
00:31:06,101 --> 00:31:08,700
It's not just about
muscle power anymore.
772
00:31:08,701 --> 00:31:09,939
It's about brain power.
773
00:31:09,940 --> 00:31:12,518
Machines are moving in
to cognitive capability
774
00:31:12,519 --> 00:31:14,000
and that of course is the thing
775
00:31:14,001 --> 00:31:15,479
that really sets people apart.
776
00:31:15,480 --> 00:31:17,200
That's the reason
that most people today
777
00:31:17,201 --> 00:31:20,820
still have jobs whereas horses
have been put
out of work.
778
00:31:20,821 --> 00:31:22,620
It's because we have
this ability to think,
779
00:31:22,621 --> 00:31:24,760
to learn, to figure out
how to do new things
780
00:31:24,761 --> 00:31:26,560
and to solve problems
but increasingly,
781
00:31:26,561 --> 00:31:28,340
the machines are
pushing into that area
782
00:31:28,341 --> 00:31:31,540
and that's gonna have huge
implications for
the future.
783
00:31:31,541 --> 00:31:32,838
- Are any jobs safe?
784
00:31:32,839 --> 00:31:34,440
- Right now, it's really
hard to build robots
785
00:31:34,441 --> 00:31:37,000
that can approach human
ability and dexterity
786
00:31:37,001 --> 00:31:40,598
and mobility so a lot of
skilled trade type jobs,
787
00:31:40,599 --> 00:31:42,840
electricians and plumbers
and that type of things
788
00:31:42,841 --> 00:31:45,080
are probably gonna
be relatively safe.
789
00:31:45,081 --> 00:31:48,759
But that's thinking out over
the next 10, 20 maybe
30 years.
790
00:31:48,760 --> 00:31:52,300
Once you go beyond that, really
nothing is off the table.
791
00:31:52,301 --> 00:31:55,679
- [Gemma] But the biggest danger
may not be losing our jobs.
792
00:31:55,680 --> 00:31:57,440
Professor Stephen
Hawking recently warned
793
00:31:57,441 --> 00:32:00,959
that the creation of powerful
AI will be either
the best
794
00:32:00,960 --> 00:32:03,839
or the worst thing ever
to happen to humanity.
795
00:32:04,839 --> 00:32:07,939
Hawking was recently joined
by Tesla founder
Elon Musk
796
00:32:07,940 --> 00:32:10,360
and other leading
figures in an open letter
797
00:32:10,361 --> 00:32:13,559
highlighting the potential
dangers of unchecked AI.
798
00:32:14,920 --> 00:32:17,039
One of the most vocal was
professor Nick Bostrom.
799
00:32:18,839 --> 00:32:21,399
- Development in the last few
years in machine learning
800
00:32:21,400 --> 00:32:23,839
have just more rapid
than people expected
801
00:32:23,840 --> 00:32:26,960
with these deep learning
algorithms and so forth.
802
00:32:26,961 --> 00:32:29,360
- How far away are
we from achieving
803
00:32:29,361 --> 00:32:31,439
human level artificial
intelligence?
804
00:32:31,440 --> 00:32:33,400
- The median opinion by
which year do you think
805
00:32:33,401 --> 00:32:37,239
there's a 50% chance
was 2040 or 2050.
806
00:32:37,240 --> 00:32:39,039
- That's within our lifetime.
807
00:32:39,040 --> 00:32:40,318
- Yeah.
808
00:32:40,319 --> 00:32:43,479
Within the lifetime of a lot
of people alive today.
809
00:32:43,480 --> 00:32:44,759
The concern is you're building
810
00:32:44,760 --> 00:32:46,579
this very, very
powerful intelligence,
811
00:32:46,580 --> 00:32:49,380
and you wanna be really sure
then that this goal
that it has
812
00:32:49,381 --> 00:32:50,800
is the same as your goal,
813
00:32:50,801 --> 00:32:53,160
that it incorporates
human values in it
814
00:32:53,161 --> 00:32:55,200
because if it's not a goal
that you're happy with
815
00:32:54,960 --> 00:32:56,480
then you might see
the world transform
816
00:32:56,481 --> 00:32:58,979
into something that
maximizes the AI's goal
817
00:32:58,980 --> 00:33:01,440
but leaves no room for
you and your values.
818
00:33:02,400 --> 00:33:05,300
- [Gemma] This idea of
autonomous and
dangerous AIs
819
00:33:05,301 --> 00:33:08,639
is a recurring theme in the
world of science fiction.
820
00:33:08,640 --> 00:33:10,760
Nick thinks that
super-intelligent machines
821
00:33:10,761 --> 00:33:12,679
could one day inhabit
the real world
822
00:33:12,680 --> 00:33:14,979
and use their power
to negative effect
823
00:33:14,980 --> 00:33:17,640
if we don't put the right
safeguards in place.
824
00:33:18,480 --> 00:33:21,200
AIs could take their
instructions to logical
825
00:33:21,201 --> 00:33:23,318
but unanticipated extremes.
826
00:33:23,319 --> 00:33:27,838
- The concern is not that
these AIs would resent us
827
00:33:27,839 --> 00:33:30,838
or resent being exploited by us
828
00:33:30,839 --> 00:33:33,060
or that they would
hate us or something
829
00:33:33,061 --> 00:33:34,820
but that they would
be indifferent to us.
830
00:33:34,821 --> 00:33:38,100
So if you think maybe you have
some big department store
831
00:33:38,101 --> 00:33:40,400
and it wants to build
a new parking place,
832
00:33:40,401 --> 00:33:43,220
maybe there was an ant colony
there before, right?
833
00:33:43,221 --> 00:33:44,358
So it got paved over.
834
00:33:44,359 --> 00:33:46,039
It's not because
we hate the ants.
835
00:33:46,040 --> 00:33:49,838
It's just because they didn't
factor into
our goal.
836
00:33:49,839 --> 00:33:51,100
- I see.
- We didn't care.
837
00:33:51,101 --> 00:33:53,660
Similarly, if you had
a machine that wants
838
00:33:53,661 --> 00:33:55,279
to optimize the universe,
839
00:33:56,279 --> 00:33:58,000
to maximize the
realization of some goal,
840
00:33:58,001 --> 00:34:01,278
in realizing this goal we
wouldn't be kind of stomped out.
841
00:34:01,279 --> 00:34:02,720
- Collateral damage.
- In the same way that,
842
00:34:02,721 --> 00:34:04,118
yeah, collateral damage.
843
00:34:04,119 --> 00:34:05,079
(intense music)
844
00:34:05,080 --> 00:34:06,639
(ringing)
845
00:34:06,640 --> 00:34:08,760
- [Gemma] The way in
which AIs can be diverted
846
00:34:08,761 --> 00:34:11,000
from what their
architects intended
847
00:34:11,001 --> 00:34:12,379
played out earlier this year
848
00:34:12,380 --> 00:34:14,880
when Microsoft introduced
Tay to Twitter.
849
00:34:15,719 --> 00:34:18,039
The AI persona was designed to
act like an American teenager
850
00:34:18,040 --> 00:34:20,800
to attract a younger audience.
851
00:34:21,559 --> 00:34:23,920
The chatbot worked by
absorbing and mimicking
852
00:34:23,921 --> 00:34:26,200
the language of
other Twitter users
853
00:34:26,201 --> 00:34:27,960
but Tay was hijacked
by internet trolls,
854
00:34:27,961 --> 00:34:31,039
who gave it a very
different set of values.
855
00:34:31,760 --> 00:34:33,800
The original intention
was corrupted
856
00:34:33,801 --> 00:34:35,320
and Tay was unable to work out
857
00:34:35,321 --> 00:34:37,960
which views were acceptable
and which weren't.
858
00:34:37,961 --> 00:34:43,598
Within a day, Tay became a
Hitler loving, racist,
sex pest.
859
00:34:43,599 --> 00:34:45,280
This shows what can happen to AI
860
00:34:45,281 --> 00:34:47,220
if it falls into the wrong hands
861
00:34:47,221 --> 00:34:49,440
but could a future
Tay be far worse?
862
00:34:51,360 --> 00:34:54,000
- Once you have a kind of
super-intelligent genie
863
00:34:54,001 --> 00:34:55,580
that's out of the bottle like,
864
00:34:55,581 --> 00:34:57,800
it might not be possible
to put it back in again.
865
00:34:57,801 --> 00:35:00,518
Like, you don't have a
super-intelligent adversary
866
00:35:00,519 --> 00:35:02,239
that is working at
cross-purposes with you
867
00:35:02,240 --> 00:35:04,859
that might then resist your
attempts to shut
it down.
868
00:35:04,860 --> 00:35:08,199
It's much better to get it
right on the first attempt,
869
00:35:08,200 --> 00:35:10,158
not to build a super-intelligent
870
00:35:10,159 --> 00:35:11,639
evil genie in the
first place, right?
871
00:35:11,640 --> 00:35:13,879
If you're gonna have a
super-intelligent genie,
872
00:35:13,880 --> 00:35:15,400
you want it to be-
- You want it to be-
873
00:35:15,119 --> 00:35:17,000
on your side.
- On your side, yeah. Exactly.
874
00:35:17,001 --> 00:35:18,839
(intense music)
875
00:35:18,840 --> 00:35:21,198
- [Gemma] In Cornwall, our genie
876
00:35:21,199 --> 00:35:23,440
is about to be let
out of its bottle
877
00:35:23,441 --> 00:35:25,559
and I want to know
who's side it's on.
878
00:35:25,560 --> 00:35:27,400
- Robot's upstairs here.
879
00:35:29,639 --> 00:35:33,479
Bear in mind this is
not your final skin so,
880
00:35:33,480 --> 00:35:37,439
let's have a look inside and
just let her peak right out.
881
00:35:37,440 --> 00:35:38,960
- [Gemma] Oh my goodness.
882
00:35:39,800 --> 00:35:40,919
(gasping)
883
00:35:40,920 --> 00:35:43,678
That's so weird.
884
00:35:43,679 --> 00:35:46,239
(eerie music)
885
00:35:48,199 --> 00:35:49,759
- See?
886
00:35:49,760 --> 00:35:50,800
It's quite warm.
887
00:35:50,801 --> 00:35:53,238
Now, just feel it
though, it's weird.
888
00:35:53,239 --> 00:35:56,000
(eerie music)
889
00:35:56,599 --> 00:35:58,079
What do you think of the eyes.
890
00:35:58,080 --> 00:36:00,800
(eerie music)
891
00:36:02,800 --> 00:36:03,920
- Ah! Oh my God!
892
00:36:03,921 --> 00:36:07,238
(laughing)
893
00:36:07,239 --> 00:36:09,039
(eerie music)
894
00:36:16,679 --> 00:36:20,759
It's so strange 'cause
she's not quite right
895
00:36:20,760 --> 00:36:26,839
but she, I can recognize,
like the nose is,
896
00:36:26,840 --> 00:36:30,519
well, I mean it's,
yeah, it's my nose.
897
00:36:31,320 --> 00:36:33,078
- [Will] Try asking
her something.
898
00:36:33,079 --> 00:36:34,158
- Okay.
899
00:36:34,159 --> 00:36:35,960
What have you been up to today?
900
00:36:36,880 --> 00:36:40,638
- We've been busy filming
season two of "Humans"
since April
901
00:36:40,639 --> 00:36:42,239
and it's been very exciting.
902
00:36:44,400 --> 00:36:45,598
- Not bad.
903
00:36:45,599 --> 00:36:47,520
- [Will] So we can
have a kind of guess
904
00:36:47,521 --> 00:36:49,638
of the sort of things
people might say.
905
00:36:49,639 --> 00:36:50,678
Say-
906
00:36:50,679 --> 00:36:52,119
- [Robot Gemma] I can't
resist cheese on toast.
907
00:36:52,120 --> 00:36:53,598
(laughing)
908
00:36:53,599 --> 00:36:55,599
- What did you
have for breakfast?
909
00:36:56,400 --> 00:36:58,320
- [Robot Gemma] I had a
toasted cheese sandwich.
910
00:36:57,960 --> 00:37:00,079
- Is that because you can't
resist cheese
on toast?
911
00:37:01,679 --> 00:37:03,840
- [Robot Gemma] I like
the taste of cheese.
912
00:37:03,841 --> 00:37:04,919
(laughing)
913
00:37:04,920 --> 00:37:07,639
- Is that true? Do you really
like cheese on toast?
914
00:37:07,640 --> 00:37:09,359
- I love cheese, yeah.
915
00:37:09,360 --> 00:37:12,439
- Oh, so there's a
little personality trait
916
00:37:12,440 --> 00:37:13,760
we might've got got right.
917
00:37:13,761 --> 00:37:15,559
- Maybe not as much
as she likes cheese.
918
00:37:16,760 --> 00:37:18,480
It's the facial expressions.
919
00:37:18,481 --> 00:37:21,599
They're not quite in sync
with what she's saying.
920
00:37:21,600 --> 00:37:24,639
- Basically, it's a
slight software bug.
921
00:37:24,920 --> 00:37:27,559
- [Will] Don't worry too much.
922
00:37:29,679 --> 00:37:31,380
- [Gemma] She's confused.
923
00:37:31,381 --> 00:37:32,440
- [Will] Very strange.
924
00:37:32,441 --> 00:37:34,679
- [Gemma] A few facial ticks
going on.
925
00:37:34,680 --> 00:37:37,879
- Yeah, she has
but this is really
926
00:37:37,880 --> 00:37:39,999
the first time we've tried this
927
00:37:40,000 --> 00:37:42,879
so it's very, very new
and what you'll find
928
00:37:42,880 --> 00:37:45,859
is that things will progress
very, very quickly.
929
00:37:45,860 --> 00:37:48,720
- [Gemma] Facial ticks aside,
the build it
going well.
930
00:37:48,721 --> 00:37:50,620
Every day, the robot
is making progress
931
00:37:50,621 --> 00:37:53,079
in terms of how it
looks, sounds and thinks.
932
00:37:53,080 --> 00:37:56,759
So we've come up with an idea
to put it to
the test.
933
00:37:56,760 --> 00:37:57,799
- Hi Gemma.
934
00:37:57,800 --> 00:37:58,800
- [Robot Gemma] Hi.
935
00:37:58,801 --> 00:38:00,159
- Robot Gemma, do not fail.
936
00:38:01,000 --> 00:38:04,799
(eerie music)
937
00:38:04,800 --> 00:38:07,400
- We're building an
artificially intelligent robot
938
00:38:07,401 --> 00:38:11,360
that looks like me, talks
like me and thinks
like me
939
00:38:12,440 --> 00:38:15,039
and today I'm going to meet
her in her finished form.
940
00:38:16,199 --> 00:38:19,678
Last time I saw her, she
needed quite a
bit of work
941
00:38:19,679 --> 00:38:23,118
so I'm hoping that
today we will have
942
00:38:23,119 --> 00:38:24,840
more of a finished product.
943
00:38:26,039 --> 00:38:28,159
Yeah, I don't know. I'm quite
nervous.
944
00:38:28,160 --> 00:38:29,839
(eerie music)
945
00:38:29,840 --> 00:38:31,919
- [Will] Here it is.
- Ah!
946
00:38:31,920 --> 00:38:34,399
Oh no.
947
00:38:34,400 --> 00:38:35,959
Really strange.
948
00:38:35,960 --> 00:38:37,360
- [Will] It is spooky, isn't it?
949
00:38:37,361 --> 00:38:39,078
(gasping)
950
00:38:39,079 --> 00:38:40,079
- Wow.
951
00:38:41,679 --> 00:38:45,158
It's really quite uncanny.
952
00:38:45,159 --> 00:38:47,920
(eerie music)
953
00:38:50,639 --> 00:38:54,518
In the 1950s, the visionary
godfather of British computing,
954
00:38:54,519 --> 00:38:56,039
Alan Turing, saw
forward to the days
955
00:38:56,040 --> 00:38:59,399
when a computer would
be able to think like us
956
00:38:59,400 --> 00:39:00,840
and he came up with a test.
957
00:39:01,840 --> 00:39:03,960
Could a bystander
tell if he was talking
958
00:39:03,961 --> 00:39:06,439
to a machine or a person?
959
00:39:06,440 --> 00:39:09,280
It's known as a Turing
Test and we've adapted it
960
00:39:09,281 --> 00:39:11,518
to try out on our own robot.
961
00:39:11,519 --> 00:39:13,000
- This is an enormous challenge.
962
00:39:13,001 --> 00:39:15,079
Basically, what
we're doing is taking
963
00:39:15,080 --> 00:39:17,959
a fictional vision of the future
964
00:39:17,960 --> 00:39:20,480
and trying to bring it here now.
965
00:39:21,960 --> 00:39:24,359
- She's ready for her closeup.
966
00:39:24,360 --> 00:39:27,500
I've invited some journalists
to come and interview her
967
00:39:27,501 --> 00:39:30,039
to see if they can tell
it's not actually me
968
00:39:30,840 --> 00:39:33,919
and we've rigged the place
with hidden cameras.
969
00:39:33,920 --> 00:39:34,920
(people talking)
970
00:39:34,921 --> 00:39:36,718
- Hi there.
971
00:39:36,719 --> 00:39:37,760
- Hello.
972
00:39:37,559 --> 00:39:39,400
- [Gemma] And I've
called in a favor
973
00:39:39,039 --> 00:39:41,280
from "Humans" castmates Emily
Barrington and
Will Tudor
974
00:39:41,281 --> 00:39:43,519
to give proceedings
an air of reality
975
00:39:43,520 --> 00:39:45,919
and the journalists
will meet them first.
976
00:39:45,920 --> 00:39:48,079
- I'll come back and tell you
when we've got two minutes.
977
00:39:48,080 --> 00:39:49,639
- Okay.
- All right.
978
00:39:49,640 --> 00:39:50,879
(intense music)
979
00:39:50,880 --> 00:39:53,100
- [Gemma] Before being
taken into a second room,
980
00:39:53,101 --> 00:39:54,879
where to give us
a fighting chance,
981
00:39:54,880 --> 00:39:57,900
they'll interview Robot
Gemma over a video call.
982
00:39:57,901 --> 00:39:59,158
- Hey Gemma.
983
00:39:59,159 --> 00:40:01,920
- [Robot Gemma] Hi, I'm so
sorry I can't be there
in person.
984
00:40:02,880 --> 00:40:04,640
- [Gemma] I'll be in
a room down the hall,
985
00:40:04,641 --> 00:40:07,320
watching with the robot's
creators, Will and Kate.
986
00:40:08,360 --> 00:40:09,119
Come in.
987
00:40:09,120 --> 00:40:11,319
- [Will] I'm so excited.
988
00:40:11,320 --> 00:40:13,040
- [Gemma] But first,
there's just time
989
00:40:13,041 --> 00:40:15,638
to introduce Robot
Gemma to my castmates.
990
00:40:15,639 --> 00:40:17,158
(gasping)
991
00:40:17,159 --> 00:40:19,799
- Oh my goodness,
that's terrifying.
992
00:40:19,800 --> 00:40:22,960
What's extraordinary is the
tiny movements
in the face.
993
00:40:22,961 --> 00:40:24,959
- Are you real?
994
00:40:24,960 --> 00:40:25,960
- Maybe.
995
00:40:25,961 --> 00:40:27,400
- [Both] Oh!
996
00:40:29,679 --> 00:40:33,399
- So with everything in place,
the experiment begins.
997
00:40:33,400 --> 00:40:34,559
I'm so nervous.
998
00:40:34,560 --> 00:40:35,678
Oh my God.
999
00:40:35,679 --> 00:40:38,360
- So should I just get started
and ask you questions.
1000
00:40:38,361 --> 00:40:42,439
- I live in Notting Hill.
(laughing)
1001
00:40:42,440 --> 00:40:44,518
- That's very interesting.
1002
00:40:44,519 --> 00:40:46,060
- Could you say-
- Sorry?
1003
00:40:46,061 --> 00:40:47,940
- [Robot Gemma]
Apology accepted.
1004
00:40:47,941 --> 00:40:50,119
- [Gemma] It's not the
smoothest of starts.
1005
00:40:50,120 --> 00:40:51,460
- My name's Lareb.
1006
00:40:51,461 --> 00:40:52,680
- [Robot Gemma] Got it.
1007
00:40:52,681 --> 00:40:54,920
- How are you today?
- Your name is,
1008
00:40:54,921 --> 00:40:56,000
is your name?
1009
00:40:56,001 --> 00:40:57,840
- No, Lareb L-A-R-E-B.
1010
00:40:57,841 --> 00:41:00,159
- [Robot Gemma] Could you
say that one more time?
1011
00:41:01,920 --> 00:41:02,959
- Lareb.
1012
00:41:02,960 --> 00:41:06,399
- [Robot Gemma] Great,
yeah, what's the name?
1013
00:41:06,400 --> 00:41:07,598
(smacking)
1014
00:41:07,599 --> 00:41:09,500
- Thank you for
coming on and Skyping.
1015
00:41:09,501 --> 00:41:11,880
That's really nice of you. -
[Robot Gemma] Oh,
thank you
1016
00:41:11,881 --> 00:41:13,759
for taking the
time to talk to me.
1017
00:41:13,760 --> 00:41:14,719
(laughing)
1018
00:41:14,720 --> 00:41:16,559
- How was season one for you?
1019
00:41:16,599 --> 00:41:19,440
- [Robot Gemma] I wish I
could explain it to you
1020
00:41:19,441 --> 00:41:21,799
but I think it is
just an instinct.
1021
00:41:21,800 --> 00:41:22,400
(groaning)
1022
00:41:22,360 --> 00:41:23,360
- [Kate] No!
1023
00:41:23,880 --> 00:41:26,559
- I think he knows.
1024
00:41:27,760 --> 00:41:28,960
So far, Robot Gemma's responses
1025
00:41:28,961 --> 00:41:31,558
haven't gone as
well as I'd hoped
1026
00:41:31,559 --> 00:41:34,960
but gradually, she starts to
find her train
of thought.
1027
00:41:34,961 --> 00:41:37,359
- So tell me about series two.
1028
00:41:37,360 --> 00:41:39,199
- It's a key word
for Gemma Robot.
1029
00:41:39,200 --> 00:41:40,238
Series two.
1030
00:41:40,239 --> 00:41:45,558
If this journalist just
keeps saying series two,
1031
00:41:45,559 --> 00:41:47,199
everything will go fine.
1032
00:41:47,200 --> 00:41:51,718
- It seems that the scope has
got bigger for series two.
1033
00:41:51,719 --> 00:41:55,079
- [Robot Gemma] The main
difference is that the world
1034
00:41:55,080 --> 00:41:57,718
of the show has become bigger.
1035
00:41:57,719 --> 00:42:01,079
- And what does Mia
want in series two?
1036
00:42:02,039 --> 00:42:05,000
- [Robot Gemma] Mia's trying to
find her place in the world.
1037
00:42:05,001 --> 00:42:06,960
- The voice is a little
bit glitchy, isn't it.
1038
00:42:06,961 --> 00:42:08,700
- She's going though.
1039
00:42:08,701 --> 00:42:10,199
She's getting into it.
1040
00:42:10,200 --> 00:42:14,559
- Do you think a
synth can enjoy art?
1041
00:42:15,400 --> 00:42:16,700
- That's a good question.
1042
00:42:16,701 --> 00:42:19,000
- [Robot Gemma] Yes,
especially like sculpture.
1043
00:42:19,001 --> 00:42:20,558
- Yes, good answer!
1044
00:42:20,559 --> 00:42:22,119
- Can you hear me now?
1045
00:42:22,880 --> 00:42:24,920
- [Robot Gemma] Yup, I can
see and hear
you okay.
1046
00:42:24,921 --> 00:42:27,518
- Lovely, perfect.
1047
00:42:27,519 --> 00:42:28,760
Okay, so-
1048
00:42:29,519 --> 00:42:31,340
- [Robot Gemma]
We're short on time.
1049
00:42:31,341 --> 00:42:32,920
- Okay, that's absolutely fine.
1050
00:42:32,760 --> 00:42:34,159
- I don't think she's clocked
1051
00:42:34,160 --> 00:42:35,760
that she's not
talking to me yet.
1052
00:42:35,761 --> 00:42:38,840
- [Robot Gemma] What would
you like to see
a robot do?
1053
00:42:38,841 --> 00:42:41,199
- Oh, I'd love a
robot to tidy my room.
1054
00:42:41,200 --> 00:42:43,638
(laughing)
1055
00:42:43,639 --> 00:42:44,640
- Oh my God.
1056
00:42:44,641 --> 00:42:46,480
- What can we expect
from season two?
1057
00:42:47,840 --> 00:42:49,039
- [Robot Gemma] The
synth characters
1058
00:42:49,040 --> 00:42:51,839
have fragmented, in a way.
1059
00:42:51,840 --> 00:42:52,919
- Yes, yes!
1060
00:42:52,920 --> 00:42:54,280
- [Robot Gemma] I can't tell
you anymore more specific
1061
00:42:54,281 --> 00:42:56,078
about the plot, I'm afraid.
1062
00:42:56,079 --> 00:42:58,359
- Can you tell us where
we find your character?
1063
00:42:58,360 --> 00:43:00,000
- [Robot Gemma] Mia's not
with the Hawkins family.
1064
00:43:00,001 --> 00:43:03,078
Mia's trying to find
her place in the world.
1065
00:43:03,079 --> 00:43:04,518
- We've got her.
1066
00:43:04,519 --> 00:43:05,559
She totally believes it.
1067
00:43:06,079 --> 00:43:09,519
- And there's a new character
that you spend quite a lot of
time with.
1068
00:43:09,520 --> 00:43:11,078
- She's perfect.
1069
00:43:11,079 --> 00:43:14,599
- What can you tell us about
their relationship if anything?
1070
00:43:15,719 --> 00:43:17,079
- I can't answer that.
1071
00:43:17,880 --> 00:43:19,039
- Oh.
- She's not sure.
1072
00:43:19,040 --> 00:43:20,718
- That's okay.
1073
00:43:20,719 --> 00:43:22,119
It's okay to be evasive on that.
1074
00:43:22,120 --> 00:43:24,360
- [Robot Gemma] Who's
your favorite character?
1075
00:43:24,361 --> 00:43:25,919
- You, obviously.
1076
00:43:25,920 --> 00:43:30,799
- [Gemma] Time to meet the
journalists and reveal
the ruse.
1077
00:43:30,800 --> 00:43:32,759
(intense music)
1078
00:43:32,760 --> 00:43:33,760
- Hello?
1079
00:43:33,761 --> 00:43:34,999
- [Gemma] Hello.
1080
00:43:35,000 --> 00:43:36,038
Hi.
1081
00:43:36,039 --> 00:43:37,799
- This is so confusing.
1082
00:43:37,800 --> 00:43:40,639
- [Gemma] Just how
convincing was it?
1083
00:43:41,960 --> 00:43:43,598
- She looks so real!
1084
00:43:43,599 --> 00:43:45,580
What did you have for
breakfast this morning?
1085
00:43:45,581 --> 00:43:48,678
- The usual, got dressed
and sat at my computer.
1086
00:43:48,679 --> 00:43:51,118
(laughing)
1087
00:43:51,119 --> 00:43:52,559
- [Gemma] What do you think?
1088
00:43:52,360 --> 00:43:53,840
- Well, I wasn't
sure to be honest.
1089
00:43:53,841 --> 00:43:54,900
- Yeah.
- Yeah.
1090
00:43:54,901 --> 00:43:56,740
- [Gemma] What
were the giveaways?
1091
00:43:56,741 --> 00:43:58,100
- [Woman] The voice maybe.
1092
00:43:58,101 --> 00:44:00,639
- Fooled me when I first sat
down and looked
at it,
1093
00:44:00,640 --> 00:44:01,960
yeah, completely fooled me.
1094
00:44:01,961 --> 00:44:03,198
- [Gemma] Oh wow.
1095
00:44:03,199 --> 00:44:05,480
- It just looks like it kind of
not the best Skype connection.
1096
00:44:05,481 --> 00:44:06,519
(laughing)
1097
00:44:06,520 --> 00:44:08,159
How are you today?
1098
00:44:08,760 --> 00:44:10,879
- I am very good, thank you.
1099
00:44:10,880 --> 00:44:13,078
(laughing)
1100
00:44:13,079 --> 00:44:15,439
- Did you believe you
were talking to me?
1101
00:44:15,440 --> 00:44:16,440
- I did, yeah.
1102
00:44:16,280 --> 00:44:17,559
I honestly thought it was you.
1103
00:44:17,560 --> 00:44:19,959
- Could you say
that one more time?
1104
00:44:19,960 --> 00:44:24,078
(intense music)
1105
00:44:24,079 --> 00:44:25,760
- [Gemma] What's
really surprised me
1106
00:44:25,761 --> 00:44:27,480
is that we've got
as far as we have.
1107
00:44:27,481 --> 00:44:30,920
- There was obviously something
slightly uncanny
about her.
1108
00:44:32,679 --> 00:44:34,899
- I'm really impressed
that she held her own.
1109
00:44:34,900 --> 00:44:37,559
I didn't know if anyone would
be convinced by our robot
1110
00:44:37,560 --> 00:44:43,400
and quite a few people
were for at least a bit so in
that sense, that's a success.
1111
00:44:43,719 --> 00:44:46,759
Having set out to find
out how far we are
1112
00:44:46,760 --> 00:44:49,879
from the unsettling fictional
world of "Humans,"
1113
00:44:49,880 --> 00:44:53,199
the answer is perhaps a bit
more complicated
than I thought.
1114
00:44:54,239 --> 00:44:57,020
Whilst human-like robots may
well be some
way away,
1115
00:44:57,021 --> 00:44:58,920
what's clear is that AI
is developing at speed
1116
00:44:58,921 --> 00:45:01,919
and we need to debate
the potential pitfalls
1117
00:45:01,920 --> 00:45:03,799
before it's too late.
1118
00:45:03,800 --> 00:45:05,399
(intense music)
1119
00:45:05,400 --> 00:45:07,439
- It's very clear that
these technologies
1120
00:45:07,440 --> 00:45:08,679
are getting better and better
1121
00:45:08,440 --> 00:45:10,000
and I do think we
have to understand
1122
00:45:09,719 --> 00:45:11,480
that we're approaching
this tipping point,
1123
00:45:11,481 --> 00:45:14,820
this point where it's gonna
have a greatly amplified effect
1124
00:45:14,821 --> 00:45:17,200
and we need to find a
way to adapt to that.
1125
00:45:17,201 --> 00:45:20,000
- The same technology can be
used for good or for bad
1126
00:45:20,001 --> 00:45:21,719
and I think it's down to society
1127
00:45:21,720 --> 00:45:26,399
and the inventors of that
technology and the public at
large
1128
00:45:26,400 --> 00:45:28,960
to make sure that it gets
used for the right things.
1129
00:45:28,961 --> 00:45:33,078
- [Gemma] This is
just the beginning.
1130
00:45:33,079 --> 00:45:36,000
(upbeat music)
81965
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.