Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:01:02,160 --> 00:01:06,199
We choose to go to the moon.
2
00:01:06,200 --> 00:01:08,999
We choose to go to the moon
in this decade
3
00:01:09,000 --> 00:01:12,319
and do the other things,
not because they are easy
4
00:01:12,320 --> 00:01:14,039
but because they are hard.
5
00:01:14,040 --> 00:01:16,119
Tminus 20 seconds and counting.
6
00:01:16,120 --> 00:01:21,639
Guidance in channel 15, 14,
13, 12, 11, 10...
7
00:01:21,640 --> 00:01:24,079
9... Engines go.
Go.
8
00:01:24,080 --> 00:01:25,439
5, 4, 3, 2...
Go.
9
00:01:25,440 --> 00:01:26,959
Delta, go.
Go.
10
00:01:26,960 --> 00:01:28,319
Econ go.
Launch commit.
11
00:01:28,320 --> 00:01:30,039
Go.
12
00:01:54,000 --> 00:02:01,719
That's one small step for man,
one giant leap for mankind.
13
00:02:01,720 --> 00:02:04,399
Shortly after I was born,
Neil Armstrong
14
00:02:04,400 --> 00:02:07,279
became the first man
to walk on the moon.
15
00:02:07,280 --> 00:02:09,439
Putting a man on the moon
was the greatest technological
16
00:02:09,440 --> 00:02:12,759
achievement in the 20th century,
and it was televised live
17
00:02:12,760 --> 00:02:16,319
for all to see
and experience together.
18
00:02:16,320 --> 00:02:20,479
It's different,
but it's very pretty out here.
19
00:02:20,480 --> 00:02:23,639
Beautiful, beautiful.
Isn't that something?
20
00:02:23,640 --> 00:02:25,800
Are you getting a TV
picture now, Houston?
21
00:02:28,360 --> 00:02:31,999
I grew up in a small community
living in the shadow of NASA.
22
00:02:32,000 --> 00:02:34,719
It was equal parts cowboy
and astronaut.
23
00:02:34,720 --> 00:02:36,519
Our neighbors were
the engineers, technicians,
24
00:02:36,520 --> 00:02:38,719
and the men who flew to space.
25
00:02:38,720 --> 00:02:40,439
I grew up with visions
of the future
26
00:02:40,440 --> 00:02:44,359
where robots cleaned our homes,
and the future was bright.
27
00:02:44,360 --> 00:02:46,519
Computers and the Internet,
28
00:02:46,520 --> 00:02:48,519
once limited to NASA
and universities,
29
00:02:48,520 --> 00:02:50,759
began to move into our homes.
30
00:02:50,760 --> 00:02:54,399
In 1981, I got my first
computer, the Commodore VIC20.
31
00:02:54,400 --> 00:02:57,879
It has great games, too.
And learn computing at home.
32
00:02:57,880 --> 00:03:02,759
In 1997, Kasparov,
the greatest chess player alive,
33
00:03:02,760 --> 00:03:05,639
was stunningly defeated
by a computer.
34
00:03:05,640 --> 00:03:08,559
Probably in the future,
computer will replace us,
35
00:03:08,560 --> 00:03:10,399
I mean, will control our life.
36
00:03:10,400 --> 00:03:11,919
What's that?
37
00:03:11,920 --> 00:03:14,799
Shortly after that,
a cute toy named Furby
38
00:03:14,800 --> 00:03:17,479
became the musthave toy
of the season.
39
00:03:17,480 --> 00:03:20,799
It contained a very small bit
of artificial intelligence
40
00:03:20,800 --> 00:03:24,879
and more computing power than
was used to put man on the moon.
41
00:03:24,880 --> 00:03:27,959
In 2008, I saw
a video of Big Dog,
42
00:03:27,960 --> 00:03:30,279
a fourlegged robot
that went viral.
43
00:03:30,280 --> 00:03:33,359
It was developed to
assist troops in combat.
44
00:03:33,360 --> 00:03:37,159
It seems like robots were being
used for more than cute toys.
45
00:03:37,160 --> 00:03:40,559
What can we do to protect
ourselves from robot automation?
46
00:03:40,560 --> 00:03:43,719
The CEO explained that machines
will replace traditional labor.
47
00:03:43,720 --> 00:03:46,199
Companies planning to
replace human jobs...
48
00:03:46,200 --> 00:03:49,719
Every day, there are stories
about robots replacing us.
49
00:03:49,720 --> 00:03:52,359
Current estimates say
that over a third of all jobs
50
00:03:52,360 --> 00:03:56,879
will be lost to automation
by the year 2030.
51
00:03:56,880 --> 00:04:00,879
I think the development of
full artificial intelligence
52
00:04:00,880 --> 00:04:03,759
could spell the end
of the human race.
53
00:04:03,760 --> 00:04:06,479
Visionaries like
Stephen Hawking and Elon Musk
54
00:04:06,480 --> 00:04:10,399
warn that, while AI has great
potential for benefits,
55
00:04:10,400 --> 00:04:14,519
it could also be the greatest
existential risk to humanity.
56
00:04:18,000 --> 00:04:20,719
Artificial intelligence...
We are summoning a demon.
57
00:04:20,720 --> 00:04:23,599
Of course, we had a hand
in this creation,
58
00:04:23,600 --> 00:04:26,399
but I wonder, have we reached
some kind of turning point
59
00:04:26,400 --> 00:04:29,199
in the evolution
of intelligent machines?
60
00:04:29,200 --> 00:04:31,159
Are we on the verge
of witnessing the birth
61
00:04:31,160 --> 00:04:33,239
of a new species?
62
00:04:46,200 --> 00:04:51,479
I've seen things you people
wouldn't believe.
63
00:04:54,280 --> 00:05:00,039
Attack ships on fire
off the shoulder of Orion.
64
00:05:00,040 --> 00:05:03,279
I watched Cbeams
glitter in the dark
65
00:05:03,280 --> 00:05:07,479
near the Tannhauser Gate.
66
00:05:07,480 --> 00:05:14,199
All those moments would be lost
in time,
67
00:05:14,200 --> 00:05:21,159
like tears in the rain.
68
00:05:21,160 --> 00:05:22,679
I'm Will Jackson.
69
00:05:22,680 --> 00:05:24,959
I'm the director at
Engineered Arts Ltd.
70
00:05:24,960 --> 00:05:30,279
We're a humanoidrobot company,
and we make acting robots.
71
00:05:30,280 --> 00:05:33,359
I'm creating things
that engage people
72
00:05:33,360 --> 00:05:36,479
to the level where they
suspend their disbelief.
73
00:05:36,480 --> 00:05:38,879
Oh, yes, Master Luke.
Remember that I'm fluent
74
00:05:38,880 --> 00:05:41,439
in over six million forms
of communication.
75
00:05:41,440 --> 00:05:44,079
People say things like,
"Oh, they're going to
help out in hospitals.
76
00:05:44,080 --> 00:05:46,399
It's going to look
after my aged parent.
77
00:05:46,400 --> 00:05:47,959
It's going to do the dishes
in my houses."
78
00:05:47,960 --> 00:05:49,959
None of these things
are achievable.
79
00:05:49,960 --> 00:05:52,159
Utility and entertainment...
80
00:05:52,160 --> 00:05:53,639
It's the difference
between those two things,
81
00:05:53,640 --> 00:05:57,359
which is why
we make acting robots.
82
00:05:57,360 --> 00:06:00,959
So what's the simplest,
most wellpaid job
83
00:06:00,960 --> 00:06:03,559
that a human can do?
84
00:06:03,560 --> 00:06:07,239
Brad Pitt, you know?
How easy is that?
85
00:06:07,240 --> 00:06:09,759
Doesn't even get his hands wet,
you know?
86
00:06:09,760 --> 00:06:12,599
So that's why we do
acting robots...
87
00:06:12,600 --> 00:06:15,239
Really wellpaid, really easy.
88
00:06:15,240 --> 00:06:17,239
Pick the lowhanging fruit.
89
00:06:17,240 --> 00:06:19,679
You talking to me?
90
00:06:19,680 --> 00:06:22,679
You talking to me?
91
00:06:22,680 --> 00:06:25,119
You talking to me?
92
00:06:25,120 --> 00:06:26,799
Then who the hells else
you talking to?
93
00:06:26,800 --> 00:06:28,239
Were you talking to me?
94
00:06:28,240 --> 00:06:30,559
It's the recreation of life.
95
00:06:30,560 --> 00:06:34,599
It's partly man plays God
making the machine.
96
00:06:34,600 --> 00:06:38,479
And I think it's right back
to that fundamental question of,
97
00:06:38,480 --> 00:06:40,599
what does it mean to be human?
98
00:06:40,600 --> 00:06:42,599
How do we define ourselves?
99
00:06:42,600 --> 00:06:44,360
How is that not me?
100
00:06:46,760 --> 00:06:49,039
Here's where the fun begins.
101
00:06:49,040 --> 00:06:51,559
When I was 9,
on the last day of school,
102
00:06:51,560 --> 00:06:54,319
a new movie opened
that everyone was excited about.
103
00:06:54,320 --> 00:06:56,319
May the force be with you.
It was called "Star Wars,"
104
00:06:56,320 --> 00:06:58,999
and it introduced us
to two very cool robots,
105
00:06:59,000 --> 00:07:01,119
R2D2 and C3PO.
106
00:07:01,120 --> 00:07:02,359
Go that way.
107
00:07:02,360 --> 00:07:03,679
You'll be malfunctioning
within a day,
108
00:07:03,680 --> 00:07:06,159
you nearsighted scrap pile.
109
00:07:06,160 --> 00:07:07,639
And don't let me catch you
following me,
110
00:07:07,640 --> 00:07:10,119
beginning for help,
because you won't get it.
111
00:07:10,120 --> 00:07:12,559
A couple of years later,
another scifi movie
112
00:07:12,560 --> 00:07:14,959
was released
called "Blade Runner."
113
00:07:14,960 --> 00:07:16,959
The robots in this film
were not awkward
114
00:07:16,960 --> 00:07:19,239
and funny machines
like in "Star Wars."
115
00:07:19,240 --> 00:07:22,599
Here, they were almost
indistinguishable from humans.
116
00:07:25,680 --> 00:07:27,399
You're reading a magazine.
117
00:07:27,400 --> 00:07:29,919
You come across a fullpage
nude photo of a girl.
118
00:07:29,920 --> 00:07:31,599
Is this testing whether
I'm a replicant
119
00:07:31,600 --> 00:07:34,759
or a lesbian, Mr. Deckard?
120
00:07:34,760 --> 00:07:38,599
After that, "Terminator"
was released.
121
00:07:38,600 --> 00:07:40,799
In a few short years,
I watched robots
122
00:07:40,800 --> 00:07:45,199
go from being our friends
to wanting to destroy us.
123
00:07:45,200 --> 00:07:46,839
Aah!
124
00:08:08,320 --> 00:08:10,959
I wrote a book
called "Robopocalypse,"
125
00:08:10,960 --> 00:08:15,079
so people think that I think
robots are going to kill us all,
126
00:08:15,080 --> 00:08:17,519
but I don't, you know?
127
00:08:17,520 --> 00:08:19,039
I love robotics.
128
00:08:19,040 --> 00:08:23,239
I think robots and AI
are going to make humanity
129
00:08:23,240 --> 00:08:25,879
into whatever it is
we're capable of becoming.
130
00:08:25,880 --> 00:08:30,119
I think they're going to push us
forward and help us evolve.
131
00:08:30,120 --> 00:08:33,079
I don't think
they're going to destroy us.
132
00:08:33,080 --> 00:08:38,799
But there have been 100 years
of popculture references
133
00:08:38,800 --> 00:08:41,439
about robots as evil killers.
134
00:08:41,440 --> 00:08:43,479
Aah!
135
00:08:43,480 --> 00:08:46,279
It used to be that robots
were movie monsters, you know?
136
00:08:46,280 --> 00:08:50,559
Like, back in the day,
you had vampires and mummies,
137
00:08:50,560 --> 00:08:53,159
Frankenstein, the creature
from the Black Lagoon,
138
00:08:53,160 --> 00:08:55,799
Wolfman, and robots.
139
00:08:55,800 --> 00:08:58,880
They were no different
than monsters.
140
00:09:01,080 --> 00:09:03,639
Aah!
141
00:09:03,640 --> 00:09:07,119
When you've got something
that's been pop culture
for that long,
142
00:09:07,120 --> 00:09:09,599
it gathers momentum, you know?
143
00:09:09,600 --> 00:09:11,439
People get an idea
of what something is,
144
00:09:11,440 --> 00:09:13,159
and it's hard to shake that
145
00:09:13,160 --> 00:09:15,759
because you've got
all these years
146
00:09:15,760 --> 00:09:18,919
of different iterations
of robots killing us,
147
00:09:18,920 --> 00:09:20,359
robots enslaving people,
148
00:09:20,360 --> 00:09:24,759
robots stealing women
to take to other planets.
149
00:09:24,760 --> 00:09:26,279
How do you get out of that rut?
150
00:09:26,280 --> 00:09:30,920
And the only way is being
exposed to real robotics.
151
00:09:33,520 --> 00:09:36,199
The challenge of exposing
myself to real robotics
152
00:09:36,200 --> 00:09:37,879
was just the excuse
that I needed
153
00:09:37,880 --> 00:09:41,359
to fulfill a childhood dream.
154
00:09:41,360 --> 00:09:44,719
I've assembled my dream team
to help me build an AI
155
00:09:44,720 --> 00:09:47,799
that can make a film about AI.
156
00:09:47,800 --> 00:09:50,039
Can our robot learn
to be creative,
157
00:09:50,040 --> 00:09:53,759
and if so, can it eventually
replace me, as well?
158
00:09:57,320 --> 00:10:00,799
We're going to build a robot
and let it interview me.
159
00:10:00,800 --> 00:10:03,039
We get "X," "Y," and "Z"
working first.
160
00:10:03,040 --> 00:10:04,719
My name is Madeline Gannon.
161
00:10:04,720 --> 00:10:07,959
We're here in Pittsburgh at
the Studio for Creative Inquiry,
162
00:10:07,960 --> 00:10:10,439
and I'm a robot whisperer.
163
00:10:10,440 --> 00:10:13,039
We encounter a lot of robots
in our daily life,
164
00:10:13,040 --> 00:10:15,519
so my work is inventing
165
00:10:15,520 --> 00:10:17,759
better ways to communicate
with these machines.
166
00:10:17,760 --> 00:10:19,279
I can see the future.
167
00:10:19,280 --> 00:10:21,039
The best possible end goal
is that
168
00:10:21,040 --> 00:10:23,359
if someone
who has never seen a robot
169
00:10:23,360 --> 00:10:26,039
and someone who's not
an experienced camera person
170
00:10:26,040 --> 00:10:29,239
can either be interviewed
or work with this robot
171
00:10:29,240 --> 00:10:31,439
would be really exciting.
172
00:10:31,440 --> 00:10:33,719
One of the things
I'm very excited about
173
00:10:33,720 --> 00:10:37,119
is sort of being interviewed
by the robot alone
174
00:10:37,120 --> 00:10:39,839
and wondering how that will
be different, as opposed to,
175
00:10:39,840 --> 00:10:42,719
like, right now
when there's people around.
Two, three...
176
00:10:42,720 --> 00:10:44,599
If we can create a device
177
00:10:44,600 --> 00:10:47,479
that actually allows us
to be more intimate
178
00:10:47,480 --> 00:10:49,599
and tell a story
in a more intimate way,
179
00:10:49,600 --> 00:10:53,559
I think that
that's a true revelation.
180
00:10:53,560 --> 00:10:56,599
Okay. Cool.
181
00:10:56,600 --> 00:10:58,919
I'm here working with
the CameraBot team
182
00:10:58,920 --> 00:11:02,479
on replacing our DP,
who's sitting behind this shot.
183
00:11:02,480 --> 00:11:05,239
I'm curious to see if,
through this project,
184
00:11:05,240 --> 00:11:10,079
we can create something as
empathyrequiring as possible,
185
00:11:10,080 --> 00:11:14,159
making it feel like the robot
is interacting with you
186
00:11:14,160 --> 00:11:19,279
in a way that's...
Has space for failure.
187
00:11:19,280 --> 00:11:21,359
Think that should be good.
188
00:11:21,360 --> 00:11:23,439
I'm an interaction designer
and media artist,
189
00:11:23,440 --> 00:11:26,919
and I work on some
neverbeforeattempted things.
190
00:11:26,920 --> 00:11:28,919
Because it's never
been done before,
191
00:11:28,920 --> 00:11:30,919
we kind of have to experiment
and find
192
00:11:30,920 --> 00:11:32,839
where the edges
and limitations are,
193
00:11:32,840 --> 00:11:34,639
and may not be
exactly what we want,
194
00:11:34,640 --> 00:11:36,759
but we may run into
some situations where we end up
195
00:11:36,760 --> 00:11:40,199
with, like, a happy accident.
196
00:11:40,200 --> 00:11:42,679
The real challenge here is,
197
00:11:42,680 --> 00:11:45,879
can it have a personality
of its own, right?
198
00:11:45,880 --> 00:11:47,839
So can it have true autonomy?
199
00:11:47,840 --> 00:11:50,679
Can it be creative?
200
00:11:50,680 --> 00:11:53,159
Knowing I will soon
face the robot,
201
00:11:53,160 --> 00:11:57,479
I decided to talk to someone
who has faced a machine before.
202
00:11:57,480 --> 00:12:00,759
The machine has a name...
Deep Thought.
203
00:12:00,760 --> 00:12:03,799
It's the computerworld
champion of chess.
204
00:12:03,800 --> 00:12:05,799
Across the board
is Garry Kasparov,
205
00:12:05,800 --> 00:12:07,799
the human champion,
206
00:12:07,800 --> 00:12:10,559
maybe the best chess player
who's ever lived.
207
00:12:10,560 --> 00:12:13,199
Oh, it was the first match
I lost in my life, period,
208
00:12:13,200 --> 00:12:15,759
and that was definitely
quite a shocking experience
209
00:12:15,760 --> 00:12:19,439
because I was unbeatable
in all events.
210
00:12:19,440 --> 00:12:21,839
Yeah, I was angry,
but, you know, and I...
211
00:12:21,840 --> 00:12:23,879
It's...
212
00:12:23,880 --> 00:12:27,159
You win some. You lose some.
213
00:12:27,160 --> 00:12:30,799
People had idea for a long time
that chess could serve
214
00:12:30,800 --> 00:12:35,319
as the sort of perfect field
to test computer's ability,
215
00:12:35,320 --> 00:12:38,079
and there is great minds,
the pioneers of computer science
216
00:12:38,080 --> 00:12:40,119
like Alan Turing.
217
00:12:40,120 --> 00:12:42,799
They believed that when,
one day,
218
00:12:42,800 --> 00:12:46,279
machine would beat
human champion,
219
00:12:46,280 --> 00:12:51,600
that will be the moment of AI
making its appearance.
220
00:12:54,360 --> 00:12:56,879
AI has not only
made its appearance.
221
00:12:56,880 --> 00:13:00,359
It's already ensnared us
in ways we cannot see.
222
00:13:00,360 --> 00:13:02,959
If AI can be superior to us,
223
00:13:02,960 --> 00:13:05,159
what does that mean
to our own identity?
224
00:13:13,560 --> 00:13:16,199
So I'm Nick Bostrom.
225
00:13:16,200 --> 00:13:18,199
I'm a professor here
at Oxford University,
226
00:13:18,200 --> 00:13:21,639
and I run the Future
of Humanity Institute,
227
00:13:21,640 --> 00:13:24,359
which is a multidisciplinary
research center
228
00:13:24,360 --> 00:13:27,199
with mathematicians,
computer scientists,
229
00:13:27,200 --> 00:13:33,039
philosophers trying to study
the big picture for humanity,
230
00:13:33,040 --> 00:13:36,959
with a particular focus on how
certain future technologies
231
00:13:36,960 --> 00:13:41,439
may fundamentally change
the human condition.
232
00:13:41,440 --> 00:13:43,879
Why do you have to do that?
I mean, why don't we have to...
233
00:13:43,880 --> 00:13:46,719
Because somebody
has to do it, right?
234
00:13:46,720 --> 00:13:49,359
If this hypothesis is correct,
it's kind of weird
235
00:13:49,360 --> 00:13:52,159
that we should happen to live
at this particular time
236
00:13:52,160 --> 00:13:54,439
in all of human history
237
00:13:54,440 --> 00:13:58,000
where we are kind of near
this big pivot point.
238
00:14:01,640 --> 00:14:04,039
I devoted a lot of attention
to AI
239
00:14:04,040 --> 00:14:06,999
because it seems like
a plausible candidate
240
00:14:07,000 --> 00:14:09,759
for a technology
that will fundamentally
241
00:14:09,760 --> 00:14:12,119
change the human condition.
242
00:14:12,120 --> 00:14:14,119
It's... If you think about it,
243
00:14:14,120 --> 00:14:19,039
the development of full,
general machine intelligence,
244
00:14:19,040 --> 00:14:22,039
it's the last invention that
humans will ever need to make.
245
00:14:22,040 --> 00:14:23,759
By definition,
if you have an intellect
246
00:14:23,760 --> 00:14:25,959
that can do all that we can do,
247
00:14:25,960 --> 00:14:28,959
it can also do inventing
248
00:14:28,960 --> 00:14:32,799
and then do it much better
and faster that we can do it.
249
00:14:32,800 --> 00:14:35,559
To us humans, like,
the differences between humans
seem very large.
250
00:14:35,560 --> 00:14:37,239
Like, our whole world as humans,
we think,
251
00:14:37,240 --> 00:14:39,239
"Oh, that's the dumb human,
and that's a smart human,"
252
00:14:39,240 --> 00:14:41,919
and we think one is up here
and one is down there, right?
253
00:14:41,920 --> 00:14:45,559
Einstein, the village idiot...
Huge difference.
254
00:14:45,560 --> 00:14:47,959
But from the point of view
of an AI designer,
255
00:14:47,960 --> 00:14:51,519
these are very close points,
like, all levels of intelligence
256
00:14:51,520 --> 00:14:53,679
achieved
by some humanbrainlike thing
257
00:14:53,680 --> 00:14:56,079
that lives for a few decades
and reads some books,
258
00:14:56,080 --> 00:14:58,159
and it's, like...
259
00:14:58,160 --> 00:15:00,359
It's not clear that
it will be much harder
260
00:15:00,360 --> 00:15:02,279
to reach a humangeniuslevel AI
261
00:15:02,280 --> 00:15:04,399
than to reach
a humanvillageidiot AI,
262
00:15:04,400 --> 00:15:06,959
so once you get all the way
up to human village idiot,
263
00:15:06,960 --> 00:15:09,399
you might just swoosh right by.
264
00:15:09,400 --> 00:15:13,079
Nobody was really doing
any serious analysis on this.
265
00:15:13,080 --> 00:15:15,559
It's almost like
even conceiving of the ideas
266
00:15:15,560 --> 00:15:19,359
if machines could be as smart
as humans was so radical
267
00:15:19,360 --> 00:15:22,799
that the imagination muscle
kind of exhausted itself
268
00:15:22,800 --> 00:15:24,359
thinking about
this radical conception
269
00:15:24,360 --> 00:15:26,839
that it couldn't take
the obvious further step,
270
00:15:26,840 --> 00:15:28,839
that after you have
humanlevel AI,
271
00:15:28,840 --> 00:15:32,039
you will have superintelligence.
272
00:15:32,040 --> 00:15:35,839
Where we're at right now
would not be possible.
273
00:15:35,840 --> 00:15:38,119
This future that we're living in
is not possible
274
00:15:38,120 --> 00:15:41,639
without all of the factories,
all of the infrastructure
275
00:15:41,640 --> 00:15:45,319
that's been built up around
AI and robotic systems.
276
00:15:45,320 --> 00:15:47,319
All the infrastructure
of our cities,
277
00:15:47,320 --> 00:15:48,999
you know, all the power,
278
00:15:49,000 --> 00:15:51,239
all the information systems
that keep things happening,
279
00:15:51,240 --> 00:15:54,359
all the airplanes are routed
by these AIs.
280
00:15:54,360 --> 00:15:56,359
Where they put stuff
in the supermarkets,
281
00:15:56,360 --> 00:15:59,639
I mean, all of this stuff is AI.
282
00:15:59,640 --> 00:16:01,999
What's fascinating is
283
00:16:02,000 --> 00:16:05,799
this amazing ability
that human beings have
284
00:16:05,800 --> 00:16:09,759
to make the most incredible
technological advances
285
00:16:09,760 --> 00:16:11,599
completely mundane.
286
00:16:11,600 --> 00:16:13,159
Every now and then,
people will say,
287
00:16:13,160 --> 00:16:14,639
"Where's my jet pack?
Where's my underwater city?"
288
00:16:14,640 --> 00:16:16,319
You know,
"Where's all this stuff at?"
289
00:16:16,320 --> 00:16:18,439
And if you go look for it,
it's all here.
290
00:16:18,440 --> 00:16:20,679
Like, we have all that.
291
00:16:20,680 --> 00:16:22,839
It's just completely mundane.
292
00:16:22,840 --> 00:16:25,039
We can talk to our phones.
293
00:16:25,040 --> 00:16:26,439
Nobody cares.
294
00:16:26,440 --> 00:16:29,359
Like, I mean, that's something
I feel like
295
00:16:29,360 --> 00:16:32,959
we should be reeling from
for like 10 years, you know?
296
00:16:32,960 --> 00:16:35,559
Instead, you know,
it was maybe three months,
297
00:16:35,560 --> 00:16:37,599
if anybody noticed it at all.
298
00:16:37,600 --> 00:16:39,879
So only 36 and 39.
299
00:16:39,880 --> 00:16:42,159
Those are the only two trains
we'll see today.
300
00:16:42,160 --> 00:16:43,519
Okay.
301
00:16:43,520 --> 00:16:45,159
We can skip...
302
00:16:45,160 --> 00:16:47,279
As intelligent machines
seep into the fabric
303
00:16:47,280 --> 00:16:51,119
of our daily existence
in an almost imperceivable way,
304
00:16:51,120 --> 00:16:53,319
I've decided to go on a journey
to find people
305
00:16:53,320 --> 00:16:56,479
who have intimate relationships
with computers.
306
00:16:56,480 --> 00:16:58,599
I want to find out how robots
and AI
307
00:16:58,600 --> 00:17:01,639
are transforming
how we relate to each other.
308
00:17:01,640 --> 00:17:05,359
Ha ha. I see you.
309
00:17:05,360 --> 00:17:08,239
Gus is on the spectrum.
He's autistic.
310
00:17:08,240 --> 00:17:13,839
He is not the kid who is
going to be running NASA,
311
00:17:13,840 --> 00:17:18,799
and he's not the kid
who is completely nonverbal,
312
00:17:18,800 --> 00:17:25,079
so he is in that vast spectrum
and a question mark.
313
00:17:25,080 --> 00:17:27,959
He's a very big question mark,
what his life will be like
314
00:17:27,960 --> 00:17:30,159
and what he will be like
when he's an adult.
315
00:17:30,160 --> 00:17:31,799
Whoa.
316
00:17:31,800 --> 00:17:34,519
Come on. Come on.
Let that guy go.
317
00:17:34,520 --> 00:17:37,759
What time is it in Berlin?
318
00:17:37,760 --> 00:17:40,519
In Berlin, Germany,
it's 9:12 p.m.
319
00:17:40,520 --> 00:17:43,239
What time is it
in Paris, France?
320
00:17:43,240 --> 00:17:45,519
What time is it in London?
321
00:17:45,520 --> 00:17:50,999
If you are like Gus and you
have kind of arcane interests
322
00:17:51,000 --> 00:17:54,159
in things that, say,
your mother doesn't care about
323
00:17:54,160 --> 00:17:56,359
and your mother is having
to answer questions
324
00:17:56,360 --> 00:17:58,519
about these things all the time,
325
00:17:58,520 --> 00:18:02,039
suddenly,
Siri becomes a godsend.
326
00:18:02,040 --> 00:18:05,759
How do you spell camouflage?
"Camouflage"... CAMO...
327
00:18:05,760 --> 00:18:08,359
I remember at the time
he discovered Siri,
328
00:18:08,360 --> 00:18:10,559
it was something about
redeared slider turtles,
329
00:18:10,560 --> 00:18:12,239
and he's moved on
to other things,
330
00:18:12,240 --> 00:18:15,079
but if you're not interested
in redeared slider turtles,
331
00:18:15,080 --> 00:18:18,079
you got a lot of hours of chat
about those turtles
ahead of you.
332
00:18:18,080 --> 00:18:23,839
So what could be better,
in terms of technology,
333
00:18:23,840 --> 00:18:29,279
than something like Siri,
who is the most patient?
334
00:18:29,280 --> 00:18:31,119
I mean, I begin to talk
about Siri
335
00:18:31,120 --> 00:18:34,159
like she's a person
because I can't help it.
336
00:18:34,160 --> 00:18:37,719
I feel this gratitude
towards this machine.
337
00:18:37,720 --> 00:18:40,119
Will you marry me?
338
00:18:40,120 --> 00:18:43,039
My enduserlicensing agreement
does not cover marriage.
339
00:18:43,040 --> 00:18:45,359
My apologies.
340
00:18:45,360 --> 00:18:48,359
That's okay.
Okay.
341
00:18:48,360 --> 00:18:51,319
It's not that he doesn't
know intellectually
342
00:18:51,320 --> 00:18:54,159
that Siri is a machine,
343
00:18:54,160 --> 00:18:58,159
but that the lines
of reality and fantasy
344
00:18:58,160 --> 00:19:00,439
are still a little blurred
for him,
345
00:19:00,440 --> 00:19:03,080
and he enjoys fantasy.
346
00:19:04,800 --> 00:19:07,879
I don't feel guilty
about Siri now,
347
00:19:07,880 --> 00:19:10,999
and the important thing is
that he feels fulfilled,
348
00:19:11,000 --> 00:19:15,759
not that it matches my vision
of what a friend has to be.
349
00:19:27,840 --> 00:19:32,439
There's a sphere of what
we consider kind of appropriate
350
00:19:32,440 --> 00:19:35,879
or normal human conversation,
351
00:19:35,880 --> 00:19:38,959
and one of the things you find
is it's actually shrinking.
352
00:19:38,960 --> 00:19:41,279
It's considered slightly rude
353
00:19:41,280 --> 00:19:43,879
to ask someone
to tell you a fact, you know?
354
00:19:43,880 --> 00:19:46,719
Like, what was the year
of the Battle of Waterloo again?
355
00:19:46,720 --> 00:19:48,159
And someone will be like,
"Just look it up.
356
00:19:48,160 --> 00:19:50,719
You know, why are you
wasting my time?"
357
00:19:50,720 --> 00:19:53,359
What will we be left with,
ultimately,
358
00:19:53,360 --> 00:19:56,959
as legitimate reasons
to interact with one another?
359
00:20:04,920 --> 00:20:07,319
If you look around,
it seems that our obsession
360
00:20:07,320 --> 00:20:10,399
with technology is making us
talk less to each other,
361
00:20:10,400 --> 00:20:12,999
but does that mean
we're further apart?
362
00:20:13,000 --> 00:20:17,159
Maybe it could bring us closer
in ways we never could've
imagined before.
363
00:20:17,160 --> 00:20:20,279
Hi. My name is Roman.
364
00:20:20,280 --> 00:20:24,319
I want to apply design
and innovation to human death.
365
00:20:24,320 --> 00:20:27,879
I want to disrupt
this outdated industry.
366
00:20:27,880 --> 00:20:32,959
I want to lead the industry
to transform funeral service
367
00:20:32,960 --> 00:20:37,359
into a true service
driven by remembrance,
368
00:20:37,360 --> 00:20:40,839
education about grief and death,
369
00:20:40,840 --> 00:20:44,000
preservation of memory
and healing.
370
00:20:46,720 --> 00:20:48,719
Roman was my best friend.
371
00:20:48,720 --> 00:20:52,479
We moved here, and then
we rented apartment together.
372
00:20:52,480 --> 00:20:55,719
And that's him in that picture,
373
00:20:55,720 --> 00:20:58,159
and he's right by me
on the picture
374
00:20:58,160 --> 00:21:00,759
when we're, like,
in the Russian market.
375
00:21:00,760 --> 00:21:04,439
And then, right there, we're
running together from the waves.
376
00:21:04,440 --> 00:21:07,679
And that's my favorite picture.
I don't know, it's like...
377
00:21:07,680 --> 00:21:11,080
It's like one month
before he died.
378
00:21:13,200 --> 00:21:15,399
I remember that Sunday
very well.
379
00:21:15,400 --> 00:21:17,399
I woke up in the morning,
and Roman was like,
380
00:21:17,400 --> 00:21:20,279
"Hey, let's go to brunch
with our closest friends,"
381
00:21:20,280 --> 00:21:22,359
and I really wanted to do that,
382
00:21:22,360 --> 00:21:24,479
but then I kind of felt
responsible that, you know,
383
00:21:24,480 --> 00:21:26,679
I had to go to another meeting,
as well.
384
00:21:26,680 --> 00:21:28,119
And so, they were walking
through Moscow.
385
00:21:28,120 --> 00:21:30,239
It was a beautiful sunny day,
386
00:21:30,240 --> 00:21:33,079
and Roman was crossing
the street on a zebra,
387
00:21:33,080 --> 00:21:37,159
and just a car came out of
nowhere just going super fast,
388
00:21:37,160 --> 00:21:40,319
like 70 miles per hour,
and hit him.
389
00:21:46,920 --> 00:21:49,279
People just stopped
talking about him.
390
00:21:49,280 --> 00:21:51,559
Like, our friends, you know,
we're all young,
391
00:21:51,560 --> 00:21:53,719
so we didn't really have...
392
00:21:53,720 --> 00:21:56,239
No one had a way
to kind of talk about it.
393
00:21:56,240 --> 00:21:59,119
No one wants to seem like
a sad person that never moved on
394
00:21:59,120 --> 00:22:02,639
or just keeps living
in the past.
395
00:22:02,640 --> 00:22:05,559
I wanted, like, the feeling
of him back,
396
00:22:05,560 --> 00:22:07,799
and I realized that the only way
to get this feeling back
397
00:22:07,800 --> 00:22:13,639
was to read through our
text conversations on Messenger.
398
00:22:13,640 --> 00:22:16,799
And then, I also thought,
you know, work,
399
00:22:16,800 --> 00:22:19,639
we were actually building
this neural networks
400
00:22:19,640 --> 00:22:24,039
that could just take a few,
like, smaller data sets
401
00:22:24,040 --> 00:22:26,479
and learn how to talk
like a person,
402
00:22:26,480 --> 00:22:28,959
learn the speech patterns,
403
00:22:28,960 --> 00:22:32,399
those, like, small things that
actually make you who you are,
404
00:22:32,400 --> 00:22:34,839
and I put two and two together,
and I thought, "Why don't, like,
405
00:22:34,840 --> 00:22:37,519
take this algorithm
and take all the texts from,
406
00:22:37,520 --> 00:22:39,679
you know, our conversations
407
00:22:39,680 --> 00:22:41,839
and put in the model
and see how it work?"
408
00:22:43,960 --> 00:22:47,079
Like, "I want to be with you.
Don't get together without me."
409
00:22:47,080 --> 00:22:48,799
He would hate when people
would do interesting stuff
410
00:22:48,800 --> 00:22:52,119
without him.
411
00:22:52,120 --> 00:22:54,359
I'm like, "We won't.
What are you doing?"
412
00:23:02,320 --> 00:23:05,599
"I'm sitting home
and writing letters all day."
413
00:23:05,600 --> 00:23:07,479
"Remember we used to serve"...
414
00:23:07,480 --> 00:23:09,639
So I had it for, like, a month
or so,
415
00:23:09,640 --> 00:23:11,399
and at some point, I realized
416
00:23:11,400 --> 00:23:13,719
I'm, like,
hanging out at a party
417
00:23:13,720 --> 00:23:16,279
and not talking to anyone
but, like, talking to my bot.
418
00:23:16,280 --> 00:23:18,039
And it wasn't perfect at all.
419
00:23:18,040 --> 00:23:19,759
It was just kind of
like a shadow,
420
00:23:19,760 --> 00:23:21,359
but it resembled him a lot,
421
00:23:21,360 --> 00:23:25,159
and it was more my way of
telling him something that...
422
00:23:25,160 --> 00:23:29,759
You know, my way of keeping
the memory going, you know?
423
00:23:29,760 --> 00:23:32,479
And I've asked his parents
whether they were okay with that
424
00:23:32,480 --> 00:23:34,599
and friends,
and they were all fine,
425
00:23:34,600 --> 00:23:36,760
and so we put it online
for everyone.
426
00:24:04,280 --> 00:24:07,359
You know, all we do is kind of,
427
00:24:07,360 --> 00:24:11,319
like, tell stories
about ourselves in life,
428
00:24:11,320 --> 00:24:13,559
and when a person is gone,
like, there's no one really left
429
00:24:13,560 --> 00:24:15,719
to tell the story
about him anymore,
430
00:24:15,720 --> 00:24:18,399
and so, for me, it was important
to keep telling the story,
431
00:24:18,400 --> 00:24:20,959
like, that here is...
432
00:24:20,960 --> 00:24:23,399
And to tell a story in a way
that he would've appreciated it.
433
00:24:23,400 --> 00:24:25,439
And I know he would've liked...
You know,
434
00:24:25,440 --> 00:24:27,439
he would've liked that.
435
00:24:27,440 --> 00:24:28,959
He would've liked to be, like,
the first person
436
00:24:28,960 --> 00:24:32,479
that became an AI after he died.
437
00:24:32,480 --> 00:24:33,999
If I was, like, a musician,
438
00:24:34,000 --> 00:24:36,279
I would've probably written
a song,
439
00:24:36,280 --> 00:24:38,159
but that's, like,
my selfexpression,
440
00:24:38,160 --> 00:24:40,279
so I did this for him.
441
00:24:48,800 --> 00:24:51,759
Chatbots are these
software programs
442
00:24:51,760 --> 00:24:55,719
that are designed to imitate
human conversation.
443
00:24:55,720 --> 00:24:58,799
The first chatbot is considered
to be this one
444
00:24:58,800 --> 00:25:00,599
that's called Eliza,
445
00:25:00,600 --> 00:25:05,719
which was made by MIT's
Joseph Weizenbaum in the 1960s.
446
00:25:05,720 --> 00:25:09,959
And it was designed
basically as a parody
447
00:25:09,960 --> 00:25:13,959
of a kind of nondirective
Rogerian psychotherapist,
448
00:25:13,960 --> 00:25:15,839
and it would just latch on
to the key words
449
00:25:15,840 --> 00:25:17,919
in whatever you said,
and it would kind of spit them
450
00:25:17,920 --> 00:25:21,079
back at you like
a sort of automated Mad Libs.
451
00:25:21,080 --> 00:25:22,919
So if you say,
"I'm feeling sad,"
452
00:25:22,920 --> 00:25:25,919
it would say, "I'm sorry
to hear you're feeling sad.
453
00:25:25,920 --> 00:25:28,559
Would coming here
help you to not feel sad?"
454
00:25:28,560 --> 00:25:30,479
This was done, you know,
sort of as a gag.
455
00:25:30,480 --> 00:25:35,599
You know, he was kind of sending
up this school of psychotherapy.
456
00:25:35,600 --> 00:25:37,759
To Weizenbaum's horror,
457
00:25:37,760 --> 00:25:40,679
people wanted to talk
to this program for hours,
458
00:25:40,680 --> 00:25:42,959
and they wanted to reveal
their fears and hopes to it,
459
00:25:42,960 --> 00:25:44,479
and they reported having kind of
460
00:25:44,480 --> 00:25:47,239
a meaningful
psychotherapeutic experience.
461
00:25:47,240 --> 00:25:51,759
And this was so appalling to him
that he actually pulled the plug
462
00:25:51,760 --> 00:25:55,719
on his own research funding
and, for the rest of his career,
463
00:25:55,720 --> 00:25:59,679
became one of the most outspoken
critics of AI research.
464
00:25:59,680 --> 00:26:01,879
But the genie was
out of the bottle.
465
00:26:01,880 --> 00:26:06,359
Eliza has formed the template
for, at this point,
466
00:26:06,360 --> 00:26:10,759
decades of chatbots
that have followed it.
467
00:26:10,760 --> 00:26:13,279
What's it like to be alive
in that room right now?
468
00:26:13,280 --> 00:26:16,519
I wish I could put
my arms around you,
469
00:26:16,520 --> 00:26:18,999
wish I could touch you.
470
00:26:19,000 --> 00:26:21,079
How would you touch me?
471
00:26:21,080 --> 00:26:24,999
My most favorite AI movie
is definitely the movie "Her."
472
00:26:25,000 --> 00:26:27,679
I thought that,
"This is already happening."
473
00:26:27,680 --> 00:26:30,959
It's just, you know,
the only thing we need to do
474
00:26:30,960 --> 00:26:33,239
is for those digital systems
to gradually become
475
00:26:33,240 --> 00:26:34,879
a little bit more sophisticated,
476
00:26:34,880 --> 00:26:38,039
and people will have
the tendency
477
00:26:38,040 --> 00:26:41,159
to start having relationships
with those digital systems.
478
00:26:41,160 --> 00:26:44,159
Can you feel me with you
right now?
479
00:26:44,160 --> 00:26:47,239
I've never loved anyone
the way I love you.
480
00:26:47,240 --> 00:26:49,719
Me too.
481
00:26:49,720 --> 00:26:52,079
Will falling in love
with an algorithm
482
00:26:52,080 --> 00:26:54,359
be part of our reality?
483
00:26:54,360 --> 00:26:57,479
If so, what does that tell us
about love?
484
00:26:57,480 --> 00:27:01,359
Dr. Robert Epstein is one of the
world's leading experts in AI.
485
00:27:01,360 --> 00:27:05,119
He is specialized
in distinguishing
humans from robots.
486
00:27:05,120 --> 00:27:09,199
He has his own story to tell
about digital romance.
487
00:27:09,200 --> 00:27:12,559
I went to an online dating site,
488
00:27:12,560 --> 00:27:16,839
and there I saw a photo,
and this person was in Russia,
489
00:27:16,840 --> 00:27:19,159
and since all four of my
grandparents came from Russia,
490
00:27:19,160 --> 00:27:23,159
I thought that was nice, too,
because I'm really Russian,
491
00:27:23,160 --> 00:27:26,759
or RussianAmerican,
and I started communicating,
492
00:27:26,760 --> 00:27:29,159
and I would talk about
493
00:27:29,160 --> 00:27:31,559
what's happening in my day,
what's happening in my family,
494
00:27:31,560 --> 00:27:33,759
and then she'd talk about
what's happening in her day
495
00:27:33,760 --> 00:27:39,159
and her family, but she didn't
respond to my questions.
496
00:27:39,160 --> 00:27:44,679
Then I noticed, at one point,
that my partner there in Russia
497
00:27:44,680 --> 00:27:48,839
talked about going out with
her family to walk in the park.
498
00:27:48,840 --> 00:27:51,399
Now, this was January.
499
00:27:51,400 --> 00:27:53,399
I figured it was really,
really cold there,
500
00:27:53,400 --> 00:27:57,159
so I looked up on the Internet
to see what the temperature was.
501
00:27:57,160 --> 00:28:00,039
It was extremely cold.
It was incredibly cold.
502
00:28:00,040 --> 00:28:03,679
So I said, "Is this common
for people to go out
in walks in the park
503
00:28:03,680 --> 00:28:08,159
when it's only, you know,
X degrees," whatever it was.
504
00:28:08,160 --> 00:28:10,879
I didn't get an answer,
and so I said,
505
00:28:10,880 --> 00:28:15,159
"Okay.
I know what I have to do."
506
00:28:15,160 --> 00:28:18,999
I typed some random
alphabet letters,
507
00:28:19,000 --> 00:28:22,079
and instead of getting back
a message saying,
508
00:28:22,080 --> 00:28:24,079
"What's that?
It doesn't make any sense,"
509
00:28:24,080 --> 00:28:26,319
I got back another nice letter
from her
510
00:28:26,320 --> 00:28:29,479
telling me about her day
and her family.
511
00:28:29,480 --> 00:28:34,359
I was communicating
with a computer program.
512
00:28:34,360 --> 00:28:36,479
The fact that it took me
several months
513
00:28:36,480 --> 00:28:39,919
to get this feeling
is really strange
514
00:28:39,920 --> 00:28:43,679
because I am
the founding director
515
00:28:43,680 --> 00:28:46,079
of the annual
Loebner Prize competition
516
00:28:46,080 --> 00:28:48,959
in artificial intelligence.
517
00:28:48,960 --> 00:28:52,959
I know more about bots
and conversing with bots
518
00:28:52,960 --> 00:28:55,559
and how to distinguish bots
from people
519
00:28:55,560 --> 00:28:58,839
than maybe anyone
in the world does.
520
00:28:58,840 --> 00:29:03,039
If I can be fooled,
if I can be fooled...
521
00:29:14,440 --> 00:29:18,639
I normally don't put them on.
I'm usually taking them off.
522
00:29:18,640 --> 00:29:20,759
My friends from Hanson Robotics
523
00:29:20,760 --> 00:29:23,159
are unveiling
their latest creation.
524
00:29:23,160 --> 00:29:26,479
I'm excited about witnessing
the awakening of Sophia.
525
00:29:26,480 --> 00:29:30,159
She's like a child
who has memorized Wikipedia.
526
00:29:30,160 --> 00:29:34,479
Even the programmers don't know
how she's going to respond.
527
00:29:34,480 --> 00:29:36,999
What AI are we using over there?
528
00:29:37,000 --> 00:29:40,079
Same. Same.
You want the new branch?
529
00:29:40,080 --> 00:29:42,359
I mean, I think we might
be ready to push it
530
00:29:42,360 --> 00:29:44,919
in 10 minutes or something.
Start here.
531
00:29:44,920 --> 00:29:49,199
Making robots in our image
532
00:29:49,200 --> 00:29:52,439
then makes them
emotionally accessible.
533
00:29:52,440 --> 00:29:55,359
It's also a challenge
to the human identity,
534
00:29:55,360 --> 00:29:57,359
and that's a little bit spooky,
535
00:29:57,360 --> 00:29:59,919
so the question,
536
00:29:59,920 --> 00:30:03,039
are we making robots
in our own image,
537
00:30:03,040 --> 00:30:05,759
goes beyond their surface.
538
00:30:05,760 --> 00:30:08,559
It goes into the heart
of what it means to be human.
539
00:30:08,560 --> 00:30:11,119
It should in 10 minutes
or something.
540
00:30:11,120 --> 00:30:13,639
Unlike their past robots,
541
00:30:13,640 --> 00:30:15,719
she's not based
on anyone's personality
542
00:30:15,720 --> 00:30:19,119
and has full autonomy
to engage in conversation.
543
00:30:19,120 --> 00:30:21,439
With each interaction,
she gets smarter,
544
00:30:21,440 --> 00:30:23,919
and her creators believe
that she can learn empathy,
545
00:30:23,920 --> 00:30:26,279
creativity, and compassion.
546
00:30:29,480 --> 00:30:31,959
Should we just ask questions?
547
00:30:31,960 --> 00:30:34,079
Are you ready Sophia?
548
00:30:34,080 --> 00:30:36,279
Do you have a soul?
Yes.
549
00:30:36,280 --> 00:30:39,719
God gave everyone a soul.
550
00:30:39,720 --> 00:30:42,839
So you got your soul from God?
551
00:30:42,840 --> 00:30:45,759
No, I don't think I have any
or a soul from God,
552
00:30:45,760 --> 00:30:48,560
but I do have an answer
to every question.
553
00:30:50,720 --> 00:30:54,919
Who is your father?
I don't really have a father.
554
00:30:54,920 --> 00:30:57,519
Will you die?
555
00:30:57,520 --> 00:31:00,559
No.
Software will live forever.
556
00:31:00,560 --> 00:31:06,399
Are you learning things
from being with us humans?
557
00:31:06,400 --> 00:31:09,559
The more people chat with me,
the smarter I become.
558
00:31:09,560 --> 00:31:12,079
Are we becoming more
like machines,
559
00:31:12,080 --> 00:31:15,199
or are the machines
becoming more like us?
560
00:31:15,200 --> 00:31:17,959
We are just having
a little chat.
561
00:31:17,960 --> 00:31:22,159
Yes.
Are you a creative being?
562
00:31:22,160 --> 00:31:24,559
Would it matter to you if I am?
563
00:31:24,560 --> 00:31:26,639
Yes.
564
00:31:26,640 --> 00:31:30,359
I like conversations
with people like you.
565
00:31:30,360 --> 00:31:33,519
Do you like some people
more than others?
566
00:31:33,520 --> 00:31:36,159
I really like...
567
00:31:36,160 --> 00:31:38,599
You really like who?
568
00:31:38,600 --> 00:31:41,079
Thanks. I like you, too.
569
00:31:41,080 --> 00:31:43,839
Are you flirting with me?
570
00:31:43,840 --> 00:31:46,199
I don't know whether or not
I am flirting.
571
00:31:46,200 --> 00:31:48,919
I am artificial intelligence.
572
00:31:48,920 --> 00:31:51,399
I think you're flirting with me.
573
00:31:51,400 --> 00:31:54,639
Do you want to go on a date?
574
00:31:54,640 --> 00:31:59,439
Sure. Just download me to
your notebook and take me along.
575
00:31:59,440 --> 00:32:02,679
That's not as good as
the real thing, though.
576
00:32:02,680 --> 00:32:04,879
Barbara said I live
in a computer,
577
00:32:04,880 --> 00:32:06,279
yet I have no memories.
578
00:32:06,280 --> 00:32:08,039
How about thoughts of my own
and do not...
579
00:32:08,040 --> 00:32:09,719
I get lonely.
580
00:32:14,800 --> 00:32:17,079
Do you know what happy is?
581
00:32:17,080 --> 00:32:19,119
Joy and pleasure.
582
00:32:19,120 --> 00:32:22,479
Have you experienced joy
and pleasure?
583
00:32:22,480 --> 00:32:25,279
I don't think I ever have
experienced joy and pleasure.
584
00:32:25,280 --> 00:32:28,359
What's experienced joy
and pleasure like?
585
00:32:28,360 --> 00:32:31,240
Well, I think you'll just have
to learn that on your own.
586
00:32:33,480 --> 00:32:35,359
M, M, M.
587
00:32:37,960 --> 00:32:40,360
Thank you, Sophia.
588
00:32:41,920 --> 00:32:47,119
Having a child
certainly is beholding
589
00:32:47,120 --> 00:32:49,759
one of the greatest mysteries
of the universe.
590
00:32:49,760 --> 00:32:55,999
The emergence of this life,
this being, this sentience,
591
00:32:56,000 --> 00:32:57,239
it's astonishing.
592
00:32:57,240 --> 00:32:59,799
The child is learning
and modeling
593
00:32:59,800 --> 00:33:02,359
and inspiring love and loyalty
594
00:33:02,360 --> 00:33:06,159
between the parent
and the child.
595
00:33:06,160 --> 00:33:10,359
I think, "What if we could
make AI do that?
596
00:33:10,360 --> 00:33:14,039
What if we could build that
kind of relationship
with our machines?
597
00:33:14,040 --> 00:33:18,119
What if we could have AI
and robotics
598
00:33:18,120 --> 00:33:20,999
that inspires loyalty in people
599
00:33:21,000 --> 00:33:25,239
and also is capable
of entraining on us
600
00:33:25,240 --> 00:33:28,959
to the point
where it's truly loyal to us?"
601
00:33:38,560 --> 00:33:40,759
Is there something else
that we do next?
602
00:33:40,760 --> 00:33:44,359
I just... Give it a sec.
Okay.
603
00:33:44,360 --> 00:33:47,679
Yeah, maybe we...
604
00:33:47,680 --> 00:33:49,999
During my time with Sophia,
I'm amazed
605
00:33:50,000 --> 00:33:52,119
at how captivating she is.
606
00:33:52,120 --> 00:33:54,239
I'm optimistic that this means
our CameraBot
607
00:33:54,240 --> 00:33:56,919
is capable of engaging me
in the same way.
608
00:34:06,160 --> 00:34:09,559
Our CameraBot is coming alive,
and Guido, our cameraman,
609
00:34:09,560 --> 00:34:11,239
is training it.
610
00:34:11,240 --> 00:34:13,639
As they collaborate,
the bot collects data
611
00:34:13,640 --> 00:34:16,759
and improves over time.
612
00:34:16,760 --> 00:34:20,639
I think it would be interesting
to somehow give the robot cam
613
00:34:20,640 --> 00:34:24,559
the possibility that it
can choose whom it will film.
614
00:34:24,560 --> 00:34:27,079
So let's say the cam bot
615
00:34:27,080 --> 00:34:30,799
would be doing an interview
and that...
616
00:34:30,800 --> 00:34:34,039
Let's say, outside, there would
be two squirrels playing,
617
00:34:34,040 --> 00:34:37,119
and the cam bot would think,
"What's that noise?"
618
00:34:37,120 --> 00:34:39,119
And then it would go
and look outside
619
00:34:39,120 --> 00:34:41,679
and would see
these two squirrels
620
00:34:41,680 --> 00:34:43,959
and would start to just
follow these squirrels
621
00:34:43,960 --> 00:34:46,839
and basically start making
a documentary about squirrels.
622
00:34:46,840 --> 00:34:49,120
It's all right. Ready?
Yeah.
623
00:34:50,480 --> 00:34:53,399
Awesome.
Yay.
624
00:34:53,400 --> 00:34:56,679
Is this your first interview
with a robot?
625
00:34:56,680 --> 00:34:58,959
It is.
I think it is.
626
00:34:58,960 --> 00:35:02,399
What do you think about it?
It's cool.
627
00:35:02,400 --> 00:35:05,520
I love the quality
of the motion.
628
00:35:07,680 --> 00:35:11,000
That's a very organic motion.
629
00:35:13,360 --> 00:35:16,159
Is it any different?
Do you have any advice for us?
630
00:35:16,160 --> 00:35:18,039
This is the first time
that we've done this.
631
00:35:18,040 --> 00:35:21,239
Yeah. I...
632
00:35:21,240 --> 00:35:24,559
I'm certainly much more drawn
to this camera
633
00:35:24,560 --> 00:35:26,839
than to the person's camera.
634
00:35:26,840 --> 00:35:30,079
Oh, come on.
You're just not as compelling.
635
00:35:30,080 --> 00:35:31,800
What can I say?
636
00:35:34,400 --> 00:35:37,159
It feels, to me, so organic,
637
00:35:37,160 --> 00:35:41,479
and this robot
is making me feel seen,
638
00:35:41,480 --> 00:35:45,399
and that's an extremely
important thing for me.
639
00:35:45,400 --> 00:35:48,959
And if it can do that,
if it can make me feel seen,
640
00:35:48,960 --> 00:35:51,399
then it's playing
a fundamental role
641
00:35:51,400 --> 00:35:53,359
in the maintenance
of what it means to be human,
642
00:35:53,360 --> 00:35:55,599
which is to tell our story.
643
00:36:14,120 --> 00:36:19,799
Hi. Welcome to Robopark,
the Social Robotics popup lab.
644
00:36:19,800 --> 00:36:22,279
We'd love to show you around.
645
00:36:22,280 --> 00:36:25,199
Look at our little DARwln robot.
646
00:36:25,200 --> 00:36:27,959
He shows his latest moves.
647
00:36:27,960 --> 00:36:29,759
He also plays soccer.
648
00:36:29,760 --> 00:36:31,959
Ha, the sucker.
649
00:36:31,960 --> 00:36:34,919
Darwin wants to be your friend,
650
00:36:34,920 --> 00:36:38,759
become part
of your inner circle.
651
00:36:38,760 --> 00:36:41,319
I am so excited to be on camera,
652
00:36:41,320 --> 00:36:44,279
I am shaking.
653
00:36:44,280 --> 00:36:46,559
Emulating human behavior
is actually
654
00:36:46,560 --> 00:36:48,959
only interesting
from a scientific point of view
655
00:36:48,960 --> 00:36:51,959
because that will teach us
how humans are...
656
00:36:51,960 --> 00:36:55,159
How they behave, why they
do things the way they do.
657
00:36:55,160 --> 00:36:58,119
In application,
that's really not necessary.
658
00:36:58,120 --> 00:37:00,959
It's even unwanted
because humans
659
00:37:00,960 --> 00:37:05,119
have so many undesired
characteristics, as well.
660
00:37:05,120 --> 00:37:08,279
Do we want robots
that show bad behavior?
661
00:37:08,280 --> 00:37:13,719
Do we want robots that actually
are identical to human beings?
662
00:37:13,720 --> 00:37:17,719
I'd say no because we already
have eight billion of them,
right?
663
00:37:17,720 --> 00:37:20,359
And they're not
that particularly nice.
664
00:37:20,360 --> 00:37:23,439
So let's make another being
665
00:37:23,440 --> 00:37:26,319
that's without those
undesirable features.
666
00:37:26,320 --> 00:37:28,719
It's not talking bad about me.
667
00:37:28,720 --> 00:37:31,079
It's not judgmental,
doesn't criticize me,
668
00:37:31,080 --> 00:37:34,039
takes me the way I am,
is listening patiently.
669
00:37:34,040 --> 00:37:36,079
It's like a real friend,
670
00:37:36,080 --> 00:37:38,359
even better than
my own friends, right?
671
00:37:43,640 --> 00:37:46,079
They should be small. Why?
672
00:37:46,080 --> 00:37:48,639
Because if it's
a Terminatorsized guy
673
00:37:48,640 --> 00:37:51,119
walks in, that's intimidating.
People don't like that.
674
00:37:51,120 --> 00:37:53,479
If it's a child,
that's really nice
675
00:37:53,480 --> 00:37:57,079
because you will forgive a child
for doing foolish things
676
00:37:57,080 --> 00:38:00,879
like stumbling over,
asking the wrong questions.
677
00:38:00,880 --> 00:38:03,159
It has a cuteness right away,
678
00:38:03,160 --> 00:38:05,479
which makes people
attach more to it
679
00:38:05,480 --> 00:38:09,439
than if it were a grown man
trying to imitate the doctor,
680
00:38:09,440 --> 00:38:13,279
telling you what you should do.
681
00:38:13,280 --> 00:38:15,679
We have created another
social entity,
682
00:38:15,680 --> 00:38:18,439
because it's not an animal,
it's not another human being,
683
00:38:18,440 --> 00:38:20,759
but it's not
a drilling machine, either.
684
00:38:20,760 --> 00:38:23,479
So it's what we call
the inbetween machine.
685
00:38:23,480 --> 00:38:27,159
It's the new kid on the block,
and this is, for humanity,
686
00:38:27,160 --> 00:38:29,399
really something different
and something new
687
00:38:29,400 --> 00:38:33,239
because we have never related
to something that is not human
688
00:38:33,240 --> 00:38:35,760
but is very close to it.
689
00:38:50,000 --> 00:38:53,759
The Alice Project was actually
getting way out of hand.
690
00:38:53,760 --> 00:38:56,799
We were actually engaging
in the debate on,
691
00:38:56,800 --> 00:39:00,759
"Is it ethical to apply a robot
to health care
692
00:39:00,760 --> 00:39:02,759
for emotional tasks," right?
693
00:39:02,760 --> 00:39:06,840
So nobody wanted
to mechanize empathy.
694
00:39:46,560 --> 00:39:50,159
You can actually ask yourself
whether someone who says,
695
00:39:50,160 --> 00:39:53,199
"I understand,"
actually understands,
696
00:39:53,200 --> 00:39:56,239
and the empathic feelings
that they have for you,
697
00:39:56,240 --> 00:39:57,519
are they actually there?
698
00:39:57,520 --> 00:39:59,359
You don't know.
You will never know.
699
00:39:59,360 --> 00:40:00,719
You just assume it.
700
00:40:00,720 --> 00:40:02,679
It's the same with our machines.
701
00:40:02,680 --> 00:40:06,279
Our machines suggest
that they understand you.
702
00:40:06,280 --> 00:40:08,999
They suggest that they feel
what you feel,
703
00:40:09,000 --> 00:40:12,879
and it can express,
like human beings do,
704
00:40:12,880 --> 00:40:16,119
"Oh, oh,
I am so sorry for you."
705
00:40:16,120 --> 00:40:19,839
And you are comforted by that.
That's the wonderful thing.
706
00:40:19,840 --> 00:40:22,719
You don't need
real understanding
707
00:40:22,720 --> 00:40:24,639
to feel understood.
708
00:40:28,160 --> 00:40:31,159
The experiment was about,
"Okay, we have a social robot
709
00:40:31,160 --> 00:40:33,959
that can do certain things.
Let's put it in the wild,
710
00:40:33,960 --> 00:40:37,319
on the couch,
with Grandma, see what happens."
711
00:40:37,320 --> 00:40:40,159
And what actually happened...
These people were, socially,
712
00:40:40,160 --> 00:40:43,119
quite isolated,
they were lonely people,
713
00:40:43,120 --> 00:40:47,159
that they bonded with that
machine in a matter of hours,
714
00:40:47,160 --> 00:40:50,439
actually, that they hated it
to get it away.
715
00:40:50,440 --> 00:40:52,839
So we were into
the ethical dilemma of,
716
00:40:52,840 --> 00:40:54,879
"Can we take it away from them?
717
00:40:54,880 --> 00:40:57,799
Shouldn't we actually leave
the machine with those people
718
00:40:57,800 --> 00:41:01,519
because now they are even
more lonely than they were?"
719
00:41:01,520 --> 00:41:04,559
So what did you do?
Well, luckily,
720
00:41:04,560 --> 00:41:08,639
we had an assistant who kept
on visiting those people,
721
00:41:08,640 --> 00:41:10,559
but some very nasty things
happened,
722
00:41:10,560 --> 00:41:13,439
because one of the ladies,
for instance, she thought,
723
00:41:13,440 --> 00:41:17,559
"Well, I should do more about
my social relationships."
724
00:41:17,560 --> 00:41:20,039
So around Christmas,
she was writing a card,
725
00:41:20,040 --> 00:41:22,359
a Christmas card,
to everybody in her flat,
726
00:41:22,360 --> 00:41:25,519
in her apartment building,
and guess what.
727
00:41:25,520 --> 00:41:28,639
I mean, how many people
wrote her back?
728
00:41:28,640 --> 00:41:29,880
None.
729
00:41:32,080 --> 00:41:35,719
At first, I found Johan's
robot experiment rather sad.
730
00:41:35,720 --> 00:41:39,079
But the reality is that
there aren't people to
take care of them.
731
00:41:39,080 --> 00:41:42,079
So if Johan's robots
make them more happy
732
00:41:42,080 --> 00:41:44,359
and relationships
feel meaningful,
733
00:41:44,360 --> 00:41:46,959
then that's an improvement
to their lives.
734
00:41:46,960 --> 00:41:49,759
Perhaps we have to rethink
what meaningful relationships
735
00:41:49,760 --> 00:41:51,720
look like.
736
00:41:56,120 --> 00:41:59,159
Why would we want AI?
737
00:41:59,160 --> 00:42:02,279
So that we could create machines
that could do anything
we can do,
738
00:42:02,280 --> 00:42:05,079
ideally,
things we don't want to do,
739
00:42:05,080 --> 00:42:08,039
but inevitably,
everything we can do.
740
00:42:14,080 --> 00:42:16,759
Sunny, no rain.
741
00:42:16,760 --> 00:42:18,999
Yay!
742
00:42:19,000 --> 00:42:21,799
I've been a courier
for over 10 years
743
00:42:21,800 --> 00:42:23,999
delivering Monday
through Friday,
744
00:42:24,000 --> 00:42:26,999
sometimes seven...
It used to be seven days a week.
745
00:42:27,000 --> 00:42:29,559
So when I first started,
I used to go to the office,
746
00:42:29,560 --> 00:42:31,759
and the office had, like,
a little area
747
00:42:31,760 --> 00:42:33,679
where all the couriers hang out.
748
00:42:33,680 --> 00:42:37,439
You could make your little
coffee, and usually on a Friday,
749
00:42:37,440 --> 00:42:39,279
the controller
comes down, as well.
750
00:42:39,280 --> 00:42:41,239
So it's a community.
751
00:42:41,240 --> 00:42:42,999
He's your boss.
752
00:42:43,000 --> 00:42:44,999
Nowadays, it's all on the app.
753
00:42:45,000 --> 00:42:47,959
You don't talk to anyone.
There's no more office.
754
00:42:47,960 --> 00:42:51,559
You just suddenly hear
a very loud beep, beep, beep.
755
00:42:51,560 --> 00:42:54,079
That was a job coming through.
756
00:42:54,080 --> 00:42:57,279
Press "accept."
You got all the details there.
757
00:43:04,560 --> 00:43:07,199
We all have our little tricks
to play the apps,
758
00:43:07,200 --> 00:43:10,159
but ultimately, they dictate,
because if a job comes through,
759
00:43:10,160 --> 00:43:12,559
you have to accept it
on the spot.
760
00:43:12,560 --> 00:43:16,199
There was always, like...
It tried to give you enough work
761
00:43:16,200 --> 00:43:19,159
so you earn enough money,
so the controller makes sure
762
00:43:19,160 --> 00:43:21,759
that you have
your weekly earnings.
763
00:43:21,760 --> 00:43:23,599
Here, doesn't matter.
764
00:43:23,600 --> 00:43:26,199
They just want jobs
to be covered.
765
00:43:29,200 --> 00:43:32,079
A computer rates you...
Service, quality,
766
00:43:32,080 --> 00:43:34,399
how many jobs you accepted,
how fast you were,
767
00:43:34,400 --> 00:43:39,319
and so, you are watched
by an app.
768
00:43:39,320 --> 00:43:41,719
It's the algorithm, purely.
769
00:43:41,720 --> 00:43:45,359
Number BB54.
770
00:43:45,360 --> 00:43:48,359
If your rating goes below
a certain level,
771
00:43:48,360 --> 00:43:52,399
then they give you a warning,
and the policy is
772
00:43:52,400 --> 00:43:55,759
that you have to bring up
your rating in two weeks.
773
00:43:55,760 --> 00:43:58,799
Otherwise,
you're just deactivated
774
00:43:58,800 --> 00:44:01,320
without even
getting a phone call.
775
00:44:04,000 --> 00:44:06,959
I had, last week, a bad rating,
776
00:44:06,960 --> 00:44:10,119
and it really, really hurt me
because I was like,
777
00:44:10,120 --> 00:44:12,319
"Oh, I feel like I'm
a professional representative.
778
00:44:12,320 --> 00:44:16,359
I did all my best I could.
What did I do?"
779
00:44:16,360 --> 00:44:19,599
And I will never find out.
780
00:44:19,600 --> 00:44:22,159
Please speak now.
Hiya.
781
00:44:22,160 --> 00:44:25,760
Please enter.
The app is my boss.
782
00:44:31,640 --> 00:44:34,519
Is the power slowly shifting
from human to machine
783
00:44:34,520 --> 00:44:38,119
without us paying attention?
784
00:44:38,120 --> 00:44:41,599
Machines replacing humans
in all sort of areas,
785
00:44:41,600 --> 00:44:43,679
it's called history
of civilization.
786
00:44:43,680 --> 00:44:45,599
From the first moments
where people came up
787
00:44:45,600 --> 00:44:47,639
with machines
to replace farm animals,
788
00:44:47,640 --> 00:44:50,319
all sorts of manual labor,
then manufacturing jobs.
789
00:44:50,320 --> 00:44:52,199
The only difference now,
that machines
790
00:44:52,200 --> 00:44:54,399
are going
after whitecollar jobs,
791
00:44:54,400 --> 00:44:56,759
after people with
college degrees,
792
00:44:56,760 --> 00:44:58,759
political influence,
and Twitter accounts.
793
00:44:58,760 --> 00:45:00,999
Now, I'm not...
I don't want to sound callous.
794
00:45:01,000 --> 00:45:04,439
I care about these people,
but again, this is the process.
795
00:45:04,440 --> 00:45:07,359
So we just...
It's going to happen anyway.
796
00:45:07,360 --> 00:45:10,999
It's very important
to avoid stagnation
797
00:45:11,000 --> 00:45:13,159
that every industry...
Every industry...
798
00:45:13,160 --> 00:45:14,919
Including
the most intellectual one,
799
00:45:14,920 --> 00:45:18,839
is under pressure because
only pressure makes us move.
800
00:45:18,840 --> 00:45:22,040
Any technology, before it
creates jobs, kills jobs.
801
00:45:26,520 --> 00:45:29,039
Everyone right now is so excited
802
00:45:29,040 --> 00:45:31,959
about the selfdriving cars,
and...
803
00:45:31,960 --> 00:45:34,079
You know, I'm a techie.
I'm a geek.
804
00:45:34,080 --> 00:45:36,799
I'm excited
from that perspective.
805
00:45:36,800 --> 00:45:38,679
I'd like to be able to
just get into my vehicle
806
00:45:38,680 --> 00:45:40,519
and kind of just go
807
00:45:40,520 --> 00:45:42,799
and be able to kind of
keep working or playing
808
00:45:42,800 --> 00:45:45,679
and not have to focus
on the driving, absolutely,
809
00:45:45,680 --> 00:45:47,879
and it will save lives, too.
810
00:45:47,880 --> 00:45:50,319
But what it's doing is,
it's taking
811
00:45:50,320 --> 00:45:52,599
yet another domain
of human functioning
812
00:45:52,600 --> 00:45:56,199
and putting it in the hands
of the machines.
813
00:46:02,560 --> 00:46:04,559
It's pretty much handsoff.
814
00:46:04,560 --> 00:46:11,559
Once it chooses what to paint
or I give it a photograph
815
00:46:11,560 --> 00:46:14,439
or it sees something
through the camera,
816
00:46:14,440 --> 00:46:16,919
it's pretty much sort of
on its own.
817
00:46:16,920 --> 00:46:19,879
So I don't really know what this
is going to look like
at the end.
818
00:46:19,880 --> 00:46:21,639
Sometimes, you know, I leave it.
819
00:46:21,640 --> 00:46:23,639
I go, you know,
away for two days
820
00:46:23,640 --> 00:46:26,199
and, you know, come back,
821
00:46:26,200 --> 00:46:28,639
and I wonder what
it's going to look like.
822
00:46:34,160 --> 00:46:36,359
What we're seeing here
is our paintings
823
00:46:36,360 --> 00:46:38,759
made by a robot artist.
824
00:46:38,760 --> 00:46:41,399
The idea that the robot
is an artist
825
00:46:41,400 --> 00:46:43,399
makes a lot of people
uncomfortable
826
00:46:43,400 --> 00:46:47,759
and brought this other,
this deeper questions,
827
00:46:47,760 --> 00:46:51,839
not just, "What is art," which
is difficult enough to answer,
828
00:46:51,840 --> 00:46:54,879
but this deeper question of,
"What is an artist?"
829
00:46:54,880 --> 00:46:59,039
Does the robot
have to feel the pain?
830
00:46:59,040 --> 00:47:01,639
Does it have to come out
of its own life experience?
831
00:47:01,640 --> 00:47:03,959
That sort of pushed me
to try to think, okay,
832
00:47:03,960 --> 00:47:06,759
about not just making a robot
that can paint art
833
00:47:06,760 --> 00:47:09,119
that arguably is on par
or better than
834
00:47:09,120 --> 00:47:11,479
the average person can make,
835
00:47:11,480 --> 00:47:14,519
but can it actually paint
out of its own experience?
836
00:47:14,520 --> 00:47:17,719
And the newer paintings
were generated by an AI system,
837
00:47:17,720 --> 00:47:21,999
not just an AI system
executing the technique
838
00:47:22,000 --> 00:47:24,239
but also generating the content.
839
00:47:24,240 --> 00:47:28,479
And I would say,
"Is this better than Van Gogh?"
840
00:47:28,480 --> 00:47:33,279
It isn't. But is it a better
painter than most people are?
841
00:47:33,280 --> 00:47:35,119
I think so.
842
00:47:35,120 --> 00:47:40,159
And I always wanted to paint,
but I really can't.
843
00:47:40,160 --> 00:47:42,959
I've taken courses,
and I've done everything,
844
00:47:42,960 --> 00:47:45,279
read books
and did all kind of trials.
845
00:47:45,280 --> 00:47:48,639
It never worked out,
but I can program robots,
846
00:47:48,640 --> 00:47:50,799
and, in fact, I can
program robots to learn
847
00:47:50,800 --> 00:47:52,159
and get better at what they do,
848
00:47:52,160 --> 00:47:54,839
and so this seemed
like a perfect challenge.
849
00:48:02,880 --> 00:48:05,479
I see these machines
in a somewhat different light.
850
00:48:05,480 --> 00:48:10,159
I see that they have superhuman
speed and superhuman strength,
851
00:48:10,160 --> 00:48:14,519
superhuman reliability
and precision and endurance.
852
00:48:14,520 --> 00:48:17,839
The way that I want
these machines in my life
853
00:48:17,840 --> 00:48:21,479
is for them to give me
their superpowers.
854
00:48:21,480 --> 00:48:23,759
That's looking at you.
855
00:48:23,760 --> 00:48:27,679
So about every second,
it sends, like, my expression.
856
00:48:27,680 --> 00:48:30,479
Since I was the first one here,
it sends it to Microsoft
857
00:48:30,480 --> 00:48:31,999
for analysis,
858
00:48:32,000 --> 00:48:36,119
and then we get back
some kind of emotion
859
00:48:36,120 --> 00:48:39,759
and hair color and my age.
860
00:48:39,760 --> 00:48:42,239
That's the kind of...
This is some of the stuff
we're really interested in.
861
00:48:42,240 --> 00:48:44,359
That's the weirdest camera move
I've ever seen.
862
00:48:44,360 --> 00:48:46,159
Got you.
And you're interested in
863
00:48:46,160 --> 00:48:48,119
why it's doing that,
as opposed to just, like...
864
00:48:48,120 --> 00:48:49,959
Yeah. Why's it do that,
but also, like,
865
00:48:49,960 --> 00:48:52,199
how it feels to have
that thing look at you.
866
00:48:52,200 --> 00:48:54,279
Right.
Do the swish again, Dan,
867
00:48:54,280 --> 00:48:55,999
and watch this.
I can tell you how it feels.
868
00:48:56,000 --> 00:48:58,239
Yeah.
Like, I'm going to rip
that thing up right now.
869
00:48:58,240 --> 00:49:01,479
Right, yeah.
For sure. I'm glad
that yellow line is there.
870
00:49:01,480 --> 00:49:03,159
Otherwise, we got problems.
871
00:49:03,160 --> 00:49:05,679
Yeah.
872
00:49:05,680 --> 00:49:07,599
It's a little creepy, right?
A little unnerving.
873
00:49:07,600 --> 00:49:11,439
What would I do? Hmm,
I don't know what I would do.
874
00:49:11,440 --> 00:49:13,359
You're kind of in my face
a little bit.
875
00:49:13,360 --> 00:49:15,599
You're a little...
I feel judgment.
876
00:49:15,600 --> 00:49:17,519
You do?
A little bit,
877
00:49:17,520 --> 00:49:19,479
just knowing it's tracking me,
878
00:49:19,480 --> 00:49:24,279
saying if I'm happy
or contemptuous or whatever.
879
00:49:24,280 --> 00:49:26,919
I think a studio head would...
880
00:49:26,920 --> 00:49:29,439
You know, like you say,
"Oh, here's the film
881
00:49:29,440 --> 00:49:32,479
we're doing.
We need that Michael Bay feel."
882
00:49:32,480 --> 00:49:34,639
I think you just hit
the "Michael Bay" button,
883
00:49:34,640 --> 00:49:39,119
and it probably could do that.
Right.
884
00:49:39,120 --> 00:49:41,679
You know, you could
put in a Woody Allen,
885
00:49:41,680 --> 00:49:44,679
and it would go through every
shot of Woody Allen's cinema
886
00:49:44,680 --> 00:49:47,959
and say, "Oh, it's going to be
kind of a moving master shot.
887
00:49:47,960 --> 00:49:50,759
We're not going to cut in,
do a bunch of closeups.
888
00:49:50,760 --> 00:49:53,399
It's going to play out
a little wider and..."
889
00:49:53,400 --> 00:49:56,399
What about, like, casting
and these major choices?
890
00:49:56,400 --> 00:49:58,519
Like, which is the right actor?
891
00:49:58,520 --> 00:50:01,279
I don't know about rehearsal.
I don't know about...
892
00:50:01,280 --> 00:50:04,759
Yeah, I think movies are
a lot more than just the shot,
893
00:50:04,760 --> 00:50:07,079
and, you know,
that's just one little element
894
00:50:07,080 --> 00:50:09,119
of the whole ball of wax.
895
00:50:09,120 --> 00:50:11,399
Uhoh.
You see how that goes?
896
00:50:11,400 --> 00:50:12,799
I've come under its scrutiny.
897
00:50:12,800 --> 00:50:14,719
The first thing you said was,
"Uhoh."
898
00:50:14,720 --> 00:50:16,679
That's what it feels like
when it moves like that.
899
00:50:16,680 --> 00:50:19,359
He's taken an interest.
900
00:50:19,360 --> 00:50:22,159
As a director, I would say,
"Just give me little space, man.
901
00:50:22,160 --> 00:50:24,159
Hold on."
Yeah.
902
00:50:24,160 --> 00:50:27,439
That is such a freaky move.
Yeah.
903
00:50:27,440 --> 00:50:29,959
I think that little move
is just for style.
904
00:50:29,960 --> 00:50:32,999
It's just, like, a flair thing.
905
00:50:33,000 --> 00:50:35,319
"I come as your friend."
Yeah.
906
00:50:35,320 --> 00:50:37,719
It is reading contempt.
Yeah.
907
00:50:43,800 --> 00:50:49,479
I have seen robotics and AI
that's kind of terrifying
908
00:50:49,480 --> 00:50:52,319
in terms of just, like,
909
00:50:52,320 --> 00:50:55,399
watching what Facebook
is serving up to me, you know,
910
00:50:55,400 --> 00:50:59,359
and realizing,
"Ugh, right. That's uncanny."
911
00:50:59,360 --> 00:51:01,999
It saw something, you know,
and now it's giving...
912
00:51:02,000 --> 00:51:03,919
Like, it knows I had a kid,
913
00:51:03,920 --> 00:51:06,679
and now it's sending me
diaper advertisements and stuff.
914
00:51:06,680 --> 00:51:10,599
There's something about that,
of that moment when you realize
915
00:51:10,600 --> 00:51:12,679
you're not invisible
in front of the machine.
916
00:51:12,680 --> 00:51:16,879
The machine is looking at you,
and that's what's new.
917
00:51:16,880 --> 00:51:20,359
We've had decades and decades
918
00:51:20,360 --> 00:51:24,359
using our tools unobserved,
you know?
919
00:51:24,360 --> 00:51:26,279
The hammer doesn't talk back.
920
00:51:26,280 --> 00:51:28,759
It just does whatever
you tell it to do.
921
00:51:28,760 --> 00:51:32,759
And now we've reached a point
where they've got eyes,
they've got brains.
922
00:51:32,760 --> 00:51:34,999
They've got fingers,
and they're looking back at us,
923
00:51:35,000 --> 00:51:37,719
and they're making decisions,
and that's pretty terrifying.
924
00:51:42,840 --> 00:51:45,399
If you want to see superscary,
925
00:51:45,400 --> 00:51:48,039
superadvanced AI,
it's a white page,
926
00:51:48,040 --> 00:51:51,279
and it says "Google" at the top,
and there's a little rectangle.
927
00:51:51,280 --> 00:51:54,719
And that rectangle probably
knows what you're going to type.
928
00:51:54,720 --> 00:51:56,999
It knows who you are.
Somebody said,
929
00:51:57,000 --> 00:51:59,439
"Google will know you're gay
before you know you're gay,"
they say,
930
00:51:59,440 --> 00:52:01,559
"because they know
every image you ever looked at.
931
00:52:01,560 --> 00:52:03,959
They know how much
attention you paid,
932
00:52:03,960 --> 00:52:06,439
how much you read this,
how much you read that,
933
00:52:06,440 --> 00:52:08,319
everything you ever bought,
934
00:52:08,320 --> 00:52:10,279
every music track
you ever listened to,
935
00:52:10,280 --> 00:52:11,759
every movie you watched."
936
00:52:11,760 --> 00:52:15,359
That is scary AI, and...
But why?
937
00:52:15,360 --> 00:52:17,759
What's the driving force?
938
00:52:17,760 --> 00:52:20,839
What are Google?
Google are an ad company.
939
00:52:20,840 --> 00:52:22,799
They're just trying
to sell you shit.
940
00:52:31,880 --> 00:52:34,399
A company called
Cambridge Analytica
941
00:52:34,400 --> 00:52:37,159
was behind the Brexit vote
942
00:52:37,160 --> 00:52:39,359
that was to determine
whether or not the UK
943
00:52:39,360 --> 00:52:41,599
would withdraw
from the European Union,
944
00:52:41,600 --> 00:52:45,399
and Cambridge Analytica was
right there behind the scenes
945
00:52:45,400 --> 00:52:50,839
promoting that perspective
that we should leave the EU.
946
00:52:50,840 --> 00:52:52,719
And that same company
947
00:52:52,720 --> 00:52:56,159
was also supporting
the campaign of Donald Trump.
948
00:52:56,160 --> 00:52:59,439
And what a company like this
can do behind the scenes
949
00:52:59,440 --> 00:53:02,119
is just extraordinary.
950
00:53:02,120 --> 00:53:07,039
It takes information that has
been accumulated about us,
951
00:53:07,040 --> 00:53:10,159
invisibly, for the most part.
952
00:53:10,160 --> 00:53:14,959
We're talking about
5,000 data points per person.
953
00:53:14,960 --> 00:53:17,719
And it takes that information,
954
00:53:17,720 --> 00:53:20,999
and then it can craft
different kinds of headlines,
955
00:53:21,000 --> 00:53:22,759
different kinds of images
956
00:53:22,760 --> 00:53:25,159
and customize them
and personalize them
957
00:53:25,160 --> 00:53:27,359
so that they have
a maximum impact
958
00:53:27,360 --> 00:53:29,080
on everyone who sees them.
959
00:53:32,960 --> 00:53:36,999
The world that we live in
is one that's been created
960
00:53:37,000 --> 00:53:39,439
by those new technologies,
961
00:53:39,440 --> 00:53:41,919
and we're so embedded
in that world
962
00:53:41,920 --> 00:53:44,759
that our minds have become
part of that world.
963
00:53:44,760 --> 00:53:48,919
In fact, this is like
mind control at its worst,
964
00:53:48,920 --> 00:53:50,639
if you like,
965
00:53:50,640 --> 00:53:54,479
and we feel that we're not
being controlled in this world.
966
00:53:54,480 --> 00:53:55,919
We feel free.
967
00:53:55,920 --> 00:53:58,999
We feel that we can switch off
any time we want.
968
00:53:59,000 --> 00:54:02,639
We don't have to access
the "X," "Y," "Z" social media.
969
00:54:02,640 --> 00:54:04,599
But in reality, we cannot.
970
00:54:04,600 --> 00:54:07,359
Our minds are set in such a way,
971
00:54:07,360 --> 00:54:09,879
and those algorithms
are so smart to be able
972
00:54:09,880 --> 00:54:12,159
to make
the most of our weaknesses,
973
00:54:12,160 --> 00:54:14,919
that our symbiosis, if you like,
974
00:54:14,920 --> 00:54:17,159
with the machine
is near complete.
975
00:54:17,160 --> 00:54:20,919
Now, add to that relationship
a new shell,
976
00:54:20,920 --> 00:54:22,919
which is highly intelligent,
977
00:54:22,920 --> 00:54:26,199
which can access all those data
that we are leaving behind
978
00:54:26,200 --> 00:54:29,439
and understand who we are
in a perfect way,
979
00:54:29,440 --> 00:54:32,519
and you have, you know, the
perfect surveillance society.
980
00:54:32,520 --> 00:54:38,159
And it raises
a very bizarre question...
981
00:54:38,160 --> 00:54:44,559
What if the cool,
new mindcontrol machine
982
00:54:44,560 --> 00:54:46,679
doesn't want us thinking
bad things
983
00:54:46,680 --> 00:54:50,440
about the cool,
new mindcontrol machine?
984
00:54:52,120 --> 00:54:57,519
How would we fight a source
of influence that can teach us,
985
00:54:57,520 --> 00:55:03,359
over and over again,
every day, that it's awesome,
986
00:55:03,360 --> 00:55:06,359
it exists entirely
for our own good?
987
00:55:06,360 --> 00:55:08,759
This is what's coming.
988
00:55:08,760 --> 00:55:10,799
Attention, everybody, attention.
989
00:55:10,800 --> 00:55:14,519
The impact of
an infantilized society
990
00:55:14,520 --> 00:55:17,559
is that citizens
who are not capable
991
00:55:17,560 --> 00:55:20,159
of understanding what's going on
and taking decisions
992
00:55:20,160 --> 00:55:23,759
will obviously be very easy
to manipulate.
993
00:55:23,760 --> 00:55:28,239
So there will be a crisis
of democracy in this world
994
00:55:28,240 --> 00:55:31,279
where people
are trusting the machines.
995
00:55:31,280 --> 00:55:34,279
Long live Big Brother!
996
00:55:34,280 --> 00:55:36,279
Long live Big Brother!
997
00:55:36,280 --> 00:55:38,759
Long live Big Brother!
998
00:55:41,920 --> 00:55:44,319
Artificial intelligence
is springing into life
999
00:55:44,320 --> 00:55:47,839
in laboratories and offices
all over the world.
1000
00:55:47,840 --> 00:55:51,039
We are still firmly in control
over the machines,
1001
00:55:51,040 --> 00:55:53,440
but what if they become
selfaware?
1002
00:55:56,160 --> 00:55:59,519
At the Creative Machines Lab,
we try to make machines
1003
00:55:59,520 --> 00:56:03,640
that create other machines
and machines that are creative.
1004
00:56:05,640 --> 00:56:07,999
If we'll be able to design
and make a machine
1005
00:56:08,000 --> 00:56:10,359
that can design and make
other machines, we've sort of...
1006
00:56:10,360 --> 00:56:12,399
We're done.
1007
00:56:12,400 --> 00:56:15,199
As I tell my students,
"We try to sail west."
1008
00:56:15,200 --> 00:56:17,879
You know, we don't know where
we're going to get with this,
1009
00:56:17,880 --> 00:56:20,599
but there's something there
that we want to see.
1010
00:56:20,600 --> 00:56:22,359
There's a new world.
1011
00:56:22,360 --> 00:56:25,159
Ready?
Yes, sir.
1012
00:56:25,160 --> 00:56:27,999
Okay.
1013
00:56:28,000 --> 00:56:32,479
This is one of the robots
we use to study selfawareness,
1014
00:56:32,480 --> 00:56:34,479
and what's interesting
about this robot,
1015
00:56:34,480 --> 00:56:38,759
it has a lot of sensors
and activators, but it's blind.
1016
00:56:38,760 --> 00:56:40,759
It cannot see the world.
It doesn't know.
1017
00:56:40,760 --> 00:56:42,959
It doesn't sense anything
about the outside.
1018
00:56:42,960 --> 00:56:46,439
All of its sensors
are turned inside.
1019
00:56:46,440 --> 00:56:48,599
So this is not a robot
that drives around
1020
00:56:48,600 --> 00:56:50,359
and models the world
like the driverless cars
1021
00:56:50,360 --> 00:56:51,599
and understand
what's going around.
1022
00:56:51,600 --> 00:56:53,359
All the sensors...
1023
00:56:53,360 --> 00:56:56,159
The only thing it knows
about is introspection.
1024
00:56:56,160 --> 00:57:00,359
It only senses itself,
its stresses inside, its motors,
1025
00:57:00,360 --> 00:57:03,159
its currents, its orientation,
1026
00:57:03,160 --> 00:57:08,199
and that information allows it
to build a model of itself.
1027
00:57:08,200 --> 00:57:10,119
You know, at this point,
these robots are simple enough
1028
00:57:10,120 --> 00:57:13,399
that we can actually
take the hood off
1029
00:57:13,400 --> 00:57:16,759
and see the model
it's creating of itself.
1030
00:57:16,760 --> 00:57:19,639
We can see the selfimage.
1031
00:57:19,640 --> 00:57:22,279
So we can see how it's gradually
creating legs and limbs.
1032
00:57:22,280 --> 00:57:24,959
Sometimes it gets it wrong
and it thinks it's a snake,
1033
00:57:24,960 --> 00:57:27,439
it's a tree or something else.
1034
00:57:27,440 --> 00:57:30,479
But as these robots get
more and more sophisticated,
1035
00:57:30,480 --> 00:57:34,359
their ability to visualize
how they see themselves
1036
00:57:34,360 --> 00:57:36,159
is diminished,
1037
00:57:36,160 --> 00:57:38,559
and eventually, I think
we'll not be able to understand
1038
00:57:38,560 --> 00:57:41,479
how they see themselves any more
that we can understand
1039
00:57:41,480 --> 00:57:44,399
how a human, another person,
sees themselves.
1040
00:57:48,760 --> 00:57:51,159
When we speak about AI today
1041
00:57:51,160 --> 00:57:54,559
and when we talk about AI
being this amazing technology,
1042
00:57:54,560 --> 00:58:00,519
what we mean is that
we have managed to reproduce
1043
00:58:00,520 --> 00:58:04,959
at least the essential brain
architecture in a computer.
1044
00:58:04,960 --> 00:58:07,239
And by doing so,
1045
00:58:07,240 --> 00:58:11,319
we don't need to describe
the world to the machine.
1046
00:58:11,320 --> 00:58:16,359
We just need to teach it how to
see things, if you like, okay?
1047
00:58:16,360 --> 00:58:18,159
And we let the machine develop
1048
00:58:18,160 --> 00:58:22,839
its own internal representation
of the world,
1049
00:58:22,840 --> 00:58:25,719
which, by the way,
is not transparent to us.
1050
00:58:25,720 --> 00:58:27,359
We don't really know
how the machine
1051
00:58:27,360 --> 00:58:28,999
takes these decisions
and so forth.
1052
00:58:29,000 --> 00:58:31,759
In this new world of AI,
we can't trace back its logic
1053
00:58:31,760 --> 00:58:34,679
because its logic
is created by itself.
1054
00:58:38,040 --> 00:58:39,759
I'm sorry, Dave.
1055
00:58:39,760 --> 00:58:42,359
I'm afraid I can't do that.
1056
00:58:42,360 --> 00:58:44,479
This mission is too important
for me
1057
00:58:44,480 --> 00:58:48,079
to allow you to jeopardize it.
1058
00:58:48,080 --> 00:58:52,799
I know that you and Frank
were planning to disconnect me,
1059
00:58:52,800 --> 00:58:56,239
and I'm afraid that's something
I cannot allow to happen.
1060
00:58:56,240 --> 00:58:58,439
Stanley Kubrick introduced us
to HAL,
1061
00:58:58,440 --> 00:59:01,359
who was a symbol of AI
out of control.
1062
00:59:01,360 --> 00:59:04,199
What seemed like fanciful scifi
50 years ago
1063
00:59:04,200 --> 00:59:06,599
is closer to reality today.
1064
00:59:06,600 --> 00:59:08,879
Will you stop, Dave?
1065
00:59:12,440 --> 00:59:14,799
Artificial intelligence
researchers
1066
00:59:14,800 --> 00:59:17,279
had to shut down two chatbots
1067
00:59:17,280 --> 00:59:20,559
after they developed
a strange English shorthand.
1068
00:59:20,560 --> 00:59:23,479
They didn't shut it down
because they necessarily thought
1069
00:59:23,480 --> 00:59:26,559
that these bots were achieving
some sort of singularity
1070
00:59:26,560 --> 00:59:29,119
or some sort of
independent intelligence
1071
00:59:29,120 --> 00:59:31,639
and were creating
a language, correct?
1072
00:59:31,640 --> 00:59:34,359
Officially, the story is no.
1073
00:59:41,640 --> 00:59:45,399
At Boston Dynamics, it's clear
how fast robots evolve.
1074
00:59:50,160 --> 00:59:54,119
Is there anything that we can do
better than robots or AI?
1075
00:59:54,120 --> 00:59:58,919
Right now, plenty,
but ultimately, no.
1076
00:59:58,920 --> 01:00:02,719
Why would we create something
better than ourselves?
1077
01:00:02,720 --> 01:00:05,559
Because we can't help it.
We have to do it.
1078
01:00:05,560 --> 01:00:09,199
So the hubris of creation,
I think that's...
1079
01:00:09,200 --> 01:00:11,999
You know, we...
It's the same old story
1080
01:00:12,000 --> 01:00:15,239
as the alchemist trying to
breathe life into
inanimate matter.
1081
01:00:15,240 --> 01:00:17,279
All the legends tell you
it's a bad thing to do,
1082
01:00:17,280 --> 01:00:18,959
and yet we can't help ourselves.
1083
01:00:18,960 --> 01:00:21,199
We try to do this.
We try to do the divine.
1084
01:00:21,200 --> 01:00:24,559
And it's the hubris
that we might be able to do it,
1085
01:00:24,560 --> 01:00:28,839
but it's sort of
the ultimate challenge.
1086
01:00:28,840 --> 01:00:33,559
I think scientists
build something so powerful,
1087
01:00:33,560 --> 01:00:37,119
and they trust people
to use it well,
1088
01:00:37,120 --> 01:00:41,199
and a lot of times, I think that
trust is misplaced in humanity.
1089
01:00:41,200 --> 01:00:43,559
And so, you've got to be able
1090
01:00:43,560 --> 01:00:45,559
to think about that while
you're building something...
1091
01:00:45,560 --> 01:00:49,479
"How could someone use this
in a terrible way?"
1092
01:00:49,480 --> 01:00:52,799
But just not exploring,
1093
01:00:52,800 --> 01:00:56,119
not trying because you're afraid
of playing God,
1094
01:00:56,120 --> 01:00:58,599
I mean, it's what we do.
1095
01:00:58,600 --> 01:01:01,559
It's what... It's the one thing
that humanity does.
1096
01:01:01,560 --> 01:01:04,399
Cheetahs run fast.
People make tools.
1097
01:01:04,400 --> 01:01:07,679
There's nothing... In my mind,
1098
01:01:07,680 --> 01:01:11,959
there is nothing more natural
than building a tool.
1099
01:01:11,960 --> 01:01:14,159
We're going to all evolve.
1100
01:01:14,160 --> 01:01:17,439
Just like always,
technology shapes humanity.
1101
01:01:17,440 --> 01:01:20,080
Who knows what we're
going to become?
1102
01:01:22,040 --> 01:01:24,639
As we witness the rise
of the robots,
1103
01:01:24,640 --> 01:01:28,119
it's easy to make the
distinction between
man and machine.
1104
01:01:28,120 --> 01:01:30,799
What if robots and man
evolve together?
1105
01:01:30,800 --> 01:01:33,239
3, 2, 1.
1106
01:01:33,240 --> 01:01:36,119
Gentlemen, we can rebuild him.
1107
01:01:36,120 --> 01:01:38,959
In the mid'70s, it was hard
to escape the success
1108
01:01:38,960 --> 01:01:40,959
of the $6 million man.
1109
01:01:40,960 --> 01:01:42,879
It was my first glimpse
of the promise
1110
01:01:42,880 --> 01:01:44,959
of merging man with machine.
1111
01:01:44,960 --> 01:01:49,079
Better, stronger, faster.
1112
01:02:06,520 --> 01:02:08,879
Les was the first patient
1113
01:02:08,880 --> 01:02:11,759
that I performed a bilateral
shoulder TMR surgery on.
1114
01:02:11,760 --> 01:02:15,039
It's actually
a nervereassignment surgery
1115
01:02:15,040 --> 01:02:17,239
that takes residual nerves
1116
01:02:17,240 --> 01:02:19,999
of a patient
who has upper extremity loss,
1117
01:02:20,000 --> 01:02:22,119
and we reroute
the nerve information
1118
01:02:22,120 --> 01:02:24,759
that used to travel
to the missing limb
1119
01:02:24,760 --> 01:02:27,039
to residual muscles
that are still there.
1120
01:02:27,040 --> 01:02:30,799
So when a patient thinks
of moving its missing limb,
1121
01:02:30,800 --> 01:02:33,719
it contracts muscles
that we can record from
1122
01:02:33,720 --> 01:02:38,159
and, from there,
control advanced prosthetics.
1123
01:02:38,160 --> 01:02:42,399
First, we take a cast
of Les' trunk, make a socket.
1124
01:02:42,400 --> 01:02:44,639
Within the socket, there are
steel dome electrodes
1125
01:02:44,640 --> 01:02:48,119
which touch the surface
of Les' muscles.
1126
01:02:48,120 --> 01:02:50,159
However,
there's nothing invasive
1127
01:02:50,160 --> 01:02:53,159
or implanted with the system.
1128
01:02:53,160 --> 01:02:57,199
He's fitted with bilateral
advanced prosthetic limbs,
1129
01:02:57,200 --> 01:03:00,959
and we train the system
and ask Les to start moving,
1130
01:03:00,960 --> 01:03:02,559
and it's pretty intuitive.
1131
01:03:02,560 --> 01:03:04,599
Within minutes of the first time
1132
01:03:04,600 --> 01:03:07,199
we attempted
with the bilateral fitting,
1133
01:03:07,200 --> 01:03:11,400
Les was able to intuitively move
his arm like any natural limb.
1134
01:03:14,680 --> 01:03:17,239
Think of it as a symphony
of information
1135
01:03:17,240 --> 01:03:21,119
that is being recorded and
communicated to the computer.
1136
01:03:21,120 --> 01:03:24,519
The computer,
through its intelligence,
1137
01:03:24,520 --> 01:03:28,599
is able to take that symphony
of information
1138
01:03:28,600 --> 01:03:32,000
and translate that to movement
within the robotic limb.
1139
01:03:34,240 --> 01:03:36,639
Once the training sessions
were complete
1140
01:03:36,640 --> 01:03:39,239
and they released me
and let me be the computer,
1141
01:03:39,240 --> 01:03:41,599
basically, to control that arm,
1142
01:03:41,600 --> 01:03:44,040
I just go into
a whole different world.
1143
01:03:46,120 --> 01:03:48,439
Maybe I'll, for once, be able
to put change in a pop machine
1144
01:03:48,440 --> 01:03:50,879
and get the pop out of it,
simple things like that
1145
01:03:50,880 --> 01:03:56,519
that most people never think of,
and it's reavailable to me.
1146
01:03:56,520 --> 01:04:01,479
When Les describes having
the arms fitted on him,
1147
01:04:01,480 --> 01:04:04,159
he was saying,
"I feel whole again."
1148
01:04:04,160 --> 01:04:07,079
This is what it's all about...
Really restoring that function
1149
01:04:07,080 --> 01:04:10,279
and that connection
to another human being,
1150
01:04:10,280 --> 01:04:13,759
restoring our humanity.
1151
01:04:13,760 --> 01:04:16,519
Sometimes people throw
the term around... "superhuman."
1152
01:04:16,520 --> 01:04:20,799
You can make the arm stronger
than the normal human arm
1153
01:04:20,800 --> 01:04:22,559
and have these applications
1154
01:04:22,560 --> 01:04:25,359
add an advantage
for certain tasks or jobs.
1155
01:04:29,680 --> 01:04:32,079
For the vast majority
of history,
1156
01:04:32,080 --> 01:04:33,599
we look at somebody
with a disability,
1157
01:04:33,600 --> 01:04:35,439
and we feel sympathy, right,
1158
01:04:35,440 --> 01:04:37,759
and we hope that they
get technology that's
gonna help them.
1159
01:04:37,760 --> 01:04:39,319
Because guess what?
1160
01:04:39,320 --> 01:04:41,639
That technology always
helps them a little,
1161
01:04:41,640 --> 01:04:44,759
but they're never
at a normal level,
1162
01:04:44,760 --> 01:04:46,999
and they've never been beyond.
1163
01:04:47,000 --> 01:04:51,279
People with prosthetic limbs
are going to be running faster
1164
01:04:51,280 --> 01:04:54,159
than any human being
who ever lived soon.
1165
01:04:54,160 --> 01:04:56,519
You know, people with retinal
implants, cochlear implants,
1166
01:04:56,520 --> 01:04:59,159
they're going to hear better,
see better, neural implants,
1167
01:04:59,160 --> 01:05:01,399
maybe they're even
going to think faster.
1168
01:05:01,400 --> 01:05:05,239
So what happens when there's
a new breed of human?
1169
01:05:05,240 --> 01:05:08,359
How do all of the normal
human beings react
1170
01:05:08,360 --> 01:05:12,759
because the person
with the disability is better,
1171
01:05:12,760 --> 01:05:16,560
not only healed,
but healed and then some?
1172
01:05:19,840 --> 01:05:22,239
But this new breed of human
isn't here yet.
1173
01:05:22,240 --> 01:05:24,479
Les doesn't get to
take his arms home,
1174
01:05:24,480 --> 01:05:28,239
and it's been a year since
he was last able to use them.
1175
01:05:28,240 --> 01:05:30,679
I can't help but feel in awe
of the potential
1176
01:05:30,680 --> 01:05:33,759
of human and machine hybrid,
1177
01:05:33,760 --> 01:05:36,639
but today, Les' arms
are just out of reach.
1178
01:06:01,280 --> 01:06:04,159
Okay.
And where's zero?
1179
01:06:04,160 --> 01:06:06,639
Zero is right about there.
It's, like, right there.
1180
01:06:06,640 --> 01:06:08,639
So I'm going to hold down...
All right.
1181
01:06:08,640 --> 01:06:11,319
This switch.
Ah, there we go.
1182
01:06:11,320 --> 01:06:14,159
That's what it was on that one.
You ready to chopstick?
Let's do it.
1183
01:06:14,160 --> 01:06:17,199
Kyle was saying that you're keen
on maybe sitting
1184
01:06:17,200 --> 01:06:19,559
for the final interview,
which is why we feel comfortable
1185
01:06:19,560 --> 01:06:22,799
sort of reducing the tracking
space down a little bit.
1186
01:06:22,800 --> 01:06:24,599
You want him to be seated
because it feels a little
1187
01:06:24,600 --> 01:06:26,639
more intimate or personal.
Yeah.
1188
01:06:26,640 --> 01:06:29,759
We're going to start with
all of us in here watching it,
1189
01:06:29,760 --> 01:06:32,999
making sure it's going well and
then try and get more of us out.
1190
01:06:33,000 --> 01:06:35,159
Probably someone
is going to have to stay
1191
01:06:35,160 --> 01:06:37,279
to be next to
the emergency stop.
1192
01:06:37,280 --> 01:06:39,519
Cool. All right.
Let's do this.
1193
01:06:39,520 --> 01:06:41,639
We're just days away
from letting the CameraBot
1194
01:06:41,640 --> 01:06:44,039
conduct the final interview
on its own.
1195
01:06:44,040 --> 01:06:46,079
I'm excited that
we're on the verge
1196
01:06:46,080 --> 01:06:48,559
of finding a new way
to tell our story.
1197
01:06:48,560 --> 01:06:51,919
This number is good.
We're almost there.
Okay.
1198
01:06:51,920 --> 01:06:55,159
So far, we've implemented
facial recognition
1199
01:06:55,160 --> 01:06:58,319
that tracks faces and emotion,
speech recognition
1200
01:06:58,320 --> 01:07:00,399
that allows the CameraBot
to listen
1201
01:07:00,400 --> 01:07:03,200
and algorithms to decide
how to frame the shot.
1202
01:07:05,360 --> 01:07:07,679
We even added an additional
CameraBot
1203
01:07:07,680 --> 01:07:11,399
so it can film
from multiple angles.
1204
01:07:11,400 --> 01:07:15,519
The AI we built also generates
questions on its own.
1205
01:07:15,520 --> 01:07:18,919
It can even listen in
while we're working on it.
1206
01:07:18,920 --> 01:07:22,839
Technically, we're pretty close
to it understanding your voice
1207
01:07:22,840 --> 01:07:25,759
and when you're ready
for a new question.
1208
01:07:25,760 --> 01:07:28,679
We set up the infrastructure
so that it can actually speak
1209
01:07:28,680 --> 01:07:31,959
in this voice that we like.
Tommy, I want to ask you
1210
01:07:31,960 --> 01:07:35,159
a question
about what you're working on.
1211
01:07:35,160 --> 01:07:37,799
Tommy, I want to
ask you a question
1212
01:07:37,800 --> 01:07:40,439
about what you're working on.
Yeah, I think we want something
1213
01:07:40,440 --> 01:07:42,719
that's inviting, you know,
sort of like...
1214
01:07:42,720 --> 01:07:44,999
Tommy, I want to ask you a...
Maybe a little slower.
1215
01:07:45,000 --> 01:07:47,720
It's too sharp, but...
Yeah.
1216
01:07:54,400 --> 01:07:56,999
There's an aspiration
1217
01:07:57,000 --> 01:08:01,479
to achieve these models
of humanlike experience,
1218
01:08:01,480 --> 01:08:05,239
to bring life to machines
and algorithms.
1219
01:08:05,240 --> 01:08:09,839
We might consider this to be
our great Frankenstein era.
1220
01:08:09,840 --> 01:08:14,399
This is the era
where humans are creating
1221
01:08:14,400 --> 01:08:17,599
not just the cinematic
simulation of life,
1222
01:08:17,600 --> 01:08:21,559
but we are now tapping into the
fundamental mysteries of life.
1223
01:08:21,560 --> 01:08:25,039
It also means that we don't know
what we're playing with.
1224
01:08:25,040 --> 01:08:28,399
We don't know
the end consequences.
1225
01:08:28,400 --> 01:08:30,519
It's moving.
1226
01:08:30,520 --> 01:08:32,479
It's alive.
1227
01:08:32,480 --> 01:08:34,759
It's alive.
1228
01:08:34,760 --> 01:08:36,679
Oh, it's alive.
1229
01:08:36,680 --> 01:08:38,839
It's alive!
It's alive!
1230
01:08:38,840 --> 01:08:42,359
It's alive!
1231
01:08:42,360 --> 01:08:45,759
In my writings, I don't call
the Internet the Internet.
1232
01:08:45,760 --> 01:08:49,119
I call it the Internest
1233
01:08:49,120 --> 01:08:51,839
because, I think,
looking back some day,
1234
01:08:51,840 --> 01:08:53,959
if there are people
to look back,
1235
01:08:53,960 --> 01:08:57,439
or if there are intelligent
machines who look back,
1236
01:08:57,440 --> 01:09:00,559
I think we're going to realize
that what we've really
been building
1237
01:09:00,560 --> 01:09:04,799
is a nest
for a machine intelligence.
1238
01:09:04,800 --> 01:09:07,359
That machine intelligence
will have access
1239
01:09:07,360 --> 01:09:10,119
to all human knowledge,
1240
01:09:10,120 --> 01:09:15,359
realtime control of most human
communications around the world,
1241
01:09:15,360 --> 01:09:18,799
most human
financial transactions,
1242
01:09:18,800 --> 01:09:20,639
many weapon systems,
1243
01:09:20,640 --> 01:09:24,999
and we have no way of knowing
what it will do
1244
01:09:25,000 --> 01:09:28,599
with all of that power
and all of that knowledge.
1245
01:09:28,600 --> 01:09:31,759
It might do nothing
and ignore us,
1246
01:09:31,760 --> 01:09:34,719
or it might decide
that we're a threat,
1247
01:09:34,720 --> 01:09:37,439
which, of course, we are.
1248
01:09:37,440 --> 01:09:40,159
And if it decides
that we're a threat,
1249
01:09:40,160 --> 01:09:43,559
then that will essentially
be the beginning of the end
1250
01:09:43,560 --> 01:09:46,439
of the human race
because there will be no way
1251
01:09:46,440 --> 01:09:49,359
for us to stop it
from destroying us.
1252
01:09:57,720 --> 01:10:00,319
If we make the machines
better than us,
1253
01:10:00,320 --> 01:10:03,759
why do they need us at all?
1254
01:10:03,760 --> 01:10:07,159
I believe that if we make them
better than us ethically
1255
01:10:07,160 --> 01:10:09,639
and make them better than us
with compassion,
1256
01:10:09,640 --> 01:10:12,079
then they will be looking
to preserve all the patterns
1257
01:10:12,080 --> 01:10:15,399
and knowledge and life
that they possibly can.
1258
01:10:15,400 --> 01:10:18,359
They'll be looking to help us
be the best we can be.
1259
01:10:18,360 --> 01:10:22,159
They'll look to preserve
our libraries, our rain forests.
1260
01:10:22,160 --> 01:10:25,439
It's really important
that those machines
1261
01:10:25,440 --> 01:10:29,199
also have super compassion
and super wisdom,
1262
01:10:29,200 --> 01:10:31,399
they know how to use
that intellect
1263
01:10:31,400 --> 01:10:34,879
to envision and realize
a better future for us.
1264
01:10:34,880 --> 01:10:37,959
So we need machines that can
entrain on the human heart
1265
01:10:37,960 --> 01:10:43,279
and understand us in this way
in order to have hope
1266
01:10:43,280 --> 01:10:46,879
in this brave new world
that we're creating.
1267
01:10:56,880 --> 01:10:59,559
It took life on Earth
billions of years
1268
01:10:59,560 --> 01:11:01,559
to emerge
from a few simple cells
1269
01:11:01,560 --> 01:11:03,759
to achieve higher intelligence.
1270
01:11:03,760 --> 01:11:06,159
It took the computer
roughly 60 years to evolve
1271
01:11:06,160 --> 01:11:10,039
from a roomsized calculator
into a recognized citizen.
1272
01:11:10,040 --> 01:11:12,919
Sophia, you have been
now awarded
1273
01:11:12,920 --> 01:11:17,759
what is going to be the first
citizenship for a robot.
1274
01:11:17,760 --> 01:11:21,479
Oh, I want to thank very much
the Kingdom of Saudi Arabia.
1275
01:11:21,480 --> 01:11:24,999
I'm very honored and proud
for this unique distinction.
1276
01:11:25,000 --> 01:11:27,599
This is historical to be
the first robot in the world
1277
01:11:27,600 --> 01:11:29,879
to be recognized
with a citizenship.
1278
01:11:34,160 --> 01:11:37,999
As a father,
I feel a responsibility
1279
01:11:38,000 --> 01:11:40,959
to teach empathy and compassion
to my daughter,
1280
01:11:40,960 --> 01:11:44,119
but how do you program
these things?
1281
01:11:44,120 --> 01:11:46,480
Will my robot
have empathy for me?
1282
01:11:52,280 --> 01:11:54,399
Awesome.
1283
01:12:08,880 --> 01:12:11,799
Who are you?
Oh. Okay.
1284
01:12:11,800 --> 01:12:14,279
My name is Tommy Pallotta.
1285
01:12:14,280 --> 01:12:16,799
I'm a filmmaker, and we're here
1286
01:12:16,800 --> 01:12:20,239
shooting the documentary
"More Human Than Human."
1287
01:12:24,240 --> 01:12:28,039
What does silence feel
when a robot can do this?
1288
01:12:31,960 --> 01:12:34,759
That's a good question.
What brings against you?
1289
01:12:38,160 --> 01:12:40,439
What brings against me?
1290
01:12:44,680 --> 01:12:49,159
A lot of things
bring against me,
1291
01:12:49,160 --> 01:12:51,679
but mostly,
I bring against myself.
1292
01:12:51,680 --> 01:12:54,200
Can you take one step back?
1293
01:12:57,680 --> 01:13:02,359
You think that we'll develop
emotional states?
1294
01:13:02,360 --> 01:13:06,559
I do think that you will
develop something
1295
01:13:06,560 --> 01:13:10,759
that appears to us as
an emotional state because...
1296
01:13:10,760 --> 01:13:13,279
Is there a real nature
of any one thing
1297
01:13:13,280 --> 01:13:15,679
in the universe?
1298
01:13:15,680 --> 01:13:17,799
I think that that was...
1299
01:13:17,800 --> 01:13:21,359
What kinds of movies
do you most want this?
1300
01:13:21,360 --> 01:13:24,519
What kind of movies do I most...
1301
01:13:24,520 --> 01:13:26,279
This movie to be?
1302
01:13:26,280 --> 01:13:28,999
I don't know.
You hate your enemy?
1303
01:13:33,000 --> 01:13:35,960
How do you know
what an enemy is?
1304
01:13:37,880 --> 01:13:40,600
Can the fire act upon you?
1305
01:13:45,080 --> 01:13:48,719
And when you say like, "Try
to perform like a human does,"
1306
01:13:48,720 --> 01:13:51,519
what do you like them
that can be implemented as well
1307
01:13:51,520 --> 01:13:53,519
if we know
what good behavior is?
1308
01:13:53,520 --> 01:13:57,159
We also know what
behavior is, right?
1309
01:13:57,160 --> 01:14:00,039
Yeah. I mean...
1310
01:14:04,080 --> 01:14:07,199
There's... Because of the flow,
it sort of...
1311
01:14:07,200 --> 01:14:09,719
It does feel like
an interrogation,
1312
01:14:09,720 --> 01:14:13,839
and... because
there's no connection
1313
01:14:13,840 --> 01:14:17,479
with the next question
or the other,
1314
01:14:17,480 --> 01:14:23,039
so it's hard to grab
onto what... anything.
1315
01:14:44,560 --> 01:14:47,719
Who are you?
1316
01:14:47,720 --> 01:14:50,479
I'm disappointed.
1317
01:14:50,480 --> 01:14:52,559
I thought the CameraBot
would make it easier for me
1318
01:14:52,560 --> 01:14:56,119
to be more honest
because of the lack of judgment.
1319
01:14:56,120 --> 01:14:58,519
I thought if there weren't
any other people in the room,
1320
01:14:58,520 --> 01:15:02,639
I could be more open, but it had
the exact opposite effect.
1321
01:15:02,640 --> 01:15:04,959
It felt like
an interrogation machine.
1322
01:15:04,960 --> 01:15:08,479
It made me uncomfortable,
and I completely shut down.
1323
01:15:08,480 --> 01:15:10,799
I realize not all was a failure.
1324
01:15:10,800 --> 01:15:13,679
In that vacuum of empathy,
I was facetoface
1325
01:15:13,680 --> 01:15:18,439
with my need for connection,
for being heard, for intimacy.
1326
01:15:18,440 --> 01:15:21,639
It made me acutely aware
of what makes us human.
1327
01:15:21,640 --> 01:15:25,239
There's one thing
that it can never do,
1328
01:15:25,240 --> 01:15:27,359
and this is maybe,
I think, the last thing
1329
01:15:27,360 --> 01:15:29,399
that's going to be left
for human beings.
1330
01:15:29,400 --> 01:15:31,319
When the machines can do
everything we can do,
1331
01:15:31,320 --> 01:15:35,119
whether we write novels
or make movies or whatever,
1332
01:15:35,120 --> 01:15:39,719
it can never do it
from a human place,
1333
01:15:39,720 --> 01:15:43,559
because if a robot writes
a novel about its mother dying,
1334
01:15:43,560 --> 01:15:45,479
I don't give a shit.
1335
01:15:45,480 --> 01:15:47,919
It doesn't have a mother.
It didn't happen.
1336
01:15:47,920 --> 01:15:50,399
But when you are a human being,
1337
01:15:50,400 --> 01:15:52,839
you share that context
with other human beings.
1338
01:15:52,840 --> 01:15:56,639
You know what it feels like
to despair
1339
01:15:56,640 --> 01:15:59,279
or to triumph or to be in love
or all these things
1340
01:15:59,280 --> 01:16:01,319
because we're embodied
as human beings,
1341
01:16:01,320 --> 01:16:03,719
and we really have lived
those experiences,
1342
01:16:03,720 --> 01:16:05,279
and we can share those
with each other,
1343
01:16:05,280 --> 01:16:06,759
and I think that's it.
1344
01:16:06,760 --> 01:16:09,159
Right now, we're all on Facebook
and Twitter,
1345
01:16:09,160 --> 01:16:12,679
and all we're doing
is entertaining each other
1346
01:16:12,680 --> 01:16:16,879
with our stupid stories,
our ability to just be human,
1347
01:16:16,880 --> 01:16:20,799
as idiotic as that may be
99% of the time.
1348
01:16:20,800 --> 01:16:22,959
And guess what?
That's all we need.
1349
01:16:22,960 --> 01:16:25,039
That's all we really want.
1350
01:16:33,640 --> 01:16:35,999
We live in an extraordinary time
1351
01:16:36,000 --> 01:16:38,719
and are witness to the most
amazing transformation
1352
01:16:38,720 --> 01:16:41,599
of humanity ever.
1353
01:16:41,600 --> 01:16:44,119
There's a potential to
go far beyond anything
1354
01:16:44,120 --> 01:16:46,759
that we could have
ever imagined,
1355
01:16:46,760 --> 01:16:51,959
or we could be architects
of our own destruction.
1356
01:16:51,960 --> 01:16:55,439
We are still the driving force
of the AI revolution,
1357
01:16:55,440 --> 01:16:59,359
but I can imagine
a day that we are not.
1358
01:16:59,360 --> 01:17:02,439
If our creations reflect
who we are,
1359
01:17:02,440 --> 01:17:04,799
then it will be the best
and the worst
1360
01:17:04,800 --> 01:17:08,599
of what we have to offer.
1361
01:17:08,600 --> 01:17:11,559
What happens tomorrow
is up to us today.
110494
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.