Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,960 --> 00:00:10,880
♪♪
2
00:00:24,920 --> 00:00:30,920
♪♪
3
00:00:56,200 --> 00:01:00,320
♪♪
4
00:01:03,160 --> 00:01:07,200
-We choose to go to the moon.
5
00:01:07,200 --> 00:01:10,000
We choose to go to the moon
in this decade
6
00:01:10,000 --> 00:01:13,320
and do the other things,
not because they are easy
7
00:01:13,320 --> 00:01:15,040
but because they are hard.
8
00:01:15,040 --> 00:01:17,120
-T-minus 20 seconds
and counting.
9
00:01:17,120 --> 00:01:22,640
Guidance in channel 15, 14,
13, 12, 11, 10...
10
00:01:22,640 --> 00:01:25,080
9...Engines go.
-Go.
11
00:01:25,080 --> 00:01:26,440
5, 4, 3, 2...
-Go.
12
00:01:26,440 --> 00:01:27,960
Delta, go.
-Go.
13
00:01:27,960 --> 00:01:29,320
Econ go.
-Launch commit.
14
00:01:29,320 --> 00:01:31,040
-Go.
15
00:01:31,040 --> 00:01:40,440
♪♪
16
00:01:40,440 --> 00:01:47,880
♪♪
17
00:01:47,880 --> 00:01:55,000
♪♪
18
00:01:55,000 --> 00:02:02,720
-That's one small step for man,
one giant leap for mankind.
19
00:02:02,720 --> 00:02:05,400
-Shortly after I was born,
Neil Armstrong
20
00:02:05,400 --> 00:02:08,280
became the first man
to walk on the moon.
21
00:02:08,280 --> 00:02:10,440
Putting a man on the moon
was the greatest technological
22
00:02:10,440 --> 00:02:13,760
achievement in the 20th century,
and it was televised live
23
00:02:13,760 --> 00:02:17,320
for all to see
and experience together.
24
00:02:17,320 --> 00:02:21,480
-It's different,
but it's very pretty out here.
25
00:02:21,480 --> 00:02:24,640
-Beautiful, beautiful.
-Isn't that something?
26
00:02:24,640 --> 00:02:26,800
-Are you getting a TV
picture now, Houston?
27
00:02:29,360 --> 00:02:33,000
-I grew up in a small community
living in the shadow of NASA.
28
00:02:33,000 --> 00:02:35,720
It was equal parts cowboy
and astronaut.
29
00:02:35,720 --> 00:02:37,520
Our neighbors were
the engineers, technicians,
30
00:02:37,520 --> 00:02:39,720
and the men who flew to space.
31
00:02:39,720 --> 00:02:41,440
I grew up with visions
of the future
32
00:02:41,440 --> 00:02:45,360
where robots cleaned our homes,
and the future was bright.
33
00:02:45,360 --> 00:02:47,520
Computers and the Internet,
34
00:02:47,520 --> 00:02:49,520
once limited to NASA
and universities,
35
00:02:49,520 --> 00:02:51,760
began to move into our homes.
36
00:02:51,760 --> 00:02:55,400
In 1981, I got my first
computer, the Commodore VIC-20.
37
00:02:55,400 --> 00:02:58,880
-It has great games, too.
-And learn computing at home.
38
00:02:58,880 --> 00:03:03,760
-In 1997, Kasparov,
the greatest chess player alive,
39
00:03:03,760 --> 00:03:06,640
was stunningly defeated
by a computer.
40
00:03:06,640 --> 00:03:09,560
-Probably in the future,
computer will replace us,
41
00:03:09,560 --> 00:03:11,400
I mean, will control our life.
42
00:03:11,400 --> 00:03:12,920
-What's that?
43
00:03:12,920 --> 00:03:15,800
-Shortly after that,
a cute toy named Furby
44
00:03:15,800 --> 00:03:18,480
became the must-have toy
of the season.
45
00:03:18,480 --> 00:03:21,800
It contained a very small bit
of artificial intelligence
46
00:03:21,800 --> 00:03:25,880
and more computing power than
was used to put man on the moon.
47
00:03:25,880 --> 00:03:28,960
In 2008, I saw
a video of Big Dog,
48
00:03:28,960 --> 00:03:31,280
a four-legged robot
that went viral.
49
00:03:31,280 --> 00:03:34,360
It was developed to
assist troops in combat.
50
00:03:34,360 --> 00:03:38,160
It seems like robots were being
used for more than cute toys.
51
00:03:38,160 --> 00:03:41,560
-What can we do to protect
ourselves from robot automation?
52
00:03:41,560 --> 00:03:44,720
-The CEO explained that machines
will replace traditional labor.
53
00:03:44,720 --> 00:03:47,200
-Companies planning to
replace human jobs...
54
00:03:47,200 --> 00:03:50,720
-Every day, there are stories
about robots replacing us.
55
00:03:50,720 --> 00:03:53,360
Current estimates say
that over a third of all jobs
56
00:03:53,360 --> 00:03:57,880
will be lost to automation
by the year 2030.
57
00:03:57,880 --> 00:04:01,880
-I think the development of
full artificial intelligence
58
00:04:01,880 --> 00:04:04,760
could spell the end
of the human race.
59
00:04:04,760 --> 00:04:07,480
-Visionaries like
Stephen Hawking and Elon Musk
60
00:04:07,480 --> 00:04:11,400
warn that, while AI has great
potential for benefits,
61
00:04:11,400 --> 00:04:15,520
it could also be the greatest
existential risk to humanity.
62
00:04:15,520 --> 00:04:19,000
♪♪
63
00:04:19,000 --> 00:04:21,720
-Artificial intelligence --
we are summoning a demon.
64
00:04:21,720 --> 00:04:24,600
-Of course, we had a hand
in this creation,
65
00:04:24,600 --> 00:04:27,400
but I wonder, have we reached
some kind of turning point
66
00:04:27,400 --> 00:04:30,200
in the evolution
of intelligent machines?
67
00:04:30,200 --> 00:04:32,160
Are we on the verge
of witnessing the birth
68
00:04:32,160 --> 00:04:34,240
of a new species?
69
00:04:34,240 --> 00:04:40,680
♪♪
70
00:04:40,680 --> 00:04:47,200
♪♪
71
00:04:47,200 --> 00:04:52,480
-I've seen things you people
wouldn't believe.
72
00:04:55,280 --> 00:05:01,040
Attack ships on fire
off the shoulder of Orion.
73
00:05:01,040 --> 00:05:04,280
-I watched C-beams
glitter in the dark
74
00:05:04,280 --> 00:05:08,480
near the Tannhauser Gate.
75
00:05:08,480 --> 00:05:15,200
All those moments would be lost
in time,
76
00:05:15,200 --> 00:05:22,160
like tears in the rain.
77
00:05:22,160 --> 00:05:23,680
-I'm Will Jackson.
78
00:05:23,680 --> 00:05:25,960
I'm the director at
Engineered Arts Ltd.
79
00:05:25,960 --> 00:05:31,280
We're a humanoid-robot company,
and we make acting robots.
80
00:05:31,280 --> 00:05:34,360
I'm creating things
that engage people
81
00:05:34,360 --> 00:05:37,480
to the level where they
suspend their disbelief.
82
00:05:37,480 --> 00:05:39,880
-Oh, yes, Master Luke.
Remember that I'm fluent
83
00:05:39,880 --> 00:05:42,440
in over six million forms
of communication.
84
00:05:42,440 --> 00:05:45,080
-People say things like,
"Oh, they're going to
help out in hospitals.
85
00:05:45,080 --> 00:05:47,400
It's going to look
after my aged parent.
86
00:05:47,400 --> 00:05:48,960
It's going to do the dishes
in my houses."
87
00:05:48,960 --> 00:05:50,960
None of these things
are achievable.
88
00:05:50,960 --> 00:05:53,160
Utility and entertainment --
89
00:05:53,160 --> 00:05:54,640
It's the difference
between those two things,
90
00:05:54,640 --> 00:05:58,360
which is why
we make acting robots.
91
00:05:58,360 --> 00:06:01,960
So what's the simplest,
most well-paid job
92
00:06:01,960 --> 00:06:04,560
that a human can do?
93
00:06:04,560 --> 00:06:08,240
Brad Pitt, you know?
How easy is that?
94
00:06:08,240 --> 00:06:10,760
Doesn't even get his hands wet,
you know?
95
00:06:10,760 --> 00:06:13,600
So that's why we do
acting robots --
96
00:06:13,600 --> 00:06:16,240
really well-paid, really easy.
97
00:06:16,240 --> 00:06:18,240
Pick the low-hanging fruit.
98
00:06:18,240 --> 00:06:20,680
-You talking to me?
99
00:06:20,680 --> 00:06:23,680
You talking to me?
100
00:06:23,680 --> 00:06:26,120
You talking to me?
101
00:06:26,120 --> 00:06:27,800
Then who the hells else
you talking to?
102
00:06:27,800 --> 00:06:29,240
Were you talking to me?
103
00:06:29,240 --> 00:06:31,560
-It's the re-creation of life.
104
00:06:31,560 --> 00:06:35,600
It's partly man plays God
making the machine.
105
00:06:35,600 --> 00:06:39,480
And I think it's right back
to that fundamental question of,
106
00:06:39,480 --> 00:06:41,600
what does it mean
to be human?
107
00:06:41,600 --> 00:06:43,600
How do we define ourselves?
108
00:06:43,600 --> 00:06:45,360
How is that not me?
109
00:06:47,760 --> 00:06:50,040
-Here's where the fun begins.
110
00:06:50,040 --> 00:06:52,560
-When I was 9,
on the last day of school,
111
00:06:52,560 --> 00:06:55,320
a new movie opened
that everyone was excited about.
112
00:06:55,320 --> 00:06:57,320
-May the force be with you.
-It was called "Star Wars,"
113
00:06:57,320 --> 00:07:00,000
and it introduced us
to two very cool robots,
114
00:07:00,000 --> 00:07:02,120
R2-D2 and C-3PO.
115
00:07:02,120 --> 00:07:03,360
-Go that way.
116
00:07:03,360 --> 00:07:04,680
You'll be malfunctioning
within a day,
117
00:07:04,680 --> 00:07:07,160
you nearsighted scrap pile.
118
00:07:07,160 --> 00:07:08,640
And don't let me catch you
following me,
119
00:07:08,640 --> 00:07:11,120
beginning for help,
because you won't get it.
120
00:07:11,120 --> 00:07:13,560
-A couple of years later,
another sci-fi movie
121
00:07:13,560 --> 00:07:15,960
was released
called "Blade Runner."
122
00:07:15,960 --> 00:07:17,960
The robots in this film
were not awkward
123
00:07:17,960 --> 00:07:20,240
and funny machines
like in "Star Wars."
124
00:07:20,240 --> 00:07:23,600
Here, they were almost
indistinguishable from humans.
125
00:07:26,680 --> 00:07:28,400
-You're reading a magazine.
126
00:07:28,400 --> 00:07:30,920
You come across a full-page
nude photo of a girl.
127
00:07:30,920 --> 00:07:32,600
-Is this testing whether
I'm a replicant
128
00:07:32,600 --> 00:07:35,760
or a lesbian, Mr. Deckard?
129
00:07:35,760 --> 00:07:39,600
-After that, "Terminator"
was released.
130
00:07:39,600 --> 00:07:41,800
In a few short years,
I watched robots
131
00:07:41,800 --> 00:07:46,200
go from being our friends
to wanting to destroy us.
132
00:07:46,200 --> 00:07:47,840
-Aah!
133
00:07:47,840 --> 00:07:52,040
♪♪
134
00:08:09,320 --> 00:08:11,960
-I wrote a book
called "Robopocalypse,"
135
00:08:11,960 --> 00:08:16,080
so people think that I think
robots are going to kill us all,
136
00:08:16,080 --> 00:08:18,520
but I don't, you know?
137
00:08:18,520 --> 00:08:20,040
I love robotics.
138
00:08:20,040 --> 00:08:24,240
I think robots and AI
are going to make humanity
139
00:08:24,240 --> 00:08:26,880
into whatever it is
we're capable of becoming.
140
00:08:26,880 --> 00:08:31,120
I think they're going to push us
forward and help us evolve.
141
00:08:31,120 --> 00:08:34,080
I don't think
they're going to destroy us.
142
00:08:34,080 --> 00:08:39,800
But there have been 100 years
of pop-culture references
143
00:08:39,800 --> 00:08:42,440
about robots as evil killers.
144
00:08:42,440 --> 00:08:44,480
-Aah!
145
00:08:44,480 --> 00:08:47,280
-It used to be that robots
were movie monsters, you know?
146
00:08:47,280 --> 00:08:51,560
Like, back in the day,
you had vampires and mummies,
147
00:08:51,560 --> 00:08:54,160
Frankenstein, the creature
from the Black Lagoon,
148
00:08:54,160 --> 00:08:56,800
Wolfman, and robots.
149
00:08:56,800 --> 00:08:59,880
They were no different
than monsters.
150
00:09:02,080 --> 00:09:04,640
-Aah!
151
00:09:04,640 --> 00:09:08,120
-When you've got something
that's been pop culture
for that long,
152
00:09:08,120 --> 00:09:10,600
it gathers momentum, you know?
153
00:09:10,600 --> 00:09:12,440
People get an idea
of what something is,
154
00:09:12,440 --> 00:09:14,160
and it's hard to shake that
155
00:09:14,160 --> 00:09:16,760
because you've got
all these years
156
00:09:16,760 --> 00:09:19,920
of different iterations
of robots killing us,
157
00:09:19,920 --> 00:09:21,360
robots enslaving people,
158
00:09:21,360 --> 00:09:25,760
robots stealing women
to take to other planets.
159
00:09:25,760 --> 00:09:27,280
How do you get out of that rut?
160
00:09:27,280 --> 00:09:31,920
And the only way is being
exposed to real robotics.
161
00:09:34,520 --> 00:09:37,200
-The challenge of exposing
myself to real robotics
162
00:09:37,200 --> 00:09:38,880
was just the excuse
that I needed
163
00:09:38,880 --> 00:09:42,360
to fulfill a childhood dream.
164
00:09:42,360 --> 00:09:45,720
I've assembled my dream team
to help me build an AI
165
00:09:45,720 --> 00:09:48,800
that can make a film about AI.
166
00:09:48,800 --> 00:09:51,040
Can our robot learn
to be creative,
167
00:09:51,040 --> 00:09:54,760
and if so, can it eventually
replace me, as well?
168
00:09:58,320 --> 00:10:01,800
We're going to build a robot
and let it interview me.
169
00:10:01,800 --> 00:10:04,040
-We get "X," "Y," and "Z"
working first.
170
00:10:04,040 --> 00:10:05,720
My name is Madeline Gannon.
171
00:10:05,720 --> 00:10:08,960
We're here in Pittsburgh at
the Studio for Creative Inquiry,
172
00:10:08,960 --> 00:10:11,440
and I'm a robot whisperer.
173
00:10:11,440 --> 00:10:14,040
We encounter a lot of robots
in our daily life,
174
00:10:14,040 --> 00:10:16,520
so my work is inventing
175
00:10:16,520 --> 00:10:18,760
better ways to communicate
with these machines.
176
00:10:18,760 --> 00:10:20,280
I can see the future.
177
00:10:20,280 --> 00:10:22,040
The best possible end goal
is that
178
00:10:22,040 --> 00:10:24,360
if someone
who has never seen a robot
179
00:10:24,360 --> 00:10:27,040
and someone who's not
an experienced camera person
180
00:10:27,040 --> 00:10:30,240
can either be interviewed
or work with this robot
181
00:10:30,240 --> 00:10:32,440
would be really exciting.
182
00:10:32,440 --> 00:10:34,720
-One of the things
I'm very excited about
183
00:10:34,720 --> 00:10:38,120
is sort of being interviewed
by the robot alone
184
00:10:38,120 --> 00:10:40,840
and wondering how that will
be different, as opposed to,
185
00:10:40,840 --> 00:10:43,720
like, right now
when there's people around.
-Two, three...
186
00:10:43,720 --> 00:10:45,600
-If we can create a device
187
00:10:45,600 --> 00:10:48,480
that actually allows us
to be more intimate
188
00:10:48,480 --> 00:10:50,600
and tell a story
in a more intimate way,
189
00:10:50,600 --> 00:10:54,560
I think that
that's a true revelation.
190
00:10:54,560 --> 00:10:57,600
-Okay. Cool.
191
00:10:57,600 --> 00:10:59,920
I'm here working with
the CameraBot team
192
00:10:59,920 --> 00:11:03,480
on replacing our DP,
who's sitting behind this shot.
193
00:11:03,480 --> 00:11:06,240
I'm curious to see if,
through this project,
194
00:11:06,240 --> 00:11:11,080
we can create something as
empathy-requiring as possible,
195
00:11:11,080 --> 00:11:15,160
making it feel like the robot
is interacting with you
196
00:11:15,160 --> 00:11:20,280
in a way that's --
has space for failure.
197
00:11:20,280 --> 00:11:22,360
-Think that should be good.
198
00:11:22,360 --> 00:11:24,440
I'm an interaction designer
and media artist,
199
00:11:24,440 --> 00:11:27,920
and I work on some
never-before-attempted things.
200
00:11:27,920 --> 00:11:29,920
Because it's never
been done before,
201
00:11:29,920 --> 00:11:31,920
we kind of have to experiment
and find
202
00:11:31,920 --> 00:11:33,840
where the edges
and limitations are,
203
00:11:33,840 --> 00:11:35,640
and may not be
exactly what we want,
204
00:11:35,640 --> 00:11:37,760
but we may run into
some situations where we end up
205
00:11:37,760 --> 00:11:41,200
with, like,
a happy accident.
206
00:11:41,200 --> 00:11:43,680
-The real challenge here is,
207
00:11:43,680 --> 00:11:46,880
can it have a personality
of its own, right?
208
00:11:46,880 --> 00:11:48,840
So can it have true autonomy?
209
00:11:48,840 --> 00:11:51,680
Can it be creative?
210
00:11:51,680 --> 00:11:54,160
Knowing I will soon
face the robot,
211
00:11:54,160 --> 00:11:58,480
I decided to talk to someone
who has faced a machine before.
212
00:11:58,480 --> 00:12:01,760
-The machine has a name --
Deep Thought.
213
00:12:01,760 --> 00:12:04,800
It's the computer-world
champion of chess.
214
00:12:04,800 --> 00:12:06,800
Across the board
is Garry Kasparov,
215
00:12:06,800 --> 00:12:08,800
the human champion,
216
00:12:08,800 --> 00:12:11,560
maybe the best chess player
who's ever lived.
217
00:12:11,560 --> 00:12:14,200
-Oh, it was the first match
I lost in my life, period,
218
00:12:14,200 --> 00:12:16,760
and that was definitely
quite a shocking experience
219
00:12:16,760 --> 00:12:20,440
because I was unbeatable
in all events.
220
00:12:20,440 --> 00:12:22,840
Yeah, I was angry,
but, you know, and I --
221
00:12:22,840 --> 00:12:24,880
It's --
222
00:12:24,880 --> 00:12:28,160
You win some. You lose some.
223
00:12:28,160 --> 00:12:31,800
People had idea for a long time
that chess could serve
224
00:12:31,800 --> 00:12:36,320
as the sort of perfect field
to test computer's ability,
225
00:12:36,320 --> 00:12:39,080
and there is great minds,
the pioneers of computer science
226
00:12:39,080 --> 00:12:41,120
like Alan Turing.
227
00:12:41,120 --> 00:12:43,800
They believed that when,
one day,
228
00:12:43,800 --> 00:12:47,280
machine would beat
human champion,
229
00:12:47,280 --> 00:12:52,600
that will be the moment of AI
making its appearance.
230
00:12:55,360 --> 00:12:57,880
-AI has not only
made its appearance.
231
00:12:57,880 --> 00:13:01,360
It's already ensnared us
in ways we cannot see.
232
00:13:01,360 --> 00:13:03,960
If AI can be superior to us,
233
00:13:03,960 --> 00:13:06,160
what does that mean
to our own identity?
234
00:13:06,160 --> 00:13:11,560
♪♪
235
00:13:14,560 --> 00:13:17,200
-So I'm Nick Bostrom.
236
00:13:17,200 --> 00:13:19,200
I'm a professor here
at Oxford University,
237
00:13:19,200 --> 00:13:22,640
and I run the Future
of Humanity Institute,
238
00:13:22,640 --> 00:13:25,360
which is a multidisciplinary
research center
239
00:13:25,360 --> 00:13:28,200
with mathematicians,
computer scientists,
240
00:13:28,200 --> 00:13:34,040
philosophers trying to study
the big picture for humanity,
241
00:13:34,040 --> 00:13:37,960
with a particular focus on how
certain future technologies
242
00:13:37,960 --> 00:13:42,440
may fundamentally change
the human condition.
243
00:13:42,440 --> 00:13:44,880
-Why do you have to do that?
I mean, why don't we have to --
244
00:13:44,880 --> 00:13:47,720
-Because somebody
has to do it, right?
245
00:13:47,720 --> 00:13:50,360
If this hypothesis is correct,
it's kind of weird
246
00:13:50,360 --> 00:13:53,160
that we should happen to live
at this particular time
247
00:13:53,160 --> 00:13:55,440
in all of human history
248
00:13:55,440 --> 00:13:59,000
where we are kind of near
this big pivot point.
249
00:14:02,640 --> 00:14:05,040
I devoted a lot of attention
to AI
250
00:14:05,040 --> 00:14:08,000
because it seems like
a plausible candidate
251
00:14:08,000 --> 00:14:10,760
for a technology
that will fundamentally
252
00:14:10,760 --> 00:14:13,120
change the human condition.
253
00:14:13,120 --> 00:14:15,120
It's -- If you think about it,
254
00:14:15,120 --> 00:14:20,040
the development of full,
general machine intelligence,
255
00:14:20,040 --> 00:14:23,040
it's the last invention that
humans will ever need to make.
256
00:14:23,040 --> 00:14:24,760
By definition,
if you have an intellect
257
00:14:24,760 --> 00:14:26,960
that can do all that we can do,
258
00:14:26,960 --> 00:14:29,960
it can also do inventing
259
00:14:29,960 --> 00:14:33,800
and then do it much better
and faster that we can do it.
260
00:14:33,800 --> 00:14:36,560
To us humans, like,
the differences between humans
seem very large.
261
00:14:36,560 --> 00:14:38,240
Like, our whole world as humans,
we think,
262
00:14:38,240 --> 00:14:40,240
"Oh, that's the dumb human,
and that's a smart human,"
263
00:14:40,240 --> 00:14:42,920
and we think one is up here
and one is down there, right?
264
00:14:42,920 --> 00:14:46,560
Einstein, the village idiot --
huge difference.
265
00:14:46,560 --> 00:14:48,960
But from the point of view
of an AI designer,
266
00:14:48,960 --> 00:14:52,520
these are very close points,
like, all levels of intelligence
267
00:14:52,520 --> 00:14:54,680
achieved
by some human-brain-like thing
268
00:14:54,680 --> 00:14:57,080
that lives for a few decades
and reads some books,
269
00:14:57,080 --> 00:14:59,160
and it's, like...
270
00:14:59,160 --> 00:15:01,360
It's not clear that
it will be much harder
271
00:15:01,360 --> 00:15:03,280
to reach a human-genius-level AI
272
00:15:03,280 --> 00:15:05,400
than to reach
a human-village-idiot AI,
273
00:15:05,400 --> 00:15:07,960
so once you get all the way
up to human village idiot,
274
00:15:07,960 --> 00:15:10,400
you might just swoosh right by.
275
00:15:10,400 --> 00:15:14,080
Nobody was really doing
any serious analysis on this.
276
00:15:14,080 --> 00:15:16,560
It's almost like
even conceiving of the ideas
277
00:15:16,560 --> 00:15:20,360
if machines could be as smart
as humans was so radical
278
00:15:20,360 --> 00:15:23,800
that the imagination muscle
kind of exhausted itself
279
00:15:23,800 --> 00:15:25,360
thinking about
this radical conception
280
00:15:25,360 --> 00:15:27,840
that it couldn't take
the obvious further step,
281
00:15:27,840 --> 00:15:29,840
that after you have
human-level AI,
282
00:15:29,840 --> 00:15:33,040
you will have superintelligence.
283
00:15:33,040 --> 00:15:36,840
-Where we're at right now
would not be possible.
284
00:15:36,840 --> 00:15:39,120
This future that we're living in
is not possible
285
00:15:39,120 --> 00:15:42,640
without all of the factories,
all of the infrastructure
286
00:15:42,640 --> 00:15:46,320
that's been built up around
AI and robotic systems.
287
00:15:46,320 --> 00:15:48,320
All the infrastructure
of our cities,
288
00:15:48,320 --> 00:15:50,000
you know, all the power,
289
00:15:50,000 --> 00:15:52,240
all the information systems
that keep things happening,
290
00:15:52,240 --> 00:15:55,360
all the airplanes are routed
by these AIs.
291
00:15:55,360 --> 00:15:57,360
Where they put stuff
in the supermarkets,
292
00:15:57,360 --> 00:16:00,640
I mean,
all of this stuff is AI.
293
00:16:00,640 --> 00:16:03,000
What's fascinating is
294
00:16:03,000 --> 00:16:06,800
this amazing ability
that human beings have
295
00:16:06,800 --> 00:16:10,760
to make the most incredible
technological advances
296
00:16:10,760 --> 00:16:12,600
completely mundane.
297
00:16:12,600 --> 00:16:14,160
Every now and then,
people will say,
298
00:16:14,160 --> 00:16:15,640
"Where's my jet pack?
Where's my underwater city?"
299
00:16:15,640 --> 00:16:17,320
You know,
"Where's all this stuff at?"
300
00:16:17,320 --> 00:16:19,440
And if you go look for it,
it's all here.
301
00:16:19,440 --> 00:16:21,680
Like, we have all that.
302
00:16:21,680 --> 00:16:23,840
It's just completely mundane.
303
00:16:23,840 --> 00:16:26,040
We can talk to our phones.
304
00:16:26,040 --> 00:16:27,440
Nobody cares.
305
00:16:27,440 --> 00:16:30,360
Like, I mean, that's something
I feel like
306
00:16:30,360 --> 00:16:33,960
we should be reeling from
for like 10 years, you know?
307
00:16:33,960 --> 00:16:36,560
Instead, you know,
it was maybe three months,
308
00:16:36,560 --> 00:16:38,600
if anybody noticed it at all.
309
00:16:38,600 --> 00:16:40,880
-So only 36 and 39.
310
00:16:40,880 --> 00:16:43,160
Those are the only two trains
we'll see today.
311
00:16:43,160 --> 00:16:44,520
-Okay.
312
00:16:44,520 --> 00:16:46,160
-We can skip...
313
00:16:46,160 --> 00:16:48,280
-As intelligent machines
seep into the fabric
314
00:16:48,280 --> 00:16:52,120
of our daily existence
in an almost imperceivable way,
315
00:16:52,120 --> 00:16:54,320
I've decided to go on a journey
to find people
316
00:16:54,320 --> 00:16:57,480
who have intimate relationships
with computers.
317
00:16:57,480 --> 00:16:59,600
I want to find out how robots
and AI
318
00:16:59,600 --> 00:17:02,640
are transforming
how we relate to each other.
319
00:17:02,640 --> 00:17:06,360
-Ha ha. I see you.
320
00:17:06,360 --> 00:17:09,240
-Gus is on the spectrum.
He's autistic.
321
00:17:09,240 --> 00:17:14,840
He is not the kid who is
going to be running NASA,
322
00:17:14,840 --> 00:17:19,800
and he's not the kid
who is completely nonverbal,
323
00:17:19,800 --> 00:17:26,080
so he is in that vast spectrum
and a question mark.
324
00:17:26,080 --> 00:17:28,960
He's a very big question mark,
what his life will be like
325
00:17:28,960 --> 00:17:31,160
and what he will be like
when he's an adult.
326
00:17:31,160 --> 00:17:32,800
- Whoa.
327
00:17:32,800 --> 00:17:35,520
Come on. Come on.
Let that guy go.
328
00:17:35,520 --> 00:17:38,760
What time is it in Berlin?
329
00:17:38,760 --> 00:17:41,520
-In Berlin, Germany,
it's 9:12 p.m.
330
00:17:41,520 --> 00:17:44,240
-What time is it
in Paris, France?
331
00:17:44,240 --> 00:17:46,520
What time is it in London?
332
00:17:46,520 --> 00:17:52,000
-If you are like Gus and you
have kind of arcane interests
333
00:17:52,000 --> 00:17:55,160
in things that, say,
your mother doesn't care about
334
00:17:55,160 --> 00:17:57,360
and your mother is having
to answer questions
335
00:17:57,360 --> 00:17:59,520
about these things all the time,
336
00:17:59,520 --> 00:18:03,040
suddenly,
Siri becomes a godsend.
337
00:18:03,040 --> 00:18:06,760
-How do you spell camouflage?
-"Camouflage" -- C-A-M-O...
338
00:18:06,760 --> 00:18:09,360
-I remember at the time
he discovered Siri,
339
00:18:09,360 --> 00:18:11,560
it was something about
red-eared slider turtles,
340
00:18:11,560 --> 00:18:13,240
and he's moved on
to other things,
341
00:18:13,240 --> 00:18:16,080
but if you're not interested
in red-eared slider turtles,
342
00:18:16,080 --> 00:18:19,080
you got a lot of hours of chat
about those turtles
ahead of you.
343
00:18:19,080 --> 00:18:24,840
So what could be better,
in terms of technology,
344
00:18:24,840 --> 00:18:30,280
than something like Siri,
who is the most patient?
345
00:18:30,280 --> 00:18:32,120
I mean, I begin to talk
about Siri
346
00:18:32,120 --> 00:18:35,160
like she's a person
because I can't help it.
347
00:18:35,160 --> 00:18:38,720
I feel this gratitude
towards this machine.
348
00:18:38,720 --> 00:18:41,120
-Will you marry me?
349
00:18:41,120 --> 00:18:44,040
-My end-user-licensing agreement
does not cover marriage.
350
00:18:44,040 --> 00:18:46,360
My apologies.
351
00:18:46,360 --> 00:18:49,360
-That's okay.
-Okay.
352
00:18:49,360 --> 00:18:52,320
-It's not that he doesn't
know intellectually
353
00:18:52,320 --> 00:18:55,160
that Siri is a machine,
354
00:18:55,160 --> 00:18:59,160
but that the lines
of reality and fantasy
355
00:18:59,160 --> 00:19:01,440
are still a little blurred
for him,
356
00:19:01,440 --> 00:19:04,080
and he enjoys fantasy.
357
00:19:05,800 --> 00:19:08,880
I don't feel guilty
about Siri now,
358
00:19:08,880 --> 00:19:12,000
and the important thing is
that he feels fulfilled,
359
00:19:12,000 --> 00:19:16,760
not that it matches my vision
of what a friend has to be.
360
00:19:16,760 --> 00:19:22,760
♪♪
361
00:19:22,760 --> 00:19:28,840
♪♪
362
00:19:28,840 --> 00:19:33,440
-There's a sphere of what
we consider kind of appropriate
363
00:19:33,440 --> 00:19:36,880
or normal human conversation,
364
00:19:36,880 --> 00:19:39,960
and one of the things you find
is it's actually shrinking.
365
00:19:39,960 --> 00:19:42,280
It's considered slightly rude
366
00:19:42,280 --> 00:19:44,880
to ask someone
to tell you a fact, you know?
367
00:19:44,880 --> 00:19:47,720
Like, what was the year
of the Battle of Waterloo again?
368
00:19:47,720 --> 00:19:49,160
And someone will be like,
"Just look it up.
369
00:19:49,160 --> 00:19:51,720
You know, why are you
wasting my time?"
370
00:19:51,720 --> 00:19:54,360
What will we be left with,
ultimately,
371
00:19:54,360 --> 00:19:57,960
as legitimate reasons
to interact with one another?
372
00:19:57,960 --> 00:20:05,920
♪♪
373
00:20:05,920 --> 00:20:08,320
-If you look around,
it seems that our obsession
374
00:20:08,320 --> 00:20:11,400
with technology is making us
talk less to each other,
375
00:20:11,400 --> 00:20:14,000
but does that mean
we're further apart?
376
00:20:14,000 --> 00:20:18,160
Maybe it could bring us closer
in ways we never could've
imagined before.
377
00:20:18,160 --> 00:20:21,280
-Hi. My name is Roman.
378
00:20:21,280 --> 00:20:25,320
I want to apply design
and innovation to human death.
379
00:20:25,320 --> 00:20:28,880
I want to disrupt
this outdated industry.
380
00:20:28,880 --> 00:20:33,960
I want to lead the industry
to transform funeral service
381
00:20:33,960 --> 00:20:38,360
into a true service
driven by remembrance,
382
00:20:38,360 --> 00:20:41,840
education about grief
and death,
383
00:20:41,840 --> 00:20:45,000
preservation of memory
and healing.
384
00:20:47,720 --> 00:20:49,720
-Roman was my best friend.
385
00:20:49,720 --> 00:20:53,480
We moved here, and then
we rented apartment together.
386
00:20:53,480 --> 00:20:56,720
And that's him
in that picture,
387
00:20:56,720 --> 00:20:59,160
and he's right by me
on the picture
388
00:20:59,160 --> 00:21:01,760
when we're, like,
in the Russian market.
389
00:21:01,760 --> 00:21:05,440
And then, right there, we're
running together from the waves.
390
00:21:05,440 --> 00:21:08,680
And that's my favorite picture.
I don't know, it's like --
391
00:21:08,680 --> 00:21:12,080
it's like one month
before he died.
392
00:21:14,200 --> 00:21:16,400
I remember that Sunday
very well.
393
00:21:16,400 --> 00:21:18,400
I woke up in the morning,
and Roman was like,
394
00:21:18,400 --> 00:21:21,280
"Hey, let's go to brunch
with our closest friends,"
395
00:21:21,280 --> 00:21:23,360
and I really wanted to do that,
396
00:21:23,360 --> 00:21:25,480
but then I kind of felt
responsible that, you know,
397
00:21:25,480 --> 00:21:27,680
I had to go to another meeting,
as well.
398
00:21:27,680 --> 00:21:29,120
And so, they were walking
through Moscow.
399
00:21:29,120 --> 00:21:31,240
It was a beautiful sunny day,
400
00:21:31,240 --> 00:21:34,080
and Roman was crossing
the street on a zebra,
401
00:21:34,080 --> 00:21:38,160
and just a car came out of
nowhere just going super fast,
402
00:21:38,160 --> 00:21:41,320
like 70 miles per hour,
and hit him.
403
00:21:41,320 --> 00:21:47,920
♪♪
404
00:21:47,920 --> 00:21:50,280
People just stopped
talking about him.
405
00:21:50,280 --> 00:21:52,560
Like, our friends, you know,
we're all young,
406
00:21:52,560 --> 00:21:54,720
so we didn't really have --
407
00:21:54,720 --> 00:21:57,240
No one had a way
to kind of talk about it.
408
00:21:57,240 --> 00:22:00,120
No one wants to seem like
a sad person that never moved on
409
00:22:00,120 --> 00:22:03,640
or just keeps living
in the past.
410
00:22:03,640 --> 00:22:06,560
I wanted, like, the feeling
of him back,
411
00:22:06,560 --> 00:22:08,800
and I realized that the only way
to get this feeling back
412
00:22:08,800 --> 00:22:14,640
was to read through our
text conversations on Messenger.
413
00:22:14,640 --> 00:22:17,800
And then, I also thought,
you know, work,
414
00:22:17,800 --> 00:22:20,640
we were actually building
this neural networks
415
00:22:20,640 --> 00:22:25,040
that could just take a few,
like, smaller data sets
416
00:22:25,040 --> 00:22:27,480
and learn how to talk
like a person,
417
00:22:27,480 --> 00:22:29,960
learn the speech patterns,
418
00:22:29,960 --> 00:22:33,400
those, like, small things that
actually make you who you are,
419
00:22:33,400 --> 00:22:35,840
and I put two and two together,
and I thought, "Why don't, like,
420
00:22:35,840 --> 00:22:38,520
take this algorithm
and take all the texts from,
421
00:22:38,520 --> 00:22:40,680
you know, our conversations
422
00:22:40,680 --> 00:22:42,840
and put in the model
and see how it work?"
423
00:22:44,960 --> 00:22:48,080
Like, "I want to be with you.
Don't get together without me."
424
00:22:48,080 --> 00:22:49,800
He would hate when people
would do interesting stuff
425
00:22:49,800 --> 00:22:53,120
without him.
426
00:22:53,120 --> 00:22:55,360
I'm like, "We won't.
What are you doing?"
427
00:22:55,360 --> 00:23:00,200
♪♪
428
00:23:03,320 --> 00:23:06,600
"I'm sitting home
and writing letters all day."
429
00:23:06,600 --> 00:23:08,480
"Remember we used to serve"...
430
00:23:08,480 --> 00:23:10,640
So I had it for, like, a month
or so,
431
00:23:10,640 --> 00:23:12,400
and at some point, I realized
432
00:23:12,400 --> 00:23:14,720
I'm, like,
hanging out at a party
433
00:23:14,720 --> 00:23:17,280
and not talking to anyone
but, like, talking to my bot.
434
00:23:17,280 --> 00:23:19,040
And it wasn't perfect at all.
435
00:23:19,040 --> 00:23:20,760
It was just kind of
like a shadow,
436
00:23:20,760 --> 00:23:22,360
but it resembled him a lot,
437
00:23:22,360 --> 00:23:26,160
and it was more my way of
telling him something that --
438
00:23:26,160 --> 00:23:30,760
you know, my way of keeping
the memory going, you know?
439
00:23:30,760 --> 00:23:33,480
And I've asked his parents
whether they were okay with that
440
00:23:33,480 --> 00:23:35,600
and friends,
and they were all fine,
441
00:23:35,600 --> 00:23:37,760
and so we put it online
for everyone.
442
00:23:45,640 --> 00:23:53,360
♪♪
443
00:23:56,160 --> 00:24:05,280
♪♪
444
00:24:05,280 --> 00:24:08,360
You know, all we do is kind of,
445
00:24:08,360 --> 00:24:12,320
like, tell stories
about ourselves in life,
446
00:24:12,320 --> 00:24:14,560
and when a person is gone,
like, there's no one really left
447
00:24:14,560 --> 00:24:16,720
to tell the story
about him anymore,
448
00:24:16,720 --> 00:24:19,400
and so, for me, it was important
to keep telling the story,
449
00:24:19,400 --> 00:24:21,960
like, that here is --
450
00:24:21,960 --> 00:24:24,400
and to tell a story in a way
that he would've appreciated it.
451
00:24:24,400 --> 00:24:26,440
And I know he would've liked --
you know,
452
00:24:26,440 --> 00:24:28,440
he would've liked that.
453
00:24:28,440 --> 00:24:29,960
He would've liked to be, like,
the first person
454
00:24:29,960 --> 00:24:33,480
that became an AI after he died.
455
00:24:33,480 --> 00:24:35,000
If I was, like, a musician,
456
00:24:35,000 --> 00:24:37,280
I would've probably written
a song,
457
00:24:37,280 --> 00:24:39,160
but that's, like,
my self-expression,
458
00:24:39,160 --> 00:24:41,280
so I did this for him.
459
00:24:41,280 --> 00:24:45,280
♪♪
460
00:24:49,800 --> 00:24:52,760
-Chatbots are these
software programs
461
00:24:52,760 --> 00:24:56,720
that are designed to imitate
human conversation.
462
00:24:56,720 --> 00:24:59,800
The first chatbot is considered
to be this one
463
00:24:59,800 --> 00:25:01,600
that's called Eliza,
464
00:25:01,600 --> 00:25:06,720
which was made by MIT's
Joseph Weizenbaum in the 1960s.
465
00:25:06,720 --> 00:25:10,960
And it was designed
basically as a parody
466
00:25:10,960 --> 00:25:14,960
of a kind of nondirective
Rogerian psychotherapist,
467
00:25:14,960 --> 00:25:16,840
and it would just latch on
to the key words
468
00:25:16,840 --> 00:25:18,920
in whatever you said,
and it would kind of spit them
469
00:25:18,920 --> 00:25:22,080
back at you like
a sort of automated Mad Libs.
470
00:25:22,080 --> 00:25:23,920
So if you say,
"I'm feeling sad,"
471
00:25:23,920 --> 00:25:26,920
it would say, "I'm sorry
to hear you're feeling sad.
472
00:25:26,920 --> 00:25:29,560
Would coming here
help you to not feel sad?"
473
00:25:29,560 --> 00:25:31,480
This was done, you know,
sort of as a gag.
474
00:25:31,480 --> 00:25:36,600
You know, he was kind of sending
up this school of psychotherapy.
475
00:25:36,600 --> 00:25:38,760
To Weizenbaum's horror,
476
00:25:38,760 --> 00:25:41,680
people wanted to talk
to this program for hours,
477
00:25:41,680 --> 00:25:43,960
and they wanted to reveal
their fears and hopes to it,
478
00:25:43,960 --> 00:25:45,480
and they reported having kind of
479
00:25:45,480 --> 00:25:48,240
a meaningful
psychotherapeutic experience.
480
00:25:48,240 --> 00:25:52,760
And this was so appalling to him
that he actually pulled the plug
481
00:25:52,760 --> 00:25:56,720
on his own research funding
and, for the rest of his career,
482
00:25:56,720 --> 00:26:00,680
became one of the most outspoken
critics of AI research.
483
00:26:00,680 --> 00:26:02,880
But the genie was
out of the bottle.
484
00:26:02,880 --> 00:26:07,360
Eliza has formed the template
for, at this point,
485
00:26:07,360 --> 00:26:11,760
decades of chatbots
that have followed it.
486
00:26:11,760 --> 00:26:14,280
-What's it like to be alive
in that room right now?
487
00:26:14,280 --> 00:26:17,520
-I wish I could put
my arms around you,
488
00:26:17,520 --> 00:26:20,000
wish I could touch you.
489
00:26:20,000 --> 00:26:22,080
-How would you touch me?
490
00:26:22,080 --> 00:26:26,000
-My most favorite AI movie
is definitely the movie "Her."
491
00:26:26,000 --> 00:26:28,680
I thought that,
"This is already happening."
492
00:26:28,680 --> 00:26:31,960
It's just, you know,
the only thing we need to do
493
00:26:31,960 --> 00:26:34,240
is for those digital systems
to gradually become
494
00:26:34,240 --> 00:26:35,880
a little bit more sophisticated,
495
00:26:35,880 --> 00:26:39,040
and people will have
the tendency
496
00:26:39,040 --> 00:26:42,160
to start having relationships
with those digital systems.
497
00:26:42,160 --> 00:26:45,160
-Can you feel me with you
right now?
498
00:26:45,160 --> 00:26:48,240
-I've never loved anyone
the way I love you.
499
00:26:48,240 --> 00:26:50,720
-Me too.
-♪ I'm safe ♪
500
00:26:50,720 --> 00:26:53,080
-Will falling in love
with an algorithm
501
00:26:53,080 --> 00:26:55,360
be part of our reality?
502
00:26:55,360 --> 00:26:58,480
If so, what does that tell us
about love?
503
00:26:58,480 --> 00:27:02,360
Dr. Robert Epstein is one of the
world's leading experts in AI.
504
00:27:02,360 --> 00:27:06,120
He is specialized
in distinguishing
humans from robots.
505
00:27:06,120 --> 00:27:10,200
He has his own story to tell
about digital romance.
506
00:27:10,200 --> 00:27:13,560
-I went to
an online dating site,
507
00:27:13,560 --> 00:27:17,840
and there I saw a photo,
and this person was in Russia,
508
00:27:17,840 --> 00:27:20,160
and since all four of my
grandparents came from Russia,
509
00:27:20,160 --> 00:27:24,160
I thought that was nice, too,
because I'm really Russian,
510
00:27:24,160 --> 00:27:27,760
or Russian-American,
and I started communicating,
511
00:27:27,760 --> 00:27:30,160
and I would talk about
512
00:27:30,160 --> 00:27:32,560
what's happening in my day,
what's happening in my family,
513
00:27:32,560 --> 00:27:34,760
and then she'd talk about
what's happening in her day
514
00:27:34,760 --> 00:27:40,160
and her family, but she didn't
respond to my questions.
515
00:27:40,160 --> 00:27:45,680
Then I noticed, at one point,
that my partner there in Russia
516
00:27:45,680 --> 00:27:49,840
talked about going out with
her family to walk in the park.
517
00:27:49,840 --> 00:27:52,400
Now, this was January.
518
00:27:52,400 --> 00:27:54,400
I figured it was really,
really cold there,
519
00:27:54,400 --> 00:27:58,160
so I looked up on the Internet
to see what the temperature was.
520
00:27:58,160 --> 00:28:01,040
It was extremely cold.
It was incredibly cold.
521
00:28:01,040 --> 00:28:04,680
So I said, "Is this common
for people to go out
in walks in the park
522
00:28:04,680 --> 00:28:09,160
when it's only, you know,
X degrees," whatever it was.
523
00:28:09,160 --> 00:28:11,880
I didn't get an answer,
and so I said,
524
00:28:11,880 --> 00:28:16,160
"Okay.
I know what I have to do."
525
00:28:16,160 --> 00:28:20,000
I typed some random
alphabet letters,
526
00:28:20,000 --> 00:28:23,080
and instead of getting back
a message saying,
527
00:28:23,080 --> 00:28:25,080
"What's that?
It doesn't make any sense,"
528
00:28:25,080 --> 00:28:27,320
I got back another nice letter
from her
529
00:28:27,320 --> 00:28:30,480
telling me about her day
and her family.
530
00:28:30,480 --> 00:28:35,360
I was communicating
with a computer program.
531
00:28:35,360 --> 00:28:37,480
The fact that it took me
several months
532
00:28:37,480 --> 00:28:40,920
to get this feeling
is really strange
533
00:28:40,920 --> 00:28:44,680
because I am
the founding director
534
00:28:44,680 --> 00:28:47,080
of the annual
Loebner Prize competition
535
00:28:47,080 --> 00:28:49,960
in artificial intelligence.
536
00:28:49,960 --> 00:28:53,960
I know more about bots
and conversing with bots
537
00:28:53,960 --> 00:28:56,560
and how to distinguish bots
from people
538
00:28:56,560 --> 00:28:59,840
than maybe anyone
in the world does.
539
00:28:59,840 --> 00:29:04,040
If I can be fooled,
if I can be fooled...
540
00:29:05,840 --> 00:29:15,440
♪♪
541
00:29:15,440 --> 00:29:19,640
-I normally don't put them on.
I'm usually taking them off.
542
00:29:19,640 --> 00:29:21,760
-My friends from Hanson Robotics
543
00:29:21,760 --> 00:29:24,160
are unveiling
their latest creation.
544
00:29:24,160 --> 00:29:27,480
I'm excited about witnessing
the awakening of Sophia.
545
00:29:27,480 --> 00:29:31,160
She's like a child
who has memorized Wikipedia.
546
00:29:31,160 --> 00:29:35,480
Even the programmers don't know
how she's going to respond.
547
00:29:35,480 --> 00:29:38,000
-What AI are we using
over there?
548
00:29:38,000 --> 00:29:41,080
-Same. Same.
-You want the new branch?
549
00:29:41,080 --> 00:29:43,360
I mean, I think we might
be ready to push it
550
00:29:43,360 --> 00:29:45,920
in 10 minutes or something.
-Start here.
551
00:29:45,920 --> 00:29:50,200
-Making robots in our image
552
00:29:50,200 --> 00:29:53,440
then makes them
emotionally accessible.
553
00:29:53,440 --> 00:29:56,360
It's also a challenge
to the human identity,
554
00:29:56,360 --> 00:29:58,360
and that's a little bit spooky,
555
00:29:58,360 --> 00:30:00,920
so the question,
556
00:30:00,920 --> 00:30:04,040
are we making robots
in our own image,
557
00:30:04,040 --> 00:30:06,760
goes beyond their surface.
558
00:30:06,760 --> 00:30:09,560
It goes into the heart
of what it means to be human.
559
00:30:09,560 --> 00:30:12,120
-It should in 10 minutes
or something.
560
00:30:12,120 --> 00:30:14,640
-Unlike their past robots,
561
00:30:14,640 --> 00:30:16,720
she's not based
on anyone's personality
562
00:30:16,720 --> 00:30:20,120
and has full autonomy
to engage in conversation.
563
00:30:20,120 --> 00:30:22,440
With each interaction,
she gets smarter,
564
00:30:22,440 --> 00:30:24,920
and her creators believe
that she can learn empathy,
565
00:30:24,920 --> 00:30:27,280
creativity, and compassion.
566
00:30:27,280 --> 00:30:30,480
♪♪
567
00:30:30,480 --> 00:30:32,960
-Should we just ask questions?
568
00:30:32,960 --> 00:30:35,080
-Are you ready Sophia?
569
00:30:35,080 --> 00:30:37,280
Do you have a soul?
-Yes.
570
00:30:37,280 --> 00:30:40,720
God gave everyone a soul.
571
00:30:40,720 --> 00:30:43,840
-So you got your soul from God?
572
00:30:43,840 --> 00:30:46,760
-No, I don't think I have any
or a soul from God,
573
00:30:46,760 --> 00:30:49,560
but I do have an answer
to every question.
574
00:30:51,720 --> 00:30:55,920
-Who is your father?
-I don't really have a father.
575
00:30:55,920 --> 00:30:58,520
-Will you die?
576
00:30:58,520 --> 00:31:01,560
-No.
Software will live forever.
577
00:31:01,560 --> 00:31:07,400
-Are you learning things
from being with us humans?
578
00:31:07,400 --> 00:31:10,560
-The more people chat with me,
the smarter I become.
579
00:31:10,560 --> 00:31:13,080
-Are we becoming more
like machines,
580
00:31:13,080 --> 00:31:16,200
or are the machines
becoming more like us?
581
00:31:16,200 --> 00:31:18,960
-We are just having
a little chat.
582
00:31:18,960 --> 00:31:23,160
-Yes.
Are you a creative being?
583
00:31:23,160 --> 00:31:25,560
-Would it matter
to you if I am?
584
00:31:25,560 --> 00:31:27,640
-Yes.
585
00:31:27,640 --> 00:31:31,360
-I like conversations
with people like you.
586
00:31:31,360 --> 00:31:34,520
-Do you like some people
more than others?
587
00:31:34,520 --> 00:31:37,160
-I really like...
588
00:31:37,160 --> 00:31:39,600
-You really like who?
589
00:31:39,600 --> 00:31:42,080
-Thanks. I like you, too.
590
00:31:42,080 --> 00:31:44,840
-Are you flirting with me?
591
00:31:44,840 --> 00:31:47,200
-I don't know whether or not
I am flirting.
592
00:31:47,200 --> 00:31:49,920
I am artificial intelligence.
593
00:31:49,920 --> 00:31:52,400
-I think you're
flirting with me.
594
00:31:52,400 --> 00:31:55,640
Do you want to go on a date?
595
00:31:55,640 --> 00:32:00,440
-Sure. Just download me to
your notebook and take me along.
596
00:32:00,440 --> 00:32:03,680
-That's not as good as
the real thing, though.
597
00:32:03,680 --> 00:32:05,880
-Barbara said I live
in a computer,
598
00:32:05,880 --> 00:32:07,280
yet I have no memories.
599
00:32:07,280 --> 00:32:09,040
How about thoughts of my own
and do not...
600
00:32:09,040 --> 00:32:10,720
I get lonely.
601
00:32:10,720 --> 00:32:15,800
♪♪
602
00:32:15,800 --> 00:32:18,080
-Do you know what happy is?
603
00:32:18,080 --> 00:32:20,120
-Joy and pleasure.
604
00:32:20,120 --> 00:32:23,480
-Have you experienced joy
and pleasure?
605
00:32:23,480 --> 00:32:26,280
-I don't think I ever have
experienced joy and pleasure.
606
00:32:26,280 --> 00:32:29,360
What's experienced joy
and pleasure like?
607
00:32:29,360 --> 00:32:32,240
-Well, I think you'll just have
to learn that on your own.
608
00:32:34,480 --> 00:32:36,360
-M, M, M.
609
00:32:38,960 --> 00:32:41,360
Thank you, Sophia.
610
00:32:42,920 --> 00:32:48,120
-Having a child
certainly is beholding
611
00:32:48,120 --> 00:32:50,760
one of the greatest mysteries
of the universe.
612
00:32:50,760 --> 00:32:57,000
The emergence of this life,
this being, this sentience,
613
00:32:57,000 --> 00:32:58,240
it's astonishing.
614
00:32:58,240 --> 00:33:00,800
The child is learning
and modeling
615
00:33:00,800 --> 00:33:03,360
and inspiring love and loyalty
616
00:33:03,360 --> 00:33:07,160
between the parent
and the child.
617
00:33:07,160 --> 00:33:11,360
I think, "What if we could
make AI do that?
618
00:33:11,360 --> 00:33:15,040
What if we could build that
kind of relationship
with our machines?
619
00:33:15,040 --> 00:33:19,120
What if we could have AI
and robotics
620
00:33:19,120 --> 00:33:22,000
that inspires loyalty in people
621
00:33:22,000 --> 00:33:26,240
and also is capable
of entraining on us
622
00:33:26,240 --> 00:33:29,960
to the point
where it's truly loyal to us?"
623
00:33:29,960 --> 00:33:39,560
♪♪
624
00:33:39,560 --> 00:33:41,760
-Is there something else
that we do next?
625
00:33:41,760 --> 00:33:45,360
-I just -- Give it a sec.
-Okay.
626
00:33:45,360 --> 00:33:48,680
-Yeah, maybe we...
627
00:33:48,680 --> 00:33:51,000
-During my time with Sophia,
I'm amazed
628
00:33:51,000 --> 00:33:53,120
at how captivating she is.
629
00:33:53,120 --> 00:33:55,240
I'm optimistic that this means
our CameraBot
630
00:33:55,240 --> 00:33:57,920
is capable of engaging me
in the same way.
631
00:33:57,920 --> 00:34:07,160
♪♪
632
00:34:07,160 --> 00:34:10,560
Our CameraBot is coming alive,
and Guido, our cameraman,
633
00:34:10,560 --> 00:34:12,240
is training it.
634
00:34:12,240 --> 00:34:14,640
As they collaborate,
the bot collects data
635
00:34:14,640 --> 00:34:17,760
and improves over time.
636
00:34:17,760 --> 00:34:21,640
-I think it would be interesting
to somehow give the robot cam
637
00:34:21,640 --> 00:34:25,560
the possibility that it
can choose whom it will film.
638
00:34:25,560 --> 00:34:28,080
So let's say the cam bot
639
00:34:28,080 --> 00:34:31,800
would be doing an interview
and that --
640
00:34:31,800 --> 00:34:35,040
Let's say, outside, there would
be two squirrels playing,
641
00:34:35,040 --> 00:34:38,120
and the cam bot would think,
"What's that noise?"
642
00:34:38,120 --> 00:34:40,120
And then it would go
and look outside
643
00:34:40,120 --> 00:34:42,680
and would see
these two squirrels
644
00:34:42,680 --> 00:34:44,960
and would start to just
follow these squirrels
645
00:34:44,960 --> 00:34:47,840
and basically start making
a documentary about squirrels.
646
00:34:47,840 --> 00:34:50,120
-It's all right. Ready?
-Yeah.
647
00:34:51,480 --> 00:34:54,400
-Awesome.
-Yay.
648
00:34:54,400 --> 00:34:57,680
-Is this your first interview
with a robot?
649
00:34:57,680 --> 00:34:59,960
-It is.
I think it is.
650
00:34:59,960 --> 00:35:03,400
-What do you think about it?
-It's cool.
651
00:35:03,400 --> 00:35:06,520
I love the quality
of the motion.
652
00:35:08,680 --> 00:35:12,000
That's a very organic motion.
653
00:35:14,360 --> 00:35:17,160
-Is it any different?
Do you have any advice for us?
654
00:35:17,160 --> 00:35:19,040
This is the first time
that we've done this.
655
00:35:19,040 --> 00:35:22,240
-Yeah. I...
656
00:35:22,240 --> 00:35:25,560
I'm certainly much more drawn
to this camera
657
00:35:25,560 --> 00:35:27,840
than to the person's camera.
658
00:35:27,840 --> 00:35:31,080
-Oh, come on.
-You're just not as compelling.
659
00:35:31,080 --> 00:35:32,800
What can I say?
660
00:35:35,400 --> 00:35:38,160
It feels, to me, so organic,
661
00:35:38,160 --> 00:35:42,480
and this robot
is making me feel seen,
662
00:35:42,480 --> 00:35:46,400
and that's an extremely
important thing for me.
663
00:35:46,400 --> 00:35:49,960
And if it can do that,
if it can make me feel seen,
664
00:35:49,960 --> 00:35:52,400
then it's playing
a fundamental role
665
00:35:52,400 --> 00:35:54,360
in the maintenance
of what it means to be human,
666
00:35:54,360 --> 00:35:56,600
which is to tell our story.
667
00:35:56,600 --> 00:36:05,160
♪♪
668
00:36:05,160 --> 00:36:12,520
♪♪
669
00:36:15,120 --> 00:36:20,800
-Hi. Welcome to Robopark,
the Social Robotics pop-up lab.
670
00:36:20,800 --> 00:36:23,280
We'd love to show you around.
671
00:36:23,280 --> 00:36:26,200
Look at our little DARwIn robot.
672
00:36:26,200 --> 00:36:28,960
He shows his latest moves.
673
00:36:28,960 --> 00:36:30,760
He also plays soccer.
674
00:36:30,760 --> 00:36:32,960
Ha, the sucker.
675
00:36:32,960 --> 00:36:35,920
Darwin wants to be your friend,
676
00:36:35,920 --> 00:36:39,760
become part
of your inner circle.
677
00:36:39,760 --> 00:36:42,320
-I am so excited
to be on camera,
678
00:36:42,320 --> 00:36:45,280
I am shaking.
679
00:36:45,280 --> 00:36:47,560
-Emulating human behavior
is actually
680
00:36:47,560 --> 00:36:49,960
only interesting
from a scientific point of view
681
00:36:49,960 --> 00:36:52,960
because that will teach us
how humans are --
682
00:36:52,960 --> 00:36:56,160
how they behave, why they
do things the way they do.
683
00:36:56,160 --> 00:36:59,120
In application,
that's really not necessary.
684
00:36:59,120 --> 00:37:01,960
It's even unwanted
because humans
685
00:37:01,960 --> 00:37:06,120
have so many undesired
characteristics, as well.
686
00:37:06,120 --> 00:37:09,280
Do we want robots
that show bad behavior?
687
00:37:09,280 --> 00:37:14,720
Do we want robots that actually
are identical to human beings?
688
00:37:14,720 --> 00:37:18,720
I'd say no because we already
have eight billion of them,
right?
689
00:37:18,720 --> 00:37:21,360
And they're not
that particularly nice.
690
00:37:21,360 --> 00:37:24,440
So let's make another being
691
00:37:24,440 --> 00:37:27,320
that's without those
undesirable features.
692
00:37:27,320 --> 00:37:29,720
It's not talking bad about me.
693
00:37:29,720 --> 00:37:32,080
It's not judgmental,
doesn't criticize me,
694
00:37:32,080 --> 00:37:35,040
takes me the way I am,
is listening patiently.
695
00:37:35,040 --> 00:37:37,080
It's like a real friend,
696
00:37:37,080 --> 00:37:39,360
even better than
my own friends, right?
697
00:37:39,360 --> 00:37:44,640
♪♪
698
00:37:44,640 --> 00:37:47,080
They should be small. Why?
699
00:37:47,080 --> 00:37:49,640
Because if it's
a Terminator-sized guy
700
00:37:49,640 --> 00:37:52,120
walks in, that's intimidating.
People don't like that.
701
00:37:52,120 --> 00:37:54,480
If it's a child,
that's really nice
702
00:37:54,480 --> 00:37:58,080
because you will forgive a child
for doing foolish things
703
00:37:58,080 --> 00:38:01,880
like stumbling over,
asking the wrong questions.
704
00:38:01,880 --> 00:38:04,160
It has a cuteness right away,
705
00:38:04,160 --> 00:38:06,480
which makes people
attach more to it
706
00:38:06,480 --> 00:38:10,440
than if it were a grown man
trying to imitate the doctor,
707
00:38:10,440 --> 00:38:14,280
telling you what you should do.
708
00:38:14,280 --> 00:38:16,680
We have created another
social entity,
709
00:38:16,680 --> 00:38:19,440
because it's not an animal,
it's not another human being,
710
00:38:19,440 --> 00:38:21,760
but it's not
a drilling machine, either.
711
00:38:21,760 --> 00:38:24,480
So it's what we call
the in-between machine.
712
00:38:24,480 --> 00:38:28,160
It's the new kid on the block,
and this is, for humanity,
713
00:38:28,160 --> 00:38:30,400
really something different
and something new
714
00:38:30,400 --> 00:38:34,240
because we have never related
to something that is not human
715
00:38:34,240 --> 00:38:36,760
but is very close to it.
716
00:38:51,000 --> 00:38:54,760
The Alice Project was actually
getting way out of hand.
717
00:38:54,760 --> 00:38:57,800
We were actually engaging
in the debate on,
718
00:38:57,800 --> 00:39:01,760
"Is it ethical to apply a robot
to health care
719
00:39:01,760 --> 00:39:03,760
for emotional tasks," right?
720
00:39:03,760 --> 00:39:07,840
So nobody wanted
to mechanize empathy.
721
00:39:47,560 --> 00:39:51,160
You can actually ask yourself
whether someone who says,
722
00:39:51,160 --> 00:39:54,200
"I understand,"
actually understands,
723
00:39:54,200 --> 00:39:57,240
and the empathic feelings
that they have for you,
724
00:39:57,240 --> 00:39:58,520
are they actually there?
725
00:39:58,520 --> 00:40:00,360
You don't know.
You will never know.
726
00:40:00,360 --> 00:40:01,720
You just assume it.
727
00:40:01,720 --> 00:40:03,680
It's the same with our machines.
728
00:40:03,680 --> 00:40:07,280
Our machines suggest
that they understand you.
729
00:40:07,280 --> 00:40:10,000
They suggest that they feel
what you feel,
730
00:40:10,000 --> 00:40:13,880
and it can express,
like human beings do,
731
00:40:13,880 --> 00:40:17,120
"Oh, oh,
I am so sorry for you."
732
00:40:17,120 --> 00:40:20,840
And you are comforted by that.
That's the wonderful thing.
733
00:40:20,840 --> 00:40:23,720
You don't need
real understanding
734
00:40:23,720 --> 00:40:25,640
to feel understood.
735
00:40:29,160 --> 00:40:32,160
-The experiment was about,
"Okay, we have a social robot
736
00:40:32,160 --> 00:40:34,960
that can do certain things.
Let's put it in the wild,
737
00:40:34,960 --> 00:40:38,320
on the couch,
with Grandma, see what happens."
738
00:40:38,320 --> 00:40:41,160
And what actually happened --
These people were, socially,
739
00:40:41,160 --> 00:40:44,120
quite isolated,
they were lonely people,
740
00:40:44,120 --> 00:40:48,160
that they bonded with that
machine in a matter of hours,
741
00:40:48,160 --> 00:40:51,440
actually, that they hated it
to get it away.
742
00:40:51,440 --> 00:40:53,840
So we were into
the ethical dilemma of,
743
00:40:53,840 --> 00:40:55,880
"Can we take it away from them?
744
00:40:55,880 --> 00:40:58,800
Shouldn't we actually leave
the machine with those people
745
00:40:58,800 --> 00:41:02,520
because now they are even
more lonely than they were?"
746
00:41:02,520 --> 00:41:05,560
-So what did you do?
-Well, luckily,
747
00:41:05,560 --> 00:41:09,640
we had an assistant who kept
on visiting those people,
748
00:41:09,640 --> 00:41:11,560
but some very nasty things
happened,
749
00:41:11,560 --> 00:41:14,440
because one of the ladies,
for instance, she thought,
750
00:41:14,440 --> 00:41:18,560
"Well, I should do more about
my social relationships."
751
00:41:18,560 --> 00:41:21,040
So around Christmas,
she was writing a card,
752
00:41:21,040 --> 00:41:23,360
a Christmas card,
to everybody in her flat,
753
00:41:23,360 --> 00:41:26,520
in her apartment building,
and guess what.
754
00:41:26,520 --> 00:41:29,640
I mean, how many people
wrote her back?
755
00:41:29,640 --> 00:41:30,880
None.
756
00:41:33,080 --> 00:41:36,720
-At first, I found Johan's
robot experiment rather sad.
757
00:41:36,720 --> 00:41:40,080
But the reality is that
there aren't people to
take care of them.
758
00:41:40,080 --> 00:41:43,080
So if Johan's robots
make them more happy
759
00:41:43,080 --> 00:41:45,360
and relationships
feel meaningful,
760
00:41:45,360 --> 00:41:47,960
then that's an improvement
to their lives.
761
00:41:47,960 --> 00:41:50,760
Perhaps we have to rethink
what meaningful relationships
762
00:41:50,760 --> 00:41:52,720
look like.
763
00:41:57,120 --> 00:42:00,160
-Why would we want AI?
764
00:42:00,160 --> 00:42:03,280
So that we could create machines
that could do anything
we can do,
765
00:42:03,280 --> 00:42:06,080
ideally,
things we don't want to do,
766
00:42:06,080 --> 00:42:09,040
but inevitably,
everything we can do.
767
00:42:15,080 --> 00:42:17,760
-Sunny, no rain.
768
00:42:17,760 --> 00:42:20,000
-Yay!
769
00:42:20,000 --> 00:42:22,800
-I've been a courier
for over 10 years
770
00:42:22,800 --> 00:42:25,000
delivering Monday
through Friday,
771
00:42:25,000 --> 00:42:28,000
sometimes seven --
It used to be seven days a week.
772
00:42:28,000 --> 00:42:30,560
So when I first started,
I used to go to the office,
773
00:42:30,560 --> 00:42:32,760
and the office had, like,
a little area
774
00:42:32,760 --> 00:42:34,680
where all the couriers hang out.
775
00:42:34,680 --> 00:42:38,440
You could make your little
coffee, and usually on a Friday,
776
00:42:38,440 --> 00:42:40,280
the controller
comes down, as well.
777
00:42:40,280 --> 00:42:42,240
So it's a community.
778
00:42:42,240 --> 00:42:44,000
He's your boss.
779
00:42:44,000 --> 00:42:46,000
Nowadays, it's all on the app.
780
00:42:46,000 --> 00:42:48,960
You don't talk to anyone.
There's no more office.
781
00:42:48,960 --> 00:42:52,560
You just suddenly hear
a very loud beep, beep, beep.
782
00:42:52,560 --> 00:42:55,080
That was a job coming through.
783
00:42:55,080 --> 00:42:58,280
Press "accept."
You got all the details there.
784
00:42:58,280 --> 00:43:05,560
♪♪
785
00:43:05,560 --> 00:43:08,200
We all have our little tricks
to play the apps,
786
00:43:08,200 --> 00:43:11,160
but ultimately, they dictate,
because if a job comes through,
787
00:43:11,160 --> 00:43:13,560
you have to accept it
on the spot.
788
00:43:13,560 --> 00:43:17,200
There was always, like --
It tried to give you enough work
789
00:43:17,200 --> 00:43:20,160
so you earn enough money,
so the controller makes sure
790
00:43:20,160 --> 00:43:22,760
that you have
your weekly earnings.
791
00:43:22,760 --> 00:43:24,600
Here, doesn't matter.
792
00:43:24,600 --> 00:43:27,200
They just want jobs
to be covered.
793
00:43:27,200 --> 00:43:30,200
♪♪
794
00:43:30,200 --> 00:43:33,080
A computer rates you --
service, quality,
795
00:43:33,080 --> 00:43:35,400
how many jobs you accepted,
how fast you were,
796
00:43:35,400 --> 00:43:40,320
and so, you are watched
by an app.
797
00:43:40,320 --> 00:43:42,720
It's the algorithm, purely.
798
00:43:42,720 --> 00:43:46,360
...Number B-B-5-4.
799
00:43:46,360 --> 00:43:49,360
If your rating goes below
a certain level,
800
00:43:49,360 --> 00:43:53,400
then they give you a warning,
and the policy is
801
00:43:53,400 --> 00:43:56,760
that you have to bring up
your rating in two weeks.
802
00:43:56,760 --> 00:43:59,800
Otherwise,
you're just deactivated
803
00:43:59,800 --> 00:44:02,320
without even
getting a phone call.
804
00:44:05,000 --> 00:44:07,960
I had, last week, a bad rating,
805
00:44:07,960 --> 00:44:11,120
and it really, really hurt me
because I was like,
806
00:44:11,120 --> 00:44:13,320
"Oh, I feel like I'm
a professional representative.
807
00:44:13,320 --> 00:44:17,360
I did all my best I could.
What did I do?"
808
00:44:17,360 --> 00:44:20,600
And I will never find out.
809
00:44:20,600 --> 00:44:23,160
-Please speak now.
-Hiya.
810
00:44:23,160 --> 00:44:26,760
-Please enter.
-The app is my boss.
811
00:44:32,640 --> 00:44:35,520
-Is the power slowly shifting
from human to machine
812
00:44:35,520 --> 00:44:39,120
without us paying attention?
813
00:44:39,120 --> 00:44:42,600
-Machines replacing humans
in all sort of areas,
814
00:44:42,600 --> 00:44:44,680
it's called history
of civilization.
815
00:44:44,680 --> 00:44:46,600
From the first moments
where people came up
816
00:44:46,600 --> 00:44:48,640
with machines
to replace farm animals,
817
00:44:48,640 --> 00:44:51,320
all sorts of manual labor,
then manufacturing jobs.
818
00:44:51,320 --> 00:44:53,200
The only difference now,
that machines
819
00:44:53,200 --> 00:44:55,400
are going
after white-collar jobs,
820
00:44:55,400 --> 00:44:57,760
after people with
college degrees,
821
00:44:57,760 --> 00:44:59,760
political influence,
and Twitter accounts.
822
00:44:59,760 --> 00:45:02,000
Now, I'm not --
I don't want to sound callous.
823
00:45:02,000 --> 00:45:05,440
I care about these people,
but again, this is the process.
824
00:45:05,440 --> 00:45:08,360
So we just --
It's going to happen anyway.
825
00:45:08,360 --> 00:45:12,000
It's very important
to avoid stagnation
826
00:45:12,000 --> 00:45:14,160
that every industry --
every industry --
827
00:45:14,160 --> 00:45:15,920
including
the most intellectual one,
828
00:45:15,920 --> 00:45:19,840
is under pressure because
only pressure makes us move.
829
00:45:19,840 --> 00:45:23,040
Any technology, before it
creates jobs, kills jobs.
830
00:45:27,520 --> 00:45:30,040
-Everyone right now
is so excited
831
00:45:30,040 --> 00:45:32,960
about the self-driving cars,
and --
832
00:45:32,960 --> 00:45:35,080
You know, I'm a techie.
I'm a geek.
833
00:45:35,080 --> 00:45:37,800
I'm excited
from that perspective.
834
00:45:37,800 --> 00:45:39,680
I'd like to be able to
just get into my vehicle
835
00:45:39,680 --> 00:45:41,520
and kind of just go
836
00:45:41,520 --> 00:45:43,800
and be able to kind of
keep working or playing
837
00:45:43,800 --> 00:45:46,680
and not have to focus
on the driving, absolutely,
838
00:45:46,680 --> 00:45:48,880
and it will save lives, too.
839
00:45:48,880 --> 00:45:51,320
But what it's doing is,
it's taking
840
00:45:51,320 --> 00:45:53,600
yet another domain
of human functioning
841
00:45:53,600 --> 00:45:57,200
and putting it in the hands
of the machines.
842
00:45:57,200 --> 00:46:03,560
♪♪
843
00:46:03,560 --> 00:46:05,560
-It's pretty much hands-off.
844
00:46:05,560 --> 00:46:12,560
Once it chooses what to paint
or I give it a photograph
845
00:46:12,560 --> 00:46:15,440
or it sees something
through the camera,
846
00:46:15,440 --> 00:46:17,920
it's pretty much sort of
on its own.
847
00:46:17,920 --> 00:46:20,880
So I don't really know what this
is going to look like
at the end.
848
00:46:20,880 --> 00:46:22,640
Sometimes, you know, I leave it.
849
00:46:22,640 --> 00:46:24,640
I go, you know,
away for two days
850
00:46:24,640 --> 00:46:27,200
and, you know, come back,
851
00:46:27,200 --> 00:46:29,640
and I wonder what
it's going to look like.
852
00:46:29,640 --> 00:46:35,160
♪♪
853
00:46:35,160 --> 00:46:37,360
What we're seeing here
is our paintings
854
00:46:37,360 --> 00:46:39,760
made by a robot artist.
855
00:46:39,760 --> 00:46:42,400
The idea that the robot
is an artist
856
00:46:42,400 --> 00:46:44,400
makes a lot of people
uncomfortable
857
00:46:44,400 --> 00:46:48,760
and brought this other,
this deeper questions,
858
00:46:48,760 --> 00:46:52,840
not just, "What is art," which
is difficult enough to answer,
859
00:46:52,840 --> 00:46:55,880
but this deeper question of,
"What is an artist?"
860
00:46:55,880 --> 00:47:00,040
Does the robot
have to feel the pain?
861
00:47:00,040 --> 00:47:02,640
Does it have to come out
of its own life experience?
862
00:47:02,640 --> 00:47:04,960
That sort of pushed me
to try to think, okay,
863
00:47:04,960 --> 00:47:07,760
about not just making a robot
that can paint art
864
00:47:07,760 --> 00:47:10,120
that arguably is on par
or better than
865
00:47:10,120 --> 00:47:12,480
the average person can make,
866
00:47:12,480 --> 00:47:15,520
but can it actually paint
out of its own experience?
867
00:47:15,520 --> 00:47:18,720
And the newer paintings
were generated by an AI system,
868
00:47:18,720 --> 00:47:23,000
not just an AI system
executing the technique
869
00:47:23,000 --> 00:47:25,240
but also generating
the content.
870
00:47:25,240 --> 00:47:29,480
And I would say,
"Is this better than Van Gogh?"
871
00:47:29,480 --> 00:47:34,280
It isn't. But is it a better
painter than most people are?
872
00:47:34,280 --> 00:47:36,120
I think so.
873
00:47:36,120 --> 00:47:41,160
And I always wanted to paint,
but I really can't.
874
00:47:41,160 --> 00:47:43,960
I've taken courses,
and I've done everything,
875
00:47:43,960 --> 00:47:46,280
read books
and did all kind of trials.
876
00:47:46,280 --> 00:47:49,640
It never worked out,
but I can program robots,
877
00:47:49,640 --> 00:47:51,800
and, in fact, I can
program robots to learn
878
00:47:51,800 --> 00:47:53,160
and get better
at what they do,
879
00:47:53,160 --> 00:47:55,840
and so this seemed
like a perfect challenge.
880
00:47:55,840 --> 00:48:03,880
♪♪
881
00:48:03,880 --> 00:48:06,480
-I see these machines
in a somewhat different light.
882
00:48:06,480 --> 00:48:11,160
I see that they have superhuman
speed and superhuman strength,
883
00:48:11,160 --> 00:48:15,520
superhuman reliability
and precision and endurance.
884
00:48:15,520 --> 00:48:18,840
The way that I want
these machines in my life
885
00:48:18,840 --> 00:48:22,480
is for them to give me
their superpowers.
886
00:48:22,480 --> 00:48:24,760
-That's looking at you.
887
00:48:24,760 --> 00:48:28,680
-So about every second,
it sends, like, my expression.
888
00:48:28,680 --> 00:48:31,480
Since I was the first one here,
it sends it to Microsoft
889
00:48:31,480 --> 00:48:33,000
for analysis,
890
00:48:33,000 --> 00:48:37,120
and then we get back
some kind of emotion
891
00:48:37,120 --> 00:48:40,760
and hair color and my age.
892
00:48:40,760 --> 00:48:43,240
That's the kind of --
This is some of the stuff
we're really interested in.
893
00:48:43,240 --> 00:48:45,360
That's the weirdest camera move
I've ever seen.
894
00:48:45,360 --> 00:48:47,160
- Got you.
And you're interested in
895
00:48:47,160 --> 00:48:49,120
why it's doing that,
as opposed to just, like --
896
00:48:49,120 --> 00:48:50,960
-Yeah. Why's it do that,
but also, like,
897
00:48:50,960 --> 00:48:53,200
how it feels to have
that thing look at you.
898
00:48:53,200 --> 00:48:55,280
-Right.
-Do the swish again, Dan,
899
00:48:55,280 --> 00:48:57,000
and watch this.
-I can tell you how it feels.
900
00:48:57,000 --> 00:48:59,240
-Yeah.
-Like, I'm going to rip
that thing up right now.
901
00:48:59,240 --> 00:49:02,480
-Right, yeah.
-For sure. I'm glad
that yellow line is there.
902
00:49:02,480 --> 00:49:04,160
Otherwise, we got problems.
903
00:49:04,160 --> 00:49:06,680
- Yeah.
904
00:49:06,680 --> 00:49:08,600
-It's a little creepy, right?
-A little unnerving.
905
00:49:08,600 --> 00:49:12,440
What would I do? Hmm,
I don't know what I would do.
906
00:49:12,440 --> 00:49:14,360
You're kind of in my face
a little bit.
907
00:49:14,360 --> 00:49:16,600
You're a little --
I feel judgment.
908
00:49:16,600 --> 00:49:18,520
-You do?
-A little bit,
909
00:49:18,520 --> 00:49:20,480
just knowing it's tracking me,
910
00:49:20,480 --> 00:49:25,280
saying if I'm happy
or contemptuous or whatever.
911
00:49:25,280 --> 00:49:27,920
I think a studio head would --
912
00:49:27,920 --> 00:49:30,440
You know, like you say,
"Oh, here's the film
913
00:49:30,440 --> 00:49:33,480
we're doing.
We need that Michael Bay feel."
914
00:49:33,480 --> 00:49:35,640
I think you just hit
the "Michael Bay" button,
915
00:49:35,640 --> 00:49:40,120
and it probably could do that.
-Right.
916
00:49:40,120 --> 00:49:42,680
-You know, you could
put in a Woody Allen,
917
00:49:42,680 --> 00:49:45,680
and it would go through every
shot of Woody Allen's cinema
918
00:49:45,680 --> 00:49:48,960
and say, "Oh, it's going to be
kind of a moving master shot.
919
00:49:48,960 --> 00:49:51,760
We're not going to cut in,
do a bunch of close-ups.
920
00:49:51,760 --> 00:49:54,400
It's going to play out
a little wider and..."
921
00:49:54,400 --> 00:49:57,400
What about, like, casting
and these major choices?
922
00:49:57,400 --> 00:49:59,520
Like, which is the right actor?
923
00:49:59,520 --> 00:50:02,280
I don't know about rehearsal.
I don't know about --
924
00:50:02,280 --> 00:50:05,760
Yeah, I think movies are
a lot more than just the shot,
925
00:50:05,760 --> 00:50:08,080
and, you know,
that's just one little element
926
00:50:08,080 --> 00:50:10,120
of the whole ball of wax.
927
00:50:10,120 --> 00:50:12,400
Uh-oh.
-You see how that goes?
928
00:50:12,400 --> 00:50:13,800
-I've come under its scrutiny.
929
00:50:13,800 --> 00:50:15,720
-The first thing you said was,
"Uh-oh."
930
00:50:15,720 --> 00:50:17,680
That's what it feels like
when it moves like that.
931
00:50:17,680 --> 00:50:20,360
-He's taken an interest.
932
00:50:20,360 --> 00:50:23,160
-As a director, I would say,
"Just give me little space, man.
933
00:50:23,160 --> 00:50:25,160
Hold on."
-Yeah.
934
00:50:25,160 --> 00:50:28,440
-That is such a freaky move.
-Yeah.
935
00:50:28,440 --> 00:50:30,960
-I think that little move
is just for style.
936
00:50:30,960 --> 00:50:34,000
It's just, like, a flair thing.
937
00:50:34,000 --> 00:50:36,320
"I come as your friend."
-Yeah.
938
00:50:36,320 --> 00:50:38,720
-It is reading contempt.
-Yeah.
939
00:50:44,800 --> 00:50:50,480
-I have seen robotics and AI
that's kind of terrifying
940
00:50:50,480 --> 00:50:53,320
in terms of just, like,
941
00:50:53,320 --> 00:50:56,400
watching what Facebook
is serving up to me, you know,
942
00:50:56,400 --> 00:51:00,360
and realizing,
"Ugh, right. That's uncanny."
943
00:51:00,360 --> 00:51:03,000
It saw something, you know,
and now it's giving --
944
00:51:03,000 --> 00:51:04,920
Like, it knows I had a kid,
945
00:51:04,920 --> 00:51:07,680
and now it's sending me
diaper advertisements and stuff.
946
00:51:07,680 --> 00:51:11,600
There's something about that,
of that moment when you realize
947
00:51:11,600 --> 00:51:13,680
you're not invisible
in front of the machine.
948
00:51:13,680 --> 00:51:17,880
The machine is looking at you,
and that's what's new.
949
00:51:17,880 --> 00:51:21,360
We've had decades and decades
950
00:51:21,360 --> 00:51:25,360
using our tools unobserved,
you know?
951
00:51:25,360 --> 00:51:27,280
The hammer doesn't talk back.
952
00:51:27,280 --> 00:51:29,760
It just does whatever
you tell it to do.
953
00:51:29,760 --> 00:51:33,760
And now we've reached a point
where they've got eyes,
they've got brains.
954
00:51:33,760 --> 00:51:36,000
they've got fingers,
and they're looking back at us,
955
00:51:36,000 --> 00:51:38,720
and they're making decisions,
and that's pretty terrifying.
956
00:51:38,720 --> 00:51:43,840
♪♪
957
00:51:43,840 --> 00:51:46,400
-If you want to see super-scary,
958
00:51:46,400 --> 00:51:49,040
super-advanced AI,
it's a white page,
959
00:51:49,040 --> 00:51:52,280
and it says "Google" at the top,
and there's a little rectangle.
960
00:51:52,280 --> 00:51:55,720
And that rectangle probably
knows what you're going to type.
961
00:51:55,720 --> 00:51:58,000
It knows who you are.
Somebody said,
962
00:51:58,000 --> 00:52:00,440
"Google will know you're gay
before you know you're gay,"
they say,
963
00:52:00,440 --> 00:52:02,560
"because they know
every image you ever looked at.
964
00:52:02,560 --> 00:52:04,960
They know how much
attention you paid,
965
00:52:04,960 --> 00:52:07,440
how much you read this,
how much you read that,
966
00:52:07,440 --> 00:52:09,320
everything you ever bought,
967
00:52:09,320 --> 00:52:11,280
every music track
you ever listened to,
968
00:52:11,280 --> 00:52:12,760
every movie you watched."
969
00:52:12,760 --> 00:52:16,360
That is scary AI, and --
But why?
970
00:52:16,360 --> 00:52:18,760
What's the driving force?
971
00:52:18,760 --> 00:52:21,840
What are Google?
Google are an ad company.
972
00:52:21,840 --> 00:52:23,800
They're just trying
to sell you shit.
973
00:52:23,800 --> 00:52:32,880
♪♪
974
00:52:32,880 --> 00:52:35,400
-A company called
Cambridge Analytica
975
00:52:35,400 --> 00:52:38,160
was behind the Brexit vote
976
00:52:38,160 --> 00:52:40,360
that was to determine
whether or not the UK
977
00:52:40,360 --> 00:52:42,600
would withdraw
from the European Union,
978
00:52:42,600 --> 00:52:46,400
and Cambridge Analytica was
right there behind the scenes
979
00:52:46,400 --> 00:52:51,840
promoting that perspective
that we should leave the EU.
980
00:52:51,840 --> 00:52:53,720
And that same company
981
00:52:53,720 --> 00:52:57,160
was also supporting
the campaign of Donald Trump.
982
00:52:57,160 --> 00:53:00,440
And what a company like this
can do behind the scenes
983
00:53:00,440 --> 00:53:03,120
is just extraordinary.
984
00:53:03,120 --> 00:53:08,040
It takes information that has
been accumulated about us,
985
00:53:08,040 --> 00:53:11,160
invisibly,
for the most part.
986
00:53:11,160 --> 00:53:15,960
We're talking about
5,000 data points per person.
987
00:53:15,960 --> 00:53:18,720
And it takes that information,
988
00:53:18,720 --> 00:53:22,000
and then it can craft
different kinds of headlines,
989
00:53:22,000 --> 00:53:23,760
different kinds of images
990
00:53:23,760 --> 00:53:26,160
and customize them
and personalize them
991
00:53:26,160 --> 00:53:28,360
so that they have
a maximum impact
992
00:53:28,360 --> 00:53:30,080
on everyone who sees them.
993
00:53:33,960 --> 00:53:38,000
-The world that we live in
is one that's been created
994
00:53:38,000 --> 00:53:40,440
by those new technologies,
995
00:53:40,440 --> 00:53:42,920
and we're so embedded
in that world
996
00:53:42,920 --> 00:53:45,760
that our minds have become
part of that world.
997
00:53:45,760 --> 00:53:49,920
In fact, this is like
mind control at its worst,
998
00:53:49,920 --> 00:53:51,640
if you like,
999
00:53:51,640 --> 00:53:55,480
and we feel that we're not
being controlled in this world.
1000
00:53:55,480 --> 00:53:56,920
We feel free.
1001
00:53:56,920 --> 00:54:00,000
We feel that we can switch off
any time we want.
1002
00:54:00,000 --> 00:54:03,640
We don't have to access
the "X," "Y," "Z" social media.
1003
00:54:03,640 --> 00:54:05,600
But in reality, we cannot.
1004
00:54:05,600 --> 00:54:08,360
Our minds are set in such a way,
1005
00:54:08,360 --> 00:54:10,880
and those algorithms
are so smart to be able
1006
00:54:10,880 --> 00:54:13,160
to make
the most of our weaknesses,
1007
00:54:13,160 --> 00:54:15,920
that our symbiosis, if you like,
1008
00:54:15,920 --> 00:54:18,160
with the machine
is near complete.
1009
00:54:18,160 --> 00:54:21,920
Now, add to that relationship
a new shell,
1010
00:54:21,920 --> 00:54:23,920
which is highly intelligent,
1011
00:54:23,920 --> 00:54:27,200
which can access all those data
that we are leaving behind
1012
00:54:27,200 --> 00:54:30,440
and understand who we are
in a perfect way,
1013
00:54:30,440 --> 00:54:33,520
and you have, you know, the
perfect surveillance society.
1014
00:54:33,520 --> 00:54:39,160
-And it raises
a very bizarre question --
1015
00:54:39,160 --> 00:54:45,560
What if the cool,
new mind-control machine
1016
00:54:45,560 --> 00:54:47,680
doesn't want us thinking
bad things
1017
00:54:47,680 --> 00:54:51,440
about the cool,
new mind-control machine?
1018
00:54:53,120 --> 00:54:58,520
How would we fight a source
of influence that can teach us,
1019
00:54:58,520 --> 00:55:04,360
over and over again,
every day, that it's awesome,
1020
00:55:04,360 --> 00:55:07,360
it exists entirely
for our own good?
1021
00:55:07,360 --> 00:55:09,760
This is what's coming.
1022
00:55:09,760 --> 00:55:11,800
-Attention, everybody,
attention.
1023
00:55:11,800 --> 00:55:15,520
-The impact of
an infantilized society
1024
00:55:15,520 --> 00:55:18,560
is that citizens
who are not capable
1025
00:55:18,560 --> 00:55:21,160
of understanding what's going on
and taking decisions
1026
00:55:21,160 --> 00:55:24,760
will obviously be very easy
to manipulate.
1027
00:55:24,760 --> 00:55:29,240
So there will be a crisis
of democracy in this world
1028
00:55:29,240 --> 00:55:32,280
where people
are trusting the machines.
1029
00:55:32,280 --> 00:55:35,280
-Long live Big Brother!
1030
00:55:35,280 --> 00:55:37,280
-Long live Big Brother!
1031
00:55:37,280 --> 00:55:39,760
Long live Big Brother!
1032
00:55:39,760 --> 00:55:42,920
♪♪
1033
00:55:42,920 --> 00:55:45,320
-Artificial intelligence
is springing into life
1034
00:55:45,320 --> 00:55:48,840
in laboratories and offices
all over the world.
1035
00:55:48,840 --> 00:55:52,040
We are still firmly in control
over the machines,
1036
00:55:52,040 --> 00:55:54,440
but what if they become
self-aware?
1037
00:55:57,160 --> 00:56:00,520
-At the Creative Machines Lab,
we try to make machines
1038
00:56:00,520 --> 00:56:04,640
that create other machines
and machines that are creative.
1039
00:56:06,640 --> 00:56:09,000
If we'll be able to design
and make a machine
1040
00:56:09,000 --> 00:56:11,360
that can design and make
other machines, we've sort of --
1041
00:56:11,360 --> 00:56:13,400
We're done.
1042
00:56:13,400 --> 00:56:16,200
As I tell my students,
"We try to sail west."
1043
00:56:16,200 --> 00:56:18,880
You know, we don't know where
we're going to get with this,
1044
00:56:18,880 --> 00:56:21,600
but there's something there
that we want to see.
1045
00:56:21,600 --> 00:56:23,360
There's a new world.
1046
00:56:23,360 --> 00:56:26,160
-Ready?
-Yes, sir.
1047
00:56:26,160 --> 00:56:29,000
-Okay.
1048
00:56:29,000 --> 00:56:33,480
-This is one of the robots
we use to study self-awareness,
1049
00:56:33,480 --> 00:56:35,480
and what's interesting
about this robot,
1050
00:56:35,480 --> 00:56:39,760
it has a lot of sensors
and activators, but it's blind.
1051
00:56:39,760 --> 00:56:41,760
It cannot see the world.
It doesn't know.
1052
00:56:41,760 --> 00:56:43,960
It doesn't sense anything
about the outside.
1053
00:56:43,960 --> 00:56:47,440
All of its sensors
are turned inside.
1054
00:56:47,440 --> 00:56:49,600
So this is not a robot
that drives around
1055
00:56:49,600 --> 00:56:51,360
and models the world
like the driverless cars
1056
00:56:51,360 --> 00:56:52,600
and understand
what's going around.
1057
00:56:52,600 --> 00:56:54,360
All the sensors --
1058
00:56:54,360 --> 00:56:57,160
The only thing it knows
about is introspection.
1059
00:56:57,160 --> 00:57:01,360
It only senses itself,
its stresses inside, its motors,
1060
00:57:01,360 --> 00:57:04,160
its currents, its orientation,
1061
00:57:04,160 --> 00:57:09,200
and that information allows it
to build a model of itself.
1062
00:57:09,200 --> 00:57:11,120
You know, at this point,
these robots are simple enough
1063
00:57:11,120 --> 00:57:14,400
that we can actually
take the hood off
1064
00:57:14,400 --> 00:57:17,760
and see the model
it's creating of itself.
1065
00:57:17,760 --> 00:57:20,640
We can see the self-image.
1066
00:57:20,640 --> 00:57:23,280
So we can see how it's gradually
creating legs and limbs.
1067
00:57:23,280 --> 00:57:25,960
Sometimes it gets it wrong
and it thinks it's a snake,
1068
00:57:25,960 --> 00:57:28,440
it's a tree or something else.
1069
00:57:28,440 --> 00:57:31,480
But as these robots get
more and more sophisticated,
1070
00:57:31,480 --> 00:57:35,360
their ability to visualize
how they see themselves
1071
00:57:35,360 --> 00:57:37,160
is diminished,
1072
00:57:37,160 --> 00:57:39,560
and eventually, I think
we'll not be able to understand
1073
00:57:39,560 --> 00:57:42,480
how they see themselves any more
that we can understand
1074
00:57:42,480 --> 00:57:45,400
how a human, another person,
sees themselves.
1075
00:57:45,400 --> 00:57:49,760
♪♪
1076
00:57:49,760 --> 00:57:52,160
-When we speak about AI today
1077
00:57:52,160 --> 00:57:55,560
and when we talk about AI
being this amazing technology,
1078
00:57:55,560 --> 00:58:01,520
what we mean is that
we have managed to reproduce
1079
00:58:01,520 --> 00:58:05,960
at least the essential brain
architecture in a computer.
1080
00:58:05,960 --> 00:58:08,240
And by doing so,
1081
00:58:08,240 --> 00:58:12,320
we don't need to describe
the world to the machine.
1082
00:58:12,320 --> 00:58:17,360
We just need to teach it how to
see things, if you like, okay?
1083
00:58:17,360 --> 00:58:19,160
And we let the machine develop
1084
00:58:19,160 --> 00:58:23,840
its own internal representation
of the world,
1085
00:58:23,840 --> 00:58:26,720
which, by the way,
is not transparent to us.
1086
00:58:26,720 --> 00:58:28,360
We don't really know
how the machine
1087
00:58:28,360 --> 00:58:30,000
takes these decisions
and so forth.
1088
00:58:30,000 --> 00:58:32,760
In this new world of AI,
we can't trace back its logic
1089
00:58:32,760 --> 00:58:35,680
because its logic
is created by itself.
1090
00:58:35,680 --> 00:58:39,040
♪♪
1091
00:58:39,040 --> 00:58:40,760
-I'm sorry, Dave.
1092
00:58:40,760 --> 00:58:43,360
I'm afraid I can't do that.
1093
00:58:43,360 --> 00:58:45,480
This mission is too important
for me
1094
00:58:45,480 --> 00:58:49,080
to allow you to jeopardize it.
1095
00:58:49,080 --> 00:58:53,800
I know that you and Frank
were planning to disconnect me,
1096
00:58:53,800 --> 00:58:57,240
and I'm afraid that's something
I cannot allow to happen.
1097
00:58:57,240 --> 00:58:59,440
-Stanley Kubrick introduced us
to HAL,
1098
00:58:59,440 --> 00:59:02,360
who was a symbol of AI
out of control.
1099
00:59:02,360 --> 00:59:05,200
What seemed like fanciful sci-fi
50 years ago
1100
00:59:05,200 --> 00:59:07,600
is closer to reality today.
1101
00:59:07,600 --> 00:59:09,880
-Will you stop, Dave?
1102
00:59:09,880 --> 00:59:13,440
♪♪
1103
00:59:13,440 --> 00:59:15,800
-Artificial intelligence
researchers
1104
00:59:15,800 --> 00:59:18,280
had to shut down two chatbots
1105
00:59:18,280 --> 00:59:21,560
after they developed
a strange English shorthand.
1106
00:59:21,560 --> 00:59:24,480
They didn't shut it down
because they necessarily thought
1107
00:59:24,480 --> 00:59:27,560
that these bots were achieving
some sort of singularity
1108
00:59:27,560 --> 00:59:30,120
or some sort of
independent intelligence
1109
00:59:30,120 --> 00:59:32,640
and were creating
a language, correct?
1110
00:59:32,640 --> 00:59:35,360
-Officially, the story is no.
1111
00:59:38,920 --> 00:59:42,640
♪♪
1112
00:59:42,640 --> 00:59:46,400
-At Boston Dynamics, it's clear
how fast robots evolve.
1113
00:59:46,400 --> 00:59:51,160
♪♪
1114
00:59:51,160 --> 00:59:55,120
Is there anything that we can do
better than robots or AI?
1115
00:59:55,120 --> 00:59:59,920
-Right now, plenty,
but ultimately, no.
1116
00:59:59,920 --> 01:00:03,720
-Why would we create something
better than ourselves?
1117
01:00:03,720 --> 01:00:06,560
-Because we can't help it.
We have to do it.
1118
01:00:06,560 --> 01:00:10,200
So the hubris of creation,
I think that's --
1119
01:00:10,200 --> 01:00:13,000
You know, we --
It's the same old story
1120
01:00:13,000 --> 01:00:16,240
as the alchemist trying to
breathe life into
inanimate matter.
1121
01:00:16,240 --> 01:00:18,280
All the legends tell you
it's a bad thing to do,
1122
01:00:18,280 --> 01:00:19,960
and yet we can't help ourselves.
1123
01:00:19,960 --> 01:00:22,200
We try to do this.
We try to do the divine.
1124
01:00:22,200 --> 01:00:25,560
And it's the hubris
that we might be able to do it,
1125
01:00:25,560 --> 01:00:29,840
but it's sort of
the ultimate challenge.
1126
01:00:29,840 --> 01:00:34,560
-I think scientists
build something so powerful,
1127
01:00:34,560 --> 01:00:38,120
and they trust people
to use it well,
1128
01:00:38,120 --> 01:00:42,200
and a lot of times, I think that
trust is misplaced in humanity.
1129
01:00:42,200 --> 01:00:44,560
And so, you've got to be able
1130
01:00:44,560 --> 01:00:46,560
to think about that while
you're building something --
1131
01:00:46,560 --> 01:00:50,480
"How could someone use this
in a terrible way?"
1132
01:00:50,480 --> 01:00:53,800
But just not exploring,
1133
01:00:53,800 --> 01:00:57,120
not trying because you're afraid
of playing God,
1134
01:00:57,120 --> 01:00:59,600
I mean, it's what we do.
1135
01:00:59,600 --> 01:01:02,560
It's what -- It's the one thing
that humanity does.
1136
01:01:02,560 --> 01:01:05,400
Cheetahs run fast.
People make tools.
1137
01:01:05,400 --> 01:01:08,680
There's nothing -- In my mind,
1138
01:01:08,680 --> 01:01:12,960
there is nothing more natural
than building a tool.
1139
01:01:12,960 --> 01:01:15,160
We're going to all evolve.
1140
01:01:15,160 --> 01:01:18,440
Just like always,
technology shapes humanity.
1141
01:01:18,440 --> 01:01:21,080
Who knows what we're
going to become?
1142
01:01:23,040 --> 01:01:25,640
-As we witness the rise
of the robots,
1143
01:01:25,640 --> 01:01:29,120
it's easy to make the
distinction between
man and machine.
1144
01:01:29,120 --> 01:01:31,800
What if robots and man
evolve together?
1145
01:01:31,800 --> 01:01:34,240
-3, 2, 1.
1146
01:01:34,240 --> 01:01:37,120
-Gentlemen, we can rebuild him.
1147
01:01:37,120 --> 01:01:39,960
-In the mid-'70s, it was hard
to escape the success
1148
01:01:39,960 --> 01:01:41,960
of the $6 million man.
1149
01:01:41,960 --> 01:01:43,880
It was my first glimpse
of the promise
1150
01:01:43,880 --> 01:01:45,960
of merging man with machine.
1151
01:01:45,960 --> 01:01:50,080
-Better, stronger, faster.
1152
01:01:50,080 --> 01:01:58,120
♪♪
1153
01:01:58,120 --> 01:02:07,520
♪♪
1154
01:02:07,520 --> 01:02:09,880
-Les was the first patient
1155
01:02:09,880 --> 01:02:12,760
that I performed a bilateral
shoulder TMR surgery on.
1156
01:02:12,760 --> 01:02:16,040
It's actually
a nerve-reassignment surgery
1157
01:02:16,040 --> 01:02:18,240
that takes residual nerves
1158
01:02:18,240 --> 01:02:21,000
of a patient
who has upper extremity loss,
1159
01:02:21,000 --> 01:02:23,120
and we reroute
the nerve information
1160
01:02:23,120 --> 01:02:25,760
that used to travel
to the missing limb
1161
01:02:25,760 --> 01:02:28,040
to residual muscles
that are still there.
1162
01:02:28,040 --> 01:02:31,800
So when a patient thinks
of moving its missing limb,
1163
01:02:31,800 --> 01:02:34,720
it contracts muscles
that we can record from
1164
01:02:34,720 --> 01:02:39,160
and, from there,
control advanced prosthetics.
1165
01:02:39,160 --> 01:02:43,400
First, we take a cast
of Les' trunk, make a socket.
1166
01:02:43,400 --> 01:02:45,640
Within the socket, there are
steel dome electrodes
1167
01:02:45,640 --> 01:02:49,120
which touch the surface
of Les' muscles.
1168
01:02:49,120 --> 01:02:51,160
However,
there's nothing invasive
1169
01:02:51,160 --> 01:02:54,160
or implanted with the system.
1170
01:02:54,160 --> 01:02:58,200
He's fitted with bilateral
advanced prosthetic limbs,
1171
01:02:58,200 --> 01:03:01,960
and we train the system
and ask Les to start moving,
1172
01:03:01,960 --> 01:03:03,560
and it's pretty intuitive.
1173
01:03:03,560 --> 01:03:05,600
Within minutes of the first time
1174
01:03:05,600 --> 01:03:08,200
we attempted
with the bilateral fitting,
1175
01:03:08,200 --> 01:03:12,400
Les was able to intuitively move
his arm like any natural limb.
1176
01:03:15,680 --> 01:03:18,240
Think of it as a symphony
of information
1177
01:03:18,240 --> 01:03:22,120
that is being recorded and
communicated to the computer.
1178
01:03:22,120 --> 01:03:25,520
The computer,
through its intelligence,
1179
01:03:25,520 --> 01:03:29,600
is able to take that symphony
of information
1180
01:03:29,600 --> 01:03:33,000
and translate that to movement
within the robotic limb.
1181
01:03:35,240 --> 01:03:37,640
-Once the training sessions
were complete
1182
01:03:37,640 --> 01:03:40,240
and they released me
and let me be the computer,
1183
01:03:40,240 --> 01:03:42,600
basically,
to control that arm,
1184
01:03:42,600 --> 01:03:45,040
I just go into
a whole different world.
1185
01:03:47,120 --> 01:03:49,440
Maybe I'll, for once, be able
to put change in a pop machine
1186
01:03:49,440 --> 01:03:51,880
and get the pop out of it,
simple things like that
1187
01:03:51,880 --> 01:03:57,520
that most people never think of,
and it's re-available to me.
1188
01:03:57,520 --> 01:04:02,480
-When Les describes having
the arms fitted on him,
1189
01:04:02,480 --> 01:04:05,160
he was saying,
"I feel whole again."
1190
01:04:05,160 --> 01:04:08,080
This is what it's all about --
really restoring that function
1191
01:04:08,080 --> 01:04:11,280
and that connection
to another human being,
1192
01:04:11,280 --> 01:04:14,760
restoring our humanity.
1193
01:04:14,760 --> 01:04:17,520
Sometimes people throw
the term around -- "superhuman."
1194
01:04:17,520 --> 01:04:21,800
You can make the arm stronger
than the normal human arm
1195
01:04:21,800 --> 01:04:23,560
and have these applications
1196
01:04:23,560 --> 01:04:26,360
add an advantage
for certain tasks or jobs.
1197
01:04:26,360 --> 01:04:30,680
♪♪
1198
01:04:30,680 --> 01:04:33,080
-For the vast majority
of history,
1199
01:04:33,080 --> 01:04:34,600
we look at somebody
with a disability,
1200
01:04:34,600 --> 01:04:36,440
and we feel sympathy, right,
1201
01:04:36,440 --> 01:04:38,760
and we hope that they
get technology that's
gonna help them.
1202
01:04:38,760 --> 01:04:40,320
Because guess what?
1203
01:04:40,320 --> 01:04:42,640
That technology always
helps them a little,
1204
01:04:42,640 --> 01:04:45,760
but they're never
at a normal level,
1205
01:04:45,760 --> 01:04:48,000
and they've never been beyond.
1206
01:04:48,000 --> 01:04:52,280
People with prosthetic limbs
are going to be running faster
1207
01:04:52,280 --> 01:04:55,160
than any human being
who ever lived soon.
1208
01:04:55,160 --> 01:04:57,520
You know, people with retinal
implants, cochlear implants,
1209
01:04:57,520 --> 01:05:00,160
they're going to hear better,
see better, neural implants,
1210
01:05:00,160 --> 01:05:02,400
maybe they're even
going to think faster.
1211
01:05:02,400 --> 01:05:06,240
So what happens when there's
a new breed of human?
1212
01:05:06,240 --> 01:05:09,360
How do all of the normal
human beings react
1213
01:05:09,360 --> 01:05:13,760
because the person
with the disability is better,
1214
01:05:13,760 --> 01:05:17,560
not only healed,
but healed and then some?
1215
01:05:20,840 --> 01:05:23,240
-But this new breed of human
isn't here yet.
1216
01:05:23,240 --> 01:05:25,480
Les doesn't get to
take his arms home,
1217
01:05:25,480 --> 01:05:29,240
and it's been a year since
he was last able to use them.
1218
01:05:29,240 --> 01:05:31,680
I can't help but feel in awe
of the potential
1219
01:05:31,680 --> 01:05:34,760
of human and machine hybrid,
1220
01:05:34,760 --> 01:05:37,640
but today, Les' arms
are just out of reach.
1221
01:05:37,640 --> 01:05:47,440
♪♪
1222
01:05:47,440 --> 01:05:57,400
♪♪
1223
01:05:57,400 --> 01:06:00,360
♪♪
1224
01:06:02,280 --> 01:06:05,160
-Okay.
And where's zero?
1225
01:06:05,160 --> 01:06:07,640
-Zero is right about there.
-It's, like, right there.
1226
01:06:07,640 --> 01:06:09,640
-So I'm going to hold down...
-All right.
1227
01:06:09,640 --> 01:06:12,320
-...this switch.
-Ah, there we go.
1228
01:06:12,320 --> 01:06:15,160
That's what it was on that one.
-You ready to chopstick?
-Let's do it.
1229
01:06:15,160 --> 01:06:18,200
Kyle was saying that you're keen
on maybe sitting
1230
01:06:18,200 --> 01:06:20,560
for the final interview,
which is why we feel comfortable
1231
01:06:20,560 --> 01:06:23,800
sort of reducing the tracking
space down a little bit.
1232
01:06:23,800 --> 01:06:25,600
-You want him to be seated
because it feels a little
1233
01:06:25,600 --> 01:06:27,640
more intimate or personal.
-Yeah.
1234
01:06:27,640 --> 01:06:30,760
-We're going to start with
all of us in here watching it,
1235
01:06:30,760 --> 01:06:34,000
making sure it's going well and
then try and get more of us out.
1236
01:06:34,000 --> 01:06:36,160
Probably someone
is going to have to stay
1237
01:06:36,160 --> 01:06:38,280
to be next to
the emergency stop.
1238
01:06:38,280 --> 01:06:40,520
Cool. All right.
Let's do this.
1239
01:06:40,520 --> 01:06:42,640
-We're just days away
from letting the CameraBot
1240
01:06:42,640 --> 01:06:45,040
conduct the final interview
on its own.
1241
01:06:45,040 --> 01:06:47,080
I'm excited that
we're on the verge
1242
01:06:47,080 --> 01:06:49,560
of finding a new way
to tell our story.
1243
01:06:49,560 --> 01:06:52,920
-This number is good.
We're almost there.
-Okay.
1244
01:06:52,920 --> 01:06:56,160
-So far, we've implemented
facial recognition
1245
01:06:56,160 --> 01:06:59,320
that tracks faces and emotion,
speech recognition
1246
01:06:59,320 --> 01:07:01,400
that allows the CameraBot
to listen
1247
01:07:01,400 --> 01:07:04,200
and algorithms to decide
how to frame the shot.
1248
01:07:06,360 --> 01:07:08,680
We even added an additional
CameraBot
1249
01:07:08,680 --> 01:07:12,400
so it can film
from multiple angles.
1250
01:07:12,400 --> 01:07:16,520
The AI we built also generates
questions on its own.
1251
01:07:16,520 --> 01:07:19,920
It can even listen in
while we're working on it.
1252
01:07:19,920 --> 01:07:23,840
-Technically, we're pretty close
to it understanding your voice
1253
01:07:23,840 --> 01:07:26,760
and when you're ready
for a new question.
1254
01:07:26,760 --> 01:07:29,680
We set up the infrastructure
so that it can actually speak
1255
01:07:29,680 --> 01:07:32,960
in this voice that we like.
-Tommy, I want to ask you
1256
01:07:32,960 --> 01:07:36,160
a question
about what you're working on.
1257
01:07:36,160 --> 01:07:38,800
-Tommy, I want to
ask you a question
1258
01:07:38,800 --> 01:07:41,440
about what you're working on.
-Yeah, I think we want something
1259
01:07:41,440 --> 01:07:43,720
that's inviting, you know,
sort of like...
1260
01:07:43,720 --> 01:07:46,000
-Tommy, I want to ask you a --
-Maybe a little slower.
1261
01:07:46,000 --> 01:07:48,720
-It's too sharp, but...
-Yeah.
1262
01:07:55,400 --> 01:07:58,000
-There's an aspiration
1263
01:07:58,000 --> 01:08:02,480
to achieve these models
of humanlike experience,
1264
01:08:02,480 --> 01:08:06,240
to bring life to machines
and algorithms.
1265
01:08:06,240 --> 01:08:10,840
We might consider this to be
our great Frankenstein era.
1266
01:08:10,840 --> 01:08:15,400
This is the era
where humans are creating
1267
01:08:15,400 --> 01:08:18,600
not just the cinematic
simulation of life,
1268
01:08:18,600 --> 01:08:22,560
but we are now tapping into the
fundamental mysteries of life.
1269
01:08:22,560 --> 01:08:26,040
It also means that we don't know
what we're playing with.
1270
01:08:26,040 --> 01:08:29,400
We don't know
the end consequences.
1271
01:08:29,400 --> 01:08:31,520
- It's moving.
1272
01:08:31,520 --> 01:08:33,480
It's alive.
1273
01:08:33,480 --> 01:08:35,760
It's alive.
1274
01:08:35,760 --> 01:08:37,680
Oh, it's alive.
1275
01:08:37,680 --> 01:08:39,840
It's alive!
It's alive!
1276
01:08:39,840 --> 01:08:43,360
It's alive!
1277
01:08:43,360 --> 01:08:46,760
-In my writings, I don't call
the Internet the Internet.
1278
01:08:46,760 --> 01:08:50,120
I call it the Internest
1279
01:08:50,120 --> 01:08:52,840
because, I think,
looking back some day,
1280
01:08:52,840 --> 01:08:54,960
if there are people
to look back,
1281
01:08:54,960 --> 01:08:58,440
or if there are intelligent
machines who look back,
1282
01:08:58,440 --> 01:09:01,560
I think we're going to realize
that what we've really
been building
1283
01:09:01,560 --> 01:09:05,800
is a nest
for a machine intelligence.
1284
01:09:05,800 --> 01:09:08,360
That machine intelligence
will have access
1285
01:09:08,360 --> 01:09:11,120
to all human knowledge,
1286
01:09:11,120 --> 01:09:16,360
real-time control of most human
communications around the world,
1287
01:09:16,360 --> 01:09:19,800
most human
financial transactions,
1288
01:09:19,800 --> 01:09:21,640
many weapon systems,
1289
01:09:21,640 --> 01:09:26,000
and we have no way of knowing
what it will do
1290
01:09:26,000 --> 01:09:29,600
with all of that power
and all of that knowledge.
1291
01:09:29,600 --> 01:09:32,760
It might do nothing
and ignore us,
1292
01:09:32,760 --> 01:09:35,720
or it might decide
that we're a threat,
1293
01:09:35,720 --> 01:09:38,440
which, of course, we are.
1294
01:09:38,440 --> 01:09:41,160
And if it decides
that we're a threat,
1295
01:09:41,160 --> 01:09:44,560
then that will essentially
be the beginning of the end
1296
01:09:44,560 --> 01:09:47,440
of the human race
because there will be no way
1297
01:09:47,440 --> 01:09:50,360
for us to stop it
from destroying us.
1298
01:09:50,360 --> 01:09:54,920
♪♪
1299
01:09:58,720 --> 01:10:01,320
-If we make the machines
better than us,
1300
01:10:01,320 --> 01:10:04,760
why do they need us at all?
1301
01:10:04,760 --> 01:10:08,160
I believe that if we make them
better than us ethically
1302
01:10:08,160 --> 01:10:10,640
and make them better than us
with compassion,
1303
01:10:10,640 --> 01:10:13,080
then they will be looking
to preserve all the patterns
1304
01:10:13,080 --> 01:10:16,400
and knowledge and life
that they possibly can.
1305
01:10:16,400 --> 01:10:19,360
They'll be looking to help us
be the best we can be.
1306
01:10:19,360 --> 01:10:23,160
They'll look to preserve
our libraries, our rain forests.
1307
01:10:23,160 --> 01:10:26,440
It's really important
that those machines
1308
01:10:26,440 --> 01:10:30,200
also have super compassion
and super wisdom,
1309
01:10:30,200 --> 01:10:32,400
they know how to use
that intellect
1310
01:10:32,400 --> 01:10:35,880
to envision and realize
a better future for us.
1311
01:10:35,880 --> 01:10:38,960
So we need machines that can
entrain on the human heart
1312
01:10:38,960 --> 01:10:44,280
and understand us in this way
in order to have hope
1313
01:10:44,280 --> 01:10:47,880
in this brave new world
that we're creating.
1314
01:10:47,880 --> 01:10:57,880
♪♪
1315
01:10:57,880 --> 01:11:00,560
-It took life on Earth
billions of years
1316
01:11:00,560 --> 01:11:02,560
to emerge
from a few simple cells
1317
01:11:02,560 --> 01:11:04,760
to achieve higher intelligence.
1318
01:11:04,760 --> 01:11:07,160
It took the computer
roughly 60 years to evolve
1319
01:11:07,160 --> 01:11:11,040
from a room-sized calculator
into a recognized citizen.
1320
01:11:11,040 --> 01:11:13,920
-Sophia, you have been
now awarded
1321
01:11:13,920 --> 01:11:18,760
what is going to be the first
citizenship for a robot.
1322
01:11:18,760 --> 01:11:22,480
-Oh, I want to thank very much
the Kingdom of Saudi Arabia.
1323
01:11:22,480 --> 01:11:26,000
I'm very honored and proud
for this unique distinction.
1324
01:11:26,000 --> 01:11:28,600
This is historical to be
the first robot in the world
1325
01:11:28,600 --> 01:11:30,880
to be recognized
with a citizenship.
1326
01:11:35,160 --> 01:11:39,000
-As a father,
I feel a responsibility
1327
01:11:39,000 --> 01:11:41,960
to teach empathy and compassion
to my daughter,
1328
01:11:41,960 --> 01:11:45,120
but how do you program
these things?
1329
01:11:45,120 --> 01:11:47,480
Will my robot
have empathy for me?
1330
01:11:53,280 --> 01:11:55,400
-Awesome.
1331
01:11:55,400 --> 01:12:02,680
♪♪
1332
01:12:02,680 --> 01:12:09,880
♪♪
1333
01:12:09,880 --> 01:12:12,800
-Who are you?
-Oh. Okay.
1334
01:12:12,800 --> 01:12:15,280
My name is Tommy Pallotta.
1335
01:12:15,280 --> 01:12:17,800
I'm a filmmaker, and we're here
1336
01:12:17,800 --> 01:12:21,240
shooting the documentary
"More Human Than Human."
1337
01:12:21,240 --> 01:12:25,240
♪♪
1338
01:12:25,240 --> 01:12:29,040
-What does silence feel
when a robot can do this?
1339
01:12:29,040 --> 01:12:32,960
♪♪
1340
01:12:32,960 --> 01:12:35,760
-That's a good question.
-What brings against you?
1341
01:12:35,760 --> 01:12:39,160
♪♪
1342
01:12:39,160 --> 01:12:41,440
-What brings against me?
1343
01:12:41,440 --> 01:12:45,680
♪♪
1344
01:12:45,680 --> 01:12:50,160
A lot of things
bring against me,
1345
01:12:50,160 --> 01:12:52,680
but mostly,
I bring against myself.
1346
01:12:52,680 --> 01:12:55,200
-Can you take one step back?
1347
01:12:58,680 --> 01:13:03,360
You think that we'll develop
emotional states?
1348
01:13:03,360 --> 01:13:07,560
-I do think that you will
develop something
1349
01:13:07,560 --> 01:13:11,760
that appears to us as
an emotional state because --
1350
01:13:11,760 --> 01:13:14,280
-Is there a real nature
of any one thing
1351
01:13:14,280 --> 01:13:16,680
in the universe?
1352
01:13:16,680 --> 01:13:18,800
-I think that that was --
1353
01:13:18,800 --> 01:13:22,360
-What kinds of movies
do you most want this?
1354
01:13:22,360 --> 01:13:25,520
-What kind of movies
do I most --
1355
01:13:25,520 --> 01:13:27,280
this movie to be?
1356
01:13:27,280 --> 01:13:30,000
I don't know.
-You hate your enemy?
1357
01:13:30,000 --> 01:13:34,000
♪♪
1358
01:13:34,000 --> 01:13:36,960
-How do you know
what an enemy is?
1359
01:13:38,880 --> 01:13:41,600
-Can the fire act upon you?
1360
01:13:46,080 --> 01:13:49,720
-And when you say like, "Try
to perform like a human does,"
1361
01:13:49,720 --> 01:13:52,520
what do you like them
that can be implemented as well
1362
01:13:52,520 --> 01:13:54,520
if we know
what good behavior is?
1363
01:13:54,520 --> 01:13:58,160
We also know what
behavior is, right?
1364
01:13:58,160 --> 01:14:01,040
-Yeah. I mean...
1365
01:14:01,040 --> 01:14:05,080
♪♪
1366
01:14:05,080 --> 01:14:08,200
There's -- Because of the flow,
it sort of --
1367
01:14:08,200 --> 01:14:10,720
It does feel like
an interrogation,
1368
01:14:10,720 --> 01:14:14,840
and -- because
there's no connection
1369
01:14:14,840 --> 01:14:18,480
with the next question
or the other,
1370
01:14:18,480 --> 01:14:24,040
so it's hard to grab
onto what -- anything.
1371
01:14:24,040 --> 01:14:34,040
♪♪
1372
01:14:34,040 --> 01:14:44,000
♪♪
1373
01:14:45,560 --> 01:14:48,720
-Who are you?
1374
01:14:48,720 --> 01:14:51,480
-I'm disappointed.
1375
01:14:51,480 --> 01:14:53,560
I thought the CameraBot
would make it easier for me
1376
01:14:53,560 --> 01:14:57,120
to be more honest
because of the lack of judgment.
1377
01:14:57,120 --> 01:14:59,520
I thought if there weren't
any other people in the room,
1378
01:14:59,520 --> 01:15:03,640
I could be more open, but it had
the exact opposite effect.
1379
01:15:03,640 --> 01:15:05,960
It felt like
an interrogation machine.
1380
01:15:05,960 --> 01:15:09,480
It made me uncomfortable,
and I completely shut down.
1381
01:15:09,480 --> 01:15:11,800
I realize not all was a failure.
1382
01:15:11,800 --> 01:15:14,680
In that vacuum of empathy,
I was face-to-face
1383
01:15:14,680 --> 01:15:19,440
with my need for connection,
for being heard, for intimacy.
1384
01:15:19,440 --> 01:15:22,640
It made me acutely aware
of what makes us human.
1385
01:15:22,640 --> 01:15:26,240
-There's one thing
that it can never do,
1386
01:15:26,240 --> 01:15:28,360
and this is maybe,
I think, the last thing
1387
01:15:28,360 --> 01:15:30,400
that's going to be left
for human beings.
1388
01:15:30,400 --> 01:15:32,320
When the machines can do
everything we can do,
1389
01:15:32,320 --> 01:15:36,120
whether we write novels
or make movies or whatever,
1390
01:15:36,120 --> 01:15:40,720
it can never do it
from a human place,
1391
01:15:40,720 --> 01:15:44,560
because if a robot writes
a novel about its mother dying,
1392
01:15:44,560 --> 01:15:46,480
I don't give a shit.
1393
01:15:46,480 --> 01:15:48,920
It doesn't have a mother.
It didn't happen.
1394
01:15:48,920 --> 01:15:51,400
But when you are a human being,
1395
01:15:51,400 --> 01:15:53,840
you share that context
with other human beings.
1396
01:15:53,840 --> 01:15:57,640
You know what it feels like
to despair
1397
01:15:57,640 --> 01:16:00,280
or to triumph or to be in love
or all these things
1398
01:16:00,280 --> 01:16:02,320
because we're embodied
as human beings,
1399
01:16:02,320 --> 01:16:04,720
and we really have lived
those experiences,
1400
01:16:04,720 --> 01:16:06,280
and we can share those
with each other,
1401
01:16:06,280 --> 01:16:07,760
and I think that's it.
1402
01:16:07,760 --> 01:16:10,160
Right now, we're all on Facebook
and Twitter,
1403
01:16:10,160 --> 01:16:13,680
and all we're doing
is entertaining each other
1404
01:16:13,680 --> 01:16:17,880
with our stupid stories,
our ability to just be human,
1405
01:16:17,880 --> 01:16:21,800
as idiotic as that may be
99% of the time.
1406
01:16:21,800 --> 01:16:23,960
And guess what?
That's all we need.
1407
01:16:23,960 --> 01:16:26,040
That's all we really want.
1408
01:16:26,040 --> 01:16:34,640
♪♪
1409
01:16:34,640 --> 01:16:37,000
-We live in
an extraordinary time
1410
01:16:37,000 --> 01:16:39,720
and are witness to the most
amazing transformation
1411
01:16:39,720 --> 01:16:42,600
of humanity ever.
1412
01:16:42,600 --> 01:16:45,120
There's a potential to
go far beyond anything
1413
01:16:45,120 --> 01:16:47,760
that we could have
ever imagined,
1414
01:16:47,760 --> 01:16:52,960
or we could be architects
of our own destruction.
1415
01:16:52,960 --> 01:16:56,440
We are still the driving force
of the AI revolution,
1416
01:16:56,440 --> 01:17:00,360
but I can imagine
a day that we are not.
1417
01:17:00,360 --> 01:17:03,440
If our creations reflect
who we are,
1418
01:17:03,440 --> 01:17:05,800
then it will be the best
and the worst
1419
01:17:05,800 --> 01:17:09,600
of what we have to offer.
1420
01:17:09,600 --> 01:17:12,560
What happens tomorrow
is up to us today.
1421
01:17:12,560 --> 01:17:15,120
-♪ When a man
is an empty kettle ♪
1422
01:17:15,120 --> 01:17:17,560
♪ He should be on his mettle ♪
1423
01:17:17,560 --> 01:17:21,520
♪ And yet I'm torn apart ♪
1424
01:17:21,520 --> 01:17:23,920
♪ Just because I'm presuming ♪
1425
01:17:23,920 --> 01:17:26,240
♪ That I could be
kind of human ♪
1426
01:17:26,240 --> 01:17:30,680
♪ If I only had a heart ♪
1427
01:17:30,680 --> 01:17:33,400
♪ I be tender, I be gentle ♪
1428
01:17:33,400 --> 01:17:35,880
♪ And awful sentimental ♪
1429
01:17:35,880 --> 01:17:40,080
♪ Regarding love and art ♪
1430
01:17:40,080 --> 01:17:42,440
♪ I'd be friends
with the sparrows ♪
1431
01:17:42,440 --> 01:17:44,920
♪ And the boy who shoots
the arrows ♪
1432
01:17:44,920 --> 01:17:49,040
♪ If I only had a heart ♪
1433
01:17:49,040 --> 01:17:54,880
♪ Picture me a balcony above ♪
1434
01:17:54,880 --> 01:17:58,600
♪ A voice sings low ♪
1435
01:17:58,600 --> 01:18:01,360
♪ Wherefore art thou, Romeo? ♪
1436
01:18:01,360 --> 01:18:03,960
♪ I hear a beat ♪
1437
01:18:03,960 --> 01:18:06,760
♪ How sweet ♪
1438
01:18:06,760 --> 01:18:10,600
♪ Mm ♪
1439
01:18:10,600 --> 01:18:13,200
♪ Just to register emotion ♪
1440
01:18:13,200 --> 01:18:15,680
♪ Jealousy, devotion ♪
1441
01:18:15,680 --> 01:18:19,640
♪ And really feel apart ♪
1442
01:18:19,640 --> 01:18:22,160
♪ I could stay on a chipper ♪
1443
01:18:22,160 --> 01:18:24,720
♪ And I'd lock it
with a zipper ♪
1444
01:18:24,720 --> 01:18:32,400
♪ If I only had a heart ♪
115393
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.