Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:15,580 --> 00:00:25,547
2
00:00:25,590 --> 00:00:35,035
3
00:00:35,078 --> 00:00:41,302
What we're on the brink of is
a world of increasingly intense,
4
00:00:41,345 --> 00:00:45,219
sophisticated
artificial intelligence.
5
00:00:45,262 --> 00:00:48,396
Man: Technology is evolving
so much faster than our society
6
00:00:48,439 --> 00:00:51,181
has the ability
to protect us as citizens.
7
00:00:51,486 --> 00:00:55,707
The robots are coming, and they
will destroy our livelihoods.
8
00:00:55,751 --> 00:01:01,844
9
00:01:01,887 --> 00:01:04,238
You have a networked
intelligence that watches us,
10
00:01:04,281 --> 00:01:08,590
knows everything about us,
and begins to try to change us.
11
00:01:08,633 --> 00:01:12,768
Man
world's number-one news site.
12
00:01:12,811 --> 00:01:15,205
Technology is never good or bad.
13
00:01:15,249 --> 00:01:18,948
It's what we do
with the technology.
14
00:01:18,991 --> 00:01:22,734
Eventually, millions of people
are gonna be thrown out of jobs
15
00:01:22,778 --> 00:01:25,737
because their skills
are going to be obsolete.
16
00:01:25,781 --> 00:01:27,435
Woman: Mass unemployment...
17
00:01:27,478 --> 00:01:32,527
greater inequalities,
even social unrest.
18
00:01:32,570 --> 00:01:35,530
Regardless of whether
to be afraid or not afraid,
19
00:01:35,573 --> 00:01:38,185
the change is coming,
and nobody can stop it.
20
00:01:38,228 --> 00:01:44,539
21
00:01:44,582 --> 00:01:46,323
Man
huge amounts of money,
22
00:01:46,367 --> 00:01:49,283
and so it stands to reason
that the military,
23
00:01:49,326 --> 00:01:50,893
with their own desires,
24
00:01:50,936 --> 00:01:53,330
are gonna start to use
these technologies.
25
00:01:53,374 --> 00:01:55,419
Man
Autonomous weapons systems
26
00:01:55,463 --> 00:01:57,552
could lead to a global arms race
27
00:01:57,595 --> 00:02:00,032
to rival the Nuclear Era.
28
00:02:00,076 --> 00:02:02,296
29
00:02:02,339 --> 00:02:04,036
Man
We know what the answer is.
30
00:02:04,080 --> 00:02:05,429
They'll eventually
be killing us.
31
00:02:05,473 --> 00:02:10,782
32
00:02:10,826 --> 00:02:12,349
These technology leaps
33
00:02:12,393 --> 00:02:15,874
are gonna yield
incredible miracles...
34
00:02:15,918 --> 00:02:18,181
and incredible horrors.
35
00:02:18,225 --> 00:02:24,231
36
00:02:24,274 --> 00:02:29,323
Man
so I think, as we move forward,
37
00:02:29,366 --> 00:02:33,762
this intelligence
will contain parts of us.
38
00:02:33,805 --> 00:02:35,981
And I think the question is --
39
00:02:36,025 --> 00:02:39,463
Will it contain
the good parts...
40
00:02:39,507 --> 00:02:41,378
or the bad parts
41
00:02:41,422 --> 00:02:47,079
42
00:02:57,742 --> 00:03:04,793
43
00:03:04,836 --> 00:03:08,840
Sarah: The survivors
called the war "Judgment Day."
44
00:03:08,884 --> 00:03:12,583
They lived only to face
a new nightmare --
45
00:03:12,627 --> 00:03:14,019
the war against the machines.
46
00:03:14,063 --> 00:03:15,412
Aah
47
00:03:15,456 --> 00:03:18,023
Nolan: I think
we've completely fucked this up.
48
00:03:18,067 --> 00:03:21,549
I think Hollywood has managed
to inoculate the general public
49
00:03:21,592 --> 00:03:24,247
against this question --
50
00:03:24,291 --> 00:03:28,251
the idea of machines
that will take over the world.
51
00:03:28,295 --> 00:03:30,645
Open the pod bay doors, HAL.
52
00:03:30,688 --> 00:03:33,561
HAL: I'm sorry, Dave.
53
00:03:33,604 --> 00:03:35,911
I'm afraid I can't do that.
54
00:03:37,434 --> 00:03:38,696
THING
55
00:03:38,740 --> 00:03:40,437
Nolan:
We've cried wolf enough times...
56
00:03:40,481 --> 00:03:42,483
THING
...that the public
has stopped paying attention,
57
00:03:42,526 --> 00:03:43,962
because it feels like
science fiction.
58
00:03:44,006 --> 00:03:45,486
Even sitting here talking
about it right now,
59
00:03:45,529 --> 00:03:48,228
it feels a little bit silly,
a little bit like,
60
00:03:48,271 --> 00:03:51,666
"Oh, this is an artifact
of some cheeseball movie."
61
00:03:51,709 --> 00:03:56,584
The WOPR spends all its time
thinking about World War III.
62
00:03:56,627 --> 00:03:59,064
But it's not.
63
00:03:59,108 --> 00:04:02,111
The general public is about
to get blindsided by this.
64
00:04:02,154 --> 00:04:11,512
65
00:04:11,555 --> 00:04:13,514
As a society and as individuals,
66
00:04:13,557 --> 00:04:18,954
we're increasingly surrounded
by machine intelligence.
67
00:04:18,997 --> 00:04:22,653
We carry this pocket device
in the palm of our hand
68
00:04:22,697 --> 00:04:24,829
that we use to make
a striking array
69
00:04:24,873 --> 00:04:26,831
of life decisions right now,
70
00:04:26,875 --> 00:04:29,007
aided by a set
of distant algorithms
71
00:04:29,051 --> 00:04:30,748
that we have no understanding.
72
00:04:30,792 --> 00:04:34,143
73
00:04:34,186 --> 00:04:36,537
We're already pretty jaded
about the idea
74
00:04:36,580 --> 00:04:37,929
that we can talk to our phone,
75
00:04:37,973 --> 00:04:40,062
and it mostly understands us.
76
00:04:40,105 --> 00:04:42,456
Woman: I found quite a number
of action films.
77
00:04:42,499 --> 00:04:44,327
Five years ago -- no way.
78
00:04:44,371 --> 00:04:47,678
Markoff: Robotics.
Machines that see and speak...
79
00:04:47,722 --> 00:04:48,897
Woman: Hi, there....and listen.
80
00:04:48,940 --> 00:04:50,202
All that's real now.
81
00:04:50,246 --> 00:04:51,639
And these technologies
82
00:04:51,682 --> 00:04:55,686
are gonna fundamentally
change our society.
83
00:04:55,730 --> 00:05:00,212
Thrun: Now we have this great
movement of self-driving cars.
84
00:05:00,256 --> 00:05:01,953
Driving a car autonomously
85
00:05:01,997 --> 00:05:06,088
can move people's lives
into a better place.
86
00:05:06,131 --> 00:05:07,916
Horvitz: I've lost
a number of family members,
87
00:05:07,959 --> 00:05:09,570
including my mother,
88
00:05:09,613 --> 00:05:11,876
my brother and sister-in-law
and their kids,
89
00:05:11,920 --> 00:05:14,009
to automobile accidents.
90
00:05:14,052 --> 00:05:18,405
It's pretty clear we could
almost eliminate car accidents
91
00:05:18,448 --> 00:05:20,102
with automation.
92
00:05:20,145 --> 00:05:21,843
30,000 lives in the U.S. alone.
93
00:05:21,886 --> 00:05:25,455
About a million around the world
per year.
94
00:05:25,499 --> 00:05:27,501
Ferrucci:
In healthcare, early indicators
95
00:05:27,544 --> 00:05:29,503
are the name of the game
in that space,
96
00:05:29,546 --> 00:05:33,158
so that's another place where
it can save somebody's life.
97
00:05:33,202 --> 00:05:35,726
Dr. Herman: Here in
the breast-cancer center,
98
00:05:35,770 --> 00:05:38,381
all the things that
the radiologist's brain
99
00:05:38,425 --> 00:05:43,386
does in two minutes, the
computer does instantaneously.
100
00:05:43,430 --> 00:05:47,303
The computer has looked
at 1 billion mammograms,
101
00:05:47,347 --> 00:05:49,261
and it takes that data
and applies it
102
00:05:49,305 --> 00:05:51,438
to this image instantaneously,
103
00:05:51,481 --> 00:05:54,441
so the medical application
is profound.
104
00:05:56,399 --> 00:05:57,705
Zilis:
Another really exciting area
105
00:05:57,748 --> 00:05:59,402
that we're seeing
a lot of development in
106
00:05:59,446 --> 00:06:03,275
is actually understanding
our genetic code
107
00:06:03,319 --> 00:06:06,104
and using that
to both diagnose disease
108
00:06:06,148 --> 00:06:07,758
and create
personalized treatments.
109
00:06:07,802 --> 00:06:11,632
110
00:06:11,675 --> 00:06:14,112
Kurzweil:
The primary application
of all these machines
111
00:06:14,156 --> 00:06:17,246
will be to extend
our own intelligence.
112
00:06:17,289 --> 00:06:19,422
We'll be able to make
ourselves smarter,
113
00:06:19,466 --> 00:06:22,643
and we'll be better
at solving problems.
114
00:06:22,686 --> 00:06:24,775
We don't have to age.
We'll actually understand aging.
115
00:06:24,819 --> 00:06:27,125
We'll be able to stop it.
116
00:06:27,169 --> 00:06:29,519
Man: There's really no limit
to what intelligent machines
117
00:06:29,563 --> 00:06:30,868
can do for the human race.
118
00:06:30,912 --> 00:06:36,265
119
00:06:36,308 --> 00:06:39,399
How could a smarter machine
not be a better machine
120
00:06:42,053 --> 00:06:44,708
It's hard to say exactly
when I began to think
121
00:06:44,752 --> 00:06:46,971
that that was a bit naive.
122
00:06:47,015 --> 00:06:56,459
123
00:06:56,503 --> 00:06:59,288
Stuart Russell,
he's basically a god
124
00:06:59,331 --> 00:07:00,898
in the field
of artificial intelligence.
125
00:07:00,942 --> 00:07:04,380
He wrote the book that almost
every university uses.
126
00:07:04,424 --> 00:07:06,948
Russell: I used to say it's the
best-selling AI textbook.
127
00:07:06,991 --> 00:07:10,255
Now I just say "It's the PDF
that's stolen most often."
128
00:07:10,299 --> 00:07:13,650
129
00:07:13,694 --> 00:07:17,306
Artificial intelligence is
about making computers smart,
130
00:07:17,349 --> 00:07:19,830
and from the point
of view of the public,
131
00:07:19,874 --> 00:07:21,484
what counts as AI
is just something
132
00:07:21,528 --> 00:07:23,268
that's surprisingly intelligent
133
00:07:23,312 --> 00:07:25,488
compared to what
we thought computers
134
00:07:25,532 --> 00:07:28,404
would typically be able to do.
135
00:07:28,448 --> 00:07:33,801
AI is a field of research
to try to basically simulate
136
00:07:33,844 --> 00:07:36,717
all kinds of human capabilities.
137
00:07:36,760 --> 00:07:38,719
We're in the AI era.
138
00:07:38,762 --> 00:07:40,503
Silicon Valley
has the ability to focus
139
00:07:40,547 --> 00:07:42,462
on one bright, shiny thing.
140
00:07:42,505 --> 00:07:43,767
It was social networking
141
00:07:43,811 --> 00:07:45,508
and social media
over the last decade,
142
00:07:45,552 --> 00:07:48,119
and it's pretty clear
that the bit has flipped.
143
00:07:48,163 --> 00:07:50,557
And it starts
with machine learning.
144
00:07:50,600 --> 00:07:54,343
Nolan: When we look back at this
moment, what was the first AI
145
00:07:54,386 --> 00:07:55,736
It's not sexy,
and it isn't the thing
146
00:07:55,779 --> 00:07:57,389
we could see at the movies,
147
00:07:57,433 --> 00:08:00,741
but you'd make a great case
that Google created,
148
00:08:00,784 --> 00:08:03,395
not a search engine,
but a godhead.
149
00:08:03,439 --> 00:08:06,486
A way for people to ask
any question they wanted
150
00:08:06,529 --> 00:08:08,270
and get the answer they needed.
151
00:08:08,313 --> 00:08:11,273
Russell: Most people are not
aware that what Google is doing
152
00:08:11,316 --> 00:08:13,710
is actually a form of
artificial intelligence.
153
00:08:13,754 --> 00:08:16,234
They just go there,
they type in a thing.
154
00:08:16,278 --> 00:08:18,323
Google gives them the answer.
155
00:08:18,367 --> 00:08:21,544
Musk: With each search,
we train it to be better.
156
00:08:21,588 --> 00:08:23,851
Sometimes we're typing a search,
and it tell us the answer
157
00:08:23,894 --> 00:08:27,419
before you've finished
asking the question.
158
00:08:27,463 --> 00:08:29,944
You know, who is the president
of Kazakhstan
159
00:08:29,987 --> 00:08:31,685
And it'll just tell you.
160
00:08:31,728 --> 00:08:33,600
You don't have to go to the
Kazakhstan national website
161
00:08:33,643 --> 00:08:34,818
to find out.
162
00:08:34,862 --> 00:08:37,081
You didn't used to be
able to do that.
163
00:08:37,125 --> 00:08:39,475
Nolan:
That is artificial intelligence.
164
00:08:39,519 --> 00:08:42,783
Years from now when we try
to understand, we will say,
165
00:08:42,826 --> 00:08:44,567
"How did we miss it"
166
00:08:44,611 --> 00:08:47,527
Markoff: It's one of
the striking contradictions
167
00:08:47,570 --> 00:08:48,484
that we're facing.
168
00:08:48,528 --> 00:08:50,051
Google and Facebook, et al,
169
00:08:50,094 --> 00:08:52,053
have built businesses
on giving us,
170
00:08:52,096 --> 00:08:54,185
as a society, free stuff.
171
00:08:54,229 --> 00:08:56,013
But it's a Faustian bargain.
172
00:08:56,057 --> 00:09:00,017
They're extracting something
from us in exchange,
173
00:09:00,061 --> 00:09:01,628
but we don't know
174
00:09:01,671 --> 00:09:03,760
what code is running
on the other side and why.
175
00:09:03,804 --> 00:09:06,546
We have no idea.
176
00:09:06,589 --> 00:09:08,591
It does strike
right at the issue
177
00:09:08,635 --> 00:09:11,028
of how much we should
trust these machines.
178
00:09:14,162 --> 00:09:18,166
I use computers
literally for everything.
179
00:09:18,209 --> 00:09:21,386
There are so many
computer advancements now,
180
00:09:21,430 --> 00:09:23,824
and it's become such
a big part of our lives.
181
00:09:23,867 --> 00:09:26,174
It's just incredible
what a computer can do.
182
00:09:26,217 --> 00:09:29,090
You can actually carry
a computer in your purse.
183
00:09:29,133 --> 00:09:31,571
I mean, how awesome is that
184
00:09:31,614 --> 00:09:35,052
I think most technology is meant
to make things easier
185
00:09:35,096 --> 00:09:37,315
and simpler for all of us,
186
00:09:37,359 --> 00:09:40,362
so hopefully that just
remains the focus.
187
00:09:40,405 --> 00:09:43,147
I think everybody loves
their computers.
188
00:09:43,191 --> 00:09:44,366
189
00:09:44,409 --> 00:09:51,678
190
00:09:51,721 --> 00:09:53,810
People don't realize
they are constantly
191
00:09:53,854 --> 00:09:59,076
being negotiated with
by machines,
192
00:09:59,120 --> 00:10:02,993
whether that's the price
of products in your Amazon cart,
193
00:10:03,037 --> 00:10:05,517
whether you can get
on a particular flight,
194
00:10:05,561 --> 00:10:08,912
whether you can reserve
a room at a particular hotel.
195
00:10:08,956 --> 00:10:11,959
What you're experiencing
are machine-learning algorithms
196
00:10:12,002 --> 00:10:14,265
that have determined
that a person like you
197
00:10:14,309 --> 00:10:15,919
is willing to pay 2 cents more
198
00:10:15,963 --> 00:10:17,791
and is changing the price.
199
00:10:17,834 --> 00:10:21,751
200
00:10:21,795 --> 00:10:24,014
Kosinski: Now, a computer looks
at millions of people
201
00:10:24,058 --> 00:10:28,105
simultaneously for
very subtle patterns.
202
00:10:28,149 --> 00:10:31,369
You can take seemingly
innocent digital footprints,
203
00:10:31,413 --> 00:10:34,677
such as someone's playlist
on Spotify,
204
00:10:34,721 --> 00:10:37,201
or stuff that they
bought on Amazon,
205
00:10:37,245 --> 00:10:40,291
and then use algorithms
to translate this
206
00:10:40,335 --> 00:10:44,513
into a very detailed and a
very accurate, intimate profile.
207
00:10:47,603 --> 00:10:50,911
Kaplan: There is a dossier on
each of us that is so extensive
208
00:10:50,954 --> 00:10:52,695
it would be possibly
accurate to say
209
00:10:52,739 --> 00:10:55,698
that they know more about you
than your mother does.
210
00:10:55,742 --> 00:11:04,054
211
00:11:04,098 --> 00:11:06,883
Tegmark: The major cause
of the recent AI breakthrough
212
00:11:06,927 --> 00:11:08,580
isn't just that some dude
213
00:11:08,624 --> 00:11:11,583
had a brilliant insight
all of a sudden,
214
00:11:11,627 --> 00:11:14,325
but simply that we have
much bigger data
215
00:11:14,369 --> 00:11:18,242
to train them on
and vastly better computers.
216
00:11:18,286 --> 00:11:19,940
el Kaliouby:
The magic is in the data.
217
00:11:19,983 --> 00:11:21,463
It's a ton of data.
218
00:11:21,506 --> 00:11:23,726
I mean, it's data
that's never existed before.
219
00:11:23,770 --> 00:11:26,686
We've never had
this data before.
220
00:11:26,729 --> 00:11:30,733
We've created technologies
that allow us to capture
221
00:11:30,777 --> 00:11:33,040
vast amounts of information.
222
00:11:33,083 --> 00:11:35,738
If you think of a billion
cellphones on the planet
223
00:11:35,782 --> 00:11:38,393
with gyroscopes
and accelerometers
224
00:11:38,436 --> 00:11:39,786
and fingerprint readers...
225
00:11:39,829 --> 00:11:42,005
couple that with the GPS
and the photos they take
226
00:11:42,049 --> 00:11:43,964
and the tweets that you send,
227
00:11:44,007 --> 00:11:47,750
we're all giving off huge
amounts of data individually.
228
00:11:47,794 --> 00:11:50,274
Cars that drive as the cameras
on them suck up information
229
00:11:50,318 --> 00:11:52,059
about the world around them.
230
00:11:52,102 --> 00:11:54,844
The satellites that are now
in orbit the size of a toaster.
231
00:11:54,888 --> 00:11:57,629
The infrared about
the vegetation on the planet.
232
00:11:57,673 --> 00:11:59,109
The buoys that are out
in the oceans
233
00:11:59,153 --> 00:12:01,024
to feed into the climate models.
234
00:12:01,068 --> 00:12:05,028
235
00:12:05,072 --> 00:12:08,902
And the NSA, the CIA,
as they collect information
236
00:12:08,945 --> 00:12:12,644
about the
geopolitical situations.
237
00:12:12,688 --> 00:12:15,604
The world today is literally
swimming in this data.
238
00:12:15,647 --> 00:12:20,565
239
00:12:20,609 --> 00:12:22,480
Kosinski: Back in 2012,
240
00:12:22,524 --> 00:12:25,875
IBM estimated
that an average human being
241
00:12:25,919 --> 00:12:31,098
leaves 500 megabytes
of digital footprints every day.
242
00:12:31,141 --> 00:12:34,841
If you wanted to back up
on the one day worth of data
243
00:12:34,884 --> 00:12:36,494
that humanity produces
244
00:12:36,538 --> 00:12:39,062
and imprint it out
on a letter-sized paper,
245
00:12:39,106 --> 00:12:43,806
double-sided, font size 12,
and you stack it up,
246
00:12:43,850 --> 00:12:46,113
it would reach from
the surface of the Earth
247
00:12:46,156 --> 00:12:49,116
to the sun four times over.
248
00:12:49,159 --> 00:12:51,292
That's every day.
249
00:12:51,335 --> 00:12:53,816
Kaplan: The data itself
is not good or evil.
250
00:12:53,860 --> 00:12:55,470
It's how it's used.
251
00:12:55,513 --> 00:12:58,342
We're relying, really,
on the goodwill of these people
252
00:12:58,386 --> 00:13:01,171
and on the policies
of these companies.
253
00:13:01,215 --> 00:13:03,870
There is no legal requirement
for how they can
254
00:13:03,913 --> 00:13:06,307
and should use
that kind of data.
255
00:13:06,350 --> 00:13:09,266
That, to me, is at the heart
of the trust issue.
256
00:13:11,007 --> 00:13:13,793
Barrat: Right now there's a
giant race for creating machines
257
00:13:13,836 --> 00:13:15,751
that are as smart as humans.
258
00:13:15,795 --> 00:13:17,971
Google -- They're working on
what's really the kind of
259
00:13:18,014 --> 00:13:20,016
Manhattan Project
of artificial intelligence.
260
00:13:20,060 --> 00:13:22,671
They've got the most money.
They've got the most talent.
261
00:13:22,714 --> 00:13:27,067
They're buying up AI companies
and robotics companies.
262
00:13:27,110 --> 00:13:29,069
Urban: People still think
of Google as a search engine
263
00:13:29,112 --> 00:13:30,722
and their e-mail provider
264
00:13:30,766 --> 00:13:33,943
and a lot of other things
that we use on a daily basis,
265
00:13:33,987 --> 00:13:39,383
but behind that search box
are 10 million servers.
266
00:13:39,427 --> 00:13:42,299
That makes Google the most
powerful computing platform
267
00:13:42,343 --> 00:13:43,910
in the world.
268
00:13:43,953 --> 00:13:47,217
Google is now working
on an AI computing platform
269
00:13:47,261 --> 00:13:50,133
that will have
100 million servers.
270
00:13:52,179 --> 00:13:53,963
So when you're interacting
with Google,
271
00:13:54,007 --> 00:13:56,052
we're just seeing
the toenail of something
272
00:13:56,096 --> 00:13:58,881
that is a giant beast
in the making.
273
00:13:58,925 --> 00:14:00,622
And the truth is,
I'm not even sure
274
00:14:00,665 --> 00:14:02,798
that Google knows
what it's becoming.
275
00:14:02,842 --> 00:14:11,502
276
00:14:11,546 --> 00:14:14,114
Phoenix: If you look inside of
what algorithms are being used
277
00:14:14,157 --> 00:14:15,811
at Google,
278
00:14:15,855 --> 00:14:20,076
it's technology
largely from the '80s.
279
00:14:20,120 --> 00:14:23,863
So these are models that you
train by showing them a 1, a 2,
280
00:14:23,906 --> 00:14:27,344
and a 3, and it learns not
what a 1 is or what a 2 is --
281
00:14:27,388 --> 00:14:30,434
It learns what the difference
between a 1 and a 2 is.
282
00:14:30,478 --> 00:14:32,436
It's just a computation.
283
00:14:32,480 --> 00:14:35,396
In the last half decade, where
we've made this rapid progress,
284
00:14:35,439 --> 00:14:38,268
it has all been
in pattern recognition.
285
00:14:38,312 --> 00:14:41,184
Tegmark: Most of
the good, old-fashioned AI
286
00:14:41,228 --> 00:14:44,057
was when we would tell
our computers
287
00:14:44,100 --> 00:14:46,798
how to play a game like chess...
288
00:14:46,842 --> 00:14:49,584
from the old paradigm where
you just tell the computer
289
00:14:49,627 --> 00:14:52,195
exactly what to do.
290
00:14:54,502 --> 00:14:57,505
Announcer:
This is "Jeopardy"
291
00:14:57,548 --> 00:14:59,376
292
00:14:59,420 --> 00:15:02,510
"The IBM Challenge"
293
00:15:02,553 --> 00:15:05,730
Ferrucci: No one at the time
had thought that a machine
294
00:15:05,774 --> 00:15:08,298
could have the precision
and the confidence
295
00:15:08,342 --> 00:15:09,952
and the speed
to play "Jeopardy"
296
00:15:09,996 --> 00:15:11,475
well enough against
the best humans.
297
00:15:11,519 --> 00:15:14,609
Let's play "Jeopardy"
298
00:15:18,569 --> 00:15:20,354
Watson.Watson: What is "shoe"
299
00:15:20,397 --> 00:15:21,877
You are right.
You get to pick.
300
00:15:21,921 --> 00:15:24,836
Literary Character APB
for $800.
301
00:15:24,880 --> 00:15:28,014
Answer --
the Daily Double.
302
00:15:28,057 --> 00:15:31,539
Watson actually got its
knowledge by reading Wikipedia
303
00:15:31,582 --> 00:15:34,672
and 200 million pages
of natural-language documents.
304
00:15:34,716 --> 00:15:36,674
Ferrucci:
You can't program every line
305
00:15:36,718 --> 00:15:38,502
of how the world works.
306
00:15:38,546 --> 00:15:40,722
The machine has to learn
by reading.
307
00:15:40,765 --> 00:15:42,202
Now we come to Watson.
308
00:15:42,245 --> 00:15:43,986
"Who is Bram Stoker"
309
00:15:44,030 --> 00:15:45,988
And the wager
310
00:15:46,032 --> 00:15:49,165
Hello $17,973.
311
00:15:49,209 --> 00:15:50,993
$41,413.
312
00:15:51,037 --> 00:15:53,343
And a two-day total
of $77--
313
00:15:53,387 --> 00:15:56,694
Phoenix: Watson's trained
on huge amounts of text,
314
00:15:56,738 --> 00:15:59,828
but it's not like it
understands what it's saying.
315
00:15:59,871 --> 00:16:02,309
It doesn't know that water makes
things wet by touching water
316
00:16:02,352 --> 00:16:04,441
and by seeing the way
things behave in the world
317
00:16:04,485 --> 00:16:06,182
the way you and I do.
318
00:16:06,226 --> 00:16:10,143
A lot of language AI today
is not building logical models
319
00:16:10,186 --> 00:16:11,622
of how the world works.
320
00:16:11,666 --> 00:16:15,365
Rather, it's looking at
how the words appear
321
00:16:15,409 --> 00:16:18,238
in the context of other words.
322
00:16:18,281 --> 00:16:20,196
Barrat: David Ferrucci
developed IBM's Watson,
323
00:16:20,240 --> 00:16:23,547
and somebody asked him,
"Does Watson think"
324
00:16:23,591 --> 00:16:27,160
And he said,
"Does a submarine swim"
325
00:16:27,203 --> 00:16:29,031
And what they meant was,
when they developed submarines,
326
00:16:29,075 --> 00:16:32,992
they borrowed basic principles
of swimming from fish.
327
00:16:33,035 --> 00:16:35,037
But a submarine swims
farther and faster than fish
328
00:16:35,081 --> 00:16:36,125
and can carry a huge payload.
329
00:16:36,169 --> 00:16:39,911
It out-swims fish.
330
00:16:39,955 --> 00:16:41,870
Ng: Watson winning the game
of "Jeopardy"
331
00:16:41,913 --> 00:16:43,741
will go down
in the history of AI
332
00:16:43,785 --> 00:16:46,570
as a significant milestone.
333
00:16:46,614 --> 00:16:49,269
We tend to be amazed
when the machine does so well.
334
00:16:49,312 --> 00:16:52,663
I'm even more amazed when the
computer beats humans at things
335
00:16:52,707 --> 00:16:55,188
that humans are
naturally good at.
336
00:16:55,231 --> 00:16:58,060
This is how we make progress.
337
00:16:58,104 --> 00:17:00,671
In the early days of
the Google Brain project,
338
00:17:00,715 --> 00:17:02,804
I gave the team a very
simple instruction,
339
00:17:02,847 --> 00:17:05,807
which was, "Build the biggest
neural network possible,
340
00:17:05,850 --> 00:17:08,157
like 1,000 computers."
341
00:17:08,201 --> 00:17:09,724
Musk: A neural net is
something very close
342
00:17:09,767 --> 00:17:12,161
to a simulation
of how the brain works.
343
00:17:12,205 --> 00:17:16,818
It's very probabilistic,
but with contextual relevance.
344
00:17:16,861 --> 00:17:18,298
Urban: In your brain,
you have long neurons
345
00:17:18,341 --> 00:17:20,256
that connect to thousands
of other neurons,
346
00:17:20,300 --> 00:17:22,519
and you have these pathways
that are formed and forged
347
00:17:22,563 --> 00:17:24,739
based on what
the brain needs to do.
348
00:17:24,782 --> 00:17:28,960
When a baby tries something and
it succeeds, there's a reward,
349
00:17:29,004 --> 00:17:32,312
and that pathway that created
the success is strengthened.
350
00:17:32,355 --> 00:17:34,662
If it fails at something,
the pathway is weakened,
351
00:17:34,705 --> 00:17:36,794
and so, over time,
the brain becomes honed
352
00:17:36,838 --> 00:17:40,320
to be good at
the environment around it.
353
00:17:40,363 --> 00:17:43,279
Ng: Really, it's just getting
machines to learn by themselves.
354
00:17:43,323 --> 00:17:45,238
This is called "deep learning,"
and "deep learning"
355
00:17:45,281 --> 00:17:48,676
and "neural networks"
mean roughly the same thing.
356
00:17:48,719 --> 00:17:52,375
Tegmark: Deep learning
is a totally different approach
357
00:17:52,419 --> 00:17:55,161
where the computer learns
more like a toddler,
358
00:17:55,204 --> 00:17:56,466
by just getting a lot of data
359
00:17:56,510 --> 00:18:00,340
and eventually
figuring stuff out.
360
00:18:00,383 --> 00:18:03,125
The computer just gets
smarter and smarter
361
00:18:03,169 --> 00:18:05,997
as it has more experiences.
362
00:18:06,041 --> 00:18:08,130
Ng: So, imagine, if you will,
a neural network, you know,
363
00:18:08,174 --> 00:18:09,697
like 1,000 computers.
364
00:18:09,740 --> 00:18:11,438
And it wakes up
not knowing anything.
365
00:18:11,481 --> 00:18:14,093
And we made it watch YouTube
for a week.
366
00:18:14,136 --> 00:18:16,704
367
00:18:16,747 --> 00:18:18,662
[ Psy's "Gangnam Style" plays ]
368
00:18:18,706 --> 00:18:20,360
♪ Oppan Gangnam style
369
00:18:20,403 --> 00:18:23,189
Ow
370
00:18:23,232 --> 00:18:25,365
[ Laughing ]
371
00:18:25,408 --> 00:18:28,194
Charlie
That really hurt
372
00:18:28,237 --> 00:18:30,152
373
00:18:30,196 --> 00:18:31,327
♪ Gangnam style
374
00:18:31,371 --> 00:18:33,286
♪ Op, op, op, op
375
00:18:33,329 --> 00:18:36,202
♪ Oppan Gangnam style
376
00:18:36,245 --> 00:18:38,508
Ng: And so, after watching
YouTube for a week,
377
00:18:38,552 --> 00:18:39,988
what would it learn
378
00:18:40,031 --> 00:18:41,903
We had a hypothesis that
it would learn to detect
379
00:18:41,946 --> 00:18:44,384
commonly occurring objects
in videos.
380
00:18:44,427 --> 00:18:47,517
And so, we know that human faces
appear a lot in videos,
381
00:18:47,561 --> 00:18:49,302
so we looked,
and, lo and behold,
382
00:18:49,345 --> 00:18:51,608
there was a neuron that had
learned to detect human faces.
383
00:18:51,652 --> 00:18:56,265
Leave Britney alone
384
00:18:56,309 --> 00:18:58,354
Well, what else
appears in videos a lot
385
00:18:58,398 --> 00:19:00,051
386
00:19:00,095 --> 00:19:01,792
So, we looked,
and to our surprise,
387
00:19:01,836 --> 00:19:04,882
there was actually a neuron
that had learned to detect cats.
388
00:19:04,926 --> 00:19:14,849
389
00:19:14,892 --> 00:19:17,068
I still remember
seeing recognition.
390
00:19:17,112 --> 00:19:18,635
"Wow, that's a cat.
Okay, cool.
391
00:19:18,679 --> 00:19:20,071
Great."
[ Laughs ]
392
00:19:23,162 --> 00:19:24,859
Barrat:
It's all pretty innocuous
393
00:19:24,902 --> 00:19:26,295
when you're thinking
about the future.
394
00:19:26,339 --> 00:19:29,733
It all seems kind of
harmless and benign.
395
00:19:29,777 --> 00:19:31,605
But we're making
cognitive architectures
396
00:19:31,648 --> 00:19:33,520
that will fly farther
and faster than us
397
00:19:33,563 --> 00:19:35,086
and carry a bigger payload,
398
00:19:35,130 --> 00:19:37,437
and they won't be
warm and fuzzy.
399
00:19:37,480 --> 00:19:39,656
Ferrucci: I think that,
in three to five years,
400
00:19:39,700 --> 00:19:41,702
you will see a computer system
401
00:19:41,745 --> 00:19:45,401
that will be able
to autonomously learn
402
00:19:45,445 --> 00:19:49,013
how to understand,
how to build understanding,
403
00:19:49,057 --> 00:19:51,364
not unlike the way
the human mind works.
404
00:19:53,931 --> 00:19:56,891
Whatever that lunch was,
it was certainly delicious.
405
00:19:56,934 --> 00:19:59,807
Simply some of
Robby's synthetics.
406
00:19:59,850 --> 00:20:01,635
He's your cook, too
407
00:20:01,678 --> 00:20:04,551
Even manufactures
the raw materials.
408
00:20:04,594 --> 00:20:06,944
Come around here, Robby.
409
00:20:06,988 --> 00:20:09,773
I'll show you
how this works.
410
00:20:11,122 --> 00:20:13,342
One introduces
a sample of human food
411
00:20:13,386 --> 00:20:15,344
through this aperture.
412
00:20:15,388 --> 00:20:17,738
Down here there's a small
built-in chemical laboratory,
413
00:20:17,781 --> 00:20:19,218
where he analyzes it.
414
00:20:19,261 --> 00:20:21,263
Later, he can reproduce
identical molecules
415
00:20:21,307 --> 00:20:22,482
in any shape or quantity.
416
00:20:22,525 --> 00:20:24,614
Why, it's
a housewife's dream.
417
00:20:24,658 --> 00:20:26,834
Announcer: Meet Baxter,
418
00:20:26,877 --> 00:20:29,445
revolutionary
new category of robots,
419
00:20:29,489 --> 00:20:30,490
with common sense.
420
00:20:30,533 --> 00:20:31,839
Baxter...
421
00:20:31,882 --> 00:20:33,449
Barrat: Baxter is
a really good example
422
00:20:33,493 --> 00:20:36,887
of the kind of competition
we face from machines.
423
00:20:36,931 --> 00:20:42,676
Baxter can do almost anything
we can do with our hands.
424
00:20:42,719 --> 00:20:45,722
Baxter costs about
what a minimum-wage worker
425
00:20:45,766 --> 00:20:47,507
makes in a year.
426
00:20:47,550 --> 00:20:48,769
But Baxter won't be
taking the place
427
00:20:48,812 --> 00:20:50,118
of one minimum-wage worker --
428
00:20:50,161 --> 00:20:51,772
He'll be taking
the place of three,
429
00:20:51,815 --> 00:20:55,515
because they never get tired,
they never take breaks.
430
00:20:55,558 --> 00:20:57,865
Gourley: That's probably the
first thing we're gonna see --
431
00:20:57,908 --> 00:20:59,475
displacement of jobs.
432
00:20:59,519 --> 00:21:01,651
They're gonna be done
quicker, faster, cheaper
433
00:21:01,695 --> 00:21:04,088
by machines.
434
00:21:04,132 --> 00:21:07,657
Our ability to even stay current
is so insanely limited
435
00:21:07,701 --> 00:21:10,138
compared to
the machines we build.
436
00:21:10,181 --> 00:21:13,446
For example, now we have this
great movement of Uber and Lyft
437
00:21:13,489 --> 00:21:15,056
kind of making
transportation cheaper
438
00:21:15,099 --> 00:21:16,405
and democratizing
transportation,
439
00:21:16,449 --> 00:21:17,711
which is great.
440
00:21:17,754 --> 00:21:19,321
The next step is gonna be
441
00:21:19,365 --> 00:21:21,149
that they're all gonna be
replaced by driverless cars,
442
00:21:21,192 --> 00:21:22,411
and then all the Uber
and Lyft drivers
443
00:21:22,455 --> 00:21:25,936
have to find
something new to do.
444
00:21:25,980 --> 00:21:28,156
Barrat: There are
4 million professional drivers
445
00:21:28,199 --> 00:21:29,723
in the United States.
446
00:21:29,766 --> 00:21:31,638
They're unemployed soon.
447
00:21:31,681 --> 00:21:34,075
7 million people
that do data entry.
448
00:21:34,118 --> 00:21:37,339
Those people
are gonna be jobless.
449
00:21:37,383 --> 00:21:40,342
A job isn't just about money,
right
450
00:21:40,386 --> 00:21:42,605
On a biological level,
it serves a purpose.
451
00:21:42,649 --> 00:21:45,391
It becomes a defining thing.
452
00:21:45,434 --> 00:21:48,350
When the jobs went away
in any given civilization,
453
00:21:48,394 --> 00:21:50,787
it doesn't take long
until that turns into violence.
454
00:21:50,831 --> 00:21:53,312
[ Crowd chanting
in native language ]
455
00:21:53,355 --> 00:21:57,011
456
00:21:57,054 --> 00:21:59,579
[ Gunshots ]
457
00:21:59,622 --> 00:22:02,016
We face a giant divide
between rich and poor,
458
00:22:02,059 --> 00:22:05,019
because that's what automation
and AI will provoke --
459
00:22:05,062 --> 00:22:08,588
a greater divide between
the haves and the have-nots.
460
00:22:08,631 --> 00:22:10,807
Right now, it's working
into the middle class,
461
00:22:10,851 --> 00:22:12,896
into white-collar jobs.
462
00:22:12,940 --> 00:22:15,334
IBM's Watson does
business analytics
463
00:22:15,377 --> 00:22:20,600
that we used to pay a business
analyst $300 an hour to do.
464
00:22:20,643 --> 00:22:23,037
Gourley: Today, you're going
to college to be a doctor,
465
00:22:23,080 --> 00:22:25,082
to be an accountant,
to be a journalist.
466
00:22:25,126 --> 00:22:28,608
It's unclear that there's
gonna be jobs there for you.
467
00:22:28,651 --> 00:22:32,612
Ng: If someone's planning for
a 40-year career in radiology,
468
00:22:32,655 --> 00:22:34,222
just reading images,
469
00:22:34,265 --> 00:22:35,745
I think that could be
a challenge
470
00:22:35,789 --> 00:22:36,920
to the new graduates of today.
471
00:22:36,964 --> 00:22:39,227
472
00:22:39,270 --> 00:22:49,193
473
00:22:49,237 --> 00:22:50,804
474
00:22:50,847 --> 00:22:58,464
475
00:22:58,507 --> 00:23:02,729
Dr. Herman: The da Vinci robot
is currently utilized
476
00:23:02,772 --> 00:23:07,516
by a variety of surgeons
for its accuracy and its ability
477
00:23:07,560 --> 00:23:12,303
to avoid the inevitable
fluctuations of the human hand.
478
00:23:12,347 --> 00:23:17,787
479
00:23:17,831 --> 00:23:23,358
480
00:23:23,402 --> 00:23:28,494
Anybody who watches this
feels the amazingness of it.
481
00:23:30,931 --> 00:23:34,674
You look through the scope,
and you're seeing the claw hand
482
00:23:34,717 --> 00:23:36,893
holding that woman's ovary.
483
00:23:36,937 --> 00:23:42,638
Humanity was resting right here
in the hands of this robot.
484
00:23:42,682 --> 00:23:46,947
People say it's the future,
but it's not the future --
485
00:23:46,990 --> 00:23:50,516
It's the present.
486
00:23:50,559 --> 00:23:52,474
Zilis: If you think about
a surgical robot,
487
00:23:52,518 --> 00:23:54,737
there's often not a lot
of intelligence in these things,
488
00:23:54,781 --> 00:23:56,783
but over time, as we put
more and more intelligence
489
00:23:56,826 --> 00:23:58,567
into these systems,
490
00:23:58,611 --> 00:24:02,441
the surgical robots can actually
learn from each robot surgery.
491
00:24:02,484 --> 00:24:04,181
They're tracking the movements,
they're understanding
492
00:24:04,225 --> 00:24:05,966
what worked
and what didn't work.
493
00:24:06,009 --> 00:24:08,708
And eventually, the robot
for routine surgeries
494
00:24:08,751 --> 00:24:12,320
is going to be able to perform
that entirely by itself...
495
00:24:12,363 --> 00:24:13,756
or with human supervision.
496
00:24:32,558 --> 00:24:34,995
497
00:24:35,038 --> 00:24:37,214
Dr. Herman: It seems that we're
feeding it and creating it,
498
00:24:37,258 --> 00:24:42,785
but, in a way, we are a slave
to the technology,
499
00:24:42,829 --> 00:24:45,701
because we can't go back.
500
00:24:45,745 --> 00:24:47,355
[ Beep ]
501
00:24:50,053 --> 00:24:52,882
Gourley: The machines are taking
bigger and bigger bites
502
00:24:52,926 --> 00:24:57,147
out of our skill set
at an ever-increasing speed.
503
00:24:57,191 --> 00:24:59,236
And so we've got to run
faster and faster
504
00:24:59,280 --> 00:25:00,890
to keep ahead of the machines.
505
00:25:02,675 --> 00:25:04,677
How do I look
506
00:25:04,720 --> 00:25:06,374
Good.
507
00:25:10,030 --> 00:25:11,553
Are you attracted to me
508
00:25:11,597 --> 00:25:14,251
WhatAre you attracted to me
509
00:25:14,295 --> 00:25:17,777
You give me indications
that you are.
510
00:25:17,820 --> 00:25:20,562
I do
Yes.
511
00:25:20,606 --> 00:25:22,608
Nolan: This is the future
we're headed into.
512
00:25:22,651 --> 00:25:26,046
We want to design
our companions.
513
00:25:26,089 --> 00:25:29,266
We're gonna like to see
a human face on AI.
514
00:25:29,310 --> 00:25:33,967
Therefore, gaming our emotions
will be depressingly easy.
515
00:25:34,010 --> 00:25:35,272
We're not that complicated.
516
00:25:35,316 --> 00:25:38,101
We're simple.
Stimulus-response.
517
00:25:38,145 --> 00:25:43,063
I can make you like me basically
by smiling at you a lot.
518
00:25:43,106 --> 00:25:45,674
AIs are gonna be fantastic
at manipulating us.
519
00:25:45,718 --> 00:25:54,640
520
00:25:54,683 --> 00:25:56,946
So, you've developed
a technology
521
00:25:56,990 --> 00:26:00,036
that can sense
what people are feeling.
522
00:26:00,080 --> 00:26:01,472
Right.
We've developed technology
523
00:26:01,516 --> 00:26:03,387
that can read
your facial expressions
524
00:26:03,431 --> 00:26:06,521
and map that to a number
of emotional states.
525
00:26:06,565 --> 00:26:08,697
el Kaliouby: 15 years ago,
I had just finished
526
00:26:08,741 --> 00:26:11,482
my undergraduate studies
in computer science,
527
00:26:11,526 --> 00:26:15,008
and it struck me that I was
spending a lot of time
528
00:26:15,051 --> 00:26:17,793
interacting with my laptops
and my devices,
529
00:26:17,837 --> 00:26:23,582
yet these devices had absolutely
no clue how I was feeling.
530
00:26:23,625 --> 00:26:26,802
I started thinking, "What if
this device could sense
531
00:26:26,846 --> 00:26:29,326
that I was stressed
or I was having a bad day
532
00:26:29,370 --> 00:26:31,067
What would that open up"
533
00:26:32,721 --> 00:26:34,418
Hi, first-graders
534
00:26:34,462 --> 00:26:35,855
How are you
535
00:26:35,898 --> 00:26:37,813
Can I get a hug
536
00:26:37,857 --> 00:26:40,773
We had kids interact
with the technology.
537
00:26:40,816 --> 00:26:42,862
A lot of it
is still in development,
538
00:26:42,905 --> 00:26:44,472
but it was just amazing.
539
00:26:44,515 --> 00:26:46,648
Who likes robots
Me
540
00:26:46,692 --> 00:26:48,911
Who wants to have a robot
in their house
541
00:26:48,955 --> 00:26:51,479
What would you use
a robot for, Jack
542
00:26:51,522 --> 00:26:56,353
I would use it to ask my mom
very hard math questions.
543
00:26:56,397 --> 00:26:58,181
Okay.
What about you, Theo
544
00:26:58,225 --> 00:27:02,272
I would use it
for scaring people.
545
00:27:02,316 --> 00:27:04,666
All right.
So, start by smiling.
546
00:27:04,710 --> 00:27:06,625
Nice.
547
00:27:06,668 --> 00:27:09,018
Brow furrow.
548
00:27:09,062 --> 00:27:10,890
Nice one.
Eyebrow raise.
549
00:27:10,933 --> 00:27:12,587
This generation, technology
550
00:27:12,631 --> 00:27:15,068
is just surrounding them
all the time.
551
00:27:15,111 --> 00:27:17,853
It's almost like they expect
to have robots in their homes,
552
00:27:17,897 --> 00:27:22,336
and they expect these robots
to be socially intelligent.
553
00:27:22,379 --> 00:27:25,252
What makes robots smart
554
00:27:25,295 --> 00:27:29,648
Put them in, like, a math
or biology class.
555
00:27:29,691 --> 00:27:32,259
I think you would
have to train it.
556
00:27:32,302 --> 00:27:35,218
All right.
Let's walk over here.
557
00:27:35,262 --> 00:27:37,394
So, if you smile and you
raise your eyebrows,
558
00:27:37,438 --> 00:27:39,005
it's gonna run over to you.
559
00:27:39,048 --> 00:27:40,833
Woman: It's coming over
It's coming over Look.
560
00:27:40,876 --> 00:27:43,139
[ Laughter ]
561
00:27:43,183 --> 00:27:45,272
But if you look angry,
it's gonna run away.
562
00:27:45,315 --> 00:27:46,490
[ Child growls ]
563
00:27:46,534 --> 00:27:48,797
-Awesome
-Oh, that was good.
564
00:27:48,841 --> 00:27:52,366
We're training computers to read
and recognize emotions.
565
00:27:52,409 --> 00:27:53,846
Ready Set Go
566
00:27:53,889 --> 00:27:57,414
And the response so far
has been really amazing.
567
00:27:57,458 --> 00:27:59,590
People are integrating this
into health apps,
568
00:27:59,634 --> 00:28:04,465
meditation apps, robots, cars.
569
00:28:04,508 --> 00:28:06,728
We're gonna see
how this unfolds.
570
00:28:06,772 --> 00:28:09,426
571
00:28:09,470 --> 00:28:11,602
Zilis:
Robots can contain AI,
572
00:28:11,646 --> 00:28:14,388
but the robot is just
a physical instantiation,
573
00:28:14,431 --> 00:28:16,782
and the artificial
intelligence is the brain.
574
00:28:16,825 --> 00:28:19,872
And so brains can exist purely
in software-based systems.
575
00:28:19,915 --> 00:28:22,483
They don't need to have
a physical form.
576
00:28:22,526 --> 00:28:25,094
Robots can exist without
any artificial intelligence.
577
00:28:25,138 --> 00:28:28,097
We have a lot of
dumb robots out there.
578
00:28:28,141 --> 00:28:31,753
But a dumb robot can be
a smart robot overnight,
579
00:28:31,797 --> 00:28:34,103
given the right software,
given the right sensors.
580
00:28:34,147 --> 00:28:38,629
Barrat: We can't help but impute
motive into inanimate objects.
581
00:28:38,673 --> 00:28:40,327
We do it with machines.
582
00:28:40,370 --> 00:28:41,502
We'll treat them like children.
583
00:28:41,545 --> 00:28:43,330
We'll treat them
like surrogates.
584
00:28:43,373 --> 00:28:45,027
-Goodbye
-Goodbye
585
00:28:45,071 --> 00:28:48,204
And we'll pay the price.
586
00:28:49,292 --> 00:28:58,998
587
00:28:59,041 --> 00:29:08,572
588
00:29:08,616 --> 00:29:10,792
Okay, welcome to ATR.
589
00:29:10,836 --> 00:29:18,060
590
00:29:25,067 --> 00:29:30,594
591
00:29:30,638 --> 00:29:36,122
592
00:29:47,786 --> 00:29:51,485
593
00:29:51,528 --> 00:29:52,791
Konnichiwa.
594
00:30:24,170 --> 00:30:29,436
595
00:30:53,677 --> 00:30:56,942
596
00:30:56,985 --> 00:30:58,682
Gourley: We build
artificial intelligence,
597
00:30:58,726 --> 00:31:02,948
and the very first thing
we want to do is replicate us.
598
00:31:02,991 --> 00:31:05,341
I think the key point will come
599
00:31:05,385 --> 00:31:09,258
when all the major senses
are replicated --
600
00:31:09,302 --> 00:31:11,130
sight...
601
00:31:11,173 --> 00:31:12,871
touch...
602
00:31:12,914 --> 00:31:14,611
smell.
603
00:31:14,655 --> 00:31:17,919
When we replicate our senses,
is that when it become alive
604
00:31:17,963 --> 00:31:22,010
605
00:31:22,054 --> 00:31:24,752
[ Indistinct conversations ]
606
00:31:24,795 --> 00:31:27,581
607
00:31:27,624 --> 00:31:29,104
Nolan:
So many of our machines
608
00:31:29,148 --> 00:31:31,019
are being built
to understand us.
609
00:31:31,063 --> 00:31:32,803
[ Speaking Japanese ]
610
00:31:32,847 --> 00:31:34,805
But what happens when
an anthropomorphic creature
611
00:31:34,849 --> 00:31:37,417
discovers that they can
adjust their loyalty,
612
00:31:37,460 --> 00:31:40,028
adjust their courage,
adjust their avarice,
613
00:31:40,072 --> 00:31:42,291
adjust their cunning
614
00:31:42,335 --> 00:31:44,815
615
00:31:44,859 --> 00:31:47,166
Musk: The average person,
they don't see killer robots
616
00:31:47,209 --> 00:31:48,645
going down the streets.
617
00:31:48,689 --> 00:31:50,996
They're like, "What are
you talking about"
618
00:31:51,039 --> 00:31:53,955
Man, we want to make sure
that we don't have killer robots
619
00:31:53,999 --> 00:31:57,045
going down the street.
620
00:31:57,089 --> 00:31:59,439
Once they're going down
the street, it is too late.
621
00:31:59,482 --> 00:32:05,010
622
00:32:05,053 --> 00:32:07,099
Russell: The thing
that worries me right now,
623
00:32:07,142 --> 00:32:08,578
that keeps me awake,
624
00:32:08,622 --> 00:32:11,842
is the development
of autonomous weapons.
625
00:32:11,886 --> 00:32:19,850
626
00:32:19,894 --> 00:32:27,771
627
00:32:27,815 --> 00:32:32,733
Up to now, people have expressed
unease about drones,
628
00:32:32,776 --> 00:32:35,127
which are remotely
piloted aircraft.
629
00:32:35,170 --> 00:32:39,783
630
00:32:39,827 --> 00:32:43,309
If you take a drone's camera
and feed it into the AI system,
631
00:32:43,352 --> 00:32:47,443
it's a very easy step from here
to fully autonomous weapons
632
00:32:47,487 --> 00:32:50,881
that choose their own targets
and release their own missiles.
633
00:32:50,925 --> 00:32:58,150
634
00:32:58,193 --> 00:33:05,374
635
00:33:05,418 --> 00:33:12,686
636
00:33:12,729 --> 00:33:15,080
The expected life-span
of a human being
637
00:33:15,123 --> 00:33:16,516
in that kind of
battle environment
638
00:33:16,559 --> 00:33:20,520
would be measured in seconds.
639
00:33:20,563 --> 00:33:23,740
Singer: At one point,
drones were science fiction,
640
00:33:23,784 --> 00:33:28,832
and now they've become
the normal thing in war.
641
00:33:28,876 --> 00:33:33,402
There's over 10,000 in
U.S. military inventory alone.
642
00:33:33,446 --> 00:33:35,274
But they're not
just a U.S. phenomena.
643
00:33:35,317 --> 00:33:39,060
There's more than 80 countries
that operate them.
644
00:33:39,104 --> 00:33:41,932
Gourley: It stands to reason
that people making some
645
00:33:41,976 --> 00:33:44,587
of the most important and
difficult decisions in the world
646
00:33:44,631 --> 00:33:46,328
are gonna start to use
and implement
647
00:33:46,372 --> 00:33:48,591
artificial intelligence.
648
00:33:48,635 --> 00:33:50,724
649
00:33:50,767 --> 00:33:53,596
The Air Force just designed
a $400-billion jet program
650
00:33:53,640 --> 00:33:55,555
to put pilots in the sky,
651
00:33:55,598 --> 00:34:01,300
and a $500 AI, designed by
a couple of graduate students,
652
00:34:01,343 --> 00:34:03,432
is beating the best human pilots
653
00:34:03,476 --> 00:34:05,782
with a relatively
simple algorithm.
654
00:34:05,826 --> 00:34:09,395
655
00:34:09,438 --> 00:34:13,399
AI will have as big an impact
on the military
656
00:34:13,442 --> 00:34:17,490
as the combustion engine
had at the turn of the century.
657
00:34:17,533 --> 00:34:18,839
It will literally touch
658
00:34:18,882 --> 00:34:21,233
everything
that the military does,
659
00:34:21,276 --> 00:34:25,324
from driverless convoys
delivering logistical supplies,
660
00:34:25,367 --> 00:34:27,021
to unmanned drones
661
00:34:27,065 --> 00:34:30,764
delivering medical aid,
to computational propaganda,
662
00:34:30,807 --> 00:34:34,246
trying to win the hearts
and minds of a population.
663
00:34:34,289 --> 00:34:38,337
And so it stands to reason
that whoever has the best AI
664
00:34:38,380 --> 00:34:41,688
will probably achieve
dominance on this planet.
665
00:34:45,561 --> 00:34:47,650
At some point in
the early 21st century,
666
00:34:47,694 --> 00:34:51,219
all of mankind was
united in celebration.
667
00:34:51,263 --> 00:34:53,830
We marveled
at our own magnificence
668
00:34:53,874 --> 00:34:56,833
as we gave birth to AI.
669
00:34:56,877 --> 00:34:58,966
AI
670
00:34:59,009 --> 00:35:00,489
You mean
artificial intelligence
671
00:35:00,533 --> 00:35:01,751
A singular consciousness
672
00:35:01,795 --> 00:35:05,886
that spawned
an entire race of machines.
673
00:35:05,929 --> 00:35:09,716
We don't know
who struck first -- us or them,
674
00:35:09,759 --> 00:35:12,980
but we know that it was us
that scorched the sky.
675
00:35:13,023 --> 00:35:14,634
676
00:35:14,677 --> 00:35:16,766
Singer: There's a long history
of science fiction,
677
00:35:16,810 --> 00:35:19,987
not just predicting the future,
but shaping the future.
678
00:35:20,030 --> 00:35:26,820
679
00:35:26,863 --> 00:35:30,389
Arthur Conan Doyle
writing before World War I
680
00:35:30,432 --> 00:35:34,393
on the danger of how
submarines might be used
681
00:35:34,436 --> 00:35:38,048
to carry out civilian blockades.
682
00:35:38,092 --> 00:35:40,399
At the time
he's writing this fiction,
683
00:35:40,442 --> 00:35:43,402
the Royal Navy made fun
of Arthur Conan Doyle
684
00:35:43,445 --> 00:35:45,230
for this absurd idea
685
00:35:45,273 --> 00:35:47,623
that submarines
could be useful in war.
686
00:35:47,667 --> 00:35:53,412
687
00:35:53,455 --> 00:35:55,370
One of the things
we've seen in history
688
00:35:55,414 --> 00:35:58,243
is that our attitude
towards technology,
689
00:35:58,286 --> 00:36:01,942
but also ethics,
are very context-dependent.
690
00:36:01,985 --> 00:36:03,726
For example, the submarine...
691
00:36:03,770 --> 00:36:06,468
nations like Great Britain
and even the United States
692
00:36:06,512 --> 00:36:09,863
found it horrifying
to use the submarine.
693
00:36:09,906 --> 00:36:13,214
In fact, the German use of the
submarine to carry out attacks
694
00:36:13,258 --> 00:36:18,480
was the reason why the United
States joined World War I.
695
00:36:18,524 --> 00:36:20,613
But move the timeline forward.
696
00:36:20,656 --> 00:36:23,529
Man: The United States
of America was suddenly
697
00:36:23,572 --> 00:36:28,403
and deliberately attacked
by the empire of Japan.
698
00:36:28,447 --> 00:36:32,190
Five hours after Pearl Harbor,
the order goes out
699
00:36:32,233 --> 00:36:36,498
to commit unrestricted
submarine warfare against Japan.
700
00:36:39,936 --> 00:36:44,289
So Arthur Conan Doyle
turned out to be right.
701
00:36:44,332 --> 00:36:46,856
Nolan: That's the great old line
about science fiction --
702
00:36:46,900 --> 00:36:48,336
It's a lie that tells the truth.
703
00:36:48,380 --> 00:36:51,470
Fellow executives,
it gives me great pleasure
704
00:36:51,513 --> 00:36:54,821
to introduce you to the future
of law enforcement...
705
00:36:54,864 --> 00:36:56,562
ED-209.
706
00:36:56,605 --> 00:37:03,612
707
00:37:03,656 --> 00:37:05,919
This isn't just a question
of science fiction.
708
00:37:05,962 --> 00:37:09,488
This is about what's next, about
what's happening right now.
709
00:37:09,531 --> 00:37:13,927
710
00:37:13,970 --> 00:37:17,496
The role of intelligent systems
is growing very rapidly
711
00:37:17,539 --> 00:37:19,324
in warfare.
712
00:37:19,367 --> 00:37:22,152
Everyone is pushing
in the unmanned realm.
713
00:37:22,196 --> 00:37:26,374
714
00:37:26,418 --> 00:37:28,898
Gourley: Today, the Secretary of
Defense is very, very clear --
715
00:37:28,942 --> 00:37:32,337
We will not create fully
autonomous attacking vehicles.
716
00:37:32,380 --> 00:37:34,643
Not everyone
is gonna hold themselves
717
00:37:34,687 --> 00:37:36,515
to that same set of values.
718
00:37:36,558 --> 00:37:40,693
And when China and Russia start
deploying autonomous vehicles
719
00:37:40,736 --> 00:37:45,611
that can attack and kill, what's
the move that we're gonna make
720
00:37:45,654 --> 00:37:49,963
721
00:37:50,006 --> 00:37:51,617
Russell: You can't say,
"Well, we're gonna use
722
00:37:51,660 --> 00:37:53,967
autonomous weapons
for our military dominance,
723
00:37:54,010 --> 00:37:56,796
but no one else
is gonna use them."
724
00:37:56,839 --> 00:38:00,495
If you make these weapons,
they're gonna be used to attack
725
00:38:00,539 --> 00:38:03,324
human populations
in large numbers.
726
00:38:03,368 --> 00:38:12,507
727
00:38:12,551 --> 00:38:14,596
Autonomous weapons are,
by their nature,
728
00:38:14,640 --> 00:38:16,468
weapons of mass destruction,
729
00:38:16,511 --> 00:38:19,862
because it doesn't need a human
being to guide it or carry it.
730
00:38:19,906 --> 00:38:22,517
You only need one person,
to, you know,
731
00:38:22,561 --> 00:38:25,781
write a little program.
732
00:38:25,825 --> 00:38:30,220
It just captures
the complexity of this field.
733
00:38:30,264 --> 00:38:32,571
It is cool.
It is important.
734
00:38:32,614 --> 00:38:34,573
It is amazing.
735
00:38:34,616 --> 00:38:37,053
It is also frightening.
736
00:38:37,097 --> 00:38:38,968
And it's all about trust.
737
00:38:42,102 --> 00:38:44,583
It's an open letter about
artificial intelligence,
738
00:38:44,626 --> 00:38:47,063
signed by some of
the biggest names in science.
739
00:38:47,107 --> 00:38:48,413
What do they want
740
00:38:48,456 --> 00:38:50,763
Ban the use of
autonomous weapons.
741
00:38:50,806 --> 00:38:52,373
Woman: The author stated,
742
00:38:52,417 --> 00:38:54,375
"Autonomous weapons
have been described
743
00:38:54,419 --> 00:38:56,595
as the third revolution
in warfare."
744
00:38:56,638 --> 00:38:58,553
Woman
artificial-intelligence
specialists
745
00:38:58,597 --> 00:39:01,817
calling for a global ban
on killer robots.
746
00:39:01,861 --> 00:39:04,342
Tegmark:
This open letter basically says
747
00:39:04,385 --> 00:39:06,344
that we should redefine the goal
748
00:39:06,387 --> 00:39:07,954
of the field of
artificial intelligence
749
00:39:07,997 --> 00:39:11,610
away from just creating pure,
undirected intelligence,
750
00:39:11,653 --> 00:39:13,655
towards creating
beneficial intelligence.
751
00:39:13,699 --> 00:39:16,092
The development of AI
is not going to stop.
752
00:39:16,136 --> 00:39:18,094
It is going to continue
and get better.
753
00:39:18,138 --> 00:39:19,835
If the international community
754
00:39:19,879 --> 00:39:21,968
isn't putting
certain controls on this,
755
00:39:22,011 --> 00:39:24,666
people will develop things
that can do anything.
756
00:39:24,710 --> 00:39:27,365
Woman: The letter says
that we are years, not decades,
757
00:39:27,408 --> 00:39:28,714
away from these weapons
being deployed.
758
00:39:28,757 --> 00:39:30,106
So first of all...
759
00:39:30,150 --> 00:39:32,413
We had 6,000 signatories
of that letter,
760
00:39:32,457 --> 00:39:35,155
including many of
the major figures in the field.
761
00:39:37,026 --> 00:39:39,942
I'm getting a lot of visits
from high-ranking officials
762
00:39:39,986 --> 00:39:42,989
who wish to emphasize that
American military dominance
763
00:39:43,032 --> 00:39:45,731
is very important,
and autonomous weapons
764
00:39:45,774 --> 00:39:50,083
may be part of
the Defense Department's plan.
765
00:39:50,126 --> 00:39:52,433
That's very, very scary,
because a value system
766
00:39:52,477 --> 00:39:54,479
of military developers
of technology
767
00:39:54,522 --> 00:39:57,307
is not the same as a value
system of the human race.
768
00:39:57,351 --> 00:40:00,746
769
00:40:00,789 --> 00:40:02,922
Markoff: Out of the concerns
about the possibility
770
00:40:02,965 --> 00:40:06,665
that this technology might be
a threat to human existence,
771
00:40:06,708 --> 00:40:08,144
a number of the technologists
772
00:40:08,188 --> 00:40:09,972
have funded
the Future of Life Institute
773
00:40:10,016 --> 00:40:12,192
to try to grapple
with these problems.
774
00:40:13,193 --> 00:40:14,847
All of these guys are secretive,
775
00:40:14,890 --> 00:40:16,805
and so it's interesting
to me to see them,
776
00:40:16,849 --> 00:40:20,635
you know, all together.
777
00:40:20,679 --> 00:40:24,030
Everything we have is a result
of our intelligence.
778
00:40:24,073 --> 00:40:26,641
It's not the result
of our big, scary teeth
779
00:40:26,685 --> 00:40:29,470
or our large claws
or our enormous muscles.
780
00:40:29,514 --> 00:40:32,473
It's because we're actually
relatively intelligent.
781
00:40:32,517 --> 00:40:35,520
And among my generation,
we're all having
782
00:40:35,563 --> 00:40:37,086
what we call "holy cow,"
783
00:40:37,130 --> 00:40:39,045
or "holy something else"
moments,
784
00:40:39,088 --> 00:40:41,003
because we see
that the technology
785
00:40:41,047 --> 00:40:44,180
is accelerating faster
than we expected.
786
00:40:44,224 --> 00:40:46,705
I remember sitting
around the table there
787
00:40:46,748 --> 00:40:50,099
with some of the best and
the smartest minds in the world,
788
00:40:50,143 --> 00:40:52,058
and what really
struck me was,
789
00:40:52,101 --> 00:40:56,149
maybe the human brain
is not able to fully grasp
790
00:40:56,192 --> 00:40:58,673
the complexity of the world
that we're confronted with.
791
00:40:58,717 --> 00:41:01,415
Russell:
As it's currently constructed,
792
00:41:01,459 --> 00:41:04,766
the road that AI is following
heads off a cliff,
793
00:41:04,810 --> 00:41:07,595
and we need to change
the direction that we're going
794
00:41:07,639 --> 00:41:10,729
so that we don't take
the human race off the cliff.
795
00:41:10,772 --> 00:41:13,514
796
00:41:13,558 --> 00:41:17,126
Musk: Google acquired DeepMind
several years ago.
797
00:41:17,170 --> 00:41:18,737
DeepMind operates
798
00:41:18,780 --> 00:41:22,088
as a semi-independent
subsidiary of Google.
799
00:41:22,131 --> 00:41:24,960
The thing that makes
DeepMind unique
800
00:41:25,004 --> 00:41:26,919
is that DeepMind
is absolutely focused
801
00:41:26,962 --> 00:41:30,313
on creating digital
superintelligence --
802
00:41:30,357 --> 00:41:34,056
an AI that is vastly smarter
than any human on Earth
803
00:41:34,100 --> 00:41:36,624
and ultimately smarter than
all humans on Earth combined.
804
00:41:36,668 --> 00:41:40,715
This is from the DeepMind
reinforcement learning system.
805
00:41:40,759 --> 00:41:43,544
Basically wakes up
like a newborn baby
806
00:41:43,588 --> 00:41:46,852
and is shown the screen
of an Atari video game
807
00:41:46,895 --> 00:41:50,508
and then has to learn
to play the video game.
808
00:41:50,551 --> 00:41:55,600
It knows nothing about objects,
about motion, about time.
809
00:41:57,602 --> 00:41:59,604
It only knows that there's
an image on the screen
810
00:41:59,647 --> 00:42:02,563
and there's a score.
811
00:42:02,607 --> 00:42:06,436
So, if your baby woke up
the day it was born
812
00:42:06,480 --> 00:42:08,090
and, by late afternoon,
813
00:42:08,134 --> 00:42:11,093
was playing
40 different Atari video games
814
00:42:11,137 --> 00:42:15,315
at a superhuman level,
you would be terrified.
815
00:42:15,358 --> 00:42:19,101
You would say, "My baby
is possessed. Send it back."
816
00:42:19,145 --> 00:42:23,584
Musk: The DeepMind system
can win at any game.
817
00:42:23,628 --> 00:42:27,588
It can already beat all
the original Atari games.
818
00:42:27,632 --> 00:42:29,155
It is superhuman.
819
00:42:29,198 --> 00:42:31,636
It plays the games at superspeed
in less than a minute.
820
00:42:31,679 --> 00:42:33,507
821
00:42:35,640 --> 00:42:37,032
822
00:42:37,076 --> 00:42:38,643
DeepMind turned
to another challenge,
823
00:42:38,686 --> 00:42:40,558
and the challenge
was the game of Go,
824
00:42:40,601 --> 00:42:42,603
which people
have generally argued
825
00:42:42,647 --> 00:42:45,084
has been beyond
the power of computers
826
00:42:45,127 --> 00:42:48,304
to play with
the best human Go players.
827
00:42:48,348 --> 00:42:51,264
First, they challenged
a European Go champion.
828
00:42:53,222 --> 00:42:55,834
Then they challenged
a Korean Go champion.
829
00:42:55,877 --> 00:42:57,836
Man:
Please start the game.
830
00:42:57,879 --> 00:42:59,838
And they were able
to win both times
831
00:42:59,881 --> 00:43:02,797
in kind of striking fashion.
832
00:43:02,841 --> 00:43:05,017
Nolan: You were reading articles
in New York Timesyears ago
833
00:43:05,060 --> 00:43:09,761
talking about how Go would take
100 years for us to solve.
834
00:43:09,804 --> 00:43:11,110
Urban:
People said, "Well, you know,
835
00:43:11,153 --> 00:43:13,460
but that's still just a board.
836
00:43:13,503 --> 00:43:15,027
Poker is an art.
837
00:43:15,070 --> 00:43:16,419
Poker involves reading people.
838
00:43:16,463 --> 00:43:18,073
Poker involves lying
and bluffing.
839
00:43:18,117 --> 00:43:19,553
It's not an exact thing.
840
00:43:19,597 --> 00:43:21,381
That will never be,
you know, a computer.
841
00:43:21,424 --> 00:43:22,861
You can't do that."
842
00:43:22,904 --> 00:43:24,732
They took the best
poker players in the world,
843
00:43:24,776 --> 00:43:27,387
and it took seven days
for the computer
844
00:43:27,430 --> 00:43:30,520
to start demolishing the humans.
845
00:43:30,564 --> 00:43:32,261
So it's the best poker player
in the world,
846
00:43:32,305 --> 00:43:34,655
it's the best Go player in the
world, and the pattern here
847
00:43:34,699 --> 00:43:37,440
is that AI might take
a little while
848
00:43:37,484 --> 00:43:40,443
to wrap its tentacles
around a new skill,
849
00:43:40,487 --> 00:43:44,883
but when it does, when it
gets it, it is unstoppable.
850
00:43:44,926 --> 00:43:51,977
851
00:43:52,020 --> 00:43:55,110
DeepMind's AI has
administrator-level access
852
00:43:55,154 --> 00:43:57,156
to Google's servers
853
00:43:57,199 --> 00:44:00,768
to optimize energy usage
at the data centers.
854
00:44:00,812 --> 00:44:04,816
However, this could be
an unintentional Trojan horse.
855
00:44:04,859 --> 00:44:07,253
DeepMind has to have complete
control of the data centers,
856
00:44:07,296 --> 00:44:08,950
so with a little
software update,
857
00:44:08,994 --> 00:44:10,691
that AI could take
complete control
858
00:44:10,735 --> 00:44:12,214
of the whole Google system,
859
00:44:12,258 --> 00:44:13,607
which means
they can do anything.
860
00:44:13,651 --> 00:44:14,913
They could look
at all your data.
861
00:44:14,956 --> 00:44:16,131
They could do anything.
862
00:44:16,175 --> 00:44:18,917
863
00:44:20,135 --> 00:44:23,051
We're rapidly heading towards
digital superintelligence
864
00:44:23,095 --> 00:44:24,313
that far exceeds any human.
865
00:44:24,357 --> 00:44:26,402
I think it's very obvious.
866
00:44:26,446 --> 00:44:27,708
Barrat:
The problem is, we're not gonna
867
00:44:27,752 --> 00:44:29,710
suddenly hit
human-level intelligence
868
00:44:29,754 --> 00:44:33,105
and say,
"Okay, let's stop research."
869
00:44:33,148 --> 00:44:34,715
It's gonna go beyond
human-level intelligence
870
00:44:34,759 --> 00:44:36,195
into what's called
"superintelligence,"
871
00:44:36,238 --> 00:44:39,459
and that's anything
smarter than us.
872
00:44:39,502 --> 00:44:41,287
Tegmark:
AI at the superhuman level,
873
00:44:41,330 --> 00:44:42,810
if we succeed with that,
will be
874
00:44:42,854 --> 00:44:46,553
by far the most powerful
invention we've ever made
875
00:44:46,596 --> 00:44:50,296
and the last invention
we ever have to make.
876
00:44:50,339 --> 00:44:53,168
And if we create AI
that's smarter than us,
877
00:44:53,212 --> 00:44:54,735
we have to be open
to the possibility
878
00:44:54,779 --> 00:44:57,520
that we might actually
lose control to them.
879
00:44:57,564 --> 00:45:00,741
880
00:45:00,785 --> 00:45:02,612
Russell: Let's say
you give it some objective,
881
00:45:02,656 --> 00:45:04,745
like curing cancer,
and then you discover
882
00:45:04,789 --> 00:45:06,965
that the way
it chooses to go about that
883
00:45:07,008 --> 00:45:08,444
is actually in conflict
884
00:45:08,488 --> 00:45:12,405
with a lot of other things
you care about.
885
00:45:12,448 --> 00:45:16,496
Musk: AI doesn't have to be evil
to destroy humanity.
886
00:45:16,539 --> 00:45:20,674
If AI has a goal, and humanity
just happens to be in the way,
887
00:45:20,718 --> 00:45:22,894
it will destroy humanity
as a matter of course,
888
00:45:22,937 --> 00:45:25,113
without even thinking about it.
No hard feelings.
889
00:45:25,157 --> 00:45:27,072
It's just like
if we're building a road
890
00:45:27,115 --> 00:45:29,770
and an anthill happens
to be in the way...
891
00:45:29,814 --> 00:45:31,467
We don't hate ants.
892
00:45:31,511 --> 00:45:33,165
We're just building a road.
893
00:45:33,208 --> 00:45:34,557
And so goodbye, anthill.
894
00:45:34,601 --> 00:45:37,952
895
00:45:37,996 --> 00:45:40,172
It's tempting
to dismiss these concerns,
896
00:45:40,215 --> 00:45:42,783
'cause it's, like,
something that might happen
897
00:45:42,827 --> 00:45:47,396
in a few decades or 100 years,
so why worry
898
00:45:47,440 --> 00:45:50,704
Russell: But if you go back
to September 11, 1933,
899
00:45:50,748 --> 00:45:52,401
Ernest Rutherford,
900
00:45:52,445 --> 00:45:54,795
who was the most well-known
nuclear physicist of his time,
901
00:45:54,839 --> 00:45:56,318
said that the possibility
902
00:45:56,362 --> 00:45:58,668
of ever extracting
useful amounts of energy
903
00:45:58,712 --> 00:46:00,801
from the transmutation
of atoms, as he called it,
904
00:46:00,845 --> 00:46:03,151
was moonshine.
905
00:46:03,195 --> 00:46:04,849
The next morning, Leo Szilard,
906
00:46:04,892 --> 00:46:06,502
who was a much
younger physicist,
907
00:46:06,546 --> 00:46:09,984
read this and got really annoyed
and figured out
908
00:46:10,028 --> 00:46:11,943
how to make
a nuclear chain reaction
909
00:46:11,986 --> 00:46:13,379
just a few months later.
910
00:46:13,422 --> 00:46:20,560
911
00:46:20,603 --> 00:46:23,693
We have spent more
than $2 billion
912
00:46:23,737 --> 00:46:27,523
on the greatest
scientific gamble in history.
913
00:46:27,567 --> 00:46:30,222
Russell: So when people say
that, "Oh, this is so far off
914
00:46:30,265 --> 00:46:32,528
in the future, we don't have
to worry about it,"
915
00:46:32,572 --> 00:46:36,271
it might only be three, four
breakthroughs of that magnitude
916
00:46:36,315 --> 00:46:40,275
that will get us from here
to superintelligent machines.
917
00:46:40,319 --> 00:46:42,974
Tegmark: If it's gonna take
20 years to figure out
918
00:46:43,017 --> 00:46:45,237
how to keep AI beneficial,
919
00:46:45,280 --> 00:46:48,849
then we should start today,
not at the last second
920
00:46:48,893 --> 00:46:51,460
when some dudes
drinking Red Bull
921
00:46:51,504 --> 00:46:53,332
decide to flip the switch
and test the thing.
922
00:46:53,375 --> 00:46:56,770
923
00:46:56,814 --> 00:46:58,859
Musk:
We have five years.
924
00:46:58,903 --> 00:47:00,600
I think
digital superintelligence
925
00:47:00,643 --> 00:47:03,864
will happen in my lifetime.
926
00:47:03,908 --> 00:47:05,735
100%.
927
00:47:05,779 --> 00:47:07,215
Barrat: When this happens,
928
00:47:07,259 --> 00:47:09,696
it will be surrounded
by a bunch of people
929
00:47:09,739 --> 00:47:13,091
who are really just excited
about the technology.
930
00:47:13,134 --> 00:47:15,571
They want to see it succeed,
but they're not anticipating
931
00:47:15,615 --> 00:47:16,964
that it can get out of control.
932
00:47:17,008 --> 00:47:24,450
933
00:47:25,494 --> 00:47:28,584
Oh, my God, I trust
my computer so much.
934
00:47:28,628 --> 00:47:30,195
That's an amazing question.
935
00:47:30,238 --> 00:47:31,457
I don't trust
my computer.
936
00:47:31,500 --> 00:47:32,937
If it's on,
I take it off.
937
00:47:32,980 --> 00:47:34,242
Like, even when it's off,
938
00:47:34,286 --> 00:47:35,896
I still think it's on.
Like, you know
939
00:47:35,940 --> 00:47:37,637
Like, you really cannot tru--
Like, the webcams,
940
00:47:37,680 --> 00:47:39,595
you don't know if, like,
someone might turn it...
941
00:47:39,639 --> 00:47:41,249
You don't know, like.
942
00:47:41,293 --> 00:47:42,903
I don't trust my computer.
943
00:47:42,947 --> 00:47:46,907
Like, in my phone,
every time they ask me
944
00:47:46,951 --> 00:47:49,475
"Can we send your
information to Apple"
945
00:47:49,518 --> 00:47:50,998
every time, I...
946
00:47:51,042 --> 00:47:53,087
So, I don't trust my phone.
947
00:47:53,131 --> 00:47:56,743
Okay. So, part of it is,
yes, I do trust it,
948
00:47:56,786 --> 00:48:00,660
because it would be really
hard to get through the day
949
00:48:00,703 --> 00:48:04,011
in the way our world is
set up without computers.
950
00:48:04,055 --> 00:48:05,360
951
00:48:05,404 --> 00:48:07,232
952
00:48:10,975 --> 00:48:13,368
Dr. Herman: Trust is
such a human experience.
953
00:48:13,412 --> 00:48:21,246
954
00:48:21,289 --> 00:48:25,119
I have a patient coming in
with an intracranial aneurysm.
955
00:48:25,163 --> 00:48:29,994
956
00:48:30,037 --> 00:48:31,691
They want to look
in my eyes and know
957
00:48:31,734 --> 00:48:34,955
that they can trust
this person with their life.
958
00:48:34,999 --> 00:48:39,394
I'm not horribly concerned
about anything.
959
00:48:39,438 --> 00:48:40,830
Good.
Part of that
960
00:48:40,874 --> 00:48:42,920
is because
I have confidence in you.
961
00:48:42,963 --> 00:48:50,710
962
00:48:50,753 --> 00:48:52,233
This procedure
we're doing today
963
00:48:52,277 --> 00:48:57,151
20 years ago
was essentially impossible.
964
00:48:57,195 --> 00:49:00,328
We just didn't have the
materials and the technologies.
965
00:49:04,202 --> 00:49:13,385
966
00:49:13,428 --> 00:49:22,655
967
00:49:22,698 --> 00:49:26,485
So, the coil is barely
in there right now.
968
00:49:26,528 --> 00:49:29,923
It's just a feather
holding it in.
969
00:49:29,967 --> 00:49:32,012
It's nervous time.
970
00:49:32,056 --> 00:49:36,147
971
00:49:36,190 --> 00:49:37,626
We're just in purgatory,
972
00:49:37,670 --> 00:49:40,673
intellectual,
humanistic purgatory,
973
00:49:40,716 --> 00:49:43,632
and AI might know
exactly what to do here.
974
00:49:43,676 --> 00:49:50,596
975
00:49:50,639 --> 00:49:52,554
We've got the coil
into the aneurysm.
976
00:49:52,598 --> 00:49:54,556
But it wasn't in
tremendously well
977
00:49:54,600 --> 00:49:56,428
that I knew that it would stay,
978
00:49:56,471 --> 00:50:01,041
so with a maybe 20% risk
of a very bad situation,
979
00:50:01,085 --> 00:50:04,436
I elected
to just bring her back.
980
00:50:04,479 --> 00:50:05,959
Because of my relationship
with her
981
00:50:06,003 --> 00:50:08,222
and knowing the difficulties
of coming in
982
00:50:08,266 --> 00:50:11,051
and having the procedure,
I consider things,
983
00:50:11,095 --> 00:50:14,272
when I should only consider
the safest possible route
984
00:50:14,315 --> 00:50:16,361
to achieve success.
985
00:50:16,404 --> 00:50:19,755
But I had to stand there for
10 minutes agonizing about it.
986
00:50:19,799 --> 00:50:21,757
The computer feels nothing.
987
00:50:21,801 --> 00:50:24,760
The computer just does
what it's supposed to do,
988
00:50:24,804 --> 00:50:26,284
better and better.
989
00:50:26,327 --> 00:50:30,288
990
00:50:30,331 --> 00:50:32,551
I want to be AI in this case.
991
00:50:35,945 --> 00:50:38,861
But can AI be compassionate
992
00:50:38,905 --> 00:50:43,040
993
00:50:43,083 --> 00:50:47,827
I mean, it's everybody's
question about AI.
994
00:50:47,870 --> 00:50:51,961
We are the sole
embodiment of humanity,
995
00:50:52,005 --> 00:50:55,269
and it's a stretch for us
to accept that a machine
996
00:50:55,313 --> 00:50:58,794
can be compassionate
and loving in that way.
997
00:50:58,838 --> 00:51:05,105
998
00:51:05,149 --> 00:51:07,281
Part of me
doesn't believe in magic,
999
00:51:07,325 --> 00:51:09,805
but part of me has faith
that there is something
1000
00:51:09,849 --> 00:51:11,546
beyond the sum of the parts,
1001
00:51:11,590 --> 00:51:15,637
that there is at least a oneness
in our shared ancestry,
1002
00:51:15,681 --> 00:51:20,338
our shared biology,
our shared history.
1003
00:51:20,381 --> 00:51:23,210
Some connection there
beyond machine.
1004
00:51:23,254 --> 00:51:30,304
1005
00:51:30,348 --> 00:51:32,567
So, then, you have
the other side of that, is,
1006
00:51:32,611 --> 00:51:34,047
does the computer
know it's conscious,
1007
00:51:34,091 --> 00:51:37,137
or can it be conscious,
or does it care
1008
00:51:37,181 --> 00:51:40,009
Does it need to be conscious
1009
00:51:40,053 --> 00:51:42,011
Does it need to be aware
1010
00:51:42,055 --> 00:51:47,365
1011
00:51:47,408 --> 00:51:52,848
1012
00:51:52,892 --> 00:51:56,417
I do not think that a robot
could ever be conscious.
1013
00:51:56,461 --> 00:51:58,376
Unless they programmed it
that way.
1014
00:51:58,419 --> 00:52:00,639
Conscious No.
1015
00:52:00,682 --> 00:52:03,163
No.
No.
1016
00:52:03,207 --> 00:52:06,035
I mean, think a robot could be
programmed to be conscious.
1017
00:52:06,079 --> 00:52:09,648
How are they programmed
to do everything else
1018
00:52:09,691 --> 00:52:12,390
That's another big part
of artificial intelligence,
1019
00:52:12,433 --> 00:52:15,741
is to make them conscious
and make them feel.
1020
00:52:17,003 --> 00:52:22,400
1021
00:52:22,443 --> 00:52:26,230
Lipson: Back in 2005, we started
trying to build machines
1022
00:52:26,273 --> 00:52:27,709
with self-awareness.
1023
00:52:27,753 --> 00:52:33,062
1024
00:52:33,106 --> 00:52:37,284
This robot, to begin with,
didn't know what it was.
1025
00:52:37,328 --> 00:52:40,244
All it knew was that it needed
to do something like walk.
1026
00:52:40,287 --> 00:52:44,073
1027
00:52:44,117 --> 00:52:45,597
Through trial and error,
1028
00:52:45,640 --> 00:52:49,731
it figured out how to walk
using its imagination,
1029
00:52:49,775 --> 00:52:54,040
and then it walked away.
1030
00:52:54,083 --> 00:52:56,390
And then we did
something very cruel.
1031
00:52:56,434 --> 00:52:58,653
We chopped off a leg
and watched what happened.
1032
00:52:58,697 --> 00:53:03,005
1033
00:53:03,049 --> 00:53:07,749
At the beginning, it didn't
quite know what had happened.
1034
00:53:07,793 --> 00:53:13,233
But over about a period
of a day, it then began to limp.
1035
00:53:13,277 --> 00:53:16,845
And then, a year ago,
we were training an AI system
1036
00:53:16,889 --> 00:53:20,240
for a live demonstration.
1037
00:53:20,284 --> 00:53:21,763
We wanted to show how we wave
1038
00:53:21,807 --> 00:53:24,113
all these objects
in front of the camera
1039
00:53:24,157 --> 00:53:27,334
and the AI could
recognize the objects.
1040
00:53:27,378 --> 00:53:29,031
And so, we're preparing
this demo,
1041
00:53:29,075 --> 00:53:31,251
and we had on a side screen
this ability
1042
00:53:31,295 --> 00:53:36,778
to watch what certain
neurons were responding to.
1043
00:53:36,822 --> 00:53:39,041
And suddenly we noticed
that one of the neurons
1044
00:53:39,085 --> 00:53:41,087
was tracking faces.
1045
00:53:41,130 --> 00:53:45,483
It was tracking our faces
as we were moving around.
1046
00:53:45,526 --> 00:53:48,616
Now, the spooky thing about this
is that we never trained
1047
00:53:48,660 --> 00:53:52,490
the system
to recognize human faces,
1048
00:53:52,533 --> 00:53:55,710
and yet, somehow,
it learned to do that.
1049
00:53:57,973 --> 00:53:59,584
Even though these robots
are very simple,
1050
00:53:59,627 --> 00:54:02,500
we can see there's
something else going on there.
1051
00:54:02,543 --> 00:54:05,851
It's not just programming.
1052
00:54:05,894 --> 00:54:08,462
So, this is just the beginning.
1053
00:54:10,377 --> 00:54:14,294
Horvitz: I often think about
that beach in Kitty Hawk,
1054
00:54:14,338 --> 00:54:18,255
the 1903 flight
by Orville and Wilbur Wright.
1055
00:54:21,214 --> 00:54:24,348
It was kind of a canvas plane,
and it's wood and iron,
1056
00:54:24,391 --> 00:54:26,828
and it gets off the ground for,
what, a minute and 20 seconds,
1057
00:54:26,872 --> 00:54:29,091
on this windy day
1058
00:54:29,135 --> 00:54:31,006
before touching back down again.
1059
00:54:33,270 --> 00:54:37,143
And it was
just around 65 summers or so
1060
00:54:37,186 --> 00:54:43,149
after that moment that you have
a 747 taking off from JFK...
1061
00:54:43,192 --> 00:54:50,156
1062
00:54:50,199 --> 00:54:51,984
...where a major concern
of someone on the airplane
1063
00:54:52,027 --> 00:54:55,422
might be whether or not
their salt-free diet meal
1064
00:54:55,466 --> 00:54:56,902
is gonna be coming to them
or not.
1065
00:54:56,945 --> 00:54:58,469
We have a whole infrastructure,
1066
00:54:58,512 --> 00:55:01,385
with travel agents
and tower control,
1067
00:55:01,428 --> 00:55:03,778
and it's all casual,
and it's all part of the world.
1068
00:55:03,822 --> 00:55:07,042
1069
00:55:07,086 --> 00:55:09,523
Right now, as far
as we've come with machines
1070
00:55:09,567 --> 00:55:12,134
that think and solve problems,
we're at Kitty Hawk now.
1071
00:55:12,178 --> 00:55:13,745
We're in the wind.
1072
00:55:13,788 --> 00:55:17,052
We have our tattered-canvas
planes up in the air.
1073
00:55:17,096 --> 00:55:20,882
1074
00:55:20,926 --> 00:55:23,885
But what happens
in 65 summers or so
1075
00:55:23,929 --> 00:55:27,889
We will have machines
that are beyond human control.
1076
00:55:27,933 --> 00:55:30,457
Should we worry about that
1077
00:55:30,501 --> 00:55:32,590
1078
00:55:32,633 --> 00:55:34,853
I'm not sure it's going to help.
1079
00:55:40,337 --> 00:55:44,036
Kaplan: Nobody has any idea
today what it means for a robot
1080
00:55:44,079 --> 00:55:46,430
to be conscious.
1081
00:55:46,473 --> 00:55:48,649
There is no such thing.
1082
00:55:48,693 --> 00:55:50,172
There are a lot of smart people,
1083
00:55:50,216 --> 00:55:53,088
and I have a great deal
of respect for them,
1084
00:55:53,132 --> 00:55:57,528
but the truth is, machines
are natural psychopaths.
1085
00:55:57,571 --> 00:55:59,225
Man:
Fear came back into the market.
1086
00:55:59,268 --> 00:56:01,706
Man
nearly 1,000, in a heartbeat.
1087
00:56:01,749 --> 00:56:03,360
I mean,
it is classic capitulation.
1088
00:56:03,403 --> 00:56:04,796
There are some people
who are proposing
1089
00:56:04,839 --> 00:56:07,146
it was some kind
of fat-finger error.
1090
00:56:07,189 --> 00:56:09,583
Take the Flash Crash of 2010.
1091
00:56:09,627 --> 00:56:13,413
In a matter of minutes,
$1 trillion in value
1092
00:56:13,457 --> 00:56:15,415
was lost in the stock market.
1093
00:56:15,459 --> 00:56:18,984
Woman: The Dow dropped nearly
1,000 points in a half-hour.
1094
00:56:19,027 --> 00:56:22,553
Kaplan:
So, what went wrong
1095
00:56:22,596 --> 00:56:26,644
By that point in time,
more than 60% of all the trades
1096
00:56:26,687 --> 00:56:29,124
that took place
on the stock exchange
1097
00:56:29,168 --> 00:56:32,693
were actually being
initiated by computers.
1098
00:56:32,737 --> 00:56:34,216
Man:
Panic selling on the way down,
1099
00:56:34,260 --> 00:56:35,783
and all of a sudden
it stopped on a dime.
1100
00:56:35,827 --> 00:56:37,611
Man
in real time, folks.
1101
00:56:37,655 --> 00:56:39,526
Wisz: The short story of what
happened in the Flash Crash
1102
00:56:39,570 --> 00:56:42,399
is that algorithms
responded to algorithms,
1103
00:56:42,442 --> 00:56:45,358
and it compounded upon itself
over and over and over again
1104
00:56:45,402 --> 00:56:47,012
in a matter of minutes.
1105
00:56:47,055 --> 00:56:50,972
Man: At one point, the market
fell as if down a well.
1106
00:56:51,016 --> 00:56:54,323
There is no regulatory body
that can adapt quickly enough
1107
00:56:54,367 --> 00:56:57,979
to prevent potentially
disastrous consequences
1108
00:56:58,023 --> 00:57:01,243
of AI operating
in our financial systems.
1109
00:57:01,287 --> 00:57:03,898
They are so prime
for manipulation.
1110
00:57:03,942 --> 00:57:05,639
Let's talk about the speed
with which
1111
00:57:05,683 --> 00:57:08,076
we are watching
this market deteriorate.
1112
00:57:08,120 --> 00:57:11,602
That's the type of AI-run-amuck
that scares people.
1113
00:57:11,645 --> 00:57:13,560
Kaplan:
When you give them a goal,
1114
00:57:13,604 --> 00:57:17,825
they will relentlessly
pursue that goal.
1115
00:57:17,869 --> 00:57:20,393
How many computer programs
are there like this
1116
00:57:20,437 --> 00:57:23,483
Nobody knows.
1117
00:57:23,527 --> 00:57:27,444
Kosinski: One of the fascinating
aspects about AI in general
1118
00:57:27,487 --> 00:57:31,970
is that no one really
understands how it works.
1119
00:57:32,013 --> 00:57:36,975
Even the people who create AI
don't really fully understand.
1120
00:57:37,018 --> 00:57:39,804
Because it has millions
of elements,
1121
00:57:39,847 --> 00:57:41,675
it becomes completely impossible
1122
00:57:41,719 --> 00:57:45,113
for a human being
to understand what's going on.
1123
00:57:45,157 --> 00:57:52,512
1124
00:57:52,556 --> 00:57:56,037
Grassegger: Microsoft had set up
this artificial intelligence
1125
00:57:56,081 --> 00:57:59,127
called Tay on Twitter,
which was a chatbot.
1126
00:58:00,912 --> 00:58:02,696
They started out in the morning,
1127
00:58:02,740 --> 00:58:06,526
and Tay was starting to tweet
and learning from stuff
1128
00:58:06,570 --> 00:58:10,835
that was being sent to him
from other Twitter people.
1129
00:58:10,878 --> 00:58:13,272
Because some people,
like trolls, attacked him,
1130
00:58:13,315 --> 00:58:18,582
within 24 hours, the Microsoft
bot became a terrible person.
1131
00:58:18,625 --> 00:58:21,367
They had to literally
pull Tay off the Net
1132
00:58:21,410 --> 00:58:24,718
because he had turned
into a monster.
1133
00:58:24,762 --> 00:58:30,550
A misanthropic, racist, horrible
person you'd never want to meet.
1134
00:58:30,594 --> 00:58:32,857
And nobody had foreseen this.
1135
00:58:35,337 --> 00:58:38,602
The whole idea of AI is that
we are not telling it exactly
1136
00:58:38,645 --> 00:58:42,780
how to achieve a given
outcome or a goal.
1137
00:58:42,823 --> 00:58:46,435
AI develops on its own.
1138
00:58:46,479 --> 00:58:48,829
Nolan: We're worried about
superintelligent AI,
1139
00:58:48,873 --> 00:58:52,790
the master chess player
that will outmaneuver us,
1140
00:58:52,833 --> 00:58:55,923
but AI won't have to
actually be that smart
1141
00:58:55,967 --> 00:59:00,145
to have massively disruptive
effects on human civilization.
1142
00:59:00,188 --> 00:59:01,886
We've seen over the last century
1143
00:59:01,929 --> 00:59:05,150
it doesn't necessarily take
a genius to knock history off
1144
00:59:05,193 --> 00:59:06,804
in a particular direction,
1145
00:59:06,847 --> 00:59:09,589
and it won't take a genius AI
to do the same thing.
1146
00:59:09,633 --> 00:59:13,158
Bogus election news stories
generated more engagement
1147
00:59:13,201 --> 00:59:17,075
on Facebook
than top real stories.
1148
00:59:17,118 --> 00:59:21,079
Facebook really is
the elephant in the room.
1149
00:59:21,122 --> 00:59:23,777
Kosinski:
AI running Facebook news feed --
1150
00:59:23,821 --> 00:59:28,347
The task for AI
is keeping users engaged,
1151
00:59:28,390 --> 00:59:29,827
but no one really understands
1152
00:59:29,870 --> 00:59:34,832
exactly how this AI
is achieving this goal.
1153
00:59:34,875 --> 00:59:38,792
Nolan: Facebook is building an
elegant mirrored wall around us.
1154
00:59:38,836 --> 00:59:41,665
A mirror that we can ask,
"Who's the fairest of them all"
1155
00:59:41,708 --> 00:59:45,016
and it will answer, "You, you,"
time and again
1156
00:59:45,059 --> 00:59:48,193
and slowly begin
to warp our sense of reality,
1157
00:59:48,236 --> 00:59:53,502
warp our sense of politics,
history, global events,
1158
00:59:53,546 --> 00:59:57,028
until determining what's true
and what's not true,
1159
00:59:57,071 --> 00:59:58,943
is virtually impossible.
1160
01:00:01,032 --> 01:00:03,861
The problem is that AI
doesn't understand that.
1161
01:00:03,904 --> 01:00:08,039
AI just had a mission --
maximize user engagement,
1162
01:00:08,082 --> 01:00:10,041
and it achieved that.
1163
01:00:10,084 --> 01:00:13,653
Nearly 2 billion people
spend nearly one hour
1164
01:00:13,697 --> 01:00:17,831
on average a day
basically interacting with AI
1165
01:00:17,875 --> 01:00:21,530
that is shaping
their experience.
1166
01:00:21,574 --> 01:00:24,664
Even Facebook engineers,
they don't like fake news.
1167
01:00:24,708 --> 01:00:26,666
It's very bad business.
1168
01:00:26,710 --> 01:00:28,015
They want to get rid
of fake news.
1169
01:00:28,059 --> 01:00:29,974
It's just very difficult
to do because,
1170
01:00:30,017 --> 01:00:32,324
how do you recognize news
as fake
1171
01:00:32,367 --> 01:00:34,456
if you cannot read
all of those news personally
1172
01:00:34,500 --> 01:00:39,418
There's so much
active misinformation
1173
01:00:39,461 --> 01:00:41,115
and it's packaged very well,
1174
01:00:41,159 --> 01:00:44,553
and it looks the same when
you see it on a Facebook page
1175
01:00:44,597 --> 01:00:47,426
or you turn on your television.
1176
01:00:47,469 --> 01:00:49,210
Nolan:
It's not terribly sophisticated,
1177
01:00:49,254 --> 01:00:51,691
but it is terribly powerful.
1178
01:00:51,735 --> 01:00:54,346
And what it means is
that your view of the world,
1179
01:00:54,389 --> 01:00:56,435
which, 20 years ago,
was determined,
1180
01:00:56,478 --> 01:01:00,004
if you watched the nightly news,
by three different networks,
1181
01:01:00,047 --> 01:01:02,528
the three anchors who endeavored
to try to get it right.
1182
01:01:02,571 --> 01:01:04,225
Might have had a little bias
one way or the other,
1183
01:01:04,269 --> 01:01:05,923
but, largely speaking,
we could all agree
1184
01:01:05,966 --> 01:01:08,273
on an objective reality.
1185
01:01:08,316 --> 01:01:10,754
Well, that objectivity is gone,
1186
01:01:10,797 --> 01:01:13,757
and Facebook has
completely annihilated it.
1187
01:01:13,800 --> 01:01:17,064
1188
01:01:17,108 --> 01:01:19,197
If most of your understanding
of how the world works
1189
01:01:19,240 --> 01:01:20,807
is derived from Facebook,
1190
01:01:20,851 --> 01:01:23,418
facilitated
by algorithmic software
1191
01:01:23,462 --> 01:01:27,118
that tries to show you
the news you want to see,
1192
01:01:27,161 --> 01:01:28,815
that's a terribly
dangerous thing.
1193
01:01:28,859 --> 01:01:33,080
And the idea that we have not
only set that in motion,
1194
01:01:33,124 --> 01:01:37,258
but allowed bad-faith actors
access to that information...
1195
01:01:37,302 --> 01:01:39,565
I mean, this is a recipe
for disaster.
1196
01:01:39,608 --> 01:01:43,134
1197
01:01:43,177 --> 01:01:45,876
Urban: I think that there will
definitely be lots of bad actors
1198
01:01:45,919 --> 01:01:48,922
trying to manipulate the world
with AI.
1199
01:01:48,966 --> 01:01:52,143
2016 was a perfect example
of an election
1200
01:01:52,186 --> 01:01:55,015
where there was lots of AI
producing lots of fake news
1201
01:01:55,059 --> 01:01:58,323
and distributing it
for a purpose, for a result.
1202
01:01:58,366 --> 01:01:59,846
1203
01:01:59,890 --> 01:02:02,283
Ladies and gentlemen,
honorable colleagues...
1204
01:02:02,327 --> 01:02:04,546
it's my privilege
to speak to you today
1205
01:02:04,590 --> 01:02:07,985
about the power of big data
and psychographics
1206
01:02:08,028 --> 01:02:09,682
in the electoral process
1207
01:02:09,726 --> 01:02:12,206
and, specifically,
to talk about the work
1208
01:02:12,250 --> 01:02:14,513
that we contributed
to Senator Cruz's
1209
01:02:14,556 --> 01:02:16,558
presidential primary campaign.
1210
01:02:16,602 --> 01:02:19,910
Nolan: Cambridge Analytica
emerged quietly as a company
1211
01:02:19,953 --> 01:02:21,563
that, according to its own hype,
1212
01:02:21,607 --> 01:02:26,307
has the ability to use
this tremendous amount of data
1213
01:02:26,351 --> 01:02:30,137
in order
to effect societal change.
1214
01:02:30,181 --> 01:02:33,358
In 2016, they had
three major clients.
1215
01:02:33,401 --> 01:02:34,794
Ted Cruz was one of them.
1216
01:02:34,838 --> 01:02:37,884
It's easy to forget
that, only 18 months ago,
1217
01:02:37,928 --> 01:02:41,148
Senator Cruz was one of
the less popular candidates
1218
01:02:41,192 --> 01:02:42,846
seeking nomination.
1219
01:02:42,889 --> 01:02:47,241
So, what was not possible maybe,
like, 10 or 15 years ago,
1220
01:02:47,285 --> 01:02:49,374
was that you can send fake news
1221
01:02:49,417 --> 01:02:52,420
to exactly the people
that you want to send it to.
1222
01:02:52,464 --> 01:02:56,685
And then you could actually see
how he or she reacts on Facebook
1223
01:02:56,729 --> 01:02:58,905
and then adjust that information
1224
01:02:58,949 --> 01:03:01,778
according to the feedback
that you got.
1225
01:03:01,821 --> 01:03:03,257
So you can start developing
1226
01:03:03,301 --> 01:03:06,130
kind of a real-time management
of a population.
1227
01:03:06,173 --> 01:03:08,697
In this case, we've zoned in
1228
01:03:08,741 --> 01:03:10,699
on a group
we've called "Persuasion."
1229
01:03:10,743 --> 01:03:13,746
These are people who are
definitely going to vote,
1230
01:03:13,790 --> 01:03:16,705
to caucus, but they need
moving from the center
1231
01:03:16,749 --> 01:03:18,490
a little bit more
towards the right.
1232
01:03:18,533 --> 01:03:19,708
in order to support Cruz.
1233
01:03:19,752 --> 01:03:22,059
They need a persuasion message.
1234
01:03:22,102 --> 01:03:23,800
"Gun rights," I've selected.
1235
01:03:23,843 --> 01:03:25,802
That narrows the field
slightly more.
1236
01:03:25,845 --> 01:03:29,066
And now we know that we need
a message on gun rights,
1237
01:03:29,109 --> 01:03:31,111
it needs to be
a persuasion message,
1238
01:03:31,155 --> 01:03:32,591
and it needs to be nuanced
1239
01:03:32,634 --> 01:03:34,201
according to
the certain personality
1240
01:03:34,245 --> 01:03:36,029
that we're interested in.
1241
01:03:36,073 --> 01:03:39,946
Through social media, there's an
infinite amount of information
1242
01:03:39,990 --> 01:03:42,514
that you can gather
about a person.
1243
01:03:42,557 --> 01:03:45,734
We have somewhere close
to 4,000 or 5,000 data points
1244
01:03:45,778 --> 01:03:48,563
on every adult
in the United States.
1245
01:03:48,607 --> 01:03:51,915
Grassegger: It's about targeting
the individual.
1246
01:03:51,958 --> 01:03:54,352
It's like a weapon,
which can be used
1247
01:03:54,395 --> 01:03:55,962
in the totally wrong direction.
1248
01:03:56,006 --> 01:03:58,051
That's the problem
with all of this data.
1249
01:03:58,095 --> 01:04:02,229
It's almost as if we built the
bullet before we built the gun.
1250
01:04:02,273 --> 01:04:04,362
Ted Cruz employed our data,
1251
01:04:04,405 --> 01:04:06,407
our behavioral insights.
1252
01:04:06,451 --> 01:04:09,541
He started from a base
of less than 5%
1253
01:04:09,584 --> 01:04:15,590
and had a very slow-and-steady-
but-firm rise to above 35%,
1254
01:04:15,634 --> 01:04:17,157
making him, obviously,
1255
01:04:17,201 --> 01:04:20,465
the second most threatening
contender in the race.
1256
01:04:20,508 --> 01:04:23,120
Now, clearly, the Cruz
campaign is over now,
1257
01:04:23,163 --> 01:04:24,904
but what I can tell you
1258
01:04:24,948 --> 01:04:28,168
is that of the two candidates
left in this election,
1259
01:04:28,212 --> 01:04:30,867
one of them is using
these technologies.
1260
01:04:32,564 --> 01:04:35,959
I, Donald John Trump,
do solemnly swear
1261
01:04:36,002 --> 01:04:38,222
that I will faithfully execute
1262
01:04:38,265 --> 01:04:42,226
the office of President
of the United States.
1263
01:04:42,269 --> 01:04:46,273
1264
01:04:48,275 --> 01:04:50,234
Nolan: Elections are
a marginal exercise.
1265
01:04:50,277 --> 01:04:53,237
It doesn't take
a very sophisticated AI
1266
01:04:53,280 --> 01:04:57,719
in order to have
a disproportionate impact.
1267
01:04:57,763 --> 01:05:02,550
Before Trump, Brexit was
another supposed client.
1268
01:05:02,594 --> 01:05:04,726
Well, at 20 minutes to 5:00,
1269
01:05:04,770 --> 01:05:08,730
we can now say
the decision taken in 1975
1270
01:05:08,774 --> 01:05:10,950
by this country to join
the common market
1271
01:05:10,994 --> 01:05:15,999
has been reversed by this
referendum to leave the EU.
1272
01:05:16,042 --> 01:05:19,828
Nolan: Cambridge Analytica
allegedly uses AI
1273
01:05:19,872 --> 01:05:23,267
to push through two of
the most ground-shaking pieces
1274
01:05:23,310 --> 01:05:27,967
of political change
in the last 50 years.
1275
01:05:28,011 --> 01:05:30,709
These are epochal events,
and if we believe the hype,
1276
01:05:30,752 --> 01:05:33,755
they are connected directly
to a piece of software,
1277
01:05:33,799 --> 01:05:37,194
essentially, created
by a professor at Stanford.
1278
01:05:37,237 --> 01:05:41,415
1279
01:05:41,459 --> 01:05:43,635
Kosinski:
Back in 2013, I described
1280
01:05:43,678 --> 01:05:45,593
that what they are doing
is possible
1281
01:05:45,637 --> 01:05:49,293
and warned against this
happening in the future.
1282
01:05:49,336 --> 01:05:51,382
Grassegger:
At the time, Michal Kosinski
1283
01:05:51,425 --> 01:05:52,949
was a young Polish researcher
1284
01:05:52,992 --> 01:05:54,994
working at the
Psychometrics Centre.
1285
01:05:55,038 --> 01:06:00,217
So, what Michal had done was to
gather the largest-ever data set
1286
01:06:00,260 --> 01:06:03,481
of how people
behave on Facebook.
1287
01:06:03,524 --> 01:06:07,789
Kosinski:
Psychometrics is trying
to measure psychological traits,
1288
01:06:07,833 --> 01:06:09,922
such as personality,
intelligence,
1289
01:06:09,966 --> 01:06:11,880
political views, and so on.
1290
01:06:11,924 --> 01:06:15,058
Now, traditionally,
those traits were measured
1291
01:06:15,101 --> 01:06:17,712
using tests and questions.
1292
01:06:17,756 --> 01:06:19,410
Nolan: Personality test --
the most benign thing
1293
01:06:19,453 --> 01:06:20,715
you could possibly think of.
1294
01:06:20,759 --> 01:06:22,065
Something that doesn't
necessarily have
1295
01:06:22,108 --> 01:06:24,197
a lot of utility, right
1296
01:06:24,241 --> 01:06:27,331
Kosinski: Our idea was that
instead of tests and questions,
1297
01:06:27,374 --> 01:06:30,029
we could simply look at the
digital footprints of behaviors
1298
01:06:30,073 --> 01:06:32,553
that we are all leaving behind
1299
01:06:32,597 --> 01:06:34,903
to understand openness,
1300
01:06:34,947 --> 01:06:37,732
conscientiousness,
neuroticism.
1301
01:06:37,776 --> 01:06:39,560
Grassegger: You can easily buy
personal data,
1302
01:06:39,604 --> 01:06:43,129
such as where you live, what
club memberships you've tried,
1303
01:06:43,173 --> 01:06:45,044
which gym you go to.
1304
01:06:45,088 --> 01:06:47,873
There are actually marketplaces
for personal data.
1305
01:06:47,916 --> 01:06:49,918
Nolan: It turns out, we can
discover an awful lot
1306
01:06:49,962 --> 01:06:51,442
about what you're gonna do
1307
01:06:51,485 --> 01:06:55,750
based on a very, very tiny
set of information.
1308
01:06:55,794 --> 01:06:58,275
Kosinski: We are training
deep-learning networks
1309
01:06:58,318 --> 01:07:01,278
to infer intimate traits,
1310
01:07:01,321 --> 01:07:04,759
people's political views,
personality,
1311
01:07:04,803 --> 01:07:07,806
intelligence,
sexual orientation
1312
01:07:07,849 --> 01:07:10,504
just from an image
from someone's face.
1313
01:07:10,548 --> 01:07:17,033
1314
01:07:17,076 --> 01:07:20,645
Now think about countries which
are not so free and open-minded.
1315
01:07:20,688 --> 01:07:23,300
If you can reveal people's
religious views
1316
01:07:23,343 --> 01:07:25,954
or political views
or sexual orientation
1317
01:07:25,998 --> 01:07:28,740
based on only profile pictures,
1318
01:07:28,783 --> 01:07:33,310
this could be literally
an issue of life and death.
1319
01:07:33,353 --> 01:07:36,965
1320
01:07:37,009 --> 01:07:39,751
I think there's no going back.
1321
01:07:42,145 --> 01:07:44,321
Do you know what
the Turing test is
1322
01:07:44,364 --> 01:07:48,977
It's when a human interacts
with a computer,
1323
01:07:49,021 --> 01:07:50,805
and if the human doesn't know
they're interacting
1324
01:07:50,849 --> 01:07:52,546
with a computer,
1325
01:07:52,590 --> 01:07:54,026
the test is passed.
1326
01:07:54,070 --> 01:07:57,247
And over the next few days,
1327
01:07:57,290 --> 01:07:59,684
you're gonna be the human
component in a Turing test.
1328
01:07:59,727 --> 01:08:02,295
Holy shit.Yeah, that's right, Caleb.
1329
01:08:02,339 --> 01:08:04,080
You got it.
1330
01:08:04,123 --> 01:08:06,865
'Cause if that test
is passed,
1331
01:08:06,908 --> 01:08:10,825
you are dead center of
the greatest scientific event
1332
01:08:10,869 --> 01:08:12,958
in the history of man.
1333
01:08:13,001 --> 01:08:14,612
If you've created
a conscious machine,
1334
01:08:14,655 --> 01:08:17,615
it's not the history
of man--
1335
01:08:17,658 --> 01:08:19,356
That's the history
of gods.
1336
01:08:19,399 --> 01:08:26,798
1337
01:08:26,841 --> 01:08:28,452
Nolan: It's almost like
technology is a god
1338
01:08:28,495 --> 01:08:29,975
in and of itself.
1339
01:08:30,018 --> 01:08:33,152
1340
01:08:33,196 --> 01:08:35,241
Like the weather.
We can't impact it.
1341
01:08:35,285 --> 01:08:39,593
We can't slow it down.
We can't stop it.
1342
01:08:39,637 --> 01:08:43,249
We feel powerless.
1343
01:08:43,293 --> 01:08:44,685
Kurzweil:
If we think of God
1344
01:08:44,729 --> 01:08:46,687
as an unlimited amount
of intelligence,
1345
01:08:46,731 --> 01:08:48,167
the closest we can get to that
1346
01:08:48,211 --> 01:08:50,474
is by evolving
our own intelligence
1347
01:08:50,517 --> 01:08:55,566
by merging with the artificial
intelligence we're creating.
1348
01:08:55,609 --> 01:08:58,003
Musk:
Today, our computers, phones,
1349
01:08:58,046 --> 01:09:01,615
applications give us
superhuman capability.
1350
01:09:01,659 --> 01:09:04,662
So, as the old maxim says,
if you can't beat 'em, join 'em.
1351
01:09:06,968 --> 01:09:09,971
el Kaliouby: It's about
a human-machine partnership.
1352
01:09:10,015 --> 01:09:11,669
I mean, we already see
how, you know,
1353
01:09:11,712 --> 01:09:14,933
our phones, for example, act
as memory prosthesis, right
1354
01:09:14,976 --> 01:09:17,196
I don't have to remember
your phone number anymore
1355
01:09:17,240 --> 01:09:19,198
'cause it's on my phone.
1356
01:09:19,242 --> 01:09:22,070
It's about machines
augmenting our human abilities,
1357
01:09:22,114 --> 01:09:25,248
as opposed to, like,
completely displacing them.
1358
01:09:25,291 --> 01:09:27,380
Nolan: If you look at all the
objects that have made the leap
1359
01:09:27,424 --> 01:09:30,122
from analog to digital
over the last 20 years...
1360
01:09:30,166 --> 01:09:32,080
it's a lot.
1361
01:09:32,124 --> 01:09:35,388
We're the last analog object
in a digital universe.
1362
01:09:35,432 --> 01:09:36,911
And the problem with that,
of course,
1363
01:09:36,955 --> 01:09:40,567
is that the data input/output
is very limited.
1364
01:09:40,611 --> 01:09:42,613
It's this.
It's these.
1365
01:09:42,656 --> 01:09:45,355
Zilis:
Our eyes are pretty good.
1366
01:09:45,398 --> 01:09:48,445
We're able to take in a lot
of visual information.
1367
01:09:48,488 --> 01:09:52,536
But our information output
is very, very, very low.
1368
01:09:52,579 --> 01:09:55,669
The reason this is important --
If we envision a scenario
1369
01:09:55,713 --> 01:09:59,543
where AI's playing a more
prominent role in societies,
1370
01:09:59,586 --> 01:10:02,023
we want good ways to interact
with this technology
1371
01:10:02,067 --> 01:10:04,983
so that it ends up
augmenting us.
1372
01:10:05,026 --> 01:10:07,812
1373
01:10:07,855 --> 01:10:09,553
Musk: I think
it's incredibly important
1374
01:10:09,596 --> 01:10:12,295
that AI not be "other."
1375
01:10:12,338 --> 01:10:14,862
It must be us.
1376
01:10:14,906 --> 01:10:18,605
And I could be wrong
about what I'm saying.
1377
01:10:18,649 --> 01:10:20,216
I'm certainly open to ideas
1378
01:10:20,259 --> 01:10:23,915
if anybody can suggest
a path that's better.
1379
01:10:23,958 --> 01:10:27,266
But I think we're gonna really
have to either merge with AI
1380
01:10:27,310 --> 01:10:28,963
or be left behind.
1381
01:10:29,007 --> 01:10:36,362
1382
01:10:36,406 --> 01:10:38,756
Gourley: It's hard to kind of
think of unplugging a system
1383
01:10:38,799 --> 01:10:41,802
that's distributed
everywhere on the planet,
1384
01:10:41,846 --> 01:10:45,806
that's distributed now
across the solar system.
1385
01:10:45,850 --> 01:10:49,375
You can't just, you know,
shut that off.
1386
01:10:49,419 --> 01:10:51,290
Nolan:
We've opened Pandora's box.
1387
01:10:51,334 --> 01:10:55,642
We've unleashed forces that
we can't control, we can't stop.
1388
01:10:55,686 --> 01:10:57,296
We're in the midst
of essentially creating
1389
01:10:57,340 --> 01:10:59,516
a new life-form on Earth.
1390
01:10:59,559 --> 01:11:05,826
1391
01:11:05,870 --> 01:11:07,611
Russell:
We don't know what happens next.
1392
01:11:07,654 --> 01:11:10,353
We don't know what shape
the intellect of a machine
1393
01:11:10,396 --> 01:11:14,531
will be when that intellect is
far beyond human capabilities.
1394
01:11:14,574 --> 01:11:17,360
It's just not something
that's possible.
1395
01:11:17,403 --> 01:11:24,715
1396
01:11:24,758 --> 01:11:26,934
The least scary future
I can think of is one
1397
01:11:26,978 --> 01:11:29,633
where we have at least
democratized AI.
1398
01:11:31,548 --> 01:11:34,159
Because if one company
or small group of people
1399
01:11:34,202 --> 01:11:37,031
manages to develop godlike
digital superintelligence,
1400
01:11:37,075 --> 01:11:40,339
they can take over the world.
1401
01:11:40,383 --> 01:11:42,210
At least when there's
an evil dictator,
1402
01:11:42,254 --> 01:11:44,343
that human is going to die,
1403
01:11:44,387 --> 01:11:46,998
but, for an AI,
there would be no death.
1404
01:11:47,041 --> 01:11:49,392
It would live forever.
1405
01:11:49,435 --> 01:11:51,916
And then you have
an immortal dictator
1406
01:11:51,959 --> 01:11:53,570
from which we can never escape.
1407
01:11:53,613 --> 01:12:02,100
1408
01:12:02,143 --> 01:12:10,587
1409
01:12:10,630 --> 01:12:19,160
1410
01:12:19,204 --> 01:12:27,647
1411
01:12:27,691 --> 01:12:36,221
1412
01:12:36,264 --> 01:12:44,838
1413
01:12:51,845 --> 01:12:53,717
Woman on P.A.:
Alan. Macchiato.
1414
01:12:53,760 --> 01:12:55,806
1415
01:12:58,112 --> 01:13:02,116
1416
01:13:05,816 --> 01:13:07,252
1417
01:13:07,295 --> 01:13:09,036
1418
01:13:09,080 --> 01:13:10,908
1419
01:13:10,951 --> 01:13:17,610
1420
01:13:17,654 --> 01:13:24,269
1421
01:13:24,312 --> 01:13:30,884
1422
01:13:30,928 --> 01:13:37,543
1423
01:13:37,587 --> 01:13:44,245
1424
01:13:44,289 --> 01:13:50,817
1425
01:13:50,861 --> 01:13:57,520
1426
01:13:57,563 --> 01:14:04,178
1427
01:14:04,222 --> 01:14:10,794
1428
01:14:10,837 --> 01:14:17,496
1429
01:14:17,540 --> 01:14:24,111
1430
01:14:24,155 --> 01:14:30,727
1431
01:14:30,770 --> 01:14:37,473
1432
01:14:37,516 --> 01:14:44,044
1433
01:14:44,088 --> 01:14:47,570
Hello
1434
01:14:47,613 --> 01:14:54,838
1435
01:14:54,881 --> 01:15:02,236
1436
01:15:02,280 --> 01:15:09,548
1437
01:15:09,592 --> 01:15:16,773
1438
01:15:16,816 --> 01:15:18,688
♪ Yeah, yeah
1439
01:15:18,731 --> 01:15:20,080
♪ Yeah, yeah
1440
01:15:20,124 --> 01:15:27,261
1441
01:15:27,305 --> 01:15:34,442
1442
01:15:34,486 --> 01:15:41,580
1443
01:15:41,624 --> 01:15:43,234
♪ Yeah, yeah
1444
01:15:43,277 --> 01:15:45,541
♪ Yeah, yeah
1445
01:15:45,584 --> 01:15:51,764
1446
01:15:51,808 --> 01:15:57,988
1447
01:15:58,031 --> 01:16:04,342
1448
01:16:04,385 --> 01:16:10,609
1449
01:16:10,653 --> 01:16:13,046
Hello
1450
01:16:13,090 --> 01:16:22,012
1451
01:16:22,055 --> 01:16:31,021
1452
01:16:31,064 --> 01:16:39,986
1453
01:16:40,030 --> 01:16:48,952
1454
01:16:48,995 --> 01:16:57,961
1455
01:16:58,004 --> 01:17:06,926
1456
01:17:06,970 --> 01:17:15,892
1457
01:17:15,935 --> 01:17:24,901
1458
01:17:24,944 --> 01:17:33,910
1459
01:17:33,953 --> 01:17:40,960
110463
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.