Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:25,020 --> 00:00:28,000
- Intelligence is
the ability to understand.
2
00:00:28,000 --> 00:00:30,220
We passed on what we know to machines.
3
00:00:30,220 --> 00:00:31,760
- The rise of
artificial intelligence
4
00:00:31,760 --> 00:00:34,210
is happening fast, but some
fear the new technology
5
00:00:34,210 --> 00:00:36,960
might have more problems than anticipated.
6
00:00:36,960 --> 00:00:38,510
- We will not control it.
7
00:01:37,720 --> 00:01:40,290
- Artificially
intelligent algorithms are here,
8
00:01:40,290 --> 00:01:41,930
but this is only the beginning.
9
00:01:51,380 --> 00:01:54,850
- In the age of AI,
data is the new oil.
10
00:01:57,180 --> 00:01:59,230
- Today, Amazon,
Google and Facebook
11
00:01:59,230 --> 00:02:01,440
are richer and more
powerful than any companies
12
00:02:01,440 --> 00:02:03,840
that have ever existed
throughout human history.
13
00:02:05,000 --> 00:02:08,690
- A handful of people working at
a handful of technology companies
14
00:02:08,690 --> 00:02:11,360
steer what a billion
people are thinking today.
15
00:02:12,710 --> 00:02:16,840
- This technology is changing:
What does it mean to be human?
16
00:03:11,870 --> 00:03:16,340
- Artificial intelligence is simply
non-biological intelligence.
17
00:03:17,310 --> 00:03:20,910
And intelligence itself is simply
the ability to accomplish goals.
18
00:03:23,440 --> 00:03:26,030
I'm convinced that AI
will ultimately be either
19
00:03:26,030 --> 00:03:29,780
the best thing ever to happen to humanity,
or the worst thing ever to happen.
20
00:03:31,240 --> 00:03:34,930
We can use it to solve all
of today's and tomorrow's
21
00:03:34,930 --> 00:03:40,620
greatest problems; cure diseases,
deal with climate change,
22
00:03:40,780 --> 00:03:42,840
lift everybody out of poverty.
23
00:03:44,380 --> 00:03:47,460
But, we could use exactly
the same technology
24
00:03:47,460 --> 00:03:51,780
to create a brutal global
dictatorship with unprecedented
25
00:03:51,780 --> 00:03:54,600
surveillance and inequality and suffering.
26
00:03:56,230 --> 00:03:59,260
That's why this is the most important
conversation of our time.
27
00:04:04,380 --> 00:04:07,590
- Artificial intelligence is everywhere
28
00:04:08,450 --> 00:04:11,500
because we now have thinking machines.
29
00:04:12,420 --> 00:04:15,840
If you go on social media or online,
30
00:04:15,840 --> 00:04:20,290
there's an artificial intelligence engine
that decides what to recommend.
31
00:04:21,290 --> 00:04:25,400
If you go on Facebook and you're just
scrolling through your friends' posts,-
32
00:04:25,400 --> 00:04:29,100
there's an AI engine that's picking
which one to show you first
33
00:04:29,100 --> 00:04:30,840
- and which one to bury.
34
00:04:30,840 --> 00:04:34,190
If you try to get insurance,
there is an AI engine
35
00:04:34,190 --> 00:04:36,270
trying to figure out how risky you are.
36
00:04:37,170 --> 00:04:40,710
And if you apply for a job,
it's quite possible
37
00:04:40,710 --> 00:04:43,430
that an AI engine looks at the resume.
38
00:04:51,350 --> 00:04:53,500
- We are made of data.
39
00:04:54,240 --> 00:04:59,520
Every one of us is made of data
-in terms of how we behave,
40
00:04:59,770 --> 00:05:03,000
how we talk, how we love,
what we do every day.
41
00:05:05,410 --> 00:05:09,120
So, computer scientists are
developing deep learning
42
00:05:09,120 --> 00:05:14,120
algorithms that can learn
to identify, classify,
43
00:05:14,570 --> 00:05:18,680
and predict patterns within
massive amounts of data.
44
00:05:31,390 --> 00:05:35,190
We are facing a form of
precision surveillance,
45
00:05:35,190 --> 00:05:38,750
you could call it algorithmic surveillance,
46
00:05:38,750 --> 00:05:41,810
and it means that you
cannot go unrecognized.
47
00:05:43,120 --> 00:05:46,110
You are always under
the watch of algorithms.
48
00:05:51,070 --> 00:05:54,380
- Almost all the AI
development on the planet today
49
00:05:54,380 --> 00:05:57,050
is done by a handful of
big technology companies
50
00:05:57,050 --> 00:05:58,770
or by a few large governments.
51
00:06:01,930 --> 00:06:06,170
If we look at what AI is
mostly being developed for,
52
00:06:06,170 --> 00:06:11,170
I would say it's killing,
spying, and brainwashing.
53
00:06:11,570 --> 00:06:13,730
So, I mean, we have military AI,
54
00:06:13,730 --> 00:06:16,850
we have a whole surveillance
apparatus being built
55
00:06:16,850 --> 00:06:19,060
using AI by major governments,
56
00:06:19,060 --> 00:06:21,730
and we have an advertising
industry which is oriented
57
00:06:21,730 --> 00:06:25,870
toward recognizing what ads
to try to sell to someone.
58
00:06:29,080 --> 00:06:32,090
- We humans have come to
a fork in the road now.
59
00:06:33,530 --> 00:06:36,670
The AI we have today is very narrow.
60
00:06:37,820 --> 00:06:40,550
The holy grail of AI research
ever since the beginning
61
00:06:40,550 --> 00:06:43,300
is to make AI that can do
everything better than us,
62
00:06:44,660 --> 00:06:46,280
and we've basically built a God.
63
00:06:47,640 --> 00:06:49,930
It's going to revolutionize
life as we know it.
64
00:06:53,040 --> 00:06:55,760
It's incredibly important
to take a step back
65
00:06:55,760 --> 00:06:57,660
and think carefully about this.
66
00:06:59,380 --> 00:07:01,510
What sort of society do we want?
67
00:07:04,760 --> 00:07:07,430
- So, we're in this
historic transformation.
68
00:07:08,890 --> 00:07:11,260
Like we're raising this new creature.
69
00:07:11,260 --> 00:07:14,230
We have a new offspring of sorts.
70
00:07:16,120 --> 00:07:19,630
But just like actual offspring,
71
00:07:19,630 --> 00:07:22,990
you don't get to control
everything it's going to do.
72
00:07:57,000 --> 00:08:00,300
We are living at
this privileged moment where,
73
00:08:00,300 --> 00:08:05,270
for the first time, we
will see probably that AI
74
00:08:05,270 --> 00:08:08,860
is really going to outcompete
humans in many, many,
75
00:08:08,860 --> 00:08:10,460
if not all, important fields.
76
00:08:14,300 --> 00:08:16,450
- Everything is going to change.
77
00:08:16,450 --> 00:08:19,520
A new form of life is emerging.
78
00:08:45,000 --> 00:08:50,900
When I was a boy, I thought,
how can I maximize my impact?
79
00:08:52,050 --> 00:08:56,270
And then it was clear that
I have to build something
80
00:08:56,270 --> 00:09:00,050
that learns to become smarter than myself,
81
00:09:00,050 --> 00:09:01,680
such that I can retire,
82
00:09:01,680 --> 00:09:04,480
and the smarter thing
can further self-improve
83
00:09:04,480 --> 00:09:06,950
and solve all the problems
that I cannot solve.
84
00:09:11,180 --> 00:09:14,520
Multiplying that tiny
little bit of creativity
85
00:09:14,520 --> 00:09:16,370
that I have into infinity,
86
00:09:18,270 --> 00:09:20,780
and that's what has been
driving me since then.
87
00:09:39,220 --> 00:09:40,680
How am I trying to build
88
00:09:40,680 --> 00:09:43,320
a general purpose artificial intelligence?
89
00:09:45,540 --> 00:09:49,690
If you want to be intelligent,
you have to recognize speech,
90
00:09:49,690 --> 00:09:54,550
video and handwriting, and
faces, and all kinds of things,
91
00:09:54,550 --> 00:09:57,550
and there we have made a lot of progress.
92
00:09:59,290 --> 00:10:02,010
See, LSTM, neural networks,
93
00:10:02,010 --> 00:10:05,890
which we developed in our labs
in Munich and in Switzerland,
94
00:10:05,890 --> 00:10:10,460
and it's now used for speech
recognition and translation,
95
00:10:10,460 --> 00:10:12,640
and video recognition.
96
00:10:12,640 --> 00:10:16,660
They are now in everybody's
smartphone, almost one billion
97
00:10:16,660 --> 00:10:20,660
iPhones and in over
two billion Android phones.
98
00:10:21,990 --> 00:10:26,990
So, we are generating all
kinds of useful by-products
99
00:10:27,490 --> 00:10:29,040
on the way to the general goal.
100
00:10:40,000 --> 00:10:44,990
The main goal, some Artificial
General Intelligence,
101
00:10:44,990 --> 00:10:51,400
an AGI that can learn to improve
the learning algorithm itself.
102
00:10:52,960 --> 00:10:56,940
So, it basically can learn
to improve the way it learns
103
00:10:56,940 --> 00:11:00,940
and it can also recursively
improve the way it learns,
104
00:11:00,940 --> 00:11:04,460
the way it learns without
any limitations except for
105
00:11:04,460 --> 00:11:07,970
the basic fundamental
limitations of computability.
106
00:11:14,590 --> 00:11:17,880
One of my favorite
robots is this one here.
107
00:11:17,880 --> 00:11:21,380
We use this robot for our
studies of artificial curiosity.
108
00:11:22,640 --> 00:11:27,620
Where we are trying to teach
this robot to teach itself.
109
00:11:32,870 --> 00:11:33,960
What is a baby doing?
110
00:11:33,960 --> 00:11:37,820
A baby is curiously exploring its world.
111
00:11:39,970 --> 00:11:42,210
That's how he learns how gravity works
112
00:11:42,210 --> 00:11:45,130
and how certain things topple, and so on.
113
00:11:46,700 --> 00:11:49,980
And as it learns to ask
questions about the world,
114
00:11:49,980 --> 00:11:52,640
and as it learns to
answer these questions,
115
00:11:52,640 --> 00:11:55,500
it becomes a more and more
general problem solver.
116
00:11:56,540 --> 00:11:59,160
And so, our artificial
systems are also learning
117
00:11:59,160 --> 00:12:03,320
to ask all kinds of
questions, not just slavishly
118
00:12:03,320 --> 00:12:07,010
try to answer the questions
given to them by humans.
119
00:12:10,880 --> 00:12:14,920
You have to give AI the freedom
to invent its own tasks.
120
00:12:17,830 --> 00:12:20,610
If you don't do that, it's not
going to become very smart.
121
00:12:22,160 --> 00:12:26,780
On the other hand, it's really hard
to predict what they are going to do.
122
00:12:49,560 --> 00:12:52,910
- I feel that technology
is a force of nature.
123
00:12:55,420 --> 00:13:00,020
I feel like there is a lot of similarity between
technology and biological evolution.
124
00:13:08,850 --> 00:13:10,140
Playing God.
125
00:13:13,470 --> 00:13:16,500
Scientists have been accused
of playing God for a while,
126
00:13:17,960 --> 00:13:22,410
- but there is a real sense in
which we are creating something
127
00:13:23,380 --> 00:13:26,250
very different from anything
we've created so far.
128
00:13:49,790 --> 00:13:53,250
I was interested in the concept of
AI from a relatively early age.
129
00:13:55,080 --> 00:13:58,420
At some point, I got especially
interested in machine learning.
130
00:14:01,690 --> 00:14:03,520
What is experience?
131
00:14:03,520 --> 00:14:04,710
What is learning?
132
00:14:04,710 --> 00:14:05,720
What is thinking?
133
00:14:06,890 --> 00:14:08,150
How does the brain work?
134
00:14:10,140 --> 00:14:11,940
These questions are philosophical,
135
00:14:11,940 --> 00:14:15,190
but it looks like we can
come up with algorithms that
136
00:14:15,190 --> 00:14:18,530
both do useful things and help
us answer these questions.
137
00:14:20,000 --> 00:14:22,250
Like it's almost like applied philosophy.
138
00:14:48,060 --> 00:14:51,410
Artificial General Intelligence, AGI.
139
00:14:52,610 --> 00:14:57,210
A computer system that
can do any job or any task
140
00:14:57,210 --> 00:15:00,380
that a human does, but only better.
141
00:15:13,180 --> 00:15:15,810
Yeah, I mean, we definitely
will be able to create
142
00:15:17,340 --> 00:15:20,150
completely autonomous
beings with their own goals.
143
00:15:24,060 --> 00:15:25,630
And it will be very important,
144
00:15:25,630 --> 00:15:30,440
especially as these beings
become much smarter than humans,
145
00:15:30,440 --> 00:15:34,720
it's going to be important
to have these beings,
146
00:15:36,220 --> 00:15:39,330
that the goals of these beings
be aligned with our goals.
147
00:15:42,600 --> 00:15:45,340
That's what we're trying
to do at OpenAI.
148
00:15:45,340 --> 00:15:49,700
Be at the forefront of research
and steer the research,
149
00:15:49,700 --> 00:15:53,940
steer their initial conditions
so to maximize the chance
150
00:15:53,940 --> 00:15:56,100
that the future will be good for humans.
151
00:16:11,660 --> 00:16:14,370
Now, AI is a great thing
because AI will solve
152
00:16:14,370 --> 00:16:16,440
all the problems that we have today.
153
00:16:19,080 --> 00:16:22,580
It will solve employment,
it will solve disease,
154
00:16:24,670 --> 00:16:26,560
it will solve poverty,
155
00:16:29,000 --> 00:16:31,360
but it will also create new problems.
156
00:16:34,580 --> 00:16:36,320
I think that...
157
00:16:40,490 --> 00:16:43,120
The problem of fake news
is going to be a thousand,
158
00:16:43,120 --> 00:16:44,360
a million times worse.
159
00:16:46,440 --> 00:16:48,820
Cyberattacks will
become much more extreme.
160
00:16:50,640 --> 00:16:53,470
You will have totally
automated AI weapons.
161
00:16:56,100 --> 00:16:59,980
I think AI has the potential to create
infinitely stable dictatorships.
162
00:17:05,820 --> 00:17:08,850
You're gonna see dramatically
more intelligent systems
163
00:17:08,850 --> 00:17:12,620
in 10 or 15 years from now,
and I think it's highly likely
164
00:17:12,620 --> 00:17:17,120
that those systems will have
completely astronomical impact on society.
165
00:17:19,480 --> 00:17:21,160
Will humans actually benefit?
166
00:17:22,860 --> 00:17:24,860
And who will benefit, who will not?
167
00:17:44,560 --> 00:17:47,660
- In 2012, IBM estimated that
168
00:17:47,660 --> 00:17:52,090
an average person is
generating 500 megabytes
169
00:17:52,090 --> 00:17:55,110
of digital footprints every single day.
170
00:17:55,110 --> 00:17:57,390
Imagine that you wanted
to back-up one day worth
171
00:17:57,390 --> 00:18:01,260
of data that humanity is
leaving behind, on paper.
172
00:18:01,260 --> 00:18:04,370
How tall will be the stack
of paper that contains
173
00:18:04,370 --> 00:18:07,820
just one day worth of data
that humanity is producing?
174
00:18:09,610 --> 00:18:12,670
It's like from the earth
to the sun, four times over.
175
00:18:14,550 --> 00:18:20,340
In 2025, we'll be generating
62 gigabytes of data
176
00:18:20,640 --> 00:18:22,380
per person, per day.
177
00:18:36,970 --> 00:18:42,380
We're leaving a ton of digital footprints
while going through our lives.
178
00:18:44,790 --> 00:18:49,380
They provide computer algorithms
with a fairly good idea about who we are,
179
00:18:49,380 --> 00:18:52,280
what we want, what we are doing.
180
00:18:56,020 --> 00:18:59,320
In my work, I looked at different
types of digital footprints.
181
00:18:59,320 --> 00:19:02,650
I looked at Facebook likes,
I looked at language,
182
00:19:02,650 --> 00:19:06,890
credit card records, web browsing
histories, search records.
183
00:19:07,740 --> 00:19:12,010
and each time I found that if
you get enough of this data,
184
00:19:12,010 --> 00:19:14,370
you can accurately predict future behavior
185
00:19:14,370 --> 00:19:17,560
and reveal important intimate traits.
186
00:19:19,140 --> 00:19:21,270
This can be used in great ways,
187
00:19:21,270 --> 00:19:24,670
but it can also be used
to manipulate people.
188
00:19:29,910 --> 00:19:33,880
Facebook is delivering daily information
189
00:19:33,880 --> 00:19:36,460
to two billion people or more.
190
00:19:38,750 --> 00:19:42,840
If you slightly change the
functioning of the Facebook engine,
191
00:19:42,840 --> 00:19:46,500
you can move the opinions and hence,
192
00:19:46,500 --> 00:19:49,870
the votes of millions of people.
193
00:19:50,720 --> 00:19:52,680
- Brexit!
- When do we want it?
194
00:19:52,720 --> 00:19:54,140
- Now!
195
00:19:56,070 --> 00:19:58,690
- A politician
wouldn't be able to figure out
196
00:19:58,690 --> 00:20:02,830
which message each one of
his or her voters would like,
197
00:20:02,830 --> 00:20:06,010
but a computer can see
what political message
198
00:20:06,010 --> 00:20:08,920
would be particularly convincing for you.
199
00:20:11,320 --> 00:20:12,640
- Ladies and gentlemen,
200
00:20:12,640 --> 00:20:16,170
it's my privilege to speak
to you today about the power
201
00:20:16,170 --> 00:20:20,430
of big data and psychographics
in the electoral process.
202
00:20:20,430 --> 00:20:22,360
- Data from Cambridge Analytica
203
00:20:22,360 --> 00:20:25,070
secretly harvested the
personal information
204
00:20:25,070 --> 00:20:28,320
of 50 million unsuspecting Facebook users.
205
00:20:28,320 --> 00:20:31,690
USA!
206
00:20:31,690 --> 00:20:35,500
- The data firm hired by Donald Trump's
presidential election campaign
207
00:20:35,870 --> 00:20:41,080
used secretly obtained information
to directly target potential American voters.
208
00:20:41,080 --> 00:20:42,940
- With that, they say they can predict
209
00:20:42,940 --> 00:20:46,760
the personality of every single
adult in the United States.
210
00:20:47,470 --> 00:20:49,460
- Tonight we're hearing
from Cambridge Analytica
211
00:20:49,460 --> 00:20:51,650
whistleblower, Christopher Wiley.
212
00:20:51,650 --> 00:20:54,990
- What we worked on was
data harvesting programs
213
00:20:54,990 --> 00:20:59,200
where we would pull data and run that
data through algorithms that could profile
214
00:20:59,200 --> 00:21:01,930
their personality traits and
other psychological attributes
215
00:21:01,930 --> 00:21:06,580
to exploit mental vulnerabilities
that our algorithms showed that they had.
216
00:21:15,500 --> 00:21:17,570
- Cambridge Analytica mentioned once
217
00:21:17,570 --> 00:21:20,460
or said that their models
were based on my work,
218
00:21:22,510 --> 00:21:25,140
but Cambridge Analytica is
just one of the hundreds
219
00:21:25,140 --> 00:21:30,040
of companies that are using
such methods to target voters.
220
00:21:32,130 --> 00:21:35,760
You know, I would be asked
questions by journalists such as,
221
00:21:35,760 --> 00:21:37,220
"So how do you feel about
222
00:21:38,810 --> 00:21:41,990
"electing Trump and supporting Brexit?"
223
00:21:41,990 --> 00:21:43,940
How do you answer to such question?
224
00:21:45,400 --> 00:21:51,740
I guess that I have to deal
with being blamed for all of it.
225
00:22:07,700 --> 00:22:12,480
- How tech started was
as a democratizing force,
226
00:22:12,480 --> 00:22:15,110
as a force for good, as
an ability for humans
227
00:22:15,110 --> 00:22:17,900
to interact with each
other without gatekeepers.
228
00:22:20,210 --> 00:22:25,110
There's never been a bigger experiment
in communications for the human race.
229
00:22:26,330 --> 00:22:29,060
What happens when everybody
gets to have their say?
230
00:22:29,780 --> 00:22:31,850
You would assume that it
would be for the better,
231
00:22:31,850 --> 00:22:34,540
that there would be more democracy,
there would be more discussion,
232
00:22:34,540 --> 00:22:37,660
there would be more tolerance,
but what's happened is that
233
00:22:37,660 --> 00:22:39,720
these systems have been hijacked.
234
00:22:41,300 --> 00:22:43,860
- We stand for connecting every person.
235
00:22:44,960 --> 00:22:46,620
For a global community.
236
00:22:47,780 --> 00:22:50,500
- One company,
Facebook, is responsible
237
00:22:50,500 --> 00:22:53,540
for the communications of
a lot of the human race.
238
00:22:55,650 --> 00:22:57,360
Same thing with Google.
239
00:22:57,360 --> 00:23:00,540
Everything you want know about
the world comes from them.
240
00:23:01,540 --> 00:23:05,120
This is global information economy
241
00:23:05,120 --> 00:23:07,830
that is controlled by a
small group of people.
242
00:23:14,400 --> 00:23:17,920
- The world's richest companies
are all technology companies.
243
00:23:18,770 --> 00:23:24,500
Google, Apple, Microsoft,
Amazon, Facebook.
244
00:23:25,880 --> 00:23:29,040
It's staggering how,
245
00:23:29,040 --> 00:23:31,770
in probably just 10 years,
246
00:23:31,770 --> 00:23:34,810
that the entire corporate power structure
247
00:23:34,810 --> 00:23:39,460
are basically in the business
of trading electrons.
248
00:23:41,240 --> 00:23:47,060
These little bits and bytes
are really the new currency.
249
00:23:53,560 --> 00:23:55,800
- The way that data is monetized
250
00:23:55,800 --> 00:23:58,820
is happening all around us,
even if it's invisible to us.
251
00:24:00,960 --> 00:24:04,280
Google has every amount
of information available.
252
00:24:04,280 --> 00:24:07,110
They track people by their GPS location.
253
00:24:07,110 --> 00:24:10,080
They know exactly what your
search history has been.
254
00:24:10,080 --> 00:24:12,850
They know your political preferences.
255
00:24:12,850 --> 00:24:14,840
Your search history alone can tell you
256
00:24:14,840 --> 00:24:17,570
everything about an individual
from their health problems
257
00:24:17,570 --> 00:24:19,480
to their sexual preferences.
258
00:24:19,480 --> 00:24:21,880
So, Google's reach is unlimited.
259
00:24:28,750 --> 00:24:31,620
- So we've seen Google and Facebook
260
00:24:31,620 --> 00:24:34,150
rise into these large
surveillance machines
261
00:24:35,040 --> 00:24:38,200
and they're both actually ad brokers.
262
00:24:38,200 --> 00:24:42,040
It sounds really mundane,
but they're high tech ad brokers.
263
00:24:42,910 --> 00:24:46,080
And the reason they're so
profitable is that they're using
264
00:24:46,080 --> 00:24:49,750
artificial intelligence to
process all this data about you,
265
00:24:51,560 --> 00:24:54,660
and then to match you with the advertiser
266
00:24:54,660 --> 00:24:59,660
that wants to reach people
like you, - for whatever message.
267
00:25:03,000 --> 00:25:07,880
- One of the problems with technology is
that it's been developed to be addictive.
268
00:25:07,880 --> 00:25:09,910
The way these companies
design these things
269
00:25:09,910 --> 00:25:12,010
is in order to pull you in and engage you.
270
00:25:13,180 --> 00:25:16,820
They want to become essentially
a slot machine of attention.
271
00:25:18,600 --> 00:25:21,770
So you're always paying attention,
you're always jacked into the matrix,
272
00:25:21,770 --> 00:25:23,520
you're always checking.
273
00:25:26,580 --> 00:25:30,450
- When somebody controls what you read,
they also control what you think.
274
00:25:32,180 --> 00:25:34,610
You get more of what you've
seen before and liked before,
275
00:25:34,610 --> 00:25:37,890
because this gives more traffic
and that gives more ads,
276
00:25:39,410 --> 00:25:43,160
but it also locks you
into your echo chamber.
277
00:25:43,160 --> 00:25:46,350
And this is what leads
to this polarization that we see today.
278
00:25:46,460 --> 00:25:49,300
Jair Bolsonaro!
279
00:25:49,970 --> 00:25:52,590
- Jair Bolsonaro,
Brazil's right-wing
280
00:25:52,590 --> 00:25:56,000
populist candidate sometimes
likened to Donald Trump,
281
00:25:56,000 --> 00:25:58,640
winning the presidency Sunday
night in that country's
282
00:25:58,640 --> 00:26:01,840
most polarizing election in decades.
283
00:26:01,840 --> 00:26:02,840
- Bolsonaro!
284
00:26:04,160 --> 00:26:09,550
- What we are seeing around the world
is upheaval and polarization and conflict
285
00:26:10,770 --> 00:26:15,000
that is partially pushed by algorithms
286
00:26:15,000 --> 00:26:19,070
that's figured out that
political extremes,
287
00:26:19,070 --> 00:26:22,180
tribalism, and sort of
shouting for your team,
288
00:26:22,180 --> 00:26:25,290
and feeling good about it, is engaging.
289
00:26:30,420 --> 00:26:33,150
- Social media may
be adding to the attention
290
00:26:33,150 --> 00:26:35,490
to hate crimes around the globe.
291
00:26:35,490 --> 00:26:38,120
- It's about how
people can become radicalized
292
00:26:38,120 --> 00:26:41,740
by living in the fever
swamps of the Internet.
293
00:26:41,740 --> 00:26:45,960
- So is this a key moment for the tech giants?
Are they now prepared to take responsibility
294
00:26:45,960 --> 00:26:48,840
as publishers for what
they share with the world?
295
00:26:49,790 --> 00:26:52,750
- If you deploy a
powerful potent technology
296
00:26:52,750 --> 00:26:56,670
at scale, and if you're talking
about Google and Facebook,
297
00:26:56,670 --> 00:26:59,530
you're deploying things
at scale of billions.
298
00:26:59,530 --> 00:27:02,770
If your artificial intelligence
is pushing polarization,
299
00:27:02,770 --> 00:27:05,090
you have global upheaval potentially.
300
00:27:05,720 --> 00:27:09,240
White lives matter!
301
00:27:09,320 --> 00:27:13,620
Black lives matter!
302
00:28:00,620 --> 00:28:04,040
- Artificial General Intelligence, AGI.
303
00:28:06,350 --> 00:28:08,320
Imagine your smartest friend,
304
00:28:09,600 --> 00:28:11,950
with 1,000 friends, just as smart,
305
00:28:14,680 --> 00:28:17,580
and then run them at a 1,000
times faster than real time.
306
00:28:17,580 --> 00:28:19,940
So it means that in every day of our time,
307
00:28:19,940 --> 00:28:22,510
they will do three years of thinking.
308
00:28:22,510 --> 00:28:25,520
Can you imagine how much you could do
309
00:28:26,580 --> 00:28:31,480
if, for every day, you could
do three years' worth of work?
310
00:28:52,440 --> 00:28:55,720
It wouldn't be an unfair comparison to say
311
00:28:55,720 --> 00:28:59,780
that what we have right now
is even more exciting than
312
00:28:59,780 --> 00:29:02,510
the quantum physicists of
the early 20th century.
313
00:29:02,510 --> 00:29:04,240
They discovered nuclear power.
314
00:29:05,740 --> 00:29:08,380
I feel extremely lucky to
be taking part in this.
315
00:29:15,310 --> 00:29:18,820
Many machine learning experts,
who are very knowledgeable and experienced,
316
00:29:18,820 --> 00:29:20,820
have a lot of skepticism about AGI.
317
00:29:22,350 --> 00:29:25,900
About when it would happen,
and about whether it could happen at all.
318
00:29:31,460 --> 00:29:35,740
But right now, this is something that just
not that many people have realized yet.
319
00:29:36,500 --> 00:29:41,240
That the speed of computers,
for neural networks, for AI,
320
00:29:41,240 --> 00:29:45,240
are going to become maybe
100,000 times faster
321
00:29:45,240 --> 00:29:46,980
in a small number of years.
322
00:29:49,330 --> 00:29:52,100
The entire hardware
industry for a long time
323
00:29:52,100 --> 00:29:54,660
didn't really know what to do next,
324
00:29:55,660 --> 00:30:00,900
but with artificial neural networks,
now that they actually work,
325
00:30:00,900 --> 00:30:03,420
you have a reason to build huge computers.
326
00:30:04,530 --> 00:30:06,930
You can build a brain in
silicon, it's possible.
327
00:30:14,450 --> 00:30:18,710
The very first AGIs
will be basically very,
328
00:30:18,710 --> 00:30:22,850
very large data centers
packed with specialized
329
00:30:22,850 --> 00:30:25,660
neural network processors
working in parallel.
330
00:30:28,070 --> 00:30:30,920
Compact, hot, power hungry package,
331
00:30:32,140 --> 00:30:35,540
consuming like 10 million
homes' worth of energy.
332
00:30:54,140 --> 00:30:55,640
A roast beef sandwich.
333
00:30:55,640 --> 00:30:58,290
Yeah, something slightly different.
334
00:30:58,290 --> 00:30:59,200
Just this once.
335
00:31:04,060 --> 00:31:06,260
Even the very first AGIs
336
00:31:06,260 --> 00:31:09,060
will be dramatically
more capable than humans.
337
00:31:10,970 --> 00:31:14,730
Humans will no longer be
economically useful for nearly any task.
338
00:31:16,200 --> 00:31:17,880
Why would you want to hire a human,
339
00:31:17,880 --> 00:31:21,980
if you could just get a computer that's going to
do it much better and much more cheaply?
340
00:31:28,900 --> 00:31:31,020
AGI is going to be like, without question,
341
00:31:31,880 --> 00:31:34,660
the most important
technology in the history
342
00:31:34,660 --> 00:31:36,600
of the planet by a huge margin.
343
00:31:39,410 --> 00:31:42,550
It's going to be bigger
than electricity, nuclear,
344
00:31:42,550 --> 00:31:44,140
and the Internet combined.
345
00:31:45,740 --> 00:31:47,620
In fact, you could say
that the whole purpose
346
00:31:47,620 --> 00:31:49,660
of all human science, the
purpose of computer science,
347
00:31:49,660 --> 00:31:52,550
the End Game, this is the
End Game, to build this.
348
00:31:52,550 --> 00:31:54,130
And it's going to be built.
349
00:31:54,130 --> 00:31:56,000
It's going to be a new life form.
350
00:31:56,000 --> 00:31:57,090
It's going to be...
351
00:31:59,190 --> 00:32:00,740
It's going to make us obsolete.
352
00:32:22,070 --> 00:32:24,520
- European manufacturers
know the Americans
353
00:32:24,520 --> 00:32:27,460
have invested heavily in
the necessary hardware.
354
00:32:27,460 --> 00:32:29,780
- Step into a
brave new world of power,
355
00:32:29,780 --> 00:32:31,630
performance and productivity.
356
00:32:32,360 --> 00:32:35,480
- All of the images you are
about to see on the large screen
357
00:32:35,480 --> 00:32:39,030
will be generated by
what's in that Macintosh.
358
00:32:39,760 --> 00:32:42,210
- It's my honor and
privilege to introduce to you
359
00:32:42,210 --> 00:32:44,700
the Windows 95 Development Team.
360
00:32:45,640 --> 00:32:49,040
- Human physical labor has
been mostly obsolete for
361
00:32:49,040 --> 00:32:50,620
getting on for a century.
362
00:32:51,390 --> 00:32:55,300
Routine human mental labor
is rapidly becoming obsolete
363
00:32:55,300 --> 00:32:58,980
and that's why we're seeing a lot of
the middle class jobs disappearing.
364
00:33:01,010 --> 00:33:02,340
- Every once in a while,
365
00:33:02,340 --> 00:33:06,370
a revolutionary product comes
along that changes everything.
366
00:33:06,370 --> 00:33:09,190
Today, Apple is reinventing the phone.
367
00:33:20,730 --> 00:33:24,270
- Machine intelligence
is already all around us.
368
00:33:24,270 --> 00:33:27,620
The list of things that we humans
can do better than machines
369
00:33:27,620 --> 00:33:29,600
is actually
shrinking pretty fast.
370
00:33:35,730 --> 00:33:37,400
- Driverless cars are great.
371
00:33:37,400 --> 00:33:40,010
They probably will reduce accidents.
372
00:33:40,010 --> 00:33:43,670
Except, alongside with
that, in the United States,
373
00:33:43,670 --> 00:33:46,410
you're going to lose 10 million jobs.
374
00:33:46,480 --> 00:33:49,980
What are you going to do with
10 million unemployed people?
375
00:33:54,900 --> 00:33:58,660
- The risk for social
conflict and tensions,
376
00:33:58,660 --> 00:34:02,080
if you exacerbate inequalities,
is very, very high.
377
00:34:11,560 --> 00:34:13,870
- AGI can, by definition,
378
00:34:13,870 --> 00:34:16,700
do all jobs better than we can do.
379
00:34:16,700 --> 00:34:18,610
People who are saying,
"There will always be jobs
380
00:34:18,610 --> 00:34:21,120
"that humans can do better
than machines," are simply
381
00:34:21,120 --> 00:34:24,180
betting against science and
saying there will never be AGI.
382
00:34:30,860 --> 00:34:33,630
- What we are seeing
now is like a train hurtling
383
00:34:33,630 --> 00:34:37,880
down a dark tunnel at
breakneck speed and it looks like
384
00:34:37,880 --> 00:34:39,660
we're sleeping at the wheel.
385
00:35:29,170 --> 00:35:34,710
- A large fraction of the digital footprints
we're leaving behind are digital images.
386
00:35:35,690 --> 00:35:39,340
And specifically, what's really
interesting to me as a psychologist
387
00:35:39,340 --> 00:35:41,470
are digital images of our faces.
388
00:35:44,500 --> 00:35:47,260
Here you can see the difference
in the facial outline
389
00:35:47,260 --> 00:35:50,130
of an average gay and
an average straight face.
390
00:35:50,130 --> 00:35:52,660
And you can see that straight men
391
00:35:52,660 --> 00:35:55,380
have slightly broader jaws.
392
00:35:55,680 --> 00:36:00,580
Gay women have slightly larger jaws,
compared with straight women.
393
00:36:02,510 --> 00:36:05,670
Computer algorithms can
reveal our political views
394
00:36:05,670 --> 00:36:08,280
or sexual orientation, or intelligence,
395
00:36:08,280 --> 00:36:11,120
just based on the picture of our faces.
396
00:36:12,070 --> 00:36:16,760
Even a human brain can distinguish between
gay and straight men with some accuracy.
397
00:36:16,920 --> 00:36:21,700
Now it turns out that the computer
can do it with much higher accuracy.
398
00:36:21,700 --> 00:36:25,200
What you're seeing here is an accuracy of
399
00:36:25,200 --> 00:36:29,410
off-the-shelf facial recognition software.
400
00:36:29,410 --> 00:36:31,640
This is terrible news
401
00:36:31,640 --> 00:36:34,320
for gay men and women
all around the world.
402
00:36:34,320 --> 00:36:35,680
And not only gay men and women,
403
00:36:35,680 --> 00:36:38,300
because the same algorithms
can be used to detect other
404
00:36:38,300 --> 00:36:42,350
intimate traits, think being
a member of the opposition,
405
00:36:42,350 --> 00:36:45,090
or being a liberal, or being an atheist.
406
00:36:46,850 --> 00:36:50,070
Being an atheist is
also punishable by death
407
00:36:50,070 --> 00:36:52,610
in Saudi Arabia, for instance.
408
00:36:59,780 --> 00:37:04,440
My mission as an academic is to
warn people about the dangers of algorithms
409
00:37:04,440 --> 00:37:08,290
being able to reveal our intimate traits.
410
00:37:09,780 --> 00:37:14,060
The problem is that when
people receive bad news,
411
00:37:14,060 --> 00:37:16,260
they very often choose to dismiss them.
412
00:37:17,810 --> 00:37:22,250
Well, it's a bit scary when you start receiving
death threats from one day to another,
413
00:37:22,250 --> 00:37:24,830
and I've received quite a
few death threats, -
414
00:37:25,910 --> 00:37:30,870
-but as a scientist, I have to
basically show what is possible.
415
00:37:33,590 --> 00:37:36,900
So what I'm really interested
in now is to try to see
416
00:37:36,900 --> 00:37:41,080
whether we can predict other
traits from people's faces.
417
00:37:46,270 --> 00:37:48,580
Now, if you can detect
depression from a face,
418
00:37:48,580 --> 00:37:53,580
or suicidal thoughts, maybe a CCTV system
419
00:37:53,690 --> 00:37:56,970
on the train station can save some lives.
420
00:37:59,130 --> 00:38:03,830
What if we could predict that someone
is more prone to commit a crime?
421
00:38:04,960 --> 00:38:07,150
You probably had a school counselor,
422
00:38:07,150 --> 00:38:10,380
a psychologist hired
there to identify children
423
00:38:10,380 --> 00:38:14,830
that potentially may have
some behavioral problems.
424
00:38:17,330 --> 00:38:20,080
So now imagine if you could
predict with high accuracy
425
00:38:20,080 --> 00:38:22,590
that someone is likely to
commit a crime in the future
426
00:38:22,590 --> 00:38:24,780
from the language use, from the face,
427
00:38:24,780 --> 00:38:27,580
from the facial expressions,
from the likes on Facebook.
428
00:38:32,310 --> 00:38:35,210
I'm not developing new
methods, I'm just describing
429
00:38:35,210 --> 00:38:38,890
something or testing something
in an academic environment.
430
00:38:40,590 --> 00:38:42,790
But there obviously is a chance that,
431
00:38:42,790 --> 00:38:47,790
while warning people against
risks of new technologies,
432
00:38:47,960 --> 00:38:50,560
I may also give some people new ideas.
433
00:39:11,140 --> 00:39:13,830
- We haven't yet seen
the future in terms of
434
00:39:13,830 --> 00:39:18,780
the ways in which the
new data-driven society
435
00:39:18,780 --> 00:39:21,380
is going to really evolve.
436
00:39:23,540 --> 00:39:26,740
The tech companies want
to get every possible bit
437
00:39:26,740 --> 00:39:30,170
of information that they
can collect on everyone
438
00:39:30,170 --> 00:39:31,690
to facilitate business.
439
00:39:33,470 --> 00:39:37,090
The police and the military
want to do the same thing
440
00:39:37,090 --> 00:39:38,830
to facilitate security.
441
00:39:41,960 --> 00:39:46,440
The interests that the two
have in common are immense,
442
00:39:46,440 --> 00:39:51,220
and so the extent of collaboration
between what you might
443
00:39:51,220 --> 00:39:56,820
call the Military-Tech Complex
is growing dramatically.
444
00:40:00,400 --> 00:40:03,010
- The CIA, for a very long time,
445
00:40:03,010 --> 00:40:06,120
has maintained a close
connection with Silicon Valley.
446
00:40:07,110 --> 00:40:10,270
Their venture capital
firm known as In-Q-Tel,
447
00:40:10,270 --> 00:40:13,870
makes seed investments to
start-up companies developing
448
00:40:13,870 --> 00:40:17,490
breakthrough technology that
the CIA hopes to deploy.
449
00:40:18,660 --> 00:40:22,100
Palantir, the big data analytics firm,
450
00:40:22,100 --> 00:40:25,100
one of their first seed
investments was from In-Q-Tel.
451
00:40:28,950 --> 00:40:31,780
- In-Q-Tel has struck gold in Palantir
452
00:40:31,780 --> 00:40:35,990
in helping to create a private vendor
453
00:40:35,990 --> 00:40:40,850
that has intelligence and
artificial intelligence
454
00:40:40,850 --> 00:40:44,460
capabilities that the government
can't even compete with.
455
00:40:46,370 --> 00:40:48,720
- Good evening, I'm Peter Thiel.
456
00:40:49,740 --> 00:40:54,210
I'm not a politician, but
neither is Donald Trump.
457
00:40:54,210 --> 00:40:58,620
He is a builder and it's
time to rebuild America.
458
00:41:01,280 --> 00:41:03,940
- Peter Thiel,
the founder of Palantir,
459
00:41:03,940 --> 00:41:06,680
was a Donald Trump transition advisor
460
00:41:06,680 --> 00:41:09,180
and a close friend and donor.
461
00:41:11,080 --> 00:41:13,700
Trump was elected largely on the promise
462
00:41:13,700 --> 00:41:17,560
to deport millions of immigrants.
463
00:41:17,560 --> 00:41:22,240
The only way you can do that
is with a lot of intelligence
464
00:41:22,240 --> 00:41:25,010
and that's where Palantir comes in.
465
00:41:28,680 --> 00:41:33,340
They ingest huge troves
of data, which include,
466
00:41:33,340 --> 00:41:37,370
where you live, where
you work, who you know,
467
00:41:37,370 --> 00:41:40,930
who your neighbors are,
who your family is,
468
00:41:40,930 --> 00:41:44,480
where you have visited, where you stay,
469
00:41:44,480 --> 00:41:46,320
your social media profile.
470
00:41:49,250 --> 00:41:53,240
Palantir gets all of that
and is remarkably good
471
00:41:53,240 --> 00:41:58,240
at structuring it in a way
that helps law enforcement,
472
00:41:58,460 --> 00:42:02,450
immigration authorities
or intelligence agencies
473
00:42:02,450 --> 00:42:06,210
of any kind, track you, find you,
474
00:42:06,210 --> 00:42:09,550
and learn everything there
is to know about you.
475
00:42:48,430 --> 00:42:51,040
- We're putting AI in
charge now of evermore
476
00:42:51,040 --> 00:42:53,860
important decisions that
affect people's lives.
477
00:42:54,810 --> 00:42:58,050
Old-school AI used to have
its intelligence programmed in
478
00:42:58,050 --> 00:43:01,420
by humans who understood
how it worked, but today,
479
00:43:01,420 --> 00:43:04,080
powerful AI systems have
just learned for themselves,
480
00:43:04,080 --> 00:43:07,480
and we have no clue really how they work,
481
00:43:07,480 --> 00:43:09,680
which makes it really hard to trust them.
482
00:43:14,270 --> 00:43:17,820
- This isn't some futuristic
technology, this is now.
483
00:43:19,470 --> 00:43:23,220
AI might help determine
where a fire department
484
00:43:23,220 --> 00:43:25,900
is built in a community
or where a school is built.
485
00:43:25,900 --> 00:43:28,380
It might decide whether you get bail,
486
00:43:28,380 --> 00:43:30,680
or whether you stay in jail.
487
00:43:30,680 --> 00:43:32,930
It might decide where the
police are going to be.
488
00:43:32,930 --> 00:43:37,140
It might decide whether you're going to be
under additional police scrutiny.
489
00:43:43,340 --> 00:43:46,140
- It's popular now in the US
to do predictive policing.
490
00:43:46,980 --> 00:43:50,900
So what they do is they use an algorithm
to figure out where crime will be,
491
00:43:51,760 --> 00:43:55,240
- and they use that to tell where
we should send police officers.
492
00:43:56,430 --> 00:43:59,420
So that's based on a
measurement of crime rate.
493
00:44:00,640 --> 00:44:02,330
So we know that there is bias.
494
00:44:02,330 --> 00:44:05,090
Black people and Hispanic
people are pulled over,
495
00:44:05,090 --> 00:44:07,180
and stopped by the police
officers more frequently
496
00:44:07,180 --> 00:44:09,820
than white people are, so we
have this biased data going in,
497
00:44:09,820 --> 00:44:11,820
and then what happens
is you use that to say,
498
00:44:11,820 --> 00:44:13,640
"Here's where the cops should go."
499
00:44:13,640 --> 00:44:17,320
Well, the cops go to those neighborhoods,
and guess what they do, they arrest people.
500
00:44:17,680 --> 00:44:21,160
And then it feeds back
biased data into the system,
501
00:44:21,160 --> 00:44:23,060
and that's called a feedback loop.
502
00:44:35,990 --> 00:44:40,650
- Predictive policing
leads at the extremes
503
00:44:41,590 --> 00:44:45,910
to experts saying, "Show me your baby,
504
00:44:45,910 --> 00:44:48,860
"and I will tell you whether
she's going to be a criminal."
505
00:44:51,300 --> 00:44:55,780
Now that we can predict it,
we're going to then surveil
506
00:44:55,780 --> 00:45:01,100
those kids much more closely
and we're going to jump on them
507
00:45:01,100 --> 00:45:03,700
at the first sign of a problem.
508
00:45:03,700 --> 00:45:06,700
And that's going to make
for more effective policing.
509
00:45:07,000 --> 00:45:11,250
It does, but it's going to
make for a really grim society
510
00:45:11,250 --> 00:45:16,250
and it's reinforcing
dramatically existing injustices.
511
00:45:22,960 --> 00:45:27,810
- Imagine a world in which
networks of CCTV cameras,
512
00:45:27,810 --> 00:45:30,930
drone surveillance
cameras, have sophisticated
513
00:45:30,930 --> 00:45:34,440
face recognition technologies
and are connected
514
00:45:34,440 --> 00:45:37,020
to other government
surveillance databases.
515
00:45:38,080 --> 00:45:41,190
We will have the
technology in place to have
516
00:45:41,190 --> 00:45:45,690
all of our movements comprehensively
tracked and recorded.
517
00:45:47,680 --> 00:45:50,330
What that also means is that we will have
518
00:45:50,330 --> 00:45:52,940
created a surveillance time machine
519
00:45:52,940 --> 00:45:56,620
that will allow governments
and powerful corporations
520
00:45:56,620 --> 00:45:58,710
to essentially hit rewind on our lives.
521
00:45:58,710 --> 00:46:01,780
We might not be under any suspicion now
522
00:46:01,780 --> 00:46:03,860
and five years from now,
they might want to know
523
00:46:03,860 --> 00:46:08,100
more about us, and can
then recreate granularly
524
00:46:08,100 --> 00:46:10,260
everything we've done,
everyone we've seen,
525
00:46:10,260 --> 00:46:13,050
everyone we've been around
over that entire period.
526
00:46:15,820 --> 00:46:19,320
That's an extraordinary amount of power
527
00:46:19,320 --> 00:46:21,660
for us to seed to anyone.
528
00:46:22,910 --> 00:46:25,370
And it's a world that I
think has been difficult
529
00:46:25,370 --> 00:46:29,180
for people to imagine,
but we've already built
530
00:46:29,180 --> 00:46:31,660
the architecture to enable that.
531
00:47:07,140 --> 00:47:10,290
- I'm a political reporter
and I'm very interested
532
00:47:10,290 --> 00:47:14,670
in the ways powerful industries
use their political power
533
00:47:14,670 --> 00:47:17,180
to influence the public policy process.
534
00:47:21,000 --> 00:47:24,420
The large tech companies and
their lobbyists get together
535
00:47:24,420 --> 00:47:27,530
behind closed doors and
are able to craft policies
536
00:47:27,530 --> 00:47:29,340
that we all have to live under.
537
00:47:30,790 --> 00:47:35,780
That's true for surveillance policies,
for policies in terms of data collection,
538
00:47:35,780 --> 00:47:40,540
but also increasingly important when it
comes to military and foreign policy.
539
00:47:43,930 --> 00:47:50,220
Starting in 2016, the Defense Department
formed the Defense Innovation Board.
540
00:47:50,220 --> 00:47:54,010
That's a special body created
to bring top tech executives
541
00:47:54,010 --> 00:47:56,420
into closer contact with the military.
542
00:47:59,140 --> 00:48:01,750
Eric Schmidt, former chairman of Alphabet,
543
00:48:01,750 --> 00:48:03,380
the parent company of Google,
544
00:48:03,380 --> 00:48:07,240
became the chairman of the
Defense Innovation Board,
545
00:48:07,240 --> 00:48:09,930
and one of their first
priorities was to say,
546
00:48:09,930 --> 00:48:13,850
"We need more artificial intelligence
integrated into the military."
547
00:48:15,930 --> 00:48:19,190
- I've worked with a group
of volunteers over the
548
00:48:19,190 --> 00:48:22,150
last couple of years to take
a look at innovation in the
549
00:48:22,150 --> 00:48:26,780
overall military, and my summary
conclusion is that we have
550
00:48:26,780 --> 00:48:30,460
fantastic people who are
trapped in a very bad system.
551
00:48:33,290 --> 00:48:35,700
- From the Department of
Defense's perspective,
552
00:48:35,700 --> 00:48:37,700
where I really started
to get interested in it
553
00:48:37,700 --> 00:48:40,980
was when we started thinking
about Unmanned Systems and
554
00:48:40,980 --> 00:48:45,980
how robotic and unmanned systems
would start to change war.
555
00:48:46,160 --> 00:48:50,320
The smarter you made the
Unmanned Systems and robots,
556
00:48:50,320 --> 00:48:53,570
the more powerful you might
be able to make your military.
557
00:48:55,540 --> 00:48:57,240
- Under Secretary of Defense,
558
00:48:57,240 --> 00:49:00,420
Robert Work put together
a major memo known as
559
00:49:00,420 --> 00:49:03,680
the Algorithmic Warfare
Cross-Functional Team,
560
00:49:03,680 --> 00:49:05,540
better known as Project Maven.
561
00:49:07,830 --> 00:49:10,760
Eric Schmidt gave a number of
speeches and media appearances
562
00:49:10,760 --> 00:49:14,310
where he said this effort
was designed to increase fuel
563
00:49:14,310 --> 00:49:18,240
efficiency in the Air Force,
to help with the logistics,
564
00:49:18,240 --> 00:49:21,300
but behind closed doors there
was another parallel effort.
565
00:49:26,710 --> 00:49:29,760
Late in 2017 as part of Project Maven,
566
00:49:29,760 --> 00:49:33,710
Google, Eric Schmidt's firm,
was tasked to secretly work
567
00:49:33,710 --> 00:49:36,100
on another part of Project Maven,
568
00:49:36,100 --> 00:49:41,480
and that was to take the vast
volumes of image data vacuumed up
569
00:49:41,480 --> 00:49:47,060
by drones operating in Iraq
and Afghanistan and to teach an AI
570
00:49:47,060 --> 00:49:49,690
to quickly identify
targets on the battlefield.
571
00:49:52,600 --> 00:49:58,470
- We have a sensor and the sensor
can do full motion video of an entire city.
572
00:49:58,470 --> 00:50:01,690
And we would have three
seven-person teams working
573
00:50:01,690 --> 00:50:06,360
constantly and they could
process 15% of the information.
574
00:50:06,360 --> 00:50:08,950
The other 85% of the
information was just left
575
00:50:08,950 --> 00:50:14,040
on the cutting room floor, so we said,
"Hey, AI and machine learning
576
00:50:14,040 --> 00:50:17,810
"would help us process
100% of the information."
577
00:50:25,380 --> 00:50:28,760
- Google has long had
the motto, "Don't be evil."
578
00:50:28,760 --> 00:50:30,630
They have created a public image
579
00:50:30,630 --> 00:50:35,080
that they are devoted
to public transparency.
580
00:50:35,080 --> 00:50:39,200
But for Google to slowly
transform into a defense contractor,
581
00:50:39,200 --> 00:50:41,720
they maintained
the utmost secrecy.
582
00:50:41,720 --> 00:50:45,280
And you had Google entering into
this contract with most of the employees,
583
00:50:45,280 --> 00:50:49,020
even employees who were working on
the program completely left in the dark.
584
00:51:01,840 --> 00:51:05,050
- Usually within Google,
anyone in the company
585
00:51:05,050 --> 00:51:08,100
is allowed to know about any
other project that's happening
586
00:51:08,100 --> 00:51:09,800
in some other part of the company.
587
00:51:11,180 --> 00:51:14,300
With Project Maven, the fact
that it was kept secret,
588
00:51:14,300 --> 00:51:18,740
I think was alarming to people
because that's not the norm at Google.
589
00:51:20,620 --> 00:51:22,940
- When this story was first revealed,
590
00:51:22,940 --> 00:51:25,540
it set off a firestorm within Google.
591
00:51:25,540 --> 00:51:28,420
You had a number of employees
quitting in protests,
592
00:51:28,420 --> 00:51:31,900
others signing a petition
objecting to this work.
593
00:51:34,130 --> 00:51:37,680
- You have to really say,
"I don't want to be part of this anymore."
594
00:51:38,820 --> 00:51:41,750
There are companies
called defense contractors
595
00:51:41,750 --> 00:51:45,830
and Google should just not
be one of those companies
596
00:51:45,830 --> 00:51:50,040
because people need to trust
Google for Google to work.
597
00:51:52,070 --> 00:51:55,710
- Good morning and welcome to Google I/O.
598
00:51:57,310 --> 00:52:00,160
- We've seen emails that
show that Google simply
599
00:52:00,160 --> 00:52:03,350
continued to mislead their
employees that the drone
600
00:52:03,350 --> 00:52:07,390
targeting program was only a
minor effort that could at most
601
00:52:07,390 --> 00:52:11,170
be worth $9 million to the firm,
which is drops in the bucket
602
00:52:11,170 --> 00:52:13,540
for a gigantic company like Google.
603
00:52:14,400 --> 00:52:17,300
But from internal emails that we obtained,
604
00:52:17,300 --> 00:52:21,530
Google was expecting Project
Maven would ramp up to as much
605
00:52:21,530 --> 00:52:26,530
as $250 million, and that this
entire effort would provide
606
00:52:26,630 --> 00:52:29,900
Google with Special Defense
Department certification to make
607
00:52:29,900 --> 00:52:32,660
them available for even
bigger defense contracts,
608
00:52:32,660 --> 00:52:34,980
some worth as much as $10 billion.
609
00:52:45,710 --> 00:52:49,750
The pressure for Google to
compete for military contracts
610
00:52:49,750 --> 00:52:53,780
has come at a time when its competitors
are also shifting their culture.
611
00:52:55,850 --> 00:53:00,070
Amazon, similarly pitching the
military and law enforcement.
612
00:53:00,070 --> 00:53:02,280
IBM and other leading firms,
613
00:53:02,280 --> 00:53:04,890
they're pitching law
enforcement and military.
614
00:53:05,840 --> 00:53:09,420
To stay competitive, Google
has slowly transformed.
615
00:53:15,150 --> 00:53:18,910
- The Defense Science
Board said of all of the
616
00:53:18,910 --> 00:53:22,170
technological advances that
are happening right now,
617
00:53:22,170 --> 00:53:26,780
the single most important thing
was artificial intelligence
618
00:53:26,780 --> 00:53:30,780
and the autonomous operations
that it would lead.
619
00:53:30,780 --> 00:53:32,330
Are we investing enough?
620
00:53:37,810 --> 00:53:41,220
- Once we develop what are known as
621
00:53:41,220 --> 00:53:45,820
autonomous lethal weapons, in other words,
622
00:53:45,820 --> 00:53:51,510
weapons that are not controlled at all,
they are genuinely autonomous,
623
00:53:51,510 --> 00:53:54,000
you've only got to get
a president who says,
624
00:53:54,000 --> 00:53:56,410
"The hell with international law,
we've got these weapons.
625
00:53:56,410 --> 00:53:58,560
"We're going to do what
we want with them."
626
00:54:02,540 --> 00:54:03,960
- We're very close.
627
00:54:03,960 --> 00:54:06,340
When you have the hardware
already set up
628
00:54:06,340 --> 00:54:10,360
and all you have to do is flip a switch
to make it fully autonomous,
629
00:54:10,360 --> 00:54:13,240
what is it there that's
stopping you from doing that?
630
00:54:14,670 --> 00:54:19,080
There's something really to be feared
in war at machine speed.
631
00:54:19,930 --> 00:54:24,370
What if you're a machine and you've run
millions and millions of different war scenarios
632
00:54:24,370 --> 00:54:28,060
and you have a team of drones and
you've delegated control to half of them,
633
00:54:28,060 --> 00:54:30,790
and you're collaborating in real time?
634
00:54:30,790 --> 00:54:35,600
What happens when that swarm of drones
is tasked with engaging a city?
635
00:54:37,490 --> 00:54:39,870
How will they take over that city?
636
00:54:39,870 --> 00:54:42,760
The answer is we won't
know until it happens.
637
00:54:50,020 --> 00:54:53,910
- We do not want an AI system to decide
638
00:54:53,910 --> 00:54:56,180
what human it would attack,
639
00:54:56,180 --> 00:54:59,500
but we're going up against
authoritarian competitors.
640
00:54:59,500 --> 00:55:02,450
So in my view, an authoritarian regime
641
00:55:02,450 --> 00:55:05,710
will have less problem
delegating authority
642
00:55:05,710 --> 00:55:08,460
to a machine to make lethal decisions.
643
00:55:09,410 --> 00:55:12,660
So how that plays out remains to be seen.
644
00:55:37,270 --> 00:55:41,020
- Almost the gift of AI
now is that it will force us
645
00:55:41,020 --> 00:55:44,410
collectively to think
through at a very basic level,
646
00:55:44,410 --> 00:55:46,310
what does it mean to be human?
647
00:55:48,620 --> 00:55:50,510
What do I do as a human better
648
00:55:50,510 --> 00:55:52,970
than a certain
super smart machine can do?
649
00:55:56,300 --> 00:56:01,000
First, we create our technology
and then it recreates us.
650
00:56:01,000 --> 00:56:05,630
We need to make sure that we
don't miss some of the things
651
00:56:05,630 --> 00:56:07,520
that make us so beautiful human.
652
00:56:11,840 --> 00:56:14,380
- Once we build intelligent machines,
653
00:56:14,380 --> 00:56:16,920
the philosophical vocabulary
we have available to think
654
00:56:16,920 --> 00:56:20,680
about ourselves as human
increasingly fails us.
655
00:56:23,160 --> 00:56:25,950
If I ask you to write up a
list of all the terms you have
656
00:56:25,950 --> 00:56:28,960
available to describe yourself as human,
657
00:56:28,960 --> 00:56:31,100
there are not so many terms.
658
00:56:31,100 --> 00:56:38,140
Culture, history, sociality,
maybe politics, civilization,
659
00:56:38,820 --> 00:56:45,090
subjectivity, all of these
terms ground in two positions
660
00:56:45,670 --> 00:56:48,020
that humans are more than mere animals
661
00:56:49,000 --> 00:56:52,200
and that humans are
more than mere machines.
662
00:56:55,140 --> 00:56:59,290
But if machines truly
think there is a large set
663
00:56:59,290 --> 00:57:03,520
of key philosophical questions
in which what is at stake is:
664
00:57:04,880 --> 00:57:09,080
Who are we? What is our place in the world?
What is the world? How is it structured?
665
00:57:09,080 --> 00:57:12,190
Do the categories that we have relied on
666
00:57:12,190 --> 00:57:14,510
- Do they still work? Were they wrong?
667
00:57:19,500 --> 00:57:22,220
- Many people think of intelligence
as something mysterious
668
00:57:22,220 --> 00:57:26,190
that can only exist inside of
biological organisms, like us,
669
00:57:26,190 --> 00:57:29,260
but intelligence is all
about information processing.
670
00:57:30,280 --> 00:57:32,230
It doesn't matter whether
the intelligence is processed
671
00:57:32,230 --> 00:57:35,940
by carbon atoms inside of
cells and brains, and people,
672
00:57:35,940 --> 00:57:38,020
or by silicon atoms in computers.
673
00:57:40,700 --> 00:57:43,320
Part of the success of
AI recently has come
674
00:57:43,320 --> 00:57:47,540
from stealing great ideas from evolution.
675
00:57:47,540 --> 00:57:49,320
We noticed that the brain, for example,
676
00:57:49,320 --> 00:57:52,920
has all these neurons inside
connected in complicated ways.
677
00:57:52,920 --> 00:57:55,660
So we stole that idea and abstracted it
678
00:57:55,660 --> 00:57:58,350
into artificial neural
networks in computers,
679
00:57:59,330 --> 00:58:02,800
and that's what has revolutionized
machine intelligence.
680
00:58:07,690 --> 00:58:10,300
If we one day get Artificial
General Intelligence,
681
00:58:10,300 --> 00:58:14,330
then by definition, AI can
also do better the job of AI
682
00:58:14,330 --> 00:58:18,930
programming and that means
that further progress in making
683
00:58:18,930 --> 00:58:22,950
AI will be dominated not by
human programmers, but by AI.
684
00:58:25,450 --> 00:58:29,090
Recursively self-improving AI
could leave human intelligence
685
00:58:29,090 --> 00:58:32,990
far behind,
creating super intelligence.
686
00:58:34,970 --> 00:58:37,970
It's gonna be the last
invention we ever need to make,
687
00:58:37,970 --> 00:58:40,520
because it can then invent everything else
688
00:58:40,520 --> 00:58:42,010
much faster than we could.
689
00:59:54,040 --> 00:59:59,040
- There is a future that
we all need to talk about.
690
00:59:59,110 --> 01:00:02,430
Some of the fundamental
questions about the future
691
01:00:02,430 --> 01:00:06,280
of artificial intelligence,
not just where it's going,
692
01:00:06,280 --> 01:00:09,630
but what it means for society to go there.
693
01:00:10,930 --> 01:00:14,620
It is not what computers can do,
694
01:00:14,620 --> 01:00:17,590
but what computers should do.
695
01:00:17,590 --> 01:00:22,210
As the generation of people
that is bringing AI to the future,
696
01:00:22,210 --> 01:00:27,690
we are the generation that will
answer this question first and foremost.
697
01:00:34,570 --> 01:00:37,380
- We haven't created the
human-level thinking machine yet,
698
01:00:37,380 --> 01:00:39,410
but we get closer and closer.
699
01:00:41,100 --> 01:00:44,780
Maybe we'll get to human-level
AI in five years from now
700
01:00:44,780 --> 01:00:47,410
or maybe it'll take 50
or 100 years from now.
701
01:00:47,660 --> 01:00:51,540
It almost doesn't matter.
Like these are all really, really soon,
702
01:00:51,540 --> 01:00:56,060
in terms of the overall
history of humanity.
703
01:01:01,200 --> 01:01:02,180
Very nice.
704
01:01:15,750 --> 01:01:19,350
So, the AI field is
extremely international.
705
01:01:19,350 --> 01:01:23,790
China is up and coming and it's
starting to rival the US,
706
01:01:23,790 --> 01:01:26,940
Europe and Japan in terms of putting a lot
707
01:01:26,940 --> 01:01:29,770
of processing power behind AI
708
01:01:29,770 --> 01:01:33,120
and gathering a lot of
data to help AI learn.
709
01:01:35,960 --> 01:01:40,540
We have a young generation
of Chinese researchers now.
710
01:01:40,540 --> 01:01:43,910
Nobody knows where the next
revolution is going to come from.
711
01:01:50,120 --> 01:01:54,290
- China always wanted to become
the superpower in the world.
712
01:01:56,120 --> 01:01:59,410
The Chinese government thinks AI
gave them the chance to become
713
01:01:59,410 --> 01:02:03,940
one of the most advanced
technology wise, business wise.
714
01:02:04,190 --> 01:02:07,780
So the Chinese government look at
this as a huge opportunity.
715
01:02:09,140 --> 01:02:13,670
Like they've raised a flag
and said, "That's a good field.
716
01:02:13,670 --> 01:02:16,300
"The companies should jump into it."
717
01:02:16,300 --> 01:02:18,300
Then China's commercial world
and companies say,
718
01:02:18,300 --> 01:02:20,800
"Okay, the government
raised a flag, that's good.
719
01:02:20,800 --> 01:02:22,500
"Let's put the money into it."
720
01:02:23,900 --> 01:02:28,090
Chinese tech giants, like Baidu,
like Tencent and like AliBaba,
721
01:02:28,090 --> 01:02:31,800
they put a lot of the
investment into the AI field.
722
01:02:33,410 --> 01:02:36,740
So we see that China's AI
development is booming.
723
01:02:43,430 --> 01:02:47,370
- In China, everybody
has Alipay and WeChat pay,
724
01:02:47,370 --> 01:02:49,500
so mobile payment is everywhere.
725
01:02:50,850 --> 01:02:54,850
And with that, they can
do a lot of AI analysis
726
01:02:54,850 --> 01:02:59,340
to know like your spending
habits, your credit rating.
727
01:03:01,150 --> 01:03:06,100
Face recognition technology
is widely adopted in China,
728
01:03:06,100 --> 01:03:08,200
in airports and train stations.
729
01:03:09,040 --> 01:03:11,570
So, in the future, maybe
in just a few months,
730
01:03:11,570 --> 01:03:15,140
you don't need a paper
ticket to board a train.
731
01:03:15,140 --> 01:03:15,970
Only your face.
732
01:03:24,620 --> 01:03:29,690
- We generate the world's biggest platform
of facial recognition.
733
01:03:31,260 --> 01:03:37,300
We have 300,000 developers
using our platform.
734
01:03:38,330 --> 01:03:41,550
A lot of it is selfie camera apps.
735
01:03:41,550 --> 01:03:43,930
It makes you look more beautiful.
736
01:03:46,550 --> 01:03:49,780
There are millions and millions
of cameras in the world,
737
01:03:50,760 --> 01:03:54,880
each camera from my point
is a data generator.
738
01:03:58,820 --> 01:04:02,810
In a machine's eye, your face
will change into the features
739
01:04:02,810 --> 01:04:06,660
and it will turn your face
into a paragraph of code.
740
01:04:07,720 --> 01:04:11,980
So we can detect how old you
are, if you're male or female,
741
01:04:11,980 --> 01:04:13,440
and your emotions.
742
01:04:16,640 --> 01:04:20,110
Shopping is about what kind
of thing you are looking at.
743
01:04:20,110 --> 01:04:25,650
We can track your eyeballs,
so if you are focusing on some product,
744
01:04:25,650 --> 01:04:28,690
we can track that so that we can know
745
01:04:28,690 --> 01:04:32,120
which kind of people like
which kind of product.
746
01:04:42,450 --> 01:04:45,820
Our mission is to create a platform
747
01:04:45,950 --> 01:04:50,300
that will enable millions of AI
developers in China.
748
01:04:51,280 --> 01:04:58,060
We study all the data we can get.
749
01:05:00,440 --> 01:05:04,350
Not just user profiles,
750
01:05:04,350 --> 01:05:08,380
but what you are doing at the moment,
751
01:05:09,460 --> 01:05:12,060
your geographical location.
752
01:05:15,380 --> 01:05:23,580
This platform will be so valuable that we don't
even worry about profit now,
753
01:05:23,580 --> 01:05:27,660
because it is definitely there.
754
01:05:29,660 --> 01:05:34,420
China's social credit system is
just one of the applications.
755
01:05:45,990 --> 01:05:48,040
- The Chinese government is using multiple
756
01:05:48,040 --> 01:05:50,810
different kinds of
technologies, whether it's AI,
757
01:05:50,810 --> 01:05:53,650
whether it's big data
platforms, facial recognition,
758
01:05:53,650 --> 01:05:57,030
voice recognition, essentially to monitor
759
01:05:57,030 --> 01:05:58,800
what the population is doing.
760
01:06:01,860 --> 01:06:04,240
I think the Chinese
government has made very clear
761
01:06:04,240 --> 01:06:09,240
its intent to gather massive
amounts of data about people
762
01:06:09,600 --> 01:06:13,300
to socially engineer a
dissent-free society.
763
01:06:16,040 --> 01:06:19,060
The logic behind the Chinese government's
764
01:06:19,060 --> 01:06:23,180
social credit system,
it's to take the idea that
765
01:06:23,180 --> 01:06:27,930
whether you are credit
worthy for a financial loan
766
01:06:27,930 --> 01:06:31,860
and adding to it a very
political dimension to say,
767
01:06:31,860 --> 01:06:34,270
"Are you a trustworthy human being?
768
01:06:36,530 --> 01:06:40,270
"What you've said online, have you ever been
critical of the authorities?
769
01:06:40,270 --> 01:06:42,190
"Do you have a criminal record?"
770
01:06:43,720 --> 01:06:47,340
And all that information is
packaged up together to rate
771
01:06:47,340 --> 01:06:51,820
you in ways that if you have
performed well in their view,
772
01:06:51,820 --> 01:06:56,490
you'll have easier access to certain kinds
of state services or benefits.
773
01:06:57,790 --> 01:06:59,710
But if you haven't done very well,
774
01:06:59,710 --> 01:07:02,130
you are going to be
penalized or restricted.
775
01:07:05,940 --> 01:07:09,020
There's no way for people to
challenge those designations
776
01:07:09,020 --> 01:07:12,100
or, in some cases, even know that
they've been put in that category,
777
01:07:12,100 --> 01:07:17,990
and it's not until they try to access some kind
of state service or buy a plane ticket,
778
01:07:17,990 --> 01:07:20,460
or get a passport, or
enroll their kid in school,
779
01:07:20,460 --> 01:07:23,590
that they come to learn that they've
been labeled in this way,
780
01:07:23,590 --> 01:07:27,550
and that there are negative
consequences for them as a result.
781
01:07:44,970 --> 01:07:48,660
We've spent the better part
of the last one or two years
782
01:07:48,660 --> 01:07:53,180
looking at abuses of surveillance
technology across China,
783
01:07:53,180 --> 01:07:56,030
and a lot of that work
has taken us to Xinjiang,
784
01:07:57,580 --> 01:08:01,220
the Northwestern region of
China that has a more than half
785
01:08:01,220 --> 01:08:05,870
population of Turkic Muslims,
Uyghurs, Kazakhs and Hui.
786
01:08:08,430 --> 01:08:10,980
This is a region and a
population the Chinese government
787
01:08:10,980 --> 01:08:14,900
has long considered to be
politically suspect or disloyal.
788
01:08:17,580 --> 01:08:20,160
We came to find information
about what's called
789
01:08:20,160 --> 01:08:23,130
the Integrated Joint Operations Platform,
790
01:08:23,130 --> 01:08:27,190
which is a predictive policing
program and that's one
791
01:08:27,190 --> 01:08:30,650
of the programs that has been
spitting out lists of names
792
01:08:30,650 --> 01:08:33,560
of people to be subjected
to political re-education.
793
01:08:39,090 --> 01:08:42,770
A number of our interviewees
for the report we just released
794
01:08:42,770 --> 01:08:46,110
about the political education
camps in Xinjiang just
795
01:08:46,110 --> 01:08:50,380
painted an extraordinary
portrait of a surveillance state.
796
01:08:53,460 --> 01:08:56,380
A region awash in surveillance cameras
797
01:08:56,380 --> 01:09:00,600
for facial recognition purposes,
checkpoints, body scanners,
798
01:09:00,600 --> 01:09:04,030
QR codes outside people's homes.
799
01:09:05,660 --> 01:09:10,690
Yeah, it really is the stuff of dystopian
movies that we've all gone to and thought,
800
01:09:10,690 --> 01:09:13,160
"Wow, that would be a
creepy world to live in."
801
01:09:13,160 --> 01:09:16,510
Yeah, well, 13 million
Turkic Muslims in China
802
01:09:16,510 --> 01:09:18,530
are living in that reality right now.
803
01:09:30,840 --> 01:09:33,060
- The Intercept reports
that Google is planning to
804
01:09:33,060 --> 01:09:36,490
launch a censored version of
its search engine in China.
805
01:09:36,490 --> 01:09:38,730
- Google's search
for new markets leads it
806
01:09:38,730 --> 01:09:42,170
to China, despite Beijing's
rules on censorship.
807
01:09:42,170 --> 01:09:44,080
- Tell us more
about why you felt it was
808
01:09:44,080 --> 01:09:46,650
your ethical responsibility to resign,
809
01:09:46,650 --> 01:09:51,060
because you talk about being complicit in
censorship and oppression, and surveillance.
810
01:09:51,060 --> 01:09:54,220
- There is a Chinese
venture company that has to be
811
01:09:54,220 --> 01:09:56,230
set up for Google to operate in China.
812
01:09:56,230 --> 01:09:58,920
And the question is, to what
degree did they get to control
813
01:09:58,920 --> 01:10:01,970
the blacklist and to what
degree would they have just
814
01:10:01,970 --> 01:10:05,220
unfettered access to
surveilling Chinese citizens?
815
01:10:05,220 --> 01:10:07,470
And the fact that Google
refuses to respond
816
01:10:07,470 --> 01:10:09,250
to human rights organizations on this,
817
01:10:09,250 --> 01:10:12,060
I think should be extremely
disturbing to everyone.
818
01:10:16,640 --> 01:10:21,120
Due to my conviction that dissent is
fundamental to functioning democracies
819
01:10:21,120 --> 01:10:25,560
and forced to resign in order to avoid
contributing to or profiting from the erosion
820
01:10:25,560 --> 01:10:27,510
of protections for dissidents.
821
01:10:28,560 --> 01:10:30,920
The UN is currently
reporting that between
822
01:10:30,920 --> 01:10:34,190
200,000 and one million
Uyghurs have been disappeared
823
01:10:34,190 --> 01:10:36,520
into re-education camps.
824
01:10:36,520 --> 01:10:39,290
And there is a serious argument
that Google would be complicit
825
01:10:39,290 --> 01:10:42,460
should it launch a surveilled
version of search in China.
826
01:10:45,180 --> 01:10:51,380
Dragonfly is a project meant
to launch search in China under
827
01:10:51,390 --> 01:10:55,810
Chinese government regulations,
which include censoring
828
01:10:55,810 --> 01:10:59,510
sensitive content, basic
queries on human rights,
829
01:10:59,510 --> 01:11:03,210
information about political
representatives is blocked,
830
01:11:03,210 --> 01:11:07,080
information about student
protests is blocked.
831
01:11:07,080 --> 01:11:09,180
And that's one small part of it.
832
01:11:09,180 --> 01:11:12,250
Perhaps a deeper concern is
the surveillance side of this.
833
01:11:16,070 --> 01:11:19,580
When I raised the issue with my managers,
with my colleagues,
834
01:11:19,580 --> 01:11:23,370
there was a lot of concern, but everyone
just said, "I don't know anything."
835
01:11:27,760 --> 01:11:30,200
And then when there was a meeting finally,
836
01:11:30,200 --> 01:11:35,080
there was essentially no addressing
the serious concerns associated with it.
837
01:11:37,010 --> 01:11:39,790
So then I filed my formal resignation,
838
01:11:39,790 --> 01:11:42,950
not just to my manager,
but I actually distributed it company-wide.
839
01:11:42,950 --> 01:11:45,440
And that's the letter
that I was reading from.
840
01:11:50,360 --> 01:11:55,960
Personally, I haven't slept well.
I've had pretty horrific headaches,
841
01:11:55,960 --> 01:11:58,520
wake up in the middle of
the night just sweating.
842
01:12:00,640 --> 01:12:03,340
With that said, what I
found since speaking out
843
01:12:03,340 --> 01:12:07,930
is just how positive the global
response to this has been.
844
01:12:10,350 --> 01:12:13,910
Engineers should demand
to know what the uses
845
01:12:13,910 --> 01:12:16,280
of their technical contributions are
846
01:12:16,280 --> 01:12:19,340
and to have a seat at the table
in those ethical decisions.
847
01:12:27,020 --> 01:12:29,590
Most citizens don't really
understand what it means to be
848
01:12:29,590 --> 01:12:32,250
in a very large scale
prescriptive technology.
849
01:12:33,720 --> 01:12:36,260
Where someone has already
pre-divided the work
850
01:12:36,260 --> 01:12:38,300
and all you know about
is your little piece,
851
01:12:38,300 --> 01:12:41,150
and almost certainly you don't
understand how it fits in.
852
01:12:44,630 --> 01:12:51,040
So, I think it's worth drawing the analogy
to physicists' work on the atomic bomb.
853
01:12:53,840 --> 01:12:56,610
In fact, that's actually
the community I came out of.
854
01:12:59,220 --> 01:13:03,460
I wasn't a nuclear scientist by any means,
but I was an applied mathematician
855
01:13:04,400 --> 01:13:09,400
and my PhD program was actually funded
to train people to work in weapons labs.
856
01:13:12,980 --> 01:13:17,100
One could certainly argue
that there is an existential threat
857
01:13:17,820 --> 01:13:22,060
and whoever is leading
in AI will lead militarily.
858
01:13:31,420 --> 01:13:34,250
- China fully expects to
pass the United States
859
01:13:34,250 --> 01:13:37,480
as the number one economy in
the world and it believes that
860
01:13:37,480 --> 01:13:42,450
AI will make that jump more
quickly and more dramatically.
861
01:13:43,900 --> 01:13:46,480
And they also see it as
being able to leapfrog
862
01:13:46,480 --> 01:13:49,820
the United States in
terms of military power.
863
01:13:57,580 --> 01:13:59,600
Their plan is very simple.
864
01:13:59,600 --> 01:14:03,620
We want to catch the United States
and these technologies by 2020,
865
01:14:03,620 --> 01:14:07,940
we want to surpass the United States
in these technologies by 2025,
866
01:14:07,940 --> 01:14:12,600
and we want to be the world leader in AI
and autonomous technologies by 2030.
867
01:14:15,110 --> 01:14:16,850
It is a national plan.
868
01:14:16,850 --> 01:14:21,790
It is backed up by at least
$150 billion in investments.
869
01:14:21,790 --> 01:14:25,060
So, this is definitely a race.
870
01:14:50,070 --> 01:14:52,070
- AI is a little bit like fire.
871
01:14:53,360 --> 01:14:56,270
Fire was invented 700,000 years ago,
872
01:14:57,480 --> 01:14:59,820
and it has its pros and cons.
873
01:15:01,960 --> 01:15:06,810
People realized you can use fire
to keep warm at night and to cook,
874
01:15:08,520 --> 01:15:14,420
but they also realized that you can
kill other people with that.
875
01:15:19,860 --> 01:15:24,300
Fire also has this
AI-like quality of growing
876
01:15:24,300 --> 01:15:27,400
in a wildfire without further human ado,
877
01:15:30,530 --> 01:15:35,970
but the advantages outweigh
the disadvantages by so much
878
01:15:35,970 --> 01:15:39,760
that we are not going
to stop its development.
879
01:15:49,540 --> 01:15:51,180
Europe is waking up.
880
01:15:52,160 --> 01:15:58,390
Lots of companies in Europe
are realizing that the next wave of AI
881
01:15:58,730 --> 01:16:01,540
will be much bigger than the current wave.
882
01:16:03,310 --> 01:16:08,000
The next wave of AI will be about robots.
883
01:16:09,810 --> 01:16:15,450
All these machines that make
things, that produce stuff,
884
01:16:15,450 --> 01:16:19,370
that build other machines,
they are going to become smart.
885
01:16:26,660 --> 01:16:29,950
In the not-so-distant future,
we will have robots
886
01:16:29,950 --> 01:16:33,290
that we can teach like we teach kids.
887
01:16:35,640 --> 01:16:38,660
For example, I will talk to a
little robot and I will say,
888
01:16:39,970 --> 01:16:42,550
"Look here, robot, look here.
889
01:16:42,550 --> 01:16:44,620
"Let's assemble a smartphone.
890
01:16:44,620 --> 01:16:48,720
"We take this slab of plastic like that
and we takes a screwdriver like that,
891
01:16:48,720 --> 01:16:52,310
"and now we screw in everything like this.
892
01:16:52,310 --> 01:16:54,310
"No, no, not like this.
893
01:16:54,310 --> 01:16:57,690
"Like this, look, robot, look, like this."
894
01:16:58,820 --> 01:17:01,890
And he will fail a couple
of times but rather quickly,
895
01:17:01,890 --> 01:17:05,860
he will learn to do the same thing
much better than I could do it.
896
01:17:06,710 --> 01:17:11,400
And then we stop the learning
and we make a million copies, and sell it.
897
01:17:32,410 --> 01:17:35,800
Regulation of AI sounds
like an attractive idea,
898
01:17:35,800 --> 01:17:38,210
but I don't think it's possible.
899
01:17:40,250 --> 01:17:42,720
One of the reasons why it won't work is
900
01:17:42,720 --> 01:17:46,360
the sheer curiosity of scientists.
901
01:17:47,260 --> 01:17:49,560
They don't give a damn for regulation.
902
01:17:52,640 --> 01:17:56,740
Military powers won't give a
damn for regulations, either.
903
01:17:56,740 --> 01:17:59,020
They will say,
"If we, the Americans don't do it,
904
01:17:59,020 --> 01:18:00,720
"then the Chinese will do it."
905
01:18:00,720 --> 01:18:03,150
And the Chinese will say,
"If we don't do it,
906
01:18:03,150 --> 01:18:04,810
"then the Russians will do it."
907
01:18:07,700 --> 01:18:10,930
No matter what kind of political
regulation is out there,
908
01:18:10,930 --> 01:18:14,660
all these military industrial complexes,
909
01:18:14,660 --> 01:18:17,570
they will almost by
definition have to ignore that
910
01:18:18,670 --> 01:18:21,250
because they want to avoid falling behind.
911
01:18:26,030 --> 01:18:27,690
Welcome to Xinhua.
912
01:18:27,690 --> 01:18:30,260
I'm the world's first female
AI news anchor developed
913
01:18:30,260 --> 01:18:32,690
jointly by Xinhua and
search engine company Sogou.
914
01:18:33,050 --> 01:18:36,310
- A program developed by
the company OpenAI can write
915
01:18:36,310 --> 01:18:39,500
coherent and credible stories
just like human beings.
916
01:18:39,500 --> 01:18:41,520
- It's one small step for machine,
917
01:18:41,520 --> 01:18:44,330
one giant leap for machine kind.
918
01:18:44,330 --> 01:18:47,170
IBM's newest artificial
intelligence system took on
919
01:18:47,170 --> 01:18:51,570
experienced human debaters
and won a live debate.
920
01:18:51,570 --> 01:18:54,160
- Computer-generated
videos known as deep fakes
921
01:18:54,160 --> 01:18:57,980
are being used to put women's
faces on pornographic videos.
922
01:19:02,510 --> 01:19:06,450
- Artificial intelligence
evolves at a very crazy pace.
923
01:19:07,970 --> 01:19:10,180
You know, it's like progressing so fast.
924
01:19:10,180 --> 01:19:12,940
In some ways, we're only
at the beginning right now.
925
01:19:14,490 --> 01:19:17,620
You have so many potential
applications, it's a gold mine.
926
01:19:19,590 --> 01:19:23,450
Since 2012, when deep learning
became a big game changer
927
01:19:23,450 --> 01:19:25,370
in the computer vision community,
928
01:19:25,370 --> 01:19:28,730
we were one of the first to
actually adopt deep learning
929
01:19:28,730 --> 01:19:31,080
and apply it in the field
of computer graphics.
930
01:19:34,640 --> 01:19:39,800
A lot of our research is funded by
government, military intelligence agencies.
931
01:19:43,950 --> 01:19:47,840
The way we create these photoreal mappings,
932
01:19:47,840 --> 01:19:50,300
usually the way it works is
that we need two subjects,
933
01:19:50,300 --> 01:19:53,740
a source and a target, and
I can do a face replacement.
934
01:19:58,880 --> 01:20:03,700
One of the applications is, for example,
I want to manipulate someone's face
935
01:20:03,700 --> 01:20:05,450
saying things that he did not.
936
01:20:08,920 --> 01:20:12,140
It can be used for creative
things, for funny contents,
937
01:20:12,140 --> 01:20:16,110
but obviously, it can also
be used for just simply
938
01:20:16,110 --> 01:20:18,660
manipulate videos and generate fake news.
939
01:20:21,050 --> 01:20:23,340
This can be very dangerous.
940
01:20:24,790 --> 01:20:29,300
If it gets into the wrong hands,
it can get out of control very quickly.
941
01:20:33,370 --> 01:20:35,750
- We're entering an era
in which our enemies can
942
01:20:35,750 --> 01:20:39,590
make it look like anyone is
saying anything at any point in time,
943
01:20:39,590 --> 01:20:41,830
even if they would
never say those things.
944
01:20:42,450 --> 01:20:46,970
Moving forward, we need to be more
vigilant with what we trust from the Internet.
945
01:20:46,970 --> 01:20:50,710
It may sound basic, but how
we move forward
946
01:20:50,710 --> 01:20:55,180
in the age of information
is going to be the difference
947
01:20:55,180 --> 01:20:58,160
between whether we survive
or whether we become
948
01:20:58,160 --> 01:21:00,270
some kind of fucked up dystopia.
949
01:22:31,070 --> 01:22:37,337
- One criticism that is frequently raised
against my work is saying that,
950
01:22:37,337 --> 01:22:43,610
"Hey, you know there were stupid ideas
in the past like phrenology or physiognomy.-
951
01:22:45,110 --> 01:22:49,210
- "There were people claiming
that you can read a character
952
01:22:49,210 --> 01:22:52,010
"of a person just based on their face."
953
01:22:53,820 --> 01:22:56,050
People would say, "This is rubbish.
954
01:22:56,050 --> 01:23:01,050
"We know it was just thinly
veiled racism and superstition."
955
01:23:05,300 --> 01:23:09,650
But the fact that someone
made a claim in the past
956
01:23:09,650 --> 01:23:14,650
and tried to support this
claim with invalid reasoning,
957
01:23:14,960 --> 01:23:18,970
doesn't automatically
invalidate the claim.
958
01:23:23,790 --> 01:23:25,710
Of course, people should have rights
959
01:23:25,710 --> 01:23:31,020
to their privacy when it comes to
sexual orientation or political views,
960
01:23:32,310 --> 01:23:35,600
but I'm also afraid that in
the current technological environment,
961
01:23:35,600 --> 01:23:38,220
this is essentially impossible.
962
01:23:42,640 --> 01:23:44,910
People should realize
there's no going back.
963
01:23:44,910 --> 01:23:48,330
There's no running away
from the algorithms.
964
01:23:51,120 --> 01:23:55,670
The sooner we accept the
inevitable and inconvenient truth
965
01:23:56,540 --> 01:23:59,430
that privacy is gone,
966
01:24:01,770 --> 01:24:05,380
the sooner we can actually
start thinking about
967
01:24:05,380 --> 01:24:07,440
how to make
sure that our societies
968
01:24:07,440 --> 01:24:11,660
are ready for the Post-Privacy Age.
969
01:24:35,080 --> 01:24:37,610
- While speaking about facial recognition,
970
01:24:37,610 --> 01:24:43,700
in my deep thoughts, I sometimes get to
the very dark era of our history.
971
01:24:45,490 --> 01:24:49,020
When the people had to live in the system,
972
01:24:49,020 --> 01:24:52,660
where some part of the society was accepted
973
01:24:53,680 --> 01:24:56,900
and some part of the society
was accused to death.
974
01:25:01,670 --> 01:25:06,180
What would Mengele do to have
such an instrument in his hands?
975
01:25:10,740 --> 01:25:14,390
It would be very quick and
efficient for selection
976
01:25:18,390 --> 01:25:22,100
and this is the apocalyptic vision.
977
01:26:15,880 --> 01:26:19,870
- So in the near future,
the entire story of you
978
01:26:19,870 --> 01:26:25,340
will exist in a vast array of
connected databases of faces,
979
01:26:25,750 --> 01:26:28,390
genomes, behaviors and emotion.
980
01:26:30,950 --> 01:26:35,320
So, you will have a digital
avatar of yourself online,
981
01:26:35,320 --> 01:26:39,510
which records how well you
are doing as a citizen,
982
01:26:39,510 --> 01:26:41,910
what kind of a relationship do you have,
983
01:26:41,910 --> 01:26:45,890
what kind of political orientation
and sexual orientation.
984
01:26:50,020 --> 01:26:54,120
Based on all of those data,
those algorithms will be able to
985
01:26:54,120 --> 01:26:58,390
manipulate your behavior
with an extreme precision,
986
01:26:58,390 --> 01:27:03,890
changing how we think and
probably in the future, how we feel.
987
01:27:25,660 --> 01:27:30,510
- The beliefs and desires of the
first AGIs will be extremely important.
988
01:27:32,730 --> 01:27:35,180
So, it's important to
program them correctly.
989
01:27:36,200 --> 01:27:37,850
I think that if this is not done,
990
01:27:38,810 --> 01:27:44,190
then the nature of evolution
of natural selection will favor
991
01:27:44,190 --> 01:27:48,340
those systems, prioritize their
own survival above all else.
992
01:27:51,860 --> 01:27:56,660
It's not that it's going to actively
hate humans and want to harm them,
993
01:27:58,780 --> 01:28:01,100
but it's just
going to be too powerful
994
01:28:01,400 --> 01:28:05,470
and I think a good analogy would be
the way humans treat animals.
995
01:28:06,970 --> 01:28:08,200
It's not that we hate animals.
996
01:28:08,260 --> 01:28:11,930
I think humans love animals
and have a lot of affection for them,
997
01:28:12,340 --> 01:28:17,610
but when the time comes to
build a highway between two cities,
998
01:28:17,610 --> 01:28:20,130
we are not asking
the animals for permission.
999
01:28:20,130 --> 01:28:23,340
We just do it because
it's important for us.
1000
01:28:24,250 --> 01:28:29,010
And I think by default, that's the kind of
relationship that's going to be between us
1001
01:28:29,010 --> 01:28:34,700
and AGIs which are truly autonomous
and operating on their own behalf.
1002
01:28:47,190 --> 01:28:51,400
If you have an arms-race
dynamics between multiple kings
1003
01:28:51,400 --> 01:28:54,140
trying to build the AGI first,
1004
01:28:54,140 --> 01:28:58,000
they will have less time to make sure
that the AGI that they build
1005
01:28:58,970 --> 01:29:00,520
will care deeply for humans.
1006
01:29:03,530 --> 01:29:06,540
Because the way I imagine it
is that there is an avalanche,
1007
01:29:06,540 --> 01:29:09,220
there is an avalanche
of AGI development.
1008
01:29:09,220 --> 01:29:12,300
Imagine it's a huge unstoppable force.
1009
01:29:15,780 --> 01:29:18,740
And I think it's pretty likely
the entire surface of the
1010
01:29:18,740 --> 01:29:22,070
earth would be covered with
solar panels and data centers.
1011
01:29:26,230 --> 01:29:28,700
Given these kinds of concerns,
1012
01:29:28,700 --> 01:29:32,830
it will be important that
the AGI is somehow built
1013
01:29:32,830 --> 01:29:35,570
as a cooperation
with multiple countries.
1014
01:29:38,060 --> 01:29:41,180
The future is going to be
good for the AIs, regardless.
1015
01:29:42,130 --> 01:29:45,020
It would be nice if it would
be good for humans as well.
1016
01:30:06,750 --> 01:30:10,580
- Is there a lot of responsibility
weighing on my shoulders?
1017
01:30:10,580 --> 01:30:11,800
Not really.
1018
01:30:13,420 --> 01:30:16,740
Was there a lot of
responsibility on the shoulders
1019
01:30:16,740 --> 01:30:19,100
of the parents of Einstein?
1020
01:30:20,190 --> 01:30:21,780
The parents somehow made him,
1021
01:30:21,780 --> 01:30:25,450
but they had no way of
predicting what he would do,
1022
01:30:25,450 --> 01:30:27,300
and how he would change the world.
1023
01:30:28,580 --> 01:30:32,540
And so, you can't really hold
them responsible for that.
1024
01:30:57,640 --> 01:31:00,240
So, I'm not a very human-centric person.
1025
01:31:01,840 --> 01:31:05,500
I think I'm a little stepping
stone in the evolution
1026
01:31:05,500 --> 01:31:08,060
of the Universe towards higher complexity.
1027
01:31:10,820 --> 01:31:14,860
But it's also clear to me that
I'm not the crown of creation
1028
01:31:14,860 --> 01:31:19,320
and that humankind as a whole
is not the crown of creation,
1029
01:31:21,140 --> 01:31:26,290
but we are setting the stage for something
that is bigger than us that transcends us.
1030
01:31:28,980 --> 01:31:32,780
And then will go out there in a way
where humans cannot follow
1031
01:31:32,780 --> 01:31:37,940
and transform the entire Universe,
or at least, the reachable Universe.
1032
01:31:41,520 --> 01:31:46,520
So, I find beauty and awe in seeing myself
1033
01:31:46,640 --> 01:31:50,260
as part of this much grander theme.
1034
01:32:14,070 --> 01:32:16,030
- AI is inevitable.
1035
01:32:17,560 --> 01:32:23,370
We need to make sure we have
the necessary human regulation
1036
01:32:23,370 --> 01:32:27,940
to prevent the weaponization
of artificial intelligence.
1037
01:32:28,690 --> 01:32:32,000
We don't need any more weaponization
1038
01:32:32,000 --> 01:32:33,960
of such a powerful tool.
1039
01:32:37,100 --> 01:32:41,960
- One of the most critical things, I think,
is the need for international governance.
1040
01:32:43,920 --> 01:32:46,310
We have an imbalance of
power here because now
1041
01:32:46,310 --> 01:32:49,140
we have corporations with
more power, might and ability,
1042
01:32:49,140 --> 01:32:51,100
than entire countries.
1043
01:32:51,100 --> 01:32:54,050
How do we make sure that people's
voices are getting heard?
1044
01:32:58,030 --> 01:32:59,930
- It can't be a law-free zone.
1045
01:32:59,930 --> 01:33:02,100
It can't be a rights-free zone.
1046
01:33:02,100 --> 01:33:05,870
We can't embrace all of these
wonderful new technologies
1047
01:33:05,870 --> 01:33:10,380
for the 21st century without
trying to bring with us
1048
01:33:10,380 --> 01:33:15,380
the package of human rights
that we fought so hard
1049
01:33:15,640 --> 01:33:19,190
to achieve, and that remains so fragile.
1050
01:33:28,680 --> 01:33:32,180
- AI isn't good and it isn't evil, either.
1051
01:33:32,180 --> 01:33:36,890
It's just going to amplify the desires
and goals of whoever controls it.
1052
01:33:36,890 --> 01:33:41,240
And AI today is under the control of
a very, very small group of people.
1053
01:33:44,430 --> 01:33:49,260
The most important question that we humans
have to ask ourselves at this point in history
1054
01:33:49,260 --> 01:33:51,380
requires no technical knowledge.
1055
01:33:51,590 --> 01:33:57,540
It's the question of what sort
of future society do we want to create
1056
01:33:57,540 --> 01:33:59,740
with all this
technology we're making?
1057
01:34:01,410 --> 01:34:05,110
What do we want the role of
humans to be in this world?
93590
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.