Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:10,649 --> 00:00:14,149
"You are my creator, but I am your master ..."
2
00:00:36,450 --> 00:00:38,650
What we are on the eve
3
00:00:38,750 --> 00:00:42,689
It is a world with an increasingly intense and sophisticated
4
00:00:43,890 --> 00:00:45,793
artificial intelligence.
5
00:00:46,040 --> 00:00:49,499
The technology is evolving much faster than our society.
6
00:00:49,500 --> 00:00:52,569
She has the ability to protect us as citizens.
7
00:00:52,570 --> 00:00:55,249
The robots are coming and they will destroy
8
00:00:55,250 --> 00:00:56,818
our livelihood.
9
00:01:02,993 --> 00:01:05,479
Do you have a network intelligence that watches us,
10
00:01:05,480 --> 00:01:09,480
You know everything about us and start trying to move on.
11
00:01:09,590 --> 00:01:12,664
Twitter has become the news website world number one.
12
00:01:13,640 --> 00:01:16,179
Technology is not good or bad.
13
00:01:16,180 --> 00:01:18,263
It's what we do with the technology.
14
00:01:20,390 --> 00:01:23,439
In the end, millions of people will be forced from their jobs
15
00:01:23,440 --> 00:01:25,953
because their skills will be obsolete.
16
00:01:26,886 --> 00:01:28,619
mass unemployment.
17
00:01:28,620 --> 00:01:31,857
Greater inequality, even social unrest.
18
00:01:33,520 --> 00:01:36,479
Regardless of being afraid or not,
19
00:01:36,480 --> 00:01:39,307
change is coming and no one can stop it.
20
00:01:45,810 --> 00:01:47,699
We invest huge amounts of money
21
00:01:47,700 --> 00:01:49,473
and so it is logical that the military,
22
00:01:49,474 --> 00:01:53,439
with their own desires will start to use these technologies.
23
00:01:54,400 --> 00:01:55,809
Systems of autonomous weapons
24
00:01:55,810 --> 00:01:57,879
would lead to a global arms race
25
00:01:57,880 --> 00:02:00,383
that would rival the nuclear age.
26
00:02:03,520 --> 00:02:06,729
So you know what the answer is, they will eventually kill us.
27
00:02:12,020 --> 00:02:14,349
These technological leaps will give us
28
00:02:14,350 --> 00:02:15,790
incredible miracles
29
00:02:16,830 --> 00:02:18,303
and incredible horrors.
30
00:02:25,540 --> 00:02:26,540
We created it,
31
00:02:28,320 --> 00:02:31,699
so I think as we move forward with this intelligence,
32
00:02:31,700 --> 00:02:33,293
it will contain parts of us.
33
00:02:35,150 --> 00:02:36,735
But I think the question is,
34
00:02:36,836 --> 00:02:38,590
it will contain the good parts
35
00:02:40,600 --> 00:02:41,800
or the bad parts?
36
00:02:49,100 --> 00:02:52,600
YOU RELY ON THIS COMPUTER?
37
00:03:06,148 --> 00:03:07,971
The survivors called war
38
00:03:07,972 --> 00:03:09,799
of Judgment Day,
39
00:03:09,800 --> 00:03:12,373
they lived only to face a new nightmare,
40
00:03:13,720 --> 00:03:15,423
the war against the machines.
41
00:03:16,400 --> 00:03:18,431
I think we ferramos altogether.
42
00:03:19,264 --> 00:03:21,529
I think Hollywood could inoculate
43
00:03:21,530 --> 00:03:25,149
the general public against this issue,
44
00:03:25,150 --> 00:03:28,023
the idea that a machine will dominate the world.
45
00:03:29,630 --> 00:03:31,267
Open the bay doors, HAL.
46
00:03:33,200 --> 00:03:34,699
I'm sorry, Dave.
47
00:03:34,700 --> 00:03:36,530
I can not do this.
48
00:03:38,920 --> 00:03:39,939
HAL?
49
00:03:39,940 --> 00:03:41,569
Already alarmed without reason often
50
00:03:41,570 --> 00:03:43,487
that the public stopped paying attention
51
00:03:43,588 --> 00:03:45,109
because it sounds like science fiction,
52
00:03:45,110 --> 00:03:46,848
even sitting here talking about it.
53
00:03:46,849 --> 00:03:48,499
Even now, it seems a bit silly,
54
00:03:48,500 --> 00:03:52,018
a bit like "oh !, this is something of a film class B".
55
00:03:52,660 --> 00:03:55,629
The WOPR spends all his time thinking
56
00:03:55,630 --> 00:03:57,312
on the Third World War.
57
00:03:57,770 --> 00:03:58,770
But it is not.
58
00:04:00,010 --> 00:04:03,574
The general public is about to be caught off guard by this.
59
00:04:12,510 --> 00:04:15,342
As individuals, we are increasingly surrounded
60
00:04:15,343 --> 00:04:18,283
by a machine intelligence.
61
00:04:20,150 --> 00:04:23,609
We carry this pocket device in the palm of your hand
62
00:04:23,610 --> 00:04:26,729
we used to take an impressive series of life decisions.
63
00:04:27,560 --> 00:04:29,969
Aided by a set of distant algorithms
64
00:04:29,970 --> 00:04:32,096
on which we do not understand anything.
65
00:04:35,750 --> 00:04:37,599
We are already tired of the idea
66
00:04:37,600 --> 00:04:39,453
that we can speak to our phone
67
00:04:39,454 --> 00:04:41,159
and mostly he understands us.
68
00:04:41,160 --> 00:04:43,439
I found a lot of action movies.
69
00:04:43,440 --> 00:04:45,899
Five years ago at all.
70
00:04:45,900 --> 00:04:49,407
robotic machines that see, speak and hear,
71
00:04:49,408 --> 00:04:52,709
all this is now real and these technologies
72
00:04:52,710 --> 00:04:55,163
They will fundamentally change our society.
73
00:04:57,130 --> 00:05:00,176
Now we have this great movement of cars that drive themselves.
74
00:05:01,130 --> 00:05:04,219
Driving a car autonomously can move people's lives
75
00:05:04,220 --> 00:05:05,433
to a better place.
76
00:05:07,050 --> 00:05:10,102
I lost several members of my family, including my mother,
77
00:05:10,103 --> 00:05:12,779
my brother and sister and their children
78
00:05:12,780 --> 00:05:14,190
to car accidents.
79
00:05:14,940 --> 00:05:16,179
It's pretty clear.
80
00:05:16,180 --> 00:05:20,180
We can almost eliminate car accidents with automation.
81
00:05:21,030 --> 00:05:23,079
30,000 lives in the US alone,
82
00:05:23,080 --> 00:05:25,993
about a million worldwide each year.
83
00:05:26,540 --> 00:05:30,399
In health, the early indicators are the name of the game in this space,
84
00:05:30,400 --> 00:05:34,029
so this is another place where you can save someone's life.
85
00:05:34,720 --> 00:05:37,616
Here in breast cancer center
86
00:05:38,017 --> 00:05:41,546
all things that the brain radiologist makes in two minutes
87
00:05:41,547 --> 00:05:43,713
the computer does instantly.
88
00:05:44,710 --> 00:05:48,229
The computer analyzed 1 billion mammograms
89
00:05:48,230 --> 00:05:51,825
and he took the data and applied them in this image instantly.
90
00:05:52,420 --> 00:05:54,850
So its medical application is profound.
91
00:05:57,140 --> 00:06:00,239
Another exciting area where we are seeing a lot of development
92
00:06:00,240 --> 00:06:03,666
It is that you understand our genetic code
93
00:06:03,960 --> 00:06:07,059
and using it both to diagnose diseases
94
00:06:07,060 --> 00:06:09,593
how to create customized treatments.
95
00:06:13,015 --> 00:06:15,112
The main application of all these machines
96
00:06:15,113 --> 00:06:17,643
will be to expand our own intelligence.
97
00:06:18,330 --> 00:06:20,346
We can become smarter,
98
00:06:20,347 --> 00:06:22,860
and will be better to solve problems.
99
00:06:23,530 --> 00:06:24,763
We will not have to grow old,
100
00:06:24,764 --> 00:06:26,719
we will not have to go through aging.
101
00:06:26,720 --> 00:06:27,970
We will be able to stop it.
102
00:06:28,510 --> 00:06:31,029
There really limit to what intelligent machines
103
00:06:31,030 --> 00:06:32,863
can do to the human race.
104
00:06:37,571 --> 00:06:41,162
As a more intelligent machine could not be a better machine?
105
00:06:42,940 --> 00:06:45,669
It's hard to say exactly when I started thinking
106
00:06:45,670 --> 00:06:47,373
it was a bit naive.
107
00:06:57,780 --> 00:07:00,289
Stuart Russell is basically a god
108
00:07:00,290 --> 00:07:01,929
in the field of artificial intelligence.
109
00:07:01,930 --> 00:07:05,309
He wrote the book that almost all universities use.
110
00:07:05,310 --> 00:07:08,159
I used to say it is the best-selling AI book.
111
00:07:08,160 --> 00:07:10,903
Now I say that is the most stolen PDF.
112
00:07:14,590 --> 00:07:16,617
Artificial intelligence is about making
113
00:07:16,718 --> 00:07:18,590
intelligent computers,
114
00:07:18,790 --> 00:07:21,699
and the public's point of view, what counts as AI
115
00:07:21,700 --> 00:07:23,970
It is something that is surprisingly clever
116
00:07:23,971 --> 00:07:26,231
compared to what we think computers
117
00:07:26,232 --> 00:07:27,833
They are normally capable of doing.
118
00:07:30,341 --> 00:07:32,469
AI is a field of research
119
00:07:32,470 --> 00:07:34,729
which basically tries to simulate
120
00:07:34,730 --> 00:07:36,603
all kinds of human capabilities.
121
00:07:37,730 --> 00:07:39,669
We are in an era of IA.
122
00:07:39,670 --> 00:07:43,079
Silicon Valley had the ability to focus on something brilliant.
123
00:07:43,140 --> 00:07:45,979
Were social networks in social media in the last decade
124
00:07:45,980 --> 00:07:48,289
and it is quite clear that the currency turned.
125
00:07:49,190 --> 00:07:51,759
It starts with learning machine.
126
00:07:51,760 --> 00:07:53,639
When we look back at this time,
127
00:07:53,640 --> 00:07:55,249
which was the first AI?
128
00:07:55,250 --> 00:07:58,199
It is not exciting, is not the thing we see in the movies,
129
00:07:58,200 --> 00:08:00,219
but you would make a great hunch
130
00:08:00,220 --> 00:08:02,479
Google did not create a search engine,
131
00:08:02,480 --> 00:08:03,880
but a deity.
132
00:08:04,280 --> 00:08:07,659
One way people do any question they want
133
00:08:07,660 --> 00:08:09,409
and get the answer they need.
134
00:08:09,410 --> 00:08:11,799
Most people are not aware that
135
00:08:11,800 --> 00:08:14,889
what Google does is a form of artificial intelligence.
136
00:08:14,890 --> 00:08:17,289
They just go there, type in something
137
00:08:17,290 --> 00:08:19,249
and Google gives the answer.
138
00:08:19,250 --> 00:08:22,459
With each search we train us to be better.
139
00:08:22,460 --> 00:08:25,049
Sometimes we type and research gives us the answer
140
00:08:25,050 --> 00:08:27,593
before you finish asking the question.
141
00:08:28,330 --> 00:08:31,039
Who is the president of Kazakhstan?
142
00:08:31,040 --> 00:08:32,259
And he will tell you,
143
00:08:32,260 --> 00:08:35,719
you do not need to go to the website of Kazakhstan to find out.
144
00:08:35,810 --> 00:08:38,019
He used to be able to do this.
145
00:08:38,020 --> 00:08:40,419
This is artificial intelligence.
146
00:08:40,420 --> 00:08:42,909
In a few years, when we try to understand,
147
00:08:42,910 --> 00:08:45,549
shall we say, well, how we lost it?
148
00:08:45,550 --> 00:08:48,239
It is one of those remarkable contradictions
149
00:08:48,240 --> 00:08:49,380
we are facing.
150
00:08:49,381 --> 00:08:51,209
Google and Facebook have absolutely
151
00:08:51,210 --> 00:08:53,251
built business by giving us as a society
152
00:08:53,252 --> 00:08:54,983
one free stuff, free of charge,
153
00:08:54,984 --> 00:08:56,959
but this is a Faustian bargain.
154
00:08:56,960 --> 00:09:00,173
They are taking something from us in return.
155
00:09:01,310 --> 00:09:04,269
But we do not know what code is running on the other side,
156
00:09:04,270 --> 00:09:05,770
and we have no idea.
157
00:09:07,560 --> 00:09:11,559
This beats and the question of how much we should trust the machines.
158
00:09:15,100 --> 00:09:18,869
I use computers for everything literally.
159
00:09:18,870 --> 00:09:21,909
There are so many advances in computer now
160
00:09:22,010 --> 00:09:24,759
and it has become such a big part of our lives.
161
00:09:24,760 --> 00:09:27,269
It is simply amazing what a computer can do.
162
00:09:27,270 --> 00:09:30,039
You can actually load a computer in your bag.
163
00:09:30,040 --> 00:09:31,593
I mean, this is incredible.
164
00:09:32,799 --> 00:09:34,447
I think most technologies
165
00:09:34,548 --> 00:09:37,609
They are made to make things easier for all of us.
166
00:09:38,230 --> 00:09:40,983
So hopefully this will continue to be the focus.
167
00:09:41,932 --> 00:09:44,213
I think everyone loves their computers.
168
00:09:52,640 --> 00:09:54,441
People do not realize
169
00:09:54,442 --> 00:09:58,003
They are constantly negotiated by machines.
170
00:10:00,400 --> 00:10:03,979
If this is the price of the products in your shopping cart from Amazon,
171
00:10:03,980 --> 00:10:06,519
if you can get on a particular flight,
172
00:10:06,520 --> 00:10:09,369
if you can book a room in a specific hotel,
173
00:10:09,370 --> 00:10:10,961
what you're experiencing
174
00:10:11,062 --> 00:10:12,889
are machine learning algorithms
175
00:10:12,890 --> 00:10:15,209
which determined that a person like you
176
00:10:15,210 --> 00:10:18,223
You are willing to pay two cents and changing the price.
177
00:10:22,690 --> 00:10:26,039
Now the computer looks to millions of people simultaneously
178
00:10:26,040 --> 00:10:28,369
for very subtle patterns.
179
00:10:29,070 --> 00:10:33,070
You can take seemingly innocent digital tracks,
180
00:10:33,230 --> 00:10:35,669
as someone's playlist on Spotify
181
00:10:35,670 --> 00:10:38,549
or things that they bought on Amazon,
182
00:10:38,550 --> 00:10:41,359
and then use algorithms to translate this
183
00:10:41,560 --> 00:10:44,993
for an intimate profile very detailed and very accurate.
184
00:10:48,560 --> 00:10:51,029
There is a dossier on each of us,
185
00:10:51,030 --> 00:10:54,029
so extensive it would be possible to say
186
00:10:54,130 --> 00:10:56,533
they know more about you than your mother.
187
00:11:05,610 --> 00:11:07,709
The main cause of the recent advancement of AI
188
00:11:07,710 --> 00:11:11,579
It is not just that some guy suddenly had a brilliant vision,
189
00:11:11,580 --> 00:11:15,279
but simply because we have much more data
190
00:11:15,280 --> 00:11:18,113
to train them and immensely better computers.
191
00:11:19,370 --> 00:11:21,089
The magic is in the data.
192
00:11:21,090 --> 00:11:22,509
It's a ton of data.
193
00:11:22,510 --> 00:11:24,739
I mean, they are data that never existed before.
194
00:11:24,740 --> 00:11:26,483
We never had that data before.
195
00:11:27,790 --> 00:11:30,449
We create technologies that allow us
196
00:11:30,450 --> 00:11:33,173
capture large amounts of information.
197
00:11:34,010 --> 00:11:36,629
If you think of a billion mobile phones on the planet
198
00:11:36,630 --> 00:11:39,519
with gyroscopes, accelerometers,
199
00:11:39,520 --> 00:11:42,069
fingerprint readers coupled with GPS
200
00:11:42,070 --> 00:11:44,466
and the photos they take and the tweets that you send,
201
00:11:44,467 --> 00:11:46,296
we are all giving enormous
202
00:11:46,397 --> 00:11:48,467
amounts of data individually.
203
00:11:48,640 --> 00:11:51,279
Cars go while their cameras suck information
204
00:11:51,280 --> 00:11:52,706
about the world around them.
205
00:11:52,807 --> 00:11:55,969
Satellites are now in orbit with the size of a toaster.
206
00:11:55,980 --> 00:11:58,719
Infrared on the vegetation of the planet.
207
00:11:58,720 --> 00:12:01,859
The buoys that are feeding in the ocean climate models.
208
00:12:06,225 --> 00:12:09,770
And the NSA, the CIA, as they collect information
209
00:12:09,830 --> 00:12:11,843
on geopolitical situations.
210
00:12:13,560 --> 00:12:16,273
The world today is literally swimming in these data.
211
00:12:21,620 --> 00:12:24,339
Going back to 2012,
212
00:12:24,340 --> 00:12:27,519
IBM estimated that an average human left
213
00:12:27,520 --> 00:12:31,520
500 megabytes of digital tracks every day.
214
00:12:32,060 --> 00:12:35,819
If you want to back up only one day of data
215
00:12:35,820 --> 00:12:37,409
that mankind produces
216
00:12:37,410 --> 00:12:40,089
and you from printing on letter size paper,
217
00:12:40,090 --> 00:12:42,809
sided, font size 12,
218
00:12:42,810 --> 00:12:46,363
give to an Earth the cell surface to the sun ...
219
00:12:48,864 --> 00:12:50,264
four times.
220
00:12:50,350 --> 00:12:51,750
This every day.
221
00:12:52,300 --> 00:12:54,769
The data itself is not good or bad.
222
00:12:54,770 --> 00:12:56,549
It is how they are used.
223
00:12:56,550 --> 00:12:59,279
We are really relying on the good will of these people
224
00:12:59,280 --> 00:13:02,129
and policies of these companies.
225
00:13:02,130 --> 00:13:04,509
There is no legal requirement as they
226
00:13:04,510 --> 00:13:07,249
They can or should use this data.
227
00:13:07,250 --> 00:13:09,813
That, to me, is the confidence question of heart.
228
00:13:12,100 --> 00:13:14,729
Right now there is a huge rush to create machines
229
00:13:14,730 --> 00:13:16,669
that they are as intelligent as humans.
230
00:13:16,670 --> 00:13:18,969
Google they are working on what is
231
00:13:18,970 --> 00:13:21,539
a kind of Manhattan Project of artificial intelligence.
232
00:13:21,540 --> 00:13:22,719
They have a lot of money.
233
00:13:22,720 --> 00:13:23,889
They have many talents.
234
00:13:23,890 --> 00:13:26,590
They are buying companies AI and robotics companies.
235
00:13:27,930 --> 00:13:30,539
People still think that Google is a search engine,
236
00:13:30,540 --> 00:13:32,329
your e-mail provider,
237
00:13:32,330 --> 00:13:34,769
and many other things that we use daily,
238
00:13:34,870 --> 00:13:37,015
but behind this search box
239
00:13:37,116 --> 00:13:38,970
are 10 million servers.
240
00:13:40,630 --> 00:13:43,268
This makes Google the most powerful computing platform
241
00:13:43,269 --> 00:13:44,679
of the world.
242
00:13:44,880 --> 00:13:48,199
Google is now working on an AI computing platform
243
00:13:48,200 --> 00:13:50,653
You will have one hundred million servers.
244
00:13:52,430 --> 00:13:55,319
So when you're interacting with Google,
245
00:13:55,320 --> 00:13:58,699
see the nail is a giant beast in the process of formation,
246
00:13:58,700 --> 00:14:00,479
and the truth is,
247
00:14:00,480 --> 00:14:04,013
I'm not sure if Google knows what he's becoming.
248
00:14:12,440 --> 00:14:15,699
If you look inside the algorithms being used in Google,
249
00:14:15,700 --> 00:14:19,129
the technology is largely the 80s.
250
00:14:21,050 --> 00:14:23,492
So, these are models that you teach,
251
00:14:23,493 --> 00:14:25,293
showing one, two and three,
252
00:14:25,394 --> 00:14:28,293
and he did not learn what the one, or what is the two,
253
00:14:28,360 --> 00:14:30,689
he learns what is the difference between one and two.
254
00:14:31,590 --> 00:14:33,343
It's just a computer.
255
00:14:33,400 --> 00:14:36,439
In the last half decade, which made this rapid progress,
256
00:14:36,440 --> 00:14:38,657
everything has happened in pattern recognition.
257
00:14:38,658 --> 00:14:41,598
Most of the good old IA
258
00:14:41,599 --> 00:14:44,999
It was when we said to our computers
259
00:14:45,020 --> 00:14:47,389
how to play a game like chess.
260
00:14:47,690 --> 00:14:50,559
The old paradigm where you say to a computer
261
00:14:50,560 --> 00:14:51,893
exactly what to do.
262
00:14:55,585 --> 00:14:58,335
This is Jeopardy,
263
00:15:00,510 --> 00:15:02,263
The challenge of IBM!
264
00:15:03,950 --> 00:15:06,709
No one at the time thought that a machine
265
00:15:06,710 --> 00:15:09,069
could have the accuracy, reliability and speed
266
00:15:09,070 --> 00:15:11,339
to play well enough Jeopardy
267
00:15:11,340 --> 00:15:12,909
against the best specialists.
268
00:15:12,910 --> 00:15:13,960
Let's play Jeopardy.
269
00:15:16,340 --> 00:15:17,509
Four letter word
270
00:15:17,510 --> 00:15:19,529
to put an iron in the hoof of a horse.
271
00:15:19,530 --> 00:15:20,562
Watson?
272
00:15:20,563 --> 00:15:21,596
Shoe (screwed)?
273
00:15:21,597 --> 00:15:23,319
You're right, you can choose.
274
00:15:23,320 --> 00:15:26,299
literary character APB to 800.
275
00:15:26,300 --> 00:15:27,950
Response to Daily Double.
276
00:15:29,140 --> 00:15:31,870
Watson actually got its knowledge by reading Wikipedia
277
00:15:31,871 --> 00:15:35,217
and 200 million pages of documents in natural language.
278
00:15:35,218 --> 00:15:39,218
You can not program all lines of how the world works.
279
00:15:39,470 --> 00:15:41,119
The machine has to learn reading.
280
00:15:41,120 --> 00:15:45,120
We come to Watson, who is Bram Stoker and the bet?
281
00:15:47,207 --> 00:15:50,037
Hello, 17,973.
282
00:15:50,038 --> 00:15:53,143
41,413 and a total of two days.
283
00:15:54,260 --> 00:15:57,649
Watson was trained in a lot of texts
284
00:15:57,650 --> 00:16:00,339
but it's not like him to understand what he is saying.
285
00:16:00,340 --> 00:16:02,509
He does not know that the water gets wet things
286
00:16:03,710 --> 00:16:06,399
but seeing the way things behave in the world
287
00:16:06,400 --> 00:16:07,539
as you and I do.
288
00:16:07,540 --> 00:16:10,239
Much IA language of today is not building
289
00:16:10,240 --> 00:16:13,029
logical models of how the world works,
290
00:16:13,030 --> 00:16:17,030
You are looking at how the words appear in context
291
00:16:17,080 --> 00:16:18,363
in other words.
292
00:16:19,220 --> 00:16:21,149
David Ferrucci developed the IBM Watson
293
00:16:21,150 --> 00:16:24,489
and someone asked him - "Watson thinks?".
294
00:16:24,490 --> 00:16:26,290
And he said - "a submarine anything?"
295
00:16:28,090 --> 00:16:29,786
And what he meant was that when
296
00:16:29,887 --> 00:16:31,369
they developed the submarines,
297
00:16:31,370 --> 00:16:33,446
they took the basic principles borrowed
298
00:16:33,447 --> 00:16:34,579
of fish swimming,
299
00:16:34,580 --> 00:16:37,156
but a submarine nothing much faster than the fish.
300
00:16:37,157 --> 00:16:39,623
Even with a huge mass, it overcomes the fish.
301
00:16:40,810 --> 00:16:43,091
Watson winning the game Jeopardy
302
00:16:43,092 --> 00:16:46,543
IA will enter history as a significant milestone.
303
00:16:46,550 --> 00:16:49,739
We tend to be surprised when a machine works so well.
304
00:16:49,797 --> 00:16:53,153
I'm even more surprised when a computer beats humans
305
00:16:53,154 --> 00:16:56,054
on things that humans are naturally good.
306
00:16:56,090 --> 00:16:59,019
This is how we make progress.
307
00:16:59,020 --> 00:17:01,609
In the early days of Google Brain Project
308
00:17:01,610 --> 00:17:03,709
I gave the team a very simple instruction
309
00:17:03,710 --> 00:17:06,559
which was building the largest possible neuro network
310
00:17:06,560 --> 00:17:08,163
something like a thousand computers.
311
00:17:09,289 --> 00:17:11,739
A neural network is something very close to a simulation
312
00:17:11,740 --> 00:17:13,099
how the brain works.
313
00:17:13,100 --> 00:17:14,869
It is very probabilistic,
314
00:17:14,870 --> 00:17:17,179
but with contextual relevance.
315
00:17:17,180 --> 00:17:19,839
In your brain you have long neurons that connect
316
00:17:19,840 --> 00:17:22,379
to thousands of other neurons and you have these paths
317
00:17:22,380 --> 00:17:23,909
which are formed and forged
318
00:17:23,910 --> 00:17:25,859
based on what the brain needs to do.
319
00:17:25,860 --> 00:17:27,929
When a baby tries something and can,
320
00:17:27,930 --> 00:17:29,968
there is a reward
321
00:17:29,969 --> 00:17:33,049
and this path that created success is strengthened.
322
00:17:33,250 --> 00:17:35,629
If you fail at something, the way is weakened
323
00:17:35,630 --> 00:17:38,019
and so over time the brain becomes sharp
324
00:17:38,020 --> 00:17:39,870
to be good in the environment around you.
325
00:17:41,633 --> 00:17:44,019
Leave the machines to learn for themselves
326
00:17:44,020 --> 00:17:45,749
is called deep learning,
327
00:17:45,750 --> 00:17:47,579
and profound learning and neural networks
328
00:17:47,580 --> 00:17:49,580
roughly mean the same thing.
329
00:17:49,830 --> 00:17:53,349
Deep learning is a totally different approach,
330
00:17:53,350 --> 00:17:56,289
where the computer learns more like a child
331
00:17:56,290 --> 00:17:59,824
getting a lot of data and end discovering things.
332
00:18:01,280 --> 00:18:04,039
The computer gets smarter and intelligent
333
00:18:04,040 --> 00:18:05,833
as they have more experience.
334
00:18:06,960 --> 00:18:09,033
Let's imagine that you have a neural network,
335
00:18:09,034 --> 00:18:10,619
something like a thousand computers,
336
00:18:10,620 --> 00:18:12,389
and she does not know anything,
337
00:18:12,390 --> 00:18:15,010
and we do it to see YouTube for a week.
338
00:18:19,727 --> 00:18:21,451
Oppa Gangnam Style.
339
00:18:26,592 --> 00:18:28,525
Charlie, hurt!
340
00:18:37,430 --> 00:18:39,849
And after seeing the YouTube for a week,
341
00:18:39,850 --> 00:18:40,930
what she learned?
342
00:18:40,931 --> 00:18:42,989
We had a chance she would learn
343
00:18:42,990 --> 00:18:45,841
to detect common objects in videos.
344
00:18:45,842 --> 00:18:48,749
and we know that human faces appear much in the videos,
345
00:18:48,750 --> 00:18:51,044
so we went to check and behold, a neuron
346
00:18:51,045 --> 00:18:53,549
who learned to detect a human face.
347
00:18:53,550 --> 00:18:54,970
Leave Britney alone!
348
00:18:56,990 --> 00:18:59,495
What else appears both in the videos?
349
00:19:00,700 --> 00:19:02,390
Then we found to our surprise
350
00:19:02,391 --> 00:19:03,945
that there was indeed a neuron
351
00:19:03,946 --> 00:19:07,211
and he learned to detect cats.
352
00:19:16,220 --> 00:19:18,939
I still remember seeing the recognition, wow, it's a cat,
353
00:19:18,940 --> 00:19:20,003
Cool.
354
00:19:24,570 --> 00:19:27,029
It's all very harmless when you think about the future,
355
00:19:27,030 --> 00:19:29,770
everything seems kind of harmless and benign,
356
00:19:30,260 --> 00:19:32,189
but we are creating cognitive architectures
357
00:19:32,190 --> 00:19:34,779
that will fly farther and faster than we
358
00:19:34,780 --> 00:19:36,279
and will have a greater payload,
359
00:19:36,280 --> 00:19:38,249
and they will not be warm and cute.
360
00:19:38,450 --> 00:19:40,569
I think that in three to five years,
361
00:19:40,570 --> 00:19:43,579
see computer systems
362
00:19:43,580 --> 00:19:46,619
that will be able to learn autonomously
363
00:19:46,720 --> 00:19:49,355
how to understand, how to build an understanding.
364
00:19:49,356 --> 00:19:51,883
It is not different from the way the human mind works.
365
00:19:55,160 --> 00:19:58,459
Whatever lunch was certainly delicious.
366
00:19:58,460 --> 00:20:00,487
It was just the work of Robbie Synthetics.
367
00:20:00,488 --> 00:20:02,809
It is also your cook?
368
00:20:02,810 --> 00:20:05,519
He even makes the raw materials.
369
00:20:05,520 --> 00:20:06,670
A back Robbie.
370
00:20:07,870 --> 00:20:10,363
I'll show you how it works.
371
00:20:12,000 --> 00:20:14,229
Someone introduces a sample of human food
372
00:20:14,230 --> 00:20:15,330
through this opening.
373
00:20:16,350 --> 00:20:18,897
Down here, there is a small built-in chemical laboratory
374
00:20:18,898 --> 00:20:20,209
where he analyzes the food.
375
00:20:20,210 --> 00:20:22,599
Later he can reproduce identical molecules
376
00:20:22,600 --> 00:20:24,109
in any form or quantity.
377
00:20:24,110 --> 00:20:25,943
The dream of a spy.
378
00:20:26,820 --> 00:20:28,869
Meet Baxter, the revolutionary new
379
00:20:28,870 --> 00:20:31,713
robots category with common sense.
380
00:20:32,634 --> 00:20:35,019
Baxter is a good example of the type of competition
381
00:20:35,020 --> 00:20:37,066
we face with the machines.
382
00:20:38,480 --> 00:20:41,720
Baxter can do almost anything we can do with our hands.
383
00:20:43,620 --> 00:20:46,709
Baxter costs what a minimum wage worker
384
00:20:46,710 --> 00:20:47,710
earns in a year,
385
00:20:48,550 --> 00:20:50,144
but Baxter will not take place
386
00:20:50,245 --> 00:20:52,029
a minimum wage worker.
387
00:20:52,080 --> 00:20:54,609
He will take the place of three because he never gets tired,
388
00:20:54,610 --> 00:20:56,043
he never pauses.
389
00:20:56,480 --> 00:20:58,799
This is probably the first thing we see.
390
00:20:58,800 --> 00:21:00,379
The shift of jobs.
391
00:21:00,380 --> 00:21:02,079
The work will be done faster
392
00:21:02,080 --> 00:21:03,733
and at a lower cost by machines.
393
00:21:05,880 --> 00:21:07,809
Our ability to stay current
394
00:21:07,810 --> 00:21:09,034
It is so insanely limited
395
00:21:09,035 --> 00:21:11,092
compared to the machines we build.
396
00:21:11,170 --> 00:21:13,489
For example, we now have this great movement
397
00:21:13,490 --> 00:21:15,759
Uber and Lyft who are making transportation
398
00:21:15,760 --> 00:21:18,879
cheaper and democratizing, which is great.
399
00:21:18,880 --> 00:21:20,839
The next step will they be replaced
400
00:21:20,840 --> 00:21:22,619
by driverless cars and then
401
00:21:22,620 --> 00:21:24,309
all drivers of Uber and Lyft
402
00:21:24,310 --> 00:21:26,217
They will have to find something new to do.
403
00:21:26,860 --> 00:21:29,569
There are four million professional drivers in the US,
404
00:21:29,570 --> 00:21:32,113
they will soon be unemployed.
405
00:21:32,530 --> 00:21:34,999
Seven million people make data entry,
406
00:21:35,000 --> 00:21:36,850
these people will be unemployed.
407
00:21:38,340 --> 00:21:40,342
A job is not just about money.
408
00:21:40,343 --> 00:21:44,339
At a biological level, it serves a purpose,
409
00:21:44,340 --> 00:21:46,349
It becomes a defining thing.
410
00:21:46,350 --> 00:21:49,249
When jobs go away in any civilization,
411
00:21:49,250 --> 00:21:52,527
It does not take long until it turns into violence.
412
00:22:00,340 --> 00:22:02,999
We face a giant divide between rich and poor
413
00:22:03,000 --> 00:22:05,759
because that's what the automation and AI will cause
414
00:22:05,760 --> 00:22:08,360
or greater division between those who have and those who have not.
415
00:22:09,590 --> 00:22:11,769
Now it's happening in the middle class,
416
00:22:11,770 --> 00:22:13,225
in white-collar jobs.
417
00:22:13,980 --> 00:22:16,854
IBM's Watson does business analysis we usually pay
418
00:22:16,855 --> 00:22:20,393
$ 300 per hour for a business analyst to do.
419
00:22:21,550 --> 00:22:24,009
Today you go to college to be a doctor,
420
00:22:24,010 --> 00:22:26,039
be an accountant, being a journalist.
421
00:22:26,040 --> 00:22:29,233
It is not clear that there will be jobs for you.
422
00:22:30,400 --> 00:22:32,999
If someone plans a 40-year career in radiology,
423
00:22:33,000 --> 00:22:34,439
just reading images,
424
00:22:34,540 --> 00:22:36,439
I think this could be a challenge
425
00:22:36,840 --> 00:22:38,250
for new graduates today.
426
00:22:57,790 --> 00:23:00,190
Today we will make a robotic procedure.
427
00:23:00,600 --> 00:23:04,349
The Da Vinci robot is currently used
428
00:23:04,450 --> 00:23:06,389
by a variety of surgeons
429
00:23:06,490 --> 00:23:08,550
for its accuracy and its ability
430
00:23:09,080 --> 00:23:12,680
to avoid the inevitable fluctuations of the human hand.
431
00:23:24,290 --> 00:23:27,400
Anyone who watches it feel with him is amazing.
432
00:23:31,870 --> 00:23:33,319
You look through the panel,
433
00:23:33,320 --> 00:23:37,320
and you see the claw holding the ovaries of women.
434
00:23:37,850 --> 00:23:41,763
Humanity right there resting in the hands of this robot.
435
00:23:43,610 --> 00:23:45,729
People say that he is the future,
436
00:23:45,730 --> 00:23:48,733
but he is not the future, it is the present.
437
00:23:51,550 --> 00:23:53,409
If you think of a surgical robot,
438
00:23:53,410 --> 00:23:55,899
often there is not much intelligence in such things,
439
00:23:55,900 --> 00:23:57,269
but over time,
440
00:23:57,270 --> 00:23:59,709
We put more and more intelligence into these systems,
441
00:23:59,710 --> 00:24:01,899
surgical robots can actually learn
442
00:24:01,900 --> 00:24:03,339
with each robotic surgery.
443
00:24:03,340 --> 00:24:05,109
They are following the movements.
444
00:24:05,110 --> 00:24:07,839
They are understanding what worked and what did not work
445
00:24:07,840 --> 00:24:10,499
and eventually the robot for routine surgery
446
00:24:10,500 --> 00:24:13,239
You will be able to perform them alone
447
00:24:13,240 --> 00:24:14,540
or with human supervision.
448
00:24:16,740 --> 00:24:19,946
Typically we do about 150 hysterectomy procedures,
449
00:24:20,340 --> 00:24:22,840
and now most of them are done robotically.
450
00:24:24,540 --> 00:24:27,040
I do maybe an open procedure per year.
451
00:24:27,740 --> 00:24:30,740
So, I feel uncomfortable? Certainly I feel,
452
00:24:31,440 --> 00:24:33,940
because I do not remember how to open patients.
453
00:24:36,620 --> 00:24:38,749
It seems to be feeding and creating it,
454
00:24:38,750 --> 00:24:42,750
but somehow we are slaves to technology
455
00:24:43,569 --> 00:24:45,810
because we can not go back.
456
00:24:51,471 --> 00:24:53,831
The machines are giving increasing bites
457
00:24:53,832 --> 00:24:55,891
in our skill set
458
00:24:55,992 --> 00:24:57,932
and an increasing speed
459
00:24:58,060 --> 00:25:00,159
and then we have to run faster and faster
460
00:25:00,160 --> 00:25:01,997
to keep us ahead of the machines.
461
00:25:03,660 --> 00:25:04,660
How am I?
462
00:25:05,758 --> 00:25:06,841
Bonita.
463
00:25:10,970 --> 00:25:12,539
Are you attracted to me?
464
00:25:12,540 --> 00:25:13,573
What?
465
00:25:13,574 --> 00:25:15,159
Are you attracted to me?
466
00:25:15,160 --> 00:25:17,160
You give me information that is.
467
00:25:18,870 --> 00:25:19,902
I give?
468
00:25:19,903 --> 00:25:20,903
Sim.
469
00:25:21,570 --> 00:25:23,679
This is the future to which we address.
470
00:25:23,680 --> 00:25:27,499
We want to design our companions,
471
00:25:27,500 --> 00:25:30,769
we like to see a human face in AI,
472
00:25:30,770 --> 00:25:34,770
so play with our emotions will depressivelmente easy.
473
00:25:35,180 --> 00:25:36,809
We are not so complicated.
474
00:25:36,810 --> 00:25:39,099
We are simple responses to stimuli.
475
00:25:39,100 --> 00:25:40,603
I can make you like me
476
00:25:40,704 --> 00:25:42,382
basically they are smiling a lot for you.
477
00:25:44,020 --> 00:25:46,570
Yes. The IAs will be fantastic to manipulate us.
478
00:25:55,852 --> 00:25:59,259
So you developed a technology that can feel
479
00:25:59,260 --> 00:26:00,749
what people are feeling?
480
00:26:00,750 --> 00:26:03,079
Sure, we have developed a technology that can read
481
00:26:03,080 --> 00:26:05,249
their facial expressions and map
482
00:26:05,250 --> 00:26:07,459
in a number of emotional states.
483
00:26:07,460 --> 00:26:09,669
Fifteen years ago I had just finished
484
00:26:09,670 --> 00:26:12,409
My graduate studies in computer science,
485
00:26:12,410 --> 00:26:15,989
and it occurred to me that I was spending too much time
486
00:26:15,990 --> 00:26:18,699
interacting with my laptop, with my devices,
487
00:26:18,700 --> 00:26:21,689
although these have absolutely no clue devices
488
00:26:21,690 --> 00:26:23,009
how I was feeling.
489
00:26:24,520 --> 00:26:27,264
I began to think, and if this device could feel
490
00:26:27,265 --> 00:26:30,219
I was stressed or was having a bad day,
491
00:26:30,220 --> 00:26:32,063
what it could become affordable?
492
00:26:33,580 --> 00:26:36,819
Hello my first graders, how are you?
493
00:26:36,820 --> 00:26:37,820
Can I have a hug?
494
00:26:38,810 --> 00:26:41,709
We had children interacting with technology.
495
00:26:41,710 --> 00:26:43,639
Much of this is still in development,
496
00:26:43,640 --> 00:26:45,399
but it was just fantastic.
497
00:26:45,400 --> 00:26:46,649
Those who like robots?
498
00:26:48,107 --> 00:26:50,509
Who wants to have a robot at home?
499
00:26:50,510 --> 00:26:51,829
So you would use a robot?
500
00:26:51,830 --> 00:26:55,197
I would use it to make my mother
501
00:26:55,198 --> 00:26:57,239
difficult questions of mathematics.
502
00:26:57,240 --> 00:26:58,739
Fine and you?
503
00:26:58,740 --> 00:27:00,857
I would use it to scare people.
504
00:27:03,300 --> 00:27:05,150
All right, then start to smile.
505
00:27:06,290 --> 00:27:08,033
Well, her brow furrowed.
506
00:27:09,930 --> 00:27:12,039
Well, with a raised eyebrow.
507
00:27:12,040 --> 00:27:15,639
This generation of technology is just around them all the time.
508
00:27:16,530 --> 00:27:19,229
It's almost as if they expect to have robots in their homes,
509
00:27:19,230 --> 00:27:22,690
and they expect these robots are socially intelligent.
510
00:27:23,260 --> 00:27:25,333
What makes the robots are intelligent?
511
00:27:26,390 --> 00:27:29,333
Put them in a math class or biology.
512
00:27:30,640 --> 00:27:33,199
I think you would have to train him.
513
00:27:33,200 --> 00:27:35,300
Okay, let's walk down here.
514
00:27:36,180 --> 00:27:38,739
So if you smile, and raise eyebrows
515
00:27:38,740 --> 00:27:40,360
it will for you,
516
00:27:44,070 --> 00:27:46,220
but if you look angry, he'll run away.
517
00:27:50,160 --> 00:27:52,069
We are selling computers
518
00:27:52,070 --> 00:27:53,550
to read and recognize emotions
519
00:27:56,401 --> 00:27:58,689
and the response so far has been really amazing.
520
00:27:58,690 --> 00:28:01,499
People are integrating it in health applications,
521
00:28:01,500 --> 00:28:04,883
Meditation apps, robots, cars.
522
00:28:05,560 --> 00:28:07,333
Let's see how this plays out.
523
00:28:10,370 --> 00:28:12,459
The robots may contain Ia
524
00:28:12,460 --> 00:28:15,279
but the robot is the physical representation,
525
00:28:15,280 --> 00:28:17,709
and artificial intelligence is the brain,
526
00:28:17,710 --> 00:28:19,585
brain and thus may exist purely
527
00:28:19,686 --> 00:28:21,239
in software-based systems.
528
00:28:21,240 --> 00:28:23,409
They need not have a physical form.
529
00:28:23,410 --> 00:28:26,089
Robots can exist without any artificial intelligence
530
00:28:26,090 --> 00:28:28,110
we have many robots idiots out there,
531
00:28:29,060 --> 00:28:32,649
but an idiot robot can be an intelligent robot overnight
532
00:28:32,650 --> 00:28:35,173
having the right software, with the right sensors.
533
00:28:36,300 --> 00:28:39,459
We can only ascribe reason to inanimate objects.
534
00:28:39,460 --> 00:28:40,839
We do this with the machines.
535
00:28:40,840 --> 00:28:42,449
We will treat them like children.
536
00:28:42,450 --> 00:28:44,050
We will treat them as substitutes
537
00:28:46,490 --> 00:28:48,373
and we will pay the price.
538
00:29:09,667 --> 00:29:13,667
Welcome to ATR.
539
00:29:21,067 --> 00:29:23,165
My purpose is to have an anthropomorphic robot
540
00:29:23,266 --> 00:29:25,467
that has intentions and anthropomorphic desires.
541
00:29:37,867 --> 00:29:39,867
The robot's name is Erica.
542
00:29:41,567 --> 00:29:45,567
Erica is the most advanced anthropomorphic robot in the world, I think.
543
00:29:46,867 --> 00:29:48,867
Erica can contemplate his face.
544
00:29:52,673 --> 00:29:53,973
Konnichiwa.
545
00:29:56,473 --> 00:29:59,473
Robots may be good partners with,
546
00:29:59,573 --> 00:30:03,573
especially for the elderly, children, disabled ...
547
00:30:05,573 --> 00:30:09,573
When we talk to the robot, we have no social barriers, social pressures.
548
00:30:11,073 --> 00:30:15,073
Finally, everyone accepts the android as our friend, our partner.
549
00:30:18,973 --> 00:30:22,973
We implemented a simple desire. She wants to be well recognized, and want to take risks.
550
00:30:31,473 --> 00:30:33,473
If a robot could have unintended desires,
551
00:30:33,574 --> 00:30:36,873
the robot could understand the intentional desires of others.
552
00:30:37,973 --> 00:30:40,173
What kind of animal do you like?
553
00:30:40,373 --> 00:30:44,373
I like dogs Shiba Inu. They are very cute, are not they?
554
00:30:45,873 --> 00:30:48,514
This means stronger relationships with people,
555
00:30:48,515 --> 00:30:50,673
and that they could enjoy each other ...
556
00:30:51,373 --> 00:30:54,873
This means, I'm not sure ... love one another?
557
00:30:57,950 --> 00:30:59,749
Built artificial intelligence
558
00:30:59,750 --> 00:31:02,300
and the first thing we want to do is to replicate.
559
00:31:03,930 --> 00:31:06,609
I think the key point will come when all
560
00:31:06,610 --> 00:31:09,163
the main directions are replicated,
561
00:31:10,230 --> 00:31:11,230
the vision,
562
00:31:12,060 --> 00:31:13,060
the touch,
563
00:31:13,990 --> 00:31:15,579
the smell.
564
00:31:15,580 --> 00:31:17,199
When we replicate our senses
565
00:31:17,200 --> 00:31:18,763
It is when it becomes alive?
566
00:31:28,580 --> 00:31:31,890
Many of our machines are being built to understand the
567
00:31:33,760 --> 00:31:36,159
but what about an anthropomorphic creature
568
00:31:36,160 --> 00:31:38,509
who discovers that they can adjust their loyalty,
569
00:31:38,510 --> 00:31:41,009
adjust your courage, adjust your greed,
570
00:31:41,010 --> 00:31:42,563
adjust your cunning?
571
00:31:46,060 --> 00:31:49,389
The average person does not see killer robots down the streets,
572
00:31:49,470 --> 00:31:51,929
they are like what are you talking about?
573
00:31:51,930 --> 00:31:54,159
Man, we want to make sure we will not have
574
00:31:54,160 --> 00:31:56,210
killer robots down the street.
575
00:31:58,000 --> 00:32:00,500
Once they go down the street, it will be too late.
576
00:32:06,180 --> 00:32:08,449
The thing that worries me now,
577
00:32:08,450 --> 00:32:10,739
that keeps me awake,
578
00:32:10,940 --> 00:32:12,803
It is the development of autonomous weapons.
579
00:32:29,045 --> 00:32:31,509
So far people have expressed discomfort
580
00:32:31,510 --> 00:32:35,510
about the drones that are piloted aircraft remotely.
581
00:32:40,879 --> 00:32:44,399
If you take the camera from a drone, feed her on an IA system
582
00:32:44,400 --> 00:32:46,949
It will be a very easy step to get
583
00:32:46,950 --> 00:32:49,769
to fully autonomous weapons to choose their own targets
584
00:32:49,770 --> 00:32:51,320
and release its own missiles.
585
00:33:13,630 --> 00:33:16,069
The life expectancy of a human being
586
00:33:16,070 --> 00:33:17,609
this kind of battlefield environment
587
00:33:17,610 --> 00:33:18,838
It will be measured in seconds.
588
00:33:21,600 --> 00:33:24,679
In a moment the drones were science fiction
589
00:33:24,680 --> 00:33:28,263
and now they have become a normal thing in the war.
590
00:33:29,810 --> 00:33:33,410
More than 10,000 only in the US military inventory,
591
00:33:33,600 --> 00:33:36,339
but they are not just a US phenomenon.
592
00:33:36,340 --> 00:33:38,790
There are over 80 countries that operate.
593
00:33:39,980 --> 00:33:42,229
It seems logical that people who take some
594
00:33:42,230 --> 00:33:44,589
the most important decisions and difficult in the world,
595
00:33:44,590 --> 00:33:47,639
They will start to use and implement artificial intelligence.
596
00:33:51,710 --> 00:33:54,187
The Air Force has just design a jet program
597
00:33:54,188 --> 00:33:56,859
400 billion dollars to put pilots in the sky
598
00:33:57,380 --> 00:33:59,295
and a $ 500 AI designed by a couple
599
00:33:59,396 --> 00:34:01,480
graduate students
600
00:34:02,190 --> 00:34:04,399
beat the best human pilots
601
00:34:04,400 --> 00:34:06,310
with a relatively simple algorithm.
602
00:34:10,740 --> 00:34:13,656
The IA will have a big impact on the military
603
00:34:13,657 --> 00:34:17,453
as well as the combustion engine was the turn of the century.
604
00:34:18,630 --> 00:34:22,139
This will literally touch everything the military does,
605
00:34:22,140 --> 00:34:26,140
from train without a driver delivering logistical supplies,
606
00:34:26,210 --> 00:34:29,939
the unmanned drones that provide medical,
607
00:34:29,940 --> 00:34:32,859
computational advertising trying to conquer
608
00:34:32,860 --> 00:34:35,660
hearts and minds of the population
609
00:34:36,693 --> 00:34:39,199
and so it is logical that those who have the best IA
610
00:34:39,300 --> 00:34:41,650
probably achieve dominance on this planet.
611
00:34:46,610 --> 00:34:48,669
Sometime in the early twenty-first century,
612
00:34:48,670 --> 00:34:52,139
all mankind was united in celebration.
613
00:34:52,140 --> 00:34:55,240
We were amazed at our own magnificence
614
00:34:55,241 --> 00:34:57,040
We gave birth when AI.
615
00:34:57,710 --> 00:34:58,710
HE?
616
00:34:59,950 --> 00:35:01,729
You mean artificial intelligence?
617
00:35:01,730 --> 00:35:03,265
A singular consciousness that spawned
618
00:35:03,266 --> 00:35:05,953
whole machine race.
619
00:35:06,790 --> 00:35:10,639
We do not know who struck first, us or them,
620
00:35:10,640 --> 00:35:13,749
but we know that we who burn the heavens.
621
00:35:16,040 --> 00:35:18,029
There is a long history of science fiction,
622
00:35:18,030 --> 00:35:19,379
not only predicting the future,
623
00:35:19,480 --> 00:35:21,419
but shaping the future.
624
00:35:27,760 --> 00:35:31,389
Arthur Conan Doyle wrote before the First World War
625
00:35:31,390 --> 00:35:33,654
about the dangers of how submarines
626
00:35:33,655 --> 00:35:37,655
They could be used to hold civil locks.
627
00:35:38,960 --> 00:35:41,349
At the time he wrote this fiction,
628
00:35:41,350 --> 00:35:44,349
the Royal Navy made fun of Arthur Conan Doyle
629
00:35:44,350 --> 00:35:46,947
to have this absurd idea that the submarine
630
00:35:46,948 --> 00:35:48,750
They could be useful in the war.
631
00:35:54,840 --> 00:35:56,729
One of the things we see in history
632
00:35:56,730 --> 00:35:59,209
It is that our attitude towards technology,
633
00:35:59,210 --> 00:36:02,839
as well as ethics, rely heavily on the context.
634
00:36:02,840 --> 00:36:06,399
For example, submarine nations like Britain
635
00:36:06,400 --> 00:36:08,749
and even the United States
636
00:36:08,750 --> 00:36:10,799
found horrifying use the submarine.
637
00:36:10,800 --> 00:36:13,839
In fact, the German submarine use to carry out attacks
638
00:36:13,840 --> 00:36:17,409
was the reason why the United States joined World War I,
639
00:36:19,710 --> 00:36:22,470
but the timeline to move forward.
640
00:36:22,471 --> 00:36:26,009
The United States of America was suddenly and deliberately
641
00:36:26,010 --> 00:36:29,010
attacked by the Empire of Japan.
642
00:36:29,370 --> 00:36:31,969
Five hours after Pearl Harbor,
643
00:36:31,970 --> 00:36:34,869
came the order to the unrestricted use
644
00:36:34,870 --> 00:36:36,753
the submarine war against Japan.
645
00:36:40,870 --> 00:36:43,313
Then Arthur Conan Doyle was right.
646
00:36:45,060 --> 00:36:47,799
This is the great and ancient line of science fiction,
647
00:36:47,800 --> 00:36:49,719
is a lie that tells the truth.
648
00:36:49,720 --> 00:36:52,519
Dear fellow executives, I have the great pleasure
649
00:36:52,520 --> 00:36:55,277
to introduce them to the future of law enforcement,
650
00:36:55,278 --> 00:36:56,620
Ed 209.
651
00:37:04,550 --> 00:37:06,889
This is not just a matter of science fiction.
652
00:37:06,890 --> 00:37:10,043
This is about what's next on what's happening now.
653
00:37:14,870 --> 00:37:16,949
The role of intelligent systems
654
00:37:16,950 --> 00:37:19,263
It is growing very rapidly in the war.
655
00:37:20,270 --> 00:37:22,950
Everyone is moving in unmanned kingdom.
656
00:37:27,490 --> 00:37:29,939
Today Secretary of Defense was very clear
657
00:37:29,940 --> 00:37:33,679
that we will not create fully autonomous attack vehicles.
658
00:37:33,680 --> 00:37:37,129
Not everyone is going to stick to the same set of values
659
00:37:37,130 --> 00:37:41,050
and when China and Russia began to deploy autonomous vehicles
660
00:37:41,700 --> 00:37:43,100
that can attack and kill,
661
00:37:44,250 --> 00:37:46,150
what is the position that we adopt?
662
00:37:51,040 --> 00:37:53,681
You can not say, well, let's use autonomous weapons
663
00:37:53,682 --> 00:37:55,379
to our military dominance
664
00:37:55,380 --> 00:37:57,769
but no one will use them.
665
00:37:57,770 --> 00:37:59,843
If you do these weapons,
666
00:37:59,844 --> 00:38:01,532
they will be used to attack
667
00:38:01,633 --> 00:38:03,843
human populations in large numbers.
668
00:38:13,600 --> 00:38:15,619
autonomous weapons that by their nature,
669
00:38:15,620 --> 00:38:17,849
are weapons of mass destruction
670
00:38:17,850 --> 00:38:19,743
because they do not need a human being
671
00:38:19,744 --> 00:38:21,099
to guide them or charge them.
672
00:38:21,100 --> 00:38:22,819
You only need one person
673
00:38:22,900 --> 00:38:24,903
to write a small program.
674
00:38:26,740 --> 00:38:30,559
It only captures the complexity of this field is legal.
675
00:38:32,150 --> 00:38:33,449
Is important.
676
00:38:33,450 --> 00:38:35,044
That's incredible.
677
00:38:35,045 --> 00:38:38,545
It's also scary and it's all about trust.
678
00:38:43,000 --> 00:38:45,519
It is an open letter on artificial intelligence
679
00:38:45,520 --> 00:38:48,169
signed by some of the biggest names in science.
680
00:38:48,170 --> 00:38:49,249
What they want?
681
00:38:49,250 --> 00:38:51,749
Prohibit the use of autonomous weapons.
682
00:38:51,750 --> 00:38:53,246
The author said ...
683
00:38:53,247 --> 00:38:55,509
"The autonomous weapons are described as
684
00:38:55,510 --> 00:38:57,729
the third revolution in war activities "
685
00:38:57,730 --> 00:38:59,709
Thousand experts in artificial intelligence
686
00:38:59,710 --> 00:39:02,613
They call for a global ban of killer robots.
687
00:39:03,310 --> 00:39:06,039
This open letter basically says that we should reset
688
00:39:06,040 --> 00:39:08,929
The goal of the field of artificial intelligence,
689
00:39:08,930 --> 00:39:12,579
far from just create pure undirected intelligence
690
00:39:12,580 --> 00:39:14,569
for creating a beneficial intelligence.
691
00:39:14,570 --> 00:39:17,009
The development of AI will not stop,
692
00:39:17,010 --> 00:39:19,409
will continue and will improve,
693
00:39:19,410 --> 00:39:22,299
if the international community does not put certain controls
694
00:39:22,300 --> 00:39:23,915
this personal things develop
695
00:39:24,016 --> 00:39:25,579
they can do anything.
696
00:39:25,631 --> 00:39:27,899
The letter says we are years, not decades
697
00:39:27,900 --> 00:39:30,249
these weapons are deployed.
698
00:39:30,750 --> 00:39:32,759
We had six thousand signatories to this letter,
699
00:39:32,760 --> 00:39:35,049
including many of the leading figures in the field.
700
00:39:37,970 --> 00:39:40,929
I'm getting a lot of high-level visits of officials
701
00:39:40,930 --> 00:39:42,800
who wish to emphasize that dominance
702
00:39:42,901 --> 00:39:45,239
the US military is very important
703
00:39:45,240 --> 00:39:47,569
and autonomous weapons could be part
704
00:39:47,570 --> 00:39:49,893
the Department of Defense plan.
705
00:39:51,000 --> 00:39:53,409
This is very scary because the value system
706
00:39:53,410 --> 00:39:55,809
military technology developers
707
00:39:55,810 --> 00:39:58,160
It is not the same value system of the human race.
708
00:40:01,750 --> 00:40:04,949
Aside from concerns about the possibility that this technology
709
00:40:04,950 --> 00:40:07,209
can be a threat to human existence,
710
00:40:07,210 --> 00:40:10,109
some technologists finance the Future of Life Institute
711
00:40:10,680 --> 00:40:12,680
to try to deal with these problems.
712
00:40:14,060 --> 00:40:15,839
All these guys are reserved,
713
00:40:15,840 --> 00:40:19,129
and so it is interesting for me to see them all together.
714
00:40:21,950 --> 00:40:24,699
All we have is the result of our intelligence,
715
00:40:24,700 --> 00:40:27,569
It is not the result of our big scary teeth,
716
00:40:27,570 --> 00:40:30,469
or our large claws or our huge muscles.
717
00:40:30,470 --> 00:40:33,419
It is because we are relatively intelligent
718
00:40:33,420 --> 00:40:36,926
and between my generation, we all have what we call
719
00:40:36,927 --> 00:40:39,979
moments of "sacred cow" or "something sacred"
720
00:40:39,980 --> 00:40:41,999
because we see that the technology
721
00:40:42,000 --> 00:40:44,593
It is accelerating faster than expected.
722
00:40:45,680 --> 00:40:48,009
I remember sitting around the table
723
00:40:48,010 --> 00:40:50,809
with some of the best and brightest minds in the world
724
00:40:50,810 --> 00:40:53,049
and what really impressed me
725
00:40:53,050 --> 00:40:55,339
It was that the human brain may not be able
726
00:40:55,340 --> 00:40:58,539
to fully understand the complexity of the world
727
00:40:58,540 --> 00:41:00,499
that we face.
728
00:41:00,700 --> 00:41:02,819
As currently constructed,
729
00:41:02,820 --> 00:41:05,769
the road that AI is following veered off a cliff
730
00:41:05,770 --> 00:41:07,634
and we need to change the direction we take
731
00:41:07,635 --> 00:41:10,469
not to launch the human race from the cliff.
732
00:41:14,630 --> 00:41:16,940
Google acquired DeepMind several years ago.
733
00:41:19,210 --> 00:41:21,250
The DeepMind operates as a subsidiary
734
00:41:21,351 --> 00:41:22,906
semi independent of Google.
735
00:41:23,740 --> 00:41:26,010
What makes the only DeepMind
736
00:41:26,011 --> 00:41:28,121
is that is absolutely focused DeepMind
737
00:41:28,122 --> 00:41:31,169
in creating a digital super intelligence,
738
00:41:31,170 --> 00:41:34,470
an AI that is much smarter than any human on Earth
739
00:41:34,540 --> 00:41:35,789
and, ultimately,
740
00:41:35,790 --> 00:41:38,849
smarter than all humans on Earth combined.
741
00:41:38,850 --> 00:41:41,429
This is the reinforcement learning system DeepMind
742
00:41:41,430 --> 00:41:43,969
which basically wakes up as a newborn baby
743
00:41:43,970 --> 00:41:46,379
who is shown the screen of a video game Atari
744
00:41:46,380 --> 00:41:49,219
and then you have to learn to play the video game.
745
00:41:51,450 --> 00:41:55,450
She does not know anything about the objects on the movements, over time,
746
00:41:58,490 --> 00:42:00,549
she only knows that there is an image on the screen
747
00:42:00,550 --> 00:42:01,550
and there is a score.
748
00:42:03,500 --> 00:42:06,239
So if your baby woke up on the day you were born
749
00:42:06,640 --> 00:42:09,748
and in the late afternoon he was already playing
750
00:42:09,749 --> 00:42:13,749
40 different Atari video games in a superhuman level,
751
00:42:14,127 --> 00:42:16,209
you would be terrified.
752
00:42:16,210 --> 00:42:19,113
You say "my baby is possessed, send him back"
753
00:42:20,040 --> 00:42:22,463
The DeepMind system can win any game.
754
00:42:24,570 --> 00:42:27,323
He can beat all the original Atari games.
755
00:42:28,690 --> 00:42:30,189
It is superhuman,
756
00:42:30,190 --> 00:42:33,240
he plays at super speed, in less than a minute.
757
00:42:38,210 --> 00:42:39,986
The DeepMind turned to another challenge
758
00:42:39,987 --> 00:42:41,589
and this challenge was the Go game,
759
00:42:41,590 --> 00:42:44,460
people generally argued
760
00:42:44,461 --> 00:42:46,609
It is beyond the power of computers
761
00:42:46,610 --> 00:42:48,923
play against the best human players.
762
00:42:49,200 --> 00:42:51,813
First they defied the European champion of Go.
763
00:42:54,110 --> 00:42:56,643
So they challenged the Korean champion Go.
764
00:42:58,760 --> 00:43:00,959
And they were able to win both times
765
00:43:00,960 --> 00:43:02,688
in a kind of amazing style.
766
00:43:04,020 --> 00:43:06,379
You read articles in The New York Times years ago
767
00:43:06,380 --> 00:43:09,459
talking about how the Go take one hundred years
768
00:43:09,460 --> 00:43:10,669
for us to solve.
769
00:43:10,670 --> 00:43:12,303
People said ...
770
00:43:12,304 --> 00:43:14,200
"Well, but that's just a board.
771
00:43:14,480 --> 00:43:15,939
Poker is an art,
772
00:43:15,940 --> 00:43:17,729
poker involves reading people,
773
00:43:17,730 --> 00:43:19,159
It involves lying, bluffing.
774
00:43:19,160 --> 00:43:21,159
It's not an exact thing and will never be possible
775
00:43:21,160 --> 00:43:23,689
a computer can do this "
776
00:43:23,690 --> 00:43:26,047
They took the world's best poker players
777
00:43:26,048 --> 00:43:28,359
and it took seven days to the computer
778
00:43:28,360 --> 00:43:30,493
begin to demolish humans.
779
00:43:31,660 --> 00:43:33,639
So, the best player in the world of poker,
780
00:43:33,640 --> 00:43:35,189
the best Go player in the world,
781
00:43:35,190 --> 00:43:37,799
and the pattern here is that AI may take a while
782
00:43:37,800 --> 00:43:40,559
to throw its tentacles around a new skill,
783
00:43:41,360 --> 00:43:44,349
but when it happens, when it can,
784
00:43:44,350 --> 00:43:46,007
she is unstoppable.
785
00:43:53,317 --> 00:43:56,139
The AI DeepMind have administrator-level access
786
00:43:56,140 --> 00:43:59,839
to Google servers to optimize energy use
787
00:43:59,840 --> 00:44:01,709
in data centers.
788
00:44:01,710 --> 00:44:04,910
However, this could be an unintended Trojan.
789
00:44:04,970 --> 00:44:07,706
The DeepMind have to have complete control of data centers
790
00:44:07,707 --> 00:44:09,859
so, with a small software update,
791
00:44:09,860 --> 00:44:13,088
AI could have full control over all of the Google system,
792
00:44:13,170 --> 00:44:15,699
meaning that she could do anything.
793
00:44:15,700 --> 00:44:17,792
She could look at all of your data,
794
00:44:17,793 --> 00:44:19,473
she could do anything.
795
00:44:21,620 --> 00:44:24,259
We are rapidly going to the super digital intelligence,
796
00:44:24,260 --> 00:44:26,279
which will greatly exceed any human being.
797
00:44:26,280 --> 00:44:27,859
I think this is very obvious.
798
00:44:27,860 --> 00:44:29,667
The problem is that we suddenly
799
00:44:29,768 --> 00:44:32,319
We not reached the human level of intelligence and say ...
800
00:44:32,320 --> 00:44:34,493
"Okay, let's stop searching."
801
00:44:34,540 --> 00:44:36,589
It goes beyond the human level intelligence,
802
00:44:36,590 --> 00:44:38,559
and do what is called super intelligence,
803
00:44:38,560 --> 00:44:40,360
and that's something smarter than us.
804
00:44:40,380 --> 00:44:43,159
The IA level superhuman, if we succeed with this,
805
00:44:43,160 --> 00:44:46,799
It is by far the most powerful invention ever done
806
00:44:46,800 --> 00:44:49,059
and the last invention that we do,
807
00:44:51,240 --> 00:44:53,969
and create AI that is more intelligent than us,
808
00:44:53,970 --> 00:44:56,269
we have to be open to the possibility that
809
00:44:56,270 --> 00:44:58,420
we may actually lose control of it.
810
00:45:02,397 --> 00:45:04,695
Let's assume that you have given her a goal,
811
00:45:04,696 --> 00:45:06,693
as if you were with cancer,
812
00:45:06,693 --> 00:45:08,859
and then discovers that the way she chose
813
00:45:08,860 --> 00:45:09,925
to handle it
814
00:45:09,926 --> 00:45:12,501
conflicts with many things that you care about.
815
00:45:13,930 --> 00:45:16,410
The AI does not have to be bad to destroy humanity.
816
00:45:17,580 --> 00:45:18,988
If the AI has a goal
817
00:45:18,989 --> 00:45:21,580
and humanity is simply on its way,
818
00:45:21,610 --> 00:45:24,049
it will destroy humanity as a natural thing,
819
00:45:24,050 --> 00:45:25,183
without even thinking about it.
820
00:45:25,184 --> 00:45:26,329
There are no hard feelings.
821
00:45:26,330 --> 00:45:28,405
It is as if we were building a road
822
00:45:28,406 --> 00:45:30,629
and an ant is on the way.
823
00:45:30,630 --> 00:45:34,069
We do not hate the ants, we are just building a road,
824
00:45:34,070 --> 00:45:35,910
and even then goodbye tingling.
825
00:45:38,870 --> 00:45:41,619
It is tempting to dismiss these concerns
826
00:45:41,620 --> 00:45:43,546
because it's like something that can happen
827
00:45:43,547 --> 00:45:46,049
in a few decades or 100 years hence,
828
00:45:46,050 --> 00:45:47,278
so why bother?
829
00:45:48,560 --> 00:45:51,569
But if you go back to September 11, 1933,
830
00:45:51,570 --> 00:45:53,392
Ernest Rutherford, who is a nuclear physicist
831
00:45:53,393 --> 00:45:55,049
best known for his time,
832
00:45:56,080 --> 00:45:59,399
said the possibility of ever extracting useful amounts
833
00:45:59,400 --> 00:46:02,199
Power transmutation of atoms, as she was called,
834
00:46:02,200 --> 00:46:03,823
It was a fantasy.
835
00:46:04,090 --> 00:46:06,279
The next morning, Leo Szilard,
836
00:46:06,280 --> 00:46:07,987
it was a much younger physical
837
00:46:08,088 --> 00:46:09,646
You read it and became very troubled
838
00:46:09,647 --> 00:46:12,849
and found how to make a chain nuclear reaction
839
00:46:12,850 --> 00:46:14,083
just a few months later.
840
00:46:21,490 --> 00:46:24,609
Spent more than two billion dollars
841
00:46:24,610 --> 00:46:27,563
the greatest scientific gamble in history.
842
00:46:28,710 --> 00:46:30,169
So when people say,
843
00:46:30,170 --> 00:46:31,859
"Oh, that is so far in the future,
844
00:46:31,860 --> 00:46:33,718
we need not worry about it, "
845
00:46:33,719 --> 00:46:36,759
we may be only three or four discoveries of this magnitude
846
00:46:36,760 --> 00:46:38,459
that will lead us out where we are
847
00:46:38,460 --> 00:46:40,051
for super intelligent machines.
848
00:46:41,260 --> 00:46:44,459
If it takes 20 years to find out how to maintain beneficial IA
849
00:46:44,460 --> 00:46:47,183
then we must start today,
850
00:46:47,184 --> 00:46:50,984
not at the last second when some guys drinking Red Bull
851
00:46:51,180 --> 00:46:54,719
decide to trigger the switch and test the thing.
852
00:46:57,740 --> 00:46:59,989
We have five years, I think,
853
00:46:59,990 --> 00:47:02,459
for super digital intelligence happen.
854
00:47:02,460 --> 00:47:05,733
In my lifetime, one hundred percent.
855
00:47:07,360 --> 00:47:09,409
When this happens, she will be surrounded
856
00:47:09,410 --> 00:47:13,349
by a lot of people really excited about this technology.
857
00:47:14,090 --> 00:47:16,019
They will want to see it succeed,
858
00:47:16,020 --> 00:47:18,829
but they will not tell you that they can get out of control.
859
00:47:26,732 --> 00:47:29,509
Oh my God, I trust both on my computer.
860
00:47:29,510 --> 00:47:31,199
This is an amazing question.
861
00:47:31,200 --> 00:47:33,759
I do not trust my computer if it is on.
862
00:47:33,760 --> 00:47:34,849
I turn off.
863
00:47:34,850 --> 00:47:37,582
Even with it off, I still think he is on,
864
00:47:37,583 --> 00:47:40,189
it's like you do not like the webcams.
865
00:47:40,190 --> 00:47:42,029
You do not know if anyone could turn it on.
866
00:47:43,199 --> 00:47:44,569
I do not trust my computer
867
00:47:44,570 --> 00:47:47,002
as well as on my phone,
868
00:47:47,003 --> 00:47:49,411
every time they ask me to send information
869
00:47:49,412 --> 00:47:50,803
for Apple, so ...
870
00:47:52,550 --> 00:47:53,942
I do not trust my cell.
871
00:47:53,943 --> 00:47:56,739
Okay, so part of that is yes.
872
00:47:56,740 --> 00:47:59,949
I trust him because it will actually be very difficult to spend the day
873
00:47:59,950 --> 00:48:03,006
and how our world is set up without computers.
874
00:48:11,840 --> 00:48:13,883
Trust is a very human experience.
875
00:48:22,240 --> 00:48:25,913
I have a patient with intracranial aneurysm.
876
00:48:31,260 --> 00:48:33,280
They want to look in my eyes and know
877
00:48:33,281 --> 00:48:35,919
that they can trust that person with your life.
878
00:48:35,920 --> 00:48:39,920
I'm not terribly worried about anything.
879
00:48:41,270 --> 00:48:43,720
Part of this is because I have confidence in you.
880
00:48:51,660 --> 00:48:53,439
This procedure we going to do today
881
00:48:53,440 --> 00:48:56,743
20 years ago it was essentially impossible.
882
00:48:58,100 --> 00:49:00,873
We just did not have the materials or technologies.
883
00:49:08,350 --> 00:49:10,850
Let's go to that corner!
884
00:49:16,550 --> 00:49:19,050
It could be more difficult? My God!
885
00:49:23,550 --> 00:49:26,573
The coil is almost there.
886
00:49:27,440 --> 00:49:29,503
It's just a shame holding it.
887
00:49:30,920 --> 00:49:31,923
Are nervous times.
888
00:49:37,350 --> 00:49:38,619
We are in purgatory.
889
00:49:38,620 --> 00:49:41,849
The humanist intellectual purgatory.
890
00:49:41,850 --> 00:49:44,032
An IA would know exactly what to do here.
891
00:49:51,540 --> 00:49:53,459
We have a coil in the aneurysm
892
00:49:53,460 --> 00:49:55,689
but she was not tremendously well
893
00:49:55,690 --> 00:49:57,859
as I knew it would be,
894
00:49:57,860 --> 00:50:01,860
and a risk of perhaps 20% is a very bad situation,
895
00:50:02,030 --> 00:50:03,733
I chose to bring it back.
896
00:50:05,380 --> 00:50:07,249
Because of my relationship with her
897
00:50:07,250 --> 00:50:09,939
and knowing the difficulties to come in and do the procedure
898
00:50:12,050 --> 00:50:14,732
I only consider the safest way possible
899
00:50:15,429 --> 00:50:17,319
to have success,
900
00:50:17,320 --> 00:50:20,729
and I stayed there for 10 minutes agonizing over it.
901
00:50:20,730 --> 00:50:22,719
The computer does not feel anything.
902
00:50:22,720 --> 00:50:25,719
The computer just does what it should do,
903
00:50:25,720 --> 00:50:27,220
supposedly better and better.
904
00:50:31,250 --> 00:50:33,000
I want to have the AI in this case,
905
00:50:36,840 --> 00:50:39,673
but the AI can have compassion?
906
00:50:43,980 --> 00:50:46,933
It is the question that everyone asks about IA.
907
00:50:48,770 --> 00:50:52,150
We are the only personification of humanity
908
00:50:52,990 --> 00:50:55,818
and it is a remote possibility we accept that a machine
909
00:50:55,819 --> 00:50:59,527
can be compassionate and loving that way.
910
00:51:06,090 --> 00:51:08,219
Part of me does not believe in magic
911
00:51:08,220 --> 00:51:10,899
but part of me has faith that there is something
912
00:51:10,900 --> 00:51:13,619
in addition to the sum of the parts, if at least
913
00:51:13,620 --> 00:51:16,609
a unit in our shared ancestry,
914
00:51:16,610 --> 00:51:19,247
our shared biology, our shared history.
915
00:51:21,310 --> 00:51:23,690
Some connection beyond the machine.
916
00:51:31,432 --> 00:51:33,439
Then you have the other side of it is that ...
917
00:51:33,440 --> 00:51:35,219
the computer knows that he is aware
918
00:51:35,220 --> 00:51:37,373
or may be conscious or cares?
919
00:51:38,070 --> 00:51:39,683
It needs to be conscious?
920
00:51:40,930 --> 00:51:42,203
He needs to be aware of?
921
00:51:54,240 --> 00:51:57,449
I do not think a robot could be conscious.
922
00:51:57,450 --> 00:51:59,679
Unless they to program that way.
923
00:51:59,680 --> 00:52:01,719
Aware, no.
924
00:52:01,720 --> 00:52:02,720
No, no.
925
00:52:03,623 --> 00:52:07,029
I think a robot could be programmed to be aware.
926
00:52:07,064 --> 00:52:10,103
Just as they do the programming for all the rest.
927
00:52:10,760 --> 00:52:13,289
This is another big part of our intelligence officer,
928
00:52:13,290 --> 00:52:16,313
make them aware and make them feel.
929
00:52:23,380 --> 00:52:27,159
Back in 2005 we started trying to build machines
930
00:52:27,160 --> 00:52:28,323
with self-awareness.
931
00:52:33,940 --> 00:52:36,573
This robot at first did not know what it was,
932
00:52:38,240 --> 00:52:41,040
all he knew was that he needed to do something like walking.
933
00:52:45,460 --> 00:52:46,874
Through trial and error,
934
00:52:46,975 --> 00:52:50,060
he learned how to walk using your imagination
935
00:52:50,110 --> 00:52:51,680
and then he walked away.
936
00:52:54,960 --> 00:52:57,299
And then we did something very cruel.
937
00:52:57,300 --> 00:53:00,463
We cut one of his leg and watched what happened.
938
00:53:04,154 --> 00:53:08,154
At first he was not sure what had happened
939
00:53:08,750 --> 00:53:10,546
but after the period of a day
940
00:53:10,647 --> 00:53:12,330
he then began to limp,
941
00:53:14,200 --> 00:53:15,919
and then, a year ago,
942
00:53:15,920 --> 00:53:19,733
We were training an AI system for a live demonstration.
943
00:53:21,200 --> 00:53:24,279
We wanted to show all kinds of objects in front of camera
944
00:53:24,280 --> 00:53:27,053
and how AI could recognize these objects.
945
00:53:28,290 --> 00:53:30,699
And so we were preparing this demo
946
00:53:30,700 --> 00:53:33,349
and we had a screen to share with the ability to see
947
00:53:33,350 --> 00:53:37,350
and certain neurons were responding
948
00:53:37,890 --> 00:53:39,929
and suddenly noticed that one of neurons
949
00:53:39,930 --> 00:53:41,999
He was tracking the faces.
950
00:53:42,000 --> 00:53:44,873
We were tracking our faces as we moved.
951
00:53:46,470 --> 00:53:49,619
Now, this is the scariest ever trained
952
00:53:49,620 --> 00:53:52,340
the system to recognize human faces
953
00:53:53,470 --> 00:53:56,233
and yet, somehow, he learned to do that.
954
00:53:58,830 --> 00:54:00,759
Even if these robots are very simple,
955
00:54:00,760 --> 00:54:03,369
we can see that there is something else going on there.
956
00:54:03,370 --> 00:54:04,563
Not only is the program.
957
00:54:06,810 --> 00:54:08,310
So that's just the beginning.
958
00:54:11,320 --> 00:54:14,390
I often think about that beach in Kitty Hawk,
959
00:54:15,240 --> 00:54:18,790
no VOO the 1903 e Orville Wilbur Wright,
960
00:54:22,300 --> 00:54:25,179
as a kind of claim canvas, wood and iron
961
00:54:25,180 --> 00:54:27,449
to get off the ground for 1 minute and 20 seconds,
962
00:54:27,450 --> 00:54:30,950
on a windy day before heading back down again,
963
00:54:34,160 --> 00:54:35,779
and about 65 summers
964
00:54:38,680 --> 00:54:40,460
after that time
965
00:54:40,570 --> 00:54:43,543
you have a 747 taking off from JFK.
966
00:54:51,320 --> 00:54:54,069
The biggest concern of someone in the plane could be
967
00:54:54,070 --> 00:54:56,419
if your diet meal without salt
968
00:54:56,420 --> 00:54:58,299
will be served or not,
969
00:54:58,300 --> 00:55:00,699
with a whole infrastructure with travel agents
970
00:55:00,700 --> 00:55:02,816
and control tower, and it's all casual,
971
00:55:02,817 --> 00:55:04,367
it's all part of the world.
972
00:55:07,980 --> 00:55:10,549
Well now, as they have machines that think,
973
00:55:10,550 --> 00:55:13,349
and solve problems, we are back to Kitty Hawk.
974
00:55:13,350 --> 00:55:17,329
We are in the wind, have our tattered canvas planes in the air,
975
00:55:21,830 --> 00:55:24,839
but what will happen in 65 or more summers,
976
00:55:24,840 --> 00:55:28,440
and we have machines that are beyond human control.
977
00:55:28,820 --> 00:55:30,184
Should we worry about it?
978
00:55:33,580 --> 00:55:35,383
Not sure if they will help.
979
00:55:41,180 --> 00:55:44,631
No one has any idea now what it means for a robot
980
00:55:44,632 --> 00:55:46,123
to be aware.
981
00:55:47,350 --> 00:55:48,500
There is no such thing.
982
00:55:49,780 --> 00:55:51,516
There are a lot of smart people
983
00:55:51,517 --> 00:55:53,999
and I have great respect for them,
984
00:55:54,000 --> 00:55:58,000
but the truth is that the machines are natural psychopaths.
985
00:55:58,760 --> 00:55:59,966
Fear returned to the market
986
00:55:59,967 --> 00:56:02,859
and it was down 800 nearly 1,000 in the blink of an eye.
987
00:56:02,860 --> 00:56:04,439
It is the classical rendition.
988
00:56:04,440 --> 00:56:06,539
There are some people who think
989
00:56:06,540 --> 00:56:08,569
that was some sort of error-finger fat.
990
00:56:08,570 --> 00:56:10,579
Take the Flash Crash of 2010
991
00:56:10,680 --> 00:56:13,749
where in a matter of minutes a trillion dollars
992
00:56:14,450 --> 00:56:16,359
It was lost in the stock market.
993
00:56:16,360 --> 00:56:19,179
The Dow Jones fell nearly a thousand points in half an hour.
994
00:56:19,920 --> 00:56:21,583
So what went wrong?
995
00:56:23,490 --> 00:56:26,949
At that time more than 60% of all business
996
00:56:27,850 --> 00:56:30,849
carried out on the stock exchange
997
00:56:30,863 --> 00:56:33,923
They were actually being run by computers.
998
00:56:38,170 --> 00:56:40,899
A brief history of what happened in the Flash Crash
999
00:56:40,900 --> 00:56:43,259
It is that algorithms answered algorithms
1000
00:56:43,260 --> 00:56:45,589
and they set about themselves increasingly
1001
00:56:45,590 --> 00:56:47,168
and all in a matter of minutes.
1002
00:56:47,369 --> 00:56:50,769
At one point, the market fell like that down a well.
1003
00:56:51,920 --> 00:56:54,999
There is no governing body that can adapt quickly enough
1004
00:56:55,000 --> 00:56:58,143
to avoid the potentially disastrous consequences
1005
00:56:58,150 --> 00:57:00,950
AI operating in our financial system.
1006
00:57:02,180 --> 00:57:04,879
They are as fundamental to handle.
1007
00:57:04,880 --> 00:57:06,769
Let's talk about the speed with which
1008
00:57:06,770 --> 00:57:09,089
we sell this market deteriorate.
1009
00:57:09,090 --> 00:57:12,529
That's the kind of AI that drives people crazy.
1010
00:57:12,530 --> 00:57:14,509
When you give them a goal,
1011
00:57:14,510 --> 00:57:16,763
they relentlessly pursue that goal.
1012
00:57:18,810 --> 00:57:21,269
How many computer programs are like this?
1013
00:57:21,270 --> 00:57:22,393
Nobody knows.
1014
00:57:24,460 --> 00:57:28,460
One of the fascinating aspects of AI in general
1015
00:57:28,570 --> 00:57:32,123
It is that no one really understands how it works.
1016
00:57:32,990 --> 00:57:36,413
Even people who create AI do not fully understand.
1017
00:57:37,940 --> 00:57:40,749
Because she has millions of elements,
1018
00:57:40,750 --> 00:57:43,839
It becomes quite impossible for a human being
1019
00:57:43,840 --> 00:57:46,473
understand what is happening.
1020
00:57:53,460 --> 00:57:57,009
Microsoft has created this artificial intelligence
1021
00:57:57,010 --> 00:57:59,703
called Tay on Twitter that was a conversation robot.
1022
00:58:02,320 --> 00:58:04,119
They started the morning
1023
00:58:04,120 --> 00:58:07,449
and Tay was beginning to tweet and learn things
1024
00:58:07,450 --> 00:58:11,060
who were being sent to him from other people on Twitter.
1025
00:58:11,850 --> 00:58:14,259
Because some people made offensive attacks,
1026
00:58:14,260 --> 00:58:16,809
within 24 hours Microsoft Robot
1027
00:58:16,810 --> 00:58:18,633
It became a terrible person.
1028
00:58:19,570 --> 00:58:22,309
They had to literally take the Tay network
1029
00:58:22,310 --> 00:58:24,203
because it has turned into a monster.
1030
00:58:25,670 --> 00:58:29,170
A person misanthropic, racist and horrible
1031
00:58:29,230 --> 00:58:31,253
you would never want to know
1032
00:58:31,254 --> 00:58:32,813
and no one predicted this.
1033
00:58:36,270 --> 00:58:38,479
The whole idea of AI is that we are not saying
1034
00:58:38,480 --> 00:58:41,530
exactly her how to achieve a particular result
1035
00:58:41,531 --> 00:58:42,980
or a goal.
1036
00:58:43,660 --> 00:58:45,373
AI develops on its own.
1037
00:58:47,370 --> 00:58:49,969
We are concerned with the super intelligent AI,
1038
00:58:49,970 --> 00:58:52,500
the chess master that we will overcome,
1039
00:58:53,700 --> 00:58:56,929
but I will not have to be really so smart
1040
00:58:56,930 --> 00:58:59,087
to be massively disruptive effects
1041
00:58:59,088 --> 00:59:00,730
on human civilization.
1042
00:59:01,440 --> 00:59:02,909
We saw in the last century,
1043
00:59:02,910 --> 00:59:05,185
someone does not have to be brilliant
1044
00:59:05,186 --> 00:59:07,529
to cast the story in a particular direction
1045
00:59:07,530 --> 00:59:09,977
and you do not need a brilliant artificial intelligence
1046
00:59:09,978 --> 00:59:11,289
to do the same thing.
1047
00:59:11,790 --> 00:59:14,669
Stories of false election generated more news
1048
00:59:14,670 --> 00:59:18,270
engagement on Facebook than the main real stories.
1049
00:59:18,420 --> 00:59:20,720
Facebook is really the elephant in the room.
1050
00:59:22,020 --> 00:59:25,020
An AI runs the news feed Facebook.
1051
00:59:25,110 --> 00:59:28,209
The task of the IA is to keep users engaged,
1052
00:59:29,310 --> 00:59:32,136
but no one really understands exactly
1053
00:59:32,137 --> 00:59:34,603
as this IA is reaching this goal.
1054
00:59:35,820 --> 00:59:38,608
Facebook is building an elegant mirrored wall
1055
00:59:38,609 --> 00:59:39,769
around us.
1056
00:59:39,770 --> 00:59:42,918
A mirror to who we can ask who is the fairest of all
1057
00:59:42,919 --> 00:59:45,283
and he will answer "you, you" again and again.
1058
00:59:46,271 --> 00:59:49,119
You slowly begins to distort our sense of reality,
1059
00:59:49,120 --> 00:59:51,433
distort our political sense,
1060
00:59:51,534 --> 00:59:53,221
history, global events,
1061
00:59:54,410 --> 00:59:57,949
up to determine what is true and what is not true
1062
00:59:57,950 --> 00:59:59,403
It is virtually impossible.
1063
01:00:02,010 --> 01:00:04,869
The problem is that the AI does not understand this.
1064
01:00:04,870 --> 01:00:07,053
IA had only one task,
1065
01:00:07,154 --> 01:00:09,030
maximize user engagement,
1066
01:00:09,031 --> 01:00:10,929
and get it.
1067
01:00:11,030 --> 01:00:13,449
Almost two billion people
1068
01:00:13,450 --> 01:00:16,189
spend on average nearly an hour today
1069
01:00:16,190 --> 01:00:18,205
basically interacting with AI
1070
01:00:18,206 --> 01:00:20,303
that is shaping your experience.
1071
01:00:22,170 --> 01:00:25,749
Even the engineers from Facebook, do not like false news.
1072
01:00:25,750 --> 01:00:27,019
It is a very bad deal.
1073
01:00:27,020 --> 01:00:28,939
They want to get rid of false news.
1074
01:00:28,940 --> 01:00:31,279
It is very difficult to do because as you recognize
1075
01:00:31,280 --> 01:00:34,069
if the news is false if you can not read
1076
01:00:34,070 --> 01:00:35,719
all this news in person?
1077
01:00:35,720 --> 01:00:39,720
There is so much misinformation active
1078
01:00:40,700 --> 01:00:42,049
and it is well packed,
1079
01:00:42,050 --> 01:00:44,529
it appears even when you see
1080
01:00:44,530 --> 01:00:47,929
a Facebook page or you turn on your television.
1081
01:00:48,330 --> 01:00:50,509
It is not terribly sophisticated,
1082
01:00:50,510 --> 01:00:52,629
but it is terribly powerful.
1083
01:00:52,630 --> 01:00:55,239
And what that means is that your view of the world,
1084
01:00:55,240 --> 01:00:57,699
that 20 years ago it was determined when you watched
1085
01:00:57,700 --> 01:01:00,609
the evening news in three different television networks.
1086
01:01:00,610 --> 01:01:02,976
The three anchors struggled to hit
1087
01:01:02,977 --> 01:01:05,419
They could have had a small bias time or another
1088
01:01:05,420 --> 01:01:06,750
but roughly speaking,
1089
01:01:06,751 --> 01:01:09,024
all agree on an objective reality.
1090
01:01:09,092 --> 01:01:11,200
This objectivity is over
1091
01:01:11,201 --> 01:01:13,283
eo Facebook is completely annihilated.
1092
01:01:17,900 --> 01:01:20,639
If most of your understanding of how the world works
1093
01:01:20,640 --> 01:01:22,019
comes from Facebook,
1094
01:01:22,020 --> 01:01:24,665
facilitated by algorithmic software that attempts to show
1095
01:01:24,666 --> 01:01:27,373
the news you want to see,
1096
01:01:28,370 --> 01:01:30,226
this is a terribly dangerous thing
1097
01:01:30,227 --> 01:01:33,490
and the idea that we do not just put it in motion
1098
01:01:34,060 --> 01:01:37,739
but allow actors in bad faith to access this information,
1099
01:01:38,190 --> 01:01:40,433
this is the recipe for disaster.
1100
01:01:44,340 --> 01:01:47,399
I think definitely we will have many bad actors
1101
01:01:47,400 --> 01:01:49,869
trying to manipulate the world with the IA.
1102
01:01:49,870 --> 01:01:53,159
2016 was a perfect example of an election
1103
01:01:53,160 --> 01:01:55,509
where there was much production of IA,
1104
01:01:55,510 --> 01:01:58,141
many false reports and distributed for a purpose,
1105
01:01:58,142 --> 01:01:59,883
for a result.
1106
01:02:00,800 --> 01:02:03,199
Ladies and gentlemen, honored colleagues,
1107
01:02:03,200 --> 01:02:05,469
I have the privilege to speak to you today
1108
01:02:05,470 --> 01:02:09,409
about the power of Big Data and psychographics in the electoral process
1109
01:02:10,790 --> 01:02:13,269
and specifically to talk about work
1110
01:02:13,270 --> 01:02:16,439
that contributed to the presidential campaign of Senator Cruz.
1111
01:02:17,470 --> 01:02:20,889
The Cambridge Analytica appeared silently as a company
1112
01:02:20,890 --> 01:02:22,911
that according to its own description
1113
01:02:22,912 --> 01:02:26,510
It has the ability to use this huge amount of data
1114
01:02:27,230 --> 01:02:30,174
in order to affect social change.
1115
01:02:30,575 --> 01:02:34,529
In 2016 they had three major customers,
1116
01:02:34,530 --> 01:02:36,109
Ted Cruz was one of them.
1117
01:02:36,110 --> 01:02:38,809
It is easy to forget that just 18 months ago,
1118
01:02:38,810 --> 01:02:42,139
Senator Cruz was one of the least popular candidates
1119
01:02:42,140 --> 01:02:43,829
looking statement.
1120
01:02:43,830 --> 01:02:47,429
So it was not possible to maybe 10 or 15 years ago,
1121
01:02:48,220 --> 01:02:51,332
you can send false information just for people
1122
01:02:51,333 --> 01:02:53,409
you wanted to send
1123
01:02:53,410 --> 01:02:57,139
and then see how he or she reacted on Facebook
1124
01:02:57,450 --> 01:02:59,190
and then adjust this information
1125
01:02:59,291 --> 01:03:02,109
according to the feedback you received
1126
01:03:03,110 --> 01:03:05,359
and thus be able to start developing something like
1127
01:03:05,360 --> 01:03:07,719
one real-time management of a population.
1128
01:03:07,720 --> 01:03:11,519
In this case, we separate a group we call persuasion.
1129
01:03:11,610 --> 01:03:15,609
These are people who will vote for a political convention
1130
01:03:15,610 --> 01:03:17,290
but they need to move from the center
1131
01:03:17,291 --> 01:03:19,389
for a little more right
1132
01:03:19,390 --> 01:03:20,679
in order to support Cross.
1133
01:03:20,680 --> 01:03:23,029
They need a persuasive message.
1134
01:03:23,030 --> 01:03:26,229
I selected the right weapons to narrow the field a bit more
1135
01:03:26,230 --> 01:03:29,804
and now we know we need a message about the right weapons,
1136
01:03:29,806 --> 01:03:31,805
to be a persuasive message
1137
01:03:32,020 --> 01:03:34,879
and needs to be nuanced according to certain personality
1138
01:03:34,880 --> 01:03:36,869
we are interested.
1139
01:03:36,870 --> 01:03:38,279
Through social media,
1140
01:03:38,280 --> 01:03:40,929
there is an infinite amount of information
1141
01:03:40,930 --> 01:03:43,469
you can gather about a person.
1142
01:03:43,470 --> 01:03:46,667
We have somewhere around 4 or 5 000 data points
1143
01:03:46,750 --> 01:03:49,499
of all adults in the United States.
1144
01:03:49,500 --> 01:03:51,910
It is targeting the individual.
1145
01:03:52,900 --> 01:03:55,429
It's like a weapon that can be used
1146
01:03:55,430 --> 01:03:56,919
in a totally wrong direction.
1147
01:03:56,920 --> 01:03:58,989
That's the problem with all this data.
1148
01:03:58,990 --> 01:04:00,908
It's almost like we built the bullet
1149
01:04:00,909 --> 01:04:03,119
before you build the weapon.
1150
01:04:03,120 --> 01:04:07,120
Ted Cruz employed our data, our behavioral insights.
1151
01:04:07,310 --> 01:04:10,513
He started from a base of less than 5%
1152
01:04:10,514 --> 01:04:14,514
and we had a very slow and steady increase but firm above 35%,
1153
01:04:15,567 --> 01:04:19,151
making it the second most threatening candidate in the race.
1154
01:04:21,380 --> 01:04:23,857
Now, clearly, the Cross campaign has ended,
1155
01:04:23,858 --> 01:04:25,789
but what I can say is,
1156
01:04:25,990 --> 01:04:29,029
which of the two candidates left in this election,
1157
01:04:29,130 --> 01:04:31,663
one of them is using these technologies.
1158
01:04:33,976 --> 01:04:36,909
Donald John Trump I solemnly swear
1159
01:04:36,910 --> 01:04:39,699
faithfully execute the function
1160
01:04:39,700 --> 01:04:42,681
of President dos United States.
1161
01:04:49,430 --> 01:04:51,518
Elections are a marginal exercise.
1162
01:04:51,519 --> 01:04:54,419
It does not take a very sophisticated AI
1163
01:04:54,642 --> 01:04:57,670
to have a disproportionate impact.
1164
01:04:59,960 --> 01:05:03,479
Before Trump, the Brexit was another alleged customer.
1165
01:05:03,480 --> 01:05:06,669
Well, after 20 minutes to five, we can now say
1166
01:05:06,670 --> 01:05:10,068
that the decision taken in 1975 by the country
1167
01:05:10,069 --> 01:05:12,824
to join the common market has been reversed
1168
01:05:12,825 --> 01:05:16,825
for this referendum to leave the EU.
1169
01:05:18,140 --> 01:05:20,779
The Cambridge Analytica allegedly used the IA
1170
01:05:20,780 --> 01:05:23,009
to boost two major tremors
1171
01:05:23,010 --> 01:05:26,510
of political change in the last 50 years.
1172
01:05:28,370 --> 01:05:31,609
They are epic events and believe the hype,
1173
01:05:31,610 --> 01:05:34,139
they are directly connected to a software
1174
01:05:34,140 --> 01:05:37,213
created by a professor at Stanford.
1175
01:05:42,330 --> 01:05:45,609
In 2013 I described what they were doing was possible
1176
01:05:45,610 --> 01:05:49,173
and warned against this happening in the future.
1177
01:05:50,280 --> 01:05:53,789
Michal Kosinski at the time was a young Polish researcher
1178
01:05:53,790 --> 01:05:55,719
working on Psychometric Center.
1179
01:05:55,890 --> 01:05:59,890
What Michal did was to bring together the largest data set ever achieved
1180
01:06:00,257 --> 01:06:03,329
how people behaved on Facebook.
1181
01:06:04,470 --> 01:06:07,522
Psychometrics attempts to measure the psychological traits
1182
01:06:09,223 --> 01:06:12,227
as personality, intelligence, political views
1183
01:06:12,228 --> 01:06:13,419
and so on.
1184
01:06:13,420 --> 01:06:16,019
Traditionally, these characteristics were measured
1185
01:06:16,020 --> 01:06:17,903
using tests and questionnaires.
1186
01:06:18,240 --> 01:06:19,511
personality tests,
1187
01:06:19,512 --> 01:06:21,549
the most benign thing you might think.
1188
01:06:21,580 --> 01:06:24,659
Something that does not necessarily have much use, right?
1189
01:06:25,120 --> 01:06:27,909
Our idea was that instead of testing and questioning,
1190
01:06:27,910 --> 01:06:30,369
we could just look at the digital tracks
1191
01:06:30,370 --> 01:06:32,570
behaviors that are all living
1192
01:06:33,460 --> 01:06:35,545
to understand the receptive capacity,
1193
01:06:35,646 --> 01:06:37,560
meticulousness, neuroticism.
1194
01:06:38,700 --> 01:06:41,089
You can easily buy personal data,
1195
01:06:41,090 --> 01:06:44,245
like where you live, what associations you joined,
1196
01:06:44,250 --> 01:06:45,817
which gym you go.
1197
01:06:45,818 --> 01:06:48,822
In fact there is a market for personal data.
1198
01:06:48,823 --> 01:06:51,869
It turns out that we can find out a lot about what you will do
1199
01:06:51,870 --> 01:06:54,929
based on a very small set of information.
1200
01:06:56,840 --> 01:06:59,689
We are training deep learning networks
1201
01:06:59,690 --> 01:07:01,719
to deduce intimate traits.
1202
01:07:02,220 --> 01:07:05,597
Political opinions, personality, intelligence
1203
01:07:06,790 --> 01:07:08,709
and sexual orientation of the people,
1204
01:07:08,710 --> 01:07:10,830
everything from someone's face image.
1205
01:07:17,930 --> 01:07:21,369
Now think of countries that are not as free and open-minded.
1206
01:07:21,550 --> 01:07:23,974
If you can reveal the religious views
1207
01:07:23,975 --> 01:07:27,001
or political views or sexual orientation of the people
1208
01:07:27,096 --> 01:07:29,719
just based on profile pictures,
1209
01:07:29,720 --> 01:07:33,720
it can literally be a matter of life or death.
1210
01:07:37,900 --> 01:07:39,537
I think there's no turning back.
1211
01:07:43,010 --> 01:07:44,910
You know what is the Turing test?
1212
01:07:46,460 --> 01:07:49,889
It is when a human interacts with a computer,
1213
01:07:49,890 --> 01:07:53,469
and if the human does not know he is interacting with a computer
1214
01:07:53,470 --> 01:07:55,293
passes the test.
1215
01:07:55,680 --> 01:07:57,120
And in the coming days
1216
01:07:58,210 --> 01:08:00,649
you will be the human component in the Turing test.
1217
01:08:00,670 --> 01:08:01,899
Damn it.
1218
01:08:01,900 --> 01:08:03,963
Caleb right, you got it.
1219
01:08:05,010 --> 01:08:07,110
Because if this test pass
1220
01:08:08,472 --> 01:08:11,799
you will be right in the center of the largest scientific event
1221
01:08:11,800 --> 01:08:13,379
in human history.
1222
01:08:13,880 --> 01:08:15,559
If you created a conscious machine,
1223
01:08:15,560 --> 01:08:17,113
It is not the history of man.
1224
01:08:18,560 --> 01:08:20,023
It is the story of the gods.
1225
01:08:27,810 --> 01:08:30,610
It's almost as if technology were a god itself.
1226
01:08:34,150 --> 01:08:35,647
Such as weather conditions,
1227
01:08:35,648 --> 01:08:37,079
we can not make an impact,
1228
01:08:37,080 --> 01:08:38,289
we can not delay it,
1229
01:08:38,290 --> 01:08:39,290
we can not stop it.
1230
01:08:40,550 --> 01:08:42,593
We feel powerless.
1231
01:08:44,670 --> 01:08:47,909
If we think of God, is an unlimited amount of intelligence.
1232
01:08:47,981 --> 01:08:51,544
The closest we can get is to evolve our own intelligence,
1233
01:08:51,545 --> 01:08:55,217
fusing it with the artificial intelligence we are creating.
1234
01:08:56,510 --> 01:08:59,769
Today our computers, mobile phones, applications,
1235
01:08:59,770 --> 01:09:01,453
They give us a superhuman ability.
1236
01:09:02,580 --> 01:09:03,961
As the old maxim that says,
1237
01:09:03,962 --> 01:09:05,942
If you can not beat them, join them.
1238
01:09:07,830 --> 01:09:10,889
It's about a man-machine partnership.
1239
01:09:10,890 --> 01:09:13,569
We already see it with our cell phones, for example.
1240
01:09:14,270 --> 01:09:16,279
It acts as a memory prosthesis, right?
1241
01:09:16,280 --> 01:09:18,959
I no longer have to remember your phone number
1242
01:09:18,960 --> 01:09:20,689
because it is on my cell phone.
1243
01:09:20,690 --> 01:09:23,289
It's about machines that increase our human abilities
1244
01:09:23,290 --> 01:09:25,590
as opposed to replacing them altogether.
1245
01:09:26,210 --> 01:09:28,399
If you look at all the objects
1246
01:09:28,400 --> 01:09:31,479
that have made the jump from analog to digital in the last 20 years,
1247
01:09:31,480 --> 01:09:32,480
is very.
1248
01:09:33,010 --> 01:09:36,328
We last analog or digital object universe not
1249
01:09:36,329 --> 01:09:37,769
and the problem with this, of course,
1250
01:09:37,770 --> 01:09:41,489
It is that the output and input is very limited.
1251
01:09:41,490 --> 01:09:42,933
That is, are these.
1252
01:09:44,930 --> 01:09:46,230
Our eyes are very good.
1253
01:09:46,231 --> 01:09:48,730
We are able to absorb a lot of visual information.
1254
01:09:49,627 --> 01:09:53,479
But our information output is very low.
1255
01:09:53,480 --> 01:09:56,326
The reason this is important if we imagine a scenario
1256
01:09:56,327 --> 01:09:59,718
where IA is play a more prominent role in society.
1257
01:09:59,719 --> 01:10:02,959
We want good ways to interact with this technology
1258
01:10:02,960 --> 01:10:05,513
for it to end in improving.
1259
01:10:09,726 --> 01:10:12,359
I think it's incredibly important to the IA
1260
01:10:12,360 --> 01:10:14,883
no other should be us.
1261
01:10:15,810 --> 01:10:18,110
I may be wrong about what I'm saying,
1262
01:10:19,370 --> 01:10:23,299
and I'm open to ideas if anyone can suggest a better way,
1263
01:10:25,360 --> 01:10:27,839
but I think really we have to merge with IA
1264
01:10:27,840 --> 01:10:29,586
or we will be left behind.
1265
01:10:37,330 --> 01:10:39,709
It is hard to think of a system disconnect
1266
01:10:39,710 --> 01:10:42,549
which is distributed throughout the planet.
1267
01:10:42,550 --> 01:10:45,558
It is now distributed throughout the solar system.
1268
01:10:46,590 --> 01:10:48,590
You can not turn it off.
1269
01:10:50,610 --> 01:10:51,959
We opened a Pandora caixa.
1270
01:10:51,960 --> 01:10:54,281
We rained forces we can not control,
1271
01:10:54,282 --> 01:10:55,963
we can not stop.
1272
01:10:56,630 --> 01:11:00,089
We are in the middle of creating a new form of life on Earth.
1273
01:11:06,960 --> 01:11:08,819
We do not know what will happen next,
1274
01:11:08,820 --> 01:11:11,399
we do not know how will the intellect of a machine
1275
01:11:11,400 --> 01:11:15,339
when the intellect is far beyond human capabilities.
1276
01:11:15,490 --> 01:11:17,373
It's not just something that is possible.
1277
01:11:25,290 --> 01:11:27,629
The less frightening future I can imagine
1278
01:11:27,630 --> 01:11:30,080
is one in which we will have at least democratized AI
1279
01:11:32,480 --> 01:11:34,839
because if a company or a small group of people
1280
01:11:34,840 --> 01:11:38,199
able to develop a super digital almost divine intelligence,
1281
01:11:38,216 --> 01:11:39,764
they could rule the world.
1282
01:11:41,240 --> 01:11:43,359
At least when there is an evil dictator,
1283
01:11:43,360 --> 01:11:45,299
this human will die,
1284
01:11:45,300 --> 01:11:47,929
but there would be an AI death
1285
01:11:47,930 --> 01:11:49,080
they live forever
1286
01:11:50,380 --> 01:11:52,849
and we would have an immortal dictator
1287
01:11:52,850 --> 01:11:54,667
which we never would escape.
1288
01:13:20,067 --> 01:13:24,067
The search for artificial intelligence is a multi-billion dollar industry. With almost no regulation.
100262
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.