Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:35,838 --> 00:00:43,640
we're on the brink of is a world of
1
00:00:39,689 --> 00:00:47,009
increasingly intense sophisticated
1
00:00:43,640 --> 00:00:49,439
artificial intelligence technology is
1
00:00:47,009 --> 00:00:51,298
evolving so much faster than our society
1
00:00:49,439 --> 00:00:53,750
has the ability to protect us as
1
00:00:51,298 --> 00:00:53,750
citizens
1
00:00:57,079 --> 00:01:04,250
[Music]
1
00:01:01,990 --> 00:01:07,189
you have a networked intelligence that
1
00:01:04,250 --> 00:01:14,629
watches us knows everything about us and
1
00:01:07,189 --> 00:01:17,299
begins to try to change us technology is
1
00:01:14,629 --> 00:01:21,769
never good or bad it's what we do with
1
00:01:17,299 --> 00:01:22,969
the technology eventually millions of
1
00:01:21,769 --> 00:01:24,920
people are going to be thrown out of
1
00:01:22,969 --> 00:01:26,939
jobs because their skills are going to
1
00:01:24,920 --> 00:01:35,019
be obsolete
1
00:01:26,939 --> 00:01:37,329
unemployment regardless of whether to be
1
00:01:35,019 --> 00:01:40,039
afraid or not afraid the change is
1
00:01:37,329 --> 00:01:45,219
coming and nobody can stop it
1
00:01:40,040 --> 00:01:47,380
[Music]
1
00:01:45,219 --> 00:01:49,689
we've invested huge amounts of money and
1
00:01:47,379 --> 00:01:52,390
so it stands to reason that the military
1
00:01:49,689 --> 00:01:55,149
with their own desires are gonna start
1
00:01:52,390 --> 00:01:57,370
to use these technologies autonomous
1
00:01:55,150 --> 00:02:02,430
weapons systems would lead to a global
1
00:01:57,370 --> 00:02:02,430
arms race to rival the nuclear era
1
00:02:02,909 --> 00:02:08,360
we know what the answer is they'll
1
00:02:05,159 --> 00:02:08,360
eventually be killing us
1
00:02:11,189 --> 00:02:17,829
these technology leaps are going to
1
00:02:13,629 --> 00:02:19,129
yield incredible miracles and incredible
1
00:02:17,830 --> 00:02:24,030
Horrors
1
00:02:19,129 --> 00:02:24,030
[Music]
1
00:02:24,590 --> 00:02:32,280
we created it so I think as we move
1
00:02:29,370 --> 00:02:36,060
forward this intelligence will contain
1
00:02:32,280 --> 00:02:40,379
parts of us but I think the question is
1
00:02:36,060 --> 00:02:43,550
will it contain the good parts or the
1
00:02:40,379 --> 00:02:43,549
bad parts
1
00:03:01,509 --> 00:03:04,649
[Music]
1
00:03:05,068 --> 00:03:09,009
the survivors called the war Judgment
1
00:03:08,348 --> 00:03:11,348
Day
1
00:03:09,009 --> 00:03:13,378
they they have told me to face a new
1
00:03:11,348 --> 00:03:13,378
nightmare
1
00:03:13,770 --> 00:03:19,560
against the machines I think we
1
00:03:16,919 --> 00:03:21,659
completely us up I think
1
00:03:19,560 --> 00:03:25,289
Hollywood has managed to inoculate the
1
00:03:21,659 --> 00:03:27,780
general public against this question the
1
00:03:25,289 --> 00:03:30,810
idea of machines that will take over the
1
00:03:27,780 --> 00:03:35,939
world open the pod bay doors
1
00:03:30,810 --> 00:03:41,250
oh I'm sorry Dave I'm afraid I can't do
1
00:03:35,939 --> 00:03:43,020
that al we've cried wolf enough times
1
00:03:41,250 --> 00:03:44,699
out the public has stopped paying
1
00:03:43,020 --> 00:03:45,719
attention because it feels like science
1
00:03:44,699 --> 00:03:47,579
fiction even sitting here talking about
1
00:03:45,719 --> 00:03:50,189
it right now it feels a little bit silly
1
00:03:47,580 --> 00:03:53,370
a little bit like oh this is an artifact
1
00:03:50,189 --> 00:03:55,859
of some cheeseball movie the whopper
1
00:03:53,370 --> 00:04:00,420
spends all its time thinking about World
1
00:03:55,860 --> 00:04:02,190
War three but it's not the general
1
00:04:00,419 --> 00:04:04,250
public is about to get blindsided by
1
00:04:02,189 --> 00:04:04,250
this
1
00:04:04,870 --> 00:04:07,960
[Music]
1
00:04:10,879 --> 00:04:17,279
as as decidin as individuals we're
1
00:04:14,699 --> 00:04:21,930
increasingly surrounded by a machine
1
00:04:17,279 --> 00:04:24,299
intelligence we carry this pocket device
1
00:04:21,930 --> 00:04:26,670
in the palm of our hand that we use to
1
00:04:24,300 --> 00:04:29,340
make a striking array of life decisions
1
00:04:26,670 --> 00:04:33,140
right now aided by a set of distant
1
00:04:29,339 --> 00:04:33,139
algorithms we have no understanding
1
00:04:34,730 --> 00:04:38,759
they're already pretty jaded about the
1
00:04:37,199 --> 00:04:43,620
idea that we can talk to our phone and
1
00:04:38,759 --> 00:04:45,469
that mostly understands us five years
1
00:04:43,620 --> 00:04:49,050
ago no way
1
00:04:45,470 --> 00:04:51,510
robotics machines that see and speak and
1
00:04:49,050 --> 00:04:53,490
listen all that's real now and these
1
00:04:51,509 --> 00:04:57,779
technologies are going to fundamentally
1
00:04:53,490 --> 00:05:00,439
change our society now we have this
1
00:04:57,779 --> 00:05:03,209
great movement of the self-driving cars
1
00:05:00,439 --> 00:05:07,139
driving a car autonomously can move
1
00:05:03,209 --> 00:05:08,819
people's lives into a better place I've
1
00:05:07,139 --> 00:05:11,219
lost a number of family members
1
00:05:08,819 --> 00:05:13,379
including my mother my mother and
1
00:05:11,220 --> 00:05:16,620
sister-in-law and their kids - I don't
1
00:05:13,379 --> 00:05:19,500
feel accidents it's pretty clear we can
1
00:05:16,620 --> 00:05:22,590
almost eliminate car accidents with
1
00:05:19,500 --> 00:05:24,149
automation 30,000 lives in the US alone
1
00:05:22,589 --> 00:05:26,099
about a million around the world per
1
00:05:24,149 --> 00:05:29,009
year
1
00:05:26,100 --> 00:05:31,230
in healthcare early indicators are the
1
00:05:29,009 --> 00:05:32,399
name of the game in that space so that's
1
00:05:31,230 --> 00:05:35,670
another place where it can save
1
00:05:32,399 --> 00:05:38,399
somebody's life here in the breast
1
00:05:35,670 --> 00:05:41,699
cancer Center all the things that the
1
00:05:38,399 --> 00:05:44,209
radiologist brain does in two minutes
1
00:05:41,699 --> 00:05:46,740
computer bonus instantaneously a
1
00:05:44,209 --> 00:05:49,409
computer has looked at 1 million
1
00:05:46,740 --> 00:05:51,780
mammograms and it takes that data and
1
00:05:49,410 --> 00:05:56,720
applies it to this image instantaneously
1
00:05:51,779 --> 00:05:59,219
so the medical application is profound
1
00:05:56,720 --> 00:06:00,660
another really exciting area that we're
1
00:05:59,220 --> 00:06:03,470
seeing a lot of development and is
1
00:06:00,660 --> 00:06:06,990
actually understanding our genetic code
1
00:06:03,470 --> 00:06:10,370
and using that to both diagnose disease
1
00:06:06,990 --> 00:06:10,370
and create personalized treatments
1
00:06:12,180 --> 00:06:16,350
the primary application of all these
1
00:06:14,370 --> 00:06:19,050
machines will be to extend our own
1
00:06:16,350 --> 00:06:21,120
intelligence we were able to make
1
00:06:19,050 --> 00:06:24,329
ourselves smarter and it will be better
1
00:06:21,120 --> 00:06:26,069
in solving problems we don't have to age
1
00:06:24,329 --> 00:06:29,099
we'll actually understand aging we'll be
1
00:06:26,069 --> 00:06:30,870
able to stop it there's really no limit
1
00:06:29,100 --> 00:06:33,260
to what intelligent machines can do for
1
00:06:30,870 --> 00:06:33,259
the human race
1
00:06:36,300 --> 00:06:43,629
how could a smarter machine not be a
1
00:06:39,160 --> 00:06:46,180
better machine it's hard to say exactly
1
00:06:43,629 --> 00:06:48,930
when I began to think that that was a
1
00:06:46,180 --> 00:06:48,930
bit naive
1
00:06:49,189 --> 00:07:00,430
[Music]
1
00:06:57,060 --> 00:07:02,019
Stuart Russell he's basically a God in
1
00:07:00,430 --> 00:07:03,788
the field of artificial intelligence he
1
00:07:02,019 --> 00:07:06,188
wrote the book that almost every
1
00:07:03,788 --> 00:07:08,620
University uses I used to say it's the
1
00:07:06,189 --> 00:07:11,360
best-selling AI textbook now I just say
1
00:07:08,620 --> 00:07:14,050
it's the PDF that stolen most often
1
00:07:11,360 --> 00:07:17,060
[Music]
1
00:07:14,050 --> 00:07:19,759
artificial intelligence is about making
1
00:07:17,060 --> 00:07:21,680
computer smart and from the point of
1
00:07:19,759 --> 00:07:23,569
view of the public what counts as AI is
1
00:07:21,680 --> 00:07:25,430
just something that's surprisingly
1
00:07:23,569 --> 00:07:28,509
intelligent compared to what we thought
1
00:07:25,430 --> 00:07:33,019
computers would typically be able to do
1
00:07:28,509 --> 00:07:35,599
AI is a field of research to try to
1
00:07:33,019 --> 00:07:40,099
basically simulate all kinds of human
1
00:07:35,600 --> 00:07:41,600
capabilities we're in an AI era Silicon
1
00:07:40,100 --> 00:07:44,390
Valley has the ability to focus on one
1
00:07:41,600 --> 00:07:45,950
bright shiny thing it was social
1
00:07:44,389 --> 00:07:48,110
networking in social media over the last
1
00:07:45,949 --> 00:07:50,899
decade and it's pretty clear the bit has
1
00:07:48,110 --> 00:07:53,000
flipped and it starts with machine
1
00:07:50,899 --> 00:07:55,609
learning when we look back at this
1
00:07:53,000 --> 00:07:57,439
moment what was the first AI it's not
1
00:07:55,610 --> 00:07:59,330
sexy and it isn't the thing we consider
1
00:07:57,439 --> 00:08:02,600
the movies but you'd make a great case
1
00:07:59,329 --> 00:08:06,409
that Google created a search engine but
1
00:08:02,600 --> 00:08:08,030
a God had a way for people to ask any
1
00:08:06,410 --> 00:08:10,010
question they wanted and get the answer
1
00:08:08,029 --> 00:08:12,769
they need it most people are not aware
1
00:08:10,009 --> 00:08:14,839
that what Google is doing is actually a
1
00:08:12,769 --> 00:08:16,939
form of artificial intelligence they
1
00:08:14,839 --> 00:08:19,669
just go there they type in a thing
1
00:08:16,939 --> 00:08:21,879
Google give them the answer with each
1
00:08:19,670 --> 00:08:24,110
search we train it to be better
1
00:08:21,879 --> 00:08:25,759
sometimes we type in the search and it
1
00:08:24,110 --> 00:08:29,300
tells us the answer what you finished
1
00:08:25,759 --> 00:08:31,519
asking the question you know who is the
1
00:08:29,300 --> 00:08:33,320
president of Kazakhstan and it'll just
1
00:08:31,519 --> 00:08:35,049
tell you you don't have to go to the
1
00:08:33,320 --> 00:08:38,210
Kazakhstan national website to find out
1
00:08:35,049 --> 00:08:40,789
didn't used to be able to do that that
1
00:08:38,210 --> 00:08:43,250
is artificial intelligence gears from
1
00:08:40,789 --> 00:08:46,069
now when we try to understand we will
1
00:08:43,250 --> 00:08:48,710
say well how do we miss it it's one of
1
00:08:46,070 --> 00:08:51,200
these striking contradictions that we're
1
00:08:48,710 --> 00:08:53,150
facing Google and Facebook at all have
1
00:08:51,200 --> 00:08:55,970
built businesses on giving us as a
1
00:08:53,149 --> 00:08:58,519
society free stuff but it's a Faustian
1
00:08:55,970 --> 00:09:02,330
bargain they're extracting something
1
00:08:58,519 --> 00:09:04,189
from us in exchange but we don't know
1
00:09:02,330 --> 00:09:08,240
what code is running on the other side
1
00:09:04,190 --> 00:09:10,370
and why we have no idea it does strike
1
00:09:08,240 --> 00:09:12,740
right at the issue of how much we should
1
00:09:10,370 --> 00:09:18,278
trust these machines
1
00:09:12,740 --> 00:09:21,169
I use computers literally for everything
1
00:09:18,278 --> 00:09:23,870
there are so many computer advancements
1
00:09:21,169 --> 00:09:26,449
now and it's become such a big part of
1
00:09:23,870 --> 00:09:28,039
our lives it's just incredible what a
1
00:09:26,450 --> 00:09:30,470
computer can do you can actually carry a
1
00:09:28,039 --> 00:09:33,769
computer in your purse I mean how
1
00:09:30,470 --> 00:09:36,170
awesome is that I think most technology
1
00:09:33,769 --> 00:09:39,620
is meant to make things easier and
1
00:09:36,169 --> 00:09:42,559
simpler for for all of us so hopefully I
1
00:09:39,620 --> 00:09:45,850
just remains the focus I think everybody
1
00:09:42,559 --> 00:09:45,849
loves their computers
1
00:09:52,129 --> 00:10:00,590
people don't realize they are constantly
1
00:09:54,769 --> 00:10:02,269
being negotiated with by machines where
1
00:10:00,590 --> 00:10:04,879
that's the price of products in your
1
00:10:02,269 --> 00:10:07,069
Amazon cart whether you can get on a
1
00:10:04,879 --> 00:10:09,039
particular flight whether you can
1
00:10:07,070 --> 00:10:11,780
reserve a room at a particular hotel
1
00:10:09,039 --> 00:10:13,339
what you're experiencing are machine
1
00:10:11,779 --> 00:10:15,829
learning algorithms that have determined
1
00:10:13,340 --> 00:10:18,700
that a person like you is willing to pay
1
00:10:15,830 --> 00:10:21,829
two cents more and is changing the price
1
00:10:18,700 --> 00:10:21,829
[Music]
1
00:10:21,879 --> 00:10:28,179
now computer looks at millions of people
1
00:10:25,100 --> 00:10:31,790
simultaneously for very subtle patterns
1
00:10:28,179 --> 00:10:34,669
you can take seemingly innocent digital
1
00:10:31,789 --> 00:10:37,069
footprints such as someone's playlist on
1
00:10:34,669 --> 00:10:40,069
Spotify or stuff that they bought on
1
00:10:37,070 --> 00:10:42,530
Amazon and then use algorithms to
1
00:10:40,070 --> 00:10:48,890
translate this into a very detailed and
1
00:10:42,529 --> 00:10:50,959
very accurate intimate profile there is
1
00:10:48,889 --> 00:10:53,389
the dossier on each of us that is so
1
00:10:50,960 --> 00:10:54,830
extensive it would be possibly accurate
1
00:10:53,389 --> 00:11:06,769
to say that they know more about you
1
00:10:54,830 --> 00:11:08,240
than your mother does major cause of the
1
00:11:06,769 --> 00:11:11,059
recently I breakthrough it isn't just
1
00:11:08,240 --> 00:11:13,759
that some dude at a brilliant insight
1
00:11:11,059 --> 00:11:16,279
doll of a son but simply that we have
1
00:11:13,759 --> 00:11:20,059
much bigger data to train them on and
1
00:11:16,279 --> 00:11:22,279
vastly better computers the magic is in
1
00:11:20,059 --> 00:11:24,079
the data it's a ton of data
1
00:11:22,279 --> 00:11:27,169
I mean it's data that's never existed
1
00:11:24,080 --> 00:11:30,259
before we've never had this data before
1
00:11:27,169 --> 00:11:33,099
we've created technologies that allow us
1
00:11:30,259 --> 00:11:35,990
to capture vast amounts of information
1
00:11:33,100 --> 00:11:37,730
if you think of a billion cell phones on
1
00:11:35,990 --> 00:11:40,399
the planet with gyroscopes and
1
00:11:37,730 --> 00:11:42,620
accelerometers and fingerprint readers
1
00:11:40,399 --> 00:11:44,328
couple that with the GPS and the photos
1
00:11:42,620 --> 00:11:46,940
they take and the tweets that you send
1
00:11:44,328 --> 00:11:50,000
we're all giving off huge amounts of
1
00:11:46,940 --> 00:11:51,680
data individually cars that drive is the
1
00:11:50,000 --> 00:11:53,089
cameras on them suck of information
1
00:11:51,679 --> 00:11:54,649
about the world around them the
1
00:11:53,089 --> 00:11:56,959
satellites that are now in orbit the
1
00:11:54,649 --> 00:11:59,149
size of a toaster the infrared about the
1
00:11:56,958 --> 00:12:00,649
vegetation on the planet the boys that
1
00:11:59,149 --> 00:12:03,429
are out in the oceans defeating into
1
00:12:00,649 --> 00:12:03,429
climate models
1
00:12:06,139 --> 00:12:10,850
and the NSA the CIA as they collect
1
00:12:09,230 --> 00:12:14,990
information about the geopolitical
1
00:12:10,850 --> 00:12:17,920
situations the world today is literally
1
00:12:14,990 --> 00:12:17,919
swimming in this data
1
00:12:20,889 --> 00:12:26,870
back in 2012 IBM estimated that an
1
00:12:25,519 --> 00:12:29,929
average human being
1
00:12:26,870 --> 00:12:33,409
leaves 500 megabytes of digital
1
00:12:29,929 --> 00:12:35,959
footprints every day if you wanted to
1
00:12:33,409 --> 00:12:38,179
back up only one day worth of data that
1
00:12:35,960 --> 00:12:41,389
humanity produces and you print it out
1
00:12:38,179 --> 00:12:45,079
on a letter size paper double-sided font
1
00:12:41,389 --> 00:12:47,149
size 12 and you stack it up it would
1
00:12:45,080 --> 00:12:51,399
reach from the surface of the earth to
1
00:12:47,149 --> 00:12:54,860
the Sun four times over this everyday
1
00:12:51,399 --> 00:12:57,919
the data itself is not good or evil it's
1
00:12:54,860 --> 00:13:00,050
how it's used we're relying really on
1
00:12:57,919 --> 00:13:02,569
the goodwill of these people and on the
1
00:13:00,049 --> 00:13:05,000
policies of these companies there is no
1
00:13:02,570 --> 00:13:07,850
legal requirement for how they can and
1
00:13:05,000 --> 00:13:12,379
should use that kind of data that to me
1
00:13:07,850 --> 00:13:14,180
is at the heart of the trust issue right
1
00:13:12,379 --> 00:13:16,028
now there's a giant race for creating
1
00:13:14,179 --> 00:13:18,528
machines that are as smart as humans
1
00:13:16,028 --> 00:13:19,939
Google they're working on what's really
1
00:13:18,528 --> 00:13:21,200
the kind of Manhattan project of
1
00:13:19,940 --> 00:13:23,149
artificial intelligence they've got the
1
00:13:21,200 --> 00:13:25,250
most money they've got the most talent
1
00:13:23,149 --> 00:13:28,940
they're buying up AI companies and
1
00:13:25,250 --> 00:13:30,950
robotics companies people still think of
1
00:13:28,940 --> 00:13:32,870
Gulas a search engine and their email
1
00:13:30,950 --> 00:13:35,930
provider and a lot of other things that
1
00:13:32,870 --> 00:13:40,789
we use on a daily basis but behind that
1
00:13:35,929 --> 00:13:42,620
search box are 10 million servers that
1
00:13:40,789 --> 00:13:45,469
makes Google the most powerful computing
1
00:13:42,620 --> 00:13:48,350
platform in the world Google is now
1
00:13:45,470 --> 00:13:52,269
working on an AI computing platform that
1
00:13:48,350 --> 00:13:52,269
will have a hundred million servers
1
00:13:52,350 --> 00:13:56,769
so when you're interacting with Google
1
00:13:54,879 --> 00:13:58,929
we're just seeing the toenail of
1
00:13:56,769 --> 00:14:01,269
something that is a giant beast in the
1
00:13:58,929 --> 00:14:02,828
making and the truth is I'm not even
1
00:14:01,269 --> 00:14:05,399
sure that Google knows what it's
1
00:14:02,828 --> 00:14:05,399
becoming
1
00:14:11,528 --> 00:14:17,568
if you look inside of what algorithms
1
00:14:14,269 --> 00:14:21,889
are being used at Google it's technology
1
00:14:17,568 --> 00:14:24,679
largely from the 80s so these are models
1
00:14:21,889 --> 00:14:27,649
that you train by showing them a 1 a 2
1
00:14:24,679 --> 00:14:28,969
and a 3 and it learns not what a 1 is or
1
00:14:27,649 --> 00:14:31,519
what it - is it learns what the
1
00:14:28,970 --> 00:14:34,579
difference between a 1 and a 2 is it's
1
00:14:31,519 --> 00:14:35,990
just a computation in the last half
1
00:14:34,578 --> 00:14:37,609
decade where we've made this rapid
1
00:14:35,990 --> 00:14:40,430
progress it has all been in pattern
1
00:14:37,610 --> 00:14:43,938
recognition most of the good old
1
00:14:40,429 --> 00:14:46,870
fashioned AI was when we would tell our
1
00:14:43,938 --> 00:14:49,818
computers how to play a game like chess
1
00:14:46,870 --> 00:14:53,558
from the old paradigm where you just
1
00:14:49,818 --> 00:14:53,558
tell the computer exactly what to do
1
00:14:57,299 --> 00:15:05,779
[Music]
1
00:14:59,620 --> 00:15:07,039
the idea challenge no one at time had
1
00:15:05,779 --> 00:15:09,470
thought that a machine could have the
1
00:15:07,039 --> 00:15:11,389
precision and the confidence and the
1
00:15:09,470 --> 00:15:13,370
speed to play jeopardy well enough
1
00:15:11,389 --> 00:15:17,720
against the best units let's play
1
00:15:13,370 --> 00:15:19,460
jeopardy four-letter word for the iron
1
00:15:17,720 --> 00:15:22,070
fitting on the hoof of a horse
1
00:15:19,460 --> 00:15:24,710
Watson what is issue you are right you
1
00:15:22,070 --> 00:15:28,570
get to pick literary character APB for
1
00:15:24,710 --> 00:15:30,740
800 answered the Daily Double
1
00:15:28,570 --> 00:15:33,620
Watson actually got its knowledge by
1
00:15:30,740 --> 00:15:36,080
reading Wikipedia and 200 million pages
1
00:15:33,620 --> 00:15:38,269
of natural language documents you can't
1
00:15:36,080 --> 00:15:38,830
program every line of how the world
1
00:15:38,269 --> 00:15:41,870
works
1
00:15:38,830 --> 00:15:44,780
mushiya has to learn by reading now we
1
00:15:41,870 --> 00:15:49,299
come to Watson who is Bram Stoker and
1
00:15:44,779 --> 00:15:49,299
the wager hello
1
00:15:50,649 --> 00:15:58,429
4:13 and a Tuesday Watson's trained on
1
00:15:55,309 --> 00:16:00,859
huge amounts of text but it's not like
1
00:15:58,429 --> 00:16:02,269
it understands what it's saying it
1
00:16:00,860 --> 00:16:04,310
doesn't know that water makes things wet
1
00:16:02,269 --> 00:16:05,809
by touching water and by seeing the way
1
00:16:04,309 --> 00:16:08,989
things behave in the world the way you
1
00:16:05,809 --> 00:16:11,719
and I do a lot of language a itay is not
1
00:16:08,990 --> 00:16:14,419
building logical models of how the world
1
00:16:11,720 --> 00:16:17,360
works rather it's looking at how the
1
00:16:14,419 --> 00:16:20,659
words appear in the context of other
1
00:16:17,360 --> 00:16:22,310
words David Ferrucci developed ibm's
1
00:16:20,659 --> 00:16:25,219
watson and somebody asked him just
1
00:16:22,309 --> 00:16:28,789
watson to think and he said does a
1
00:16:25,220 --> 00:16:30,110
submarine swim and what he meant was
1
00:16:28,789 --> 00:16:32,360
when they developed submarines they
1
00:16:30,110 --> 00:16:35,240
borrowed basic principles of swimming
1
00:16:32,360 --> 00:16:36,710
from fish but a submarine swims farther
1
00:16:35,240 --> 00:16:41,360
and faster than fishing in the area huge
1
00:16:36,710 --> 00:16:43,100
payload and out swims fish Watson
1
00:16:41,360 --> 00:16:45,050
winning the game of Jeopardy will go
1
00:16:43,100 --> 00:16:48,019
down in the history of AI as the
1
00:16:45,049 --> 00:16:50,059
significant milestone we tend to be
1
00:16:48,019 --> 00:16:52,669
amazed when the Machine does so well I'm
1
00:16:50,059 --> 00:16:54,289
even more amazed when the computer beast
1
00:16:52,669 --> 00:16:56,809
humans and things are humans and
1
00:16:54,289 --> 00:16:58,589
naturally good at this is how we make
1
00:16:56,809 --> 00:17:00,989
progress
1
00:16:58,590 --> 00:17:03,028
in the early days of the Google brain
1
00:17:00,990 --> 00:17:05,068
project I gave the team a very simple
1
00:17:03,028 --> 00:17:07,439
instruction which was built the biggest
1
00:17:05,068 --> 00:17:10,500
neuro Network possible like a thousand
1
00:17:07,439 --> 00:17:11,759
computers in your net is something very
1
00:17:10,500 --> 00:17:15,509
close to a simulation of how the brain
1
00:17:11,759 --> 00:17:18,420
works it's very probabilistic but with
1
00:17:15,509 --> 00:17:19,799
contextual relevance in your brain you
1
00:17:18,420 --> 00:17:21,539
have long neurons that connect to
1
00:17:19,799 --> 00:17:22,948
thousands of other neurons and you have
1
00:17:21,539 --> 00:17:24,899
these pathways that are formed and
1
00:17:22,949 --> 00:17:27,150
forged based on what then brain needs to
1
00:17:24,900 --> 00:17:29,690
do when a baby tries something and it
1
00:17:27,150 --> 00:17:32,100
succeeds there's a reward and that
1
00:17:29,690 --> 00:17:34,259
pathway that created the success is
1
00:17:32,099 --> 00:17:36,449
strengthened if it fails at something
1
00:17:34,259 --> 00:17:38,490
the pathway is weakened and so over time
1
00:17:36,450 --> 00:17:41,880
the brain becomes honed to be good at
1
00:17:38,490 --> 00:17:43,309
the environment around it really is just
1
00:17:41,880 --> 00:17:45,930
getting machines to learn by themselves
1
00:17:43,309 --> 00:17:47,129
is it called deep learning and deep
1
00:17:45,930 --> 00:17:50,720
learning and neural networks mean
1
00:17:47,130 --> 00:17:53,730
roughly the same thing deep learning is
1
00:17:50,720 --> 00:17:56,339
a totally different approach where the
1
00:17:53,730 --> 00:17:57,660
computer learns more like a toddler by
1
00:17:56,339 --> 00:18:01,379
just getting a lot of data and
1
00:17:57,660 --> 00:18:03,029
eventually figuring stuff out the
1
00:18:01,380 --> 00:18:07,620
computer just gets smarter and smarter
1
00:18:03,029 --> 00:18:09,299
as it has more experiences so imagine if
1
00:18:07,619 --> 00:18:11,489
you will the neural network we're like a
1
00:18:09,299 --> 00:18:13,349
thousand computers and it wakes up not
1
00:18:11,490 --> 00:18:16,640
knowing anything and we made it watch
1
00:18:13,349 --> 00:18:16,639
YouTube for a week
1
00:18:17,640 --> 00:18:26,910
[Music]
1
00:18:29,059 --> 00:18:39,769
[Music]
1
00:18:36,289 --> 00:18:41,869
and so after watching YouTube for a week
1
00:18:39,769 --> 00:18:43,519
what were they learn we had a hypothesis
1
00:18:41,869 --> 00:18:46,819
they learn to detect commonly occurring
1
00:18:43,519 --> 00:18:49,129
objects in videos and so we know the
1
00:18:46,819 --> 00:18:50,359
human faces appear a lot in videos so we
1
00:18:49,130 --> 00:18:52,070
looked and lo and behold there was a
1
00:18:50,359 --> 00:18:58,849
neuron that had learn to detect human
1
00:18:52,069 --> 00:19:02,929
faces know what else appears in videos a
1
00:18:58,849 --> 00:19:04,849
lot so we looked into surprise there was
1
00:19:02,930 --> 00:19:06,269
actually a neuron and that had learn to
1
00:19:04,849 --> 00:19:15,189
detect cats
1
00:19:06,269 --> 00:19:18,230
[Music]
1
00:19:15,190 --> 00:19:24,920
that's the remember CF recognition wow
1
00:19:18,230 --> 00:19:26,029
that's a cat okay cool great it's all
1
00:19:24,920 --> 00:19:26,660
pretty innocuous when you're thinking
1
00:19:26,029 --> 00:19:29,859
about the future
1
00:19:26,660 --> 00:19:32,540
it all seems kind of harmless in benign
1
00:19:29,859 --> 00:19:34,459
but we're making cognitive architectures
1
00:19:32,539 --> 00:19:36,230
that will fly farther and faster than us
1
00:19:34,460 --> 00:19:39,110
and carry a bigger payload and they
1
00:19:36,230 --> 00:19:41,089
won't be warm and fuzzy I think that in
1
00:19:39,109 --> 00:19:43,699
three to five years you will see a
1
00:19:41,089 --> 00:19:47,599
computer system that will be able to
1
00:19:43,700 --> 00:19:50,870
autonomously learn how to understand how
1
00:19:47,599 --> 00:19:53,500
to build understanding not unlike the
1
00:19:50,869 --> 00:19:53,500
way the human mind works
1
00:19:54,470 --> 00:19:59,850
whatever that lunch was it was certainly
1
00:19:56,940 --> 00:20:03,090
delicious simply a sum of Robbie
1
00:19:59,849 --> 00:20:04,859
synthetics is your cook too even
1
00:20:03,089 --> 00:20:09,720
manufactures the raw materials come
1
00:20:04,859 --> 00:20:13,648
round here Robbie I'll show you how this
1
00:20:09,720 --> 00:20:16,919
works one introduces a sample of human
1
00:20:13,648 --> 00:20:18,449
food through this aperture down here
1
00:20:16,919 --> 00:20:20,429
there's a small built-in chemical
1
00:20:18,450 --> 00:20:22,590
laboratory where he analyzed it later he
1
00:20:20,429 --> 00:20:26,059
can reproduce identical molecules in in
1
00:20:22,589 --> 00:20:29,490
any shape or quantity as far as dream
1
00:20:26,058 --> 00:20:32,339
meet Baxter revolutionary new category
1
00:20:29,490 --> 00:20:34,589
of robots with common sense Baxter
1
00:20:32,339 --> 00:20:38,009
Baxter is a really good example of the
1
00:20:34,589 --> 00:20:40,349
kind of competition we face for machines
1
00:20:38,009 --> 00:20:45,150
Baxter can do almost anything we can do
1
00:20:40,349 --> 00:20:47,699
with our hands Baxter costs about what a
1
00:20:45,150 --> 00:20:49,860
minimum-wage worker makes in a year
1
00:20:47,700 --> 00:20:51,420
the Baxter won't be taking the place of
1
00:20:49,859 --> 00:20:52,979
one minimum-wage worker he'll be taking
1
00:20:51,420 --> 00:20:56,700
the place of three because they never
1
00:20:52,980 --> 00:20:57,839
gets hired they never take breaks that's
1
00:20:56,700 --> 00:21:00,569
probably the first thing we're going to
1
00:20:57,839 --> 00:21:02,939
say displacement of jobs they're going
1
00:21:00,569 --> 00:21:06,839
to be done quicker faster cheaper by
1
00:21:02,940 --> 00:21:09,000
machines our ability to even stay
1
00:21:06,839 --> 00:21:12,059
current is so insanely limited compared
1
00:21:09,000 --> 00:21:14,130
to the machines we built for example now
1
00:21:12,059 --> 00:21:15,450
we have this great movement of uber and
1
00:21:14,130 --> 00:21:17,370
lyft are kind of making transportation
1
00:21:15,450 --> 00:21:19,799
cheaper and democratizing transportation
1
00:21:17,369 --> 00:21:20,909
which is great the next step is going to
1
00:21:19,799 --> 00:21:22,859
be that the argument plates by
1
00:21:20,910 --> 00:21:24,029
travellers cars and then all the uber
1
00:21:22,859 --> 00:21:26,359
and lyft drivers had to find something
1
00:21:24,029 --> 00:21:26,359
new to do
1
00:21:26,380 --> 00:21:31,150
there are 4 million professional drivers
1
00:21:29,019 --> 00:21:34,139
in the United States they're unemployed
1
00:21:31,150 --> 00:21:37,900
soon 7 million people to do data entry
1
00:21:34,140 --> 00:21:41,530
those people are going to be jobless
1
00:21:37,900 --> 00:21:43,500
a job isn't just about money right on a
1
00:21:41,529 --> 00:21:46,839
biological level it serves a purpose
1
00:21:43,500 --> 00:21:49,299
becomes a defining thing when the jobs
1
00:21:46,839 --> 00:21:50,740
went away in any given civilization it
1
00:21:49,299 --> 00:21:53,159
doesn't take long until that turns into
1
00:21:50,740 --> 00:21:53,160
violence
1
00:22:00,569 --> 00:22:04,710
we face a giant divide between rich and
1
00:22:02,789 --> 00:22:06,750
poor because that's what automation and
1
00:22:04,710 --> 00:22:10,289
AI will provoke a greater divide between
1
00:22:06,750 --> 00:22:11,910
the haves and have-nots right now it's
1
00:22:10,289 --> 00:22:15,389
working into the middle class into
1
00:22:11,910 --> 00:22:17,490
white-collar jobs IBM's Watson does
1
00:22:15,390 --> 00:22:20,840
business analytics that we used to pay a
1
00:22:17,490 --> 00:22:24,029
business analyst $300 an hour to do
1
00:22:20,839 --> 00:22:25,500
today you go to college to be a doctor
1
00:22:24,029 --> 00:22:27,809
to be an accountant to be a journalist
1
00:22:25,500 --> 00:22:31,559
it's unclear that there's gonna be jobs
1
00:22:27,809 --> 00:22:33,839
there for you if someone's planning for
1
00:22:31,559 --> 00:22:35,879
a 40 year career in radiology just
1
00:22:33,839 --> 00:22:39,889
reading images I think that could be a
1
00:22:35,880 --> 00:22:39,890
challenge to the new drivers of today
1
00:22:56,339 --> 00:23:03,879
but today we live in a robotic case the
1
00:23:00,759 --> 00:23:07,509
da Vinci robot is currently utilized by
1
00:23:03,880 --> 00:23:11,340
variety of surgeons for its accuracy and
1
00:23:07,509 --> 00:23:14,819
its ability to avoid the inevitable
1
00:23:11,339 --> 00:23:23,740
fluctuations of the human hand
1
00:23:14,819 --> 00:23:26,329
[Music]
1
00:23:23,740 --> 00:23:29,380
anybody who watches this feels the
1
00:23:26,329 --> 00:23:29,379
amazingness of it
1
00:23:31,039 --> 00:23:36,869
you look through the scope and you've
1
00:23:33,630 --> 00:23:39,570
seen the claw hand holding that woman's
1
00:23:36,869 --> 00:23:44,119
ovary humanity was resting right there
1
00:23:39,569 --> 00:23:47,000
in the hands of this robot people say
1
00:23:44,119 --> 00:23:52,139
it's the future but it's not the future
1
00:23:47,000 --> 00:23:54,329
it's the present if you think about a
1
00:23:52,140 --> 00:23:56,430
surgical robot there's often not a lot
1
00:23:54,329 --> 00:23:57,629
of intelligence in these things but over
1
00:23:56,430 --> 00:23:59,610
time as we put more and more
1
00:23:57,630 --> 00:24:01,680
intelligence into these systems the
1
00:23:59,609 --> 00:24:04,049
surgical robots can actually learn from
1
00:24:01,680 --> 00:24:05,370
each robot surgery they're tracking the
1
00:24:04,049 --> 00:24:06,629
movements they're understanding what
1
00:24:05,369 --> 00:24:09,209
worked and what didn't work and
1
00:24:06,630 --> 00:24:11,130
eventually the robot for routine
1
00:24:09,210 --> 00:24:13,860
surgeries is going to be able to perform
1
00:24:11,130 --> 00:24:16,440
that entirely by itself or with human
1
00:24:13,859 --> 00:24:18,659
supervision normally I do about a
1
00:24:16,440 --> 00:24:22,710
hundred fifty cases that hysterectomies
1
00:24:18,660 --> 00:24:26,250
they say and now most of them are done
1
00:24:22,710 --> 00:24:31,829
robotically I do maybe one open case a
1
00:24:26,250 --> 00:24:34,970
year so do I feel uncomfortable how to
1
00:24:31,829 --> 00:24:34,970
open bases anymore
1
00:24:35,490 --> 00:24:42,210
it seems that we're feeding it and
1
00:24:37,589 --> 00:24:47,629
creating it but in a way we are slave to
1
00:24:42,210 --> 00:24:47,630
the technology because we can't go back
1
00:24:50,400 --> 00:24:55,960
the machines are taking bigger and
1
00:24:52,569 --> 00:24:58,629
bigger bites out of our skill set and
1
00:24:55,960 --> 00:25:00,400
are never increasing speed and so we've
1
00:24:58,630 --> 00:25:11,770
got to run faster and faster to keep
1
00:25:00,400 --> 00:25:12,280
ahead of the machines are you attracted
1
00:25:11,769 --> 00:25:15,490
to me
1
00:25:12,279 --> 00:25:21,700
what are you attracted to me you give me
1
00:25:15,490 --> 00:25:23,859
indications that you are I do yes this
1
00:25:21,700 --> 00:25:27,580
is the future we're headed into it we
1
00:25:23,859 --> 00:25:29,369
want to design our companions we're
1
00:25:27,579 --> 00:25:32,339
gonna like to see a human face on the I
1
00:25:29,369 --> 00:25:34,509
therefore gaming our emotions will be
1
00:25:32,339 --> 00:25:37,299
depressingly easy
1
00:25:34,509 --> 00:25:39,879
we're not that complicated simple
1
00:25:37,299 --> 00:25:44,139
stimulus response I can make you like me
1
00:25:39,880 --> 00:25:45,550
basically by smiling at you a lot yeah
1
00:25:44,140 --> 00:25:48,300
ours are gonna be fantastic at
1
00:25:45,549 --> 00:25:48,299
manipulating us
1
00:25:49,000 --> 00:25:53,380
[Music]
1
00:25:54,809 --> 00:26:01,269
so you've developed a technology that
1
00:25:57,849 --> 00:26:03,039
can sense what people are feeling right
1
00:26:01,269 --> 00:26:05,079
we've developed technology that can read
1
00:26:03,039 --> 00:26:07,839
your facial expressions and map that to
1
00:26:05,079 --> 00:26:09,849
a number of emotional states fifteen
1
00:26:07,839 --> 00:26:11,500
years ago I had just finished my
1
00:26:09,849 --> 00:26:13,569
undergraduate studies in computer
1
00:26:11,500 --> 00:26:16,720
science and it struck me that I was
1
00:26:13,569 --> 00:26:19,089
spending a lot of time interacting with
1
00:26:16,720 --> 00:26:22,150
my laptops on my devices yet these
1
00:26:19,089 --> 00:26:26,289
devices had absolutely no clue how I was
1
00:26:22,150 --> 00:26:28,540
feeling I started thinking what if this
1
00:26:26,289 --> 00:26:30,730
device could sense that I was stressed
1
00:26:28,539 --> 00:26:38,950
or I was having a bad day what would
1
00:26:30,730 --> 00:26:41,319
that open up for you can I get a hug we
1
00:26:38,950 --> 00:26:43,960
had kids interact with the technology a
1
00:26:41,319 --> 00:26:55,119
lot of it is still in development but it
1
00:26:43,960 --> 00:27:01,299
was just amazing who likes robots my mom
1
00:26:55,119 --> 00:27:04,419
really hard math questions okay we're
1
00:27:01,299 --> 00:27:10,389
scaring people all right so start by
1
00:27:04,420 --> 00:27:12,820
smiling nice brow furrow nice one
1
00:27:10,390 --> 00:27:14,620
eyebrow raised this generation
1
00:27:12,819 --> 00:27:17,619
technology is just surrounding them all
1
00:27:14,619 --> 00:27:19,059
the time it's almost like they expect to
1
00:27:17,619 --> 00:27:21,099
have robots in their homes and they
1
00:27:19,059 --> 00:27:26,589
expect these robots to be socially
1
00:27:21,099 --> 00:27:30,059
intelligent what makes robots smart put
1
00:27:26,589 --> 00:27:33,490
them in like a math or biology class I
1
00:27:30,059 --> 00:27:37,359
think you would have to train all right
1
00:27:33,490 --> 00:27:39,039
let's walk over here so if you smile and
1
00:27:37,359 --> 00:27:41,789
you raise your eyebrows it's gonna run
1
00:27:39,039 --> 00:27:41,789
over to you
1
00:27:43,210 --> 00:27:52,009
but if you look angry it's gonna run
1
00:27:45,679 --> 00:27:56,720
away we're trading computers to read and
1
00:27:52,009 --> 00:27:58,730
recognize emotions the response so far
1
00:27:56,720 --> 00:28:00,288
has been really amazing people are
1
00:27:58,730 --> 00:28:05,808
integrating this into health apps
1
00:28:00,288 --> 00:28:08,558
meditation apps robots cars we're gonna
1
00:28:05,808 --> 00:28:08,558
see how this unfolds
1
00:28:09,730 --> 00:28:15,528
robots can contain AI but the robot is
1
00:28:13,339 --> 00:28:17,418
just a physical instantiation and the
1
00:28:15,528 --> 00:28:19,819
artificial intelligence is the brain and
1
00:28:17,419 --> 00:28:21,710
so brains can exist purely in software
1
00:28:19,819 --> 00:28:24,470
based systems they don't need to have a
1
00:28:21,710 --> 00:28:26,569
physical form robots can exist without
1
00:28:24,470 --> 00:28:29,899
any artificial intelligence we have a
1
00:28:26,569 --> 00:28:32,358
lot of dumb robots out there but a dumb
1
00:28:29,898 --> 00:28:34,459
robot can be a smart robot overnight
1
00:28:32,358 --> 00:28:38,210
given the right software given the right
1
00:28:34,460 --> 00:28:40,009
sensors we can't help but impute motive
1
00:28:38,210 --> 00:28:42,200
into inanimate objects we do it with
1
00:28:40,009 --> 00:28:46,509
machines we'll treat them like children
1
00:28:42,200 --> 00:28:49,929
we'll treat them like surrogates and
1
00:28:46,509 --> 00:28:49,929
we'll pay the price
1
00:28:51,200 --> 00:28:54,298
[Music]
1
00:29:00,630 --> 00:29:10,950
[Music]
1
00:29:08,769 --> 00:29:17,338
you get welcome to that yeah
1
00:29:10,950 --> 00:29:17,338
[Music]
1
00:29:19,048 --> 00:29:24,190
my purpose is to have more human-like
1
00:29:22,000 --> 00:29:27,240
robot which has the human right
1
00:29:24,190 --> 00:29:30,318
intention desire
1
00:29:27,240 --> 00:29:30,318
[Music]
1
00:29:36,309 --> 00:29:44,079
the name of the robot is Erica erica is
1
00:29:41,140 --> 00:29:47,410
the most advanced human-like robot in
1
00:29:44,079 --> 00:29:49,199
the world I think Erica and I can gaze
1
00:29:47,410 --> 00:29:52,460
at your face
1
00:29:49,200 --> 00:29:57,029
[Music]
1
00:29:52,460 --> 00:29:58,490
only to our properties and we predict
1
00:29:57,029 --> 00:30:01,049
with the conversation partners
1
00:29:58,490 --> 00:30:03,750
especially for the elderly and young
1
00:30:01,049 --> 00:30:06,329
children's handicapped people's ideas
1
00:30:03,750 --> 00:30:10,470
when we talk to the robot we don't fear
1
00:30:06,329 --> 00:30:14,689
the social barriers social pressures the
1
00:30:10,470 --> 00:30:18,390
finally everybody except the Android as
1
00:30:14,690 --> 00:30:20,610
just our friend were partners we have
1
00:30:18,390 --> 00:30:23,280
implemented a simple desires now she
1
00:30:20,609 --> 00:30:25,969
wanted to be a well recognized and she
1
00:30:23,279 --> 00:30:25,970
wanted to arrest
1
00:30:27,329 --> 00:30:31,808
[Music]
1
00:30:29,430 --> 00:30:32,620
if a robot could have an intention
1
00:30:31,808 --> 00:30:35,529
there's Oreos
1
00:30:32,619 --> 00:30:46,000
the robot can understand other people's
1
00:30:35,529 --> 00:30:48,190
engagement desires that is tied to
1
00:30:46,000 --> 00:30:49,859
relationships with the people and that
1
00:30:48,190 --> 00:30:53,440
means they like each other
1
00:30:49,859 --> 00:30:56,039
that means well I'm not sure and not to
1
00:30:53,440 --> 00:30:56,039
rub each other
1
00:30:57,589 --> 00:31:00,859
we build about official intelligence and
1
00:30:59,779 --> 00:31:03,470
the very first thing we want to do is
1
00:31:00,859 --> 00:31:06,589
replicate us
1
00:31:03,470 --> 00:31:11,319
I think the key point will come when all
1
00:31:06,589 --> 00:31:16,908
the major senses are replicated sight
1
00:31:11,319 --> 00:31:20,379
touch smell when we replicate our senses
1
00:31:16,909 --> 00:31:20,380
is that when it becomes alive
1
00:31:27,789 --> 00:31:34,579
so many of our machines are being built
1
00:31:30,559 --> 00:31:36,289
to understand us but what happens with
1
00:31:34,579 --> 00:31:38,210
an anthropomorphic creature discovers
1
00:31:36,289 --> 00:31:40,549
that they can adjust their loyalty
1
00:31:38,210 --> 00:31:46,460
adjust their courage adjust their
1
00:31:40,549 --> 00:31:48,379
avarice adjust their cunning the average
1
00:31:46,460 --> 00:31:49,910
person they don't see killer robots
1
00:31:48,380 --> 00:31:52,370
going down the streets they're like what
1
00:31:49,910 --> 00:31:54,350
are you talking about man
1
00:31:52,369 --> 00:31:58,219
we want to make sure we don't have
1
00:31:54,349 --> 00:31:59,599
killer robots going down the street once
1
00:31:58,220 --> 00:32:08,329
they're going down the street it is too
1
00:31:59,599 --> 00:32:11,449
late the thing that worries me right now
1
00:32:08,329 --> 00:32:14,409
that keeps me awake is the development
1
00:32:11,450 --> 00:32:14,410
of autonomous weapons
1
00:32:28,089 --> 00:32:34,669
up to now people have expressed unease
1
00:32:31,130 --> 00:32:42,170
about drones which are remotely piloted
1
00:32:34,670 --> 00:32:44,720
aircraft if you take a drones camera
1
00:32:42,170 --> 00:32:47,990
feed it into the AI system it's a very
1
00:32:44,720 --> 00:32:49,640
easy step from here to fully autonomous
1
00:32:47,990 --> 00:32:53,170
weapons that choose their own targets
1
00:32:49,640 --> 00:32:53,170
release their own missiles
1
00:32:55,269 --> 00:32:58,389
[Music]
1
00:33:02,299 --> 00:33:05,368
[Applause]
1
00:33:12,720 --> 00:33:17,440
the expected lifespan of a human being
1
00:33:15,490 --> 00:33:20,460
and that kind of baffling environment
1
00:33:17,440 --> 00:33:20,460
will be measured in seconds
1
00:33:20,640 --> 00:33:27,340
at one point drones or science fiction
1
00:33:24,009 --> 00:33:31,779
and now they've become the normal thing
1
00:33:27,339 --> 00:33:35,109
and war there's over 10,000 and the US
1
00:33:31,779 --> 00:33:36,759
military inventory alone but they're not
1
00:33:35,109 --> 00:33:40,449
just a u.s. phenomenon there's more than
1
00:33:36,759 --> 00:33:43,000
80 countries that operate them it stands
1
00:33:40,450 --> 00:33:44,200
to reason that people making some of the
1
00:33:43,000 --> 00:33:46,690
most important and difficult decisions
1
00:33:44,200 --> 00:33:50,250
in the world are gonna start to use and
1
00:33:46,690 --> 00:33:50,250
implement artificial intelligence
1
00:33:50,859 --> 00:33:54,709
the Air Force just designed a four
1
00:33:53,180 --> 00:33:58,930
hundred billion dollar jet program to
1
00:33:54,710 --> 00:34:01,579
put pilots in the sky and a $500 AI
1
00:33:58,930 --> 00:34:03,740
designed by a couple of graduate
1
00:34:01,579 --> 00:34:10,699
students as being the best human pilots
1
00:34:03,740 --> 00:34:12,918
with a relatively simple algorithm io I
1
00:34:10,699 --> 00:34:16,460
will have as big an impact on the
1
00:34:12,918 --> 00:34:18,888
military as the combustion engine had at
1
00:34:16,460 --> 00:34:20,690
the turn of the century that would
1
00:34:18,889 --> 00:34:23,809
literally touch everything that the
1
00:34:20,690 --> 00:34:26,450
military does from driverless convoys
1
00:34:23,809 --> 00:34:29,210
delivering logistical supplies to
1
00:34:26,449 --> 00:34:32,329
unmanned drones delivering medical aid
1
00:34:29,210 --> 00:34:33,559
to computational propaganda try and win
1
00:34:32,329 --> 00:34:38,059
the hearts and minds of the population
1
00:34:33,559 --> 00:34:40,279
and so it stands to reason that whoever
1
00:34:38,059 --> 00:34:47,179
has the best day I will probably achieve
1
00:34:40,280 --> 00:34:49,639
dominance on this planet at some point
1
00:34:47,179 --> 00:34:53,659
in the early 21st century all of mankind
1
00:34:49,639 --> 00:34:56,000
was united in celebration we marveled at
1
00:34:53,659 --> 00:34:58,809
our own magnificence as we gave birth to
1
00:34:56,000 --> 00:34:58,809
a
1
00:34:59,829 --> 00:35:03,549
I mean artificial intelligence a
1
00:35:01,420 --> 00:35:07,720
singular consciousness that spawned an
1
00:35:03,550 --> 00:35:11,170
entire race of machines we don't know
1
00:35:07,719 --> 00:35:15,089
who struck first us or them but we know
1
00:35:11,170 --> 00:35:17,139
that it was us that scorched the sky
1
00:35:15,090 --> 00:35:18,970
there's a long history of science
1
00:35:17,139 --> 00:35:22,710
fiction not just predicting the future
1
00:35:18,969 --> 00:35:22,709
but shaping the future
1
00:35:27,130 --> 00:35:33,039
Arthur Conan Doyle riding before World
1
00:35:30,639 --> 00:35:35,889
War one only the danger of how
1
00:35:33,039 --> 00:35:39,730
submarines might be used to carry out
1
00:35:35,889 --> 00:35:42,940
civilian blockades at the time he's
1
00:35:39,730 --> 00:35:44,710
writing this fiction the Royal Navy made
1
00:35:42,940 --> 00:35:47,500
fun of Arthur Conan Doyle for this
1
00:35:44,710 --> 00:35:49,460
absurd idea that submarines could be
1
00:35:47,500 --> 00:35:52,539
useful and war
1
00:35:49,460 --> 00:35:52,539
[Music]
1
00:35:54,039 --> 00:35:58,550
one of the things we've seen in history
1
00:35:55,818 --> 00:36:02,029
is that our attitude towards technology
1
00:35:58,550 --> 00:36:02,660
but also ethics are very context
1
00:36:02,030 --> 00:36:05,510
dependent
1
00:36:02,659 --> 00:36:07,338
for example the submarine nations like
1
00:36:05,510 --> 00:36:09,800
Great Britain and even the I states
1
00:36:07,338 --> 00:36:12,949
found it horrifying to use the submarine
1
00:36:09,800 --> 00:36:15,318
in fact the German used to the submarine
1
00:36:12,949 --> 00:36:18,879
to carry out attacks was the reason why
1
00:36:15,318 --> 00:36:22,400
the United States joined World War one
1
00:36:18,880 --> 00:36:24,559
but move the timeline forward the United
1
00:36:22,400 --> 00:36:27,920
States of America was suddenly and
1
00:36:24,559 --> 00:36:32,119
deliberately attacked by the Empire of
1
00:36:27,920 --> 00:36:34,880
Japan five hours after Pearl Harbor the
1
00:36:32,119 --> 00:36:41,059
order goes out to commit unrestricted
1
00:36:34,880 --> 00:36:42,890
submarine warfare against Japan so
1
00:36:41,059 --> 00:36:46,460
Arthur Conan Doyle turned out to be
1
00:36:42,889 --> 00:36:48,259
right that's the the great old line
1
00:36:46,460 --> 00:36:51,440
about science fiction it's a lie that
1
00:36:48,260 --> 00:36:53,180
tells the truth fellow executives it
1
00:36:51,440 --> 00:36:55,309
gives me great pleasure to introduce you
1
00:36:53,179 --> 00:36:59,049
to the future of law enforcement
1
00:36:55,309 --> 00:36:59,050
edie 209
1
00:37:04,289 --> 00:37:09,029
this isn't just a question of science
1
00:37:06,059 --> 00:37:10,420
fiction this is about what's next about
1
00:37:09,030 --> 00:37:14,000
what's happening right now
1
00:37:10,420 --> 00:37:17,159
[Music]
1
00:37:14,000 --> 00:37:20,730
the role of intelligent systems is
1
00:37:17,159 --> 00:37:27,750
growing very rapidly in warfare everyone
1
00:37:20,730 --> 00:37:29,849
is pushing in the unmanned realm today
1
00:37:27,750 --> 00:37:31,860
Secretary of Defense is very very clear
1
00:37:29,849 --> 00:37:34,650
we will not create fully autonomous
1
00:37:31,860 --> 00:37:36,390
attacking vehicles not everyone is going
1
00:37:34,650 --> 00:37:38,700
to hold themselves to that same set of
1
00:37:36,389 --> 00:37:41,849
values and when China and Russia and
1
00:37:38,699 --> 00:37:45,329
start deploying autonomous vehicles that
1
00:37:41,849 --> 00:37:51,989
can attack and kill what's the move that
1
00:37:45,329 --> 00:37:53,489
we're gonna make you can't say well
1
00:37:51,989 --> 00:37:55,229
we're going to use at homeless weapons
1
00:37:53,489 --> 00:37:58,049
for our our military dominance but no
1
00:37:55,230 --> 00:37:59,849
one else is going to use them if you
1
00:37:58,050 --> 00:38:03,060
make these weapons they're going to be
1
00:37:59,849 --> 00:38:05,750
used to attack human populations in
1
00:38:03,059 --> 00:38:05,750
large numbers
1
00:38:07,110 --> 00:38:10,249
[Music]
1
00:38:13,159 --> 00:38:17,449
tournaments weapons that by their nature
1
00:38:15,650 --> 00:38:19,519
weapons of mass destruction because it
1
00:38:17,449 --> 00:38:23,058
doesn't need a human being to guide it
1
00:38:19,519 --> 00:38:25,900
or carry it you only need one person to
1
00:38:23,059 --> 00:38:29,780
you know write a little program
1
00:38:25,900 --> 00:38:33,588
it's just have Cheers the complexity of
1
00:38:29,780 --> 00:38:37,450
this field it is cool it is important it
1
00:38:33,588 --> 00:38:43,788
is amazing it is also frightening and
1
00:38:37,449 --> 00:38:46,068
it's all about trust it's an open letter
1
00:38:43,789 --> 00:38:47,900
about artificial intelligence signed by
1
00:38:46,068 --> 00:38:50,210
some of the biggest names in science
1
00:38:47,900 --> 00:38:52,548
what do they want ban the use of
1
00:38:50,210 --> 00:38:54,798
autonomous weapons the author stated
1
00:38:52,548 --> 00:38:56,929
quote autonomous weapons have been
1
00:38:54,798 --> 00:38:59,088
described as the third revolution in
1
00:38:56,929 --> 00:39:01,250
warfare thousand artificial intelligence
1
00:38:59,088 --> 00:39:05,000
specialists calling for a global ban on
1
00:39:01,250 --> 00:39:07,039
killer robots this open letter basically
1
00:39:05,000 --> 00:39:08,480
says that we should redefine the goal of
1
00:39:07,039 --> 00:39:11,390
the field of artificial intelligence
1
00:39:08,480 --> 00:39:13,880
away from just creating pure undirected
1
00:39:11,389 --> 00:39:15,739
intelligence towards creating beneficial
1
00:39:13,880 --> 00:39:17,568
intelligence the development of AI is
1
00:39:15,739 --> 00:39:19,368
not going to stop it is going to
1
00:39:17,568 --> 00:39:21,588
continue and get better if the
1
00:39:19,369 --> 00:39:23,420
international community isn't putting
1
00:39:21,588 --> 00:39:25,788
certain controls on this people will
1
00:39:23,420 --> 00:39:27,829
develop things that can do anything the
1
00:39:25,789 --> 00:39:29,960
letter says that we are years not
1
00:39:27,829 --> 00:39:31,910
decades away from these weapons being
1
00:39:29,960 --> 00:39:34,010
deployed so we had six thousands of
1
00:39:31,909 --> 00:39:37,139
countries of that letter including many
1
00:39:34,010 --> 00:39:39,390
of the major figures in the field
1
00:39:37,139 --> 00:39:41,489
I'm getting a lot of visits from
1
00:39:39,389 --> 00:39:43,980
high-ranking officials who wish to
1
00:39:41,489 --> 00:39:46,709
emphasize that American Miller dominance
1
00:39:43,980 --> 00:39:49,170
is very important and autonomous weapons
1
00:39:46,710 --> 00:39:52,528
may be part of the Defense Department's
1
00:39:49,170 --> 00:39:54,869
plan that's very very scary because a
1
00:39:52,528 --> 00:39:56,670
value system of military developers of
1
00:39:54,869 --> 00:40:02,160
Technology is not the same as a value
1
00:39:56,670 --> 00:40:04,349
system of the human race out of the
1
00:40:02,159 --> 00:40:06,389
concerns about the possibility that this
1
00:40:04,349 --> 00:40:07,140
technology might be a threat to human
1
00:40:06,389 --> 00:40:09,268
existence
1
00:40:07,139 --> 00:40:11,068
a number of the technologists have
1
00:40:09,268 --> 00:40:14,159
funded the future of life Institute to
1
00:40:11,068 --> 00:40:16,230
try to grapple with these problems all
1
00:40:14,159 --> 00:40:17,818
of these guys are secretive and so it's
1
00:40:16,230 --> 00:40:23,699
interesting to me to see them and you
1
00:40:17,818 --> 00:40:25,288
know all together everything we have is
1
00:40:23,699 --> 00:40:28,018
a result of our intelligence it's not
1
00:40:25,289 --> 00:40:30,630
the result of our big scary teeth or our
1
00:40:28,018 --> 00:40:32,338
large claws or our enormous muscles it's
1
00:40:30,630 --> 00:40:35,130
because we're actually relatively
1
00:40:32,338 --> 00:40:37,619
intelligent and among my generation
1
00:40:35,130 --> 00:40:39,720
we're all having what we call holy cow
1
00:40:37,619 --> 00:40:42,088
or something holy something else moments
1
00:40:39,719 --> 00:40:44,959
because we see that the technology is
1
00:40:42,088 --> 00:40:47,548
accelerating faster than we expected
1
00:40:44,960 --> 00:40:49,528
remember sitting around the table there
1
00:40:47,548 --> 00:40:51,809
with some of the bests and the smartest
1
00:40:49,528 --> 00:40:54,480
minds in the world and what really
1
00:40:51,809 --> 00:40:57,930
struck me was maybe the human brain is
1
00:40:54,480 --> 00:40:59,400
not able to fully grasp the complexity
1
00:40:57,929 --> 00:41:02,969
of the world that we're confronted with
1
00:40:59,400 --> 00:41:05,009
as it's currently constructed the road
1
00:41:02,969 --> 00:41:07,980
that AI is following heads off a cliff
1
00:41:05,009 --> 00:41:09,539
and we need to change the direction that
1
00:41:07,980 --> 00:41:15,119
we're going so that we don't take the
1
00:41:09,539 --> 00:41:19,170
human race off the cliff Google acquired
1
00:41:15,119 --> 00:41:20,608
deep mind several years ago do you mind
1
00:41:19,170 --> 00:41:24,088
operates as a semi independent
1
00:41:20,608 --> 00:41:26,518
subsidiary of Google the thing that
1
00:41:24,088 --> 00:41:28,679
makes deep mind unique is that deep mind
1
00:41:26,518 --> 00:41:32,008
is absolutely focused on creating
1
00:41:28,679 --> 00:41:34,379
digital super intelligence an AI that is
1
00:41:32,009 --> 00:41:36,778
vastly smarter than any human on earth
1
00:41:34,380 --> 00:41:39,150
and ultimately smarter than all humans
1
00:41:36,778 --> 00:41:41,159
on earth combined this is from the deep
1
00:41:39,150 --> 00:41:44,130
mind reinforcement learning system
1
00:41:41,159 --> 00:41:46,440
basically wakes up like a newborn baby
1
00:41:44,130 --> 00:41:48,960
and is shown the screen of an Atari
1
00:41:46,440 --> 00:41:50,548
video game and then has to learn to play
1
00:41:48,960 --> 00:41:54,369
the video game
1
00:41:50,548 --> 00:41:59,409
it knows nothing about objects about
1
00:41:54,369 --> 00:42:00,640
motion about time it only knows that
1
00:41:59,409 --> 00:42:06,129
there's an image on the screen and
1
00:42:00,639 --> 00:42:08,199
there's a score so if your baby woke up
1
00:42:06,130 --> 00:42:11,289
the day it was born and by later
1
00:42:08,199 --> 00:42:15,009
afternoon was playing 40 different Atari
1
00:42:11,289 --> 00:42:17,109
video games at a superhuman level you
1
00:42:15,009 --> 00:42:20,528
would be terrified you would say my baby
1
00:42:17,108 --> 00:42:24,788
is possessed send it back the deep line
1
00:42:20,528 --> 00:42:26,679
system can win at any game it can
1
00:42:24,789 --> 00:42:30,190
already beat all the original Atari
1
00:42:26,679 --> 00:42:32,078
games it is superhuman it plays the
1
00:42:30,190 --> 00:42:39,068
games at SuperSpeed in less than a
1
00:42:32,079 --> 00:42:40,690
minute deep mine turned to another
1
00:42:39,068 --> 00:42:43,509
challenge and the challenge was the game
1
00:42:40,690 --> 00:42:46,059
of Go which people have generally argued
1
00:42:43,509 --> 00:42:48,509
has been beyond the power of computers
1
00:42:46,059 --> 00:42:51,130
to play with the best human go players
1
00:42:48,509 --> 00:42:55,659
first they challenged the european go
1
00:42:51,130 --> 00:43:00,099
champion then they challenged a korean
1
00:42:55,659 --> 00:43:03,118
go champion and they were able to win in
1
00:43:00,099 --> 00:43:05,410
both times in kind of striking fashion
1
00:43:03,119 --> 00:43:07,539
he really articles in new york times
1
00:43:05,409 --> 00:43:10,348
years ago talking about how go would
1
00:43:07,539 --> 00:43:12,220
take a hundred years for us to saw
1
00:43:10,349 --> 00:43:16,180
people say well you know but that's
1
00:43:12,219 --> 00:43:18,038
still just a board poker is an art poker
1
00:43:16,179 --> 00:43:20,139
involves reading people poker involves
1
00:43:18,039 --> 00:43:22,210
lying bluffing it's not an exact thing
1
00:43:20,139 --> 00:43:24,308
that will never be you know a computer
1
00:43:22,210 --> 00:43:26,650
you can't do that they took the best
1
00:43:24,309 --> 00:43:28,660
poker players in the world and took
1
00:43:26,650 --> 00:43:32,650
seven days for the computer to start
1
00:43:28,659 --> 00:43:34,118
demolishing the dunes so the best poker
1
00:43:32,650 --> 00:43:35,710
player in the world the best go player
1
00:43:34,119 --> 00:43:38,289
in the world and the pattern here is
1
00:43:35,710 --> 00:43:40,690
that AI might take a little while to
1
00:43:38,289 --> 00:43:44,799
wrap its tentacles around a new skill
1
00:43:40,690 --> 00:43:47,338
but when it does when it gets it it is
1
00:43:44,798 --> 00:43:47,338
unstoppable
1
00:43:50,260 --> 00:43:55,510
[Music]
1
00:43:52,329 --> 00:43:58,869
bleep minds AI as administrator level
1
00:43:55,510 --> 00:44:02,080
access to Google's servers to optimize
1
00:43:58,869 --> 00:44:04,929
energy usage at the data centers however
1
00:44:02,079 --> 00:44:06,909
this could be an unintentional Trojan
1
00:44:04,929 --> 00:44:08,559
horse deepmind has to have complete
1
00:44:06,909 --> 00:44:10,420
control of the datacenters so with a
1
00:44:08,559 --> 00:44:12,070
little software update that a I could
1
00:44:10,420 --> 00:44:13,809
take complete control of the whole
1
00:44:12,070 --> 00:44:15,760
Google System which means they can do
1
00:44:13,809 --> 00:44:22,539
anything take a look at all your data
1
00:44:15,760 --> 00:44:24,190
you do anything we're rapidly headed
1
00:44:22,539 --> 00:44:25,690
towards digital super intelligence that
1
00:44:24,190 --> 00:44:28,480
far exceeds any human don't think it's
1
00:44:25,690 --> 00:44:30,190
very obvious the problem is we don't
1
00:44:28,480 --> 00:44:32,079
really suddenly hit human level
1
00:44:30,190 --> 00:44:35,079
intelligence and say okay let's stop
1
00:44:32,079 --> 00:44:36,309
research it's gonna go beyond human
1
00:44:35,079 --> 00:44:37,659
level intelligence into what's called
1
00:44:36,309 --> 00:44:41,829
super intelligence and that's anything
1
00:44:37,659 --> 00:44:43,899
smarter than us AI at the superhuman
1
00:44:41,829 --> 00:44:46,989
level if we succeed without will be by
1
00:44:43,900 --> 00:44:48,940
far the most powerful invention we've
1
00:44:46,989 --> 00:44:50,729
ever made and the last dimension we ever
1
00:44:48,940 --> 00:44:53,130
have to make
1
00:44:50,730 --> 00:44:55,740
and if we create AI that's smarter than
1
00:44:53,130 --> 00:44:57,900
us we have to be open to the possibility
1
00:44:55,739 --> 00:45:02,969
that we might actually lose control to
1
00:44:57,900 --> 00:45:04,829
them let's say you give it some
1
00:45:02,969 --> 00:45:07,409
objective like you're in cancer and then
1
00:45:04,829 --> 00:45:09,329
you discover that the way it chooses to
1
00:45:07,409 --> 00:45:10,739
go about that is actually in conflict
1
00:45:09,329 --> 00:45:12,769
with a lot of other things you care
1
00:45:10,739 --> 00:45:12,769
about
1
00:45:12,858 --> 00:45:17,048
ai doesn't have to be evil to destroy
1
00:45:15,228 --> 00:45:19,808
humanity
1
00:45:17,048 --> 00:45:22,119
if AI has a goal and humanity just
1
00:45:19,809 --> 00:45:23,559
happens to be in the way it will destroy
1
00:45:22,119 --> 00:45:24,818
him at the humanity as a matter of
1
00:45:23,559 --> 00:45:26,890
course without even thinking about it no
1
00:45:24,818 --> 00:45:29,438
hard feelings it's just like if we're
1
00:45:26,889 --> 00:45:31,629
building a road and an ant hill happens
1
00:45:29,438 --> 00:45:34,418
to be in the way we don't hate ants
1
00:45:31,630 --> 00:45:37,019
we're just building a road and so
1
00:45:34,418 --> 00:45:37,018
goodbye anthill
1
00:45:38,469 --> 00:45:43,269
it's tempting to dismiss these concerns
1
00:45:40,989 --> 00:45:46,299
because it's like something that might
1
00:45:43,269 --> 00:45:49,480
happen in a few decades or 100 years so
1
00:45:46,300 --> 00:45:51,280
why worry but if you go back to
1
00:45:49,480 --> 00:45:53,740
September 11th 1933
1
00:45:51,280 --> 00:45:56,200
Ernest Rutherford who is the most well
1
00:45:53,739 --> 00:45:58,598
known nuclear physicist of his time said
1
00:45:56,199 --> 00:45:59,980
that the possibility of ever extracting
1
00:45:58,599 --> 00:46:01,720
useful amounts of energy from the
1
00:45:59,980 --> 00:46:03,240
transmutation of atoms as he called it
1
00:46:01,719 --> 00:46:06,279
was moonshine
1
00:46:03,239 --> 00:46:08,469
the next morning Leo Szilard who is much
1
00:46:06,280 --> 00:46:11,140
younger physicist read this and got
1
00:46:08,469 --> 00:46:13,328
really annoyed and figured out how to
1
00:46:11,139 --> 00:46:14,049
make a nuclear chain reaction just a few
1
00:46:13,329 --> 00:46:17,849
months later
1
00:46:14,050 --> 00:46:17,849
[Music]
1
00:46:20,659 --> 00:46:26,069
we have spent more than two billion
1
00:46:23,489 --> 00:46:29,639
dollars on the greatest scientific
1
00:46:26,070 --> 00:46:31,680
gamble in history so when people say
1
00:46:29,639 --> 00:46:32,940
that oh this is so far off in the future
1
00:46:31,679 --> 00:46:35,190
we don't have to worry about it
1
00:46:32,940 --> 00:46:37,349
they might only be three four
1
00:46:35,190 --> 00:46:38,909
breakthroughs of that magnitude that
1
00:46:37,349 --> 00:46:41,940
will get us from here to super
1
00:46:38,909 --> 00:46:44,969
intelligent machines if it's gonna take
1
00:46:41,940 --> 00:46:47,909
20 years to figure out how the keep AI
1
00:46:44,969 --> 00:46:51,480
beneficial then we should start today
1
00:46:47,909 --> 00:46:53,159
not at the last second when some dudes
1
00:46:51,480 --> 00:46:58,769
drinking Red Bull decide to flip the
1
00:46:53,159 --> 00:47:01,529
switch and test the thing we have five
1
00:46:58,769 --> 00:47:05,219
years I think Digital super intelligence
1
00:47:01,530 --> 00:47:06,830
will happen in my lifetime one hard
1
00:47:05,219 --> 00:47:09,259
percent
1
00:47:06,829 --> 00:47:11,690
what this happens it will be surrounded
1
00:47:09,260 --> 00:47:14,450
by a bunch of people who are really just
1
00:47:11,690 --> 00:47:15,950
excited about the technology they want
1
00:47:14,449 --> 00:47:17,149
to see it succeed but they're not
1
00:47:15,949 --> 00:47:28,579
anticipating that it can get out of
1
00:47:17,150 --> 00:47:31,400
control oh my god I trust my computer so
1
00:47:28,579 --> 00:47:33,409
much that's an amazing question I don't
1
00:47:31,400 --> 00:47:35,869
trust my computer if it's on I take it
1
00:47:33,409 --> 00:47:37,279
off like even it was off I still think
1
00:47:35,869 --> 00:47:38,839
it's all like you know like you really
1
00:47:37,280 --> 00:47:40,850
cannot just like the webcams you don't
1
00:47:38,840 --> 00:47:44,059
know like someone might turn it up don't
1
00:47:40,849 --> 00:47:48,199
know like I don't trust my computer like
1
00:47:44,059 --> 00:47:50,480
in my phone every time they ask me we
1
00:47:48,199 --> 00:47:55,909
send your information to Apple every
1
00:47:50,480 --> 00:47:59,030
time I so trust my phone ok so part of
1
00:47:55,909 --> 00:48:00,440
it is yes I do trust it because it's
1
00:47:59,030 --> 00:48:02,420
really it would be really hard to get
1
00:48:00,440 --> 00:48:12,590
through the day and the way our world is
1
00:48:02,420 --> 00:48:15,909
set up without computers Trust is such a
1
00:48:12,590 --> 00:48:15,910
human experience
1
00:48:21,500 --> 00:48:27,460
I have a patient coming in with
1
00:48:24,289 --> 00:48:30,420
intracranial aneurysm
1
00:48:27,460 --> 00:48:32,650
[Music]
1
00:48:30,420 --> 00:48:34,509
they want to look in my eyes and know
1
00:48:32,650 --> 00:48:38,369
that they can trust this person with
1
00:48:34,509 --> 00:48:41,889
their life I'm not horribly concerned
1
00:48:38,369 --> 00:48:45,479
about anything good part of that is
1
00:48:41,889 --> 00:48:45,478
because I have confidence in you
1
00:48:51,190 --> 00:48:58,210
this procedure we're doing today 20
1
00:48:53,559 --> 00:48:59,650
years ago was essentially impossible we
1
00:48:58,210 --> 00:49:16,329
just didn't have the materials in the
1
00:48:59,650 --> 00:49:24,599
technologies could it be any more
1
00:49:16,329 --> 00:49:28,089
difficult thank God so the coil is
1
00:49:24,599 --> 00:49:31,329
barely in there right now it's just a
1
00:49:28,088 --> 00:49:33,690
feather holding it in it's a nervous
1
00:49:31,329 --> 00:49:33,690
time
1
00:49:36,480 --> 00:49:42,909
we're just in purgatory intellectual
1
00:49:39,340 --> 00:49:45,960
humanistic purgatory an AI might know
1
00:49:42,909 --> 00:49:45,960
exactly what to do here
1
00:49:50,670 --> 00:49:56,050
we got the coil into the aneurysm but it
1
00:49:53,769 --> 00:49:59,259
wasn't in tremendously well that I knew
1
00:49:56,050 --> 00:50:02,530
that it would stay so with a maybe 20%
1
00:49:59,260 --> 00:50:05,890
risk of a very bad situation I elected
1
00:50:02,530 --> 00:50:07,840
to just bring her back because of my
1
00:50:05,889 --> 00:50:09,819
relationship with her and knowing the
1
00:50:07,840 --> 00:50:12,309
difficulties of coming in and having the
1
00:50:09,820 --> 00:50:14,140
procedure I consider things when I
1
00:50:12,309 --> 00:50:17,679
should only consider the safest possible
1
00:50:14,139 --> 00:50:19,239
route to achieve success well I had to
1
00:50:17,679 --> 00:50:22,750
stand there for 10 minutes agonizing
1
00:50:19,239 --> 00:50:25,000
about it the computer feels nothing the
1
00:50:22,750 --> 00:50:25,480
computer just does what it's supposed to
1
00:50:25,000 --> 00:50:26,449
do
1
00:50:25,480 --> 00:50:28,949
better and better
1
00:50:26,449 --> 00:50:38,129
[Music]
1
00:50:28,949 --> 00:50:39,609
I want to be AI in this case but can a I
1
00:50:38,130 --> 00:50:43,530
be compassionate
1
00:50:39,610 --> 00:50:45,690
[Music]
1
00:50:43,530 --> 00:50:51,360
I mean it's everybody's question about
1
00:50:45,690 --> 00:50:54,420
AI we are the sole embodiment of
1
00:50:51,360 --> 00:50:56,579
humanity and it's a stretch for us to
1
00:50:54,420 --> 00:51:01,250
accept that a machine can be
1
00:50:56,579 --> 00:51:01,250
compassionate and loving in that way
1
00:51:01,469 --> 00:51:07,789
[Music]
1
00:51:05,329 --> 00:51:10,380
part of me doesn't believe in magic but
1
00:51:07,789 --> 00:51:12,659
part of me has faith that there is
1
00:51:10,380 --> 00:51:14,640
something beyond the sum of the parts if
1
00:51:12,659 --> 00:51:17,940
there is at least a oneness in our
1
00:51:14,639 --> 00:51:20,509
shared ancestry our shared biology our
1
00:51:17,940 --> 00:51:20,510
shared history
1
00:51:20,920 --> 00:51:28,999
some connection there beyond machine
1
00:51:23,920 --> 00:51:28,999
[Music]
1
00:51:30,869 --> 00:51:34,829
so then you have the other side of that
1
00:51:33,090 --> 00:51:37,250
is does the computer know it's conscious
1
00:51:34,829 --> 00:51:41,250
or can it be conscious or does it care
1
00:51:37,250 --> 00:51:43,969
does it need to be conscious does it
1
00:51:41,250 --> 00:51:43,969
need to be aware
1
00:51:44,369 --> 00:51:55,759
[Music]
1
00:51:52,250 --> 00:51:58,369
I do not think that a robot could ever
1
00:51:55,760 --> 00:51:59,230
be conscious unless they programmed it
1
00:51:58,369 --> 00:52:04,940
that way
1
00:51:59,230 --> 00:52:06,409
conscious no no no I mean I think a
1
00:52:04,940 --> 00:52:08,059
robot could be programmed to be
1
00:52:06,409 --> 00:52:11,629
conscious how they program to do
1
00:52:08,059 --> 00:52:13,670
everything else that's another big part
1
00:52:11,630 --> 00:52:17,920
of our official intelligence is to make
1
00:52:13,670 --> 00:52:17,920
them a conscious and make them feel
1
00:52:22,579 --> 00:52:29,720
back in 2005 we started trying to build
1
00:52:26,119 --> 00:52:29,720
machines with self-awareness
1
00:52:33,099 --> 00:52:39,230
this robot to begin with didn't know
1
00:52:35,750 --> 00:52:45,590
what it was all he knew is that it
1
00:52:39,230 --> 00:52:47,210
needed to do something like walk through
1
00:52:45,590 --> 00:52:51,110
trial and error and figure out how to
1
00:52:47,210 --> 00:52:56,150
walk using its imagination and then it
1
00:52:51,110 --> 00:52:58,610
walked away and then we did something
1
00:52:56,150 --> 00:53:01,269
very cruel we chopped off a leg and
1
00:52:58,610 --> 00:53:01,269
watched what happened
1
00:53:03,360 --> 00:53:10,170
at the beginning it didn't quite know
1
00:53:05,969 --> 00:53:14,579
what had happened but over by the period
1
00:53:10,170 --> 00:53:17,369
of a day and then began to limp and then
1
00:53:14,579 --> 00:53:21,719
a year ago we were training an AI system
1
00:53:17,369 --> 00:53:24,179
for a live demonstration we wanted to
1
00:53:21,719 --> 00:53:25,889
show how we wave all these objects in
1
00:53:24,179 --> 00:53:28,679
front of the camera under the AI can
1
00:53:25,889 --> 00:53:31,139
recognize that the objects and so we're
1
00:53:28,679 --> 00:53:33,029
preparing this demo and we had an aside
1
00:53:31,139 --> 00:53:37,219
screen this ability to watch what
1
00:53:33,030 --> 00:53:39,360
certain neurons were responding to and
1
00:53:37,219 --> 00:53:42,209
suddenly we notice that one of the
1
00:53:39,360 --> 00:53:44,340
neurons was tracking faces it was
1
00:53:42,210 --> 00:53:48,329
tracking our faces as we were moving
1
00:53:44,340 --> 00:53:50,820
around now the spooky thing about this
1
00:53:48,329 --> 00:53:54,630
is that we never trained the system to
1
00:53:50,820 --> 00:53:58,039
recognize human faces and yet somehow
1
00:53:54,630 --> 00:53:58,039
and learn to do that
1
00:53:58,079 --> 00:54:02,500
even though these robots are very simple
1
00:54:00,429 --> 00:54:07,299
we can see there's something else on
1
00:54:02,500 --> 00:54:12,099
there it's not just program so this is
1
00:54:07,300 --> 00:54:17,050
just the beginning I often think about
1
00:54:12,099 --> 00:54:22,480
that beach in Kitty Hawk the 1903 flight
1
00:54:17,050 --> 00:54:24,550
by Orville and Wilbur Wright there's a
1
00:54:22,480 --> 00:54:26,650
kind of a canvas claim it's wood and
1
00:54:24,550 --> 00:54:28,450
iron and it gets off the ground for what
1
00:54:26,650 --> 00:54:31,269
a minute and 20 seconds and he's winning
1
00:54:28,449 --> 00:54:37,469
the day before took him back down again
1
00:54:31,269 --> 00:54:41,829
and it was just around 65 summers or so
1
00:54:37,469 --> 00:54:43,559
after that moment that you have a 747
1
00:54:41,829 --> 00:54:47,139
taking off from JFK
1
00:54:43,559 --> 00:54:47,139
[Music]
1
00:54:50,500 --> 00:54:54,739
with major concern of someone on the
1
00:54:52,639 --> 00:54:56,838
airplane might be whether or not their
1
00:54:54,739 --> 00:54:58,669
salt free diet meal is going to be
1
00:54:56,838 --> 00:55:00,650
coming to them or not with a whole
1
00:54:58,670 --> 00:55:03,409
infrastructure with travel agents and
1
00:55:00,650 --> 00:55:08,900
tower control and it's all casual it's
1
00:55:03,409 --> 00:55:10,909
all part of the world right now as far
1
00:55:08,900 --> 00:55:13,309
as we've come with machines and thinking
1
00:55:10,909 --> 00:55:16,009
solve problems we're a Kittyhawk now
1
00:55:13,309 --> 00:55:18,019
we're in the wind we have our tattered
1
00:55:16,010 --> 00:55:21,119
canvas planes up in the air
1
00:55:18,019 --> 00:55:21,119
[Music]
1
00:55:21,269 --> 00:55:26,610
but what happens in 65 summers or so we
1
00:55:25,050 --> 00:55:32,870
will have machines that are behind you
1
00:55:26,610 --> 00:55:37,220
control should we worry about that I'm
1
00:55:32,869 --> 00:55:37,219
not sure it's going to help
1
00:55:40,568 --> 00:55:47,808
nobody has any idea today what it means
1
00:55:43,880 --> 00:55:50,599
for a robot to be conscious there is no
1
00:55:47,809 --> 00:55:52,039
such thing there are a lot of smart
1
00:55:50,599 --> 00:55:55,150
people and I have a great deal of
1
00:55:52,039 --> 00:55:58,970
respect for them but the truth is
1
00:55:55,150 --> 00:56:00,950
machines are natural Psychopaths fear
1
00:55:58,969 --> 00:56:02,598
came back into the market and down eight
1
00:56:00,949 --> 00:56:04,548
hundred nearly a thousand in a heartbeat
1
00:56:02,599 --> 00:56:05,809
they did is plastic capitulation there
1
00:56:04,548 --> 00:56:08,420
are some people were proposing there was
1
00:56:05,809 --> 00:56:11,150
some kind of fat finger error take the
1
00:56:08,420 --> 00:56:14,480
flash crash of 2010 in a matter of
1
00:56:11,150 --> 00:56:16,910
minutes trillion dollars in value was
1
00:56:14,480 --> 00:56:19,179
lost in stock market the Dow dropped
1
00:56:16,909 --> 00:56:24,469
nearly a thousand points in a half hour
1
00:56:19,179 --> 00:56:27,679
so what went wrong by that point in time
1
00:56:24,469 --> 00:56:30,199
more than 60% of all the trades that
1
00:56:27,679 --> 00:56:35,379
took place on stock exchange we're
1
00:56:30,199 --> 00:56:35,379
actually being initiated by computers
1
00:56:38,000 --> 00:56:42,289
the short story what happened in the
1
00:56:39,739 --> 00:56:44,750
flash crash is that algorithms responded
1
00:56:42,289 --> 00:56:46,609
to algorithms and it compounded upon
1
00:56:44,750 --> 00:56:48,798
itself over and over and over again the
1
00:56:46,608 --> 00:56:51,528
matter of minutes at one point the
1
00:56:48,798 --> 00:56:54,139
market fell as if down a well
1
00:56:51,528 --> 00:56:55,789
there is no regulatory body that can
1
00:56:54,139 --> 00:56:58,548
adapt quickly enough to prevent
1
00:56:55,789 --> 00:57:01,609
potentially disastrous consequences of
1
00:56:58,548 --> 00:57:05,268
AI operating in our financial systems
1
00:57:01,608 --> 00:57:06,889
they are so prime for manipulation let's
1
00:57:05,268 --> 00:57:09,288
talk about the speed with which we are
1
00:57:06,889 --> 00:57:11,088
watching this market be theory that's
1
00:57:09,289 --> 00:57:14,660
the type of AI run amok that scares
1
00:57:11,088 --> 00:57:17,949
people when you give them a goal they
1
00:57:14,659 --> 00:57:20,179
will relentlessly pursue that goal
1
00:57:17,949 --> 00:57:20,869
how many computer programs are there
1
00:57:20,179 --> 00:57:25,789
likeness
1
00:57:20,869 --> 00:57:29,659
nobody knows one of the fascinating
1
00:57:25,789 --> 00:57:32,259
aspects about AI in general is that no
1
00:57:29,659 --> 00:57:35,359
one really understands how it works
1
00:57:32,260 --> 00:57:39,380
even people who create AI don't really
1
00:57:35,360 --> 00:57:41,960
fully understand because it has millions
1
00:57:39,380 --> 00:57:43,700
of elements it becomes completely
1
00:57:41,960 --> 00:57:47,920
impossible for a human being to
1
00:57:43,699 --> 00:57:47,919
understand what's going on
1
00:57:53,119 --> 00:57:58,409
Microsoft had set up this artificial
1
00:57:56,400 --> 00:58:03,300
intelligence called ti' on Twitter which
1
00:57:58,409 --> 00:58:05,869
was a chat bot they started out in the
1
00:58:03,300 --> 00:58:08,220
morning and ty was starting to tweet and
1
00:58:05,869 --> 00:58:12,119
learning from stuff that was being sent
1
00:58:08,219 --> 00:58:14,039
to him from other Twitter people because
1
00:58:12,119 --> 00:58:17,130
some people like trawl attacked him
1
00:58:14,039 --> 00:58:18,840
within 24 hours the Microsoft Bob became
1
00:58:17,130 --> 00:58:22,079
a terrible person
1
00:58:18,840 --> 00:58:23,820
they had to literally pull tie off the
1
00:58:22,079 --> 00:58:29,400
net because he had turned into a monster
1
00:58:23,820 --> 00:58:32,430
a misanthropic races horrible person you
1
00:58:29,400 --> 00:58:35,119
never want to move and nobody had
1
00:58:32,429 --> 00:58:35,119
foreseen this
1
00:58:35,699 --> 00:58:40,739
the whole idea of AI is that we are not
1
00:58:38,309 --> 00:58:44,760
telling it exactly how to achieve a
1
00:58:40,739 --> 00:58:46,519
given outcome or a goal ai develops on
1
00:58:44,760 --> 00:58:49,920
its own
1
00:58:46,519 --> 00:58:51,568
we're worried about super intelligent AI
1
00:58:49,920 --> 00:58:54,990
the master chess player that will
1
00:58:51,568 --> 00:58:58,019
outmaneuver us but hey I won't have to
1
00:58:54,989 --> 00:59:00,929
actually be that smart to have massively
1
00:58:58,019 --> 00:59:02,969
disruptive effects on human civilization
1
00:59:00,929 --> 00:59:04,289
we've seen over the last century it
1
00:59:02,969 --> 00:59:06,629
doesn't necessarily take a genius to
1
00:59:04,289 --> 00:59:09,000
knock history off in a particular
1
00:59:06,630 --> 00:59:10,730
direction and it won't take a genius ai
1
00:59:09,000 --> 00:59:13,349
to do the same thing
1
00:59:10,730 --> 00:59:16,139
bogus election news stories generated
1
00:59:13,349 --> 00:59:19,440
more engagement on Facebook then top
1
00:59:16,139 --> 00:59:21,170
real stories Facebook really is the
1
00:59:19,440 --> 00:59:25,079
elephant in the room
1
00:59:21,170 --> 00:59:29,789
AI running Facebook newsfeed the task
1
00:59:25,079 --> 00:59:32,789
for AI is keeping users engaged but no
1
00:59:29,789 --> 00:59:36,630
one really understands exactly how this
1
00:59:32,789 --> 00:59:38,730
AI is achieving this goal Facebook is
1
00:59:36,630 --> 00:59:41,280
building an elegant mirrored wall around
1
00:59:38,730 --> 00:59:43,139
us a mirror that we can ask who's the
1
00:59:41,280 --> 00:59:46,950
fairest of them all and it will answer
1
00:59:43,139 --> 00:59:49,588
you you time it again you slowly begin
1
00:59:46,949 --> 00:59:53,608
to warp our sense of reality warp our
1
00:59:49,588 --> 00:59:56,699
sense of politics history global events
1
00:59:53,608 --> 00:59:59,730
until determining what's true and what's
1
00:59:56,699 --> 01:00:01,108
not true is virtually impossible
1
00:59:59,730 --> 01:00:04,179
[Music]
1
01:00:01,108 --> 01:00:06,190
the problem is that AI doesn't
1
01:00:04,179 --> 01:00:09,789
understand that hey I just had a mission
1
01:00:06,190 --> 01:00:13,030
maximize user engagement and it achieved
1
01:00:09,789 --> 01:00:16,480
that nearly two billion people spend
1
01:00:13,030 --> 01:00:20,109
nearly 1 hour on average a day basically
1
01:00:16,480 --> 01:00:23,650
interacting with AI that is shaping
1
01:00:20,108 --> 01:00:26,318
their experience even Facebook engineers
1
01:00:23,650 --> 01:00:28,660
they don't like fake news let's very bad
1
01:00:26,318 --> 01:00:30,190
business they want to get rid of fake
1
01:00:28,659 --> 01:00:32,440
news it's just very difficult to do
1
01:00:30,190 --> 01:00:34,420
because how do you recognize news is
1
01:00:32,440 --> 01:00:38,559
fake if you cannot read all of those
1
01:00:34,420 --> 01:00:41,530
news personally there's so much active
1
01:00:38,559 --> 01:00:44,290
misinformation and it's packaged very
1
01:00:41,530 --> 01:00:46,690
well and it looks the same when you see
1
01:00:44,289 --> 01:00:49,000
it on a Facebook page or you turn on
1
01:00:46,690 --> 01:00:51,159
your television it's not terribly
1
01:00:49,000 --> 01:00:54,068
sophisticated but it is terribly
1
01:00:51,159 --> 01:00:56,558
powerful and what it means is that your
1
01:00:54,068 --> 01:00:58,869
view of the world which 20 years ago was
1
01:00:56,559 --> 01:01:01,359
determined if you watch the nightly news
1
01:00:58,869 --> 01:01:03,039
by three different networks the three
1
01:01:01,358 --> 01:01:04,298
anchors who endeavor but try to get it
1
01:01:03,039 --> 01:01:05,679
right you might have had a little bias
1
01:01:04,298 --> 01:01:07,059
one way or the other but largely
1
01:01:05,679 --> 01:01:10,808
speaking we can all agree on an
1
01:01:07,059 --> 01:01:13,599
objective reality that objectivity is
1
01:01:10,809 --> 01:01:15,869
gone and Facebook is completely
1
01:01:13,599 --> 01:01:15,869
annihilated
1
01:01:17,230 --> 01:01:21,699
if most of your understanding of how the
1
01:01:19,579 --> 01:01:24,529
world works is derived from Facebook
1
01:01:21,699 --> 01:01:26,899
facilitated by algorithmic software that
1
01:01:24,530 --> 01:01:29,660
tries to show you the news you want to
1
01:01:26,900 --> 01:01:32,119
see that's a terribly dangerous thing
1
01:01:29,659 --> 01:01:35,750
and the idea that we have not only set
1
01:01:32,119 --> 01:01:38,869
that in motion but allowed bad-faith
1
01:01:35,750 --> 01:01:45,139
actors access to that information this
1
01:01:38,869 --> 01:01:46,579
is a recipe for disaster I think that it
1
01:01:45,139 --> 01:01:49,539
will definitely be lots of bad actors
1
01:01:46,579 --> 01:01:52,549
trying to manipulate the world with AI
1
01:01:49,539 --> 01:01:54,340
2016 was a perfect example of an
1
01:01:52,550 --> 01:01:56,180
election where there was lots of AI
1
01:01:54,340 --> 01:01:58,850
producing lots of fake news and
1
01:01:56,179 --> 01:02:02,449
distributing it for for a purpose for a
1
01:01:58,849 --> 01:02:05,119
result ladies and gentlemen honourable
1
01:02:02,449 --> 01:02:07,279
colleagues it's my privilege to speak to
1
01:02:05,119 --> 01:02:09,559
you today about the power of big data
1
01:02:07,280 --> 01:02:12,410
and psychographics in the electoral
1
01:02:09,559 --> 01:02:15,049
process and specifically to talk about
1
01:02:12,409 --> 01:02:16,940
the work that we contributed to Senator
1
01:02:15,050 --> 01:02:20,360
Cruz's presidential primary campaign
1
01:02:16,940 --> 01:02:22,130
Cambridge analytics emerged quietly as a
1
01:02:20,360 --> 01:02:24,890
company that according to its own height
1
01:02:22,130 --> 01:02:28,010
and has the ability to use this
1
01:02:24,889 --> 01:02:32,690
tremendous amount of data in order to
1
01:02:28,010 --> 01:02:35,420
affect societal change in 2016 they had
1
01:02:32,690 --> 01:02:37,970
three major clients Ted Cruz was one of
1
01:02:35,420 --> 01:02:40,940
them it's easy to forget that only 18
1
01:02:37,969 --> 01:02:42,409
months ago senator Cruz was one of the
1
01:02:40,940 --> 01:02:45,619
less popular candidates seeking
1
01:02:42,409 --> 01:02:48,559
nomination so what was not possible
1
01:02:45,619 --> 01:02:51,170
maybe like 10 or 15 years ago was that
1
01:02:48,559 --> 01:02:53,539
you can send fake news to exactly the
1
01:02:51,170 --> 01:02:55,610
people that you want to send it to and
1
01:02:53,539 --> 01:02:58,489
then you could actually see how he or
1
01:02:55,610 --> 01:03:00,800
she reacts on Facebook and then adjust
1
01:02:58,489 --> 01:03:03,259
that information according to the
1
01:03:00,800 --> 01:03:05,390
feedback that you got and so you can
1
01:03:03,260 --> 01:03:08,360
start developing kind of a real-time
1
01:03:05,389 --> 01:03:10,519
management of a population in this case
1
01:03:08,360 --> 01:03:13,130
we've zoned in on a group we've called
1
01:03:10,519 --> 01:03:15,769
persuasion these are people who are
1
01:03:13,130 --> 01:03:17,599
definitely going to vote to caucus but
1
01:03:15,769 --> 01:03:19,340
they need moving from the center a
1
01:03:17,599 --> 01:03:20,989
little bit more towards the right in
1
01:03:19,340 --> 01:03:23,750
order to support Cruz they need a
1
01:03:20,989 --> 01:03:26,149
persuasion message gun rights I've
1
01:03:23,750 --> 01:03:27,079
selected that narrows the field slightly
1
01:03:26,150 --> 01:03:29,568
more and
1
01:03:27,079 --> 01:03:31,160
we know that we need a message on gun
1
01:03:29,568 --> 01:03:33,528
rights it needs to be a persuasion
1
01:03:31,159 --> 01:03:35,478
message and it needs to be nuanced
1
01:03:33,528 --> 01:03:37,518
according to the certain personality
1
01:03:35,478 --> 01:03:40,368
that we're interested in through social
1
01:03:37,518 --> 01:03:42,468
media there's an infinite amount of
1
01:03:40,369 --> 01:03:44,778
information that you can gather about a
1
01:03:42,469 --> 01:03:47,269
person we have somewhere close to four
1
01:03:44,778 --> 01:03:49,909
or five thousand data points on every
1
01:03:47,268 --> 01:03:53,268
adult in the United States it's about
1
01:03:49,909 --> 01:03:55,848
targeting the individual it's like a
1
01:03:53,268 --> 01:03:57,828
weapon which can be used in the totally
1
01:03:55,849 --> 01:03:59,900
wrong direction that's the problem with
1
01:03:57,829 --> 01:04:02,568
all of this data it's almost as if we
1
01:03:59,900 --> 01:04:05,690
built the bullet before we built the gun
1
01:04:02,568 --> 01:04:08,268
Ted Cruz employed our data our
1
01:04:05,690 --> 01:04:12,259
behavioral insights he started from a
1
01:04:08,268 --> 01:04:17,118
base of less than 5% and had a very slow
1
01:04:12,259 --> 01:04:18,978
and steady but firm rise to above 35%
1
01:04:17,119 --> 01:04:20,959
making him obviously the second most
1
01:04:18,978 --> 01:04:23,568
threatening contender in the race now
1
01:04:20,958 --> 01:04:26,389
clearly the Cruz campaign is over now
1
01:04:23,568 --> 01:04:28,338
but what I can tell you is that of the
1
01:04:26,389 --> 01:04:30,588
two candidates left left in this
1
01:04:28,338 --> 01:04:36,228
election one of them is using these
1
01:04:30,588 --> 01:04:39,259
technologies Donald Trump do solemnly
1
01:04:36,228 --> 01:04:43,778
swear that I will faithfully execute the
1
01:04:39,259 --> 01:04:43,778
office of President of the United States
1
01:04:44,260 --> 01:04:47,449
[Music]
1
01:04:48,958 --> 01:04:54,969
elections are marginal exercise doesn't
1
01:04:52,059 --> 01:05:00,339
take a very sophisticated AI in order to
1
01:04:54,969 --> 01:05:02,438
have a disproportionate impact before
1
01:05:00,338 --> 01:05:06,099
Trump breaks it was another supposed
1
01:05:02,438 --> 01:05:09,788
client well at 20 minutes to 5 we can
1
01:05:06,099 --> 01:05:11,439
now say the decision taken in 1975 by
1
01:05:09,789 --> 01:05:14,919
this country to join the common market
1
01:05:11,438 --> 01:05:18,998
has been reversed by this referendum to
1
01:05:14,918 --> 01:05:21,699
leave the EU Cambridge analytic a
1
01:05:18,998 --> 01:05:24,158
allegedly uses AI to push through two of
1
01:05:21,699 --> 01:05:28,019
the most ground shaking pieces of
1
01:05:24,159 --> 01:05:30,429
political change in the last 50 years
1
01:05:28,018 --> 01:05:32,379
these are epochal events and if we
1
01:05:30,429 --> 01:05:34,659
believe the hype they are connected
1
01:05:32,380 --> 01:05:37,119
directly to a piece of software
1
01:05:34,659 --> 01:05:39,400
essentially created by a professor at
1
01:05:37,119 --> 01:05:41,590
Stanford
1
01:05:39,400 --> 01:05:45,160
[Music]
1
01:05:41,590 --> 01:05:47,380
back in 2013 I described that what
1
01:05:45,159 --> 01:05:49,349
they're doing is possible and warned
1
01:05:47,380 --> 01:05:52,960
against this happening in the future
1
01:05:49,349 --> 01:05:54,549
at the time we have kasinsky was a young
1
01:05:52,960 --> 01:05:57,070
Polish researcher working at the
1
01:05:54,550 --> 01:06:00,789
psychometric Center so what Michael had
1
01:05:57,070 --> 01:06:04,200
done was to gather the largest-ever data
1
01:06:00,789 --> 01:06:07,349
set of how people behaved on Facebook
1
01:06:04,199 --> 01:06:09,789
psychometrics is trying to measure
1
01:06:07,349 --> 01:06:12,460
psychological traits such as personality
1
01:06:09,789 --> 01:06:15,489
intelligence political views and so on
1
01:06:12,460 --> 01:06:18,360
now traditionally those traits were
1
01:06:15,489 --> 01:06:20,439
measured using tests and questioners
1
01:06:18,360 --> 01:06:21,970
personality tests the most benign thing
1
01:06:20,440 --> 01:06:23,050
you could possibly think of something
1
01:06:21,969 --> 01:06:26,469
that doesn't necessarily have a lot of
1
01:06:23,050 --> 01:06:29,019
utility right our idea was that instead
1
01:06:26,469 --> 01:06:30,489
of tests and questioners we could simply
1
01:06:29,019 --> 01:06:32,610
look at the digital footprints of
1
01:06:30,489 --> 01:06:36,869
behaviors that we are all living behind
1
01:06:32,610 --> 01:06:40,120
to understand openness conscientiousness
1
01:06:36,869 --> 01:06:42,579
neuroticism you can easily buy personal
1
01:06:40,119 --> 01:06:44,769
data such as where you live what club
1
01:06:42,579 --> 01:06:47,529
memberships you've joined which gym you
1
01:06:44,769 --> 01:06:50,110
go to there are actually marketplaces
1
01:06:47,530 --> 01:06:51,519
for personal data turns out we can
1
01:06:50,110 --> 01:06:54,490
discover an awful lot about what you're
1
01:06:51,519 --> 01:06:58,239
gonna do based on a very very tiny set
1
01:06:54,489 --> 01:07:01,149
of information we are training deep
1
01:06:58,239 --> 01:07:03,869
learning networks in fair intimate
1
01:07:01,150 --> 01:07:07,980
trades people's political views
1
01:07:03,869 --> 01:07:12,779
personality intelligence sex orientation
1
01:07:07,980 --> 01:07:12,780
just from an image of someone's face
1
01:07:17,530 --> 01:07:22,120
now think about countries which are not
1
01:07:19,480 --> 01:07:24,400
so free and open-minded if you can
1
01:07:22,119 --> 01:07:26,099
reveal people's religious views or
1
01:07:24,400 --> 01:07:29,860
political views or sexual orientation
1
01:07:26,099 --> 01:07:32,909
based on only profile pictures this
1
01:07:29,860 --> 01:07:40,950
could be literally an issue of life and
1
01:07:32,909 --> 01:07:40,949
death I think there's no going back
1
01:07:42,150 --> 01:07:49,019
you know what the Turing test is it's
1
01:07:46,599 --> 01:07:51,190
when a human interacts with a computer
1
01:07:49,019 --> 01:07:54,099
and if the human doesn't know they're
1
01:07:51,190 --> 01:07:58,389
interacting with a computer the test is
1
01:07:54,099 --> 01:07:59,799
passed and over the next few days you're
1
01:07:58,389 --> 01:08:01,869
gonna be the human component in the
1
01:07:59,800 --> 01:08:05,289
Turing test holy
1
01:08:01,869 --> 01:08:07,049
that's right Kayla you got it because if
1
01:08:05,289 --> 01:08:10,809
that test is passed
1
01:08:07,050 --> 01:08:12,960
you are dead center of the greatest
1
01:08:10,809 --> 01:08:15,250
scientific event in the history of man
1
01:08:12,960 --> 01:08:18,850
if you've created a conscious machine
1
01:08:15,250 --> 01:08:21,539
it's not the history of man that's the
1
01:08:18,850 --> 01:08:21,539
history of gods
1
01:08:23,260 --> 01:08:29,509
[Music]
1
01:08:27,288 --> 01:08:31,929
it's almost like technology is a garden
1
01:08:29,509 --> 01:08:31,929
of itself
1
01:08:33,779 --> 01:08:40,089
like the weather we can't impact it we
1
01:08:36,479 --> 01:08:45,639
can't slow it down we can't stop it
1
01:08:40,088 --> 01:08:47,920
we feel powerless if we think of God is
1
01:08:45,640 --> 01:08:49,500
an unlimited amount of intelligence the
1
01:08:47,920 --> 01:08:52,210
closest we can get to that is by
1
01:08:49,500 --> 01:08:54,338
evolving our own intelligence by merging
1
01:08:52,210 --> 01:08:58,619
with the artificial intelligence we're
1
01:08:54,338 --> 01:09:00,489
creating today our computers phones
1
01:08:58,619 --> 01:09:04,180
applications give us superhuman
1
01:09:00,489 --> 01:09:06,929
capability so as the old maxim says if
1
01:09:04,180 --> 01:09:06,930
you can't beat them join them
1
01:09:07,288 --> 01:09:12,809
it's about a human machine partnership I
1
01:09:10,130 --> 01:09:14,849
mean we already see how you know our
1
01:09:12,809 --> 01:09:16,380
phones for example it's act as memory
1
01:09:14,849 --> 01:09:17,969
prosthesis right I don't have to
1
01:09:16,380 --> 01:09:19,219
remember your phone number anymore
1
01:09:17,969 --> 01:09:22,288
because it's on my phone
1
01:09:19,219 --> 01:09:24,088
it's about machines augmenting our human
1
01:09:22,288 --> 01:09:26,939
abilities as opposed to like completely
1
01:09:24,088 --> 01:09:28,318
displacing them if you look at all the
1
01:09:26,939 --> 01:09:30,118
objects that have made the leap from
1
01:09:28,319 --> 01:09:34,739
analog to digital over the last 20 years
1
01:09:30,118 --> 01:09:37,139
it's a lot we're the last analog object
1
01:09:34,738 --> 01:09:38,338
in the digital universe and the problem
1
01:09:37,139 --> 01:09:41,940
with that of course is that the data
1
01:09:38,338 --> 01:09:45,778
input output is very limited it's this
1
01:09:41,939 --> 01:09:47,158
it's these our eyes are pretty good
1
01:09:45,779 --> 01:09:50,969
we're able to take in a lot of visual
1
01:09:47,158 --> 01:09:54,238
information what our information output
1
01:09:50,969 --> 01:09:56,130
is very very very low the reason this is
1
01:09:54,238 --> 01:09:58,678
important if we envision a scenario
1
01:09:56,130 --> 01:10:01,349
where AI is playing a more prominent
1
01:09:58,679 --> 01:10:03,868
role in societies we want good ways to
1
01:10:01,349 --> 01:10:07,250
interact with this technology so that it
1
01:10:03,868 --> 01:10:07,250
ends up augmenting us
1
01:10:09,130 --> 01:10:17,079
I think it's incredibly important to AI
1
01:10:10,840 --> 01:10:19,930
not the other it must be us and I could
1
01:10:17,079 --> 01:10:21,939
be wrong about what I'm saying I'm
1
01:10:19,930 --> 01:10:24,890
certainly open to ideas or anybody can
1
01:10:21,939 --> 01:10:26,629
suggest a path that's better
1
01:10:24,890 --> 01:10:31,030
but I think we're really gonna have to
1
01:10:26,630 --> 01:10:31,029
either merge with a IOP left behind
1
01:10:31,109 --> 01:10:39,049
[Music]
1
01:10:36,618 --> 01:10:40,880
it's hard to kind of think of unplugging
1
01:10:39,050 --> 01:10:43,779
a system that's distributed everywhere
1
01:10:40,880 --> 01:10:47,480
on the planet that's distributed now
1
01:10:43,779 --> 01:10:50,229
across the solar system you can't just
1
01:10:47,479 --> 01:10:50,229
you know shut that off
1
01:10:50,288 --> 01:10:54,279
we've opened Pandora's box we've
1
01:10:52,118 --> 01:10:57,368
Unleashed forces that we can't control
1
01:10:54,279 --> 01:10:59,109
we can't stop we're in the midst of
1
01:10:57,368 --> 01:11:00,279
essentially creating a new life-form on
1
01:10:59,109 --> 01:11:04,188
earth
1
01:11:00,279 --> 01:11:04,188
[Music]
1
01:11:06,130 --> 01:11:10,969
we don't know what happens next we don't
1
01:11:08,929 --> 01:11:13,158
know what shape the intellect of a
1
01:11:10,969 --> 01:11:16,038
machine will be when that intellect is
1
01:11:13,158 --> 01:11:17,599
far beyond human capabilities it's just
1
01:11:16,038 --> 01:11:22,050
not something that's possible
1
01:11:17,600 --> 01:11:24,800
[Music]
1
01:11:22,050 --> 01:11:27,570
[Applause]
1
01:11:24,800 --> 01:11:29,369
the least scary future I can think of is
1
01:11:27,569 --> 01:11:34,380
one where we have at least democratized
1
01:11:29,369 --> 01:11:36,569
AI because if one company or small group
1
01:11:34,380 --> 01:11:38,220
for people managers to develop godlike
1
01:11:36,569 --> 01:11:39,609
digital super intelligence they could
1
01:11:38,220 --> 01:11:40,809
take over the world
1
01:11:39,609 --> 01:11:42,549
[Music]
1
01:11:40,809 --> 01:11:46,239
at least when there's an evil dictator
1
01:11:42,550 --> 01:11:48,340
that human is going to die but for an AI
1
01:11:46,238 --> 01:11:51,399
there would be no death they would look
1
01:11:48,340 --> 01:11:54,489
forever and then you'd have an immortal
1
01:11:51,399 --> 01:11:57,579
dictator from which we can never escape
1
01:11:54,489 --> 01:11:57,579
[Music]
1
01:12:10,930 --> 01:12:14,048
[Music]
1
01:12:19,479 --> 01:12:22,669
[Music]
1
01:12:28,310 --> 01:12:33,919
[Music]
1
01:12:36,270 --> 01:12:48,399
[Music]
1
01:13:12,380 --> 01:13:15,630
[Music]
1
01:13:17,529 --> 01:13:20,099
you
1
01:13:24,590 --> 01:13:27,699
[Music]
1
01:13:30,310 --> 01:13:33,659
[Music]
1
01:13:37,140 --> 01:13:53,560
[Music]
1
01:14:01,899 --> 01:14:05,339
[Music]
1
01:14:12,720 --> 01:14:19,159
[Music]
1
01:14:22,909 --> 01:14:33,170
[Music]
1
01:14:31,949 --> 01:14:52,929
[Applause]
1
01:14:33,170 --> 01:14:52,929
[Music]
1
01:15:00,229 --> 01:15:08,469
[Music]
1
01:15:14,829 --> 01:15:31,640
[Music]
1
01:15:39,329 --> 01:15:44,238
[Music]
1
01:15:57,850 --> 01:16:29,590
[Applause]
1
01:15:58,930 --> 01:16:29,590
[Music]
1
01:16:34,359 --> 01:16:42,478
[Music]
1
01:16:47,000 --> 01:16:52,939
[Music]
1
01:16:59,350 --> 01:17:33,329
[Music]
1
01:17:32,430 --> 01:17:38,050
[Applause]
1
01:17:33,329 --> 01:17:38,050
[Music]
1
01:17:39,579 --> 01:17:45,409
[Applause]
1
01:17:42,189 --> 01:17:45,409
[Music]
154062
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.