Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:35,838 --> 00:00:43,640
we're on the brink of is a world of
1
00:00:39,689 --> 00:00:47,009
increasingly intense sophisticated
1
00:00:43,640 --> 00:00:49,439
artificial intelligence technology is
1
00:00:47,009 --> 00:00:51,298
evolving so much faster than our society
1
00:00:49,439 --> 00:00:53,750
has the ability to protect us as
1
00:00:51,298 --> 00:00:53,750
citizens
1
00:00:57,079 --> 00:01:04,250
[Music]
1
00:01:01,990 --> 00:01:07,189
you have a networked intelligence that
1
00:01:04,250 --> 00:01:14,629
watches us knows everything about us and
1
00:01:07,189 --> 00:01:17,299
begins to try to change us technology is
1
00:01:14,629 --> 00:01:21,769
never good or bad it's what we do with
1
00:01:17,299 --> 00:01:22,969
the technology eventually millions of
1
00:01:21,769 --> 00:01:24,920
people are going to be thrown out of
1
00:01:22,969 --> 00:01:26,939
jobs because their skills are going to
1
00:01:24,920 --> 00:01:35,019
be obsolete
1
00:01:26,939 --> 00:01:37,329
unemployment regardless of whether to 00:02:24,030
[Music]
1
00:02:24,590 --> 00:02:32,280
we created it so I think as we move
1
00:02:29,370 --> 00:02:36,060
forward this intelligence will contain
1
00:02:32,280 --> 00:02:40,379
parts of us but I think the question is
1
00:02:36,060 --> 00:02:43,550
will it contain the good parts or the
1
00:02:40,379 --> 00:02:43,549
bad parts
1
00:03:01,509 --> 00:03:04,649
[Music]
1
00:03:05,068 --> 00:03:09,009
the survivors called the war Judgment
1
00:03:08,348 --> 00:03:11,348
Day
1
00:03:09,009 --> 00:03:13,378
they they have told me to face a new
1
00:03:11,348 --> 00:03:13,378
nightmare
1
00:03:13,770 --> 00:03:19,560
against the machines I think we
1
00:03:16,919 --> 00:03:21,659
completely us up I think
1
00:03:19,560 --> 00:03:25,289
Hollywood has managed to inoculate the
1
00:03:21,659 --> 00:03:27,780
general public against this question the
1
00:03:25,289 --> 00:03:30,810
idea of machines that will take over the
1
00:03:27,780 --> 00:03:35,939
00:04:04,250
public is about to get blindsided by
1
00:04:02,189 --> 00:04:04,250
this
1
00:04:04,870 --> 00:04:07,960
[Music]
1
00:04:10,879 --> 00:04:17,279
as as decidin as individuals we're
1
00:04:14,699 --> 00:04:21,930
increasingly surrounded by a machine
1
00:04:17,279 --> 00:04:24,299
intelligence we carry this pocket device
1
00:04:21,930 --> 00:04:26,670
in the palm of our hand that we use to
1
00:04:24,300 --> 00:04:29,340
make a striking array of life decisions
1
00:04:26,670 --> 00:04:33,140
right now aided by a set of distant
1
00:04:29,339 --> 00:04:33,139
algorithms we have no understanding
1
00:04:34,730 --> 00:04:38,759
they're already pretty jaded about the
1
00:04:37,199 --> 00:04:43,620
idea that we can talk to our phone and
1
00:04:38,759 --> 00:04:45,469
that mostly understands us five years
1
00:04:43,620 --> 00:04:49,050
00:05:22,590
almost eliminate car accidents with
1
00:05:19,500 --> 00:05:24,149
automation 30,000 lives in the US alone
1
00:05:22,589 --> 00:05:26,099
about a million around the world per
1
00:05:24,149 --> 00:05:29,009
year
1
00:05:26,100 --> 00:05:31,230
in healthcare early indicators are the
1
00:05:29,009 --> 00:05:32,399
name of the game in that space so that's
1
00:05:31,230 --> 00:05:35,670
another place where it can save
1
00:05:32,399 --> 00:05:38,399
somebody's life here in the breast
1
00:05:35,670 --> 00:05:41,699
cancer Center all the things that the
1
00:05:38,399 --> 00:05:44,209
radiologist brain does in two minutes
1
00:05:41,699 --> 00:05:46,740
computer bonus instantaneously a
1
00:05:44,209 --> 00:05:49,409
computer has looked at 1 million
1
00:05:46,740 --> 00:05:51,780
00:06:29,099
we'll actually understand aging we'll be
1
00:06:26,069 --> 00:06:30,870
able to stop it there's really no limit
1
00:06:29,100 --> 00:06:33,260
to what intelligent machines can do for
1
00:06:30,870 --> 00:06:33,259
the human race
1
00:06:36,300 --> 00:06:43,629
how could a smarter machine not be a
1
00:06:39,160 --> 00:06:46,180
better machine it's hard to say exactly
1
00:06:43,629 --> 00:06:48,930
when I began to think that that was a
1
00:06:46,180 --> 00:06:48,930
bit naive
1
00:06:49,189 --> 00:07:00,430
[Music]
1
00:06:57,060 --> 00:07:02,019
Stuart Russell he's basically a God in
1
00:07:00,430 --> 00:07:03,788
the field of artificial intelligence he
1
00:07:02,019 --> 00:07:06,188
wrote the book that almost every
1
00:07:03,788 --> 00:07:08,620
University uses I used to say it's the
1
00:07:06,189 --> 00:07:11,360
00:07:45,950
bright shiny thing it was social
1
00:07:44,389 --> 00:07:48,110
networking in social media over the last
1
00:07:45,949 --> 00:07:50,899
decade and it's pretty clear the bit has
1
00:07:48,110 --> 00:07:53,000
flipped and it starts with machine
1
00:07:50,899 --> 00:07:55,609
learning when we look back at this
1
00:07:53,000 --> 00:07:57,439
moment what was the first AI it's not
1
00:07:55,610 --> 00:07:59,330
sexy and it isn't the thing we consider
1
00:07:57,439 --> 00:08:02,600
the movies but you'd make a great case
1
00:07:59,329 --> 00:08:06,409
that Google created a search engine but
1
00:08:02,600 --> 00:08:08,030
00:08:33,320
president of Kazakhstan and it'll just
1
00:08:31,519 --> 00:08:35,049
tell you you don't have to go to the
1
00:08:33,320 --> 00:08:38,210
Kazakhstan national website to find out
1
00:08:35,049 --> 00:08:40,789
didn't used to be able to do that that
1
00:08:38,210 --> 00:08:43,250
is artificial intelligence gears from
1
00:08:40,789 --> 00:08:46,069
now when we try to understand we will
1
00:08:43,250 --> 00:08:48,710
say well how do we miss it it's one of
1
00:08:46,070 --> 00:08:51,200
these striking contradictions that we're
1
00:08:48,710 --> 00:08:53,150
facing Google and Facebook at all have
1
00:08:51,200 --> 00:08:55,970
built businesses on giving us as a
1
00:08:53,149 --> 00:08:58,519
society 00:09:33,769
computer in your purse I mean how
1
00:09:30,470 --> 00:09:36,170
awesome is that I think most technology
1
00:09:33,769 --> 00:09:39,620
is meant to make things easier and
1
00:09:36,169 --> 00:09:42,559
simpler for for all of us so hopefully I
1
00:09:39,620 --> 00:09:45,850
just remains the focus I think everybody
1
00:09:42,559 --> 00:09:45,849
loves their computers
1
00:09:52,129 --> 00:10:00,590
people don't realize they are constantly
1
00:09:54,769 --> 00:10:02,269
being negotiated with by machines where
1
00:10:00,590 --> 00:10:04,879
that's the price of products in your
1
00:10:02,269 --> 00:10:07,069
Amazon cart whether you can get on a
1
00:10:04,879 --> 00:10:09,039
particular flight whether you can
1
00:10:07,070 --> 00:10:11,780
reserve a room at a particular hotel
1
00:10:09,039 --> 00:10:13,339
what you're experiencing are machine
1
00:10:11,779 --> 00:10:15,829
learning algorithms that 00:11:06,769
to say that they know more about you
1
00:10:54,830 --> 00:11:08,240
than your mother does major cause of the
1
00:11:06,769 --> 00:11:11,059
recently I breakthrough it isn't just
1
00:11:08,240 --> 00:11:13,759
that some dude at a brilliant insight
1
00:11:11,059 --> 00:11:16,279
doll of a son but simply that we have
1
00:11:13,759 --> 00:11:20,059
much bigger data to train them on and
1
00:11:16,279 --> 00:11:22,279
vastly better computers the magic is in
1
00:11:20,059 --> 00:11:24,079
the data it's a ton of data
1
00:11:22,279 --> 00:11:27,169
I mean it's data that's never existed
1
00:11:24,080 --> 00:11:30,259
before we've never had this data before
1
00:11:27,169 --> 00:11:33,099
we've created technologies that allow us
1
00:11:30,259 --> 00:11:35,990
to capture vast amounts of information
1
00:11:33,100 --> 00:11:37,730
if you think of a billion cell 00:12:14,990
information about the geopolitical
1
00:12:10,850 --> 00:12:17,920
situations the world today is literally
1
00:12:14,990 --> 00:12:17,919
swimming in this data
1
00:12:20,889 --> 00:12:26,870
back in 2012 IBM estimated that an
1
00:12:25,519 --> 00:12:29,929
average human being
1
00:12:26,870 --> 00:12:33,409
leaves 500 megabytes of digital
1
00:12:29,929 --> 00:12:35,959
footprints every day if you wanted to
1
00:12:33,409 --> 00:12:38,179
back up only one day worth of data that
1
00:12:35,960 --> 00:12:41,389
humanity produces and you print it out
1
00:12:38,179 --> 00:12:45,079
on a letter size paper double-sided font
1
00:12:41,389 --> 00:12:47,149
size 12 and you stack it up it would
1
00:12:45,080 --> 00:12:51,399
reach from the surface of the earth to
1
00:12:47,149 --> 00:12:54,860
the Sun four times 00:13:23,149
artificial intelligence they've got the
1
00:13:21,200 --> 00:13:25,250
most money they've got the most talent
1
00:13:23,149 --> 00:13:28,940
they're buying up AI companies and
1
00:13:25,250 --> 00:13:30,950
robotics companies people still think of
1
00:13:28,940 --> 00:13:32,870
Gulas a search engine and their email
1
00:13:30,950 --> 00:13:35,930
provider and a lot of other things that
1
00:13:32,870 --> 00:13:40,789
we use on a daily basis but behind that
1
00:13:35,929 --> 00:13:42,620
search box are 10 million servers that
1
00:13:40,789 --> 00:13:45,469
makes Google the most powerful computing
1
00:13:42,620 --> 00:13:48,350
platform in the world Google is now
1
00:13:45,470 --> 00:13:52,269
00:14:34,579
difference between a 1 and a 2 is it's
1
00:14:31,519 --> 00:14:35,990
just a computation in the last half
1
00:14:34,578 --> 00:14:37,609
decade where we've made this rapid
1
00:14:35,990 --> 00:14:40,430
progress it has all been in pattern
1
00:14:37,610 --> 00:14:43,938
recognition most of the good old
1
00:14:40,429 --> 00:14:46,870
fashioned AI was when we would tell our
1
00:14:43,938 --> 00:14:49,818
computers how to play a game like chess
1
00:14:46,870 --> 00:14:53,558
from the old paradigm where you just
1
00:14:49,818 --> 00:14:53,558
tell the computer exactly what to do
1
00:14:57,299 --> 00:15:05,779
[Music]
1
00:14:59,620 --> 00:15:07,039
the idea challenge no one at time had
1
00:15:05,779 --> 00:15:09,470
thought that a machine could have the
1
00:15:07,039 --> 00:15:11,389
00:15:49,299
the wager hello
1
00:15:50,649 --> 00:15:58,429
4:13 and a Tuesday Watson's trained on
1
00:15:55,309 --> 00:16:00,859
huge amounts of text but it's not like
1
00:15:58,429 --> 00:16:02,269
it understands what it's saying it
1
00:16:00,860 --> 00:16:04,310
doesn't know that water makes things wet
1
00:16:02,269 --> 00:16:05,809
by touching water and by seeing the way
1
00:16:04,309 --> 00:16:08,989
things behave in the world the way you
1
00:16:05,809 --> 00:16:11,719
and I do a lot of language a itay is not
1
00:16:08,990 --> 00:16:14,419
building logical models of how the world
1
00:16:11,720 --> 00:16:17,360
works rather it's looking at how the
1
00:16:14,419 --> 00:16:20,659
words appear in the context of other
1
00:16:17,360 --> 00:16:22,310
words 00:16:54,289
even more amazed when the computer beast
1
00:16:52,669 --> 00:16:56,809
humans and things are humans and
1
00:16:54,289 --> 00:16:58,589
naturally good at this is how we make
1
00:16:56,809 --> 00:17:00,989
progress
1
00:16:58,590 --> 00:17:03,028
in the early days of the Google brain
1
00:17:00,990 --> 00:17:05,068
project I gave the team a very simple
1
00:17:03,028 --> 00:17:07,439
instruction which was built the biggest
1
00:17:05,068 --> 00:17:10,500
neuro Network possible like a thousand
1
00:17:07,439 --> 00:17:11,759
computers in your net is something very
1
00:17:10,500 --> 00:17:15,509
close to a simulation of how the brain
1
00:17:11,759 --> 00:17:18,420
00:17:50,720
learning and neural networks mean
1
00:17:47,130 --> 00:17:53,730
roughly the same thing deep learning is
1
00:17:50,720 --> 00:17:56,339
a totally different approach where the
1
00:17:53,730 --> 00:17:57,660
computer learns more like a toddler by
1
00:17:56,339 --> 00:18:01,379
just getting a lot of data and
1
00:17:57,660 --> 00:18:03,029
eventually figuring stuff out the
1
00:18:01,380 --> 00:18:07,620
computer just gets smarter and smarter
1
00:18:03,029 --> 00:18:09,299
as it has more experiences so imagine if
1
00:18:07,619 --> 00:18:11,489
you will the neural network we're like a
1
00:18:09,299 --> 00:18:13,349
thousand computers and it wakes up not
1
00:18:11,490 --> 00:18:16,640
knowing 00:19:15,189
detect cats
1
00:19:06,269 --> 00:19:18,230
[Music]
1
00:19:15,190 --> 00:19:24,920
that's the remember CF recognition wow
1
00:19:18,230 --> 00:19:26,029
that's a cat okay cool great it's all
1
00:19:24,920 --> 00:19:26,660
pretty innocuous when you're thinking
1
00:19:26,029 --> 00:19:29,859
about the future
1
00:19:26,660 --> 00:19:32,540
it all seems kind of harmless in benign
1
00:19:29,859 --> 00:19:34,459
but we're making cognitive architectures
1
00:19:32,539 --> 00:19:36,230
that will fly farther and faster than us
1
00:19:34,460 --> 00:19:39,110
and carry a bigger payload and they
1
00:19:36,230 --> 00:19:41,089
won't be warm and fuzzy I think that in
1
00:19:39,109 --> 00:19:43,699
three to five years you will see a
1
00:19:41,089 --> 00:19:47,599
computer system that will be able to
1
00:19:43,700 --> 00:19:50,870
autonomously 00:20:29,490
any shape or quantity as far as dream
1
00:20:26,058 --> 00:20:32,339
meet Baxter revolutionary new category
1
00:20:29,490 --> 00:20:34,589
of robots with common sense Baxter
1
00:20:32,339 --> 00:20:38,009
Baxter is a really good example of the
1
00:20:34,589 --> 00:20:40,349
kind of competition we face for machines
1
00:20:38,009 --> 00:20:45,150
Baxter can do almost anything we can do
1
00:20:40,349 --> 00:20:47,699
with our hands Baxter costs about what a
1
00:20:45,150 --> 00:20:49,860
minimum-wage worker makes in a year
1
00:20:47,700 --> 00:20:51,420
the Baxter won't be taking the place of
1
00:20:49,859 --> 00:20:52,979
one minimum-wage worker he'll be taking
1
00:20:51,420 --> 00:20:56,700
the place of three because they never
1
00:20:52,980 --> 00:20:57,839
gets hired they never take breaks that's
1
00:20:56,700 --> 00:21:00,569
00:21:26,359
and lyft drivers had to find something
1
00:21:24,029 --> 00:21:26,359
new to do
1
00:21:26,380 --> 00:21:31,150
there are 4 million professional drivers
1
00:21:29,019 --> 00:21:34,139
in the United States they're unemployed
1
00:21:31,150 --> 00:21:37,900
soon 7 million people to do data entry
1
00:21:34,140 --> 00:21:41,530
those people are going to be jobless
1
00:21:37,900 --> 00:21:43,500
a job isn't just about money right on a
1
00:21:41,529 --> 00:21:46,839
biological level it serves a purpose
1
00:21:43,500 --> 00:21:49,299
becomes a defining thing when the jobs
1
00:21:46,839 --> 00:21:50,740
went away in any given civilization it
1
00:21:49,299 --> 00:21:53,159
doesn't take long 00:22:35,879
a 40 year career in radiology just
1
00:22:33,839 --> 00:22:39,889
reading images I think that could be a
1
00:22:35,880 --> 00:22:39,890
challenge to the new drivers of today
1
00:22:56,339 --> 00:23:03,879
but today we live in a robotic case the
1
00:23:00,759 --> 00:23:07,509
da Vinci robot is currently utilized by
1
00:23:03,880 --> 00:23:11,340
variety of surgeons for its accuracy and
1
00:23:07,509 --> 00:23:14,819
its ability to avoid the inevitable
1
00:23:11,339 --> 00:23:23,740
fluctuations of the human hand
1
00:23:14,819 --> 00:23:26,329
[Music]
1
00:23:23,740 --> 00:23:29,380
anybody who watches this feels the
1
00:23:26,329 --> 00:23:29,379
amazingness of it
1
00:23:31,039 --> 00:23:36,869
you look through the scope and you've
1
00:23:33,630 --> 00:23:39,570
seen the claw hand holding that woman's
1
00:23:36,869 --> 00:23:44,119
ovary humanity was resting right there
1
00:23:39,569 --> 00:23:47,000
00:24:13,860
surgeries is going to be able to perform
1
00:24:11,130 --> 00:24:16,440
that entirely by itself or with human
1
00:24:13,859 --> 00:24:18,659
supervision normally I do about a
1
00:24:16,440 --> 00:24:22,710
hundred fifty cases that hysterectomies
1
00:24:18,660 --> 00:24:26,250
they say and now most of them are done
1
00:24:22,710 --> 00:24:31,829
robotically I do maybe one open case a
1
00:24:26,250 --> 00:24:34,970
year so do I feel uncomfortable how to
1
00:24:31,829 --> 00:24:34,970
open bases anymore
1
00:24:35,490 --> 00:24:42,210
it seems that we're feeding it and
1
00:24:37,589 --> 00:24:47,629
creating it but in a way we are slave to
1
00:24:42,210 --> 00:24:47,630
the technology because 00:25:37,299
depressingly easy
1
00:25:34,509 --> 00:25:39,879
we're not that complicated simple
1
00:25:37,299 --> 00:25:44,139
stimulus response I can make you like me
1
00:25:39,880 --> 00:25:45,550
basically by smiling at you a lot yeah
1
00:25:44,140 --> 00:25:48,300
ours are gonna be fantastic at
1
00:25:45,549 --> 00:25:48,299
manipulating us
1
00:25:49,000 --> 00:25:53,380
[Music]
1
00:25:54,809 --> 00:26:01,269
so you've developed a technology that
1
00:25:57,849 --> 00:26:03,039
can sense what people are feeling right
1
00:26:01,269 --> 00:26:05,079
we've developed technology that can read
1
00:26:03,039 --> 00:26:07,839
your facial expressions and map that to
1
00:26:05,079 --> 00:26:09,849
a number of emotional states fifteen
1
00:26:07,839 --> 00:26:11,500
years ago I had just finished my
1
00:26:09,849 --> 00:26:13,569
undergraduate studies 00:27:10,389
scaring people all right so start by
1
00:27:04,420 --> 00:27:12,820
smiling nice brow furrow nice one
1
00:27:10,390 --> 00:27:14,620
eyebrow raised this generation
1
00:27:12,819 --> 00:27:17,619
technology is just surrounding them all
1
00:27:14,619 --> 00:27:19,059
the time it's almost like they expect to
1
00:27:17,619 --> 00:27:21,099
have robots in their homes and they
1
00:27:19,059 --> 00:27:26,589
expect these robots to be socially
1
00:27:21,099 --> 00:27:30,059
intelligent what makes robots smart put
1
00:27:26,589 --> 00:27:33,490
them in like a math or biology class I
1
00:27:30,059 --> 00:27:37,359
think you would have to train all right
1
00:27:33,490 --> 00:27:39,039
let's walk over here so if you smile and
1
00:27:37,359 --> 00:27:41,789
you raise 00:28:29,899
any artificial intelligence we have a
1
00:28:26,569 --> 00:28:32,358
lot of dumb robots out there but a dumb
1
00:28:29,898 --> 00:28:34,459
robot can be a smart robot overnight
1
00:28:32,358 --> 00:28:38,210
given the right software given the right
1
00:28:34,460 --> 00:28:40,009
sensors we can't help but impute motive
1
00:28:38,210 --> 00:28:42,200
into inanimate objects we do it with
1
00:28:40,009 --> 00:28:46,509
machines we'll treat them like children
1
00:28:42,200 --> 00:28:49,929
we'll treat them like surrogates and
1
00:28:46,509 --> 00:28:49,929
we'll pay the price
1
00:28:51,200 --> 00:28:54,298
[Music]
1
00:29:00,630 --> 00:29:10,950
[Music]
1
00:29:08,769 --> 00:29:17,338
you get welcome to that yeah
1
00:29:10,950 --> 00:29:17,338
[Music]
1
00:29:19,048 --> 00:29:24,190
my purpose is to have more human-like
1
00:29:22,000 --> 00:29:27,240
robot which has the human 00:30:25,970
wanted to arrest
1
00:30:27,329 --> 00:30:31,808
[Music]
1
00:30:29,430 --> 00:30:32,620
if a robot could have an intention
1
00:30:31,808 --> 00:30:35,529
there's Oreos
1
00:30:32,619 --> 00:30:46,000
the robot can understand other people's
1
00:30:35,529 --> 00:30:48,190
engagement desires that is tied to
1
00:30:46,000 --> 00:30:49,859
relationships with the people and that
1
00:30:48,190 --> 00:30:53,440
means they like each other
1
00:30:49,859 --> 00:30:56,039
that means well I'm not sure and not to
1
00:30:53,440 --> 00:30:56,039
rub each other
1
00:30:57,589 --> 00:31:00,859
we build about official intelligence and
1
00:30:59,779 --> 00:31:03,470
the very first thing we want to do is
1
00:31:00,859 --> 00:31:06,589
replicate us
1
00:31:03,470 --> 00:31:11,319
I think the key point will come when all
1
00:31:06,589 --> 00:31:16,908
the major senses are replicated sight
1
00:31:11,319 --> 00:31:20,379
touch smell when we replicate our senses
1
00:31:16,909 --> 00:31:20,380
is that when it becomes alive
1
00:31:27,789 --> 00:31:34,579
so many of our machines are being 00:32:14,410
of autonomous weapons
1
00:32:28,089 --> 00:32:34,669
up to now people have expressed unease
1
00:32:31,130 --> 00:32:42,170
about drones which are remotely piloted
1
00:32:34,670 --> 00:32:44,720
aircraft if you take a drones camera
1
00:32:42,170 --> 00:32:47,990
feed it into the AI system it's a very
1
00:32:44,720 --> 00:32:49,640
easy step from here to fully autonomous
1
00:32:47,990 --> 00:32:53,170
weapons that choose their own targets
1
00:32:49,640 --> 00:32:53,170
release their own missiles
1
00:32:55,269 --> 00:32:58,389
[Music]
1
00:33:02,299 --> 00:33:05,368
[Applause]
1
00:33:12,720 --> 00:33:17,440
the expected lifespan of a human being
1
00:33:15,490 --> 00:33:20,460
and that kind of baffling environment
1
00:33:17,440 --> 00:33:20,460
will be measured in seconds
1
00:33:20,640 --> 00:33:27,340
at one point drones or science fiction
1
00:33:24,009 --> 00:33:31,779
and now they've become the normal thing
1
00:33:27,339 --> 00:33:35,109
and war 00:34:10,699
students as being the best human pilots
1
00:34:03,740 --> 00:34:12,918
with a relatively simple algorithm io I
1
00:34:10,699 --> 00:34:16,460
will have as big an impact on the
1
00:34:12,918 --> 00:34:18,888
military as the combustion engine had at
1
00:34:16,460 --> 00:34:20,690
the turn of the century that would
1
00:34:18,889 --> 00:34:23,809
literally touch everything that the
1
00:34:20,690 --> 00:34:26,450
military does from driverless convoys
1
00:34:23,809 --> 00:34:29,210
delivering logistical supplies to
1
00:34:26,449 --> 00:34:32,329
unmanned drones delivering medical aid
1
00:34:29,210 --> 00:34:33,559
to computational propaganda try and win
1
00:34:32,329 --> 00:34:38,059
the hearts and minds of the population
1
00:34:33,559 --> 00:34:40,279
00:35:22,710
fiction not just predicting the future
1
00:35:18,969 --> 00:35:22,709
but shaping the future
1
00:35:27,130 --> 00:35:33,039
Arthur Conan Doyle riding before World
1
00:35:30,639 --> 00:35:35,889
War one only the danger of how
1
00:35:33,039 --> 00:35:39,730
submarines might be used to carry out
1
00:35:35,889 --> 00:35:42,940
civilian blockades at the time he's
1
00:35:39,730 --> 00:35:44,710
writing this fiction the Royal Navy made
1
00:35:42,940 --> 00:35:47,500
fun of Arthur Conan Doyle for this
1
00:35:44,710 --> 00:35:49,460
absurd idea that submarines could be
1
00:35:47,500 --> 00:35:52,539
useful and war
1
00:35:49,460 --> 00:35:52,539
[Music]
1
00:35:54,039 --> 00:35:58,550
one of the things we've seen in history
1
00:35:55,818 --> 00:36:02,029
is that our attitude towards technology
1
00:35:58,550 --> 00:36:02,660
00:36:41,059
order goes out to commit unrestricted
1
00:36:34,880 --> 00:36:42,890
submarine warfare against Japan so
1
00:36:41,059 --> 00:36:46,460
Arthur Conan Doyle turned out to be
1
00:36:42,889 --> 00:36:48,259
right that's the the great old line
1
00:36:46,460 --> 00:36:51,440
about science fiction it's a lie that
1
00:36:48,260 --> 00:36:53,180
tells the truth fellow executives it
1
00:36:51,440 --> 00:36:55,309
gives me great pleasure to introduce you
1
00:36:53,179 --> 00:36:59,049
to the future of law enforcement
1
00:36:55,309 --> 00:36:59,050
edie 209
1
00:37:04,289 --> 00:37:09,029
this isn't just a question of science
1
00:37:06,059 --> 00:37:10,420
fiction this is about what's next about
1
00:37:09,030 --> 00:37:14,000
what's happening right now
1
00:37:10,420 --> 00:37:17,159
[Music]
1
00:37:14,000 --> 00:37:20,730
the role of intelligent systems is
1
00:37:17,159 --> 00:37:27,750
growing very rapidly in warfare everyone
1
00:37:20,730 --> 00:37:29,849
is pushing in the unmanned realm today
1
00:37:27,750 --> 00:37:31,860
Secretary of Defense is very very clear
1
00:37:29,849 --> 00:37:34,650
we will not create fully autonomous
1
00:37:31,860 --> 00:37:36,390
attacking vehicles not everyone is going
1
00:37:34,650 --> 00:37:38,700
to hold themselves to that same set of
1
00:37:36,389 --> 00:37:41,849
values and when China and Russia and
1
00:37:38,699 --> 00:37:45,329
start deploying autonomous vehicles that
1
00:37:41,849 --> 00:37:51,989
can attack and kill what's the move that
1
00:37:45,329 --> 00:37:53,489
we're gonna make you can't say well
1
00:37:51,989 --> 00:37:55,229
we're going to use at homeless weapons
1
00:37:53,489 --> 00:37:58,049
for our our military dominance but no
1
00:37:55,230 --> 00:37:59,849
one else is going to use them if you
1
00:37:58,050 --> 00:38:03,060
make these weapons they're going to be
1
00:37:59,849 --> 00:38:05,750
used to attack 00:38:52,548
what do they want ban the use of
1
00:38:50,210 --> 00:38:54,798
autonomous weapons the author stated
1
00:38:52,548 --> 00:38:56,929
quote autonomous weapons have been
1
00:38:54,798 --> 00:38:59,088
described as the third revolution in
1
00:38:56,929 --> 00:39:01,250
warfare thousand artificial intelligence
1
00:38:59,088 --> 00:39:05,000
specialists calling for a global ban on
1
00:39:01,250 --> 00:39:07,039
killer robots this open letter basically
1
00:39:05,000 --> 00:39:08,480
says that we should redefine the goal of
1
00:39:07,039 --> 00:39:11,390
the field of artificial intelligence
1
00:39:08,480 --> 00:39:13,880
away from just creating pure undirected
1
00:39:11,389 --> 00:39:15,739
intelligence towards creating beneficial
1
00:39:13,880 --> 00:39:17,568
intelligence the development of AI is
1
00:39:15,739 --> 00:39:19,368
not 00:39:49,170
is very important and autonomous weapons
1
00:39:46,710 --> 00:39:52,528
may be part of the Defense Department's
1
00:39:49,170 --> 00:39:54,869
plan that's very very scary because a
1
00:39:52,528 --> 00:39:56,670
value system of military developers of
1
00:39:54,869 --> 00:40:02,160
Technology is not the same as a value
1
00:39:56,670 --> 00:40:04,349
system of the human race out of the
1
00:40:02,159 --> 00:40:06,389
concerns about the possibility that this
1
00:40:04,349 --> 00:40:07,140
technology might be a threat to human
1
00:40:06,389 --> 00:40:09,268
existence
1
00:40:07,139 --> 00:40:11,068
00:40:47,548
accelerating faster than we expected
1
00:40:44,960 --> 00:40:49,528
remember sitting around the table there
1
00:40:47,548 --> 00:40:51,809
with some of the bests and the smartest
1
00:40:49,528 --> 00:40:54,480
minds in the world and what really
1
00:40:51,809 --> 00:40:57,930
struck me was maybe the human brain is
1
00:40:54,480 --> 00:40:59,400
not able to fully grasp the complexity
1
00:40:57,929 --> 00:41:02,969
of the world that we're confronted with
1
00:40:59,400 --> 00:41:05,009
as it's currently constructed the road
1
00:41:02,969 --> 00:41:07,980
that AI is following heads off a cliff
1
00:41:05,009 --> 00:41:09,539
and we need to change the direction that
1
00:41:07,980 --> 00:41:15,119
we're going so that we don't take the
1
00:41:09,539 --> 00:41:19,170
human race off the cliff Google acquired
1
00:41:15,119 --> 00:41:20,608
deep mind 00:41:54,369
the video game
1
00:41:50,548 --> 00:41:59,409
it knows nothing about objects about
1
00:41:54,369 --> 00:42:00,640
motion about time it only knows that
1
00:41:59,409 --> 00:42:06,129
there's an image on the screen and
1
00:42:00,639 --> 00:42:08,199
there's a score so if your baby woke up
1
00:42:06,130 --> 00:42:11,289
the day it was born and by later
1
00:42:08,199 --> 00:42:15,009
afternoon was playing 40 different Atari
1
00:42:11,289 --> 00:42:17,109
video games at a superhuman level you
1
00:42:15,009 --> 00:42:20,528
would be terrified you would say my baby
1
00:42:17,108 --> 00:42:24,788
is possessed send it back the deep line
1
00:42:20,528 --> 00:42:26,679
system can win at any game it can
1
00:42:24,789 --> 00:42:30,190
already beat all the original Atari
1
00:42:26,679 --> 00:42:32,078
games 00:43:05,410
both times in kind of striking fashion
1
00:43:03,119 --> 00:43:07,539
he really articles in new york times
1
00:43:05,409 --> 00:43:10,348
years ago talking about how go would
1
00:43:07,539 --> 00:43:12,220
take a hundred years for us to saw
1
00:43:10,349 --> 00:43:16,180
people say well you know but that's
1
00:43:12,219 --> 00:43:18,038
still just a board poker is an art poker
1
00:43:16,179 --> 00:43:20,139
involves reading people poker involves
1
00:43:18,039 --> 00:43:22,210
lying bluffing it's not an exact thing
1
00:43:20,139 --> 00:43:24,308
that will never be you know a computer
1
00:43:22,210 --> 00:43:26,650
00:44:06,909
this could be an unintentional Trojan
1
00:44:04,929 --> 00:44:08,559
horse deepmind has to have complete
1
00:44:06,909 --> 00:44:10,420
control of the datacenters so with a
1
00:44:08,559 --> 00:44:12,070
little software update that a I could
1
00:44:10,420 --> 00:44:13,809
take complete control of the whole
1
00:44:12,070 --> 00:44:15,760
Google System which means they can do
1
00:44:13,809 --> 00:44:22,539
anything take a look at all your data
1
00:44:15,760 --> 00:44:24,190
you do anything we're rapidly headed
1
00:44:22,539 --> 00:44:25,690
towards digital super intelligence that
1
00:44:24,190 --> 00:44:28,480
far exceeds any human don't think it's
1
00:44:25,690 --> 00:44:30,190
very obvious the problem is we don't
1
00:44:28,480 --> 00:44:32,079
really suddenly hit human level
1
00:44:30,190 --> 00:44:35,079
intelligence and say okay let's 00:45:10,739
go about that is actually in conflict
1
00:45:09,329 --> 00:45:12,769
with a lot of other things you care
1
00:45:10,739 --> 00:45:12,769
about
1
00:45:12,858 --> 00:45:17,048
ai doesn't have to be evil to destroy
1
00:45:15,228 --> 00:45:19,808
humanity
1
00:45:17,048 --> 00:45:22,119
if AI has a goal and humanity just
1
00:45:19,809 --> 00:45:23,559
happens to be in the way it will destroy
1
00:45:22,119 --> 00:45:24,818
him at the humanity as a matter of
1
00:45:23,559 --> 00:45:26,890
course without even thinking about it no
1
00:45:24,818 --> 00:45:29,438
hard feelings it's just like if we're
1
00:45:26,889 --> 00:45:31,629
building a road and an ant hill happens
1
00:45:29,438 --> 00:45:34,418
to be in the way we don't hate ants
1
00:45:31,630 --> 00:45:37,019
we're just building a road and so
1
00:45:34,418 --> 00:45:37,018
goodbye 00:46:13,328
really annoyed and figured out how to
1
00:46:11,139 --> 00:46:14,049
make a nuclear chain reaction just a few
1
00:46:13,329 --> 00:46:17,849
months later
1
00:46:14,050 --> 00:46:17,849
[Music]
1
00:46:20,659 --> 00:46:26,069
we have spent more than two billion
1
00:46:23,489 --> 00:46:29,639
dollars on the greatest scientific
1
00:46:26,070 --> 00:46:31,680
gamble in history so when people say
1
00:46:29,639 --> 00:46:32,940
that oh this is so far off in the future
1
00:46:31,679 --> 00:46:35,190
we don't have to worry about it
1
00:46:32,940 --> 00:46:37,349
they might only be three four
1
00:46:35,190 --> 00:46:38,909
breakthroughs of that magnitude that
1
00:46:37,349 --> 00:46:41,940
will get us from here to super
1
00:46:38,909 --> 00:46:44,969
intelligent machines if it's gonna take
1
00:46:41,940 --> 00:46:47,909
00:47:33,409
much that's an amazing question I don't
1
00:47:31,400 --> 00:47:35,869
trust my computer if it's on I take it
1
00:47:33,409 --> 00:47:37,279
off like even it was off I still think
1
00:47:35,869 --> 00:47:38,839
it's all like you know like you really
1
00:47:37,280 --> 00:47:40,850
cannot just like the webcams you don't
1
00:47:38,840 --> 00:47:44,059
know like someone might turn it up don't
1
00:47:40,849 --> 00:47:48,199
know like I don't trust my computer like
1
00:47:44,059 --> 00:47:50,480
in my phone every time they ask me we
1
00:47:48,199 --> 00:47:55,909
send your information to Apple every
1
00:47:50,480 --> 00:47:59,030
time I so trust my phone ok so part of
1
00:47:55,909 --> 00:48:00,440
it is yes I do trust it because it's
1
00:47:59,030 --> 00:48:02,420
00:49:16,329
just didn't have the materials in the
1
00:48:59,650 --> 00:49:24,599
technologies could it be any more
1
00:49:16,329 --> 00:49:28,089
difficult thank God so the coil is
1
00:49:24,599 --> 00:49:31,329
barely in there right now it's just a
1
00:49:28,088 --> 00:49:33,690
feather holding it in it's a nervous
1
00:49:31,329 --> 00:49:33,690
time
1
00:49:36,480 --> 00:49:42,909
we're just in purgatory intellectual
1
00:49:39,340 --> 00:49:45,960
humanistic purgatory an AI might know
1
00:49:42,909 --> 00:49:45,960
exactly what to do here
1
00:49:50,670 --> 00:49:56,050
we got the coil into the aneurysm but it
1
00:49:53,769 --> 00:49:59,259
wasn't in tremendously well that I knew
1
00:49:56,050 --> 00:50:02,530
00:50:43,530
be compassionate
1
00:50:39,610 --> 00:50:45,690
[Music]
1
00:50:43,530 --> 00:50:51,360
I mean it's everybody's question about
1
00:50:45,690 --> 00:50:54,420
AI we are the sole embodiment of
1
00:50:51,360 --> 00:50:56,579
humanity and it's a stretch for us to
1
00:50:54,420 --> 00:51:01,250
accept that a machine can be
1
00:50:56,579 --> 00:51:01,250
compassionate and loving in that way
1
00:51:01,469 --> 00:51:07,789
[Music]
1
00:51:05,329 --> 00:51:10,380
part of me doesn't believe in magic but
1
00:51:07,789 --> 00:51:12,659
part of me has faith that there is
1
00:51:10,380 --> 00:51:14,640
something beyond the sum of the parts if
1
00:51:12,659 --> 00:51:17,940
there is at least a oneness in our
1
00:51:14,639 --> 00:51:20,509
shared ancestry our shared biology our
1
00:51:17,940 --> 00:51:20,510
shared history
1
00:51:20,920 --> 00:51:28,999
some connection there 00:52:17,920
them a conscious and make them feel
1
00:52:22,579 --> 00:52:29,720
back in 2005 we started trying to build
1
00:52:26,119 --> 00:52:29,720
machines with self-awareness
1
00:52:33,099 --> 00:52:39,230
this robot to begin with didn't know
1
00:52:35,750 --> 00:52:45,590
what it was all he knew is that it
1
00:52:39,230 --> 00:52:47,210
needed to do something like walk through
1
00:52:45,590 --> 00:52:51,110
trial and error and figure out how to
1
00:52:47,210 --> 00:52:56,150
walk using its imagination and then it
1
00:52:51,110 --> 00:52:58,610
walked away and then we did something
1
00:52:56,150 --> 00:53:01,269
very cruel we chopped off a leg and
1
00:52:58,610 --> 00:53:01,269
watched what happened
1
00:53:03,360 --> 00:53:10,170
at the beginning it didn't quite know
1
00:53:05,969 --> 00:53:14,579
what 00:53:48,329
tracking our faces as we were moving
1
00:53:44,340 --> 00:53:50,820
around now the spooky thing about this
1
00:53:48,329 --> 00:53:54,630
is that we never trained the system to
1
00:53:50,820 --> 00:53:58,039
recognize human faces and yet somehow
1
00:53:54,630 --> 00:53:58,039
and learn to do that
1
00:53:58,079 --> 00:54:02,500
even though these robots are very simple
1
00:54:00,429 --> 00:54:07,299
we can see there's something else on
1
00:54:02,500 --> 00:54:12,099
there it's not just program so this is
1
00:54:07,300 --> 00:54:17,050
just the beginning I often think about
1
00:54:12,099 --> 00:54:22,480
that beach in Kitty Hawk the 1903 flight
1
00:54:17,050 --> 00:54:24,550
by 00:55:03,409
infrastructure with travel agents and
1
00:55:00,650 --> 00:55:08,900
tower control and it's all casual it's
1
00:55:03,409 --> 00:55:10,909
all part of the world right now as far
1
00:55:08,900 --> 00:55:13,309
as we've come with machines and thinking
1
00:55:10,909 --> 00:55:16,009
solve problems we're a Kittyhawk now
1
00:55:13,309 --> 00:55:18,019
we're in the wind we have our tattered
1
00:55:16,010 --> 00:55:21,119
canvas planes up in the air
1
00:55:18,019 --> 00:55:21,119
[Music]
1
00:55:21,269 --> 00:55:26,610
but what happens in 65 summers or so we
1
00:55:25,050 --> 00:55:32,870
will have machines that are behind you
1
00:55:26,610 --> 00:55:37,220
control should we worry about that I'm
1
00:55:32,869 --> 00:55:37,219
not sure it's going to help
1
00:55:40,568 --> 00:55:47,808
nobody has any idea today what it means
1
00:55:43,880 --> 00:55:50,599
00:56:19,179
lost in stock market the Dow dropped
1
00:56:16,909 --> 00:56:24,469
nearly a thousand points in a half hour
1
00:56:19,179 --> 00:56:27,679
so what went wrong by that point in time
1
00:56:24,469 --> 00:56:30,199
more than 60% of all the trades that
1
00:56:27,679 --> 00:56:35,379
took place on stock exchange we're
1
00:56:30,199 --> 00:56:35,379
actually being initiated by computers
1
00:56:38,000 --> 00:56:42,289
the short story what happened in the
1
00:56:39,739 --> 00:56:44,750
flash crash is that algorithms responded
1
00:56:42,289 --> 00:56:46,609
to algorithms and it compounded upon
1
00:56:44,750 --> 00:56:48,798
itself over and over and over again the
1
00:56:46,608 --> 00:56:51,528
matter of minutes at one point the
1
00:56:48,798 --> 00:56:54,139
market 00:57:32,259
aspects about AI in general is that no
1
00:57:29,659 --> 00:57:35,359
one really understands how it works
1
00:57:32,260 --> 00:57:39,380
even people who create AI don't really
1
00:57:35,360 --> 00:57:41,960
fully understand because it has millions
1
00:57:39,380 --> 00:57:43,700
of elements it becomes completely
1
00:57:41,960 --> 00:57:47,920
impossible for a human being to
1
00:57:43,699 --> 00:57:47,919
understand what's going on
1
00:57:53,119 --> 00:57:58,409
Microsoft had set up this artificial
1
00:57:56,400 --> 00:58:03,300
intelligence called ti' on Twitter which
1
00:57:58,409 --> 00:58:05,869
was a chat bot they started out in the
1
00:58:03,300 --> 00:58:08,220
morning and ty was starting to tweet and
1
00:58:05,869 --> 00:58:12,119
learning 00:58:54,990
the master chess player that will
1
00:58:51,568 --> 00:58:58,019
outmaneuver us but hey I won't have to
1
00:58:54,989 --> 00:59:00,929
actually be that smart to have massively
1
00:58:58,019 --> 00:59:02,969
disruptive effects on human civilization
1
00:59:00,929 --> 00:59:04,289
we've seen over the last century it
1
00:59:02,969 --> 00:59:06,629
doesn't necessarily take a genius to
1
00:59:04,289 --> 00:59:09,000
knock history off in a particular
1
00:59:06,630 --> 00:59:10,730
direction and it won't take a genius ai
1
00:59:09,000 --> 00:59:13,349
to do the same thing
1
00:59:10,730 --> 00:59:16,139
bogus election news stories generated
1
00:59:13,349 --> 00:59:19,440
more engagement on Facebook then top
1
00:59:16,139 --> 00:59:21,170
real stories Facebook really is the
1
00:59:19,440 --> 00:59:25,079
elephant in the room
1
00:59:21,170 --> 00:59:29,789
01:00:13,030
maximize user engagement and it achieved
1
01:00:09,789 --> 01:00:16,480
that nearly two billion people spend
1
01:00:13,030 --> 01:00:20,109
nearly 1 hour on average a day basically
1
01:00:16,480 --> 01:00:23,650
interacting with AI that is shaping
1
01:00:20,108 --> 01:00:26,318
their experience even Facebook engineers
1
01:00:23,650 --> 01:00:28,660
they don't like fake news let's very bad
1
01:00:26,318 --> 01:00:30,190
business they want to get rid of fake
1
01:00:28,659 --> 01:00:32,440
news it's just very difficult to do
1
01:00:30,190 --> 01:00:34,420
because how do you recognize news is
1
01:00:32,440 --> 01:00:38,559
fake if you cannot read all of those
1
01:00:34,420 --> 01:00:41,530
news personally there's so much active
1
01:00:38,559 --> 01:00:44,290
misinformation and it's packaged very
1
01:00:41,530 --> 01:00:46,690
well and it 01:01:15,869
gone and Facebook is completely
1
01:01:13,599 --> 01:01:15,869
annihilated
1
01:01:17,230 --> 01:01:21,699
if most of your understanding of how the
1
01:01:19,579 --> 01:01:24,529
world works is derived from Facebook
1
01:01:21,699 --> 01:01:26,899
facilitated by algorithmic software that
1
01:01:24,530 --> 01:01:29,660
tries to show you the news you want to
1
01:01:26,900 --> 01:01:32,119
see that's a terribly dangerous thing
1
01:01:29,659 --> 01:01:35,750
and the idea that we have not only set
1
01:01:32,119 --> 01:01:38,869
that in motion but allowed bad-faith
1
01:01:35,750 --> 01:01:45,139
actors access to that information this
1
01:01:38,869 --> 01:01:46,579
is a recipe for disaster I think that it
1
01:01:45,139 --> 01:01:49,539
01:02:20,360
Cruz's presidential primary campaign
1
01:02:16,940 --> 01:02:22,130
Cambridge analytics emerged quietly as a
1
01:02:20,360 --> 01:02:24,890
company that according to its own height
1
01:02:22,130 --> 01:02:28,010
and has the ability to use this
1
01:02:24,889 --> 01:02:32,690
tremendous amount of data in order to
1
01:02:28,010 --> 01:02:35,420
affect societal change in 2016 they had
1
01:02:32,690 --> 01:02:37,970
three major clients Ted Cruz was one of
1
01:02:35,420 --> 01:02:40,940
them it's easy to forget that only 18
1
01:02:37,969 --> 01:02:42,409
months ago senator Cruz was one of the
1
01:02:40,940 --> 01:02:45,619
less 01:03:19,340
they need moving from the center a
1
01:03:17,599 --> 01:03:20,989
little bit more towards the right in
1
01:03:19,340 --> 01:03:23,750
order to support Cruz they need a
1
01:03:20,989 --> 01:03:26,149
persuasion message gun rights I've
1
01:03:23,750 --> 01:03:27,079
selected that narrows the field slightly
1
01:03:26,150 --> 01:03:29,568
more and
1
01:03:27,079 --> 01:03:31,160
we know that we need a message on gun
1
01:03:29,568 --> 01:03:33,528
rights it needs to be a persuasion
1
01:03:31,159 --> 01:03:35,478
message and it needs to be nuanced
1
01:03:33,528 --> 01:03:37,518
according to the certain personality
1
01:03:35,478 --> 01:03:40,368
that we're interested in through social
1
01:03:37,518 --> 01:03:42,468
media there's an infinite amount of
1
01:03:40,369 --> 01:03:44,778
01:04:18,978
and steady but firm rise to above 35%
1
01:04:17,119 --> 01:04:20,959
making him obviously the second most
1
01:04:18,978 --> 01:04:23,568
threatening contender in the race now
1
01:04:20,958 --> 01:04:26,389
clearly the Cruz campaign is over now
1
01:04:23,568 --> 01:04:28,338
but what I can tell you is that of the
1
01:04:26,389 --> 01:04:30,588
two candidates left left in this
1
01:04:28,338 --> 01:04:36,228
election one of them is using these
1
01:04:30,588 --> 01:04:39,259
technologies Donald Trump do solemnly
1
01:04:36,228 --> 01:04:43,778
swear that I will faithfully execute the
1
01:04:39,259 --> 01:04:43,778
office of President of the United States
1
01:04:44,260 --> 01:04:47,449
[Music]
1
01:04:48,958 --> 01:04:54,969
elections are marginal exercise 01:06:12,460
psychological traits such as personality
1
01:06:09,789 --> 01:06:15,489
intelligence political views and so on
1
01:06:12,460 --> 01:06:18,360
now traditionally those traits were
1
01:06:15,489 --> 01:06:20,439
measured using tests and questioners
1
01:06:18,360 --> 01:06:21,970
personality tests the most benign thing
1
01:06:20,440 --> 01:06:23,050
you could possibly think of something
1
01:06:21,969 --> 01:06:26,469
that doesn't necessarily have a lot of
1
01:06:23,050 --> 01:06:29,019
utility right our idea was that instead
1
01:06:26,469 --> 01:06:30,489
of tests and questioners we could simply
1
01:06:29,019 --> 01:06:32,610
look at the digital footprints of
1
01:06:30,489 --> 01:06:36,869
behaviors that we are all living behind
1
01:06:32,610 --> 01:06:40,120
to understand openness conscientiousness
1
01:06:36,869 --> 01:06:42,579
neuroticism 01:07:26,099
reveal people's religious views or
1
01:07:24,400 --> 01:07:29,860
political views or sexual orientation
1
01:07:26,099 --> 01:07:32,909
based on only profile pictures this
1
01:07:29,860 --> 01:07:40,950
could be literally an issue of life and
1
01:07:32,909 --> 01:07:40,949
death I think there's no going back
1
01:07:42,150 --> 01:07:49,019
you know what the Turing test is it's
1
01:07:46,599 --> 01:07:51,190
when a human interacts with a computer
1
01:07:49,019 --> 01:07:54,099
and if the human doesn't know they're
1
01:07:51,190 --> 01:07:58,389
interacting with a computer the test is
1
01:07:54,099 --> 01:07:59,799
passed and over the next few days you're
1
01:07:58,389 --> 01:08:01,869
gonna be the human component in the
1
01:07:59,800 --> 01:08:05,289
Turing test holy
1
01:08:01,869 --> 01:08:07,049
that's right Kayla 01:08:58,619
with the artificial intelligence we're
1
01:08:54,338 --> 01:09:00,489
creating today our computers phones
1
01:08:58,619 --> 01:09:04,180
applications give us superhuman
1
01:09:00,489 --> 01:09:06,929
capability so as the old maxim says if
1
01:09:04,180 --> 01:09:06,930
you can't beat them join them
1
01:09:07,288 --> 01:09:12,809
it's about a human machine partnership I
1
01:09:10,130 --> 01:09:14,849
mean we already see how you know our
1
01:09:12,809 --> 01:09:16,380
phones for example it's act as memory
1
01:09:14,849 --> 01:09:17,969
prosthesis right I don't have to
1
01:09:16,380 --> 01:09:19,219
remember your phone number anymore
1
01:09:17,969 --> 01:09:22,288
because it's on my phone
1
01:09:19,219 --> 01:09:24,088
it's about machines augmenting 01:10:01,349
where AI is playing a more prominent
1
01:09:58,679 --> 01:10:03,868
role in societies we want good ways to
1
01:10:01,349 --> 01:10:07,250
interact with this technology so that it
1
01:10:03,868 --> 01:10:07,250
ends up augmenting us
1
01:10:09,130 --> 01:10:17,079
I think it's incredibly important to AI
1
01:10:10,840 --> 01:10:19,930
not the other it must be us and I could
1
01:10:17,079 --> 01:10:21,939
be wrong about what I'm saying I'm
1
01:10:19,930 --> 01:10:24,890
certainly open to ideas or anybody can
1
01:10:21,939 --> 01:10:26,629
suggest a path that's better
1
01:10:24,890 --> 01:10:31,030
but I think we're really gonna have to
1
01:10:26,630 --> 01:10:31,029
either merge with a IOP left behind
1
01:10:31,109 --> 01:10:39,049
[Music]
1
01:10:36,618 --> 01:10:40,880
it's hard to kind of think of unplugging
1
01:10:39,050 --> 01:10:43,779
a system that's 01:11:24,800
[Music]
1
01:11:22,050 --> 01:11:27,570
[Applause]
1
01:11:24,800 --> 01:11:29,369
the least scary future I can think of is
1
01:11:27,569 --> 01:11:34,380
one where we have at least democratized
1
01:11:29,369 --> 01:11:36,569
AI because if one company or small group
1
01:11:34,380 --> 01:11:38,220
for people managers to develop godlike
1
01:11:36,569 --> 01:11:39,609
digital super intelligence they could
1
01:11:38,220 --> 01:11:40,809
take over the world
1
01:11:39,609 --> 01:11:42,549
[Music]
1
01:11:40,809 --> 01:11:46,239
at least when there's an evil dictator
1
01:11:42,550 --> 01:11:48,340
that human is going to die but for an AI
1
01:11:46,238 --> 01:11:51,399
there would be no death they would look
1
01:11:48,340 --> 01:11:54,489
forever and then you'd have an immortal
1
01:11:51,399 --> 01:11:57,579
dictator from which we can never escape
1
01:11:54,489 --> 01:11:57,579
[Music]
1
01:12:10,930 --> 01:12:14,048
[Music]
1
01:12:19,479 --> 01:12:22,669
[Music]
1
01:12:28,310 --> 01:12:33,919
[Music]
1
01:12:36,270 --> 01:12:48,399
[Music]
1
01:13:12,380 --> 01:13:15,630
[Music]
1
01:13:17,529 --> 01:13:20,099
you
1
01:13:24,590 --> 01:13:27,699
[Music]
1
01:13:30,310 --> 01:13:33,659
[Music]
1
01:13:37,140 --> 01:13:53,560
[Music]
1
01:14:01,899 --> 01:14:05,339
[Music]
1
01:14:12,720 --> 01:14:19,159
[Music]
1
01:14:22,909 --> 01:14:33,170
[Music]
1
01:14:31,949 --> 01:14:52,929
[Applause]
1
01:14:33,170 --> 01:14:52,929
[Music]
1
01:15:00,229 --> 01:15:08,469
[Music]
1
01:15:14,829 --> 01:15:31,640
[Music]
1
01:15:39,329 --> 01:15:44,238
[Music]
1
01:15:57,850 --> 01:16:29,590
[Applause]
1
01:15:58,930 --> 01:16:29,590
[Music]
1
01:16:34,359 --> 01:16:42,478
[Music]
1
01:16:47,000 --> 01:16:52,939
[Music]
1
01:16:59,350 --> 01:17:33,329
[Music]
1
01:17:32,430 --> 01:17:38,050
[Applause]
1
01:17:33,329 --> 01:17:38,050
[Music]
1
01:17:39,579 --> 01:17:45,409
[Applause]
1
01:17:42,189 --> 01:17:45,409
[Music]
96621
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.