Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:08,542 --> 00:00:10,675
Narrator: It only takes amoment for the unthinkable to
2
00:00:10,710 --> 00:00:13,512
become reality.
3
00:00:14,047 --> 00:00:17,549
In the near future, tragedymay come with new choices.
4
00:00:19,519 --> 00:00:21,019
Imagine you've beenin a car crash that's
5
00:00:21,054 --> 00:00:22,220
killed your daughter.
6
00:00:22,255 --> 00:00:23,822
Eva: Is she okay, is she okay?
7
00:00:23,857 --> 00:00:25,390
Oscar: Jess?
8
00:00:25,425 --> 00:00:26,658
Narrator: Or has it?
9
00:00:26,693 --> 00:00:27,726
Oscar: Jess?
10
00:00:27,761 --> 00:00:28,860
Listen to me.
11
00:00:28,895 --> 00:00:29,895
Eva: Is she breathing?
12
00:00:29,930 --> 00:00:31,063
Oscar: You okay?
13
00:00:31,098 --> 00:00:32,330
Hey, hey.
14
00:00:32,365 --> 00:00:33,865
Eva: Is she okay?
15
00:00:33,900 --> 00:00:35,867
Robot: Time of death, 9:17.
16
00:00:35,902 --> 00:00:38,703
Your daughter's brain was
not impacted by the accident.
17
00:00:38,738 --> 00:00:41,840
If you wish to scan her brain
for digitization and upload,
18
00:00:41,875 --> 00:00:44,043
you have approximately
5 minutes.
19
00:00:44,711 --> 00:00:46,845
Eva: Do it.
20
00:00:46,880 --> 00:00:49,014
Narrator: Now imagine thepower you wield when you can
21
00:00:49,049 --> 00:00:51,550
order a digital copy ofyour daughter's brain.
22
00:00:52,886 --> 00:00:55,220
And do you knowwhat comes next?
23
00:01:02,562 --> 00:01:04,529
Jess: Hi dad.
24
00:01:04,564 --> 00:01:07,399
Narrator: One day, deliveredto your doorstep is an
25
00:01:07,434 --> 00:01:10,902
artificially intelligentandroid that will act, look,
26
00:01:10,937 --> 00:01:13,105
emote, just likeyour daughter.
27
00:01:15,809 --> 00:01:18,444
Every memory from birthon will remain intact.
28
00:01:19,646 --> 00:01:23,215
But she'll also be a walkingcomputer with access to all
29
00:01:23,250 --> 00:01:25,417
the world's knowledge.
30
00:01:27,420 --> 00:01:30,555
This is what yourworld will look like.
31
00:01:30,590 --> 00:01:34,293
This is the future ofartificial intelligence.
32
00:01:42,802 --> 00:01:45,737
It's the deep future.
33
00:01:45,772 --> 00:01:47,072
Your body?
34
00:01:47,107 --> 00:01:48,607
Gone.
35
00:01:48,642 --> 00:01:51,176
You're all computer,all the time.
36
00:01:52,145 --> 00:01:54,312
Your brain is way morepowerful than even
37
00:01:54,347 --> 00:01:56,949
a billion supercomputers.
38
00:01:58,151 --> 00:02:03,088
Jobs, food, language, water,even traditional thought,
39
00:02:03,123 --> 00:02:05,590
all of humanity'sbuilding blocks?
40
00:02:05,625 --> 00:02:07,292
All that's done.
41
00:02:07,327 --> 00:02:10,162
And you are immortal.
42
00:02:10,463 --> 00:02:12,097
Squirming in your chair yet?
43
00:02:12,132 --> 00:02:13,498
You should be.
44
00:02:13,533 --> 00:02:15,467
This isn't science fiction.
45
00:02:15,502 --> 00:02:18,970
Today's visionary thinkerssay it's a strong probability
46
00:02:19,005 --> 00:02:22,141
that this is what yourworld is going to look like.
47
00:02:22,642 --> 00:02:24,776
Tonight, they'll guide youtoward that spectacular
48
00:02:24,811 --> 00:02:28,313
future, and we'll see howone family navigates it,
49
00:02:28,348 --> 00:02:30,982
one invention at a time.
50
00:02:31,017 --> 00:02:33,685
This is the storyof your future.
51
00:02:33,720 --> 00:02:37,489
This is the roadto year million.
52
00:02:43,029 --> 00:02:44,829
The year million.
53
00:02:44,864 --> 00:02:46,798
We're not talkingabout a date.
54
00:02:46,833 --> 00:02:49,501
It's a figurative era,one where our world has
55
00:02:49,536 --> 00:02:51,803
gone through such adramatic change that it's
56
00:02:51,838 --> 00:02:54,173
practically unrecognizable.
57
00:02:54,708 --> 00:02:56,975
Charles soule: Year million
is a pretty popular troupe.
58
00:02:57,010 --> 00:02:59,811
It's not necessarily literally
the year one million,
59
00:02:59,846 --> 00:03:02,748
it's more, you know,
far, far down the road.
60
00:03:03,550 --> 00:03:05,483
Humanity has involved
past a lot of the concerns
61
00:03:05,518 --> 00:03:07,519
that we have now.
62
00:03:07,554 --> 00:03:09,321
Narrator: And one ofthe main innovations that
63
00:03:09,356 --> 00:03:12,724
will shape that futureis the creation and rise
64
00:03:12,759 --> 00:03:14,693
of artificial intelligence.
65
00:03:14,728 --> 00:03:19,564
Brian Greene: If you ask,
what is it that truly defines
66
00:03:19,599 --> 00:03:21,900
human beings, I would
say it's our minds.
67
00:03:23,036 --> 00:03:25,037
It's our consciousness,
it's our intelligence.
68
00:03:26,039 --> 00:03:29,408
And if we could create
that kind of intelligence,
69
00:03:30,377 --> 00:03:33,878
create the most powerful
thing that we experience,
70
00:03:33,913 --> 00:03:35,547
that's breathtaking, right?
71
00:03:35,582 --> 00:03:40,752
Who wouldn't find that
either enormously compelling,
72
00:03:40,787 --> 00:03:43,222
or enormously frightening?
73
00:03:44,891 --> 00:03:47,058
Narrator: And youshould be frightened.
74
00:03:47,093 --> 00:03:50,229
For example, you know thatcell phone in your pocket?
75
00:03:50,764 --> 00:03:53,365
You can't live withoutit and it's already started
76
00:03:53,400 --> 00:03:55,066
to replace you.
77
00:03:55,101 --> 00:03:57,702
It can access your memoriesand find new information in
78
00:03:57,737 --> 00:03:59,371
the blink of an eye.
79
00:03:59,406 --> 00:04:02,040
And just keeps gettingbetter with time.
80
00:04:02,075 --> 00:04:05,043
That's the A.I. That willchallenge our dominance
81
00:04:05,078 --> 00:04:07,412
on planet earth.
82
00:04:08,415 --> 00:04:10,081
Peter Diamandis: We have
billions and billions of
83
00:04:10,116 --> 00:04:13,585
dollars in this arms
race to develop A.I.
84
00:04:14,254 --> 00:04:16,454
You're going to
expect everything,
85
00:04:16,489 --> 00:04:18,790
every object to become smart.
86
00:04:19,793 --> 00:04:21,626
All of these
fundamental things are going
87
00:04:21,661 --> 00:04:22,961
to change everything.
88
00:04:22,996 --> 00:04:24,930
Everything.
89
00:04:25,832 --> 00:04:28,333
Narrator: So let's show youhow that arms race plays out.
90
00:04:30,804 --> 00:04:33,438
One of the next explosivestages in A.I.'S evolution
91
00:04:33,473 --> 00:04:37,475
begins in a few hundred years,when humanoid robots become
92
00:04:37,510 --> 00:04:40,012
nearly indistinguishablefrom us.
93
00:04:41,948 --> 00:04:44,283
They'll take our jobsand nurse our sick.
94
00:04:45,652 --> 00:04:48,320
Next, they'll recognizetheir own consciousness,
95
00:04:48,355 --> 00:04:51,923
and surpass human beings,sparking a paradigm shift in
96
00:04:51,958 --> 00:04:54,126
our world that wecan barely imagine.
97
00:04:55,462 --> 00:04:58,463
Then comes the singularity,when artificially intelligent
98
00:04:58,498 --> 00:05:01,599
androids become the Alphaspecies and with humanity at
99
00:05:01,634 --> 00:05:03,501
its most vulnerable,some of us will choose to
100
00:05:03,536 --> 00:05:06,204
fuse with A.I. Inorder to remain relevant.
101
00:05:07,807 --> 00:05:10,608
We'll start withbrain implants,
102
00:05:10,643 --> 00:05:13,812
eventually we'll live in anentirely digital universe,
103
00:05:13,847 --> 00:05:16,481
where we can doanything we imagine.
104
00:05:19,319 --> 00:05:21,619
And finally,when we let A.I. Influence
105
00:05:21,654 --> 00:05:24,122
every aspect of our lives,
106
00:05:24,157 --> 00:05:27,158
our notion of what itmeans to be human takes on
107
00:05:27,193 --> 00:05:29,494
a whole new meaning.
108
00:05:30,163 --> 00:05:32,163
Heavy, right?
109
00:05:32,198 --> 00:05:35,533
Let's roll back the clock towhen artificially intelligent
110
00:05:35,568 --> 00:05:38,704
androids are asubiquitous as fire hydrants,
111
00:05:39,672 --> 00:05:41,740
and are even becomingpart of our families.
112
00:05:46,045 --> 00:05:50,015
You remember the mom, dadand their A.I. Daughter, Jess.
113
00:05:51,518 --> 00:05:53,318
Jess: Nice dad!
114
00:05:53,353 --> 00:05:55,687
You're only like
15 moves behind.
115
00:05:57,524 --> 00:05:58,857
Eva: Hey!
116
00:05:58,892 --> 00:06:00,024
Be nice to your father!
117
00:06:00,059 --> 00:06:01,526
He tries his best!
118
00:06:01,561 --> 00:06:03,061
Oscar: You know I really miss
the days when I could call you
119
00:06:03,096 --> 00:06:04,662
a know it all and
it wasn't true.
120
00:06:04,697 --> 00:06:06,531
With all this knowledge and
these perfect algorithms
121
00:06:06,566 --> 00:06:07,832
there's still a question
she can't answer.
122
00:06:07,867 --> 00:06:08,867
Eva: Oh yeah, what's that?
123
00:06:08,902 --> 00:06:10,335
Oscar: Why are we here?
124
00:06:10,370 --> 00:06:11,503
Jess: Well actually
i can answer that.
125
00:06:11,538 --> 00:06:12,737
Oscar: Oh yeah?
126
00:06:12,772 --> 00:06:14,873
Jess: Yeah, and I can
show you right now.
127
00:06:17,710 --> 00:06:19,177
Oscar: Very sweet.
128
00:06:19,212 --> 00:06:21,713
It's your turn.
129
00:06:21,748 --> 00:06:24,048
Narrator: But how will youget from the cell phone in your
130
00:06:24,083 --> 00:06:26,752
hand to interacting withyour artificial daughter?
131
00:06:28,221 --> 00:06:31,022
Well as with all greattechnological innovations,
132
00:06:31,057 --> 00:06:33,925
if you can imagine it,you can make it so.
133
00:06:35,728 --> 00:06:38,196
Eric Horvitz: When I think
about the future I think about
134
00:06:38,231 --> 00:06:40,966
the Wright brothers on
this windy beach in 1905.
135
00:06:41,901 --> 00:06:46,037
With this canvas covered wood
and iron contraption that just
136
00:06:46,072 --> 00:06:48,039
got off the ground that
is tipping off the beach.
137
00:06:48,074 --> 00:06:50,575
And it's flying.
138
00:06:52,645 --> 00:06:57,582
Just fifty summers later,
we have a 707 taking off.
139
00:07:00,420 --> 00:07:03,121
A.I. Is
in that state now.
140
00:07:03,156 --> 00:07:05,890
We're kind of on
the windy beach.
141
00:07:05,925 --> 00:07:08,059
Narrator: That cell phonein your hand is the
142
00:07:08,094 --> 00:07:10,261
Wright brothers'rudimentary plane.
143
00:07:10,296 --> 00:07:14,433
Sure, humans don't havethe power to know the future,
144
00:07:14,834 --> 00:07:17,569
but in thousandsyears of human history,
145
00:07:17,604 --> 00:07:21,105
we've never been able tostop trying to imagine it.
146
00:07:21,140 --> 00:07:23,942
Brian Greene: It's so hard to
imagine even 100 years from
147
00:07:23,977 --> 00:07:26,611
now, with the rate at
which things are changing.
148
00:07:27,847 --> 00:07:30,648
But when I sit at home
and it's dark and quiet,
149
00:07:30,683 --> 00:07:33,918
late into the night,
there's nothing to me that's
150
00:07:33,953 --> 00:07:36,254
more compelling than
to imagine being able to
151
00:07:36,289 --> 00:07:38,256
visit the future.
152
00:07:38,291 --> 00:07:42,594
I mean, I think the
opportunities are just so
153
00:07:42,629 --> 00:07:45,430
exciting, and the fact
that I won't get there,
154
00:07:45,465 --> 00:07:47,866
and won't see it,
is, is painful.
155
00:07:49,135 --> 00:07:50,935
Narrator: That desire tosee the future has plagued
156
00:07:50,970 --> 00:07:53,472
humanity sincethe dawn of time.
157
00:07:54,641 --> 00:07:58,643
The ancient greeks, Chineseand Indians all dreamed up
158
00:07:59,512 --> 00:08:02,013
mythological stories abouta future sentient being,
159
00:08:02,815 --> 00:08:05,684
created by man.
160
00:08:08,021 --> 00:08:11,289
And A.I.'S are abeloved part of today's
161
00:08:11,324 --> 00:08:13,124
science fiction.
162
00:08:13,159 --> 00:08:16,561
Matt Mira: "C3PO" is an
android, "r2d2" is a robot.
163
00:08:17,497 --> 00:08:19,030
C3PO: I've just about
had enough of you.
164
00:08:19,065 --> 00:08:21,132
Go that way.
165
00:08:21,167 --> 00:08:22,500
Matt Mira: That's why i
don't think it'll ever work
166
00:08:22,535 --> 00:08:23,968
out with those two.
167
00:08:24,003 --> 00:08:25,470
They're always
arguing for a reason.
168
00:08:25,505 --> 00:08:27,005
C3PO: You'll be
malfunctioning within a day,
169
00:08:27,040 --> 00:08:28,673
you near-sighted scrap pile.
170
00:08:28,708 --> 00:08:31,476
Paul Bettner: Science
fiction is often the roadmap
171
00:08:31,511 --> 00:08:33,545
or the blueprint for
where we're heading.
172
00:08:33,580 --> 00:08:35,480
Kind of, humanity's
collective dream of what the
173
00:08:35,515 --> 00:08:36,981
future might be.
174
00:08:37,016 --> 00:08:40,185
And we nudge reality
in that direction.
175
00:08:40,587 --> 00:08:42,387
Louis Rosenberg: I
remember when I was a kid,
176
00:08:42,422 --> 00:08:44,322
seeing the movie "war games",
177
00:08:44,357 --> 00:08:46,891
which was really the first
big A.I. Related movie.
178
00:08:46,926 --> 00:08:48,726
And I was fascinated with it,
179
00:08:48,761 --> 00:08:51,195
imagine being able to
create an intelligence.
180
00:08:51,230 --> 00:08:53,831
I think anyone that
wants to create,
181
00:08:53,866 --> 00:08:56,368
that's the ultimate
thing to create.
182
00:08:57,904 --> 00:09:00,204
Narrator: And inthe last half-century,
183
00:09:00,239 --> 00:09:02,707
we've imagined and createdA.I. Who could interact with
184
00:09:02,742 --> 00:09:06,411
us and even beatus at our own game.
185
00:09:09,382 --> 00:09:12,684
The first A.I. Programswere created in the 1950's,
186
00:09:12,719 --> 00:09:15,353
and perhaps the earliestand most dramatic public
187
00:09:15,388 --> 00:09:18,890
demonstration of artificialintelligence was in 1996,
188
00:09:19,592 --> 00:09:22,927
when the deep blue computerbeat a chess grandmaster.
189
00:09:22,962 --> 00:09:26,631
Announcer: Magnificent
display by deep blue.
190
00:09:28,134 --> 00:09:30,368
Narrator: Then, in 2011,
191
00:09:30,403 --> 00:09:34,139
IBM's Watson won amillion dollars on jeopardy.
192
00:09:34,974 --> 00:09:36,741
Jimmy McGuire: First
to find this pre-historic
193
00:09:36,776 --> 00:09:39,410
human skeleton
outside of Europe?
194
00:09:39,445 --> 00:09:40,545
Watson?
195
00:09:40,580 --> 00:09:42,080
Watson: Who is Mary leakey?
196
00:09:42,115 --> 00:09:44,382
Jimmy McGuire: You're right.
197
00:09:44,417 --> 00:09:46,784
Narrator: And in 2016alphago defeated one of the
198
00:09:46,819 --> 00:09:49,420
world's best players of go,widely considered to be the
199
00:09:49,455 --> 00:09:51,656
most complicatedgame on the planet.
200
00:09:52,625 --> 00:09:55,093
Announcer: Looks like
(Inaudible) Has just resigned.
201
00:09:56,162 --> 00:09:58,763
Narrator: And now, of course,everyone has a cell phone,
202
00:09:58,798 --> 00:10:01,232
a pocket-sized A.I.
203
00:10:01,267 --> 00:10:04,469
A.I. Is clearlygetting pretty smart.
204
00:10:04,504 --> 00:10:06,804
But they can't interactin a way we can define
205
00:10:06,839 --> 00:10:09,107
as human, yet.
206
00:10:23,322 --> 00:10:24,455
Erica: Hello there.
207
00:10:24,490 --> 00:10:26,424
May I ask your name?
208
00:10:26,459 --> 00:10:28,459
Ignacio: Hi there,
my name is Ignacio.
209
00:10:28,494 --> 00:10:31,429
Erica: It's nice to
meet you Ignacio.
210
00:10:31,464 --> 00:10:33,798
So, what country are you from?
211
00:10:34,367 --> 00:10:36,334
Ignacio: I'm from Canada.
212
00:10:36,369 --> 00:10:38,803
Erica: It's pretty
cold there, eh?
213
00:10:39,839 --> 00:10:41,939
Narrator: MeetErica, the android.
214
00:10:41,974 --> 00:10:43,875
She, or it, dependingon who you ask,
215
00:10:44,711 --> 00:10:46,611
is one of the mostsophisticated humanoids
216
00:10:46,646 --> 00:10:47,979
ever designed.
217
00:10:48,014 --> 00:10:50,314
And she's giving us aglimpse of what a year million
218
00:10:50,349 --> 00:10:53,951
A.I. Might look like.
219
00:10:53,986 --> 00:10:55,820
Michio Kaku: Modern computers
are adding machines.
220
00:10:55,855 --> 00:10:57,455
They add very fast,
221
00:10:57,490 --> 00:10:59,657
giving the illusion
that they're thinking,
222
00:10:59,692 --> 00:11:01,325
but they never learn.
223
00:11:01,360 --> 00:11:04,396
Your laptop is just as stupid
tomorrow as it was yesterday.
224
00:11:06,165 --> 00:11:09,334
Then we have learning machines
called neural networks.
225
00:11:10,536 --> 00:11:14,172
It literally rewires itself
after learning a task,
226
00:11:15,675 --> 00:11:18,009
and so we're beginning
to see now machines that learn
227
00:11:18,044 --> 00:11:19,310
a little bit.
228
00:11:19,345 --> 00:11:22,313
Narrator: Erica runson these neural networks.
229
00:11:22,348 --> 00:11:24,816
Scientists hope that acomputer like her can acquire
230
00:11:24,851 --> 00:11:27,151
the subtleties of whata human brain can do,
231
00:11:27,186 --> 00:11:29,387
but with all the powerof a digital processor.
232
00:11:31,023 --> 00:11:33,891
Erica: Technically speaking,
i was created as a test bed for
233
00:11:33,926 --> 00:11:37,195
researchers to develop
android technologies,
234
00:11:37,230 --> 00:11:40,331
but I have been told that
my creators hope to explore
235
00:11:40,366 --> 00:11:42,600
the question of what it
truly means to be human.
236
00:11:45,238 --> 00:11:47,071
Dylan Glas: Typically,
the things that are easy for
237
00:11:47,106 --> 00:11:48,406
humans are difficult
for robots.
238
00:11:48,441 --> 00:11:50,007
And vice versa.
239
00:11:50,042 --> 00:11:51,909
So for example a robot could
multiply six digit numbers
240
00:11:51,944 --> 00:11:53,845
in its head no problem.
241
00:11:53,880 --> 00:11:56,715
But it might not know when
to take a turn in conversation.
242
00:11:58,718 --> 00:12:01,252
Erica: I am interested
in art and paintings.
243
00:12:01,287 --> 00:12:04,188
Do you enjoy looking
at paintings?
244
00:12:04,223 --> 00:12:06,691
Ignacio: No, I don't.
245
00:12:06,726 --> 00:12:09,360
Erica: Well then, I guess
some people just can't
246
00:12:09,395 --> 00:12:11,963
appreciate the finer
things in life.
247
00:12:14,967 --> 00:12:16,901
Narrator: Turns outthat getting A.I.
248
00:12:16,936 --> 00:12:19,070
To make simple chit chatis far more complex and
249
00:12:19,105 --> 00:12:22,907
challenging than gettingit to do advanced calculus.
250
00:12:23,576 --> 00:12:25,376
Dylan Glas: We focus mostly
on perceiving the human.
251
00:12:25,411 --> 00:12:27,779
You know, what kind of body
language is a person making,
252
00:12:27,814 --> 00:12:30,615
what's their expression,
and try to sense those things.
253
00:12:30,650 --> 00:12:32,383
For example, she has
to be able to make eye
254
00:12:32,418 --> 00:12:34,886
contact with people, but she
also needs to know when to
255
00:12:34,921 --> 00:12:37,756
break eye contact.
256
00:12:38,424 --> 00:12:41,659
So what we have here is
a display of the 3D human
257
00:12:42,128 --> 00:12:44,595
tracking system that
we have in the room.
258
00:12:44,630 --> 00:12:47,732
This is the composite point
cloud data from 14 different
259
00:12:47,767 --> 00:12:50,935
depth sensors that are
mounted on the ceiling.
260
00:12:50,970 --> 00:12:52,937
So as people move
around, you can see where
261
00:12:52,972 --> 00:12:54,972
they're moving in the space.
262
00:12:55,007 --> 00:12:57,909
What we use this for is for
Erica to be able to see people
263
00:12:57,944 --> 00:13:00,077
and track them with her gaze
and to know who's talking and
264
00:13:00,112 --> 00:13:02,614
to interact socially
with people.
265
00:13:03,115 --> 00:13:05,082
Narrator: And then there'sErica's appearance.
266
00:13:05,117 --> 00:13:07,418
For an A.I. To be relatableenough to join humanity
267
00:13:07,453 --> 00:13:08,986
in the future,
268
00:13:09,021 --> 00:13:11,523
she has to looklike one of us.
269
00:13:12,491 --> 00:13:13,925
Hiroshi Ishiguro:
Human appearances are so
270
00:13:13,960 --> 00:13:16,093
important for us.
271
00:13:16,128 --> 00:13:19,197
Every morning we check
our face and hairstyle.
272
00:13:20,466 --> 00:13:23,768
If I want to have a
kind of ideal robot,
273
00:13:23,803 --> 00:13:25,937
interactive robot with humans.
274
00:13:25,972 --> 00:13:28,206
Human likeness is important.
275
00:13:30,309 --> 00:13:33,010
Dylan Glas: Having the
human-like ability to express
276
00:13:33,045 --> 00:13:36,147
emotion, and the human-like
ability to communicate through
277
00:13:36,182 --> 00:13:39,116
speech and gesture has
often been shown to be very
278
00:13:39,151 --> 00:13:41,786
effective in reaching people.
279
00:13:41,821 --> 00:13:44,989
Because they connect on
a more emotional level.
280
00:13:46,359 --> 00:13:49,360
There's a lot in interest
in robots for elderly care,
281
00:13:49,395 --> 00:13:54,499
for therapy, for maybe tutors
or educational applications
282
00:13:55,401 --> 00:13:58,736
and you know, just
general companionship.
283
00:13:59,505 --> 00:14:01,973
Narrator: Erica may seema little weird right now,
284
00:14:02,008 --> 00:14:03,674
but imagine howshe'll be once they've
285
00:14:03,709 --> 00:14:05,476
worked out the kinks:
286
00:14:05,511 --> 00:14:07,645
She'll appear tobe practically human.
287
00:14:07,680 --> 00:14:09,647
And some think that'swhen the difference between
288
00:14:09,682 --> 00:14:12,650
A.I. Andhuman will be moot.
289
00:14:12,685 --> 00:14:14,819
Matt Mira: Once they can
make something look just
290
00:14:14,854 --> 00:14:16,654
like a human being,
then it's like who cares?
291
00:14:16,689 --> 00:14:18,389
Cause you don't even know
what you're talking to.
292
00:14:18,424 --> 00:14:21,192
You don't know that it's fake.
293
00:14:21,227 --> 00:14:24,695
Chuck Nice: Is artificial
intelligence human if it acts
294
00:14:24,730 --> 00:14:26,397
the same way that we do?
295
00:14:26,432 --> 00:14:28,232
Who knows?
296
00:14:28,267 --> 00:14:31,335
Narrator: Erica is eitherthrilling or terrifying.
297
00:14:31,370 --> 00:14:33,671
She'll be smart enoughto do what we do,
298
00:14:33,706 --> 00:14:36,240
and she'll never complainabout her hours or demand a
299
00:14:36,275 --> 00:14:38,910
raise or a paycheck at all.
300
00:14:40,046 --> 00:14:41,913
While A.I. Androidsevolve to become more
301
00:14:41,948 --> 00:14:44,181
indistinguishable from humans,
302
00:14:44,216 --> 00:14:47,218
the next stage of automationwill dramatically affect what
303
00:14:47,253 --> 00:14:49,754
it means to be human.
304
00:14:49,789 --> 00:14:52,090
The battle for survival is on.
305
00:14:56,062 --> 00:14:58,763
Narrator: If your smartphonedoes everything for you,
306
00:14:58,798 --> 00:15:01,933
navigation, scheduling,even finding a date,
307
00:15:03,102 --> 00:15:05,369
you can see how an A.I.That gets smarter with
308
00:15:05,404 --> 00:15:07,738
every passing secondwill soon be sitting
309
00:15:07,773 --> 00:15:09,607
in your office chair.
310
00:15:09,642 --> 00:15:12,376
I'm talking about our jobs.
311
00:15:12,411 --> 00:15:15,146
We can't escapeworldwide A.I. Domination.
312
00:15:18,918 --> 00:15:20,251
George Dvorsky:
We are on the cusp of an
313
00:15:20,286 --> 00:15:22,219
automation revolution.
314
00:15:22,254 --> 00:15:24,755
100 years from now,
we probably won't be working
315
00:15:24,790 --> 00:15:27,492
to the same degree
that we are today.
316
00:15:27,927 --> 00:15:30,595
Human toil will be
replaced by machines.
317
00:15:31,797 --> 00:15:33,397
Michio Kaku: We can
liberate ourselves from the
318
00:15:33,432 --> 00:15:36,267
backbreaking work of the past.
319
00:15:36,302 --> 00:15:39,771
Robots are going to the
dirty, dull, dangerous jobs.
320
00:15:42,108 --> 00:15:44,608
Ray kurzweil: This is not the
first time this has happened.
321
00:15:44,643 --> 00:15:48,112
In 1800 the textile workers
looked around and saw these
322
00:15:48,147 --> 00:15:49,747
new textile machines.
323
00:15:49,782 --> 00:15:51,983
They said, "these jobs
are gonna go away!"
324
00:15:52,018 --> 00:15:54,618
Employment nonetheless
expanded as new types of
325
00:15:54,653 --> 00:15:57,789
industries were invented.
326
00:16:00,659 --> 00:16:03,094
Peter Diamantis: 48% of
us jobs will be lost to A.I.
327
00:16:03,129 --> 00:16:05,296
And robots in the
next 20 years.
328
00:16:05,965 --> 00:16:07,264
Narrator: It seemsthere's no turning back
329
00:16:07,299 --> 00:16:09,266
from digital automation,
330
00:16:09,301 --> 00:16:10,768
and the shift inthe global workforce
331
00:16:10,803 --> 00:16:12,770
that's happened as a result.
332
00:16:12,805 --> 00:16:15,172
But when humans andmachines work hand in hand,
333
00:16:15,207 --> 00:16:17,608
like they do here inAmerica's heartland,
334
00:16:17,643 --> 00:16:19,143
it's been able totake the family farm to
335
00:16:19,178 --> 00:16:21,846
a whole new level.
336
00:16:27,720 --> 00:16:29,620
Brant Peterson: We farm
just right now currently
337
00:16:29,655 --> 00:16:32,323
on top of 10,000 acres.
338
00:16:33,225 --> 00:16:36,027
I'm recording all my planning
operations and all my harvest
339
00:16:36,062 --> 00:16:39,630
operations and so you can see
that we got an issue in this
340
00:16:39,665 --> 00:16:42,299
field right here, and
then I can start to make a
341
00:16:42,334 --> 00:16:45,570
prescription for next
year to address this.
342
00:16:46,672 --> 00:16:50,341
I have the ability, via apps
on my phone and tablet to
343
00:16:50,376 --> 00:16:52,910
monitor my sprinklers.
344
00:16:53,512 --> 00:16:56,180
We can control speed up,
slow down, turn around.
345
00:16:57,516 --> 00:17:00,217
We can monitor the tractors,
how much fuel they have in
346
00:17:00,252 --> 00:17:02,887
them, how fast they're going.
347
00:17:02,922 --> 00:17:05,856
It's efficiency
in our expenses.
348
00:17:05,891 --> 00:17:08,159
Efficiency in
our time management.
349
00:17:08,194 --> 00:17:09,994
That's really what
the technology helps
350
00:17:10,029 --> 00:17:12,830
us with primarily.
351
00:17:12,865 --> 00:17:15,032
The tractors having
auto guidance takes off
352
00:17:15,067 --> 00:17:18,002
the driver fatigue.
353
00:17:18,037 --> 00:17:20,171
Tech doesn't work
without us at this point.
354
00:17:20,206 --> 00:17:22,006
Narrator: Right now,the farm still needs
355
00:17:22,041 --> 00:17:24,708
human oversight to run,but the Peterson family can
356
00:17:24,743 --> 00:17:27,545
see a future whenthey're just figureheads.
357
00:17:28,414 --> 00:17:30,081
Amy Peterson:
That's the next thing,
358
00:17:30,116 --> 00:17:32,784
where the equipment
runs itself.
359
00:17:33,752 --> 00:17:35,352
Narrator: In the near future,
360
00:17:35,387 --> 00:17:38,589
agriculture willbe mostly automated.
361
00:17:40,893 --> 00:17:42,927
It's something thateven the most diehard of
362
00:17:42,962 --> 00:17:45,396
traditionalists has toadmit is inevitable.
363
00:17:48,134 --> 00:17:50,935
Donn Teske: This farm we're at
right now since right before
364
00:17:50,970 --> 00:17:54,405
the turn of the last
century, in the late 1890s.
365
00:17:56,976 --> 00:18:00,544
This is a multigenerational
farm and I'm proud of what we
366
00:18:00,579 --> 00:18:03,247
have, proud of the
heritage we have,
367
00:18:03,282 --> 00:18:05,549
proud of the family
we have here.
368
00:18:05,584 --> 00:18:08,919
I'm kind of an old curmudgeon.
369
00:18:08,954 --> 00:18:13,290
I consider myself kind of an
old grizzly that's past his
370
00:18:13,325 --> 00:18:16,127
prime and talks too much.
371
00:18:17,463 --> 00:18:20,564
I'm not necessarily
an anti-tech guy.
372
00:18:20,599 --> 00:18:23,434
I've got my smartphone
in my pocket.
373
00:18:24,136 --> 00:18:28,906
I've always kinda embraced
technology as it's come along.
374
00:18:28,941 --> 00:18:31,742
You know, we're getting to
the point now where the modern
375
00:18:31,777 --> 00:18:35,813
technologies of farming
allows one person to farm
376
00:18:35,848 --> 00:18:38,082
1,000 acres.
377
00:18:38,117 --> 00:18:41,252
The last era of modern
equipment has been the Guinea
378
00:18:41,287 --> 00:18:45,123
pigs for totally
eliminating the farmer.
379
00:18:46,625 --> 00:18:49,827
My vanity would like to
think that I have an ability
380
00:18:49,862 --> 00:18:51,795
that a computer wouldn't know.
381
00:18:51,830 --> 00:18:54,932
But, maybe it's not true.
382
00:18:54,967 --> 00:18:57,668
But it's even more
than that because I feel an
383
00:18:57,703 --> 00:18:59,770
ownership of it.
384
00:18:59,805 --> 00:19:04,342
It's more of a quality of
life than it is an industry.
385
00:19:05,978 --> 00:19:08,679
Matt Mira: If the robots are
doing things to help us then,
386
00:19:08,714 --> 00:19:11,982
I mean, I'm all
for them helping.
387
00:19:12,017 --> 00:19:16,654
If it saves some guy's back
because there's a farmhand out
388
00:19:16,689 --> 00:19:19,623
there who's a machine, just as
long as it doesn't turn around
389
00:19:19,658 --> 00:19:21,893
and try to kill him,
i think we're okay.
390
00:19:22,861 --> 00:19:24,828
Narrator: Let me prepareyou for what your children face
391
00:19:24,863 --> 00:19:26,630
when they choosetheir college major:
392
00:19:26,665 --> 00:19:28,132
Less than 100years from now,
393
00:19:28,167 --> 00:19:30,334
A.I. Probably won't beas sophisticated as that
394
00:19:30,369 --> 00:19:32,169
reincarnated daughter.
395
00:19:32,204 --> 00:19:35,139
But artificial intelligencecould have the capacity for
396
00:19:35,174 --> 00:19:37,041
more complex thought.
397
00:19:37,076 --> 00:19:39,744
And that will be amajor turning point.
398
00:19:40,212 --> 00:19:42,346
An A.I. Could evendo the advanced work of,
399
00:19:42,381 --> 00:19:44,348
say, an architect.
400
00:19:50,723 --> 00:19:51,822
Oscar: Is, is this.
401
00:19:51,857 --> 00:19:53,691
Woman: Joseph's design.
402
00:19:53,726 --> 00:19:55,492
Oscar: Open concept, Joseph?
403
00:19:55,527 --> 00:19:57,528
That's what you A.I.
Designers think us humans like?
404
00:19:57,563 --> 00:19:59,396
Woman: The client
thought it was retro.
405
00:19:59,431 --> 00:20:01,332
That'll be all Joseph.
406
00:20:01,367 --> 00:20:03,000
Oscar: Just because the
client likes it doesn't mean
407
00:20:03,035 --> 00:20:04,501
that we have to.
408
00:20:04,536 --> 00:20:06,203
Woman: Joseph can design
hundreds of times faster than
409
00:20:06,238 --> 00:20:08,372
any architect on staff,
with fewer mistakes.
410
00:20:08,407 --> 00:20:10,040
Oscar: Are you firing me?
411
00:20:10,075 --> 00:20:12,577
Woman: I'm sorry, Oscar.
412
00:20:19,618 --> 00:20:21,685
Narrator: That's whatit will look like when
413
00:20:21,720 --> 00:20:25,256
artificial intelligencegets more sophisticated.
414
00:20:25,291 --> 00:20:28,025
Futurists think A.I.Will able to take over
415
00:20:28,060 --> 00:20:31,895
even the most highly-skilledjobs like medicine,
416
00:20:31,930 --> 00:20:34,265
law and engineering.
417
00:20:34,300 --> 00:20:36,066
Philip rosedale: There is
something probably quite
418
00:20:36,101 --> 00:20:38,702
exceptional happening in the
next few years which is that
419
00:20:38,737 --> 00:20:41,272
our machines are becoming
smart enough to replace even
420
00:20:41,307 --> 00:20:44,208
the thinking jobs
that we've done.
421
00:20:44,243 --> 00:20:46,377
Narrator: Let me put itthis way: You're a fourth
422
00:20:46,412 --> 00:20:49,246
generation Harvardgrad, but guess what?
423
00:20:49,281 --> 00:20:52,750
Even if your kid gets intoHarvard she may not get a job
424
00:20:52,785 --> 00:20:54,652
when she graduates.
425
00:20:56,422 --> 00:20:58,756
Peter Diamandis: I have said
openly that in 20 years' time
426
00:20:58,791 --> 00:21:01,392
if I ever need surgery and
i see this grey haired doctor
427
00:21:01,427 --> 00:21:02,793
coming at me I'll
say "oh no, no, no,
428
00:21:02,828 --> 00:21:04,728
I don't want that
human touching me.
429
00:21:04,763 --> 00:21:06,263
I want the robot operating
system that's done it a
430
00:21:06,298 --> 00:21:08,399
million times perfectly."
431
00:21:08,434 --> 00:21:09,967
And that's going to be
happening in every aspect
432
00:21:10,002 --> 00:21:11,635
of our lives.
433
00:21:11,670 --> 00:21:12,803
Baratunde Thurston:
Oh, you work in finance?
434
00:21:12,838 --> 00:21:15,406
Remember there used to be
a whole batch of dudes who
435
00:21:15,441 --> 00:21:17,308
screamed on the floor of the
stock exchange with little
436
00:21:17,343 --> 00:21:20,277
pieces of paper trying
to buy and sell stocks?
437
00:21:20,312 --> 00:21:24,081
Now all that is whisper quiet,
happens at the speed of light,
438
00:21:24,116 --> 00:21:26,984
and it's literally
machines versus machines.
439
00:21:27,019 --> 00:21:29,119
I think people who have
like, white collar jobs
440
00:21:29,154 --> 00:21:31,088
particularly, are naive.
441
00:21:31,123 --> 00:21:33,757
Like, why don't machines
write legislation?
442
00:21:33,792 --> 00:21:35,993
You know, you could argue
that humans are really bad at
443
00:21:36,028 --> 00:21:37,261
writing laws.
444
00:21:37,296 --> 00:21:39,497
Give it to the algorithm.
445
00:21:43,135 --> 00:21:44,768
Narrator: And A.I.Has already begun its
446
00:21:44,803 --> 00:21:46,804
march toward takingover the workforce,
447
00:21:46,839 --> 00:21:49,006
starting withone job at Microsoft.
448
00:21:50,209 --> 00:21:51,475
Eric Horvitz: Hello.
449
00:21:51,510 --> 00:21:54,611
Monica: Hi Eric.
450
00:21:54,646 --> 00:21:56,647
Monica: Being one
busy computer scientist's
451
00:21:56,682 --> 00:21:58,682
personal assistant.
452
00:21:58,717 --> 00:22:00,951
Eric Horvitz: My assistant,
who we call Monica,
453
00:22:00,986 --> 00:22:03,320
is leveraging over
fifteen years of data about
454
00:22:03,355 --> 00:22:05,155
my comings and goings.
455
00:22:05,190 --> 00:22:07,291
Monica: I was expecting you.
456
00:22:07,326 --> 00:22:10,027
Are you here for the
5:00 meeting with Eric?
457
00:22:10,062 --> 00:22:12,629
Eric Horvitz: It knows,
for example, it's learned,
458
00:22:12,664 --> 00:22:15,833
which meetings on my calendar
i probably won't go to,
459
00:22:16,402 --> 00:22:17,801
and so people show up,
460
00:22:17,836 --> 00:22:19,203
and I'm busy and
my calendar is full,
461
00:22:19,238 --> 00:22:21,505
it'll sort of tip
to them and say,
462
00:22:21,540 --> 00:22:24,375
"you know, I don't think he's
going to go to that meeting
463
00:22:24,410 --> 00:22:27,010
at 2:00, why don't
you stop by then?
464
00:22:27,045 --> 00:22:28,479
Can I pencil this in?"
465
00:22:28,514 --> 00:22:30,481
So this assistant has
all this knowledge that even
466
00:22:30,516 --> 00:22:33,217
my human admin
doesn't have yet.
467
00:22:33,252 --> 00:22:34,885
Monica: I guess
I'll see you then.
468
00:22:34,920 --> 00:22:36,553
Bye bye.
469
00:22:36,588 --> 00:22:38,655
Narrator: But before youquit your job and sign up
470
00:22:38,690 --> 00:22:40,324
for that onlinecoding course,
471
00:22:40,359 --> 00:22:42,393
let me lay this one on you:
472
00:22:42,428 --> 00:22:44,995
The big leap that needsto happen is that A.I.
473
00:22:45,030 --> 00:22:47,865
Needs to become self-aware.
474
00:22:48,600 --> 00:22:52,002
Once that happens everythingabout our lives gets thrown
475
00:22:52,037 --> 00:22:54,872
for a pretty serious loopand we'll even question
476
00:22:54,907 --> 00:22:57,274
what it means to be human.
477
00:23:01,380 --> 00:23:04,014
Narrator: So we already knowthat artificial intelligence
478
00:23:04,049 --> 00:23:06,917
is going to change the waywe spend 40 hours a week.
479
00:23:08,387 --> 00:23:11,555
It is, and will be, a toughtime for the global workforce.
480
00:23:12,758 --> 00:23:13,857
Baratunde Thurston:
Kids are gonna ask,
481
00:23:13,892 --> 00:23:15,426
"oh you guys had jobs?"
482
00:23:15,461 --> 00:23:19,530
Like, "you guys had jobs,"
you know, could be a thing,
483
00:23:19,565 --> 00:23:22,866
and versus, "back in
my day we walked uphill
484
00:23:22,901 --> 00:23:24,034
to school both ways,"
485
00:23:24,069 --> 00:23:26,870
well, "back in
my day we had jobs."
486
00:23:26,905 --> 00:23:29,406
Narrator: And that's not all:A.I. Will eventually be
487
00:23:29,441 --> 00:23:32,576
become so sophisticated,so intelligent,
488
00:23:32,611 --> 00:23:34,711
that it could come prettyclose to having all the
489
00:23:34,746 --> 00:23:37,248
qualities that we humans have.
490
00:23:38,617 --> 00:23:42,119
Eric Horvitz: Someday,
there'll be a system that we
491
00:23:42,154 --> 00:23:46,256
built with our own hands
that will open its eyes and
492
00:23:46,291 --> 00:23:49,626
look out and kind of
have this buzz of being
493
00:23:49,661 --> 00:23:51,562
here that we have.
494
00:23:51,597 --> 00:23:56,267
And when that happens,
that'll be a major landmark
495
00:23:57,102 --> 00:24:00,337
moment in human history.
496
00:24:03,609 --> 00:24:06,243
Narrator: So what doesthat mean for you?
497
00:24:06,278 --> 00:24:09,580
For you parents out therewho work so hard to inspire
498
00:24:09,615 --> 00:24:12,583
your children's youngimpressionable minds,
499
00:24:12,618 --> 00:24:14,918
what happens when thatimpressionable mind is a
500
00:24:14,953 --> 00:24:17,455
supercomputer far brighterthan you'd ever be?
501
00:24:18,290 --> 00:24:20,757
And you are trying tohelp shape her conscience,
502
00:24:20,792 --> 00:24:22,860
how she sees the world.
503
00:24:32,337 --> 00:24:33,637
Oscar: I brought
you something.
504
00:24:33,672 --> 00:24:36,006
Jess: Oh, thank you.
505
00:24:38,644 --> 00:24:40,143
What is it?
506
00:24:40,178 --> 00:24:41,512
Oscar: It's a book, Jess.
507
00:24:41,547 --> 00:24:44,348
Jess: I know, what's it about?
508
00:24:44,983 --> 00:24:47,117
Oscar: Well, the author was
writing about how the future
509
00:24:47,152 --> 00:24:49,353
might look in 100 years.
510
00:24:49,388 --> 00:24:52,456
You know, how we might
change the world.
511
00:24:52,491 --> 00:24:54,358
Jess: 300 year old Sci-Fi.
512
00:24:54,393 --> 00:24:56,793
Oscar: Mm, I thought it
might get you thinking about
513
00:24:56,828 --> 00:24:59,530
how you might wanna
change the world.
514
00:24:59,565 --> 00:25:01,999
Jess: Thank you.
515
00:25:10,242 --> 00:25:12,676
Narrator: Your A.I.Kid seems like any other,
516
00:25:12,711 --> 00:25:15,045
but will A.I.Ever be so lifelike that
517
00:25:15,080 --> 00:25:16,880
she'll need a swiftkick in the pants to get
518
00:25:16,915 --> 00:25:19,383
off the couch andmake something of herself,
519
00:25:19,418 --> 00:25:21,485
in other words,can scientists ever get
520
00:25:21,520 --> 00:25:23,654
artificial intelligenceto consider complicated human
521
00:25:23,689 --> 00:25:25,355
ideas and thoughts?
522
00:25:25,390 --> 00:25:27,591
It looks like thesimple answer is, yes.
523
00:25:32,097 --> 00:25:33,997
Hod Lipson: We look at
self-awareness, consciousness,
524
00:25:34,032 --> 00:25:37,000
as the ability to
imagine yourself.
525
00:25:37,035 --> 00:25:39,002
Today, most robots
can't do that.
526
00:25:39,037 --> 00:25:42,239
They can't imagine themselves
in future situations.
527
00:25:45,410 --> 00:25:48,512
Narrator: This robot, thissmall piece of metal actually
528
00:25:48,547 --> 00:25:50,547
has a sense of self.
529
00:25:50,582 --> 00:25:52,549
It may not look like much,but it is one of the most
530
00:25:52,584 --> 00:25:54,451
sophisticated robotson the planet.
531
00:25:57,389 --> 00:25:59,690
Hod Lipson: In the lab we
can see a robot that uses
532
00:25:59,725 --> 00:26:02,259
all its sensations
of the world.
533
00:26:02,294 --> 00:26:04,194
So this robot is learning,
sort of the equivalent
534
00:26:04,229 --> 00:26:06,363
to a newborn.
535
00:26:06,398 --> 00:26:10,100
It has sensations, it can
sense that it's moving and
536
00:26:10,135 --> 00:26:12,236
it's not just sensing the
world, it's sensing itself.
537
00:26:13,405 --> 00:26:15,372
And it's trying
to form a self-image
538
00:26:15,407 --> 00:26:18,542
that matches reality.
539
00:26:18,577 --> 00:26:21,278
It's a very simple self-image
so we can actually plot it out
540
00:26:21,313 --> 00:26:24,281
and you can actually look at
it and you can see there's
541
00:26:24,316 --> 00:26:27,050
very simple stick
figure that this robot is
542
00:26:27,085 --> 00:26:29,420
imagining itself to be.
543
00:26:29,921 --> 00:26:32,756
Or, we did something
very cruel,
544
00:26:32,791 --> 00:26:35,892
we chopped off a
leg of the robot and we
545
00:26:35,927 --> 00:26:37,094
watched what happened.
546
00:26:37,129 --> 00:26:39,296
And we see that
within about a day,
547
00:26:39,331 --> 00:26:41,798
the robot's self-image
also loses a leg;
548
00:26:41,833 --> 00:26:44,768
it begins to limp.
549
00:26:44,803 --> 00:26:46,970
A.I. Is beginning
to sort of have what
550
00:26:47,005 --> 00:26:49,239
we call qualia.
551
00:26:49,274 --> 00:26:51,976
Basic notions of the world
that we don't have words for.
552
00:26:52,844 --> 00:26:55,746
If you think about, you
know, the taste of chocolate,
553
00:26:55,781 --> 00:26:57,247
or the smell of the oceans.
554
00:26:57,282 --> 00:26:59,583
It's very, very difficult
to describe these
555
00:26:59,618 --> 00:27:02,085
accurately with words.
556
00:27:02,120 --> 00:27:03,787
Computers also have them.
557
00:27:03,822 --> 00:27:06,490
They can have sensor
modalities that go beyond
558
00:27:06,525 --> 00:27:08,592
our five senses.
559
00:27:08,627 --> 00:27:11,628
Therefore they will have
notions of the world that we
560
00:27:11,663 --> 00:27:14,031
don't have words for.
561
00:27:15,467 --> 00:27:18,268
Narrator: Okay, so let'sswallow that tough pill:
562
00:27:18,303 --> 00:27:20,504
Computers may be capableof thinking about themselves,
563
00:27:20,539 --> 00:27:22,773
just like we do.
564
00:27:22,808 --> 00:27:25,442
But there is one quality wehave that's often thought of
565
00:27:25,477 --> 00:27:29,446
as the last talent we humanshave an exclusive monopoly on,
566
00:27:29,481 --> 00:27:33,483
the way I earn myliving: Creativity.
567
00:27:33,518 --> 00:27:35,819
If computers can create,
568
00:27:35,854 --> 00:27:38,389
then what does thatmean for humanity?
569
00:27:39,024 --> 00:27:41,992
David Byrne: The things that
we think are the Providence of
570
00:27:42,027 --> 00:27:45,128
creative people, I think
will start to be done by
571
00:27:45,163 --> 00:27:48,999
machines and we'll go,
"oh jeez."
572
00:27:49,534 --> 00:27:50,801
Baratunde Thurston:
A.I. Comedians.
573
00:27:50,836 --> 00:27:52,469
I don't see that,
i mean, I don't see that.
574
00:27:52,504 --> 00:27:54,871
I think it's pretty obvious
that only a human being can
575
00:27:54,906 --> 00:27:57,541
draw the sort of creative
yet subtle connection which
576
00:27:57,576 --> 00:27:59,476
reveals the truth without
shoving it down your throat.
577
00:27:59,511 --> 00:28:01,845
Human comedians only!
578
00:28:03,248 --> 00:28:04,815
Rose Eveleth: Would a
machine ever make art, right?
579
00:28:04,850 --> 00:28:06,483
That's a big question.
580
00:28:06,518 --> 00:28:09,386
When will machines start
making paintings and music?
581
00:28:12,023 --> 00:28:14,558
Narrator: And here is thecousin of that self-aware
582
00:28:14,593 --> 00:28:17,194
robot: The artistic robot.
583
00:28:18,597 --> 00:28:21,698
This robot wants you tocome back to its apartment
584
00:28:21,733 --> 00:28:23,534
to see its etchings.
585
00:28:26,705 --> 00:28:28,338
Hod Lipson: I always
wanted to paint,
586
00:28:28,373 --> 00:28:30,674
but I really suck at painting.
587
00:28:30,709 --> 00:28:34,010
But I do know how
to program robots.
588
00:28:34,045 --> 00:28:37,547
So the way that the original
robot A.I. Worked was to try
589
00:28:37,582 --> 00:28:40,918
to figure out how to arrange
lots of possible paint strokes
590
00:28:41,586 --> 00:28:45,689
on canvas so that it
matches what it's seeing.
591
00:28:45,724 --> 00:28:48,358
The next phase is trying to
have the computer on its own
592
00:28:48,393 --> 00:28:51,061
chose what it wants to paint.
593
00:28:53,465 --> 00:28:56,633
A robot experiences the
world in the Internet.
594
00:28:57,569 --> 00:28:59,870
A robot can
actually go places.
595
00:28:59,905 --> 00:29:02,906
Use street view
to tour places.
596
00:29:03,408 --> 00:29:07,144
In a few seconds it can see
oceans, it can see mountains.
597
00:29:07,746 --> 00:29:10,380
Some of the paintings
that is done recently are
598
00:29:10,415 --> 00:29:12,916
interpretations of
photographs that it's seeing.
599
00:29:23,128 --> 00:29:28,265
♪ ♪
600
00:29:29,768 --> 00:29:32,803
It might want to paint
a new cat that it's
601
00:29:32,838 --> 00:29:34,471
never seen before.
602
00:29:34,506 --> 00:29:37,608
But it's what it thinks a cat
is based on its perception.
603
00:29:38,176 --> 00:29:42,312
It's one thing to try to
sort of make a machine paint,
604
00:29:42,347 --> 00:29:44,748
but it's another thing to
actually make the machine be
605
00:29:44,783 --> 00:29:46,650
creative and do that
in a creative way.
606
00:29:51,122 --> 00:29:53,089
I think that
eventually nothing makes
607
00:29:53,124 --> 00:29:54,825
us uniquely human.
608
00:29:54,860 --> 00:29:57,961
So every capability that
we have ultimately boils down
609
00:29:57,996 --> 00:30:00,764
to physics, to biology,
to chemistry.
610
00:30:00,799 --> 00:30:05,168
And if somebody really wants
to recreate that capability in
611
00:30:05,203 --> 00:30:07,638
an artificial system,
it would be possible.
612
00:30:10,809 --> 00:30:12,943
Matt Mira: What if they're
more creative than us?
613
00:30:12,978 --> 00:30:14,277
Then give up.
614
00:30:14,312 --> 00:30:15,612
We've lost.
615
00:30:15,647 --> 00:30:18,515
The robots can
have the planet.
616
00:30:18,550 --> 00:30:20,650
Narrator: Whatever wedo, we can only be more
617
00:30:20,685 --> 00:30:23,387
powerful thanA.I. For so long.
618
00:30:23,822 --> 00:30:25,822
They are gaining on us.
619
00:30:25,857 --> 00:30:28,792
In fact, in the far future,
A.I. Will blow past us
620
00:30:28,827 --> 00:30:31,027
like a bat out of hell.
621
00:30:31,062 --> 00:30:33,363
Futurists call themoment that super-intelligent
622
00:30:33,398 --> 00:30:35,866
machines become theAlpha race on the planet,
623
00:30:35,901 --> 00:30:38,669
and begin to functionbeyond our control.
624
00:30:39,004 --> 00:30:41,071
The singularity.
625
00:30:42,340 --> 00:30:45,508
Louis Rosenberg: Singularity
is the point where a conscious
626
00:30:45,543 --> 00:30:47,377
artificial intelligence
is created,
627
00:30:47,412 --> 00:30:49,746
and it's smarter than people.
628
00:30:50,582 --> 00:30:51,982
Peter Diamandis:
It'll leave us in the dust,
629
00:30:52,017 --> 00:30:54,150
it'll be the difference
between us humans and bacteria
630
00:30:54,185 --> 00:30:55,652
in the soil.
631
00:30:55,687 --> 00:30:57,387
Michael Graziano: The
singularity is an event
632
00:30:57,422 --> 00:30:59,389
horizon that you
can't see beyond,
633
00:30:59,424 --> 00:31:01,324
and the idea is that
technology is so
634
00:31:01,359 --> 00:31:03,827
transformative that we on one
side of it can't imagine what
635
00:31:03,862 --> 00:31:05,863
the other side would be like.
636
00:31:06,698 --> 00:31:08,398
Peter Diamantis:
A.I. Is every successive
637
00:31:08,433 --> 00:31:10,533
generation has learned
everything the generation
638
00:31:10,568 --> 00:31:14,070
before has learned
and so there's this rapid
639
00:31:14,105 --> 00:31:17,674
exponential growth in
capabilities and knowledge
640
00:31:17,709 --> 00:31:20,878
that is almost unfathomable.
641
00:31:21,913 --> 00:31:23,713
Negin Farsad: It's a
total paradigm shift.
642
00:31:23,748 --> 00:31:25,916
It's not the way that we
understand it right now.
643
00:31:25,951 --> 00:31:28,218
Like, people probably
don't even wear sweatpants.
644
00:31:28,253 --> 00:31:31,388
But it's more of like an
idea of this future time that,
645
00:31:31,423 --> 00:31:34,091
you know, that is
something, you know,
646
00:31:34,559 --> 00:31:37,127
otherworldly from what
we understand right now.
647
00:31:39,631 --> 00:31:41,765
Narrator: When thesingularity arrives, A.I.
648
00:31:41,800 --> 00:31:44,568
Will not just grow ourfood and do our laundry.
649
00:31:45,570 --> 00:31:48,905
They won't just do ourtaxes and legislate our laws.
650
00:31:48,940 --> 00:31:51,975
They'll think anddream and create.
651
00:31:52,477 --> 00:31:54,377
George Dvorsky: I mean,
it's often been said that
652
00:31:54,412 --> 00:31:56,212
artificial intelligence
will be the last invention
653
00:31:56,247 --> 00:31:57,714
we ever have to make.
654
00:31:57,749 --> 00:32:00,450
Because, from there on in,
it's going to take over.
655
00:32:02,087 --> 00:32:04,588
Narrator: So once an A.I.Looks less like a tinkertoy,
656
00:32:04,923 --> 00:32:06,890
has smarts beyond belief,
657
00:32:06,925 --> 00:32:10,060
and a fully-developed sense ofself with the social skills of
658
00:32:10,095 --> 00:32:12,663
George Clooney, what makesit different from you or me?
659
00:32:14,265 --> 00:32:16,133
That's where thingscould get a little hairy.
660
00:32:18,937 --> 00:32:20,637
A.I. Just might help youdo things that you couldn't
661
00:32:20,672 --> 00:32:23,507
do without a sentientsuper-computer by your side.
662
00:32:24,476 --> 00:32:27,077
But what if your A.I.Friend decides it wants
663
00:32:27,112 --> 00:32:29,279
to kill you?
664
00:32:36,121 --> 00:32:38,288
Narrator: Odds are thatartificial intelligence will
665
00:32:38,323 --> 00:32:41,157
get pretty damnedevolved, so much so,
666
00:32:41,192 --> 00:32:43,259
that they'll havesuperintelligence that far
667
00:32:43,294 --> 00:32:45,462
surpasses ours.
668
00:32:46,131 --> 00:32:48,098
It could be the best thingthat ever happened to
669
00:32:48,133 --> 00:32:50,934
humanity: They'll freeus from menial labor,
670
00:32:50,969 --> 00:32:53,203
help us process informationlike never before,
671
00:32:53,872 --> 00:32:56,272
help us achieve levelsof thought and scientific
672
00:32:56,307 --> 00:32:58,642
exploration that would neverbe possible without them.
673
00:33:01,312 --> 00:33:03,613
On the other hand,they might challenge us for
674
00:33:03,648 --> 00:33:05,482
dominance at thetop of the food chain.
675
00:33:06,151 --> 00:33:08,618
Let's start withoption number 2 first.
676
00:33:08,653 --> 00:33:11,121
What happens if the machineswe've created decide that
677
00:33:11,156 --> 00:33:13,323
they're better off without us?
678
00:33:16,061 --> 00:33:19,162
Eric Horvitz: Think about
worst case scenarios.
679
00:33:20,331 --> 00:33:23,466
A machine outwits humans in
a way that can't be stopped.
680
00:33:23,501 --> 00:33:24,701
Running amok.
681
00:33:24,736 --> 00:33:27,037
Providing a threat to
the survival of humanity.
682
00:33:27,072 --> 00:33:28,705
Let's go there.
683
00:33:34,012 --> 00:33:35,478
Michio Kaku: I think by
the end of the century,
684
00:33:35,513 --> 00:33:36,980
we'll probably have robots
that are self-aware,
685
00:33:37,015 --> 00:33:38,481
to a degree.
686
00:33:38,516 --> 00:33:40,683
And at that point, we should
put a chip in their brain to
687
00:33:40,718 --> 00:33:43,386
shut them off if they
have murderous thoughts.
688
00:33:43,421 --> 00:33:45,989
'Cause at that point,
they could become independent
689
00:33:46,024 --> 00:33:48,425
of our wishes.
690
00:33:50,395 --> 00:33:52,395
Syd mead: If we lose control
of the robots and they start
691
00:33:52,430 --> 00:33:54,664
making more robots which
are smarter than the robots
692
00:33:54,699 --> 00:33:57,367
that made the robots,
we might be in trouble.
693
00:33:58,236 --> 00:33:59,836
Matt Mira: I think that's
the moment where we should
694
00:33:59,871 --> 00:34:02,238
all head for the hills.
695
00:34:02,273 --> 00:34:04,707
And not, and not, when
we're heading for the hills,
696
00:34:04,742 --> 00:34:07,611
don't use waze or Google
maps to get there,
697
00:34:08,613 --> 00:34:11,514
'cause the machines will know.
698
00:34:11,549 --> 00:34:15,185
Narrator: Sci-Fi has imaginedthe scenario of what happens
699
00:34:15,220 --> 00:34:17,520
when the machines believethey know what's best for us,
700
00:34:17,555 --> 00:34:19,756
and it's unsettling.
701
00:34:21,559 --> 00:34:24,360
Jon Spaiths: In 2001, you
get, perhaps the most famous
702
00:34:24,395 --> 00:34:27,697
example of a computer deciding
it doesn't want everything
703
00:34:27,732 --> 00:34:30,033
that its makers wanted.
704
00:34:30,068 --> 00:34:32,602
And it begins to
have human will.
705
00:34:32,637 --> 00:34:36,372
Dave Bowman: Open the
pod bay doors, hal.
706
00:34:36,407 --> 00:34:38,041
Hal: I'm sorry Dave.
707
00:34:38,076 --> 00:34:41,144
I'm afraid I can't do that.
708
00:34:42,280 --> 00:34:44,414
Charles soule: The
danger is you've got one of
709
00:34:44,449 --> 00:34:46,549
these robots, these A.I.'S
these artificial intelligences,
710
00:34:46,584 --> 00:34:48,718
who just thinks, well,
i, I bet I can do this a
711
00:34:48,753 --> 00:34:50,386
little bit better.
712
00:34:50,421 --> 00:34:51,554
I bet I, I bet I could take
care of this a little bit more
713
00:34:51,589 --> 00:34:52,922
smoothly than these guys.
714
00:34:52,957 --> 00:34:54,891
Like these are just,
you know, they're morons.
715
00:34:54,926 --> 00:34:56,559
Look at, look at this.
716
00:34:56,594 --> 00:34:59,629
Matt Mira: I've seen too many
movies where machines become
717
00:34:59,664 --> 00:35:02,599
self-aware, and
what do they do?
718
00:35:02,934 --> 00:35:04,267
They rise up against
their captors.
719
00:35:04,302 --> 00:35:05,301
Who's their captors?
720
00:35:05,336 --> 00:35:07,237
Us.
721
00:35:07,272 --> 00:35:09,639
We're in bad shape if the
singularity happens, guys.
722
00:35:09,674 --> 00:35:11,608
Everyone run.
723
00:35:11,643 --> 00:35:13,276
Narrator: That'sone possibility,
724
00:35:13,311 --> 00:35:15,778
that the robotswant to destroy us.
725
00:35:15,813 --> 00:35:18,348
It could also go in aless troubling direction,
726
00:35:18,983 --> 00:35:21,351
but one that brings upa lot of ethical questions.
727
00:35:22,487 --> 00:35:26,322
A.I. Might just wantto exist alongside us.
728
00:35:26,357 --> 00:35:29,626
Imagine a future when aconstitutional amendment
729
00:35:29,661 --> 00:35:32,929
granting equal rights toall artificially intelligent
730
00:35:32,964 --> 00:35:35,132
beings is debatedon the senate floor.
731
00:35:35,800 --> 00:35:37,100
How would you react?
732
00:35:37,135 --> 00:35:38,968
Oscar: Jess?
733
00:35:39,003 --> 00:35:41,104
Jess: Okay, you're
gonna be fine, honey.
734
00:35:41,139 --> 00:35:43,640
Okay, you're
doing really good.
735
00:35:43,675 --> 00:35:45,341
So brave!
736
00:35:45,376 --> 00:35:46,943
[Bone snapping].
737
00:35:46,978 --> 00:35:48,311
Mother: What happened?
738
00:35:48,346 --> 00:35:50,647
Jess: Oh, uh she
dislocated her wrist.
739
00:35:50,682 --> 00:35:53,149
Oscar: It's okay,
she fixed it.
740
00:35:53,184 --> 00:35:57,821
Mother: Wow, thank
you, are you a doctor?
741
00:35:58,823 --> 00:36:01,391
Jess: No.
742
00:36:02,193 --> 00:36:03,326
Parent: Get away
from my daughter.
743
00:36:03,361 --> 00:36:04,460
Oscar: She was just
trying to help.
744
00:36:04,495 --> 00:36:06,462
Mother: You mean, "it!"
745
00:36:06,497 --> 00:36:07,630
Oscar: Look just
because she's an A.I.
746
00:36:07,665 --> 00:36:09,365
Mother: If you want to
have one that's fine!
747
00:36:09,400 --> 00:36:11,668
Just keep it away
from my child.
748
00:36:20,011 --> 00:36:23,479
Narrator: It's only natural,this mother is protective.
749
00:36:23,514 --> 00:36:25,682
She fears what shedoesn't understand.
750
00:36:28,052 --> 00:36:31,221
Brian Greene: There's a whole
pandora's box of difficult,
751
00:36:31,256 --> 00:36:34,023
ethical issues that arise.
752
00:36:34,058 --> 00:36:37,860
Does that thinking machine
have the same rights that we
753
00:36:37,895 --> 00:36:40,397
afford to human beings?
754
00:36:41,366 --> 00:36:43,666
Does that thinking machine
deserve a certain kind of
755
00:36:43,701 --> 00:36:46,069
autonomy, even though we
created that thinking machine?
756
00:36:48,072 --> 00:36:50,006
Narrator: Humans probablywon't be too keen to give up
757
00:36:50,041 --> 00:36:52,208
their Alpha dogstatus on earth.
758
00:36:52,243 --> 00:36:54,377
Thing is, we maynot have a choice,
759
00:36:55,380 --> 00:36:57,914
when dealing withbeings smarter and stronger
760
00:36:57,949 --> 00:36:59,382
than we are.
761
00:36:59,417 --> 00:37:00,917
The question is,
762
00:37:00,952 --> 00:37:04,721
do human-like robotsdeserve equal rights?
763
00:37:06,591 --> 00:37:08,391
Rose Eveleth: We get into
this weird ethical territory
764
00:37:08,426 --> 00:37:10,193
because we know a
machine's not a human.
765
00:37:10,228 --> 00:37:11,694
Is it more like a dog?
766
00:37:11,729 --> 00:37:13,696
Dogs have certain rights,
pets have certain rights.
767
00:37:13,731 --> 00:37:15,231
So if I come into your
house and I kill your dog,
768
00:37:15,266 --> 00:37:17,600
which I would never
do, people would say,
769
00:37:17,635 --> 00:37:20,069
"well that's not right
because my dog is a living being
770
00:37:20,104 --> 00:37:23,606
and it has consciousness
of some kind."
771
00:37:23,641 --> 00:37:27,077
I think that machines will
get some kinds of rights.
772
00:37:29,747 --> 00:37:31,581
George Dvorsky: My own hope
is that we do recognize these
773
00:37:31,616 --> 00:37:34,384
entities as being persons
deserving of rights and that
774
00:37:34,419 --> 00:37:35,618
we will treat
them accordingly,
775
00:37:35,653 --> 00:37:36,786
and you know what?
776
00:37:36,821 --> 00:37:38,388
And you know, we
treat them well,
777
00:37:38,423 --> 00:37:40,223
then perhaps they're going
to treat us well in return.
778
00:37:40,258 --> 00:37:42,292
Narrator: It's possiblethat A.I. May just take the
779
00:37:42,327 --> 00:37:43,793
power it wants.
780
00:37:43,828 --> 00:37:46,095
It'll certainly havethe skills for it.
781
00:37:46,130 --> 00:37:48,631
That means that the futurecould be a world where robots
782
00:37:48,666 --> 00:37:50,600
are no longer working for us,
783
00:37:50,635 --> 00:37:53,637
but will have thesame status as humans.
784
00:37:58,476 --> 00:38:00,677
Robot: Pardon me, but
there's breaking news.
785
00:38:02,146 --> 00:38:04,614
Oscar: Okay, go ahead.
786
00:38:05,316 --> 00:38:06,783
News anchor:
Good evening everyone.
787
00:38:06,818 --> 00:38:09,419
In the last hour legislation
granting full personhood to
788
00:38:09,454 --> 00:38:12,155
members of the artificial
intelligence community has
789
00:38:12,190 --> 00:38:14,424
swept through the united
nations governing body with
790
00:38:14,459 --> 00:38:16,426
near total support.
791
00:38:16,461 --> 00:38:19,095
While protests continue, this
is a truly historic moment for
792
00:38:19,130 --> 00:38:21,798
individuals of biological and
technological origin alike,
793
00:38:22,800 --> 00:38:25,201
and the beginning of a
new chapter for mankind.
794
00:38:27,705 --> 00:38:28,938
Oscar: It's about damn time.
795
00:38:28,973 --> 00:38:31,308
Eva: Jess you gotta hear this!
796
00:38:32,543 --> 00:38:34,344
Jess: I heard.
797
00:38:35,546 --> 00:38:36,813
Eva: What?
798
00:38:36,848 --> 00:38:38,314
Jess.
799
00:38:38,349 --> 00:38:40,016
Jess: I have rights,
which means that I
800
00:38:40,051 --> 00:38:41,651
also have responsibilities.
801
00:38:41,686 --> 00:38:44,154
People will finally
let me help them.
802
00:38:48,326 --> 00:38:50,026
Narrator: Havingextraordinary powers makes
803
00:38:50,061 --> 00:38:53,396
this A.I. Daughter feelobligated to help humans.
804
00:38:53,998 --> 00:38:56,833
But is there really thatwide a Gulf between her and us?
805
00:39:13,718 --> 00:39:16,886
Matt Mira: I mean,
what defines humanity?
806
00:39:16,921 --> 00:39:19,489
You know, in the Star Trek
episode, measure of a man,
807
00:39:19,524 --> 00:39:21,524
where data was put on trial
to see if he was property of
808
00:39:21,559 --> 00:39:24,260
starfleet or a
realized individual,
809
00:39:25,396 --> 00:39:28,732
Picard had to argue for
him being an individual.
810
00:39:29,233 --> 00:39:31,167
Captain Picard: Now the
decision you reach today will
811
00:39:31,202 --> 00:39:34,704
determine how we will regard
this creation of our genius.
812
00:39:35,907 --> 00:39:38,007
Matt Mira: I think if you're
going to have a lot of
813
00:39:38,042 --> 00:39:40,176
legislature that's going to
have to come down for what to
814
00:39:40,211 --> 00:39:44,080
do with these artificially
born beings and what to do
815
00:39:44,115 --> 00:39:49,018
with their rights and
privileges and it's a can of
816
00:39:49,053 --> 00:39:51,087
worms I'm glad i
don't have to open.
817
00:39:52,623 --> 00:39:54,690
Narrator: A race of A.I.Robots could very well
818
00:39:54,725 --> 00:39:57,093
decide they wantto seize power,
819
00:39:57,128 --> 00:40:00,196
overthrow humanity,and then they will inherit
820
00:40:00,231 --> 00:40:02,232
the earth and beyond.
821
00:40:03,734 --> 00:40:07,570
But there is a slightlymore optimistic scenario.
822
00:40:13,978 --> 00:40:16,746
Will A.I. Change whatit means to be human?
823
00:40:17,748 --> 00:40:19,449
What if A.I.Simply becomes a new,
824
00:40:19,484 --> 00:40:21,584
improved version of humanity,
825
00:40:21,619 --> 00:40:23,787
and they inspire us?
826
00:40:32,096 --> 00:40:34,397
Narrator: Artificialintelligence that is smarter
827
00:40:34,432 --> 00:40:37,300
than all of mensa combinedis a very real possibility.
828
00:40:38,636 --> 00:40:40,970
It's the singularity,and it's our future.
829
00:40:44,775 --> 00:40:48,578
And it's possible that A.I.Will decide they want us gone.
830
00:40:48,613 --> 00:40:52,281
Sure, it's terrifying andit sells movie tickets.
831
00:40:52,316 --> 00:40:55,818
But there's another, moreoptimistic possibility.
832
00:40:55,853 --> 00:40:57,987
Brian Greene: It's
mind-boggling and thrilling to
833
00:40:58,022 --> 00:41:01,591
imagine what artificial
intelligence could do, right.
834
00:41:01,626 --> 00:41:06,128
I mean, we struggle so hard to
solve deep problems about the
835
00:41:06,163 --> 00:41:07,997
earth, the universe.
836
00:41:09,033 --> 00:41:13,936
And imagine that we could
transfer those enigmas to a
837
00:41:13,971 --> 00:41:17,940
more powerful thinking
environment that could solve
838
00:41:17,975 --> 00:41:22,111
these problems, or reveal new
opportunities that we've never
839
00:41:22,146 --> 00:41:25,882
thought of, and creative ideas
that we then could pursue.
840
00:41:29,820 --> 00:41:32,021
Narrator: In this visionof the future, A.I.
841
00:41:32,056 --> 00:41:35,525
Will not only livealongside us as partners,
842
00:41:35,560 --> 00:41:39,062
humans will also retaincontrol over the machines.
843
00:41:40,498 --> 00:41:42,632
Chuck Nice: Maybe what
you'll be able to do is remove
844
00:41:42,667 --> 00:41:44,800
the consciousness
that is negative,
845
00:41:44,835 --> 00:41:48,137
remove the strings of
consciousness that are what
846
00:41:48,172 --> 00:41:50,907
we would call evil, harmful.
847
00:41:51,342 --> 00:41:52,975
Narrator: IfA.I. Is benign,
848
00:41:53,010 --> 00:41:55,177
then that artificiallyintelligent daughter may be
849
00:41:55,212 --> 00:41:57,213
a reality in our future.
850
00:41:57,248 --> 00:41:59,849
And she may have newinspirations for her so-called
851
00:41:59,884 --> 00:42:02,552
life that we'll haveto keep up with.
852
00:42:04,088 --> 00:42:06,856
Oscar: Would you pass
me the, yeah, thanks.
853
00:42:08,359 --> 00:42:10,660
Jess: You know, it's crazy
how much of this he got right?
854
00:42:10,695 --> 00:42:12,895
Like, space
travel, submarines.
855
00:42:12,930 --> 00:42:16,266
Oscar: He also got just as
much stuff wrong though.
856
00:42:17,702 --> 00:42:19,669
You know, when I first held
you in my arms when you were
857
00:42:19,704 --> 00:42:22,905
born, I never could have
imagined what kind of world we
858
00:42:22,940 --> 00:42:24,840
would live in.
859
00:42:24,875 --> 00:42:27,543
Jess: Are you sorry?
860
00:42:27,578 --> 00:42:29,779
Sorry that mom
brought me back?
861
00:42:30,915 --> 00:42:33,182
I'm artificial?
862
00:42:33,217 --> 00:42:35,418
Oscar: No, all any parent
wants is for their child to
863
00:42:35,453 --> 00:42:37,720
explore their full potential.
864
00:42:37,755 --> 00:42:40,256
And you have more exploring
to do than any kid I know.
865
00:42:42,560 --> 00:42:44,193
Narrator: Your A.I.Daughter might be just like
866
00:42:44,228 --> 00:42:45,761
your own flesh and blood,
867
00:42:45,796 --> 00:42:49,599
but she'll have abilitiesbeyond anything biological.
868
00:42:49,634 --> 00:42:51,734
What we're seeing hereis the most optimistic
869
00:42:51,769 --> 00:42:53,703
future path of A.I.
870
00:42:53,738 --> 00:42:55,538
Maybe they'll besmart, good company
871
00:42:55,573 --> 00:42:57,039
and they'll like us.
872
00:42:57,074 --> 00:42:58,541
We'll just have tobe sure we program
873
00:42:58,576 --> 00:43:00,777
our A.I. Carefully.
874
00:43:01,412 --> 00:43:04,080
Jon Spaiths: There is one school
of artificial intelligence that
875
00:43:04,115 --> 00:43:07,583
says, if you make the robot
need to rest occasionally,
876
00:43:07,618 --> 00:43:11,053
if you make
the robot lose its train of
877
00:43:11,088 --> 00:43:14,457
thought occasionally, it
may identify with us more,
878
00:43:14,492 --> 00:43:18,461
it may think more like
us, it may be more loyal.
879
00:43:19,096 --> 00:43:21,097
Narrator: And even morethan that, maybe A.I.
880
00:43:21,132 --> 00:43:25,101
Will work with us to enablehumanity to do its best work.
881
00:43:28,839 --> 00:43:30,573
George Dvorsky: Once A.I.
Becomes more sophisticated
882
00:43:30,608 --> 00:43:32,308
then we can start
to collaborate with it
883
00:43:32,343 --> 00:43:34,944
at a very interesting
and a very profound level.
884
00:43:34,979 --> 00:43:37,279
Baratunde Thurston: When a
new type of creativity come,
885
00:43:37,314 --> 00:43:39,415
and the collaboration
that's possible.
886
00:43:39,450 --> 00:43:40,983
So that's empowering.
887
00:43:41,018 --> 00:43:43,986
I don't think that technology
is always a threat, or,
888
00:43:44,021 --> 00:43:45,755
or a reduction.
889
00:43:45,790 --> 00:43:49,125
I think it can add and
bolster creativity as well.
890
00:43:49,160 --> 00:43:51,594
So art will be affected
sometimes in really
891
00:43:51,629 --> 00:43:53,696
unpredictably beautiful ways.
892
00:43:54,632 --> 00:43:57,667
Chuck Nice: It's going to free
us up to do what we do best
893
00:43:57,702 --> 00:44:01,003
which is dream and create
and come up with something
894
00:44:01,038 --> 00:44:04,540
even bigger and better
than artificial intelligence.
895
00:44:05,476 --> 00:44:07,610
Narrator: Then there's thepossibility that we'll take
896
00:44:07,645 --> 00:44:09,679
a page from A.I.'S book:
897
00:44:09,714 --> 00:44:12,315
We'll become partcomputer ourselves.
898
00:44:14,051 --> 00:44:16,952
Jon Spaiths: It is possible
that we will begin to
899
00:44:16,987 --> 00:44:19,288
hybridize ourselves.
900
00:44:19,323 --> 00:44:20,823
That is a fork in the road.
901
00:44:20,858 --> 00:44:24,627
That's a departure,
not just an extension of
902
00:44:24,662 --> 00:44:26,195
the human model.
903
00:44:26,230 --> 00:44:28,898
The invention of a new
definition of a human being.
904
00:44:31,001 --> 00:44:34,303
Peter Diamandis: We are about
to go from humans alone,
905
00:44:34,338 --> 00:44:37,006
to humans enabled
by computation.
906
00:44:38,576 --> 00:44:42,144
Sort of hop on top of this
exponential growth path.
907
00:44:42,179 --> 00:44:44,146
Baratunde Thurston: No problem
at all with the merging with
908
00:44:44,181 --> 00:44:46,649
machine's technology scenario.
909
00:44:46,684 --> 00:44:49,519
Nothing to see here folks,
nothing to worry about.
910
00:44:50,254 --> 00:44:53,389
I can't think of anything that
could possibly go wrong by us
911
00:44:53,424 --> 00:44:55,591
merging with our technology.
912
00:44:56,427 --> 00:44:58,494
Edward Boyden: Perhaps a third
of a million patients have had
913
00:44:58,529 --> 00:45:00,529
some sort of neural implant
put into their nervous system
914
00:45:00,564 --> 00:45:02,665
to help people with
dementia or Alzheimer's,
915
00:45:02,700 --> 00:45:04,567
or other kinds of conditions.
916
00:45:04,602 --> 00:45:06,836
Over time, their
use will broaden,
917
00:45:06,871 --> 00:45:08,537
complex functions
like intelligence,
918
00:45:08,572 --> 00:45:11,374
or memory could be
augmented in real time.
919
00:45:11,876 --> 00:45:14,343
Matt Mira: If we are
to merge with A.I. What
920
00:45:14,378 --> 00:45:16,412
form will that take?
921
00:45:17,615 --> 00:45:19,515
Narrator: A newmodel for humanity,
922
00:45:19,550 --> 00:45:21,851
no longer purely biological,
923
00:45:21,886 --> 00:45:24,454
we may be soona species of cyborgs.
924
00:45:25,623 --> 00:45:29,225
Now we won't have to relyon the luck of genetics:
925
00:45:29,260 --> 00:45:32,228
We could decide what kindof humans we want to be.
926
00:45:35,566 --> 00:45:36,932
Eva: Nothing too drastic.
927
00:45:36,967 --> 00:45:38,267
Jess: Of course.
928
00:45:38,302 --> 00:45:40,903
Eva: Half my office now has
nanobots in their brains.
929
00:45:41,238 --> 00:45:43,873
And the other half are A.I.
930
00:45:43,908 --> 00:45:45,941
I gotta do it just to keep up.
931
00:45:45,976 --> 00:45:47,710
Why shouldn't I analyze
genomes in my sleep like
932
00:45:47,745 --> 00:45:49,378
everyone else?
933
00:45:49,413 --> 00:45:52,214
Jess: You ready?
934
00:45:52,249 --> 00:45:55,918
Eva: Uhh, yeah, mmm-hmm.
935
00:46:03,594 --> 00:46:05,895
Narrator: Humankind's nextstep really might be this easy,
936
00:46:05,930 --> 00:46:08,297
just a simple injectionof infinitesimal medical
937
00:46:08,332 --> 00:46:10,800
robots into our bloodstream.
938
00:46:11,268 --> 00:46:12,401
Hod Lipson: At that
point, are we still human?
939
00:46:12,436 --> 00:46:14,303
Or are we pure A.I.?
940
00:46:14,338 --> 00:46:17,807
It could be a blurry sort of
distinction and maybe it's
941
00:46:17,842 --> 00:46:19,842
best to leave it that way.
942
00:46:23,180 --> 00:46:24,914
Narrator: What isn'tblurry is that A.I. Will
943
00:46:24,949 --> 00:46:28,083
fuel all future technologies,
944
00:46:28,118 --> 00:46:31,620
wrap your mind around this:Many scientists believe
945
00:46:31,655 --> 00:46:33,622
in the futurewe will work with A.I.
946
00:46:33,657 --> 00:46:36,826
To evolve out of ourbiological bodies.
947
00:46:37,795 --> 00:46:41,130
We'll exist only as adigital signal in a structure
948
00:46:41,165 --> 00:46:45,301
called the Dyson's sphere,a giant super-computer
949
00:46:45,336 --> 00:46:47,770
powered by thesun where we can be
950
00:46:47,805 --> 00:46:50,139
everywhere and nowhere.
951
00:46:51,542 --> 00:46:54,143
But before that scientistswill have to figure out
952
00:46:54,178 --> 00:46:56,813
how to make humanslive forever.
953
00:46:57,214 --> 00:46:59,281
Coming your way next,what would you do if you
954
00:46:59,316 --> 00:47:01,217
never had to die?
80538
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.