Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,000 --> 00:00:01,100
3
00:00:18,251 --> 00:00:20,788
Even if we can
change human beings,
4
00:00:21,210 --> 00:00:23,160
in what direction
do we change them?
5
00:00:23,189 --> 00:00:26,329
Do we want to change them
this way or that way?
6
00:00:26,359 --> 00:00:30,205
This is an example
of the way in which
7
00:00:30,230 --> 00:00:35,270
technological advance impinges
on sociological necessity.
8
00:00:35,301 --> 00:00:43,301
♪
9
00:00:56,356 --> 00:00:58,199
What we need to ask
ourselves is,
10
00:00:58,224 --> 00:01:02,434
"Where does my individual
ability to control my life
11
00:01:02,462 --> 00:01:05,272
or to influence
the political process
12
00:01:05,298 --> 00:01:08,768
lie in relation to these
new forms of technology?"
13
00:01:08,802 --> 00:01:11,646
♪
14
00:01:11,671 --> 00:01:13,207
Government and politicians
15
00:01:13,239 --> 00:01:15,820
don't understand
what's happening.
16
00:01:15,108 --> 00:01:17,486
See, they don't even realized
this change is happening.
17
00:01:17,510 --> 00:01:18,750
♪
18
00:01:18,778 --> 00:01:21,816
It is very difficult,
not impossible,
19
00:01:21,848 --> 00:01:25,762
to predict what
the precise effects will be,
20
00:01:25,785 --> 00:01:30,700
and in--in many cases,
like with other technologies,
21
00:01:30,723 --> 00:01:33,260
we have to socket and see.
22
00:01:33,293 --> 00:01:35,603
Who would have
predicted the internet?
23
00:01:35,628 --> 00:01:38,404
And I talk about this
matter as humanity 2.0
24
00:01:38,431 --> 00:01:40,274
'cause in a sense, this is
kind of where we're heading,
25
00:01:40,300 --> 00:01:42,746
to some kind of
new--new normal, as it were,
26
00:01:42,769 --> 00:01:45,340
of what it is
to be a human being.
27
00:01:45,371 --> 00:01:47,544
It's not a problem
28
00:01:47,574 --> 00:01:49,679
that we should dismiss
or underestimate.
29
00:01:49,709 --> 00:01:51,882
It's staggering
in its proportions.
30
00:01:52,112 --> 00:01:54,888
Ignorance and disbelief
at the same time.
31
00:01:55,115 --> 00:01:59,495
People don't believe that
change is happening this fast.
32
00:01:59,519 --> 00:02:00,862
That's the problem.
33
00:02:00,887 --> 00:02:06,360
♪
34
00:02:06,392 --> 00:02:08,668
This is a stone
35
00:02:08,695 --> 00:02:10,902
formed naturally
in the Earth's crust
36
00:02:11,131 --> 00:02:14,340
over millions of years
through pressure and heat.
37
00:02:14,367 --> 00:02:18,611
It was discovered in
the Olduvai Gorge in Tanzania.
38
00:02:18,638 --> 00:02:22,313
Dated around 2.5
million years BC,
39
00:02:22,342 --> 00:02:26,449
it is arguably one of the first
examples of technology.
40
00:02:26,479 --> 00:02:29,483
Stone tools were first adapted
for the use of cutting,
41
00:02:29,516 --> 00:02:34,226
scraping, or pounding
materials by Homo habilis,
42
00:02:34,254 --> 00:02:37,633
one of our earliest ancestors.
43
00:02:37,657 --> 00:02:39,659
Over one million years later,
44
00:02:39,692 --> 00:02:42,195
mankind made one of
the most significant
45
00:02:42,228 --> 00:02:45,266
of all technological
discoveries...
46
00:02:45,298 --> 00:02:46,572
Fire.
47
00:02:46,599 --> 00:02:48,476
The ability to control fire
48
00:02:48,501 --> 00:02:51,175
was a turning point
for human evolution.
49
00:02:51,204 --> 00:02:54,310
It kept us warm, allowed us
to see in the dark,
50
00:02:54,340 --> 00:02:56,286
and allowed us to cook food,
51
00:02:56,309 --> 00:02:59,552
which many scientists believe
was a huge contributor
52
00:02:59,579 --> 00:03:01,354
to the ascent of mind.
53
00:03:01,381 --> 00:03:02,917
♪
54
00:03:02,949 --> 00:03:07,261
Each age, each empire,
has brought with it
55
00:03:07,287 --> 00:03:11,292
the discovery and invention
of numerous technologies
56
00:03:11,324 --> 00:03:15,329
each in their own way
redesigning human life...
57
00:03:15,361 --> 00:03:17,602
♪
58
00:03:17,630 --> 00:03:20,304
...leading us to now...
59
00:03:20,333 --> 00:03:21,937
...modern-day society.
60
00:03:23,803 --> 00:03:27,341
We're now more advanced,
connected, knowledgeable,
61
00:03:27,373 --> 00:03:31,378
and resistant to disease
than ever before,
62
00:03:31,411 --> 00:03:33,322
and it is all due
to our ability
63
00:03:33,346 --> 00:03:36,919
to apply scientific knowledge
for practical purposes
64
00:03:36,950 --> 00:03:40,363
in a bid to maximize
efficiency.
65
00:03:40,386 --> 00:03:44,300
Just as the stone set us
on a path of transformation,
66
00:03:44,324 --> 00:03:46,463
the technologies
of the future
67
00:03:46,492 --> 00:03:49,473
may bring with them
a paradigm shift,
68
00:03:49,495 --> 00:03:54,274
changing two major features
of the human experience.
69
00:03:54,300 --> 00:03:56,541
Two things that have
defined our lives
70
00:03:56,569 --> 00:03:58,947
for as long as
we can remember.
71
00:03:58,972 --> 00:04:04,120
Two things that have always
been involuntary constants:
72
00:04:04,244 --> 00:04:07,282
trading our time
for sustenance
73
00:04:07,313 --> 00:04:09,884
and losing that time
through senescence.
74
00:04:09,916 --> 00:04:14,661
♪
75
00:04:19,392 --> 00:04:23,534
♪
76
00:04:23,563 --> 00:04:25,941
(indistinct chattering)
77
00:04:25,965 --> 00:04:29,276
♪
78
00:04:29,302 --> 00:04:30,747
(telephone ringing)
79
00:04:30,770 --> 00:04:36,150
♪
80
00:04:36,420 --> 00:04:38,545
(Zapping)
81
00:04:38,578 --> 00:04:46,520
♪
82
00:04:46,552 --> 00:04:48,862
(laughing)
83
00:04:48,888 --> 00:04:56,888
♪
84
00:04:57,463 --> 00:04:59,704
The Industrial Revolution
effectively freed man
85
00:04:59,732 --> 00:05:01,405
from being a beast of burden.
86
00:05:01,434 --> 00:05:03,573
The computer revolution
will similarly free him
87
00:05:03,603 --> 00:05:06,150
from dull, repetitive routine.
88
00:05:06,390 --> 00:05:07,712
The computer
revolution is, however,
89
00:05:07,740 --> 00:05:09,981
perhaps better compared
with the Copernican
90
00:05:10,900 --> 00:05:12,460
or the Darwinian Revolution,
91
00:05:12,780 --> 00:05:15,525
both of which greatly changed
man's idea of himself
92
00:05:15,548 --> 00:05:17,585
and the world
in which he lives.
93
00:05:17,617 --> 00:05:21,360
In the space of 60 years,
we have landed on the moon,
94
00:05:21,387 --> 00:05:23,890
seen the rise of
computing power,
95
00:05:23,923 --> 00:05:25,596
mobile phones,
96
00:05:25,625 --> 00:05:27,730
the explosion
of the internet,
97
00:05:27,760 --> 00:05:31,333
and we have sequenced
the human genome.
98
00:05:31,364 --> 00:05:33,640
We took man to
the moon and back
99
00:05:33,666 --> 00:05:35,907
with four kilobytes of memory.
100
00:05:35,935 --> 00:05:40,782
The phone in your pocket
is at least 250,000 times
101
00:05:40,807 --> 00:05:43,140
more powerful than that.
102
00:05:43,420 --> 00:05:46,580
We are ever-increasingly
doing more with less.
103
00:05:46,612 --> 00:05:48,387
One of the things
that has been born
104
00:05:48,414 --> 00:05:50,951
out of this
technological revolution
105
00:05:50,983 --> 00:05:54,210
is the ability
to replace human workers
106
00:05:54,530 --> 00:05:56,966
with more efficient machines.
107
00:05:56,989 --> 00:05:58,866
This is largely
due to the speed
108
00:05:58,891 --> 00:06:02,429
at which we are advancing our
technological capabilities.
109
00:06:02,462 --> 00:06:04,499
(applause)
110
00:06:04,530 --> 00:06:08,740
Information technology grows
in an exponential manner.
111
00:06:08,768 --> 00:06:10,714
It's not linear.
112
00:06:10,737 --> 00:06:12,683
And our intuition is linear.
113
00:06:12,705 --> 00:06:14,150
When we walked
through the savanna
114
00:06:14,374 --> 00:06:16,115
a thousand years ago,
we made linear predictions
115
00:06:16,142 --> 00:06:18,190
where that animal would be
and that worked fine.
116
00:06:18,440 --> 00:06:20,422
It's hardwired in our brains,
117
00:06:20,446 --> 00:06:22,949
but the pace of
exponential growth
118
00:06:22,982 --> 00:06:26,623
is really what describes
information technologies,
119
00:06:26,652 --> 00:06:28,723
and it's not just computation.
120
00:06:28,755 --> 00:06:30,980
There's a big difference
between linear
121
00:06:30,123 --> 00:06:31,124
and exponential growth.
122
00:06:31,157 --> 00:06:33,831
If I take 30 steps linearly,
123
00:06:33,860 --> 00:06:36,898
one, two, three, four,
five, I get to 30.
124
00:06:36,929 --> 00:06:39,000
If I take 30 steps
exponentially,
125
00:06:39,310 --> 00:06:42,569
two, four, eight, sixteen,
I get to a billion.
126
00:06:42,602 --> 00:06:44,445
It makes a huge difference.
127
00:06:44,470 --> 00:06:46,780
And that really describes
information technology.
128
00:06:46,806 --> 00:06:48,843
When I was a student at MIT,
129
00:06:48,875 --> 00:06:51,480
we all shared one computer,
it took up a whole building.
130
00:06:51,770 --> 00:06:52,784
The computer in
your cell phone today
131
00:06:52,812 --> 00:06:55,793
is a million times cheaper,
a million times smaller,
132
00:06:55,815 --> 00:06:58,220
a thousand times
more powerful.
133
00:06:58,500 --> 00:07:00,894
That's a billionfold increase
in capability per dollar
134
00:07:00,920 --> 00:07:02,524
that we've actually
experienced
135
00:07:02,555 --> 00:07:04,910
since I was a student,
136
00:07:04,123 --> 00:07:07,593
and we're gonna do it again
in the next 25 years.
137
00:07:07,627 --> 00:07:10,540
Currently,
on an almost daily basis,
138
00:07:10,563 --> 00:07:13,169
new algorithms, programs,
139
00:07:13,199 --> 00:07:15,736
and feats in
mechanical engineering
140
00:07:15,768 --> 00:07:19,181
are getting closer and closer
to being a reliable
141
00:07:19,205 --> 00:07:24,211
and more cost-effective
alternative to a human worker.
142
00:07:24,444 --> 00:07:28,119
This process is
known as automation.
143
00:07:28,147 --> 00:07:30,923
This is not just about,
144
00:07:30,950 --> 00:07:32,827
you know, automation
where we expect it,
145
00:07:32,852 --> 00:07:34,229
which is in factories
146
00:07:34,454 --> 00:07:36,559
and among blue-collar
workers and so forth.
147
00:07:36,589 --> 00:07:39,690
It is coming
quite aggressively
148
00:07:39,910 --> 00:07:40,968
for people at much
higher skill levels,
149
00:07:40,993 --> 00:07:42,836
and that will only
grow in the future
150
00:07:42,862 --> 00:07:45,706
as we continue on
this exponential arc.
151
00:07:45,731 --> 00:07:48,678
This business that, you know,
not having to work very hard
152
00:07:48,701 --> 00:07:51,113
because machines are taking
care of things for you,
153
00:07:51,137 --> 00:07:53,140
I mean, you see this
also in the 19th century
154
00:07:53,390 --> 00:07:54,609
with the Industrial Revolution.
155
00:07:54,640 --> 00:07:57,621
And, in fact, I think
one of the problems
156
00:07:57,643 --> 00:07:59,145
with the Industrial Revolution,
157
00:07:59,178 --> 00:08:02,910
and this is where Marxism
got so much traction,
158
00:08:02,114 --> 00:08:05,254
was that machines
actually did render
159
00:08:05,485 --> 00:08:07,624
a lot of people
unemployed, okay?
160
00:08:07,653 --> 00:08:10,964
That already happened in
the 19th and 20th centuries.
161
00:08:10,990 --> 00:08:14,620
And it was only by
labor organizing itself
162
00:08:14,930 --> 00:08:15,265
that it was able
to kind of deal
163
00:08:15,495 --> 00:08:17,600
with the situation
intelligently
164
00:08:17,630 --> 00:08:19,507
because there was no
automatic, you might say,
165
00:08:19,532 --> 00:08:21,170
transition to something else.
166
00:08:21,200 --> 00:08:23,900
It was just, you know,
"We don't need you anymore.
167
00:08:23,350 --> 00:08:24,514
We have these more
efficient machines,
168
00:08:24,537 --> 00:08:25,982
and so now
we don't--you know,
169
00:08:26,500 --> 00:08:27,780
now you just have to find
work somewhere else."
170
00:08:27,807 --> 00:08:30,253
Automation clearly has been
happening for a long time,
171
00:08:30,276 --> 00:08:33,520
and it has,
you know, automated
172
00:08:33,790 --> 00:08:35,787
a lot of very laborious work
that we don't want to do,
173
00:08:35,815 --> 00:08:38,125
and that's gonna continue
to be the case in the future,
174
00:08:38,150 --> 00:08:41,529
but I do think that this time
is genuinely different.
175
00:08:41,554 --> 00:08:43,261
If we look at what's
happened historically,
176
00:08:43,289 --> 00:08:45,640
what we've seen
is that automation
177
00:08:45,910 --> 00:08:47,298
has primarily been
a mechanical phenomenon,
178
00:08:47,527 --> 00:08:49,564
and the classic
example of that
179
00:08:49,595 --> 00:08:50,869
is, of course, agriculture.
180
00:08:50,897 --> 00:08:52,535
I'm a farmer.
181
00:08:52,565 --> 00:08:54,602
Here's what mechanical
engineering has done
182
00:08:54,634 --> 00:08:56,580
for all of us who
work on the farms
183
00:08:56,602 --> 00:08:57,910
and for you, too.
184
00:08:57,937 --> 00:08:59,678
It used to be,
in the United States
185
00:08:59,705 --> 00:09:00,843
and in most
advanced countries,
186
00:09:00,873 --> 00:09:02,978
that most people
worked on farms.
187
00:09:03,900 --> 00:09:04,818
Now, almost no one
works on a farm.
188
00:09:04,844 --> 00:09:06,653
It's less than two percent.
189
00:09:06,679 --> 00:09:09,570
And, of course, as a result
of that, we're better off.
190
00:09:09,810 --> 00:09:12,221
We have, you know,
more comfortable jobs,
191
00:09:12,251 --> 00:09:13,787
food is cheaper,
192
00:09:13,819 --> 00:09:15,696
we have a much more
advanced society.
193
00:09:15,721 --> 00:09:17,826
The question is, "Can that
continue indefinitely?"
194
00:09:17,857 --> 00:09:19,630
And what we're
seeing this time
195
00:09:19,910 --> 00:09:21,196
is that things are
really quite different.
196
00:09:21,227 --> 00:09:23,200
If this keeps up,
197
00:09:23,290 --> 00:09:26,203
it won't be long before
machines will do everything.
198
00:09:26,232 --> 00:09:28,178
Nobody will have work.
199
00:09:28,200 --> 00:09:32,239
So as we move deeper
into the automated future,
200
00:09:32,271 --> 00:09:35,684
we will see different
stages take form.
201
00:09:35,708 --> 00:09:37,949
The first stage
that we're entering
202
00:09:37,977 --> 00:09:40,719
is the stage where
automated robots are working
203
00:09:40,746 --> 00:09:43,220
side by side with
people in factories.
204
00:09:43,490 --> 00:09:45,154
Some of those jobs
are slowly going away,
205
00:09:45,184 --> 00:09:48,757
but in the near future,
within two to three years,
206
00:09:48,788 --> 00:09:50,859
you're going to see
a huge percentage
207
00:09:50,890 --> 00:09:52,892
of those factory jobs
be replaced
208
00:09:52,925 --> 00:09:55,929
with automated systems
and automated robots.
209
00:09:55,962 --> 00:10:00,843
The next stage following
that is we could see
210
00:10:00,866 --> 00:10:04,109
up to a third
of jobs in America
211
00:10:04,136 --> 00:10:07,879
be replaced by 2025
212
00:10:07,907 --> 00:10:10,251
by robots
or automated systems.
213
00:10:10,276 --> 00:10:12,170
That's a huge
percentage of people
214
00:10:12,440 --> 00:10:13,682
that could be unemployed
215
00:10:13,713 --> 00:10:15,886
because of this
automated tsunami
216
00:10:15,915 --> 00:10:17,258
that's coming, basically.
217
00:10:17,283 --> 00:10:20,753
We have a colleague
here called Carl Frey
218
00:10:20,786 --> 00:10:24,859
who has put together
a list of jobs
219
00:10:24,890 --> 00:10:26,130
by their vulnerability
220
00:10:26,158 --> 00:10:29,162
to getting replaced
by automation,
221
00:10:29,195 --> 00:10:32,404
and the least vulnerable are
things like choreographers,
222
00:10:32,632 --> 00:10:35,909
managers, social workers.
223
00:10:35,935 --> 00:10:38,381
People who have people skills
224
00:10:38,404 --> 00:10:42,450
and who have creativity.
225
00:10:42,740 --> 00:10:50,740
♪
226
00:10:53,119 --> 00:10:55,190
One area that I look
a lot at is fast food.
227
00:10:55,221 --> 00:10:57,326
I mean, the fast food industry
is tremendously important
228
00:10:57,356 --> 00:10:58,858
in American economy.
229
00:10:58,891 --> 00:11:01,872
If you look at the years
since recovery
230
00:11:01,894 --> 00:11:03,669
from the Great Recession,
231
00:11:03,696 --> 00:11:06,734
the majority of jobs,
somewhere around 60 percent,
232
00:11:06,766 --> 00:11:09,212
have been low-wage
service sector jobs.
233
00:11:09,235 --> 00:11:11,738
A lot of those have been
in areas like fast food,
234
00:11:11,771 --> 00:11:14,809
and yet, to me,
it seems almost inevitable
235
00:11:14,840 --> 00:11:17,446
that, ultimately,
fast food is gonna automate.
236
00:11:17,677 --> 00:11:20,283
There's a company right
here in San Francisco
237
00:11:20,312 --> 00:11:24,818
called "Momentum Machines"
which is actually working on
238
00:11:24,850 --> 00:11:26,921
a machine to automate
hamburger production,
239
00:11:26,952 --> 00:11:28,226
and it can crank out
240
00:11:28,254 --> 00:11:32,930
about 400 gourmet
hamburgers per hour,
241
00:11:32,958 --> 00:11:37,429
and they ultimately expect
to sort of roll that out
242
00:11:37,463 --> 00:11:40,842
not just in fast food
establishments,
243
00:11:40,866 --> 00:11:42,709
perhaps in convenience stores
244
00:11:42,735 --> 00:11:43,941
and maybe even
vending machines.
245
00:11:43,969 --> 00:11:45,312
It could be
all over the place.
246
00:11:45,337 --> 00:11:50,844
♪
247
00:11:50,876 --> 00:11:54,187
I can see manufacturing now
becoming completely automated.
248
00:11:54,213 --> 00:11:56,193
I can see, you know,
hundreds of millions
249
00:11:56,215 --> 00:11:57,888
of workers being
put out of jobs,
250
00:11:57,917 --> 00:11:59,760
That's almost certain
it's gonna happen.
251
00:11:59,785 --> 00:12:01,958
So you have lots of companies
right now that are automating
252
00:12:01,987 --> 00:12:04,240
their factories
and their warehouses.
253
00:12:04,560 --> 00:12:05,763
Amazon is a great example.
254
00:12:05,791 --> 00:12:08,294
They're using robots
to automate their systems.
255
00:12:08,327 --> 00:12:10,364
The robots actually
grab the products
256
00:12:10,396 --> 00:12:11,841
and bring the products
to the people
257
00:12:11,864 --> 00:12:14,435
who put those products
into the box.
258
00:12:14,467 --> 00:12:16,310
So there are still people
259
00:12:16,335 --> 00:12:18,440
within the factories
at Amazon,
260
00:12:18,471 --> 00:12:21,816
but in the near future,
those jobs may go away as well.
261
00:12:21,841 --> 00:12:24,412
There is a company here
in Silicon Valley
262
00:12:24,443 --> 00:12:26,116
called "Industrial Perception,"
263
00:12:26,145 --> 00:12:29,422
and they built a robot
that can approach
264
00:12:29,448 --> 00:12:32,827
a stack of boxes that are
son of stacked haphazardly
265
00:12:32,852 --> 00:12:35,196
in some nonstandard way
266
00:12:35,221 --> 00:12:38,430
and visually, by looking at
that stack of boxes,
267
00:12:38,457 --> 00:12:40,130
figure out how
to move those boxes.
268
00:12:40,159 --> 00:12:42,161
And they built a machine
that ultimately
269
00:12:42,194 --> 00:12:45,937
will be able to move
perhaps one box every second,
270
00:12:45,965 --> 00:12:48,536
and that compares with
about three seconds
271
00:12:48,768 --> 00:12:51,840
for a human worker
who's very industrious.
272
00:12:51,871 --> 00:12:53,942
Okay, and this machine,
you can imagine,
273
00:12:53,973 --> 00:12:55,770
will work continuously.
274
00:12:55,107 --> 00:12:56,518
It's never gonna get injured,
275
00:12:56,542 --> 00:13:00,319
never file a workers'
compensation claim,
276
00:13:00,346 --> 00:13:02,417
and, yet, it's moving
into an area
277
00:13:02,448 --> 00:13:04,951
that, up until now, at least,
we would have said
278
00:13:04,984 --> 00:13:08,397
is really something that
is exclusively the provid--
279
00:13:08,420 --> 00:13:10,930
province of the human worker.
280
00:13:10,122 --> 00:13:14,537
I mean, it's this ability
to look at something
281
00:13:14,560 --> 00:13:17,268
and then based on
what you see,
282
00:13:17,296 --> 00:13:18,468
manipulate your environment.
283
00:13:18,497 --> 00:13:20,443
It's son of the confluence
284
00:13:20,466 --> 00:13:23,379
of visual perception
and dexterity.
285
00:13:23,402 --> 00:13:27,111
♪
286
00:13:27,139 --> 00:13:29,346
We'll see self-driving
cars on the road
287
00:13:29,375 --> 00:13:31,116
within 10 or 15 years.
288
00:13:31,143 --> 00:13:32,850
Fifteen years from now,
we'll be debating
289
00:13:32,878 --> 00:13:35,552
whether we should even allow
human beings to be on--
290
00:13:35,581 --> 00:13:37,288
be on the road at all.
291
00:13:37,316 --> 00:13:39,887
Tesla says that by next year,
292
00:13:39,919 --> 00:13:44,459
that, you know, their cars
will be 90 percent automated.
293
00:13:44,490 --> 00:13:46,367
Which means that
the jobs of taxi drivers,
294
00:13:46,392 --> 00:13:48,531
truck drivers, goes away.
295
00:13:48,561 --> 00:13:51,906
Suddenly, we don't need
to own cars anymore.
296
00:13:51,931 --> 00:13:55,174
Humanity isn't ready for such
a basic change such as that.
297
00:13:55,201 --> 00:13:57,875
♪
298
00:13:57,903 --> 00:13:59,541
Call center jobs,
voice recognition
299
00:13:59,572 --> 00:14:01,609
is pretty sophisticated
these days.
300
00:14:01,841 --> 00:14:06,850
And you can imagine
replacing many kinds of,
301
00:14:06,111 --> 00:14:08,220
you know,
helplines and things.
302
00:14:08,470 --> 00:14:10,550
There's a company called
"lPsoft" that has created
303
00:14:10,583 --> 00:14:13,530
an intelligent
software system,
304
00:14:13,552 --> 00:14:16,396
an automated system,
called "Amelia."
305
00:14:16,422 --> 00:14:19,403
She can not only understand
what you're saying to her,
306
00:14:19,425 --> 00:14:22,650
she understands the context
of what you're saying,
307
00:14:22,940 --> 00:14:24,370
and she can learn
from her mistakes.
308
00:14:24,396 --> 00:14:26,569
This is a huge deal because
what we're going to see
309
00:14:26,599 --> 00:14:29,136
is all of the customer
service agent jobs,
310
00:14:29,168 --> 00:14:31,409
if she is successful,
if this software program,
311
00:14:31,437 --> 00:14:34,475
this automated software
program is successful,
312
00:14:34,506 --> 00:14:37,578
we could see all of
those jobs go away.
313
00:14:37,610 --> 00:14:39,647
These things tend
to go to marginal cost,
314
00:14:39,879 --> 00:14:43,486
and marginal cost is copying
software, which is nothing,
315
00:14:43,515 --> 00:14:45,495
and running it on a computer
316
00:14:45,517 --> 00:14:47,155
which will probably be
very cheap.
317
00:14:47,186 --> 00:14:51,134
♪
318
00:14:51,156 --> 00:14:55,263
Human doctors will be,
in some respect, pushed aside
319
00:14:55,294 --> 00:14:56,967
because machines
can do a better job
320
00:14:56,996 --> 00:14:58,634
of diagnosis than they can.
321
00:14:58,664 --> 00:15:00,610
Now will they have the empathy
that current doctors do?
322
00:15:00,633 --> 00:15:02,306
I don't know, but at least
they'll have the knowledge
323
00:15:02,334 --> 00:15:04,211
that our doctors do,
they'll be more advanced,
324
00:15:04,236 --> 00:15:06,450
so I can see this
option in healthcare.
325
00:15:06,710 --> 00:15:08,574
The one that is likely to be
the biggest growth area,
326
00:15:08,607 --> 00:15:12,770
from an economic standpoint,
is the android companions
327
00:15:12,111 --> 00:15:15,320
to help elderly people,
okay, because there's--
328
00:15:15,347 --> 00:15:18,210
you know, given the rise
in elderly people
329
00:15:18,500 --> 00:15:21,224
over the next 20,
30 years, that it's--
330
00:15:21,253 --> 00:15:23,199
and it's unlikely there are
gonna be enough people
331
00:15:23,222 --> 00:15:25,395
going into
the nursing profession
332
00:15:25,424 --> 00:15:27,927
to actually serve them,
especially if we're thinking
333
00:15:27,960 --> 00:15:31,134
in terms of home-based care.
334
00:15:31,163 --> 00:15:34,700
The robot surgeon, I think,
is something that will happen
335
00:15:34,330 --> 00:15:36,707
in the not-too-distant future
336
00:15:36,936 --> 00:15:41,248
because a lot of that is
to do with manual dexterity
337
00:15:41,273 --> 00:15:46,279
and having the expertise
to recognize--
338
00:15:46,312 --> 00:15:50,283
to understand what you're
manipulating as a surgeon.
339
00:15:50,316 --> 00:15:53,559
Terrific amount of expertise
for a human to accumulate,
340
00:15:53,585 --> 00:15:55,531
but I can imagine that
we would be able to build
341
00:15:55,554 --> 00:15:58,364
something that is
a specialized robot surgeon
342
00:15:58,390 --> 00:16:00,700
that can carry out
particular operations
343
00:16:00,726 --> 00:16:03,104
such as a prostate operation.
344
00:16:03,128 --> 00:16:04,698
That's one that people
are working on right now,
345
00:16:04,730 --> 00:16:06,403
and I think
they're nearly there
346
00:16:06,432 --> 00:16:07,638
of being able to produce
347
00:16:07,666 --> 00:16:10,100
a reliable robot surgeon
that can do that.
348
00:16:10,350 --> 00:16:11,742
You might not want to submit
yourself to this thing,
349
00:16:11,971 --> 00:16:14,420
you might think, but in fact,
I think we'll be able to make
350
00:16:14,730 --> 00:16:17,316
a very reliable robot surgeon
to do that sort of thing.
351
00:16:17,343 --> 00:16:24,693
♪
352
00:16:24,717 --> 00:16:26,424
I can see this
option in finance
353
00:16:26,452 --> 00:16:28,489
because they're moving
to digital currencies.
354
00:16:28,520 --> 00:16:31,626
And--and we're now
moving to crowdfunding
355
00:16:31,657 --> 00:16:35,230
and crowdbanking
and all these other advances.
356
00:16:35,260 --> 00:16:37,467
One of my favorites
is investment bankers.
357
00:16:37,496 --> 00:16:41,501
Artificial intelligence
already does more
358
00:16:41,533 --> 00:16:44,514
stock market trades
today than any human being.
359
00:16:44,536 --> 00:16:48,211
Lots of decisions like
decisions about mortgages
360
00:16:48,240 --> 00:16:50,652
and insurance, already
those things have been,
361
00:16:50,676 --> 00:16:53,589
you know, largely
taken over by programs,
362
00:16:53,612 --> 00:16:56,388
and I think that kind of trend
is only gonna continue.
363
00:16:56,415 --> 00:17:04,415
♪
364
00:17:08,560 --> 00:17:11,268
Every time there's
a technological change,
365
00:17:11,296 --> 00:17:13,367
it will, unfortunately, put
a lot of people out of work.
366
00:17:13,399 --> 00:17:15,208
It happened with
the cotton gin.
367
00:17:15,234 --> 00:17:18,738
It's happened with every
single technological change.
368
00:17:18,771 --> 00:17:22,446
So, sure, technology
destroys jobs,
369
00:17:22,474 --> 00:17:24,147
but it creates new ones.
370
00:17:24,176 --> 00:17:27,157
Moving from the age of work
that we're in now
371
00:17:27,179 --> 00:17:31,423
into the abundant,
ubiquitous automation age,
372
00:17:31,450 --> 00:17:33,828
that bridge that
we have to cross
373
00:17:34,530 --> 00:17:36,659
is gonna be a very
interesting time period.
374
00:17:36,688 --> 00:17:39,225
I think in the very beginning
of that time period,
375
00:17:39,258 --> 00:17:43,263
you're going to see automation
star-t to replace jobs,
376
00:17:43,295 --> 00:17:46,708
but those jobs will transfer
into other forms of work.
377
00:17:46,732 --> 00:17:50,770
So, for example, instead
of working in a factory,
378
00:17:50,102 --> 00:17:53,208
you will learn to code
and you will code the robots
379
00:17:53,238 --> 00:17:54,740
that are working
in the factory.
380
00:17:54,773 --> 00:17:58,840
When I was a young man
and I went for careers advice,
381
00:17:58,110 --> 00:18:00,112
I don't know what they
would have made of me
382
00:18:00,145 --> 00:18:04,355
asking for a job
as a webmaster.
383
00:18:04,383 --> 00:18:07,227
It didn't exist, there
wasn't a web at that time.
384
00:18:07,252 --> 00:18:10,324
And, right now, we have
over 200,000 vacancies
385
00:18:10,355 --> 00:18:12,835
for people who can
analyze big data.
386
00:18:12,858 --> 00:18:15,168
And we really do need people
387
00:18:15,194 --> 00:18:17,105
and mechanisms
for analyzing it
388
00:18:17,129 --> 00:18:20,838
and getting the most
information from that data,
389
00:18:20,866 --> 00:18:23,779
and that problem is only
gonna increase in the future.
390
00:18:23,802 --> 00:18:26,442
And I do think that there's
gonna be a lot of employment
391
00:18:26,472 --> 00:18:28,110
moving in that direction.
392
00:18:28,140 --> 00:18:33,180
♪
393
00:18:33,212 --> 00:18:36,318
The history of our country
proves that new inventions
394
00:18:36,348 --> 00:18:40,421
create thousands of jobs
for every one they displace.
395
00:18:40,452 --> 00:18:44,457
So it wasn't long before your
grandfather had a better job
396
00:18:44,490 --> 00:18:46,663
at more pay for less work.
397
00:18:46,692 --> 00:18:48,399
We're always offered
this solution
398
00:18:48,427 --> 00:18:50,703
of still more education,
still more training.
399
00:18:50,729 --> 00:18:52,402
If people lose
their routine job,
400
00:18:52,431 --> 00:18:53,808
then let's send them
back to school.
401
00:18:53,832 --> 00:18:55,709
They'll pick up
some new skills,
402
00:18:55,734 --> 00:18:57,236
they'll learn something new,
and then they'll be able
403
00:18:57,269 --> 00:18:59,875
to move into some
more rewarding career.
404
00:18:59,905 --> 00:19:02,249
That's not gonna operate
so well in the future
405
00:19:02,274 --> 00:19:03,617
where the machines are coming
406
00:19:03,642 --> 00:19:05,553
for those skilled
jobs as well.
407
00:19:05,577 --> 00:19:07,318
The fact is that machines
are really good at
408
00:19:07,346 --> 00:19:08,848
picking up skills
and doing all kinds
409
00:19:08,881 --> 00:19:11,191
of extraordinarily
complex things,
410
00:19:11,216 --> 00:19:12,559
so those jobs
aren't necessarily
411
00:19:12,584 --> 00:19:14,359
gonna be there either.
412
00:19:14,386 --> 00:19:17,265
And a second insight,
I think, is that historically,
413
00:19:17,289 --> 00:19:20,202
it's always been the case that
the vast majority of people
414
00:19:20,225 --> 00:19:22,171
have always done routine work.
415
00:19:22,194 --> 00:19:24,731
So even if people can
make that transition,
416
00:19:24,763 --> 00:19:26,674
if they can succeed
in going back to school
417
00:19:26,698 --> 00:19:29,201
and learning something new,
in percentage terms,
418
00:19:29,234 --> 00:19:31,271
those jobs don't constitute
419
00:19:31,303 --> 00:19:32,873
that much of the total
employment out there.
420
00:19:32,905 --> 00:19:34,748
I mean, most people are doing
these more routine things.
421
00:19:34,773 --> 00:19:37,185
So, you know, we're up
against a real problem
422
00:19:37,209 --> 00:19:39,883
that's probably gonna
require a political solution.
423
00:19:39,912 --> 00:19:44,725
It's probably going to
require direct redistribution.
424
00:19:44,750 --> 00:19:46,423
That's my take on it,
425
00:19:46,451 --> 00:19:49,796
and that's a staggering
political challenge,
426
00:19:49,821 --> 00:19:51,562
especially in
the United States.
427
00:19:51,590 --> 00:19:53,866
This would be fine
if we had generations
428
00:19:53,892 --> 00:19:57,738
to adapt to the change
so that the next generation
429
00:19:57,763 --> 00:19:59,208
could develop
a different lifestyle,
430
00:19:59,231 --> 00:20:00,676
different value system.
431
00:20:00,699 --> 00:20:02,679
The problem is that
all of this is happening
432
00:20:02,701 --> 00:20:03,679
within the same generation.
433
00:20:03,702 --> 00:20:05,875
Within a period of 15 years,
434
00:20:05,904 --> 00:20:09,442
we're gonna start wiping out
most of the jobs that we know.
435
00:20:09,474 --> 00:20:12,284
That's really what worries me.
436
00:20:12,311 --> 00:20:15,758
A term commonly used when
describing the trajectory
437
00:20:15,781 --> 00:20:18,990
of technological progress
and where it's leading us
438
00:20:19,218 --> 00:20:21,892
is the "technological
singularity."
439
00:20:21,920 --> 00:20:23,900
The term is
borrowed from physics
440
00:20:23,922 --> 00:20:27,961
to describe an event horizon
or a moment in space time
441
00:20:27,993 --> 00:20:30,269
that you cannot see beyond.
442
00:20:30,295 --> 00:20:32,502
We are currently
in the transistor era
443
00:20:32,531 --> 00:20:34,602
of information technology.
444
00:20:34,633 --> 00:20:39,480
In 1965, co-founder
of Intel, Gordon Moore,
445
00:20:39,504 --> 00:20:41,950
made the observation
that the processing power
446
00:20:41,974 --> 00:20:45,581
of computers doubles
every 18 months.
447
00:20:45,611 --> 00:20:48,285
The prediction that
this trend will continue
448
00:20:48,313 --> 00:20:50,725
is known as Moore's Law.
449
00:20:50,749 --> 00:20:51,989
When Intel created
450
00:20:52,170 --> 00:20:55,931
their first computer
processing unit in 1971,
451
00:20:55,954 --> 00:20:59,265
it has 2,300 transistors
452
00:20:59,291 --> 00:21:04,100
and had a processing speed
of 740 kilohertz.
453
00:21:04,290 --> 00:21:08,808
Today, a typical CPU has
over a billion transistors
454
00:21:08,834 --> 00:21:11,747
with an average speed
of two gigahertz.
455
00:21:11,770 --> 00:21:15,308
However, many predict
that by 2020,
456
00:21:15,340 --> 00:21:18,344
the miniaturization of
transistors and silicon chips
457
00:21:18,377 --> 00:21:20,323
will reach its limit,
458
00:21:20,345 --> 00:21:25,294
and Moore's Law will fizzle
out into a post-silicon era.
459
00:21:25,317 --> 00:21:27,422
Another way of
describing the term
460
00:21:27,452 --> 00:21:29,932
"technological singularity"
461
00:21:29,955 --> 00:21:32,458
is a time when
artificial intelligence
462
00:21:32,491 --> 00:21:36,337
surpasses human
intellectual capacity.
463
00:21:36,361 --> 00:21:37,897
But does this mean
that a computer
464
00:21:37,929 --> 00:21:39,875
can produce a new idea
465
00:21:39,898 --> 00:21:42,970
or make an original
contribution to knowledge?
466
00:21:43,100 --> 00:21:46,574
Artificial intelligence, Al,
is a longstanding project
467
00:21:46,605 --> 00:21:49,449
which has to do
with basically trying
468
00:21:49,474 --> 00:21:51,784
to use machines as a way
of trying to understand
469
00:21:51,810 --> 00:21:53,551
the nature of intelligence
470
00:21:53,578 --> 00:21:55,819
and, originally,
the idea was, in some sense,
471
00:21:55,847 --> 00:21:58,540
to manufacture
within machines
472
00:21:58,830 --> 00:22:00,723
something that could
simulate human intelligence.
473
00:22:00,752 --> 00:22:02,789
But I think now,
as the years have gone on,
474
00:22:02,821 --> 00:22:04,767
we now think in terms
of intelligence
475
00:22:04,790 --> 00:22:06,980
in a much more abstract way,
476
00:22:06,325 --> 00:22:10,340
so the ability to engage
in massive computations,
477
00:22:10,620 --> 00:22:11,803
right, where you can
end up making
478
00:22:11,830 --> 00:22:14,367
quite intelligent decisions
much more quickly
479
00:22:14,399 --> 00:22:15,810
than a human being can.
480
00:22:15,834 --> 00:22:17,745
So in this respect,
artificial intelligence
481
00:22:17,769 --> 00:22:19,749
in a sense is a,
you might say,
482
00:22:19,771 --> 00:22:21,876
as trying to go to the next
level of intelligence
483
00:22:21,907 --> 00:22:23,450
beyond the human.
484
00:22:23,750 --> 00:22:24,884
♪
485
00:22:24,910 --> 00:22:27,481
A proper Al could substitute
486
00:22:27,512 --> 00:22:32,180
for practically any human job
at some level of skill,
487
00:22:32,500 --> 00:22:34,894
so it's--it would be
488
00:22:34,920 --> 00:22:36,900
a completely
different situation.
489
00:22:36,922 --> 00:22:40,369
You can imagine any kind of
job could, in theory,
490
00:22:40,392 --> 00:22:45,680
be replaced by technology,
if you build human-level Al.
491
00:22:45,970 --> 00:22:49,443
Now that, of course, may
or may not be a good thing.
492
00:22:49,468 --> 00:22:51,880
You'd be able to,
for example, make robots
493
00:22:51,903 --> 00:22:54,420
that could do
all kinds of jobs
494
00:22:54,720 --> 00:22:56,109
that humans don't
necessarily want to do.
495
00:22:56,141 --> 00:22:59,145
There are the so-called
three D's jobs
496
00:22:59,378 --> 00:23:01,949
that are dirty,
dangerous, or dull,
497
00:23:01,980 --> 00:23:03,982
which humans might
not want to do,
498
00:23:04,150 --> 00:23:06,928
and yet, where you
might actually want
499
00:23:06,952 --> 00:23:08,522
a human level of intelligence
500
00:23:08,553 --> 00:23:10,499
to do the job well
or do the job properly.
501
00:23:10,522 --> 00:23:12,433
These are things
which are achievable.
502
00:23:12,457 --> 00:23:13,902
- Yeah.
- This isn't something...
503
00:23:13,925 --> 00:23:15,529
I don't think
it's science fiction.
504
00:23:15,560 --> 00:23:17,164
I think this is
entirely feasible
505
00:23:17,396 --> 00:23:19,876
that we could
build a computer
506
00:23:19,898 --> 00:23:22,435
which is vastly superhuman
507
00:23:22,467 --> 00:23:24,913
which is conscious,
which has emotions,
508
00:23:24,936 --> 00:23:27,473
which is, essentially,
a new species
509
00:23:27,506 --> 00:23:31,113
of self-aware intelligence
and conscious in every way
510
00:23:31,143 --> 00:23:34,560
and has got emotions
the same as you and I do.
511
00:23:34,790 --> 00:23:35,752
I don't see any
fundamental limits
512
00:23:35,781 --> 00:23:38,125
on what we can do,
and we already know enough
513
00:23:38,150 --> 00:23:41,620
about basic science
to start doing that now.
514
00:23:41,653 --> 00:23:44,998
So some people are
concerned about,
515
00:23:45,230 --> 00:23:48,493
you know, possible
risks of building Al
516
00:23:48,527 --> 00:23:51,804
and building something
that is very, very powerful
517
00:23:51,830 --> 00:23:54,106
where there are
unintended consequences
518
00:23:54,132 --> 00:23:57,443
of the thing that you've
built and where it might
519
00:23:57,469 --> 00:23:58,675
do things that
you can't predict
520
00:23:58,703 --> 00:24:00,808
that might be
extremely dangerous.
521
00:24:00,839 --> 00:24:02,648
So a so-called
"existential risk,"
522
00:24:02,674 --> 00:24:03,948
as some people call it.
523
00:24:03,975 --> 00:24:06,922
We are going to hand off
to our machines
524
00:24:06,945 --> 00:24:09,840
all the
multidimensional problems
525
00:24:09,114 --> 00:24:11,492
that we are incapable
of coping with.
526
00:24:11,516 --> 00:24:14,827
You and I can take
a problem with two or three
527
00:24:14,853 --> 00:24:17,697
or four or even seven inputs.
528
00:24:17,722 --> 00:24:18,826
But 300?
529
00:24:18,857 --> 00:24:21,235
A thousand, a million inputs?
530
00:24:21,460 --> 00:24:22,837
We're dead in the water.
531
00:24:22,861 --> 00:24:24,841
The machines can
cope with that.
532
00:24:24,863 --> 00:24:26,672
The advantage
that computers have
533
00:24:26,698 --> 00:24:29,577
is that they communicate
at gigabit speeds.
534
00:24:29,601 --> 00:24:31,137
They all network together.
535
00:24:31,169 --> 00:24:32,944
We talk in slow motion.
536
00:24:32,971 --> 00:24:37,780
So computers will achieve
this level of awareness
537
00:24:37,108 --> 00:24:39,110
probably in the next
20, 30, 40 years.
538
00:24:39,144 --> 00:24:42,682
It's not that if it's
good or that it's evil.
539
00:24:42,714 --> 00:24:44,125
It's we're
probably good enough
540
00:24:44,149 --> 00:24:46,857
to not program an evil Al.
541
00:24:46,885 --> 00:24:50,526
It's that if it's
lethally indifferent.
542
00:24:50,555 --> 00:24:52,159
If it has certain things
543
00:24:52,190 --> 00:24:55,865
that it's tasked
with accomplishing
544
00:24:55,894 --> 00:24:57,840
and humans are in the way.
545
00:24:57,863 --> 00:25:00,935
So there's this concern
that once we reach that moment
546
00:25:00,966 --> 00:25:03,242
where the computers
outperform us
547
00:25:03,268 --> 00:25:05,942
in ways that are
quite meaningful,
548
00:25:05,971 --> 00:25:09,282
that then they will somehow
be motivated to dispose of us
549
00:25:09,508 --> 00:25:12,114
or take over us
or something of this kind.
550
00:25:12,143 --> 00:25:13,781
I don't really believe that
551
00:25:13,812 --> 00:25:16,880
because these kinds
of developments,
552
00:25:16,114 --> 00:25:18,253
which probably are a little
farther off in the future
553
00:25:18,283 --> 00:25:20,695
than some of their
enthusiasts think,
554
00:25:20,719 --> 00:25:22,858
there will be time
for us to adapt,
555
00:25:22,888 --> 00:25:26,280
to come to terms with it,
to organize social systems
556
00:25:26,570 --> 00:25:28,300
that will enable us
to deal adequately
557
00:25:28,260 --> 00:25:30,970
with these new forms
of intelligence.
558
00:25:30,128 --> 00:25:31,971
So I don't thi--this is not
just gonna be something
559
00:25:31,997 --> 00:25:34,136
that's gonna happen
as a miracle tomorrow
560
00:25:34,165 --> 00:25:35,974
and then we'll be
taken by surprise.
561
00:25:36,100 --> 00:25:38,380
But I do think
the key thing here is
562
00:25:38,690 --> 00:25:41,710
that we need to treat
these futuristic things
563
00:25:41,740 --> 00:25:44,243
as not as far away
as people say they are.
564
00:25:44,276 --> 00:25:46,153
Just because they're
not likely to happen
565
00:25:46,177 --> 00:25:47,952
in 15 years, let's say,
566
00:25:47,979 --> 00:25:49,981
it doesn't mean they
won't happen in 50 years.
567
00:25:50,150 --> 00:25:54,200
It's gonna be of kind of
historical dimensions,
568
00:25:54,520 --> 00:25:56,191
and it's very hard
to predict, I think,
569
00:25:56,221 --> 00:26:00,601
whether it's gonna take us
in a utopian direction
570
00:26:00,625 --> 00:26:02,161
or in a dystopian direction
571
00:26:02,193 --> 00:26:04,935
or more likely
something in between,
572
00:26:04,963 --> 00:26:06,237
but just very different.
573
00:26:06,264 --> 00:26:08,175
Very hard to predict.
574
00:26:08,199 --> 00:26:10,839
You see, it's our job
to take raw materials,
575
00:26:10,869 --> 00:26:12,746
adapt them to useful forms,
576
00:26:12,771 --> 00:26:16,116
take natural forces,
harness them to do man's work.
577
00:26:16,141 --> 00:26:18,280
The automated systems
of the future
578
00:26:18,310 --> 00:26:21,951
are a natural process
of human innovation.
579
00:26:21,980 --> 00:26:26,326
It all comes back to the idea
of doing more with less.
580
00:26:26,351 --> 00:26:28,580
This process of innovation
581
00:26:28,860 --> 00:26:31,761
is driven not by
necessity, but desire,
582
00:26:31,790 --> 00:26:35,363
or to simply fill
a gap in the market.
583
00:26:35,594 --> 00:26:37,972
Farm owners didn't
really need to replace
584
00:26:37,996 --> 00:26:40,840
their workers with machines,
but they did so
585
00:26:40,865 --> 00:26:43,709
because they could
foresee the benefits.
586
00:26:43,735 --> 00:26:46,450
It's a natural
cycle of business.
587
00:26:46,710 --> 00:26:49,746
Doing more with less
leads to greater prosperity.
588
00:26:49,774 --> 00:26:51,310
♪
589
00:26:51,343 --> 00:26:53,118
The hope is that
we can adapt to this
590
00:26:53,144 --> 00:26:54,282
politically and socially.
591
00:26:54,312 --> 00:26:55,757
In order to do that,
592
00:26:55,780 --> 00:26:57,259
we have to begin
a conversation now.
593
00:26:57,282 --> 00:26:59,694
Remember that
we're up against
594
00:26:59,718 --> 00:27:02,164
an exponential
arc of progress.
595
00:27:02,187 --> 00:27:04,326
Things are gonna keep
moving faster and faster,
596
00:27:04,356 --> 00:27:06,802
so we need to star-t
talking about this now
597
00:27:06,825 --> 00:27:08,361
and we need to sort of
get the word out there
598
00:27:08,393 --> 00:27:09,838
so that people will realize
599
00:27:09,861 --> 00:27:12,205
that this problem
is coming at us,
600
00:27:12,230 --> 00:27:14,176
so that we can
begin to discuss
601
00:27:14,199 --> 00:27:16,760
viable political
solutions to this
602
00:27:16,101 --> 00:27:18,843
because, again,
I think it will require,
603
00:27:18,870 --> 00:27:20,372
ultimately,
a political choice.
604
00:27:20,405 --> 00:27:24,410
It's not something that
is gonna son itself out
605
00:27:24,643 --> 00:27:27,647
by itself as a result
of the normal functioning
606
00:27:27,679 --> 00:27:29,249
of the market economy.
607
00:27:29,280 --> 00:27:31,282
It's something
that will require
608
00:27:31,316 --> 00:27:33,570
some son of an intervention
609
00:27:33,840 --> 00:27:34,358
and, you know,
pan of the problem is
610
00:27:34,386 --> 00:27:36,590
that in the United States,
611
00:27:36,870 --> 00:27:39,899
roughly half of the population
is very conservative
612
00:27:39,924 --> 00:27:42,131
and they really don't
believe in this idea
613
00:27:42,160 --> 00:27:44,800
of intervention
in the market.
614
00:27:44,829 --> 00:27:46,934
It's gonna be
a tough transition,
615
00:27:46,965 --> 00:27:49,200
and those that find
themselves out of jobs
616
00:27:49,340 --> 00:27:51,275
because a robot has taken it
617
00:27:51,302 --> 00:27:53,213
are gonna be
pretty pissed off.
618
00:27:53,238 --> 00:27:57,186
The effect of automation
on jobs and livelihood
619
00:27:57,208 --> 00:28:00,712
is going to be behind this
like the original Luddites.
620
00:28:00,745 --> 00:28:02,850
It wasn't--it wasn't
that they were against
621
00:28:02,881 --> 00:28:05,794
technological developments
in some ethereal sense.
622
00:28:05,817 --> 00:28:08,957
It was that this was
taking their damn jobs.
623
00:28:08,987 --> 00:28:11,831
I absolutely think there could
be a Neo-Luddite movement
624
00:28:11,856 --> 00:28:14,336
against the future,
against technology
625
00:28:14,359 --> 00:28:15,736
because they're gonna say,
626
00:28:15,760 --> 00:28:17,680
"Well, hey,
you're taking our jobs,
627
00:28:17,950 --> 00:28:18,472
you're taking
our livelihoods away.
628
00:28:18,697 --> 00:28:20,404
You're taking
everything away from us."
629
00:28:20,432 --> 00:28:22,810
But I think that's when
it's gonna be important
630
00:28:22,834 --> 00:28:25,974
that leaders and government
step in and say,
631
00:28:26,400 --> 00:28:27,278
"It may seem that way,
632
00:28:27,305 --> 00:28:29,216
but life is going to get
better for everyone."
633
00:28:29,240 --> 00:28:31,880
We're gonna have more time
to do things that we want,
634
00:28:31,910 --> 00:28:33,389
more vacations,
more passions.
635
00:28:33,411 --> 00:28:35,186
This is the modern world.
636
00:28:35,213 --> 00:28:38,387
We can create the utopia
that we've always dreamt of.
637
00:28:38,416 --> 00:28:41,260
Why are we saying,
"My job's not safe,"
638
00:28:41,286 --> 00:28:44,824
or, "Automation's going
to steal my jobs"?
639
00:28:44,856 --> 00:28:46,460
These are the--these
are the phrases
640
00:28:46,491 --> 00:28:48,368
that keep getting
pushed out there.
641
00:28:48,393 --> 00:28:50,660
They're negative phrases,
642
00:28:50,950 --> 00:28:53,304
and instead, it seems
that we would look at this,
643
00:28:53,331 --> 00:28:55,400
especially if someone has
been working in a factory
644
00:28:55,330 --> 00:28:56,273
their whole life,
645
00:28:56,301 --> 00:28:58,406
that they would look at
that system and say,
646
00:28:58,436 --> 00:29:02,430
"Thank goodness that this is
starting to be automated."
647
00:29:02,730 --> 00:29:04,530
I don't want anyone
to have to crawl
648
00:29:04,750 --> 00:29:07,215
into a hole in the ground
and pull up coal.
649
00:29:07,245 --> 00:29:09,782
No human being should
have to go do that.
650
00:29:09,814 --> 00:29:12,317
If you make an awful
lot of computers
651
00:29:12,350 --> 00:29:14,421
and a lot of robots,
and the computers can make
652
00:29:14,452 --> 00:29:16,193
those robots
very sophisticated
653
00:29:16,221 --> 00:29:18,980
and do lots of
sophisticated jobs,
654
00:29:18,123 --> 00:29:21,502
you could eliminate most of
the high-value physical jobs
655
00:29:21,526 --> 00:29:24,370
and also most of the
high-value intellectual jobs.
656
00:29:24,395 --> 00:29:26,466
What you're left with,
then, are those jobs
657
00:29:26,498 --> 00:29:28,205
where you have to be
a human being,
658
00:29:28,233 --> 00:29:30,873
so I find it quite
paradoxical in some ways
659
00:29:30,902 --> 00:29:32,904
that the more advanced
the technology becomes,
660
00:29:32,937 --> 00:29:35,440
the more it forces us
to become humans.
661
00:29:35,473 --> 00:29:36,975
So in some ways,
it's very good.
662
00:29:37,800 --> 00:29:40,114
It forces us to explore
what is humanity about?
663
00:29:40,145 --> 00:29:41,283
What are the
fundamentally important
664
00:29:41,312 --> 00:29:42,882
things about being a human?
665
00:29:42,914 --> 00:29:45,870
It's not being able to,
you know, flip a burger
666
00:29:45,116 --> 00:29:48,325
or, you know, carve
something intricately.
667
00:29:48,353 --> 00:29:49,832
A computer or a robot
668
00:29:49,854 --> 00:29:52,270
could do that far better
than a human being.
669
00:29:52,560 --> 00:29:54,332
One thing I've noticed, if
you talk to techno-optimists
670
00:29:54,359 --> 00:29:57,829
about the future of work
and how it's gonna unfold,
671
00:29:57,862 --> 00:30:00,274
very often they will
focus on this issue
672
00:30:00,298 --> 00:30:03,177
of how will we all be
fulfilled in the future?
673
00:30:03,201 --> 00:30:05,100
What will be
our purpose in life
674
00:30:05,360 --> 00:30:06,515
when we don't want to work?
675
00:30:06,538 --> 00:30:09,542
And, you know, you can
son of posit this
676
00:30:09,574 --> 00:30:12,200
in terms of-there was
a guy named Maslow
677
00:30:12,430 --> 00:30:14,250
who came up with
a hierarchy of human needs,
678
00:30:14,279 --> 00:30:15,553
Maslow's pyramid.
679
00:30:15,580 --> 00:30:17,389
And at the base
of that pyramid
680
00:30:17,415 --> 00:30:20,555
are the foundational things
like food and shelter,
681
00:30:20,585 --> 00:30:22,326
and at the top of
that pyramid, of course,
682
00:30:22,353 --> 00:30:24,333
are all these intangible
things like, you know,
683
00:30:24,355 --> 00:30:25,925
a sense of purpose
in your life
684
00:30:25,957 --> 00:30:27,459
and fulfillment
and so forth.
685
00:30:27,492 --> 00:30:31,650
What you'll find among the most
techno-optimistic people
686
00:30:31,950 --> 00:30:32,574
is that they will
want to skip
687
00:30:32,597 --> 00:30:34,270
right over the base
of that pyramid
688
00:30:34,299 --> 00:30:36,609
and jump right to the top
and start talking about,
689
00:30:36,835 --> 00:30:38,508
"Oh, gosh, how are
we gonna," you know,
690
00:30:38,536 --> 00:30:39,947
"what's the meaning
of our life gonna be
691
00:30:39,971 --> 00:30:41,973
when we don't have to work?"
692
00:30:42,600 --> 00:30:45,317
But the reality is that
the base of that pyramid,
693
00:30:45,343 --> 00:30:47,584
food, shelter,
all the things that we need
694
00:30:47,612 --> 00:30:49,319
to have, you know,
a decent life,
695
00:30:49,347 --> 00:30:50,553
that's the elephant
in the room.
696
00:30:50,582 --> 00:30:54,290
That stuff costs real money.
697
00:30:54,520 --> 00:30:55,429
That stuff is gonna involve
698
00:30:55,453 --> 00:30:58,240
perhaps raising taxes
on a lot of the people
699
00:30:58,560 --> 00:30:59,467
that are doing
really well right now,
700
00:30:59,490 --> 00:31:00,969
and that's probably
part of the reason
701
00:31:00,992 --> 00:31:02,528
that they prefer
not to talk about it.
702
00:31:02,560 --> 00:31:05,234
So what do we do
with the 99 percent
703
00:31:05,263 --> 00:31:06,401
of the population
on this planet
704
00:31:06,431 --> 00:31:08,206
if they don't have jobs?
705
00:31:08,233 --> 00:31:11,112
The suggestion is
and the goal is
706
00:31:11,135 --> 00:31:12,944
to make this
an efficient system.
707
00:31:12,971 --> 00:31:15,383
You put automation
in the hands of everyone.
708
00:31:15,406 --> 00:31:17,477
In the near future,
we're going to see systems
709
00:31:17,508 --> 00:31:19,613
where we can
3D print our clothing,
710
00:31:19,644 --> 00:31:21,317
we can 3D print food.
711
00:31:21,346 --> 00:31:23,952
If you automate
these self-replicating
712
00:31:23,982 --> 00:31:27,156
industrial machines
to pull the raw materials
713
00:31:27,185 --> 00:31:29,620
and distribute
those raw materials
714
00:31:29,870 --> 00:31:31,431
to everyone who
has the capacity
715
00:31:31,456 --> 00:31:33,265
to 3D print their own house
716
00:31:33,291 --> 00:31:36,101
or 3D print
their own farm bot,
717
00:31:36,127 --> 00:31:40,405
you have literally
solved the equation
718
00:31:40,431 --> 00:31:43,105
of how do I automate my life
719
00:31:43,134 --> 00:31:45,444
and how do I automate
my basic necessities?
720
00:31:45,470 --> 00:31:47,347
If we had the political will
721
00:31:47,372 --> 00:31:49,283
to take all these
new technologies
722
00:31:49,307 --> 00:31:51,685
and the wealth
and abundance they create,
723
00:31:51,910 --> 00:31:54,356
and distribute these
across our society
724
00:31:54,379 --> 00:31:55,949
in the First World countries
725
00:31:55,980 --> 00:31:57,618
and also across
the whole world,
726
00:31:57,649 --> 00:31:59,629
then, of course, I mean,
the sky's the limit.
727
00:31:59,651 --> 00:32:01,562
We can solve all
kinds of problems.
728
00:32:01,586 --> 00:32:04,396
But we will have to have
the political will to do that,
729
00:32:04,422 --> 00:32:07,280
and I don't see a whole lot
of evidence for it right now.
730
00:32:07,580 --> 00:32:10,267
There really is enough
already for everybody,
731
00:32:10,295 --> 00:32:12,332
certainly to have
an adequate life,
732
00:32:12,363 --> 00:32:16,277
if not a life
of superabundant,
733
00:32:16,301 --> 00:32:19,305
so, you know, I don't think
that the introduction
734
00:32:19,337 --> 00:32:21,248
of more labor-saving devices
735
00:32:21,272 --> 00:32:25,490
or more is really gonna
make any difference in that.
736
00:32:25,760 --> 00:32:26,282
The reason there
are poor people
737
00:32:26,311 --> 00:32:27,722
is 'cause there's rich people.
738
00:32:27,946 --> 00:32:34,454
♪
739
00:32:34,485 --> 00:32:37,220
You're simultaneously
making a lot of people
740
00:32:37,550 --> 00:32:38,693
almost completely useless
741
00:32:38,723 --> 00:32:41,636
while generating
a lot more wealth
742
00:32:41,659 --> 00:32:43,696
and value than ever before.
743
00:32:43,728 --> 00:32:44,968
So I worry about this.
744
00:32:44,996 --> 00:32:46,236
I worry about the schism
745
00:32:46,264 --> 00:32:49,268
between the super rich
and the poor.
746
00:32:49,300 --> 00:32:53,900
The ultra rich,
if they're representative
747
00:32:53,370 --> 00:32:54,983
of some of the people
we've seen in Silicon Valley,
748
00:32:55,600 --> 00:32:56,246
I really, really worry
749
00:32:56,274 --> 00:32:58,254
because I wonder if
they really have a soul.
750
00:32:58,276 --> 00:33:01,723
I really--I wonder if
they really have an awareness
751
00:33:01,746 --> 00:33:04,192
of how regular people feel
752
00:33:04,215 --> 00:33:06,491
and if they share
the values of humanity.
753
00:33:06,517 --> 00:33:09,191
It really bothers me
that you have this ultra rich
754
00:33:09,220 --> 00:33:12,167
that is out of touch with
regular people, with humanity.
755
00:33:12,190 --> 00:33:14,363
This is being filmed
right now in San Francisco,
756
00:33:14,392 --> 00:33:17,601
which is by all accounts
one of the wealthiest cities
757
00:33:17,628 --> 00:33:19,733
and most advanced
cities in the world,
758
00:33:19,764 --> 00:33:21,505
and it's pretty much
ground zero
759
00:33:21,532 --> 00:33:24,172
for this
technological revolution,
760
00:33:24,202 --> 00:33:25,476
and, yet, as I came here,
761
00:33:25,503 --> 00:33:28,416
I almost tripped over
homeless people
762
00:33:28,439 --> 00:33:30,430
sleeping on the sidewalk.
763
00:33:30,740 --> 00:33:32,714
That is the reality
of today's economy
764
00:33:32,744 --> 00:33:33,779
and today's society.
765
00:33:34,120 --> 00:33:35,548
In a very real sense,
766
00:33:35,580 --> 00:33:38,754
we already live in
the economy of abundance,
767
00:33:38,783 --> 00:33:41,457
and yet we have not
solved this problem.
768
00:33:41,486 --> 00:33:44,920
I think the future
for the four billion
769
00:33:44,122 --> 00:33:46,796
poor people in the world
is actually a very good one.
770
00:33:47,250 --> 00:33:49,562
We've seen the amount of food
in the world, for example,
771
00:33:49,594 --> 00:33:51,801
has more than doubled
in the last 25 years.
772
00:33:52,300 --> 00:33:54,135
That's likely to continue.
773
00:33:54,165 --> 00:33:58,272
Worldwide, we're seeing
massive economic growth.
774
00:33:58,302 --> 00:34:01,613
That really means that people
in poor countries today
775
00:34:01,639 --> 00:34:04,510
will be much better off
in the future,
776
00:34:04,750 --> 00:34:06,112
so there will still be
some poor people,
777
00:34:06,144 --> 00:34:07,452
relatively speaking,
778
00:34:07,478 --> 00:34:09,617
but compared to
today's poor people,
779
00:34:09,647 --> 00:34:11,285
they'll be actually
quite well-off.
780
00:34:11,315 --> 00:34:14,353
I think this is
an amplifier for inequality.
781
00:34:14,385 --> 00:34:18,231
It's gonna make what we see
now much more amplified.
782
00:34:18,256 --> 00:34:19,599
The number of people
that are doing
783
00:34:19,624 --> 00:34:21,228
really well in the economy
784
00:34:21,259 --> 00:34:23,296
I think is likely
to continue to shrink.
785
00:34:23,327 --> 00:34:25,170
Those people
that are doing well
786
00:34:25,196 --> 00:34:29,144
will do extraordinarily well,
but for more and more people,
787
00:34:29,167 --> 00:34:30,703
they're simply gonna
find themselves
788
00:34:30,735 --> 00:34:32,840
in a position where they
don't have a lot to offer.
789
00:34:33,710 --> 00:34:34,675
They don't have
a marketable skill.
790
00:34:34,705 --> 00:34:38,118
They don't have a viable way
to really earn an income
791
00:34:38,142 --> 00:34:39,746
or, in particular,
middle-class income.
792
00:34:39,777 --> 00:34:41,347
We should value the fact
793
00:34:41,379 --> 00:34:43,586
that we can spend
more time doing human work
794
00:34:43,614 --> 00:34:47,357
and the robots will get on,
increase the economy.
795
00:34:47,385 --> 00:34:49,592
They'll be still taking
all the resources
796
00:34:49,620 --> 00:34:51,725
and convening them
into material goods
797
00:34:51,756 --> 00:34:54,794
at very low cost,
so the economy will expand,
798
00:34:54,826 --> 00:34:56,305
we'll be better off,
799
00:34:56,327 --> 00:34:57,863
and we can concentrate
on what matters.
800
00:34:58,960 --> 00:35:00,269
There's nothing
to worry about in there.
801
00:35:00,298 --> 00:35:02,335
A constant stream
of savings dollars
802
00:35:02,366 --> 00:35:06,280
must flow into big
and small business each year.
803
00:35:06,304 --> 00:35:08,875
These dollars help
to buy the land,
804
00:35:09,107 --> 00:35:13,556
the buildings,
the tools and equipment,
805
00:35:13,578 --> 00:35:15,615
and create new
job opportunities
806
00:35:15,646 --> 00:35:17,557
for our expanding population.
807
00:35:17,582 --> 00:35:21,291
♪
808
00:35:21,319 --> 00:35:22,764
We need consumers out there.
809
00:35:22,787 --> 00:35:25,563
We need people
who can actually buy
810
00:35:25,590 --> 00:35:28,298
the things that are
produced by the economy.
811
00:35:28,326 --> 00:35:30,203
If you look at the way
our economy works,
812
00:35:30,228 --> 00:35:32,765
ultimately, it's driven
by end consumption,
813
00:35:32,797 --> 00:35:35,710
and by that, I mean people
and to a limited extent,
814
00:35:35,733 --> 00:35:37,804
governments, who buy things
because they want them
815
00:35:37,835 --> 00:35:39,473
or they need them.
816
00:35:39,504 --> 00:35:41,142
You know, businesses
in our economy,
817
00:35:41,172 --> 00:35:43,209
they also buy
things, of course,
818
00:35:43,241 --> 00:35:45,482
but they do that in order
to produce something else,
819
00:35:45,510 --> 00:35:47,490
and one business may sell
to another business,
820
00:35:47,512 --> 00:35:49,856
but at the end of that,
at the end of that chain,
821
00:35:49,881 --> 00:35:52,919
there has to stand a consumer
or perhaps a government
822
00:35:53,151 --> 00:35:55,529
who buys that
product or service
823
00:35:55,553 --> 00:35:57,533
just because they want it
or they need it.
824
00:35:57,555 --> 00:35:58,932
So this is not the case
825
00:35:59,157 --> 00:36:00,864
that things can just
keep going like this
826
00:36:00,892 --> 00:36:03,668
and get more and more
unequal over time
827
00:36:03,694 --> 00:36:05,401
and everything
will still be fine.
828
00:36:05,429 --> 00:36:06,737
I think that it
won't be fine.
829
00:36:06,764 --> 00:36:09,870
It will actually have
an impact on our economy
830
00:36:09,901 --> 00:36:12,245
and on our economic growth.
831
00:36:12,270 --> 00:36:15,774
We need intelligent planning
because, you know,
832
00:36:15,806 --> 00:36:18,787
being unemployed is not
a positive thing in itself.
833
00:36:18,809 --> 00:36:20,482
There has to be some
kind of transition point
834
00:36:20,511 --> 00:36:22,422
to some other form
of life after that.
835
00:36:22,446 --> 00:36:24,289
And, again, at the moment,
836
00:36:24,315 --> 00:36:26,693
I really don't see enough
attention being paid to this,
837
00:36:26,717 --> 00:36:29,960
so we need to take this
future prospect seriously now.
838
00:36:30,188 --> 00:36:32,190
♪
839
00:36:32,223 --> 00:36:35,227
If we manage to adapt
to this expected wave
840
00:36:35,259 --> 00:36:39,571
of technological unemployment
both politically and socially,
841
00:36:39,597 --> 00:36:41,838
it's likely
to facilitate a time
842
00:36:41,866 --> 00:36:44,472
when work takes on
a different meaning
843
00:36:44,502 --> 00:36:46,539
and a new role in our lives.
844
00:36:46,571 --> 00:36:48,209
♪
845
00:36:48,239 --> 00:36:50,913
Ideas of how we should
approach our relationship
846
00:36:50,942 --> 00:36:54,446
with work have changed
throughout history.
847
00:36:54,478 --> 00:36:57,322
In ancient Greece,
Aristotle said,
848
00:36:57,348 --> 00:37:02,457
"A working paid job absorbs
and degrades the mind."
849
00:37:02,486 --> 00:37:05,228
I.E., if a person would
not willingly adopt
850
00:37:05,256 --> 00:37:06,758
their job for free,
851
00:37:06,791 --> 00:37:08,896
the argument can be made
that they have become
852
00:37:08,926 --> 00:37:11,463
absorbed and degraded by it,
853
00:37:11,495 --> 00:37:15,341
working purely out of
financial obligation.
854
00:37:15,366 --> 00:37:18,904
In 1844, Karl Marx
famously described
855
00:37:18,936 --> 00:37:22,884
the workers of society as
"alienated from their work
856
00:37:22,907 --> 00:37:26,684
and wholly saturated by it."
857
00:37:26,711 --> 00:37:28,782
He felt that most
work didn't allow
858
00:37:28,813 --> 00:37:31,453
an individual's
character to grow.
859
00:37:31,482 --> 00:37:33,860
He encouraged people
to find fulfillment
860
00:37:33,884 --> 00:37:36,387
and freedom in their work.
861
00:37:36,420 --> 00:37:39,594
During World War Two,
the ethos towards work
862
00:37:39,624 --> 00:37:41,661
was that it was
a patriotic duty
863
00:37:41,692 --> 00:37:44,935
in order to support
the war effort.
864
00:37:44,962 --> 00:37:48,535
To best understand our
current relationship with work
865
00:37:48,566 --> 00:37:51,672
and perhaps by extension
modern life itself,
866
00:37:51,702 --> 00:37:55,946
we can look to the writings
of a man called Guy Debord.
867
00:37:55,973 --> 00:38:01,286
Debord was a French Marxist
theorist and in 1967,
868
00:38:01,312 --> 00:38:04,418
he published a powerful
and influential critique
869
00:38:04,448 --> 00:38:07,861
on Western society entitled
870
00:38:07,885 --> 00:38:11,492
The Society of the Spectacle.
871
00:38:11,522 --> 00:38:14,469
He describes our workers
are ruled by commodities
872
00:38:14,492 --> 00:38:15,835
and that production
873
00:38:15,860 --> 00:38:19,307
is an inescapable
duty of the masses,
874
00:38:19,330 --> 00:38:21,571
such is the economic system
875
00:38:21,599 --> 00:38:24,671
that to work is to survive.
876
00:38:24,702 --> 00:38:27,460
"The capitalist economy,"
he says,
877
00:38:27,710 --> 00:38:31,383
"requires the vast majority
to take part as wage workers
878
00:38:31,409 --> 00:38:34,413
in the unending
pursuit of its ends.
879
00:38:34,445 --> 00:38:37,449
A requirement to which,
as everyone knows,
880
00:38:37,481 --> 00:38:40,485
one must either
submit or die."
881
00:38:40,518 --> 00:38:42,828
The assumption has
crept into our rhetoric
882
00:38:42,853 --> 00:38:44,423
and our understanding
that we live in
883
00:38:44,455 --> 00:38:46,457
a leisure society
to some extent.
884
00:38:46,490 --> 00:38:49,403
We have flexible working time.
885
00:38:49,427 --> 00:38:52,465
You hear the term a lot
"relative poverty"
886
00:38:52,496 --> 00:38:53,975
it's, again,
it's absolute poverty,
887
00:38:53,998 --> 00:38:57,360
and all these kinds of ideas
suggest that, in fact,
888
00:38:57,680 --> 00:38:59,105
we should feel pretty
pleased with ourselves.
889
00:38:59,337 --> 00:39:01,374
We should both feel
quite leisured
890
00:39:01,405 --> 00:39:03,715
and we should feel less
in bondage to work
891
00:39:03,741 --> 00:39:05,880
than perhaps somebody
in the 19th century
892
00:39:05,910 --> 00:39:08,948
who was kind of shackled
to a machine in a factory.
893
00:39:08,979 --> 00:39:12,426
But, in fact,
we're very unhappy.
894
00:39:12,450 --> 00:39:14,828
It's irrelevant
how much you work
895
00:39:14,852 --> 00:39:16,832
in actual terms anymore.
896
00:39:16,854 --> 00:39:18,856
The way in which
the spectacle operates
897
00:39:18,889 --> 00:39:23,133
is to make of leisure
itself an adjunct to work.
898
00:39:23,361 --> 00:39:26,433
In other words, the idea
of networking and working
899
00:39:26,464 --> 00:39:29,673
are in some sense
locked into an unholy
900
00:39:29,700 --> 00:39:32,374
and reciprocal relationship
with each other.
901
00:39:32,403 --> 00:39:34,405
You know, the fact
that you're not working
902
00:39:34,438 --> 00:39:35,940
is only because
you've been working,
903
00:39:35,973 --> 00:39:37,418
and the fact that
you're working
904
00:39:37,441 --> 00:39:39,512
is only so that
you cannot work.
905
00:39:39,543 --> 00:39:42,820
In other words,
so engrafted is that rubric
906
00:39:42,847 --> 00:39:44,155
in the way that
we approach life
907
00:39:44,382 --> 00:39:46,885
that--that we can
never be rid of it.
908
00:39:46,917 --> 00:39:50,626
Debord also observed
that as technology advances,
909
00:39:50,654 --> 00:39:52,964
production becomes
more efficient.
910
00:39:52,990 --> 00:39:56,665
Accordingly, the workers'
tasks invariably become
911
00:39:56,694 --> 00:39:59,470
more trivial and menial.
912
00:39:59,497 --> 00:40:02,876
It would seem that as human
labor becomes irrelevant,
913
00:40:02,900 --> 00:40:06,507
the harder it is
to find fulfilling work.
914
00:40:06,537 --> 00:40:09,575
The truth of the matter is
that most people already,
915
00:40:09,607 --> 00:40:12,816
in Britain,
are doing useless jobs
916
00:40:12,843 --> 00:40:16,170
and have been for
generations, actually.
917
00:40:16,460 --> 00:40:18,788
Most jobs in management
are completely useless.
918
00:40:18,816 --> 00:40:21,920
They basically consist
in the rearrangement
919
00:40:21,118 --> 00:40:23,189
of information into
different patterns
920
00:40:23,421 --> 00:40:25,662
that are meant to take on
the semblance of meaning
921
00:40:25,689 --> 00:40:27,862
in the bureaucratic context.
922
00:40:27,892 --> 00:40:31,931
So most work is, in fact,
a waste of time already,
923
00:40:31,962 --> 00:40:34,602
and I think people
understand that intuitively.
924
00:40:34,632 --> 00:40:37,579
When I go into companies,
I often ask the question,
925
00:40:37,601 --> 00:40:38,875
"Why are you employing people?
926
00:40:38,903 --> 00:40:40,473
You could get monkeys
927
00:40:40,504 --> 00:40:42,848
or you could get robots
to do this job."
928
00:40:42,873 --> 00:40:44,790
The people are not
allowed to think.
929
00:40:44,108 --> 00:40:45,917
They are processing.
930
00:40:45,943 --> 00:40:48,820
They're just like a machine.
931
00:40:48,112 --> 00:40:51,559
They're being so hemmed down,
932
00:40:51,582 --> 00:40:55,530
they operate with an algorithm
and they just do it.
933
00:40:55,553 --> 00:40:58,898
We all have the need to find
meaning in our lives,
934
00:40:58,923 --> 00:41:02,234
and it is our professions
that define us.
935
00:41:02,460 --> 00:41:04,736
To work is
to provide a service
936
00:41:04,762 --> 00:41:06,764
either to yourself
or for others,
937
00:41:06,797 --> 00:41:09,835
but most of us would like
our work to be purposeful
938
00:41:09,867 --> 00:41:13,576
and contributory
to society in some way.
939
00:41:13,604 --> 00:41:15,811
It is an uncomfortable truth
940
00:41:15,840 --> 00:41:18,719
that with our present
model of economics
941
00:41:18,742 --> 00:41:22,918
not everyone is able
to monetize their passions.
942
00:41:22,947 --> 00:41:25,951
If any of this were
to come to fruition,
943
00:41:25,983 --> 00:41:29,192
if we learned to make
automation work for us,
944
00:41:29,220 --> 00:41:34,226
the question remains,
"What do we do with our days?"
945
00:41:34,258 --> 00:41:35,999
There's a good and a bad.
946
00:41:36,260 --> 00:41:38,700
The good is that the cost
of everything drops.
947
00:41:38,729 --> 00:41:41,730
We can solve some of
the basic problems of humanity
948
00:41:41,980 --> 00:41:45,103
like disease,
hunger, lodging.
949
00:41:45,135 --> 00:41:49,277
We can look after all the
basic needs of human beings.
950
00:41:49,507 --> 00:41:53,887
The dark side is that
automation takes jobs away,
951
00:41:53,911 --> 00:41:56,892
and the question is,
"What do we do for a living?"
952
00:41:56,914 --> 00:41:59,793
Some of us will
seek enlightenment
953
00:41:59,817 --> 00:42:02,161
and rise and will keep
learning and growing,
954
00:42:02,186 --> 00:42:04,257
but the majority of people
don't care about those things.
955
00:42:04,288 --> 00:42:07,997
Majority of people just want
to do, you know, grunt work.
956
00:42:08,250 --> 00:42:11,563
They want to socialize with
people as they do at work.
957
00:42:11,595 --> 00:42:15,650
Sennett wrote in his book,
The Corrosion of Character,
958
00:42:15,990 --> 00:42:18,460
that in late capitalism,
959
00:42:18,680 --> 00:42:21,208
one of the great
kind of supports
960
00:42:21,238 --> 00:42:24,845
for human interaction
and for human meaning
961
00:42:24,875 --> 00:42:27,754
is the longevity
of social relations
962
00:42:27,778 --> 00:42:30,315
and the interactions
in working environments
963
00:42:30,548 --> 00:42:32,824
and that if that's taken away,
964
00:42:32,850 --> 00:42:36,593
if what's required is to be
continually responsive
965
00:42:36,620 --> 00:42:40,227
and changing in
a precarious world,
966
00:42:40,257 --> 00:42:43,932
then people no longer
find the fulfillment
967
00:42:43,961 --> 00:42:46,567
or the substance in
what they're doing.
968
00:42:46,597 --> 00:42:50,204
There is an underlying desire
for people to do things,
969
00:42:50,234 --> 00:42:51,770
you know, you spoke
about the idea
970
00:42:51,802 --> 00:42:54,180
that people want to be
engaged creatively.
971
00:42:54,204 --> 00:42:55,649
They want to be
engaged, you know,
972
00:42:55,673 --> 00:42:58,984
go back to basic
Marxist ideas of praxis
973
00:42:59,900 --> 00:43:00,647
and right back to John Locke.
974
00:43:00,678 --> 00:43:02,680
They want to be engaged
in what Locke thought of
975
00:43:02,713 --> 00:43:04,283
as primary acquisition,
976
00:43:04,315 --> 00:43:07,159
mixing their labor,
either their creative thinking
977
00:43:07,184 --> 00:43:08,686
or their physical labor even,
978
00:43:08,719 --> 00:43:11,199
with the world
in order to transform it.
979
00:43:11,221 --> 00:43:13,667
They want to do that,
and that's a very basic
980
00:43:13,691 --> 00:43:15,967
human instinct to do that.
981
00:43:15,993 --> 00:43:19,990
And the idea of
a leisured class, as it were,
982
00:43:19,129 --> 00:43:23,271
a class who is not involved
in a praxis with the world,
983
00:43:23,300 --> 00:43:25,610
but is simply involved
in a passive way
984
00:43:25,636 --> 00:43:29,311
as a recipient of things is
actually repugnant to people.
985
00:43:29,340 --> 00:43:34,187
They would sooner work for
the man in a meaningless job
986
00:43:34,211 --> 00:43:36,782
and construct
a false ideology
987
00:43:36,814 --> 00:43:38,987
of involvement and engagement
988
00:43:39,160 --> 00:43:41,155
than they would
actually sit on their ass.
989
00:43:41,185 --> 00:43:42,892
We can't get away
from the fact that--
990
00:43:42,920 --> 00:43:45,730
that people work
because they have to.
991
00:43:45,756 --> 00:43:48,760
That's, you know, the primary
motivation for most people,
992
00:43:48,792 --> 00:43:50,294
that if you don't work,
993
00:43:50,327 --> 00:43:52,739
you're gonna be living
on the street, okay?
994
00:43:52,763 --> 00:43:55,730
Once we--if we ever
move into a future
995
00:43:55,990 --> 00:43:56,703
where that's not the case
996
00:43:56,734 --> 00:43:57,974
and people don't have
to worry about that,
997
00:43:58,200 --> 00:43:59,208
then we can begin to take on
998
00:43:59,236 --> 00:44:01,307
these more
philosophical questions
999
00:44:01,338 --> 00:44:05,411
of-of, you know, but we're
not at that point yet.
1000
00:44:05,643 --> 00:44:09,352
We can't pretend that
we are living in an age
1001
00:44:09,380 --> 00:44:14,386
where that necessity
for an income doesn't exist.
1002
00:44:14,418 --> 00:44:18,867
Douglas Rushkoff
stated in 2009,
1003
00:44:18,889 --> 00:44:22,427
"We all want paychecks
or at least money.
1004
00:44:22,660 --> 00:44:25,368
We want food,
shelter, clothing,
1005
00:44:25,396 --> 00:44:28,309
and all the things
that money buys us.
1006
00:44:28,332 --> 00:44:32,410
But do we all
really want jobs?"
1007
00:44:32,690 --> 00:44:35,949
According to the UN Food
and Agriculture Organization,
1008
00:44:35,973 --> 00:44:37,384
there is enough food produced
1009
00:44:37,408 --> 00:44:39,460
to provide everyone
in the world
1010
00:44:39,760 --> 00:44:44,185
with 2,720 kilocalories
per person per day.
1011
00:44:44,214 --> 00:44:46,251
♪
1012
00:45:03,300 --> 00:45:05,803
At this stage,
it's difficult to think of
1013
00:45:05,836 --> 00:45:08,214
other possible ways of life.
1014
00:45:08,238 --> 00:45:10,343
The need to earn a living
has been a pan
1015
00:45:10,374 --> 00:45:12,980
of every cultural
narrative in history.
1016
00:45:13,100 --> 00:45:16,184
It's a precondition
of human life.
1017
00:45:16,213 --> 00:45:18,853
The challenge facing
the future of work
1018
00:45:18,882 --> 00:45:21,988
is politically unclear.
1019
00:45:22,190 --> 00:45:23,930
It is likely to require
1020
00:45:23,954 --> 00:45:26,867
not only a
redistribution of wealth,
1021
00:45:26,890 --> 00:45:30,303
but a redistribution
of the workload.
1022
00:45:30,327 --> 00:45:34,200
But will working less
mean living more?
1023
00:45:34,310 --> 00:45:35,772
♪
1024
00:45:35,799 --> 00:45:39,760
And is our fear of
becoming irrelevant
1025
00:45:39,103 --> 00:45:42,414
greater than
our fear of death?
1026
00:45:45,909 --> 00:45:53,909
♪
1027
00:46:17,274 --> 00:46:21,222
The process of physical aging
is known as senescence,
1028
00:46:21,245 --> 00:46:23,555
and none of us
are spared from it.
1029
00:46:23,781 --> 00:46:28,590
It remains to this day
an evolutionary enigma.
1030
00:46:28,850 --> 00:46:30,224
Our cells are programmed to wane
1031
00:46:30,254 --> 00:46:33,963
and our entire bodies
are fated to become frail.
1032
00:46:33,991 --> 00:46:36,471
It was seen that the laws
of nature would prefer it
1033
00:46:36,493 --> 00:46:38,439
if we dwindle and die.
1034
00:46:38,462 --> 00:46:42,239
♪
1035
00:46:42,266 --> 00:46:44,576
Negligible senescence, however,
1036
00:46:44,802 --> 00:46:47,942
is the lack
of symptoms of aging.
1037
00:46:47,971 --> 00:46:52,249
Negligibly senescent organisms
include certain species
1038
00:46:52,276 --> 00:46:55,520
of sturgeon, giant tortoise,
1039
00:46:55,780 --> 00:46:58,457
flatworm, clam, and tardigrade.
1040
00:47:00,830 --> 00:47:04,122
One species of jellyfish
called turritopsis dohrnii
1041
00:47:04,154 --> 00:47:08,680
has even been observed
to be biologically immortal.
1042
00:47:08,910 --> 00:47:11,903
It has the capability
to reverse its biotic cycle
1043
00:47:11,929 --> 00:47:13,909
and revert back
to the polyp stage
1044
00:47:13,931 --> 00:47:16,377
at any point in its development.
1045
00:47:16,400 --> 00:47:19,313
There is only one thing
wrong with dying,
1046
00:47:19,336 --> 00:47:21,247
and that's doing it
when you don't want to.
1047
00:47:22,272 --> 00:47:24,252
Doing it when you do
want to is not a problem.
1048
00:47:24,274 --> 00:47:26,830
Now if you put that bargain
to anybody,
1049
00:47:26,109 --> 00:47:28,350
"Look, this is the deal:
1050
00:47:28,378 --> 00:47:30,483
You will die, but only
when you want to."
1051
00:47:30,514 --> 00:47:32,323
Who would not take that bargain?
1052
00:47:32,349 --> 00:47:36,855
In 2014, a team
of scientists at Harvard
1053
00:47:36,887 --> 00:47:39,333
were able to effectively
reverse the age
1054
00:47:39,356 --> 00:47:41,529
of an older mouse by treating it
1055
00:47:41,558 --> 00:47:43,560
with the blood
of a younger mouse
1056
00:47:43,594 --> 00:47:46,905
through a process
called parabiosis.
1057
00:47:47,931 --> 00:47:49,308
For the first time in history
1058
00:47:49,333 --> 00:47:51,643
it is deemed
scientifically possible
1059
00:47:51,869 --> 00:47:54,577
to gain control
over the aging process.
1060
00:47:54,605 --> 00:48:02,605
♪
1061
00:48:03,113 --> 00:48:05,593
Ultimately, when people
get the hang of the idea
1062
00:48:05,616 --> 00:48:07,892
that aging is a medical problem
1063
00:48:07,918 --> 00:48:10,228
and that everybody's got it,
1064
00:48:10,254 --> 00:48:13,929
then it's not going to be
the way it is today.
1065
00:48:19,363 --> 00:48:23,175
He thinks it's possible
that people will extend--
1066
00:48:23,200 --> 00:48:26,670
be able to extend their lifespan
by considerable amounts.
1067
00:48:26,904 --> 00:48:28,247
I think he's on record as saying
1068
00:48:28,272 --> 00:48:31,219
the first 1,000-year-old person
is already alive.
1069
00:48:31,241 --> 00:48:33,482
It's highly likely, in my view,
1070
00:48:33,510 --> 00:48:36,423
that people born today
or born 10 years ago
1071
00:48:36,446 --> 00:48:39,188
will actually be able to live
1072
00:48:39,216 --> 00:48:41,253
as long as they like,
so to speak,
1073
00:48:41,285 --> 00:48:46,394
without any risk of death
from the ill health of old age.
1074
00:48:46,423 --> 00:48:52,101
The way to apply comprehensive
maintenance to aging
1075
00:48:52,129 --> 00:48:53,699
is a divide
and conquer approach.
1076
00:48:53,931 --> 00:48:55,467
It is not a magic bullet.
1077
00:48:55,499 --> 00:48:57,445
It is not some single thing
that we can do,
1078
00:48:57,467 --> 00:49:00,380
let alone a single thing
that we could do just once.
1079
00:49:00,404 --> 00:49:03,578
Aging is
the lifelong accumulation
1080
00:49:03,607 --> 00:49:05,609
of damage to the body,
1081
00:49:05,642 --> 00:49:08,680
and that damage occurs
as an intrinsic,
1082
00:49:08,712 --> 00:49:10,714
unavoidable side effect
1083
00:49:10,948 --> 00:49:13,360
of the way the body
normally works.
1084
00:49:13,383 --> 00:49:15,210
Even though there
are many, many,
1085
00:49:15,520 --> 00:49:17,555
many different types
of damage at the molecular level
1086
00:49:17,587 --> 00:49:21,467
and the cellular level,
they can all be classified
1087
00:49:21,491 --> 00:49:23,971
into a very manageable
number of categories,
1088
00:49:23,994 --> 00:49:25,564
just seven major categories.
1089
00:49:25,595 --> 00:49:28,439
So now the bottom line,
what do we do about it?
1090
00:49:28,465 --> 00:49:31,742
How do we actually implement
the maintenance approach?
1091
00:49:31,969 --> 00:49:33,744
There are four
fundamental paradigms,
1092
00:49:33,971 --> 00:49:35,473
they all begin with
1093
00:49:35,505 --> 00:49:37,109
They are called
replacement, removal,
1094
00:49:37,140 --> 00:49:38,676
repair, and reinforcement.
1095
00:49:38,709 --> 00:49:41,588
We've got particular ways
to do all these things.
1096
00:49:41,611 --> 00:49:44,683
Sometimes replacement,
sometimes simply elimination
1097
00:49:44,715 --> 00:49:48,754
of the superfluous material,
the garbage that's accumulated.
1098
00:49:48,986 --> 00:49:52,297
Sometimes repair
of the material.
1099
00:49:52,322 --> 00:49:54,563
Occasionally, in a couple
of cases, reinforcement--
1100
00:49:54,591 --> 00:49:57,128
that means making
the cell robust
1101
00:49:57,160 --> 00:49:59,504
so that the damage,
which would normally have caused
1102
00:49:59,529 --> 00:50:02,320
the pathology
no longer does that.
1103
00:50:02,650 --> 00:50:05,137
I wanna talk about one thing
that we're doing in our lab,
1104
00:50:05,168 --> 00:50:07,341
which involved the number one
cause of death
1105
00:50:07,371 --> 00:50:09,510
in the western world,
cardiovascular disease
1106
00:50:09,539 --> 00:50:11,246
causes hear-t attacks
and strokes.
1107
00:50:11,274 --> 00:50:14,840
It all begins with these things
called foam cells,
1108
00:50:14,111 --> 00:50:16,785
which are originally
white blood cells.
1109
00:50:17,140 --> 00:50:20,621
They become poisoned
by toxins in the bloodstream.
1110
00:50:20,650 --> 00:50:23,494
The main toxin that's
responsible is known--
1111
00:50:23,520 --> 00:50:26,433
it's called 7-ketocholesterol,
that's this thing,
1112
00:50:26,456 --> 00:50:29,460
and we found some bacteria
that could eat it.
1113
00:50:29,493 --> 00:50:31,700
We then found out
how they eat it,
1114
00:50:31,728 --> 00:50:34,675
we found out the enzymes
that they use to break it down,
1115
00:50:34,698 --> 00:50:37,201
and we found out
how to modify those enzymes
1116
00:50:37,234 --> 00:50:39,510
so that they can go
into human cells,
1117
00:50:39,536 --> 00:50:42,608
go to the right place
in the cell that they're needed,
1118
00:50:42,639 --> 00:50:44,380
which is called the lysosome,
1119
00:50:44,408 --> 00:50:48,550
and actually do their job there,
and it actually works.
1120
00:50:48,578 --> 00:50:51,821
Cells are protected
from this toxic substance--
1121
00:50:52,490 --> 00:50:53,687
that's what these graphs
are showing.
1122
00:50:53,717 --> 00:50:56,129
So this is pretty good news.
1123
00:50:56,153 --> 00:50:58,724
The damage that accumulates
that eventually causes
1124
00:50:58,755 --> 00:51:01,759
the diseases
and disabilities of old age
1125
00:51:01,792 --> 00:51:04,363
is initially harmless.
1126
00:51:04,394 --> 00:51:07,603
The body is set up to tolerate
a certain amount of it.
1127
00:51:07,631 --> 00:51:10,271
That's critical,
because while we damage it
1128
00:51:10,300 --> 00:51:13,213
at that sub-pathological level,
1129
00:51:13,236 --> 00:51:15,341
it means that
it's not participating
1130
00:51:15,372 --> 00:51:17,249
in metabolism, so to speak.
1131
00:51:17,274 --> 00:51:20,380
It's not actually interacting
with the way the body works.
1132
00:51:20,410 --> 00:51:22,720
So medicines
that target that damage
1133
00:51:22,746 --> 00:51:25,283
are much, much less likely
1134
00:51:25,315 --> 00:51:28,228
to have unacceptable
side effects
1135
00:51:28,251 --> 00:51:31,198
than medicines that try
to manipulate the body
1136
00:51:31,221 --> 00:51:33,980
so as to stop the damage
from being created
1137
00:51:33,123 --> 00:51:34,295
in the first place.
1138
00:51:42,399 --> 00:51:48,350
It's unlikely, in fact,
by working on longevity per se,
1139
00:51:48,371 --> 00:51:50,248
that we will crack it.
1140
00:51:50,273 --> 00:51:53,379
It's going to be--
it seems to me more probable
1141
00:51:53,410 --> 00:51:57,756
that we will crack longevity
simply by getting rid of,
1142
00:51:57,781 --> 00:52:01,354
sequentially,
the prime causes of death.
1143
00:52:01,384 --> 00:52:06,390
I hear people talk about
living hundreds of years.
1144
00:52:06,423 --> 00:52:07,595
Inside, I'm like, yeah, right,
1145
00:52:07,624 --> 00:52:10,468
I mean, because
if you study the brain,
1146
00:52:10,494 --> 00:52:12,770
the dead end is the brain.
1147
00:52:12,796 --> 00:52:14,332
We all star-t developing
1148
00:52:14,364 --> 00:52:16,503
Alzheimer's pathology
at 40 years old.
1149
00:52:16,533 --> 00:52:18,706
It's not a matter of whether
you get Alzheimer's,
1150
00:52:18,735 --> 00:52:19,805
it's when.
1151
00:52:20,871 --> 00:52:25,718
It's when, and genetically,
1152
00:52:25,742 --> 00:52:27,813
we all have some predisposition
1153
00:52:27,844 --> 00:52:30,347
to when we're gonna
get this disease.
1154
00:52:30,380 --> 00:52:32,155
It's pan of the program.
1155
00:52:32,182 --> 00:52:34,389
Let's fix that part
of the program
1156
00:52:34,417 --> 00:52:36,590
so we can live
past 90 years old
1157
00:52:36,620 --> 00:52:39,157
with an intact, working brain
1158
00:52:39,189 --> 00:52:41,430
to continue
the evolution of our mind.
1159
00:52:42,592 --> 00:52:47,871
That is number one in my book,
because here's a fact:
1160
00:52:47,898 --> 00:52:50,401
Life span's almost 80
right now on average.
1161
00:52:50,433 --> 00:52:54,279
By 85, half of people
will have Alzheimer's.
1162
00:52:54,304 --> 00:52:56,181
Do the math.
1163
00:52:56,206 --> 00:52:57,446
Seventy-four million
Baby Boomers
1164
00:52:57,474 --> 00:52:58,817
headed toward risk age.
1165
00:52:58,842 --> 00:53:01,482
Eighty-five, 50 percent
have Alzheimer's,
1166
00:53:01,511 --> 00:53:03,184
current life span's 80.
1167
00:53:03,213 --> 00:53:04,783
They're gonna be 85 pretty soon.
1168
00:53:04,814 --> 00:53:07,886
Half our population at 85's
gonna have this disease.
1169
00:53:07,918 --> 00:53:09,693
And then keep going
up to 90 and 100,
1170
00:53:09,719 --> 00:53:11,198
and it gets even worse.
1171
00:53:11,221 --> 00:53:13,326
This is enemy number one.
1172
00:53:13,356 --> 00:53:16,462
It's interesting,
just this week
1173
00:53:17,594 --> 00:53:20,234
it was discovered
that an Egyptian mummy
1174
00:53:21,264 --> 00:53:23,938
had died of cancer,
so even way back in those times,
1175
00:53:23,967 --> 00:53:25,605
cancer was around.
1176
00:53:25,635 --> 00:53:28,707
What seems to have happened is,
as we have lived longer,
1177
00:53:28,738 --> 00:53:31,878
the number of diseases
that pop up to kill us
1178
00:53:31,908 --> 00:53:34,411
star-ts to increase,
and the reality is
1179
00:53:34,444 --> 00:53:37,482
I think this is a son
of whack-a-mole situation
1180
00:53:37,514 --> 00:53:41,360
as we beat cancer to death
and it disappears,
1181
00:53:41,384 --> 00:53:42,624
something else will pop up.
1182
00:53:42,652 --> 00:53:44,598
Cancer is a specific disease,
1183
00:53:44,621 --> 00:53:46,726
and every cancer
is a specific gene involved
1184
00:53:46,756 --> 00:53:48,463
together with lifestyle.
1185
00:53:48,491 --> 00:53:52,234
Alzheimer's, specific disease,
specific genetics.
1186
00:53:52,262 --> 00:53:55,266
I can go on and on--
diabetes, hear-t disease.
1187
00:53:55,298 --> 00:53:59,405
These are diseases,
and as you get older,
1188
00:53:59,436 --> 00:54:02,679
your susceptibility
to these diseases increases,
1189
00:54:02,706 --> 00:54:04,686
and your genetics will determine
1190
00:54:04,708 --> 00:54:07,450
whether you get them
and when you get them
1191
00:54:07,477 --> 00:54:08,717
given your lifespan.
1192
00:54:08,745 --> 00:54:10,486
That's not aging,
1193
00:54:10,513 --> 00:54:12,789
that's just living long enough
to be susceptible,
1194
00:54:12,816 --> 00:54:16,992
so we may very well eradicate,
in our fantasy world,
1195
00:54:17,200 --> 00:54:20,263
all the cancers and strokes
and hear-t disease
1196
00:54:20,290 --> 00:54:23,294
and diabetes and Alzheimer's
we get right now
1197
00:54:23,326 --> 00:54:26,535
by 80 or 90 years old,
and then what's gonna happen?
1198
00:54:26,563 --> 00:54:29,601
You live out to 110,
and guess what's gonna happen,
1199
00:54:29,633 --> 00:54:31,544
new, other genetic variants
1200
00:54:31,568 --> 00:54:33,605
suddenly rear
their ugly heads and say,
1201
00:54:33,637 --> 00:54:36,311
"Now we're gonna affect
whether you live to 110
1202
00:54:36,339 --> 00:54:38,120
without Alzheimer's
and hear-t disease
1203
00:54:38,410 --> 00:54:39,611
and cancer and diabetes."
1204
00:54:39,643 --> 00:54:41,418
And it'll go on
and go on and go on.
1205
00:54:43,380 --> 00:54:45,792
There will undoubtedly
be enormous challenges
1206
00:54:45,815 --> 00:54:48,796
concerning the biological
approach to longevity.
1207
00:54:49,819 --> 00:54:52,925
There could, however,
be an alternative route
1208
00:54:52,956 --> 00:54:54,936
to extreme longevity.
1209
00:54:56,426 --> 00:54:58,300
When people are
worried about death,
1210
00:54:58,610 --> 00:54:59,734
I guess the issue
is what is it
1211
00:54:59,763 --> 00:55:03,472
that they would like
to have stay alive, okay?
1212
00:55:03,500 --> 00:55:07,312
And I think that's often
very unclear what the answer is,
1213
00:55:07,337 --> 00:55:10,450
because if you look at somebody
like Ray Kurzweil, for example,
1214
00:55:10,730 --> 00:55:11,916
with his promises
of the singularity
1215
00:55:11,941 --> 00:55:13,784
and our merging
with machine intelligence
1216
00:55:13,810 --> 00:55:15,687
and then being able
to kind of have
1217
00:55:15,712 --> 00:55:17,316
this kind of
infinite consciousness
1218
00:55:17,347 --> 00:55:20,385
projected outward
into the Cosmos.
1219
00:55:20,417 --> 00:55:21,862
I don't think he's
imaging a human body
1220
00:55:21,885 --> 00:55:24,832
living forever, okay?
1221
00:55:24,854 --> 00:55:28,324
And if that's what we're
talking about is immortality,
1222
00:55:28,358 --> 00:55:30,599
what I kind of think
Kurzweil is talking about,
1223
00:55:30,627 --> 00:55:32,402
then I can see it.
1224
00:55:32,429 --> 00:55:35,967
I mean, I could see at least
as something to work toward.
1225
00:55:35,999 --> 00:55:40,380
In 2005, Google's
Director of Engineering,
1226
00:55:40,700 --> 00:55:42,641
Ray Kurzweil,
published a book
1227
00:55:42,672 --> 00:55:45,551
entitled
The Singularity is Near:
1228
00:55:45,575 --> 00:55:48,749
When Humans Transcend Biology.
1229
00:55:48,778 --> 00:55:52,210
He predicts that by 2045,
1230
00:55:52,480 --> 00:55:54,551
it'll be possible
to upload our minds
1231
00:55:54,584 --> 00:55:57,622
into a computer,
effectively allowing us
1232
00:55:57,654 --> 00:55:59,725
to live indefinitely.
1233
00:56:14,637 --> 00:56:17,830
When people think
of death at this point,
1234
00:56:17,107 --> 00:56:19,417
they think of a body
going into a coffin,
1235
00:56:19,442 --> 00:56:21,718
and the coffin
going into the ground.
1236
00:56:21,745 --> 00:56:25,852
When in fact that age
of death is dying.
1237
00:56:25,882 --> 00:56:28,692
We are ending that stage.
1238
00:56:28,718 --> 00:56:30,425
We're entering a new stage
1239
00:56:30,453 --> 00:56:32,660
where we could
possibly upload consciousness
1240
00:56:32,689 --> 00:56:34,566
into a silicone substrate.
1241
00:56:34,591 --> 00:56:36,764
You know, a lot of these
science fiction ideas
1242
00:56:36,793 --> 00:56:39,637
from decades ago
are becoming real
1243
00:56:39,662 --> 00:56:42,600
and already people are
spending millions of pounds
1244
00:56:42,310 --> 00:56:44,841
research today
on making this happen.
1245
00:56:44,868 --> 00:56:47,906
Nobody expects to do it
before 2040 at the earliest.
1246
00:56:47,937 --> 00:56:51,544
My guess is 2050,
it'll be a few rich people
1247
00:56:51,574 --> 00:56:54,612
and a few kings and queens
here and there and politicians.
1248
00:56:54,644 --> 00:56:58,910
By 2060-2070, it's
reasonably well-off people.
1249
00:56:58,114 --> 00:57:01,857
By 2075, pretty much anybody
could be immortal.
1250
00:57:01,885 --> 00:57:04,161
I'm not convinced that
uploading my consciousness
1251
00:57:04,187 --> 00:57:07,430
onto a computer is
a form of immortality.
1252
00:57:07,457 --> 00:57:09,767
I would like to live forever,
1253
00:57:09,793 --> 00:57:12,205
but I'm not sure that
I would like to live forever
1254
00:57:12,429 --> 00:57:17,970
as some digibytes of memory
in a computer.
1255
00:57:18,100 --> 00:57:19,674
I wouldn't call that
living forever.
1256
00:57:19,702 --> 00:57:21,511
The things I wanna do
with my body
1257
00:57:21,538 --> 00:57:24,212
that I won't be able
to do in that computer.
1258
00:57:24,441 --> 00:57:26,614
Immortality is a question
that keeps arising
1259
00:57:26,643 --> 00:57:29,749
in the technology community,
and it's one which I think
1260
00:57:29,779 --> 00:57:33,192
is entirely feasible
in principle.
1261
00:57:33,216 --> 00:57:34,854
We won't actually
become immortal,
1262
00:57:34,884 --> 00:57:38,491
but what we will do is
we will get the technology
1263
00:57:38,521 --> 00:57:43,610
by around about 2050
to connect a human brain
1264
00:57:43,920 --> 00:57:46,972
to the machine world so well
that most of your thinking
1265
00:57:46,996 --> 00:57:49,330
is happening inside
the computer world,
1266
00:57:49,650 --> 00:57:50,874
inside the l.T.,
1267
00:57:50,900 --> 00:57:52,607
so your brain
is still being used,
1268
00:57:52,635 --> 00:57:55,582
but 99 percent of your thoughts,
99 percent of your memories
1269
00:57:55,605 --> 00:57:57,243
are actually out there
in the cloud
1270
00:57:57,474 --> 00:57:59,440
or whatever you wanna call it,
1271
00:57:59,750 --> 00:58:00,952
and only one percent
is inside your head.
1272
00:58:00,977 --> 00:58:02,718
So walk into work this morning,
1273
00:58:02,745 --> 00:58:05,555
you get hit by a bus,
it doesn't matter.
1274
00:58:05,582 --> 00:58:09,120
You just upload you mind
into an android
1275
00:58:09,152 --> 00:58:11,655
on Monday morning and carry on
as if nothing had happened.
1276
00:58:11,688 --> 00:58:13,565
The question with
that kind of technology
1277
00:58:13,590 --> 00:58:17,800
and the extension
of human capacity
1278
00:58:17,827 --> 00:58:23,400
and human life
through technology
1279
00:58:23,320 --> 00:58:26,700
is where does
the human, um, end,
1280
00:58:26,102 --> 00:58:27,877
and the technology begin?
1281
00:58:27,904 --> 00:58:31,545
If we upload our consciousness
into a robot,
1282
00:58:31,574 --> 00:58:35,954
a humanoid robot that has touch,
the ability to feel,
1283
00:58:35,979 --> 00:58:39,722
all of the sensorial inputs,
if they're the same,
1284
00:58:39,749 --> 00:58:43,600
there is a potential
of continuity, right?
1285
00:58:43,860 --> 00:58:47,228
So you can have the same
types of experiences
1286
00:58:47,257 --> 00:58:50,204
in that new substrate
that you have as a human being.
1287
00:58:50,226 --> 00:58:52,297
It's beyond a continuity issue,
1288
00:58:52,529 --> 00:58:55,874
it's an issue that you have
the ability to record
1289
00:58:55,899 --> 00:58:58,937
and recall without the content--
1290
00:58:58,968 --> 00:59:00,879
the content being
the sensations, images
1291
00:59:00,904 --> 00:59:03,544
and feelings and thoughts
you experienced your whole life
1292
00:59:03,573 --> 00:59:04,847
that have associated
with each other
1293
00:59:04,874 --> 00:59:06,251
through your neuronetwork.
1294
00:59:06,276 --> 00:59:07,880
Where are they stored?
1295
00:59:07,911 --> 00:59:11,850
I don't think consciousness
and the brain
1296
00:59:11,114 --> 00:59:12,889
is anything to do
with any particular
1297
00:59:12,916 --> 00:59:14,862
individual region in the brain,
1298
00:59:14,884 --> 00:59:17,592
but it's something
that's all about--
1299
00:59:17,620 --> 00:59:19,896
its distributed organization.
1300
00:59:19,923 --> 00:59:21,596
I don't think there
are any mysteries.
1301
00:59:21,624 --> 00:59:24,298
There are no causal mysteries
in the brain,
1302
00:59:24,327 --> 00:59:29,606
and I think that
there's a perfectly--
1303
00:59:29,632 --> 00:59:32,440
a comprehensible, physical chain
1304
00:59:32,680 --> 00:59:34,742
of cause and effect
that goes from the things
1305
00:59:34,771 --> 00:59:37,217
that I see and hear around me
1306
00:59:37,240 --> 00:59:39,150
and the words
that come out of my mouth,
1307
00:59:39,420 --> 00:59:43,582
which would encompass
consciousness, I suppose.
1308
00:59:43,613 --> 00:59:44,887
But the things that--
1309
00:59:44,914 --> 00:59:46,757
the moment you say
something like that,
1310
00:59:46,783 --> 00:59:50,788
you're on the edge
of the philosophical precipice.
1311
00:59:50,820 --> 00:59:52,128
When you think about a machine,
1312
00:59:52,155 --> 00:59:55,340
the question is are you
simulating consciousness
1313
00:59:55,580 --> 00:59:57,231
or are you simulating cognition?
1314
00:59:58,828 --> 01:00:03,334
Cognition requires inputs
and reactions
1315
01:00:03,366 --> 01:00:04,811
that are associated
with each other
1316
01:00:04,834 --> 01:00:07,314
to create an output
and an outcome.
1317
01:00:07,337 --> 01:00:08,907
And you can program that all day
1318
01:00:08,938 --> 01:00:11,430
and you can make that
as sophisticated
1319
01:00:11,740 --> 01:00:12,781
and as information-dense
as you want,
1320
01:00:12,809 --> 01:00:15,756
almost to the point
that it mimics a real person.
1321
01:00:17,213 --> 01:00:19,716
But the question is will it
ever have the consciousness
1322
01:00:19,749 --> 01:00:21,695
that our species
with our genetics,
1323
01:00:21,718 --> 01:00:23,994
with our brain has.
1324
01:00:24,200 --> 01:00:27,365
No, a machine has
its own consciousness.
1325
01:00:27,390 --> 01:00:29,700
All you're doing
is programming it
1326
01:00:29,726 --> 01:00:33,173
to be cognitively responsive
the way you are.
1327
01:00:33,196 --> 01:00:35,301
I remember well,
when my father died,
1328
01:00:35,331 --> 01:00:39,643
asking my mother,
"if I could've captured
1329
01:00:39,669 --> 01:00:42,946
the very being of my father
in a machine,
1330
01:00:42,972 --> 01:00:44,747
and I could put him
in an android
1331
01:00:44,774 --> 01:00:47,414
that looked exactly like him,
had all the mannerisms,
1332
01:00:47,644 --> 01:00:50,955
and it was warm and it smelled
and it felt like him,
1333
01:00:50,980 --> 01:00:53,187
would you do it?"
and she said, "Absolutely not.
1334
01:00:53,216 --> 01:00:55,822
It wouldn't be your father,
it wouldn't be him."
1335
01:00:55,852 --> 01:00:57,798
I think that someday
you can upload
1336
01:00:57,820 --> 01:00:59,731
your current neuronetwork...
1337
01:01:00,923 --> 01:01:02,994
but that's not you.
1338
01:01:03,260 --> 01:01:05,802
That's just your current
neuro map, right?
1339
01:01:05,828 --> 01:01:12,268
♪
1340
01:01:12,301 --> 01:01:14,372
As with any concept
that proposes
1341
01:01:14,404 --> 01:01:17,681
to change the natural
order of things,
1342
01:01:17,707 --> 01:01:22,213
the idea of extreme longevity
can be met with disbelief.
1343
01:01:23,746 --> 01:01:26,249
But there is currently
an international movement
1344
01:01:26,282 --> 01:01:29,286
called transhumanism
that is concerned
1345
01:01:29,318 --> 01:01:31,355
with fundamentally transforming
1346
01:01:31,387 --> 01:01:34,834
the human condition
by developing technologies
1347
01:01:34,857 --> 01:01:37,360
to greatly enhance human beings
1348
01:01:37,393 --> 01:01:42,467
in an intellectual, physical,
and psychological capacity.
1349
01:01:42,699 --> 01:01:45,305
I really want to just
simply live indefinitely,
1350
01:01:45,334 --> 01:01:48,213
and not have the Spectre
of Death hanging over me,
1351
01:01:48,237 --> 01:01:50,478
potentially
at any moment taking away
1352
01:01:50,707 --> 01:01:53,210
this thing
that we call existence.
1353
01:01:53,242 --> 01:01:55,449
So for me, that's
the primary goal
1354
01:01:55,478 --> 01:01:57,788
of the transhumanist movement.
1355
01:01:57,814 --> 01:02:00,852
Transhumanists believe
we should use technology
1356
01:02:00,883 --> 01:02:05,930
to overcome
our biological limitations.
1357
01:02:05,121 --> 01:02:06,429
What does that mean?
1358
01:02:06,456 --> 01:02:10,962
Well, very simplistically,
perhaps,
1359
01:02:10,993 --> 01:02:13,439
I think we should be aiming
for what one might call
1360
01:02:13,463 --> 01:02:16,000
a triple S civilization
1361
01:02:16,320 --> 01:02:20,139
of super intelligence,
super longevity,
1362
01:02:20,169 --> 01:02:21,876
and super happiness.
1363
01:02:21,904 --> 01:02:24,316
We have been evolving
through hundreds and hundreds
1364
01:02:24,340 --> 01:02:26,877
of thousands of years,
human beings,
1365
01:02:26,909 --> 01:02:29,389
and transhumanism
is the climax of that.
1366
01:02:29,412 --> 01:02:32,120
It's the result of
how we're going to get
1367
01:02:32,148 --> 01:02:37,290
to some kind of great future
where we are way beyond
1368
01:02:37,530 --> 01:02:38,794
what it means
to be a human being.
1369
01:02:38,821 --> 01:02:44,100
Unfortunately, organic robots
grow old and die,
1370
01:02:44,127 --> 01:02:48,872
and this isn't a choice,
it's completely involuntary.
1371
01:02:48,898 --> 01:02:51,174
120 years from now,
in the absence
1372
01:02:51,200 --> 01:02:54,477
of radical
biological interventions,
1373
01:02:54,504 --> 01:02:58,281
everyone listening
to this video will be dead,
1374
01:02:58,307 --> 01:03:00,548
and not beautifully as so,
1375
01:03:00,777 --> 01:03:04,486
but one's last years tends
to be those of decrepitude,
1376
01:03:04,514 --> 01:03:08,520
frequently senility, infirmity,
1377
01:03:08,840 --> 01:03:13,193
and transhumanists don't accept
aging as inevitable.
1378
01:03:13,222 --> 01:03:15,828
There's no immutable
law of nature
1379
01:03:15,858 --> 01:03:18,930
that says that organic robots
must grow old.
1380
01:03:18,961 --> 01:03:21,567
After all, silicone robots,
they don't need to grow old.
1381
01:03:21,798 --> 01:03:25,268
Their pans can be
replaced and upgraded.
1382
01:03:25,301 --> 01:03:26,974
Our bodies are capable
of adjusting
1383
01:03:27,300 --> 01:03:28,914
in ways we've hardly dreamt of.
1384
01:03:30,173 --> 01:03:31,948
If we can only find the key.
1385
01:03:33,109 --> 01:03:35,953
I'm so close now, so very close.
1386
01:03:37,880 --> 01:03:40,156
- The key to what?
- To be able to replace
1387
01:03:40,183 --> 01:03:41,958
diseased and damaged pans
of the body
1388
01:03:41,984 --> 01:03:44,828
as easily as we replace
eye corneas now.
1389
01:03:45,588 --> 01:03:46,896
Can't be done.
1390
01:03:48,291 --> 01:03:49,463
It can be done!
1391
01:03:50,960 --> 01:03:53,907
The relationship
between life and death
1392
01:03:53,930 --> 01:03:58,970
and the role of technology
in forestalling death
1393
01:03:59,100 --> 01:04:02,915
creates death, in a way,
as a new kind of problem.
1394
01:04:02,939 --> 01:04:06,216
Death becomes something
that needs to be solved.
1395
01:04:06,242 --> 01:04:07,346
Why would it be good
to live forever?
1396
01:04:07,376 --> 01:04:08,582
'Cause if you have a shit day,
1397
01:04:08,611 --> 01:04:11,524
it dilutes the depression
1398
01:04:11,547 --> 01:04:14,323
within countless
other days, you know?
1399
01:04:14,350 --> 01:04:19,493
And all of these metrics
are about failing to exist
1400
01:04:19,522 --> 01:04:22,969
in the full light
of your own autonomy.
1401
01:04:22,992 --> 01:04:24,903
That's all they're about,
1402
01:04:24,927 --> 01:04:27,320
and the paradox
of your own autonomy,
1403
01:04:27,630 --> 01:04:29,634
which is you're
simultaneously completely free
1404
01:04:29,866 --> 01:04:32,540
and completely unfree
at the same time.
1405
01:04:32,568 --> 01:04:34,138
I have been to many conferences
1406
01:04:34,170 --> 01:04:36,446
where you got
the anti-transhumanist person
1407
01:04:36,472 --> 01:04:39,780
saying, "This is just
denial of death."
1408
01:04:39,108 --> 01:04:42,214
At the end of the day,
that's all it's about, right?
1409
01:04:42,245 --> 01:04:45,317
And it's a kind of--
the last hangover
1410
01:04:45,348 --> 01:04:46,884
of the Abrahamic religions,
1411
01:04:46,916 --> 01:04:48,190
this idea that
we're gonna, you know,
1412
01:04:48,217 --> 01:04:49,924
come back to God and all this,
1413
01:04:49,952 --> 01:04:52,398
and we're gonna realize
our God-like nature,
1414
01:04:52,421 --> 01:04:57,461
and this is really the last kind
of point for that.
1415
01:04:58,928 --> 01:05:00,373
I think there's
a lot of truth to that,
1416
01:05:00,396 --> 01:05:02,171
especially in terms
of the issues
1417
01:05:02,198 --> 01:05:03,939
we've been talking about
where everybody just seems
1418
01:05:03,966 --> 01:05:06,300
to just take for granted
that if you're given
1419
01:05:06,350 --> 01:05:08,641
the chance to live forever,
you'd live forever.
1420
01:05:08,671 --> 01:05:11,550
I think yes,
I think that that's true.
1421
01:05:11,574 --> 01:05:14,680
I don't think it's--
I think it's true,
1422
01:05:14,911 --> 01:05:16,219
I don't know
if it's as problematic
1423
01:05:16,245 --> 01:05:18,191
as people kind of claim it is.
1424
01:05:18,214 --> 01:05:19,591
In other words, that
there's something wrong
1425
01:05:19,615 --> 01:05:21,356
with having this fear of death
1426
01:05:21,384 --> 01:05:23,910
and wanting to live forever.
1427
01:05:23,119 --> 01:05:25,292
No, I think living forever is--
1428
01:05:25,321 --> 01:05:28,630
I think the question is what
are you doing with your time?
1429
01:05:28,900 --> 01:05:30,297
In what capacity
do you wanna live forever?
1430
01:05:30,326 --> 01:05:32,602
So I do think it makes
all the difference in the world
1431
01:05:32,628 --> 01:05:34,505
whether we're talking
about Kurzweil's way
1432
01:05:34,530 --> 01:05:36,305
or we're talking
about Aubrey de Grey's way.
1433
01:05:36,332 --> 01:05:38,141
The way the human
species operates
1434
01:05:38,167 --> 01:05:41,205
is that we're really never
fully ready for anything.
1435
01:05:41,237 --> 01:05:44,184
However, the prospect
of living indefinitely
1436
01:05:44,206 --> 01:05:48,245
is too promising
to turn down or to slow down
1437
01:05:48,277 --> 01:05:51,349
or to just not go after
at full speed.
1438
01:05:51,380 --> 01:05:53,451
By enabling us
to find technologies
1439
01:05:53,482 --> 01:05:55,519
to live indefinitely,
we're not making it
1440
01:05:55,551 --> 01:05:57,258
so that we're going
to live forever,
1441
01:05:57,286 --> 01:05:59,459
we're just making it
so we have that choice.
1442
01:05:59,488 --> 01:06:01,399
If people wanna
pull out of life
1443
01:06:01,424 --> 01:06:03,620
at some point down the future,
1444
01:06:03,920 --> 01:06:04,730
they're certainly
welcome to do that.
1445
01:06:04,961 --> 01:06:08,374
However, it's gonna be great
to eliminate death if we want,
1446
01:06:08,397 --> 01:06:10,468
because everyone
wants that choice.
1447
01:06:12,680 --> 01:06:14,378
There are other
socioeconomic repercussions
1448
01:06:14,403 --> 01:06:17,748
of living longer
that need to be considered.
1449
01:06:17,974 --> 01:06:20,477
The combination
of an aging population
1450
01:06:20,509 --> 01:06:23,388
and the escalating expenses
of healthcare,
1451
01:06:23,412 --> 01:06:26,291
social care, and retirement
is a problem
1452
01:06:26,315 --> 01:06:29,353
that already exists
the world over.
1453
01:06:29,385 --> 01:06:31,365
In the last century alone,
1454
01:06:31,387 --> 01:06:33,594
medicine has
massively contributed
1455
01:06:33,622 --> 01:06:36,680
to increased life expectancy.
1456
01:06:37,460 --> 01:06:40,407
According to the World
Health Organization,
1457
01:06:40,429 --> 01:06:43,501
the number of people
aged 60 years and over
1458
01:06:43,532 --> 01:06:48,380
is expected to increase
from the 605 million today
1459
01:06:48,700 --> 01:06:52,382
to 2 billion by the year 2050.
1460
01:06:52,408 --> 01:06:55,252
As people live longer,
they become more susceptible
1461
01:06:55,277 --> 01:06:58,451
to noncommunicable diseases.
1462
01:06:58,481 --> 01:07:02,327
This becomes
enormously expensive.
1463
01:07:02,351 --> 01:07:08,199
Dementia alone costs the NHS
23 billion a year.
1464
01:07:08,224 --> 01:07:10,795
Currently, elderly non-workers
1465
01:07:11,270 --> 01:07:14,804
account for a vast portion
of our population
1466
01:07:15,310 --> 01:07:18,501
and a vast portion of our work
force care for them.
1467
01:07:19,668 --> 01:07:23,548
It is economically beneficial
to end aging.
1468
01:07:25,541 --> 01:07:28,545
Social life is organized
around people having--
1469
01:07:28,577 --> 01:07:32,150
occupying certain roles
at certain ages, right?
1470
01:07:32,181 --> 01:07:34,286
And you can already see
the kinds of problems
1471
01:07:34,316 --> 01:07:36,592
that are caused
to the welfare system
1472
01:07:36,619 --> 01:07:39,657
when people live substantially
beyond the age of 65,
1473
01:07:39,688 --> 01:07:45,700
because when the whole number 65
was selected by Bismarck
1474
01:07:45,940 --> 01:07:47,165
when he started the first
social security system,
1475
01:07:47,196 --> 01:07:50,750
in Germany, the expectation was
that people would be living
1476
01:07:50,990 --> 01:07:52,670
two years beyond
the retirement age
1477
01:07:52,701 --> 01:07:54,442
to be able to get
the social security.
1478
01:07:54,470 --> 01:07:56,609
So it wasn't gonna
break the bank, okay?
1479
01:07:56,639 --> 01:07:59,449
Problem now is you've got people
who are living 20 years
1480
01:07:59,475 --> 01:08:02,810
or more beyond
the retirement age,
1481
01:08:02,111 --> 01:08:03,681
and that's unaffordable.
1482
01:08:03,712 --> 01:08:06,693
There's no question that,
within society as a whole,
1483
01:08:06,715 --> 01:08:10,857
there is an enormous tendency
to knee-jerk reactions
1484
01:08:11,870 --> 01:08:14,159
with regard to the problems
that might be created
1485
01:08:14,190 --> 01:08:17,171
if we were to eliminate aging.
1486
01:08:17,193 --> 01:08:19,298
There have been people
that said, you know,
1487
01:08:19,328 --> 01:08:21,808
"You'll be bored,
you won't have anything to do."
1488
01:08:21,831 --> 01:08:25,142
Speaking from a place
of a lifespan
1489
01:08:25,167 --> 01:08:27,169
that's 80 or 90 years old,
1490
01:08:27,203 --> 01:08:28,773
saying that we're
going to be bored
1491
01:08:28,804 --> 01:08:32,217
if we live to 150 years old
really is just invalid.
1492
01:08:32,241 --> 01:08:34,653
We have no idea what
we'll do with that time.
1493
01:08:34,677 --> 01:08:36,350
Pan of
this transhumanism stuff,
1494
01:08:36,378 --> 01:08:39,450
where it gets some kind
of real policy traction,
1495
01:08:39,482 --> 01:08:43,396
is people who want us
not to live to be 1,000,
1496
01:08:43,419 --> 01:08:46,229
but maybe if we can
take that 20 years
1497
01:08:46,255 --> 01:08:49,532
that we're living longer now
than we did 100 years ago
1498
01:08:49,558 --> 01:08:51,697
and keep that productive.
1499
01:08:51,727 --> 01:08:53,468
So in other words,
if you could still be strong
1500
01:08:53,496 --> 01:08:56,875
and still be sharp
into your 70s and 80s,
1501
01:08:56,899 --> 01:08:59,709
and so not have to pull
any social security
1502
01:08:59,735 --> 01:09:02,773
until quite late in life
and then you'll be--
1503
01:09:02,805 --> 01:09:04,546
you'll have 20 extra years
1504
01:09:04,573 --> 01:09:07,144
where you're actually
contributing to the economy.
1505
01:09:07,176 --> 01:09:08,621
So one of the areas that
we're gonna have to think about
1506
01:09:08,644 --> 01:09:11,147
in the near future
if we do achieve
1507
01:09:11,180 --> 01:09:14,218
extreme longevity physically
1508
01:09:14,250 --> 01:09:17,356
is the idea of overpopulation.
1509
01:09:17,386 --> 01:09:20,526
This is a controversial idea,
of course,
1510
01:09:20,556 --> 01:09:24,766
and we may face a time period
where we have to say to people,
1511
01:09:24,793 --> 01:09:28,866
"You have to be licensed
to have more than one child."
1512
01:09:28,898 --> 01:09:31,708
The ideas around children,
I hope, will probably change
1513
01:09:31,734 --> 01:09:37,309
when people start to realize
that the values of children
1514
01:09:37,339 --> 01:09:39,615
need to be defined first
before we have them.
1515
01:09:39,642 --> 01:09:41,349
And that's not
something that we do.
1516
01:09:41,377 --> 01:09:44,358
We just have them,
and we don't define why
1517
01:09:44,380 --> 01:09:45,791
or for what purpose.
1518
01:09:45,814 --> 01:09:47,555
I'm not saying there has
to be a defined purpose,
1519
01:09:47,583 --> 01:09:50,223
but I'm saying that just
to continue our gene line
1520
01:09:50,252 --> 01:09:51,697
isn't the biggest reason.
1521
01:09:51,720 --> 01:09:53,427
At the moment, ultimately,
1522
01:09:53,455 --> 01:09:56,868
we see in any society
where fertility rate goes down
1523
01:09:56,892 --> 01:09:59,839
because of female prosperity
and emancipation
1524
01:09:59,862 --> 01:10:02,706
and education,
we also see the age
1525
01:10:02,731 --> 01:10:05,940
of the average childbirth
go up, right?
1526
01:10:05,968 --> 01:10:07,845
We see women having
their children later.
1527
01:10:07,870 --> 01:10:10,874
Now of course, at the moment,
there's a deadline for that,
1528
01:10:10,906 --> 01:10:12,749
but that's not going
to exist anymore,
1529
01:10:12,775 --> 01:10:14,686
because menopause
is pan of aging.
1530
01:10:14,710 --> 01:10:17,350
So women who are choosing
to have their children
1531
01:10:17,379 --> 01:10:19,359
a bit later now,
1532
01:10:19,381 --> 01:10:21,258
it stands to reason
that a lot of them
1533
01:10:21,283 --> 01:10:22,819
are probably gonna choose
to have their children
1534
01:10:22,851 --> 01:10:24,922
a lot later and a lot later
and that, of course,
1535
01:10:24,954 --> 01:10:28,265
also has an enormous
depressive impact
1536
01:10:28,290 --> 01:10:31,396
on the trajectory
of global population.
1537
01:10:31,427 --> 01:10:32,872
If we actually said
to everybody, "Okay,
1538
01:10:32,895 --> 01:10:34,932
you're all now
gonna live for 1,000 years,
1539
01:10:34,964 --> 01:10:36,875
we could restructure society
1540
01:10:36,899 --> 01:10:38,901
so it's on
these LOGO-year cycles.
1541
01:10:38,934 --> 01:10:42,472
That's possible,
but the problem becomes
1542
01:10:42,504 --> 01:10:46,509
when you still allow people
to live the normal length
1543
01:10:46,542 --> 01:10:49,386
and you're also allowing
some people to live 1,000 years
1544
01:10:49,411 --> 01:10:52,170
then how do you compare
the value of the lives,
1545
01:10:52,248 --> 01:10:53,386
the amount of experience?
1546
01:10:53,415 --> 01:10:56,396
Supposing a 585-year-old guy
1547
01:10:56,418 --> 01:10:58,557
goes up for a job
against a 23-year-old.
1548
01:10:58,587 --> 01:11:00,328
How do you measure
the experience?
1549
01:11:00,356 --> 01:11:01,926
What, the old guy
always gets the job?
1550
01:11:01,957 --> 01:11:05,905
I mean, really, these kinds
of problems would arise
1551
01:11:05,928 --> 01:11:08,465
unless there was
some kind of legislation
1552
01:11:08,497 --> 01:11:11,340
about permissible
variation in age.
1553
01:11:11,267 --> 01:11:12,803
This is a bit of a conundrum
1554
01:11:12,835 --> 01:11:17,450
because we're all
expanding our lifespan,
1555
01:11:17,273 --> 01:11:20,379
and the question is,
would you like to live
1556
01:11:20,409 --> 01:11:22,753
for not 100 years but 200?
1557
01:11:22,778 --> 01:11:24,883
Would you choose
to if you could?
1558
01:11:24,913 --> 01:11:27,291
It would be very,
very difficult to say no.
1559
01:11:27,316 --> 01:11:30,786
The reality is the replacement
of human piece pans
1560
01:11:30,819 --> 01:11:33,561
is probably gonna take us
in that direction,
1561
01:11:33,589 --> 01:11:35,680
but it will be market driven,
1562
01:11:35,291 --> 01:11:37,737
and those people with the money
will be able to afford
1563
01:11:37,760 --> 01:11:39,865
to live a lot longer
than those people without.
1564
01:11:39,895 --> 01:11:41,806
Pretty much most of
the discovery these days
1565
01:11:41,830 --> 01:11:44,401
takes place in Western Europe
of the United States
1566
01:11:44,433 --> 01:11:49,314
or one or two other countries,
China, Singapore, and so on,
1567
01:11:49,338 --> 01:11:52,683
but if they're valuable enough--
and I don't mean monetarily--
1568
01:11:52,708 --> 01:11:55,985
if they're worth having,
then people extend them.
1569
01:11:56,110 --> 01:11:58,321
We have to star-t somewhere,
and I don't believe
1570
01:11:58,347 --> 01:12:00,418
in the dog
in the manger attitude
1571
01:12:00,449 --> 01:12:01,689
is that you don't
give it to anybody
1572
01:12:01,717 --> 01:12:03,754
until you can provide it
for everybody.
1573
01:12:03,786 --> 01:12:06,528
All technologies
are discontinuous.
1574
01:12:06,555 --> 01:12:09,920
There are people
at this very moment
1575
01:12:09,325 --> 01:12:11,320
who are walking four kilometers
1576
01:12:11,600 --> 01:12:14,640
to get a bucket of water
from a well.
1577
01:12:14,960 --> 01:12:17,430
You know, there are people
who are having cornea operations
1578
01:12:17,660 --> 01:12:20,809
that are done with a needle
where it's stuck in their eye
1579
01:12:20,836 --> 01:12:23,770
and their cornea is scraped out.
1580
01:12:23,105 --> 01:12:25,745
You know, so these ideas
of totalizing
1581
01:12:25,774 --> 01:12:28,618
utopian technological
intervention
1582
01:12:28,644 --> 01:12:31,454
are pan of a discontinuous
technological world
1583
01:12:31,480 --> 01:12:34,825
and the world will always be
discontinuous technologically.
1584
01:12:34,850 --> 01:12:39,356
When a child has to drink
dirty water, cannot get food,
1585
01:12:39,388 --> 01:12:41,959
and is dying of starvation
and disease,
1586
01:12:41,990 --> 01:12:44,834
and the solution
is just a few dollars,
1587
01:12:44,860 --> 01:12:46,430
there's something badly wrong.
1588
01:12:46,462 --> 01:12:49,466
We need to fix those things.
1589
01:12:49,498 --> 01:12:53,503
The only way that
we could fix them in the past
1590
01:12:53,535 --> 01:12:55,537
would've been
at unbelievable cost
1591
01:12:55,571 --> 01:12:56,914
because of the limitation
1592
01:12:56,939 --> 01:12:59,112
of our industrial capacity
and capability.
1593
01:12:59,141 --> 01:13:00,643
Not anymore.
1594
01:13:05,981 --> 01:13:10,396
...not the one which is--
which stops us from aging.
1595
01:13:10,419 --> 01:13:15,164
In the last 20 years,
healthcare in Sub-Saharan Africa
1596
01:13:15,391 --> 01:13:16,836
has greatly improved.
1597
01:13:16,859 --> 01:13:17,929
♪
1598
01:13:17,960 --> 01:13:20,907
HIV prevalence has gone down,
1599
01:13:20,929 --> 01:13:23,876
Infant modality rate
has gone down.
1600
01:13:23,899 --> 01:13:26,880
Immunization rates have gone up,
1601
01:13:26,902 --> 01:13:29,883
and the drug supply
in many areas has risen.
1602
01:13:29,905 --> 01:13:30,975
♪
1603
01:13:31,600 --> 01:13:33,816
However, healthcare
and medication
1604
01:13:33,842 --> 01:13:35,685
in developing countries
1605
01:13:35,711 --> 01:13:39,887
is not always affordable
or even readily available.
1606
01:13:39,915 --> 01:13:41,758
It is not just medicine.
1607
01:13:41,784 --> 01:13:44,196
According to the World
Health Organization,
1608
01:13:44,420 --> 01:13:47,697
over 700 million people
worldwide
1609
01:13:47,723 --> 01:13:51,603
do not have access
to clean drinking water.
1610
01:13:51,627 --> 01:13:54,506
We still live in an age
where over one billion people
1611
01:13:54,530 --> 01:13:56,771
live off less
than a dollar a day
1612
01:13:56,799 --> 01:13:59,473
and live in extreme poverty.
1613
01:13:59,501 --> 01:14:02,482
It is ultimately
not a scientific issue,
1614
01:14:02,504 --> 01:14:04,882
it is a geopolitical issue.
1615
01:15:09,238 --> 01:15:12,140
Now a lot of people,
a lot of philanthropists
1616
01:15:12,400 --> 01:15:16,853
are of the view that
the most important thing to do
1617
01:15:16,879 --> 01:15:20,691
is to address the trailing edge
of quality of life.
1618
01:15:20,716 --> 01:15:23,754
In other words, to help
the disadvantaged.
1619
01:15:23,785 --> 01:15:26,527
But some visionary
philanthropists
1620
01:15:26,555 --> 01:15:29,559
such as the ones that fund
SENS Research Foundation--
1621
01:15:29,591 --> 01:15:31,195
and the fact is
I agree with this,
1622
01:15:31,226 --> 01:15:33,206
and that's why I've put
most of my inheritance
1623
01:15:33,228 --> 01:15:35,834
into SENS Research
Foundation, too--
1624
01:15:35,864 --> 01:15:39,903
we feel that, actually,
in the long run,
1625
01:15:39,935 --> 01:15:43,542
you lose out if you focus
too exclusively
1626
01:15:43,572 --> 01:15:45,108
on the trailing edge.
1627
01:15:45,140 --> 01:15:48,280
You've got also to push forward
the leading edge,
1628
01:15:48,310 --> 01:15:51,757
so that in the long term,
everybody moves forward.
1629
01:16:43,999 --> 01:16:46,843
Due to a lack of funding
from governments,
1630
01:16:46,868 --> 01:16:51,180
anti-aging research is often
pushed into the private sector.
1631
01:16:51,206 --> 01:16:55,860
If we look at funding
for disease--
1632
01:16:55,110 --> 01:16:56,919
cancer, hear-t disease--
1633
01:16:56,945 --> 01:16:59,152
they get six billion,
eight billion, ten billion.
1634
01:16:59,181 --> 01:17:02,355
AIDS still gets two
to four billion.
1635
01:17:02,384 --> 01:17:04,694
Let's look
at Alzheimer's disease.
1636
01:17:04,720 --> 01:17:06,961
It's probably the most
important disease of aging,
1637
01:17:06,989 --> 01:17:10,664
the brain--it gets
under a half a billion
1638
01:17:10,692 --> 01:17:11,898
from the federal government,
1639
01:17:11,927 --> 01:17:14,660
and a lot of that
goes to programs
1640
01:17:14,960 --> 01:17:15,939
that are tied up
with pharma trials
1641
01:17:15,964 --> 01:17:18,103
where we don't really
see it in the labs,
1642
01:17:18,133 --> 01:17:20,841
so you're maybe down to two
or three hundred million.
1643
01:17:20,869 --> 01:17:24,214
It's not nearly enough
to really make a dent.
1644
01:17:24,239 --> 01:17:29,951
The question is, why when it
comes to Alzheimer's disease,
1645
01:17:29,978 --> 01:17:32,117
which is a problem
in the elderly
1646
01:17:32,147 --> 01:17:34,218
do we just see it as--
the federal government
1647
01:17:34,249 --> 01:17:36,388
seems to see it
as a red-haired Stepchild.
1648
01:17:36,418 --> 01:17:39,331
They don't take care of it.
1649
01:17:39,354 --> 01:17:40,992
Some people say it's because--
1650
01:17:41,230 --> 01:17:42,195
people say, "Well,
it affects old people,
1651
01:17:42,224 --> 01:17:44,431
they lived their lives,
let them go."
1652
01:17:44,660 --> 01:17:47,140
No one wants to admit that,
but maybe subconsciously,
1653
01:17:47,162 --> 01:17:49,730
when Congress is
thinking about this,
1654
01:17:49,970 --> 01:17:50,974
that's at play, who knows?
1655
01:17:50,999 --> 01:17:53,912
Um, maybe it's
much more compelling
1656
01:17:53,935 --> 01:17:57,974
to wanna put money into diseases
that affect young people
1657
01:17:58,600 --> 01:18:00,111
who still have
their whole life to live
1658
01:18:00,142 --> 01:18:02,880
when they have AIDS
or breast cancer
1659
01:18:02,110 --> 01:18:04,784
or cancer that can strike
somebody at 30 or 40 years old.
1660
01:18:04,813 --> 01:18:07,157
Age might be pan of it,
and even if it's something
1661
01:18:07,182 --> 01:18:09,162
where you'd say,
"No, it can't be that!"
1662
01:18:09,184 --> 01:18:10,959
you never know what's
happening subconsciously
1663
01:18:10,986 --> 01:18:12,795
in those who are
making the decisions.
1664
01:18:12,821 --> 01:18:14,391
Otherwise, it just
makes no sense at all.
1665
01:18:14,423 --> 01:18:16,960
I don't know how to explain it.
1666
01:18:16,124 --> 01:18:17,933
When you talk
to people about aging
1667
01:18:17,959 --> 01:18:20,963
and rejuvenation medicine,
you're talking about things
1668
01:18:20,996 --> 01:18:23,408
that they haven't put
in the same category
1669
01:18:23,432 --> 01:18:25,139
as things that they can fight.
1670
01:18:25,167 --> 01:18:26,305
They are willing to put money
1671
01:18:26,334 --> 01:18:28,280
towards solving cancer
and curing cancer.
1672
01:18:28,303 --> 01:18:30,840
It's something they might have
the potential of experiencing.
1673
01:18:30,872 --> 01:18:33,876
But the thing that's 100 percent
in terms of probability,
1674
01:18:33,909 --> 01:18:36,820
they haven't classified that
as in the same category
1675
01:18:36,111 --> 01:18:38,284
when actually it is
and actually it's more dramatic
1676
01:18:38,313 --> 01:18:40,793
because 100 percent
of people experience it.
1677
01:18:40,816 --> 01:18:43,922
You need to have
the will to be cured.
1678
01:18:43,952 --> 01:18:48,128
Beyond that, medical science
will play its pan.
1679
01:18:48,156 --> 01:18:51,103
I think it's essentially a crime
1680
01:18:51,126 --> 01:18:53,732
to not support
life extension science,
1681
01:18:53,762 --> 01:18:55,469
because if you support
the other side,
1682
01:18:55,497 --> 01:18:58,137
you're an advocate
for killing someone.
1683
01:18:58,166 --> 01:19:02,808
When you actually support
a culture of death,
1684
01:19:02,838 --> 01:19:06,149
when you support
embracing death,
1685
01:19:06,174 --> 01:19:09,986
what you're really doing is
not supporting embracing life.
1686
01:19:10,110 --> 01:19:11,922
Everyone ought to be healthy,
1687
01:19:11,947 --> 01:19:14,484
however long ago they were born.
1688
01:19:14,516 --> 01:19:17,395
When someone says, "Oh, dear,
we shouldn't defeat aging,
1689
01:19:17,419 --> 01:19:19,296
we shouldn't work
to eliminate aging,"
1690
01:19:19,321 --> 01:19:22,970
what they're actually saying
is they're not in favor
1691
01:19:22,124 --> 01:19:24,934
of healthcare for the elderly,
or to be more precise,
1692
01:19:24,960 --> 01:19:27,133
what they're saying is
they're only in favor
1693
01:19:27,162 --> 01:19:28,971
of healthcare for the elderly
1694
01:19:28,997 --> 01:19:31,978
so long as it
doesn't work very well.
1695
01:19:32,000 --> 01:19:34,710
And I think that's fucked up.
1696
01:19:35,103 --> 01:19:37,777
In September 2013,
1697
01:19:37,806 --> 01:19:41,117
Google announced
the conception of Calico,
1698
01:19:41,143 --> 01:19:43,214
an independent biotech company
1699
01:19:43,245 --> 01:19:46,556
that remains to this day
a little mysterious.
1700
01:19:46,782 --> 01:19:51,940
Its aim is to tackle aging
and devise interventions
1701
01:19:51,119 --> 01:19:55,431
that enable people to lead
longer and healthier lives.
1702
01:19:55,457 --> 01:19:58,970
In September 2014,
1703
01:19:58,126 --> 01:20:01,730
the life extension company
announced it was partnering
1704
01:20:01,960 --> 01:20:04,908
with biopharmaceutical
giant AbbVie,
1705
01:20:04,933 --> 01:20:10,406
and made a $1.5 billion
investment into research.
1706
01:20:10,438 --> 01:20:12,247
I think one of
the biggest obstacles
1707
01:20:12,274 --> 01:20:14,811
that we have at the moment
to come to terms
1708
01:20:14,843 --> 01:20:17,187
with this future world
we're talking about
1709
01:20:17,212 --> 01:20:19,317
is a lot of people
who basically
1710
01:20:19,347 --> 01:20:21,953
don't want it to happen at all
1711
01:20:21,983 --> 01:20:25,294
and so are placing
all kinds of ethical
1712
01:20:25,320 --> 01:20:27,357
and institutional restrictions
1713
01:20:27,389 --> 01:20:29,266
on the development
of this stuff
1714
01:20:29,291 --> 01:20:31,293
so that it becomes difficult,
let's say, in universities
1715
01:20:31,326 --> 01:20:33,932
to experiment with certain
kinds of drugs, right?
1716
01:20:33,962 --> 01:20:36,135
To develop certain kinds
of machines maybe even,
1717
01:20:36,164 --> 01:20:38,440
and as a result,
all of that kind of research
1718
01:20:38,466 --> 01:20:41,106
ends up going into
either the private sector
1719
01:20:41,136 --> 01:20:43,980
or maybe underground, right?
1720
01:20:44,500 --> 01:20:46,349
Or going into some country
that's an ethics-free zone
1721
01:20:46,374 --> 01:20:48,479
like China
or someplace like that,
1722
01:20:48,510 --> 01:20:50,319
and I think that's
where the real problems
1723
01:20:50,345 --> 01:20:53,417
potentially lie, because
we really need to be doing,
1724
01:20:53,448 --> 01:20:55,223
you know, we need to be
developing this stuff,
1725
01:20:55,250 --> 01:20:57,321
but in the public eye, right?
1726
01:20:57,352 --> 01:21:00,424
So it should be done
by the mainstream authorities
1727
01:21:00,455 --> 01:21:02,196
so we can monitor
the consequences
1728
01:21:02,224 --> 01:21:03,464
as they're happening
and then be able
1729
01:21:03,491 --> 01:21:05,164
to take appropriate action.
1730
01:21:05,193 --> 01:21:06,536
But I'm afraid that
a lot of this stuff
1731
01:21:06,561 --> 01:21:10,800
is perhaps being driven outside
1732
01:21:10,310 --> 01:21:12,136
because of all the restrictions
that are placed on it.
1733
01:21:12,167 --> 01:21:13,908
That I think is very worrisome,
1734
01:21:13,935 --> 01:21:15,505
because then you can't
keep track of the results,
1735
01:21:15,537 --> 01:21:18,177
and you don't know
exactly what's happening.
1736
01:21:18,206 --> 01:21:20,277
And I think that's
a real problem already
1737
01:21:20,308 --> 01:21:22,219
with a lot of this
more futuristic stuff.
1738
01:21:24,412 --> 01:21:26,323
Arguably, the human condition
1739
01:21:26,348 --> 01:21:29,557
is defined by
our anxiety of death.
1740
01:21:29,584 --> 01:21:32,155
It's of little wonder
that throughout history,
1741
01:21:32,187 --> 01:21:35,430
mankind has built
countless belief systems
1742
01:21:35,457 --> 01:21:38,336
in a bid to pacify
the fear of death
1743
01:21:38,360 --> 01:21:41,637
through the promise
of endless paradise.
1744
01:21:41,663 --> 01:21:45,475
Ultimately, death always wins.
1745
01:21:45,500 --> 01:21:50,279
If it's not so much death
we fear, it's dying.
1746
01:21:50,305 --> 01:21:52,376
The relationship
between life and death
1747
01:21:52,407 --> 01:21:57,220
is often figured
in terms of immortality
1748
01:21:57,245 --> 01:21:59,191
and the quest for immortality.
1749
01:21:59,214 --> 01:22:01,490
There's a philosopher,
a brilliant philosopher
1750
01:22:01,516 --> 01:22:05,293
called Stephen Cave, who,
in his book Immortality,
1751
01:22:05,320 --> 01:22:09,359
argues that our fear of death
is the great driver
1752
01:22:09,391 --> 01:22:13,533
of all civilization,
of all human endeavor,
1753
01:22:13,561 --> 01:22:16,974
and he identifies
four different ways
1754
01:22:16,998 --> 01:22:20,411
in which people
seek immortality,
1755
01:22:20,435 --> 01:22:24,542
so firstly the idea
of extending life,
1756
01:22:24,572 --> 01:22:26,552
of living forever.
1757
01:22:26,574 --> 01:22:28,986
Secondly, the idea
of resurrection
1758
01:22:29,100 --> 01:22:33,322
so that we might come back
after death in some form.
1759
01:22:33,348 --> 01:22:36,955
Thirdly, the idea
of the immortality
1760
01:22:36,985 --> 01:22:41,161
of some pan of ourselves
beyond the physical body.
1761
01:22:41,189 --> 01:22:44,363
So perhaps the immortality
of the soul, for example,
1762
01:22:44,392 --> 01:22:47,202
or living on in Heaven.
1763
01:22:47,228 --> 01:22:51,608
And finally the idea
of leaving a legacy.
1764
01:22:51,633 --> 01:22:54,477
I think that one
of life's challenges
1765
01:22:54,502 --> 01:22:56,573
really actually is
to come to terms
1766
01:22:56,604 --> 01:22:59,608
with our own finitude
and modality
1767
01:22:59,641 --> 01:23:02,190
and human limitations,
1768
01:23:02,430 --> 01:23:05,456
and this is
an enormous challenge.
1769
01:23:05,480 --> 01:23:13,480
♪
1770
01:23:21,529 --> 01:23:26,350
Technology, a reflection
of our times.
1771
01:23:26,670 --> 01:23:31,278
Efficient, computerized,
with a sleek beauty all its own.
1772
01:23:31,306 --> 01:23:33,786
Technology is
the human imagination
1773
01:23:34,900 --> 01:23:36,182
convened into reality.
1774
01:23:36,211 --> 01:23:38,213
We are all interested
in the future,
1775
01:23:38,246 --> 01:23:40,453
for that is where you and I
are going to spend
1776
01:23:40,482 --> 01:23:42,519
the rest of our lives.
1777
01:23:42,550 --> 01:23:45,121
(high-pitched tone)
1778
01:23:45,153 --> 01:23:52,503
♪
1779
01:23:52,527 --> 01:23:54,564
It's impossible to say for sure
1780
01:23:54,596 --> 01:23:57,509
where these new technologies
will take us
1781
01:23:57,532 --> 01:24:01,571
and how we will prepare
to implement them into society.
1782
01:24:01,603 --> 01:24:04,447
It is likely that they will
affect the sensibilities
1783
01:24:04,472 --> 01:24:06,509
of global infrastructure.
1784
01:24:08,760 --> 01:24:12,218
There are always anxieties
surrounding new technologies,
1785
01:24:12,247 --> 01:24:14,591
and this time is no exception.
1786
01:24:14,616 --> 01:24:15,720
♪
1787
01:24:15,750 --> 01:24:18,128
I think people fear change,
1788
01:24:18,153 --> 01:24:21,100
and so the future represents
this enormous amount
1789
01:24:21,122 --> 01:24:23,500
of change that's coming at us.
1790
01:24:23,525 --> 01:24:25,368
I do think it's
overwhelming for people,
1791
01:24:25,393 --> 01:24:28,670
you know, they are
afraid to change
1792
01:24:28,696 --> 01:24:30,403
the paradigm that they live in,
1793
01:24:30,432 --> 01:24:33,208
and when we talk about
the future of work and death,
1794
01:24:33,234 --> 01:24:35,475
what we're really
talking about is changing
1795
01:24:35,503 --> 01:24:37,574
a paradigm
that has existed for us
1796
01:24:37,605 --> 01:24:38,811
as long as we can remember.
1797
01:24:38,840 --> 01:24:42,830
All of this Kind
of scare mongering
1798
01:24:42,110 --> 01:24:46,149
about harm and risk
and stuff like that
1799
01:24:46,181 --> 01:24:50,425
really is--it's based on a kind
of psychological illusion,
1800
01:24:50,452 --> 01:24:54,628
namely that you imagine that you
kind of see the bad state
1801
01:24:54,656 --> 01:24:57,466
as a bad state when it happens,
1802
01:24:57,492 --> 01:25:00,132
whereas in fact,
what more likely happens
1803
01:25:00,161 --> 01:25:02,767
is that you kinda get adjusted
to the various changes
1804
01:25:02,797 --> 01:25:04,401
that are happening
in your environment
1805
01:25:04,432 --> 01:25:06,139
so that when you actually
do reach that state
1806
01:25:06,167 --> 01:25:09,808
we're talking about,
it'll seem normal.
1807
01:25:09,838 --> 01:25:12,114
And because, look,
when the automobile
1808
01:25:12,140 --> 01:25:14,484
was introduced in
the early 20th century,
1809
01:25:14,509 --> 01:25:16,511
people were saying
this is just gonna pump
1810
01:25:16,544 --> 01:25:18,387
a lot of smoke
into the atmosphere,
1811
01:25:18,413 --> 01:25:20,290
it's going to ruin
our contact with nature
1812
01:25:20,315 --> 01:25:21,885
'cause we'll be in
these enclosed vehicles,
1813
01:25:22,117 --> 01:25:23,494
we'll be going so fast,
1814
01:25:23,518 --> 01:25:25,293
we won't be able
to appreciate things,
1815
01:25:25,320 --> 01:25:27,391
there'll be congestion,
blah, blah, blah,
1816
01:25:27,422 --> 01:25:29,368
they were right!
1817
01:25:29,390 --> 01:25:30,630
They were right, but of course,
1818
01:25:30,658 --> 01:25:32,831
by the time you get
to that state
1819
01:25:32,861 --> 01:25:35,205
where the automobile
has had that impact,
1820
01:25:35,230 --> 01:25:36,732
it's also had all
this benefit as well,
1821
01:25:36,764 --> 01:25:39,210
and your whole life has been
kind of restructured around it.
1822
01:25:39,234 --> 01:25:42,147
Arguably, people who are using
1823
01:25:42,170 --> 01:25:47,518
or have been conceived
using in vitro fertilization
1824
01:25:47,542 --> 01:25:53,322
are cyborgs way before
they were ever even people.
1825
01:25:53,348 --> 01:25:56,386
Now that doesn't mean
that we understand kinship
1826
01:25:56,417 --> 01:25:58,226
in a radically different way.
1827
01:25:58,253 --> 01:25:59,857
Just look at
the Industrial Revolution!
1828
01:25:59,888 --> 01:26:03,426
Is anyone actually--
does anyone actually regret
1829
01:26:03,458 --> 01:26:05,301
that the Industrial Revolution
occurred?
1830
01:26:05,326 --> 01:26:07,602
No, it was fairly
turbulent, right?
1831
01:26:07,629 --> 01:26:10,576
You know, we did actually
have a little bit of strife
1832
01:26:10,598 --> 01:26:13,272
in the translation
from a pre-industrial world
1833
01:26:13,301 --> 01:26:14,575
to the world we know today.
1834
01:26:14,602 --> 01:26:16,604
But the fact is, we adapted.
1835
01:26:16,638 --> 01:26:18,914
The most important thing here
is to try to compare it
1836
01:26:18,940 --> 01:26:20,783
to something in the past.
1837
01:26:20,808 --> 01:26:24,517
Imagine we were--
it was 1914, 100 years back,
1838
01:26:24,546 --> 01:26:26,924
and I told you that
most people on the planet
1839
01:26:26,948 --> 01:26:28,518
would have the ability to have
1840
01:26:28,550 --> 01:26:30,461
this tiny cell phone screen
in front of them
1841
01:26:30,485 --> 01:26:33,432
and video conference with ten
of their friends all at once.
1842
01:26:33,454 --> 01:26:36,264
If it was 1914,
you would look at me and say,
1843
01:26:36,291 --> 01:26:38,271
"That's absurd,
this guy's insane."
1844
01:26:38,293 --> 01:26:40,603
However, it's
the son of same concept
1845
01:26:40,628 --> 01:26:42,335
when I tell you now in 50 years
1846
01:26:42,363 --> 01:26:44,604
we're going to be digital
beings of ourselves,
1847
01:26:44,632 --> 01:26:46,270
it's no so far-fetched.
1848
01:26:46,301 --> 01:26:48,338
You have to look at it
in the historical context.
1849
01:26:48,369 --> 01:26:51,407
All concepts of
technological progress
1850
01:26:51,439 --> 01:26:55,581
in that way are linked
to post-enlightenment ideas
1851
01:26:55,610 --> 01:26:58,386
or non--they're linked
to the idea
1852
01:26:58,413 --> 01:27:01,553
of the arrow of time
being in free flight forward,
1853
01:27:01,583 --> 01:27:05,588
but they're also chiliastic,
they propose an end state.
1854
01:27:05,620 --> 01:27:06,894
They propose the end state,
1855
01:27:06,921 --> 01:27:08,662
and the end state
is the singularity,
1856
01:27:08,690 --> 01:27:11,330
but they propose it
as something desirable.
1857
01:27:11,359 --> 01:27:14,772
Now any kind of philosophy
like that, it's, you know,
1858
01:27:14,796 --> 01:27:18,505
jam tomorrow, jam yesterday,
but never jam today.
1859
01:27:18,533 --> 01:27:20,911
They're all philosophies
that are about
1860
01:27:20,935 --> 01:27:24,405
accept the shit you're in--
work, consume, die--
1861
01:27:24,439 --> 01:27:26,646
because there is something
better in the future,
1862
01:27:26,674 --> 01:27:28,585
or there's something
more innovative in the future.
1863
01:27:28,610 --> 01:27:30,920
There are good scenarios
and there are bad scenarios.
1864
01:27:30,945 --> 01:27:32,549
I don't know where we're headed.
1865
01:27:32,580 --> 01:27:35,891
I mean, I don't think
anyone really knows.
1866
01:27:35,917 --> 01:27:37,897
You know, if anyone
claims to know the future,
1867
01:27:37,919 --> 01:27:39,626
they're guessing,
they're extrapolating forward,
1868
01:27:39,654 --> 01:27:41,292
and we draw
some lines and curves
1869
01:27:41,322 --> 01:27:42,801
and see where
technology's gonna be.
1870
01:27:42,824 --> 01:27:44,633
What that means,
1871
01:27:44,659 --> 01:27:46,297
I don't think any of us
really understand.
1872
01:27:46,327 --> 01:27:48,967
Everyone assumes that
the future is going to be
1873
01:27:48,997 --> 01:27:50,670
dramatically different
from today
1874
01:27:50,698 --> 01:27:52,410
and that's absolutely true,
1875
01:27:52,267 --> 01:27:53,905
but it's also true
that the future will be
1876
01:27:53,935 --> 01:27:55,608
an extension of today's world.
1877
01:27:55,637 --> 01:27:58,481
The problems
that exist in today's world
1878
01:27:58,506 --> 01:28:01,385
are still gonna be
with us in the future.
1879
01:28:01,409 --> 01:28:03,389
Human nature is
not gonna change.
1880
01:28:03,411 --> 01:28:06,170
The end point
in all of this game
1881
01:28:06,470 --> 01:28:08,687
will become a bit of a moral
1882
01:28:08,716 --> 01:28:12,357
and an ethical question
for society
1883
01:28:12,387 --> 01:28:14,526
where decisions
will have to be made.
1884
01:28:15,657 --> 01:28:19,662
Like life itself, work
and death, for better or worse,
1885
01:28:19,694 --> 01:28:21,833
are two features
of the human experience
1886
01:28:21,863 --> 01:28:23,934
that are thrust upon us.
1887
01:28:25,400 --> 01:28:27,710
Whether or not
we define work and death
1888
01:28:27,735 --> 01:28:29,976
as problems in need of remedy,
1889
01:28:30,400 --> 01:28:33,850
human ingenuity is a progressive
and natural extension
1890
01:28:33,875 --> 01:28:35,855
of our own evolution.
1891
01:28:35,877 --> 01:28:43,694
♪
1892
01:28:43,718 --> 01:28:46,824
Advancing our technological
capabilities is a way
1893
01:28:46,854 --> 01:28:50,427
of dealing with our limitations
as human beings.
1894
01:28:50,458 --> 01:28:51,960
♪
1895
01:28:51,993 --> 01:28:53,666
Must we do something
1896
01:28:53,695 --> 01:28:56,403
just because we're capable
of doing something?
1897
01:28:56,431 --> 01:28:58,536
Or can we withhold
our hands and say
1898
01:28:58,566 --> 01:29:00,978
"No, this is not
a good thing to do"?
1899
01:29:01,200 --> 01:29:04,600
This is something
that the human species
1900
01:29:04,380 --> 01:29:05,813
must decide for itself.
1901
01:29:05,840 --> 01:29:08,616
You and I, we can't just
leave it to the scientists.
1902
01:29:08,643 --> 01:29:10,850
We have to know
what's going on and why!
1903
01:29:14,649 --> 01:29:17,649
♪
142308
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.