Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:03,709 --> 00:00:05,931
[buzzing]
2
00:00:06,619 --> 00:00:11,619
Subtitles by explosiveskull
3
00:00:20,107 --> 00:00:22,834
Even if we can
change human beings,
4
00:00:22,836 --> 00:00:24,925
in what direction
do we change them?
5
00:00:24,927 --> 00:00:28,013
Do we want to change them
this way or that way?
6
00:00:28,015 --> 00:00:31,652
This is an example
of the way in which
7
00:00:31,654 --> 00:00:36,782
technological advance impinges
on sociological necessity.
8
00:00:37,415 --> 00:00:46,975
[music]
9
00:00:56,775 --> 00:00:58,402
What we need to ask
ourselves is,
10
00:00:58,404 --> 00:01:02,573
"Where does my individual
ability to control my life
11
00:01:02,575 --> 00:01:05,246
or to influence
the political process
12
00:01:05,248 --> 00:01:09,071
lie in relation to these
new forms of technology?"
13
00:01:09,287 --> 00:01:11,558
[music]
14
00:01:11,560 --> 00:01:12,754
Government and politicians
15
00:01:12,756 --> 00:01:14,173
don't understand
what's happening.
16
00:01:14,175 --> 00:01:17,610
See, they don't even realized
this change is happening.
17
00:01:18,295 --> 00:01:21,277
It is very difficult,
not impossible,
18
00:01:21,279 --> 00:01:24,925
to predict what
the precise effects will be,
19
00:01:24,927 --> 00:01:29,709
and in many cases,
like with other technologies,
20
00:01:29,711 --> 00:01:31,868
we have to socket and see.
21
00:01:32,074 --> 00:01:34,293
Who would have
predicted the internet?
22
00:01:34,295 --> 00:01:37,058
And I talk about this
matter as humanity 2.0
23
00:01:37,060 --> 00:01:38,863
'cause in a sense,
this is where we're heading,
24
00:01:38,865 --> 00:01:41,354
to some kind of
new normal, as it were,
25
00:01:41,356 --> 00:01:43,587
of what it is
to be a human being.
26
00:01:43,670 --> 00:01:45,813
It's not a problem
27
00:01:45,815 --> 00:01:47,893
that we should dismiss
or underestimate.
28
00:01:47,895 --> 00:01:50,154
It's staggering
in its proportions.
29
00:01:50,156 --> 00:01:52,805
Ignorance and disbelief
at the same time.
30
00:01:52,818 --> 00:01:57,263
People don't believe that
change is happening this fast.
31
00:01:57,265 --> 00:01:58,845
That's the problem.
32
00:01:59,615 --> 00:02:03,769
[music]
33
00:02:03,771 --> 00:02:06,214
This is a stone
34
00:02:06,216 --> 00:02:08,133
formed naturally
in the Earth's crust
35
00:02:08,135 --> 00:02:11,455
over millions of years
through pressure and heat.
36
00:02:11,545 --> 00:02:15,733
It was discovered in
the Olduvai Gorge in Tanzania.
37
00:02:15,735 --> 00:02:18,933
Dated around 2.5
million years BC,
38
00:02:18,935 --> 00:02:23,015
it is arguably one of the first
examples of technology.
39
00:02:23,107 --> 00:02:25,941
Stone tools were first adapted
for the use of cutting,
40
00:02:25,943 --> 00:02:30,436
scraping, or pounding
materials by Homo habilis,
41
00:02:30,438 --> 00:02:33,446
one of our earliest ancestors.
42
00:02:33,920 --> 00:02:35,941
Over one million years later,
43
00:02:35,943 --> 00:02:37,952
mankind made one of
the most significant
44
00:02:37,954 --> 00:02:40,876
of all technological
discoveries...
45
00:02:41,170 --> 00:02:42,457
Fire.
46
00:02:42,459 --> 00:02:44,194
The ability to control fire
47
00:02:44,196 --> 00:02:46,777
was a turning point
for human evolution.
48
00:02:46,779 --> 00:02:49,881
It kept us warm, allowed us
to see in the dark,
49
00:02:49,883 --> 00:02:51,582
and allowed us to cook food,
50
00:02:51,584 --> 00:02:54,944
which many scientists believe
was a huge contributor
51
00:02:54,946 --> 00:02:57,266
to the ascent of mind.
52
00:02:58,215 --> 00:03:02,191
Each age, each empire,
has brought with it
53
00:03:02,193 --> 00:03:05,923
the discovery and invention
of numerous technologies
54
00:03:06,177 --> 00:03:10,532
each in their own way
redesigning human life...
55
00:03:12,267 --> 00:03:14,305
...leading us to now...
56
00:03:14,715 --> 00:03:17,275
...modern-day society.
57
00:03:18,209 --> 00:03:21,573
We're now more advanced,
connected, knowledgeable,
58
00:03:21,575 --> 00:03:25,305
and resistant to disease
than ever before,
59
00:03:25,401 --> 00:03:27,082
and it is all due
to our ability
60
00:03:27,084 --> 00:03:30,858
to apply scientific knowledge
for practical purposes
61
00:03:30,860 --> 00:03:33,649
in a bid to maximize
efficiency.
62
00:03:33,943 --> 00:03:37,756
Just as the stone set us
on a path of transformation,
63
00:03:37,758 --> 00:03:39,853
the technologies
of the future
64
00:03:39,855 --> 00:03:42,655
may bring with them
a paradigm shift,
65
00:03:42,763 --> 00:03:47,173
changing two major features
of the human experience.
66
00:03:47,295 --> 00:03:49,569
Two things that have
defined our lives
67
00:03:49,571 --> 00:03:51,673
for as long as
we can remember.
68
00:03:51,810 --> 00:03:56,579
Two things that have always
been involuntary constants:
69
00:03:56,806 --> 00:03:59,733
trading our time
for sustenance
70
00:03:59,735 --> 00:04:02,673
and losing that time
through senescence.
71
00:04:03,335 --> 00:04:07,165
[music]
72
00:04:11,467 --> 00:04:15,245
[music]
73
00:04:15,247 --> 00:04:18,247
[indistinct chattering]
74
00:04:18,350 --> 00:04:20,969
[music]
75
00:04:20,971 --> 00:04:22,870
[telephone ringing]
76
00:04:22,872 --> 00:04:27,235
[music]
77
00:04:27,315 --> 00:04:29,833
[zapping]
78
00:04:29,835 --> 00:04:37,473
[music]
79
00:04:37,475 --> 00:04:39,993
[laughing]
80
00:04:39,995 --> 00:04:45,985
[music]
81
00:04:46,995 --> 00:04:50,175
The Industrial Revolution
effectively freed man
82
00:04:50,177 --> 00:04:51,793
from being a beast of burden.
83
00:04:51,795 --> 00:04:53,925
The computer revolution
will similarly free him
84
00:04:53,927 --> 00:04:55,955
from dull, repetitive routine.
85
00:04:56,177 --> 00:04:57,839
The computer
revolution is, however,
86
00:04:57,841 --> 00:04:59,988
perhaps better compared
with the Copernican
87
00:04:59,990 --> 00:05:01,993
or the Darwinian Revolution,
88
00:05:01,995 --> 00:05:05,264
both of which greatly changed
man's idea of himself
89
00:05:05,266 --> 00:05:07,089
and the world
in which he lives.
90
00:05:07,235 --> 00:05:10,722
In the space of 60 years,
we have landed on the moon,
91
00:05:10,724 --> 00:05:13,389
seen the rise of
computing power,
92
00:05:13,391 --> 00:05:14,816
mobile phones,
93
00:05:14,818 --> 00:05:16,738
the explosion
of the internet,
94
00:05:16,740 --> 00:05:20,095
and we have sequenced
the human genome.
95
00:05:20,341 --> 00:05:22,473
We took man to
the moon and back
96
00:05:22,475 --> 00:05:24,980
with four kilobytes of memory.
97
00:05:24,982 --> 00:05:29,506
The phone in your pocket
is at least 250,000 times
98
00:05:29,508 --> 00:05:31,670
more powerful than that.
99
00:05:31,763 --> 00:05:34,930
We are ever-increasingly
doing more with less.
100
00:05:35,084 --> 00:05:36,722
One of the things
that has been born
101
00:05:36,724 --> 00:05:39,363
out of this
technological revolution
102
00:05:39,365 --> 00:05:42,175
is the ability
to replace human workers
103
00:05:42,177 --> 00:05:44,954
with more efficient machines.
104
00:05:45,117 --> 00:05:46,881
This is largely
due to the speed
105
00:05:46,883 --> 00:05:50,410
at which we are advancing
our technological capabilities.
106
00:05:50,412 --> 00:05:52,144
[applause]
107
00:05:52,146 --> 00:05:56,384
Information technology grows
in an exponential manner.
108
00:05:56,388 --> 00:05:58,113
It's not linear.
109
00:05:58,115 --> 00:06:00,246
And our intuition is linear.
110
00:06:00,248 --> 00:06:01,506
When we walked
through the savanna
111
00:06:01,508 --> 00:06:03,554
a thousand years ago,
we made linear predictions
112
00:06:03,556 --> 00:06:05,467
where that animal would be
and that worked fine.
113
00:06:05,469 --> 00:06:07,488
It's hardwired in our brains,
114
00:06:07,490 --> 00:06:09,973
but the pace of
exponential growth
115
00:06:09,975 --> 00:06:13,545
is really what describes
information technologies,
116
00:06:13,547 --> 00:06:15,470
and it's not just computation.
117
00:06:15,490 --> 00:06:17,069
There's a big difference
between linear
118
00:06:17,071 --> 00:06:18,256
and exponential growth.
119
00:06:18,258 --> 00:06:20,327
If I take 30 steps linearly,
120
00:06:20,329 --> 00:06:23,416
one, two, three, four,
five, I get to 30.
121
00:06:23,418 --> 00:06:25,453
If I take 30 steps
exponentially,
122
00:06:25,455 --> 00:06:28,639
two, four, eight, sixteen,
I get to a billion.
123
00:06:28,641 --> 00:06:30,453
It makes a huge difference.
124
00:06:30,455 --> 00:06:32,913
And that really describes
information technology.
125
00:06:32,915 --> 00:06:34,832
When I was a student at MIT,
126
00:06:34,834 --> 00:06:37,173
we all shared one computer,
it took up a whole building.
127
00:06:37,175 --> 00:06:38,493
The computer in
your cell phone today
128
00:06:38,495 --> 00:06:41,499
is a million times cheaper,
a million times smaller,
129
00:06:41,501 --> 00:06:43,668
a thousand times
more powerful.
130
00:06:43,670 --> 00:06:46,420
That's a billionfold increase
in capability per dollar
131
00:06:46,422 --> 00:06:47,858
that we've actually
experienced
132
00:06:47,860 --> 00:06:49,373
since I was a student,
133
00:06:49,375 --> 00:06:52,188
and we're gonna do it again
in the next 25 years.
134
00:06:52,638 --> 00:06:55,485
Currently,
on an almost daily basis,
135
00:06:55,501 --> 00:06:58,163
new algorithms, programs,
136
00:06:58,165 --> 00:07:00,644
and feats in
mechanical engineering
137
00:07:00,646 --> 00:07:03,853
are getting closer and closer
to being a reliable
138
00:07:03,855 --> 00:07:08,227
and more cost-effective
alternative to a human worker.
139
00:07:08,820 --> 00:07:11,907
This process is
known as automation.
140
00:07:12,528 --> 00:07:14,848
This is not just about,
141
00:07:15,259 --> 00:07:17,202
you know, automation
where we expect it,
142
00:07:17,204 --> 00:07:18,405
which is in factories
143
00:07:18,407 --> 00:07:20,530
and among blue-collar
workers and so forth.
144
00:07:20,532 --> 00:07:23,043
It is coming
quite aggressively
145
00:07:23,045 --> 00:07:24,866
for people at much
higher skill levels,
146
00:07:24,868 --> 00:07:26,718
and that will only
grow in the future
147
00:07:26,720 --> 00:07:29,266
as we continue on
this exponential arc.
148
00:07:29,357 --> 00:07:32,114
This business that, you know,
not having to work very hard
149
00:07:32,116 --> 00:07:34,466
because machines are taking
care of things for you,
150
00:07:34,468 --> 00:07:36,374
I mean, you see this
also in the 19th century
151
00:07:36,376 --> 00:07:37,879
with the Industrial Revolution.
152
00:07:37,881 --> 00:07:40,631
And, in fact, I think
one of the problems
153
00:07:40,633 --> 00:07:42,295
with the Industrial Revolution,
154
00:07:42,297 --> 00:07:45,035
and this is where Marxism
got so much traction,
155
00:07:45,037 --> 00:07:48,209
was that machines
actually did render
156
00:07:48,211 --> 00:07:50,293
a lot of people
unemployed, okay?
157
00:07:50,295 --> 00:07:53,298
That already happened in
the 19th and 20th centuries.
158
00:07:53,498 --> 00:07:56,459
And it was only by
labor organizing itself
159
00:07:56,461 --> 00:07:57,764
that it was able
to kind of deal
160
00:07:57,766 --> 00:07:59,563
with the situation
intelligently
161
00:07:59,845 --> 00:08:01,873
because there was no
automatic, you might say,
162
00:08:01,875 --> 00:08:03,248
transition to something else.
163
00:08:03,250 --> 00:08:05,069
It was just, you know,
"We don't need you anymore.
164
00:08:05,071 --> 00:08:06,577
We have these more
efficient machines,
165
00:08:06,579 --> 00:08:09,533
and so now you just have to find
work somewhere else."
166
00:08:09,535 --> 00:08:12,053
Automation clearly has been
happening for a long time,
167
00:08:12,055 --> 00:08:14,738
and it has,
you know, automated
168
00:08:14,740 --> 00:08:17,269
a lot of very laborious work
that we don't want to do,
169
00:08:17,271 --> 00:08:19,533
and that's gonna continue
to be the case in the future,
170
00:08:19,535 --> 00:08:22,493
but I do think that this time
is genuinely different.
171
00:08:22,552 --> 00:08:24,694
If we look at what's
happened historically,
172
00:08:24,696 --> 00:08:26,381
what we've seen
is that automation
173
00:08:26,383 --> 00:08:28,762
has primarily been
a mechanical phenomenon,
174
00:08:28,764 --> 00:08:30,365
and the classic
example of that
175
00:08:30,367 --> 00:08:31,780
is, of course, agriculture.
176
00:08:31,782 --> 00:08:33,165
I'm a farmer.
177
00:08:33,233 --> 00:08:35,280
Here's what mechanical
engineering has done
178
00:08:35,282 --> 00:08:37,246
for all of us who
work on the farms
179
00:08:37,248 --> 00:08:38,686
and for you, too.
180
00:08:38,688 --> 00:08:40,217
It used to be,
in the United States
181
00:08:40,219 --> 00:08:41,518
and in most
advanced countries,
182
00:08:41,520 --> 00:08:43,251
that most people
worked on farms.
183
00:08:43,508 --> 00:08:45,248
Now, almost no one
works on a farm.
184
00:08:45,250 --> 00:08:46,824
It's less than two percent.
185
00:08:46,826 --> 00:08:49,316
And, of course, as a result
of that, we're better off.
186
00:08:49,318 --> 00:08:52,475
We have, you know,
more comfortable jobs,
187
00:08:52,477 --> 00:08:53,710
food is cheaper,
188
00:08:53,712 --> 00:08:55,533
we have a much more
advanced society.
189
00:08:55,535 --> 00:08:57,788
The question is, "Can that
continue indefinitely?"
190
00:08:57,790 --> 00:08:58,943
And what we're
seeing this time
191
00:08:58,945 --> 00:09:01,123
is that things are
really quite different.
192
00:09:01,125 --> 00:09:02,493
If this keeps up,
193
00:09:02,495 --> 00:09:05,859
it won't be long before
machines will do everything.
194
00:09:05,861 --> 00:09:07,550
Nobody will have work.
195
00:09:07,552 --> 00:09:11,446
So as we move deeper
into the automated future,
196
00:09:11,541 --> 00:09:14,926
we will see different
stages take form.
197
00:09:14,928 --> 00:09:17,038
The first stage
that we're entering
198
00:09:17,040 --> 00:09:19,623
is the stage where
automated robots are working
199
00:09:19,625 --> 00:09:21,809
side by side with
people in factories.
200
00:09:21,811 --> 00:09:23,893
Some of those jobs
are slowly going away,
201
00:09:23,895 --> 00:09:27,178
but in the near future,
within two to three years,
202
00:09:27,180 --> 00:09:29,370
you're going to see
a huge percentage
203
00:09:29,372 --> 00:09:31,169
of those factory jobs
be replaced
204
00:09:31,171 --> 00:09:34,273
with automated systems
and automated robots.
205
00:09:34,275 --> 00:09:39,014
The next stage following
that is we could see
206
00:09:39,016 --> 00:09:42,178
up to a third
of jobs in America
207
00:09:42,180 --> 00:09:45,540
be replaced by 2025
208
00:09:45,631 --> 00:09:48,071
by robots
or automated systems.
209
00:09:48,074 --> 00:09:49,670
That's a huge
percentage of people
210
00:09:49,672 --> 00:09:51,187
that could be unemployed
211
00:09:51,189 --> 00:09:53,428
because of this
automated tsunami
212
00:09:53,430 --> 00:09:54,503
that's coming, basically.
213
00:09:54,505 --> 00:09:57,561
We have a colleague
here called Carl Frey
214
00:09:57,563 --> 00:10:01,936
who has put together
a list of jobs
215
00:10:01,938 --> 00:10:03,217
by their vulnerability
216
00:10:03,219 --> 00:10:06,040
to getting replaced
by automation,
217
00:10:06,185 --> 00:10:09,311
and the least vulnerable are
things like choreographers,
218
00:10:09,313 --> 00:10:12,282
managers, social workers.
219
00:10:12,564 --> 00:10:15,038
People who have people skills
220
00:10:15,040 --> 00:10:18,618
and who have creativity.
221
00:10:19,575 --> 00:10:26,173
[music]
222
00:10:29,136 --> 00:10:31,013
One area that I look
a lot at is fast food.
223
00:10:31,015 --> 00:10:33,311
I mean, the fast food industry
is tremendously important
224
00:10:33,313 --> 00:10:34,527
in American economy.
225
00:10:34,529 --> 00:10:37,459
If you look at the years
since recovery
226
00:10:37,461 --> 00:10:38,860
from the Great Recession,
227
00:10:39,138 --> 00:10:42,092
the majority of jobs,
somewhere around 60 percent,
228
00:10:42,094 --> 00:10:44,693
have been low-wage
service sector jobs.
229
00:10:44,695 --> 00:10:46,954
A lot of those have been
in areas like fast food,
230
00:10:46,956 --> 00:10:49,815
and yet, to me,
it seems almost inevitable
231
00:10:49,817 --> 00:10:52,556
that, ultimately,
fast food is gonna automate.
232
00:10:52,558 --> 00:10:55,152
There's a company right
here in San Francisco
233
00:10:55,154 --> 00:10:58,712
called "Momentum Machines"
which is actually working on
234
00:10:59,495 --> 00:11:01,545
a machine to automate
hamburger production,
235
00:11:01,547 --> 00:11:02,850
and it can crank out
236
00:11:02,852 --> 00:11:07,238
about 400 gourmet
hamburgers per hour,
237
00:11:07,240 --> 00:11:11,642
and they ultimately expect
to sort of roll that out
238
00:11:11,644 --> 00:11:14,858
not just in fast food
establishments,
239
00:11:14,860 --> 00:11:16,447
perhaps in convenience stores
240
00:11:16,449 --> 00:11:17,772
and maybe even
vending machines.
241
00:11:17,774 --> 00:11:19,399
It could be
all over the place.
242
00:11:20,295 --> 00:11:23,962
[music]
243
00:11:24,271 --> 00:11:27,727
I can see manufacturing now
becoming completely automated.
244
00:11:27,729 --> 00:11:29,561
I can see, you know,
hundreds of millions
245
00:11:29,563 --> 00:11:31,093
of workers being
put out of jobs,
246
00:11:31,095 --> 00:11:32,855
That's almost certain
it's gonna happen.
247
00:11:32,924 --> 00:11:35,084
So you have lots of companies
right now that are automating
248
00:11:35,086 --> 00:11:37,163
their factories
and their warehouses.
249
00:11:37,165 --> 00:11:38,554
Amazon is a great example.
250
00:11:38,556 --> 00:11:41,202
They're using robots
to automate their systems.
251
00:11:41,204 --> 00:11:43,295
The robots actually
grab the products
252
00:11:43,297 --> 00:11:44,569
and bring the products
to the people
253
00:11:44,571 --> 00:11:46,876
who put those products
into the box.
254
00:11:46,982 --> 00:11:48,967
So there are still people
255
00:11:48,969 --> 00:11:50,973
within the factories
at Amazon,
256
00:11:50,975 --> 00:11:53,743
but in the near future,
those jobs may go away as well.
257
00:11:54,084 --> 00:11:56,733
There is a company here
in Silicon Valley
258
00:11:56,735 --> 00:11:58,394
called "Industrial Perception,"
259
00:11:58,396 --> 00:12:01,373
and they built a robot
that can approach
260
00:12:01,375 --> 00:12:04,803
a stack of boxes that are
sort of stacked haphazardly
261
00:12:04,805 --> 00:12:06,973
in some nonstandard way
262
00:12:06,975 --> 00:12:10,077
and visually, by looking at
that stack of boxes,
263
00:12:10,079 --> 00:12:11,725
figure out how
to move those boxes.
264
00:12:11,727 --> 00:12:13,594
And they built a machine
that ultimately
265
00:12:13,596 --> 00:12:17,219
will be able to move
perhaps one box every second,
266
00:12:17,240 --> 00:12:19,967
and that compares with
about three seconds
267
00:12:19,969 --> 00:12:22,923
for a human worker
who's very industrious.
268
00:12:22,925 --> 00:12:24,998
Okay, and this machine,
you can imagine,
269
00:12:25,000 --> 00:12:26,170
will work continuously.
270
00:12:26,172 --> 00:12:27,533
It's never gonna get injured,
271
00:12:27,535 --> 00:12:30,548
never file a workers'
compensation claim,
272
00:12:30,755 --> 00:12:33,133
and, yet, it's moving
into an area
273
00:12:33,135 --> 00:12:35,373
that, up until now, at least,
we would have said
274
00:12:35,375 --> 00:12:38,467
is really something that
is exclusively
275
00:12:38,469 --> 00:12:40,373
the province
of the human worker.
276
00:12:40,375 --> 00:12:44,851
I mean, it's this ability
to look at something
277
00:12:44,853 --> 00:12:47,121
and then based on
what you see,
278
00:12:47,123 --> 00:12:48,584
manipulate your environment.
279
00:12:48,586 --> 00:12:50,342
It's sort of the confluence
280
00:12:50,344 --> 00:12:53,376
of visual perception
and dexterity.
281
00:12:53,811 --> 00:12:56,571
[music]
282
00:12:56,615 --> 00:12:58,928
We'll see self-driving
cars on the road
283
00:12:58,930 --> 00:13:00,493
within 10 or 15 years.
284
00:13:00,638 --> 00:13:02,202
Fifteen years from now,
we'll be debating
285
00:13:02,204 --> 00:13:04,761
whether we should even allow
human beings to be
286
00:13:04,763 --> 00:13:06,472
be on the road at all.
287
00:13:06,474 --> 00:13:08,773
Tesla says that by next year,
288
00:13:08,775 --> 00:13:12,860
that, you know, their cars
will be 90 percent automated.
289
00:13:13,154 --> 00:13:15,280
Which means that
the jobs of taxi drivers,
290
00:13:15,282 --> 00:13:17,242
truck drivers, goes away.
291
00:13:17,412 --> 00:13:20,253
Suddenly, we don't need
to own cars anymore.
292
00:13:20,255 --> 00:13:24,274
Humanity isn't ready for such
a basic change such as that.
293
00:13:26,037 --> 00:13:27,850
Call center jobs,
voice recognition
294
00:13:27,852 --> 00:13:29,902
is pretty sophisticated
these days.
295
00:13:29,904 --> 00:13:34,123
And you can imagine
replacing many kinds of,
296
00:13:34,125 --> 00:13:35,813
you know,
helplines and things.
297
00:13:35,815 --> 00:13:38,264
There's a company called
"IPsoft" that has created
298
00:13:38,266 --> 00:13:41,533
an intelligent
software system,
299
00:13:41,535 --> 00:13:43,975
an automated system,
called "Amelia."
300
00:13:44,068 --> 00:13:46,813
She can not only understand
what you're saying to her,
301
00:13:46,815 --> 00:13:49,634
she understands the context
of what you're saying,
302
00:13:49,636 --> 00:13:51,533
and she can learn
from her mistakes.
303
00:13:51,535 --> 00:13:53,973
This is a huge deal because
what we're going to see
304
00:13:53,975 --> 00:13:56,213
is all of the customer
service agent jobs,
305
00:13:56,215 --> 00:13:58,634
if she is successful,
if this software program,
306
00:13:58,636 --> 00:14:01,327
this automated software
program is successful,
307
00:14:01,329 --> 00:14:04,339
we could see all of
those jobs go away.
308
00:14:04,341 --> 00:14:06,300
These things tend
to go to marginal cost,
309
00:14:06,302 --> 00:14:10,077
and marginal cost is copying
software, which is nothing,
310
00:14:10,079 --> 00:14:12,030
and running it on a computer
311
00:14:12,032 --> 00:14:13,641
which will probably be
very cheap.
312
00:14:14,201 --> 00:14:17,081
[music]
313
00:14:17,412 --> 00:14:21,311
Human doctors will be,
in some respect, pushed aside
314
00:14:21,313 --> 00:14:22,944
because machines
can do a better job
315
00:14:22,946 --> 00:14:24,474
of diagnosis than they can.
316
00:14:24,476 --> 00:14:26,381
Now will they have the empathy
that current doctors do?
317
00:14:26,383 --> 00:14:28,292
I don't know, but at least
they'll have the knowledge
318
00:14:28,294 --> 00:14:30,013
that our doctors do,
they'll be more advanced,
319
00:14:30,015 --> 00:14:31,493
so I can see this
option in healthcare.
320
00:14:31,495 --> 00:14:33,994
The one that is likely to be
the biggest growth area,
321
00:14:33,996 --> 00:14:37,453
from an economic standpoint,
is the android companions
322
00:14:37,455 --> 00:14:40,655
to help elderly people,
okay, because there's...
323
00:14:40,657 --> 00:14:43,108
you know, given the rise
in elderly people
324
00:14:43,110 --> 00:14:46,133
over the next 20,
30 years, that it's...
325
00:14:46,135 --> 00:14:48,100
and it's unlikely there are
gonna be enough people
326
00:14:48,102 --> 00:14:49,785
going into
the nursing profession
327
00:14:49,787 --> 00:14:52,686
to actually serve them,
especially if we're thinking
328
00:14:52,688 --> 00:14:55,102
in terms of home-based care.
329
00:14:55,677 --> 00:14:58,452
The robot surgeon, I think,
is something that will happen
330
00:14:58,454 --> 00:15:01,053
in the not-too-distant future
331
00:15:01,055 --> 00:15:05,733
because a lot of that is
to do with manual dexterity
332
00:15:05,735 --> 00:15:10,053
and having the expertise
to recognize...
333
00:15:10,055 --> 00:15:13,933
to understand what you're
manipulating as a surgeon.
334
00:15:13,935 --> 00:15:17,253
Terrific amount of expertise
for a human to accumulate,
335
00:15:17,255 --> 00:15:19,173
but I can imagine that
we would be able to build
336
00:15:19,175 --> 00:15:21,973
something that is
a specialized robot surgeon
337
00:15:21,975 --> 00:15:24,092
that can carry out
particular operations
338
00:15:24,094 --> 00:15:26,173
such as a prostate operation.
339
00:15:26,175 --> 00:15:28,154
That's one that people
are working on right now,
340
00:15:28,156 --> 00:15:29,748
and I think
they're nearly there
341
00:15:29,750 --> 00:15:31,053
of being able to produce
342
00:15:31,055 --> 00:15:32,847
a reliable robot surgeon
that can do that.
343
00:15:32,849 --> 00:15:34,670
You might not want to submit
yourself to this thing,
344
00:15:34,672 --> 00:15:36,832
you might think, but in fact,
I think we'll be able to make
345
00:15:36,834 --> 00:15:40,134
a very reliable robot surgeon
to do that sort of thing.
346
00:15:41,095 --> 00:15:46,344
[music]
347
00:15:47,075 --> 00:15:48,754
I can see this
option in finance
348
00:15:48,756 --> 00:15:50,769
because they're moving
to digital currencies.
349
00:15:50,771 --> 00:15:53,813
And we're now
moving to crowdfunding
350
00:15:53,815 --> 00:15:57,074
and crowdbanking
and all these other advances.
351
00:15:57,076 --> 00:15:59,334
One of my favorites
is investment bankers.
352
00:15:59,336 --> 00:16:03,213
Artificial intelligence
already does more
353
00:16:03,215 --> 00:16:06,074
stock market trades
today than any human being.
354
00:16:06,076 --> 00:16:09,773
Lots of decisions like
decisions about mortgages
355
00:16:09,775 --> 00:16:12,293
and insurance, already
those things have been,
356
00:16:12,295 --> 00:16:14,674
you know, largely
taken over by programs,
357
00:16:14,676 --> 00:16:17,634
and I think that kind of trend
is only gonna continue.
358
00:16:18,575 --> 00:16:23,985
[music]
359
00:16:29,146 --> 00:16:31,524
Every time there's
a technological change,
360
00:16:31,526 --> 00:16:33,741
it will, unfortunately, put
a lot of people out of work.
361
00:16:33,743 --> 00:16:35,234
It happened with
the cotton gin.
362
00:16:35,236 --> 00:16:38,883
It's happened with every
single technological change.
363
00:16:39,021 --> 00:16:42,553
So, sure, technology
destroys jobs,
364
00:16:42,555 --> 00:16:43,973
but it creates new ones.
365
00:16:43,975 --> 00:16:46,889
Moving from the age of work
that we're in now
366
00:16:46,891 --> 00:16:50,688
into the abundant,
ubiquitous automation age,
367
00:16:51,029 --> 00:16:53,459
that bridge that
we have to cross
368
00:16:53,461 --> 00:16:56,021
is gonna be a very
interesting time period.
369
00:16:56,087 --> 00:16:58,410
I think in the very beginning
of that time period,
370
00:16:58,412 --> 00:17:02,308
you're going to see automation
start to replace jobs,
371
00:17:02,310 --> 00:17:05,827
but those jobs will transfer
into other forms of work.
372
00:17:05,829 --> 00:17:08,769
So, for example, instead
of working in a factory,
373
00:17:08,771 --> 00:17:11,952
you will learn to code
and you will code the robots
374
00:17:11,954 --> 00:17:13,413
that are working
in the factory.
375
00:17:13,415 --> 00:17:16,533
When I was a young man
and I went for careers advice,
376
00:17:16,535 --> 00:17:18,555
I don't know what they
would have made of me
377
00:17:18,557 --> 00:17:22,362
asking for a job
as a webmaster.
378
00:17:22,415 --> 00:17:25,318
It didn't exist, there
wasn't a web at that time.
379
00:17:25,320 --> 00:17:28,485
And, right now, we have
over 200,000 vacancies
380
00:17:28,487 --> 00:17:30,883
for people who can
analyze big data.
381
00:17:30,885 --> 00:17:32,946
And we really do need people
382
00:17:32,948 --> 00:17:34,773
and mechanisms
for analyzing it
383
00:17:34,775 --> 00:17:38,456
and getting the most
information from that data,
384
00:17:38,578 --> 00:17:41,271
and that problem is only
gonna increase in the future.
385
00:17:41,273 --> 00:17:43,860
And I do think that there's
gonna be a lot of employment
386
00:17:43,862 --> 00:17:45,382
moving in that direction.
387
00:17:45,384 --> 00:17:49,264
[music]
388
00:17:50,210 --> 00:17:53,173
The history of our country
proves that new inventions
389
00:17:53,175 --> 00:17:56,933
create thousands of jobs
for every one they displace.
390
00:17:56,935 --> 00:18:01,063
So it wasn't long before your
grandfather had a better job
391
00:18:01,065 --> 00:18:03,232
at more pay for less work.
392
00:18:03,234 --> 00:18:05,093
We're always offered
this solution
393
00:18:05,095 --> 00:18:07,336
of still more education,
still more training.
394
00:18:07,338 --> 00:18:08,688
If people lose
their routine job,
395
00:18:08,690 --> 00:18:10,360
then let's send them
back to school.
396
00:18:10,362 --> 00:18:11,660
They'll pick up
some new skills,
397
00:18:11,662 --> 00:18:13,589
they'll learn something new,
and then they'll be able
398
00:18:13,591 --> 00:18:16,022
to move into some
more rewarding career.
399
00:18:16,024 --> 00:18:18,156
That's not gonna operate
so well in the future
400
00:18:18,158 --> 00:18:19,648
where the machines are coming
401
00:18:19,650 --> 00:18:21,420
for those skilled
jobs as well.
402
00:18:21,422 --> 00:18:22,985
The fact is that machines
are really good at
403
00:18:22,987 --> 00:18:24,625
picking up skills
and doing all kinds
404
00:18:24,627 --> 00:18:26,651
of extraordinarily
complex things,
405
00:18:26,656 --> 00:18:28,110
so those jobs
aren't necessarily
406
00:18:28,112 --> 00:18:29,653
gonna be there either.
407
00:18:29,655 --> 00:18:32,466
And a second insight,
I think, is that historically,
408
00:18:32,468 --> 00:18:35,391
it's always been the case that
the vast majority of people
409
00:18:35,393 --> 00:18:37,193
have always done routine work.
410
00:18:37,195 --> 00:18:40,013
So even if people can
make that transition,
411
00:18:40,015 --> 00:18:41,613
if they can succeed
in going back to school
412
00:18:41,615 --> 00:18:43,938
and learning something new,
in percentage terms,
413
00:18:43,940 --> 00:18:46,013
those jobs don't constitute
414
00:18:46,015 --> 00:18:47,554
that much of the total
employment out there.
415
00:18:47,556 --> 00:18:49,716
I mean, most people are doing
these more routine things.
416
00:18:49,718 --> 00:18:51,413
So, you know, we're up
against a real problem
417
00:18:51,415 --> 00:18:54,262
that's probably gonna
require a political solution.
418
00:18:54,264 --> 00:18:59,109
It's probably going to
require direct redistribution.
419
00:18:59,111 --> 00:19:00,281
That's my take on it,
420
00:19:00,283 --> 00:19:03,914
and that's a staggering
political challenge,
421
00:19:03,916 --> 00:19:05,455
especially in
the United States.
422
00:19:05,530 --> 00:19:07,773
This would be fine
if we had generations
423
00:19:07,775 --> 00:19:11,304
to adapt to the change
so that the next generation
424
00:19:11,306 --> 00:19:12,898
could develop
a different lifestyle,
425
00:19:12,900 --> 00:19:14,180
different value system.
426
00:19:14,233 --> 00:19:16,218
The problem is that
all of this is happening
427
00:19:16,220 --> 00:19:17,211
within the same generation.
428
00:19:17,213 --> 00:19:19,013
Within a period of 15 years,
429
00:19:19,015 --> 00:19:22,549
we're gonna start wiping out
most of the jobs that we know.
430
00:19:22,562 --> 00:19:24,744
That's really what worries me.
431
00:19:25,175 --> 00:19:28,533
A term commonly used when
describing the trajectory
432
00:19:28,535 --> 00:19:31,797
of technological progress
and where it's leading us
433
00:19:31,799 --> 00:19:34,676
is the "technological
singularity."
434
00:19:34,678 --> 00:19:36,551
The term is
borrowed from physics
435
00:19:36,553 --> 00:19:40,380
to describe an event horizon
or a moment in space time
436
00:19:40,382 --> 00:19:42,262
that you cannot see beyond.
437
00:19:42,335 --> 00:19:44,781
We are currently
in the transistor era
438
00:19:44,783 --> 00:19:46,663
of information technology.
439
00:19:46,763 --> 00:19:51,438
In 1965, co-founder
of Intel, Gordon Moore,
440
00:19:51,440 --> 00:19:53,984
made the observation
that the processing power
441
00:19:53,986 --> 00:19:57,174
of computers doubles
every 18 months.
442
00:19:57,323 --> 00:19:59,781
The prediction that
this trend will continue
443
00:19:59,783 --> 00:20:01,721
is known as Moore's Law.
444
00:20:02,139 --> 00:20:03,406
When Intel created
445
00:20:03,408 --> 00:20:07,150
their first computer
processing unit in 1971,
446
00:20:07,311 --> 00:20:10,295
it has 2,300 transistors
447
00:20:10,297 --> 00:20:14,908
and had a processing speed
of 740 kilohertz.
448
00:20:14,975 --> 00:20:19,523
Today, a typical CPU has
over a billion transistors
449
00:20:19,525 --> 00:20:22,310
with an average speed
of two gigahertz.
450
00:20:22,312 --> 00:20:25,613
However, many predict
that by 2020,
451
00:20:25,615 --> 00:20:28,859
the miniaturization of
transistors and silicon chips
452
00:20:28,861 --> 00:20:30,221
will reach its limit,
453
00:20:30,255 --> 00:20:34,807
and Moore's Law will fizzle
out into a post-silicon era.
454
00:20:35,288 --> 00:20:37,328
Another way of
describing the term
455
00:20:37,330 --> 00:20:39,801
"technological singularity"
456
00:20:39,803 --> 00:20:42,373
is a time when
artificial intelligence
457
00:20:42,375 --> 00:20:45,603
surpasses human
intellectual capacity.
458
00:20:45,873 --> 00:20:47,539
But does this mean
that a computer
459
00:20:47,541 --> 00:20:49,203
can produce a new idea
460
00:20:49,205 --> 00:20:52,245
or make an original
contribution to knowledge?
461
00:20:52,255 --> 00:20:55,765
Artificial intelligence, AI,
is a longstanding project
462
00:20:55,767 --> 00:20:58,507
which has to do
with basically trying
463
00:20:58,509 --> 00:21:00,711
to use machines as a way
of trying to understand
464
00:21:00,713 --> 00:21:02,314
the nature of intelligence
465
00:21:02,495 --> 00:21:04,625
and, originally,
the idea was, in some sense,
466
00:21:04,627 --> 00:21:06,853
to manufacture
within machines
467
00:21:06,855 --> 00:21:09,343
something that could
simulate human intelligence.
468
00:21:09,345 --> 00:21:11,514
But I think now,
as the years have gone on,
469
00:21:11,516 --> 00:21:13,285
we now think in terms
of intelligence
470
00:21:13,287 --> 00:21:14,733
in a much more abstract way,
471
00:21:14,735 --> 00:21:18,228
so the ability to engage
in massive computations,
472
00:21:18,230 --> 00:21:20,013
right, where you can
end up making
473
00:21:20,015 --> 00:21:22,410
quite intelligent decisions
much more quickly
474
00:21:22,412 --> 00:21:23,730
than a human being can.
475
00:21:23,732 --> 00:21:25,707
So in this respect,
artificial intelligence
476
00:21:25,709 --> 00:21:27,533
in a sense is a,
you might say,
477
00:21:27,535 --> 00:21:29,598
as trying to go to the next
level of intelligence
478
00:21:29,600 --> 00:21:31,162
beyond the human.
479
00:21:32,589 --> 00:21:34,874
A proper AI could substitute
480
00:21:34,876 --> 00:21:39,394
for practically any human job
at some level of skill,
481
00:21:39,396 --> 00:21:42,181
so it's... it would be
482
00:21:42,183 --> 00:21:44,093
a completely
different situation.
483
00:21:44,095 --> 00:21:47,274
You can imagine any kind of
job could, in theory,
484
00:21:47,276 --> 00:21:51,576
be replaced by technology,
if you build human-level AI.
485
00:21:51,935 --> 00:21:55,553
Now that, of course, may
or may not be a good thing.
486
00:21:55,913 --> 00:21:58,590
You'd be able to,
for example, make robots
487
00:21:58,592 --> 00:22:00,582
that could do
all kinds of jobs
488
00:22:00,584 --> 00:22:02,637
that humans don't
necessarily want to do.
489
00:22:02,639 --> 00:22:05,473
There are the so-called
three D's jobs
490
00:22:05,475 --> 00:22:08,110
that are dirty,
dangerous, or dull,
491
00:22:08,112 --> 00:22:10,012
which humans might
not want to do,
492
00:22:10,014 --> 00:22:12,762
and yet, where you
might actually want
493
00:22:12,764 --> 00:22:14,269
a human level of intelligence
494
00:22:14,271 --> 00:22:15,973
to do the job well
or do the job properly.
495
00:22:15,975 --> 00:22:18,043
These are things
which are achievable.
496
00:22:18,045 --> 00:22:19,519
- Yeah.
- This isn't something...
497
00:22:19,521 --> 00:22:20,973
I don't think
it's science fiction.
498
00:22:20,975 --> 00:22:22,777
I think this is
entirely feasible
499
00:22:22,779 --> 00:22:25,160
that we could
build a computer
500
00:22:25,162 --> 00:22:27,153
which is vastly superhuman
501
00:22:27,155 --> 00:22:29,713
which is conscious,
which has emotions,
502
00:22:29,715 --> 00:22:32,399
which is, essentially,
a new species
503
00:22:32,401 --> 00:22:36,071
of self-aware intelligence
and conscious in every way
504
00:22:36,073 --> 00:22:38,393
and has got emotions
the same as you and I do.
505
00:22:38,395 --> 00:22:40,337
I don't see any
fundamental limits
506
00:22:40,339 --> 00:22:42,720
on what we can do,
and we already know enough
507
00:22:42,722 --> 00:22:45,884
about basic science
to start doing that now.
508
00:22:45,886 --> 00:22:48,715
So some people are
concerned about,
509
00:22:48,717 --> 00:22:52,681
you know, possible
risks of building AI
510
00:22:52,683 --> 00:22:55,209
and building something
that is very, very powerful
511
00:22:55,815 --> 00:22:58,017
where there are
unintended consequences
512
00:22:58,019 --> 00:23:00,813
of the thing that you've
built and where it might
513
00:23:00,815 --> 00:23:02,306
do things that
you can't predict
514
00:23:02,308 --> 00:23:04,360
that might be
extremely dangerous.
515
00:23:04,362 --> 00:23:06,233
So a so-called
"existential risk,"
516
00:23:06,235 --> 00:23:07,446
as some people call it.
517
00:23:07,448 --> 00:23:10,235
We are going to hand off
to our machines
518
00:23:10,323 --> 00:23:12,407
all the
multidimensional problems
519
00:23:12,409 --> 00:23:14,571
that we are incapable
of coping with.
520
00:23:14,573 --> 00:23:17,753
You and I can take
a problem with two or three
521
00:23:17,755 --> 00:23:20,665
or four or even seven inputs.
522
00:23:20,667 --> 00:23:21,813
But 300?
523
00:23:21,815 --> 00:23:24,073
A thousand, a million inputs?
524
00:23:24,075 --> 00:23:25,524
We're dead in the water.
525
00:23:25,526 --> 00:23:27,454
The machines can
cope with that.
526
00:23:27,456 --> 00:23:29,235
The advantage
that computers have
527
00:23:29,237 --> 00:23:31,987
is that they communicate
at gigabit speeds.
528
00:23:31,989 --> 00:23:33,728
They all network together.
529
00:23:33,730 --> 00:23:35,384
We talk in slow motion.
530
00:23:35,386 --> 00:23:39,235
So computers will achieve
this level of awareness
531
00:23:39,279 --> 00:23:41,173
probably in the next
20, 30, 40 years.
532
00:23:41,175 --> 00:23:44,475
It's not that if it's
good or that it's evil.
533
00:23:44,651 --> 00:23:46,082
It's we're
probably good enough
534
00:23:46,084 --> 00:23:48,315
to not program an evil AI.
535
00:23:48,386 --> 00:23:51,319
It's that if it's
lethally indifferent.
536
00:23:52,097 --> 00:23:53,798
If it has certain things
537
00:23:53,800 --> 00:23:57,168
that it's tasked
with accomplishing
538
00:23:57,170 --> 00:23:59,024
and humans are in the way.
539
00:23:59,026 --> 00:24:02,103
So there's this concern
that once we reach that moment
540
00:24:02,105 --> 00:24:04,233
where the computers
outperform us
541
00:24:04,235 --> 00:24:06,462
in ways that are
quite meaningful,
542
00:24:06,464 --> 00:24:10,142
that then they will somehow
be motivated to dispose of us
543
00:24:10,144 --> 00:24:12,556
or take over us
or something of this kind.
544
00:24:12,558 --> 00:24:14,454
I don't really believe that
545
00:24:14,456 --> 00:24:16,720
because these kinds
of developments,
546
00:24:16,722 --> 00:24:18,962
which probably are a little
farther off in the future
547
00:24:18,964 --> 00:24:20,833
than some of their
enthusiasts think,
548
00:24:20,835 --> 00:24:23,134
there will be time
for us to adapt,
549
00:24:23,136 --> 00:24:26,103
to come to terms with it,
to organize social systems
550
00:24:26,105 --> 00:24:28,056
that will enable us
to deal adequately
551
00:24:28,058 --> 00:24:30,087
with these new forms
of intelligence.
552
00:24:30,089 --> 00:24:31,813
So I don't this is not
just gonna be something
553
00:24:31,815 --> 00:24:34,055
that's gonna happen
as a miracle tomorrow
554
00:24:34,057 --> 00:24:35,704
and then we'll be
taken by surprise.
555
00:24:35,706 --> 00:24:37,751
But I do think
the key thing here is
556
00:24:37,753 --> 00:24:40,893
that we need to treat
these futuristic things
557
00:24:40,895 --> 00:24:43,653
as not as far away
as people say they are.
558
00:24:43,655 --> 00:24:45,597
Just because they're
not likely to happen
559
00:24:45,599 --> 00:24:47,055
in 15 years, let's say,
560
00:24:47,057 --> 00:24:49,093
it doesn't mean they
won't happen in 50 years.
561
00:24:49,095 --> 00:24:52,973
It's gonna be of kind of
historical dimensions,
562
00:24:52,975 --> 00:24:54,893
and it's very hard
to predict, I think,
563
00:24:54,895 --> 00:24:59,386
whether it's gonna take us
in a utopian direction
564
00:24:59,388 --> 00:25:00,973
or in a dystopian direction
565
00:25:00,975 --> 00:25:03,621
or more likely
something in between,
566
00:25:03,623 --> 00:25:04,882
but just very different.
567
00:25:04,884 --> 00:25:06,244
Very hard to predict.
568
00:25:06,487 --> 00:25:09,074
You see, it's our job
to take raw materials,
569
00:25:09,076 --> 00:25:10,815
adapt them to useful forms,
570
00:25:11,018 --> 00:25:14,157
take natural forces,
harness them to do man's work.
571
00:25:14,159 --> 00:25:16,333
The automated systems
of the future
572
00:25:16,335 --> 00:25:19,533
are a natural process
of human innovation.
573
00:25:19,535 --> 00:25:23,740
It all comes back to the idea
of doing more with less.
574
00:25:24,167 --> 00:25:25,714
This process of innovation
575
00:25:25,716 --> 00:25:29,194
is driven not by
necessity, but desire,
576
00:25:29,196 --> 00:25:32,076
or to simply fill
a gap in the market.
577
00:25:32,674 --> 00:25:35,093
Farm owners didn't
really need to replace
578
00:25:35,095 --> 00:25:37,893
their workers with machines,
but they did so
579
00:25:37,895 --> 00:25:40,558
because they could
foresee the benefits.
580
00:25:40,560 --> 00:25:42,954
It's a natural
cycle of business.
581
00:25:42,956 --> 00:25:46,920
Doing more with less
leads to greater prosperity.
582
00:25:48,096 --> 00:25:49,824
The hope is that
we can adapt to this
583
00:25:49,826 --> 00:25:51,199
politically and socially.
584
00:25:51,201 --> 00:25:52,089
In order to do that,
585
00:25:52,091 --> 00:25:53,832
we have to begin
a conversation now.
586
00:25:53,834 --> 00:25:55,748
Remember that
we're up against
587
00:25:55,940 --> 00:25:58,333
an exponential
arc of progress.
588
00:25:58,335 --> 00:26:00,573
Things are gonna keep
moving faster and faster,
589
00:26:00,575 --> 00:26:02,693
so we need to start
talking about this now
590
00:26:02,695 --> 00:26:04,535
and we need to sort of
get the word out there
591
00:26:04,537 --> 00:26:05,794
so that people will realize
592
00:26:05,796 --> 00:26:08,102
that this problem
is coming at us,
593
00:26:08,104 --> 00:26:10,034
so that we can
begin to discuss
594
00:26:10,036 --> 00:26:11,855
viable political
solutions to this
595
00:26:11,857 --> 00:26:14,436
because, again,
I think it will require,
596
00:26:14,438 --> 00:26:15,938
ultimately,
a political choice.
597
00:26:15,940 --> 00:26:19,787
It's not something that
is gonna sort itself out
598
00:26:19,854 --> 00:26:22,894
by itself as a result
of the normal functioning
599
00:26:22,896 --> 00:26:24,453
of the market economy.
600
00:26:24,455 --> 00:26:26,457
It's something
that will require
601
00:26:26,459 --> 00:26:28,093
some sort of an intervention
602
00:26:28,095 --> 00:26:29,628
and, you know,
part of the problem is
603
00:26:29,630 --> 00:26:30,774
that in the United States,
604
00:26:30,776 --> 00:26:34,287
roughly half of the population
is very conservative
605
00:26:34,346 --> 00:26:36,738
and they really don't
believe in this idea
606
00:26:36,740 --> 00:26:39,314
of intervention
in the market.
607
00:26:39,316 --> 00:26:41,333
It's gonna be
a tough transition,
608
00:26:41,335 --> 00:26:43,464
and those that find
themselves out of jobs
609
00:26:43,466 --> 00:26:45,589
because a robot has taken it
610
00:26:45,591 --> 00:26:47,394
are gonna be
pretty pissed off.
611
00:26:47,396 --> 00:26:51,095
The effect of automation
on jobs and livelihood
612
00:26:51,253 --> 00:26:54,344
is going to be behind this
like the original Luddites.
613
00:26:54,346 --> 00:26:56,587
It wasn't... it wasn't
that they were against
614
00:26:56,589 --> 00:26:59,474
technological developments
in some ethereal sense.
615
00:26:59,476 --> 00:27:02,356
It was that this was
taking their damn jobs.
616
00:27:02,448 --> 00:27:05,234
I absolutely think there could
be a Neo-Luddite movement
617
00:27:05,236 --> 00:27:07,714
against the future,
against technology
618
00:27:07,716 --> 00:27:09,027
because they're gonna say,
619
00:27:09,029 --> 00:27:10,293
"Well, hey,
you're taking our jobs,
620
00:27:10,295 --> 00:27:11,653
you're taking
our livelihoods away.
621
00:27:11,655 --> 00:27:13,413
You're taking
everything away from us."
622
00:27:13,415 --> 00:27:15,634
But I think that's when
it's gonna be important
623
00:27:15,636 --> 00:27:18,829
that leaders and government
step in and say,
624
00:27:18,831 --> 00:27:20,180
"It may seem that way,
625
00:27:20,182 --> 00:27:21,891
but life is going to get
better for everyone."
626
00:27:21,893 --> 00:27:24,457
We're gonna have more time
to do things that we want,
627
00:27:24,459 --> 00:27:26,152
more vacations,
more passions.
628
00:27:26,154 --> 00:27:27,462
This is the modern world.
629
00:27:27,464 --> 00:27:30,774
We can create the utopia
that we've always dreamt of.
630
00:27:30,776 --> 00:27:33,474
Why are we saying,
"My job's not safe,"
631
00:27:33,476 --> 00:27:36,853
or, "Automation's going
to steal my jobs"?
632
00:27:36,855 --> 00:27:38,573
These are the... these
are the phrases
633
00:27:38,575 --> 00:27:40,653
that keep getting
pushed out there.
634
00:27:40,655 --> 00:27:41,653
They're negative phrases,
635
00:27:41,655 --> 00:27:45,027
and instead, it seems
that we would look at this,
636
00:27:45,029 --> 00:27:46,837
especially if someone has
been working in a factory
637
00:27:46,839 --> 00:27:47,921
their whole life,
638
00:27:47,923 --> 00:27:49,969
that they would look at
that system and say,
639
00:27:49,971 --> 00:27:53,333
"Thank goodness that this is
starting to be automated."
640
00:27:53,335 --> 00:27:55,173
I don't want anyone
to have to crawl
641
00:27:55,175 --> 00:27:58,293
into a hole in the ground
and pull up coal.
642
00:27:58,295 --> 00:28:00,773
No human being should
have to go do that.
643
00:28:00,775 --> 00:28:03,133
If you make an awful
lot of computers
644
00:28:03,135 --> 00:28:05,493
and a lot of robots,
and the computers can make
645
00:28:05,495 --> 00:28:07,053
those robots
very sophisticated
646
00:28:07,055 --> 00:28:08,413
and do lots of
sophisticated jobs,
647
00:28:08,415 --> 00:28:11,893
you could eliminate most of
the high-value physical jobs
648
00:28:11,895 --> 00:28:14,653
and also most of the
high-value intellectual jobs.
649
00:28:14,655 --> 00:28:16,973
What you're left with,
then, are those jobs
650
00:28:16,975 --> 00:28:18,573
where you have to be
a human being,
651
00:28:18,575 --> 00:28:20,973
so I find it quite
paradoxical in some ways
652
00:28:20,975 --> 00:28:22,893
that the more advanced
the technology becomes,
653
00:28:22,895 --> 00:28:25,519
the more it forces us
to become humans.
654
00:28:25,521 --> 00:28:26,860
So in some ways,
it's very good.
655
00:28:26,862 --> 00:28:29,785
It forces us to explore
what is humanity about?
656
00:28:29,787 --> 00:28:31,238
What are the
fundamentally important
657
00:28:31,240 --> 00:28:32,373
things about being a human?
658
00:28:32,375 --> 00:28:34,573
It's not being able to,
you know, flip a burger
659
00:28:34,575 --> 00:28:37,574
or, you know, carve
something intricately.
660
00:28:37,576 --> 00:28:39,154
A computer or a robot
661
00:28:39,156 --> 00:28:41,172
could do that far better
than a human being.
662
00:28:41,174 --> 00:28:43,477
One thing I've noticed, if
you talk to techno-optimists
663
00:28:43,479 --> 00:28:46,773
about the future of work
and how it's gonna unfold,
664
00:28:46,775 --> 00:28:49,234
very often they will
focus on this issue
665
00:28:49,236 --> 00:28:51,980
of how will we all be
fulfilled in the future?
666
00:28:51,982 --> 00:28:53,691
What will be
our purpose in life
667
00:28:53,693 --> 00:28:55,133
when we don't want to work?
668
00:28:55,255 --> 00:28:58,173
And, you know, you can
sort of posit this
669
00:28:58,175 --> 00:29:00,333
in terms of... there was
a guy named Maslow
670
00:29:00,335 --> 00:29:02,613
who came up with
a hierarchy of human needs,
671
00:29:02,615 --> 00:29:03,980
Maslow's pyramid.
672
00:29:03,982 --> 00:29:05,453
And at the base
of that pyramid
673
00:29:05,455 --> 00:29:08,707
are the foundational things
like food and shelter,
674
00:29:08,709 --> 00:29:10,413
and at the top of
that pyramid, of course,
675
00:29:10,415 --> 00:29:12,493
are all these intangible
things like, you know,
676
00:29:12,495 --> 00:29:13,964
a sense of purpose
in your life
677
00:29:13,966 --> 00:29:15,227
and fulfillment
and so forth.
678
00:29:15,229 --> 00:29:18,730
What you'll find among the most
techno-optimistic people
679
00:29:18,732 --> 00:29:20,314
is that they will
want to skip
680
00:29:20,316 --> 00:29:21,613
right over the base
of that pyramid
681
00:29:21,615 --> 00:29:24,173
and jump right to the top
and start talking about,
682
00:29:24,175 --> 00:29:25,853
"Oh, gosh, how are
we gonna," you know,
683
00:29:25,855 --> 00:29:27,474
"what's the meaning
of our life gonna be
684
00:29:27,476 --> 00:29:28,895
when we don't have to work?"
685
00:29:28,897 --> 00:29:32,392
But the reality is that
the base of that pyramid,
686
00:29:32,394 --> 00:29:34,769
food, shelter,
all the things that we need
687
00:29:34,771 --> 00:29:36,314
to have, you know,
a decent life,
688
00:29:36,316 --> 00:29:37,893
that's the elephant
in the room.
689
00:29:37,895 --> 00:29:40,709
That stuff costs real money.
690
00:29:40,776 --> 00:29:42,141
That stuff is gonna involve
691
00:29:42,143 --> 00:29:44,480
perhaps raising taxes
on a lot of the people
692
00:29:44,482 --> 00:29:45,954
that are doing
really well right now,
693
00:29:45,956 --> 00:29:47,335
and that's probably
part of the reason
694
00:29:47,337 --> 00:29:49,053
that they prefer
not to talk about it.
695
00:29:49,055 --> 00:29:51,464
So what do we do
with the 99 percent
696
00:29:51,466 --> 00:29:52,800
of the population
on this planet
697
00:29:52,802 --> 00:29:54,322
if they don't have jobs?
698
00:29:54,455 --> 00:29:57,128
The suggestion is
and the goal is
699
00:29:57,130 --> 00:29:58,733
to make this
an efficient system.
700
00:29:58,735 --> 00:30:01,352
You put automation
in the hands of everyone.
701
00:30:01,354 --> 00:30:03,324
In the near future,
we're going to see systems
702
00:30:03,326 --> 00:30:05,457
where we can
3D print our clothing,
703
00:30:05,459 --> 00:30:06,973
we can 3D print food.
704
00:30:06,975 --> 00:30:09,446
If you automate
these self-replicating
705
00:30:09,448 --> 00:30:12,550
industrial machines
to pull the raw materials
706
00:30:12,552 --> 00:30:14,386
and distribute
those raw materials
707
00:30:14,388 --> 00:30:16,582
to everyone who
has the capacity
708
00:30:16,584 --> 00:30:18,514
to 3D print their own house
709
00:30:18,516 --> 00:30:21,093
or 3D print
their own farm bot,
710
00:30:21,095 --> 00:30:25,282
you have literally
solved the equation
711
00:30:25,284 --> 00:30:27,693
of how do I automate my life
712
00:30:27,695 --> 00:30:30,125
and how do I automate
my basic necessities?
713
00:30:30,127 --> 00:30:32,013
If we had the political will
714
00:30:32,015 --> 00:30:33,733
to take all these
new technologies
715
00:30:33,735 --> 00:30:36,173
and the wealth
and abundance they create,
716
00:30:36,175 --> 00:30:39,013
and distribute these
across our society
717
00:30:39,015 --> 00:30:40,373
in the First World countries
718
00:30:40,375 --> 00:30:41,813
and also across
the whole world,
719
00:30:41,815 --> 00:30:43,853
then, of course, I mean,
the sky's the limit.
720
00:30:43,855 --> 00:30:45,563
We can solve all
kinds of problems.
721
00:30:45,565 --> 00:30:48,282
But we will have to have
the political will to do that,
722
00:30:48,284 --> 00:30:50,674
and I don't see a whole lot
of evidence for it right now.
723
00:30:50,676 --> 00:30:53,669
There really is enough
already for everybody,
724
00:30:53,932 --> 00:30:55,886
certainly to have
an adequate life,
725
00:30:55,888 --> 00:30:59,615
if not a life
of superabundant,
726
00:30:59,666 --> 00:31:02,636
so, you know, I don't think
that the introduction
727
00:31:02,638 --> 00:31:04,516
of more labor-saving devices
728
00:31:04,518 --> 00:31:07,941
or more is really gonna
make any difference in that.
729
00:31:07,943 --> 00:31:09,331
The reason there
are poor people
730
00:31:09,333 --> 00:31:10,912
is 'cause there's rich people.
731
00:31:11,248 --> 00:31:15,865
[music]
732
00:31:17,119 --> 00:31:19,519
You're simultaneously
making a lot of people
733
00:31:19,521 --> 00:31:21,314
almost completely useless
734
00:31:21,316 --> 00:31:23,956
while generating
a lot more wealth
735
00:31:23,958 --> 00:31:26,057
and value than ever before.
736
00:31:26,059 --> 00:31:28,413
So I worry about this.
I worry about the schism
737
00:31:28,415 --> 00:31:31,242
between the super rich
and the poor.
738
00:31:31,244 --> 00:31:34,688
The ultra rich,
if they're representative
739
00:31:34,690 --> 00:31:36,800
of some of the people
we've seen in Silicon Valley,
740
00:31:36,802 --> 00:31:38,013
I really, really worry
741
00:31:38,015 --> 00:31:39,922
because I wonder if
they really have a soul.
742
00:31:39,924 --> 00:31:43,352
I really... I wonder if
they really have an awareness
743
00:31:43,354 --> 00:31:45,566
of how regular people feel
744
00:31:45,568 --> 00:31:47,711
and if they share
the values of humanity.
745
00:31:47,713 --> 00:31:50,242
It really bothers me
that you have this ultra rich
746
00:31:50,244 --> 00:31:53,250
that is out of touch with
regular people, with humanity.
747
00:31:53,252 --> 00:31:55,375
This is being filmed
right now in San Francisco,
748
00:31:55,377 --> 00:31:58,514
which is by all accounts
one of the wealthiest cities
749
00:31:58,516 --> 00:32:00,613
and most advanced
cities in the world,
750
00:32:00,615 --> 00:32:02,314
and it's pretty much
ground zero
751
00:32:02,316 --> 00:32:04,662
for this
technological revolution,
752
00:32:04,690 --> 00:32:06,053
and, yet, as I came here,
753
00:32:06,055 --> 00:32:08,847
I almost tripped over
homeless people
754
00:32:08,849 --> 00:32:10,213
sleeping on the sidewalk.
755
00:32:10,215 --> 00:32:13,066
That is the reality
of today's economy
756
00:32:13,068 --> 00:32:14,277
and today's society.
757
00:32:14,279 --> 00:32:15,613
In a very real sense,
758
00:32:15,615 --> 00:32:18,794
we already live in
the economy of abundance,
759
00:32:18,796 --> 00:32:21,335
and yet we have not
solved this problem.
760
00:32:21,337 --> 00:32:23,774
I think the future
for the four billion
761
00:32:23,776 --> 00:32:26,532
poor people in the world
is actually a very good one.
762
00:32:26,534 --> 00:32:29,072
We've seen the amount of food
in the world, for example,
763
00:32:29,074 --> 00:32:31,425
has more than doubled
in the last 25 years.
764
00:32:31,427 --> 00:32:33,177
That's likely to continue.
765
00:32:33,455 --> 00:32:37,344
Worldwide, we're seeing
massive economic growth.
766
00:32:37,346 --> 00:32:40,722
That really means that people
in poor countries today
767
00:32:40,724 --> 00:32:42,874
will be much better off
in the future,
768
00:32:42,876 --> 00:32:45,144
so there will still be
some poor people,
769
00:32:45,146 --> 00:32:46,234
relatively speaking,
770
00:32:46,236 --> 00:32:48,354
but compared to
today's poor people,
771
00:32:48,356 --> 00:32:49,977
they'll be actually
quite well-off.
772
00:32:49,979 --> 00:32:52,930
I think this is
an amplifier for inequality.
773
00:32:52,932 --> 00:32:56,524
It's gonna make what we see
now much more amplified.
774
00:32:56,526 --> 00:32:58,066
The number of people
that are doing
775
00:32:58,068 --> 00:32:59,413
really well in the economy
776
00:32:59,415 --> 00:33:01,573
I think is likely
to continue to shrink.
777
00:33:01,575 --> 00:33:02,836
Those people
that are doing well
778
00:33:02,838 --> 00:33:07,086
will do extraordinarily well,
but for more and more people,
779
00:33:07,088 --> 00:33:08,532
they're simply gonna
find themselves
780
00:33:08,534 --> 00:33:10,852
in a position where they
don't have a lot to offer.
781
00:33:10,854 --> 00:33:12,367
They don't have
a marketable skill.
782
00:33:12,369 --> 00:33:15,594
They don't have a viable way
to really earn an income
783
00:33:15,596 --> 00:33:17,422
or, in particular,
middle-class income.
784
00:33:17,424 --> 00:33:18,696
We should value the fact
785
00:33:18,698 --> 00:33:20,992
that we can spend
more time doing human work
786
00:33:20,994 --> 00:33:24,533
and the robots will get on,
increase the economy.
787
00:33:24,535 --> 00:33:26,674
They'll be still taking
all the resources
788
00:33:26,676 --> 00:33:28,789
and converting them
into material goods
789
00:33:28,791 --> 00:33:31,956
at very low cost,
so the economy will expand,
790
00:33:31,958 --> 00:33:33,074
we'll be better off,
791
00:33:33,076 --> 00:33:34,754
and we can concentrate
on what matters.
792
00:33:34,756 --> 00:33:36,485
There's nothing
to worry about in there.
793
00:33:36,856 --> 00:33:38,875
A constant stream
of savings dollars
794
00:33:38,877 --> 00:33:42,310
must flow into big
and small business each year.
795
00:33:42,672 --> 00:33:45,226
These dollars help
to buy the land,
796
00:33:45,228 --> 00:33:49,750
the buildings,
the tools and equipment,
797
00:33:49,752 --> 00:33:51,683
and create new
job opportunities
798
00:33:51,685 --> 00:33:53,685
for our expanding population.
799
00:33:53,877 --> 00:33:57,037
[music]
800
00:33:57,096 --> 00:33:58,653
We need consumers out there.
801
00:33:58,655 --> 00:34:01,253
We need people
who can actually buy
802
00:34:01,255 --> 00:34:03,733
the things that are
produced by the economy.
803
00:34:03,735 --> 00:34:05,533
If you look at the way
our economy works,
804
00:34:05,535 --> 00:34:08,203
ultimately, it's driven
by end consumption,
805
00:34:08,205 --> 00:34:10,891
and by that, I mean people
and to a limited extent,
806
00:34:10,893 --> 00:34:13,066
governments, who buy things
because they want them
807
00:34:13,068 --> 00:34:14,268
or they need them.
808
00:34:14,575 --> 00:34:16,173
You know, businesses
in our economy,
809
00:34:16,175 --> 00:34:17,773
they also buy
things, of course,
810
00:34:17,775 --> 00:34:20,133
but they do that in order
to produce something else,
811
00:34:20,135 --> 00:34:21,973
and one business may sell
to another business,
812
00:34:21,975 --> 00:34:24,578
but at the end of that,
at the end of that chain,
813
00:34:24,580 --> 00:34:27,653
there has to stand a consumer
or perhaps a government
814
00:34:27,655 --> 00:34:29,907
who buys that
product or service
815
00:34:29,909 --> 00:34:31,792
just because they want it
or they need it.
816
00:34:31,794 --> 00:34:33,253
So this is not the case
817
00:34:33,255 --> 00:34:35,093
that things can just
keep going like this
818
00:34:35,095 --> 00:34:37,853
and get more and more
unequal over time
819
00:34:37,855 --> 00:34:39,613
and everything
will still be fine.
820
00:34:39,615 --> 00:34:40,733
I think that it
won't be fine.
821
00:34:40,735 --> 00:34:43,738
It will actually have
an impact on our economy
822
00:34:43,740 --> 00:34:45,646
and on our economic growth.
823
00:34:45,893 --> 00:34:49,407
We need intelligent planning
because, you know,
824
00:34:49,409 --> 00:34:52,314
being unemployed is not
a positive thing in itself.
825
00:34:52,316 --> 00:34:53,875
There has to be some
kind of transition point
826
00:34:53,877 --> 00:34:55,730
to some other form
of life after that.
827
00:34:55,732 --> 00:34:57,227
And again, at the moment,
828
00:34:57,229 --> 00:34:59,750
I really don't see enough
attention being paid to this,
829
00:34:59,752 --> 00:35:03,232
so we need to take this
future prospect seriously now.
830
00:35:05,010 --> 00:35:08,086
If we manage to adapt
to this expected wave
831
00:35:08,088 --> 00:35:12,174
of technological unemployment
both politically and socially,
832
00:35:12,176 --> 00:35:14,355
it's likely
to facilitate a time
833
00:35:14,357 --> 00:35:16,837
when work takes on
a different meaning
834
00:35:16,839 --> 00:35:19,310
and a new role in our lives.
835
00:35:20,323 --> 00:35:23,034
Ideas of how we should
approach our relationship
836
00:35:23,036 --> 00:35:25,896
with work have changed
throughout history.
837
00:35:26,495 --> 00:35:29,133
In ancient Greece,
Aristotle said,
838
00:35:29,135 --> 00:35:33,513
"A working paid job absorbs
and degrades the mind",
839
00:35:34,135 --> 00:35:36,683
i.e., if a person would
not willingly adopt
840
00:35:36,685 --> 00:35:38,293
their job for free,
841
00:35:38,295 --> 00:35:40,495
the argument can be made
that they have become
842
00:35:40,497 --> 00:35:42,742
absorbed and degraded by it,
843
00:35:42,744 --> 00:35:45,904
working purely out of
financial obligation.
844
00:35:46,455 --> 00:35:49,930
In 1844, Karl Marx
famously described
845
00:35:49,932 --> 00:35:53,745
the workers of society as
"alienated from their work
846
00:35:53,747 --> 00:35:56,615
and wholly saturated by it."
847
00:35:57,455 --> 00:35:59,413
He felt that most
work didn't allow
848
00:35:59,415 --> 00:36:01,810
an individual's
character to grow.
849
00:36:01,855 --> 00:36:04,183
He encouraged people
to find fulfillment
850
00:36:04,185 --> 00:36:06,052
and freedom in their work.
851
00:36:06,495 --> 00:36:09,652
During World War Two,
the ethos towards work
852
00:36:09,654 --> 00:36:11,853
was that it was
a patriotic duty
853
00:36:11,855 --> 00:36:14,591
in order to support
the war effort.
854
00:36:14,940 --> 00:36:18,314
To best understand our
current relationship with work
855
00:36:18,316 --> 00:36:21,266
and perhaps by extension
modern life itself,
856
00:36:21,268 --> 00:36:25,185
we can look to the writings
of a man called Guy Debord.
857
00:36:25,502 --> 00:36:30,413
Debord was a French Marxist
theorist and in 1967,
858
00:36:30,415 --> 00:36:33,488
he published a powerful
and influential critique
859
00:36:33,490 --> 00:36:36,823
on Western society entitled
860
00:36:36,825 --> 00:36:39,716
The Society of the Spectacle.
861
00:36:40,346 --> 00:36:43,448
He describes our workers
are ruled by commodities
862
00:36:43,450 --> 00:36:44,672
and that production
863
00:36:44,674 --> 00:36:47,669
is an inescapable
duty of the masses,
864
00:36:47,682 --> 00:36:49,925
such is the economic system
865
00:36:49,927 --> 00:36:52,841
that to work is to survive.
866
00:36:52,976 --> 00:36:55,305
"The capitalist economy,"
he says,
867
00:36:55,307 --> 00:36:59,152
"requires the vast majority
to take part as wage workers
868
00:36:59,154 --> 00:37:02,278
in the unending
pursuit of its ends.
869
00:37:02,280 --> 00:37:05,116
A requirement to which,
as everyone knows,
870
00:37:05,118 --> 00:37:07,662
one must either
submit or die."
871
00:37:08,175 --> 00:37:10,394
The assumption has
crept into our rhetoric
872
00:37:10,396 --> 00:37:11,761
and our understanding
that we live in
873
00:37:11,763 --> 00:37:13,800
a leisure society
to some extent.
874
00:37:13,802 --> 00:37:16,091
We have flexible working time.
875
00:37:16,471 --> 00:37:19,394
You hear the term a lot
"relative poverty"
876
00:37:19,396 --> 00:37:21,071
it's, again,
it's absolute poverty,
877
00:37:21,073 --> 00:37:24,093
and all these kinds of ideas
suggest that, in fact,
878
00:37:24,095 --> 00:37:26,173
we should feel pretty
pleased with ourselves.
879
00:37:26,175 --> 00:37:28,154
We should both feel
quite leisured
880
00:37:28,156 --> 00:37:30,295
and we should feel less
in bondage to work
881
00:37:30,297 --> 00:37:32,571
than perhaps somebody
in the 19th century
882
00:37:32,573 --> 00:37:35,474
who was kind of shackled
to a machine in a factory.
883
00:37:35,476 --> 00:37:38,263
But, in fact,
we're very unhappy.
884
00:37:38,687 --> 00:37:41,175
It's irrelevant
how much you work
885
00:37:41,177 --> 00:37:42,925
in actual terms anymore.
886
00:37:42,927 --> 00:37:44,773
The way in which
the spectacle operates
887
00:37:44,775 --> 00:37:49,016
is to make of leisure
itself an adjunct to work.
888
00:37:49,018 --> 00:37:52,093
In other words, the idea
of not working and working
889
00:37:52,095 --> 00:37:55,154
are in some sense
locked into an unholy
890
00:37:55,156 --> 00:37:57,775
and reciprocal relationship
with each other.
891
00:37:57,777 --> 00:37:59,676
You know, the fact
that you're not working
892
00:37:59,678 --> 00:38:01,337
is only because
you've been working,
893
00:38:01,339 --> 00:38:02,571
and the fact that
you're working
894
00:38:02,573 --> 00:38:04,573
is only so that
you cannot work.
895
00:38:04,575 --> 00:38:07,878
In other words,
so engrafted is that rubric
896
00:38:07,880 --> 00:38:09,344
in the way that
we approach life
897
00:38:09,346 --> 00:38:11,860
that... that we can
never be rid of it.
898
00:38:11,862 --> 00:38:15,370
Debord also observed
that as technology advances,
899
00:38:15,372 --> 00:38:17,586
production becomes
more efficient.
900
00:38:17,588 --> 00:38:21,167
Accordingly, the workers'
tasks invariably become
901
00:38:21,169 --> 00:38:23,318
more trivial and menial.
902
00:38:23,721 --> 00:38:27,066
It would seem that as human
labor becomes irrelevant,
903
00:38:27,068 --> 00:38:30,404
the harder it is
to find fulfilling work.
904
00:38:30,565 --> 00:38:33,594
The truth of the matter is
that most people already,
905
00:38:33,596 --> 00:38:36,623
in Britain,
are doing useless jobs
906
00:38:36,674 --> 00:38:39,740
and have been for
generations, actually.
907
00:38:39,751 --> 00:38:42,333
Most jobs in management
are completely useless.
908
00:38:42,335 --> 00:38:44,693
They basically consist
in the rearrangement
909
00:38:44,695 --> 00:38:46,493
of information into
different patterns
910
00:38:46,495 --> 00:38:48,909
that are meant to take on
the semblance of meaning
911
00:38:48,911 --> 00:38:51,005
in the bureaucratic context.
912
00:38:51,049 --> 00:38:54,893
So most work is, in fact,
a waste of time already,
913
00:38:54,895 --> 00:38:57,461
and I think people
understand that intuitively.
914
00:38:57,463 --> 00:39:00,394
When I go into companies,
I often ask the question,
915
00:39:00,396 --> 00:39:01,667
"Why are you employing people?
916
00:39:01,669 --> 00:39:03,027
You could get monkeys
917
00:39:03,029 --> 00:39:05,367
or you could get robots
to do this job."
918
00:39:05,369 --> 00:39:06,839
The people are not
allowed to think.
919
00:39:06,841 --> 00:39:08,396
They are processing.
920
00:39:08,398 --> 00:39:10,573
They're just like a machine.
921
00:39:10,575 --> 00:39:13,533
They're being so hemmed down,
922
00:39:13,535 --> 00:39:17,443
they operate with an algorithm
and they just do it.
923
00:39:17,541 --> 00:39:20,874
We all have the need to find
meaning in our lives,
924
00:39:20,876 --> 00:39:24,123
and it is our professions
that define us.
925
00:39:24,210 --> 00:39:26,471
To work is
to provide a service
926
00:39:26,473 --> 00:39:28,414
either to yourself
or for others,
927
00:39:28,416 --> 00:39:31,230
but most of us would like
our work to be purposeful
928
00:39:31,232 --> 00:39:34,513
and contributory
to society in some way.
929
00:39:34,869 --> 00:39:37,053
It is an uncomfortable truth
930
00:39:37,055 --> 00:39:39,875
that with our present
model of economics
931
00:39:39,877 --> 00:39:43,474
not everyone is able
to monetize their passions.
932
00:39:43,895 --> 00:39:46,834
If any of this were
to come to fruition,
933
00:39:46,836 --> 00:39:49,961
if we learned to make
automation work for us,
934
00:39:49,963 --> 00:39:54,552
the question remains,
"What do we do with our days?"
935
00:39:54,830 --> 00:39:56,314
There's a good and a bad.
936
00:39:56,316 --> 00:39:58,956
The good is that the cost
of everything drops.
937
00:39:59,018 --> 00:40:01,589
We can solve some of
the basic problems of humanity
938
00:40:01,591 --> 00:40:04,435
like disease,
hunger, lodging.
939
00:40:05,237 --> 00:40:09,227
We can look after all the
basic needs of human beings.
940
00:40:09,229 --> 00:40:13,575
The dark side is that
automation takes jobs away,
941
00:40:13,627 --> 00:40:16,091
and the question is,
"What do we do for a living?"
942
00:40:16,449 --> 00:40:19,235
Some of us will
seek enlightenment
943
00:40:19,237 --> 00:40:21,539
and rise and will keep
learning and growing,
944
00:40:21,541 --> 00:40:23,774
but the majority of people
don't care about those things.
945
00:40:23,776 --> 00:40:27,141
Majority of people just want
to do, you know, grunt work.
946
00:40:27,143 --> 00:40:30,485
They want to socialize with
people as they do at work.
947
00:40:30,487 --> 00:40:34,050
Sennett wrote in his book,
The Corrosion of Character,
948
00:40:34,052 --> 00:40:36,792
that in late capitalism,
949
00:40:36,794 --> 00:40:39,754
one of the great
kind of supports
950
00:40:39,756 --> 00:40:43,394
for human interaction
and for human meaning
951
00:40:43,396 --> 00:40:45,894
is the longevity
of social relations
952
00:40:45,896 --> 00:40:48,617
and the interactions
in working environments
953
00:40:48,619 --> 00:40:50,828
and that if that's taken away,
954
00:40:50,830 --> 00:40:54,347
if what's required is to be
continually responsive
955
00:40:54,349 --> 00:40:57,560
and changing in
a precarious world,
956
00:40:58,135 --> 00:41:01,493
then people no longer
find the fulfillment
957
00:41:01,495 --> 00:41:04,000
or the substance in
what they're doing.
958
00:41:04,002 --> 00:41:07,667
There is an underlying desire
for people to do things,
959
00:41:07,669 --> 00:41:09,013
you know, you spoke
about the idea
960
00:41:09,015 --> 00:41:11,469
that people want to be
engaged creatively.
961
00:41:11,471 --> 00:41:12,893
They want to be
engaged, you know,
962
00:41:12,895 --> 00:41:16,058
go back to basic
Marxist ideas of praxis
963
00:41:16,060 --> 00:41:17,533
and right back to John Locke.
964
00:41:17,535 --> 00:41:19,503
They want to be engaged
in what Locke thought of
965
00:41:19,505 --> 00:41:21,175
as primary acquisition,
966
00:41:21,205 --> 00:41:24,027
mixing their labor,
either their creative thinking
967
00:41:24,029 --> 00:41:25,305
or their physical labor even,
968
00:41:25,307 --> 00:41:27,733
with the world
in order to transform it.
969
00:41:27,735 --> 00:41:30,034
They want to do that,
and that's a very basic
970
00:41:30,036 --> 00:41:32,266
human instinct to do that.
971
00:41:32,268 --> 00:41:35,446
And the idea of
a leisured class, as it were,
972
00:41:35,448 --> 00:41:39,413
a class who is not involved
in a praxis with the world,
973
00:41:39,415 --> 00:41:41,407
but is simply involved
in a passive way
974
00:41:41,409 --> 00:41:45,095
as a recipient of things is
actually repugnant to people.
975
00:41:45,151 --> 00:41:49,839
They would sooner work for
the man in a meaningless job
976
00:41:49,841 --> 00:41:52,292
and construct
a false ideology
977
00:41:52,294 --> 00:41:54,333
of involvement and engagement
978
00:41:54,335 --> 00:41:56,453
than they would
actually sit on their arse.
979
00:41:56,455 --> 00:41:59,029
We can't get away
from the fact that...
980
00:41:59,088 --> 00:42:00,844
that people work
because they have to.
981
00:42:00,846 --> 00:42:03,753
That's, you know, the primary
motivation for most people,
982
00:42:03,755 --> 00:42:05,173
that if you don't work,
983
00:42:05,175 --> 00:42:07,514
you're gonna be living
on the street, okay?
984
00:42:07,516 --> 00:42:09,909
Once we... if we ever
move into a future
985
00:42:09,911 --> 00:42:11,154
where that's not the case
986
00:42:11,156 --> 00:42:12,613
and people don't have
to worry about that,
987
00:42:12,615 --> 00:42:13,925
then we can begin to take on
988
00:42:13,927 --> 00:42:15,727
these more
philosophical questions
989
00:42:15,729 --> 00:42:19,773
of... of, you know, but we're
not at that point yet.
990
00:42:19,775 --> 00:42:23,594
We can't pretend that
we are living in an age
991
00:42:23,596 --> 00:42:28,084
where that necessity
for an income doesn't exist.
992
00:42:28,471 --> 00:42:32,554
Douglas Rushkoff
stated in 2009,
993
00:42:32,556 --> 00:42:36,255
"We all want paychecks
or at least money.
994
00:42:36,257 --> 00:42:38,916
We want food,
shelter, clothing,
995
00:42:38,918 --> 00:42:41,817
and all the things
that money buys us.
996
00:42:41,819 --> 00:42:45,099
But do we all
really want jobs?"
997
00:42:45,229 --> 00:42:49,050
According to the UN Food
and Agriculture Organization,
998
00:42:49,052 --> 00:42:50,693
there is enough food produced
999
00:42:50,695 --> 00:42:52,114
to provide everyone
in the world
1000
00:42:52,116 --> 00:42:57,505
with 2,720 kilocalories
per person per day.
1001
00:43:15,197 --> 00:43:17,667
At this stage,
it's difficult to think of
1002
00:43:17,669 --> 00:43:19,933
other possible ways of life.
1003
00:43:19,935 --> 00:43:22,053
The need to earn a living
has been a part
1004
00:43:22,055 --> 00:43:24,533
of every cultural
narrative in history.
1005
00:43:24,535 --> 00:43:27,653
It's a precondition
of human life.
1006
00:43:27,655 --> 00:43:30,081
The challenge facing
the future of work
1007
00:43:30,083 --> 00:43:32,786
is politically unclear.
1008
00:43:33,252 --> 00:43:35,003
It is likely to require
1009
00:43:35,005 --> 00:43:37,933
not only a
redistribution of wealth,
1010
00:43:37,935 --> 00:43:41,234
but a redistribution
of the workload.
1011
00:43:41,236 --> 00:43:45,505
But will working less
mean living more?
1012
00:43:46,335 --> 00:43:49,498
And is our fear of
becoming irrelevant
1013
00:43:49,500 --> 00:43:53,169
greater than
our fear of death?
1014
00:43:56,373 --> 00:44:05,493
[music]
1015
00:44:26,174 --> 00:44:29,850
The process of physical aging
is known as senescence,
1016
00:44:29,852 --> 00:44:32,216
and none of us
are spared from it.
1017
00:44:32,218 --> 00:44:36,435
It remains to this day
an evolutionary enigma.
1018
00:44:36,488 --> 00:44:38,609
Our cells are programmed to wane
1019
00:44:38,611 --> 00:44:41,849
and our entire bodies
are fated to become frail.
1020
00:44:42,088 --> 00:44:44,862
It was seen that the laws
of nature would prefer it
1021
00:44:44,864 --> 00:44:47,036
if we dwindle and die.
1022
00:44:47,120 --> 00:44:49,825
[music]
1023
00:44:50,094 --> 00:44:52,479
Negligible senescence, however,
1024
00:44:52,481 --> 00:44:55,361
is the lack
of symptoms of aging.
1025
00:44:55,424 --> 00:44:59,784
Negligibly senescent organisms
include certain species
1026
00:44:59,786 --> 00:45:02,335
of sturgeon, giant tortoise,
1027
00:45:02,337 --> 00:45:05,911
flatworm, clam, and tardigrade.
1028
00:45:06,975 --> 00:45:11,101
One species of jellyfish
called Turritopsis dohrnii
1029
00:45:11,103 --> 00:45:14,560
has even been observed
to be biologically immortal.
1030
00:45:14,728 --> 00:45:18,510
It has the capability
to reverse its biotic cycle
1031
00:45:18,512 --> 00:45:20,643
and revert back
to the polyp stage
1032
00:45:20,645 --> 00:45:22,889
at any point in its development.
1033
00:45:22,891 --> 00:45:25,529
There is only one thing
wrong with dying,
1034
00:45:25,531 --> 00:45:27,575
and that's doing it
when you don't want to.
1035
00:45:28,455 --> 00:45:30,432
Doing it when you do
want to is not a problem.
1036
00:45:30,434 --> 00:45:32,221
Now if you put that bargain
to anybody,
1037
00:45:32,223 --> 00:45:34,103
"Look, this is the deal:
1038
00:45:34,306 --> 00:45:36,360
You will die, but only
when you want to."
1039
00:45:36,362 --> 00:45:37,833
Who would not take that bargain?
1040
00:45:38,193 --> 00:45:42,455
In 2014, a team
of scientists at Harvard
1041
00:45:42,457 --> 00:45:44,979
were able to effectively
reverse the age
1042
00:45:44,981 --> 00:45:47,065
of an older mouse by treating it
1043
00:45:47,067 --> 00:45:48,976
with the blood
of a younger mouse
1044
00:45:48,978 --> 00:45:52,536
through a process
called parabiosis.
1045
00:45:52,931 --> 00:45:54,424
For the first time in history
1046
00:45:54,426 --> 00:45:56,698
it is deemed
scientifically possible
1047
00:45:56,700 --> 00:46:00,013
to gain control
over the aging process.
1048
00:46:00,535 --> 00:46:06,669
[music]
1049
00:46:07,538 --> 00:46:10,143
Ultimately, when people
get the hang of the idea
1050
00:46:10,145 --> 00:46:12,149
that aging is a medical problem
1051
00:46:12,151 --> 00:46:14,231
and that everybody's got it,
1052
00:46:14,484 --> 00:46:18,091
then it's not going to be
the way it is today.
1053
00:46:23,142 --> 00:46:26,875
He thinks it's possible
that people will extend...
1054
00:46:26,877 --> 00:46:30,218
be able to extend their lifespan
by considerable amounts.
1055
00:46:30,220 --> 00:46:31,733
I think he's on record as saying
1056
00:46:31,735 --> 00:46:34,518
the first 1,000-year-old person
is already alive.
1057
00:46:34,520 --> 00:46:36,814
It's highly likely, in my view,
1058
00:46:36,816 --> 00:46:39,510
that people born today
or born 10 years ago
1059
00:46:39,512 --> 00:46:42,299
will actually be able to live
1060
00:46:42,301 --> 00:46:44,072
as long as they like,
so to speak,
1061
00:46:44,074 --> 00:46:49,174
without any risk of death
from the ill health of old age.
1062
00:46:49,176 --> 00:46:54,549
The way to apply comprehensive
maintenance to aging
1063
00:46:54,551 --> 00:46:56,190
is a divide
and conquer approach.
1064
00:46:56,192 --> 00:46:57,853
It is not a magic bullet.
1065
00:46:57,855 --> 00:46:59,820
It is not some single thing
that we can do,
1066
00:46:59,822 --> 00:47:02,395
let alone a single thing
that we could do just once.
1067
00:47:02,572 --> 00:47:05,804
Aging is
the lifelong accumulation
1068
00:47:05,806 --> 00:47:07,686
of damage to the body,
1069
00:47:07,688 --> 00:47:10,614
and that damage occurs
as an intrinsic,
1070
00:47:10,674 --> 00:47:12,512
unavoidable side effect
1071
00:47:12,514 --> 00:47:15,071
of the way the body
normally works.
1072
00:47:15,073 --> 00:47:16,493
Even though there
are many, many,
1073
00:47:16,495 --> 00:47:19,109
many different types
of damage at the molecular level
1074
00:47:19,111 --> 00:47:22,773
and the cellular level,
they can all be classified
1075
00:47:22,775 --> 00:47:25,127
into a very manageable
number of categories,
1076
00:47:25,129 --> 00:47:26,813
just seven major categories.
1077
00:47:26,815 --> 00:47:29,585
So now the bottom line,
what do we do about it?
1078
00:47:29,587 --> 00:47:32,629
How do we actually implement
the maintenance approach?
1079
00:47:32,704 --> 00:47:34,658
There are four
fundamental paradigms,
1080
00:47:34,660 --> 00:47:35,994
they all begin with "R."
1081
00:47:35,996 --> 00:47:38,049
They are called
replacement, removal,
1082
00:47:38,051 --> 00:47:39,265
repair, and reinforcement.
1083
00:47:39,267 --> 00:47:42,053
We've got particular ways
to do all these things.
1084
00:47:42,055 --> 00:47:45,179
Sometimes replacement,
sometimes simply elimination
1085
00:47:45,181 --> 00:47:48,913
of the superfluous material,
the garbage that's accumulated.
1086
00:47:48,915 --> 00:47:51,814
Sometimes repair
of the material.
1087
00:47:51,816 --> 00:47:54,682
Occasionally, in a couple
of cases, reinforcement...
1088
00:47:54,684 --> 00:47:56,893
that means making
the cell robust
1089
00:47:56,895 --> 00:47:59,434
so that the damage,
which would normally have caused
1090
00:47:59,436 --> 00:48:01,436
the pathology
no longer does that.
1091
00:48:01,571 --> 00:48:04,268
I wanna talk about one thing
that we're doing in our lab,
1092
00:48:04,270 --> 00:48:06,804
which involved the number one
cause of death
1093
00:48:06,806 --> 00:48:08,971
in the western world,
cardiovascular disease
1094
00:48:08,973 --> 00:48:10,493
causes heart attacks
and strokes.
1095
00:48:10,495 --> 00:48:13,175
It all begins with these things
called foam cells,
1096
00:48:13,177 --> 00:48:16,007
which are originally
white blood cells.
1097
00:48:16,009 --> 00:48:19,453
They become poisoned
by toxins in the bloodstream.
1098
00:48:19,455 --> 00:48:22,374
The main toxin that's
responsible is known...
1099
00:48:22,376 --> 00:48:25,053
it's called 7-ketocholesterol,
that's this thing,
1100
00:48:25,055 --> 00:48:27,944
and we found some bacteria
that could eat it.
1101
00:48:27,946 --> 00:48:30,155
We then found out
how they eat it,
1102
00:48:30,157 --> 00:48:33,013
we found out the enzymes
that they use to break it down,
1103
00:48:33,015 --> 00:48:35,453
and we found out
how to modify those enzymes
1104
00:48:35,455 --> 00:48:37,679
so that they can go
into human cells,
1105
00:48:37,681 --> 00:48:40,705
go to the right place
in the cell that they're needed,
1106
00:48:40,707 --> 00:48:42,249
which is called the lysosome,
1107
00:48:42,251 --> 00:48:46,366
and actually do their job there,
and it actually works.
1108
00:48:46,368 --> 00:48:49,674
Cells are protected
from this toxic substance...
1109
00:48:49,676 --> 00:48:51,389
that's what these graphs
are showing.
1110
00:48:51,391 --> 00:48:53,270
So this is pretty good news.
1111
00:48:53,501 --> 00:48:56,241
The damage that accumulates
that eventually causes
1112
00:48:56,243 --> 00:48:59,046
the diseases
and disabilities of old age
1113
00:48:59,048 --> 00:49:01,415
is initially harmless.
1114
00:49:01,462 --> 00:49:04,614
The body is set up to tolerate
a certain amount of it.
1115
00:49:04,657 --> 00:49:07,221
That's critical,
because while we damage it
1116
00:49:07,223 --> 00:49:09,783
at that sub-pathological level,
1117
00:49:09,832 --> 00:49:11,869
it means that
it's not participating
1118
00:49:11,871 --> 00:49:13,444
in metabolism, so to speak.
1119
00:49:13,446 --> 00:49:16,692
It's not actually interacting
with the way the body works.
1120
00:49:16,860 --> 00:49:19,182
So medicines
that target that damage
1121
00:49:19,184 --> 00:49:21,650
are much, much less likely
1122
00:49:21,652 --> 00:49:24,295
to have unacceptable
side effects
1123
00:49:24,297 --> 00:49:27,253
than medicines that try
to manipulate the body
1124
00:49:27,255 --> 00:49:28,955
so as to stop the damage
from being created
1125
00:49:28,957 --> 00:49:30,410
in the first place.
1126
00:49:37,845 --> 00:49:43,504
It's unlikely, in fact,
by working on longevity per se,
1127
00:49:43,556 --> 00:49:45,413
that we will crack it.
1128
00:49:45,415 --> 00:49:48,374
It's going to be...
it seems to me more probable
1129
00:49:48,376 --> 00:49:52,460
that we will crack longevity
simply by getting rid of,
1130
00:49:52,462 --> 00:49:56,013
sequentially,
the prime causes of death.
1131
00:49:56,015 --> 00:50:00,986
I hear people talk about
living hundreds of years.
1132
00:50:00,988 --> 00:50:02,413
Inside, I'm like, yeah, right,
1133
00:50:02,415 --> 00:50:05,018
I mean, because
if you study the brain,
1134
00:50:05,020 --> 00:50:07,080
the dead end is the brain.
1135
00:50:07,082 --> 00:50:08,594
We all start developing
1136
00:50:08,596 --> 00:50:10,413
Alzheimer's pathology
at 40 years old.
1137
00:50:10,415 --> 00:50:12,893
It's not a matter of whether
you get Alzheimer's,
1138
00:50:12,895 --> 00:50:14,395
it's when.
1139
00:50:14,728 --> 00:50:19,666
It's when, and genetically,
1140
00:50:19,668 --> 00:50:21,687
we all have some predisposition
1141
00:50:21,689 --> 00:50:23,840
to when we're gonna
get this disease.
1142
00:50:23,860 --> 00:50:25,520
It's part of the program.
1143
00:50:25,522 --> 00:50:27,723
Let's fix that part
of the program
1144
00:50:27,741 --> 00:50:29,932
so we can live
past 90 years old
1145
00:50:29,934 --> 00:50:32,373
with an intact, working brain
1146
00:50:32,375 --> 00:50:34,848
to continue
the evolution of our mind.
1147
00:50:35,564 --> 00:50:40,693
That is number one in my book,
because here's a fact:
1148
00:50:40,695 --> 00:50:43,093
Life span's almost 80
right now on average.
1149
00:50:43,095 --> 00:50:46,731
By 85, half of people
will have Alzheimer's.
1150
00:50:46,840 --> 00:50:48,280
Do the math.
1151
00:50:48,431 --> 00:50:49,893
Seventy-four million
Baby Boomers
1152
00:50:49,895 --> 00:50:51,185
headed toward risk age.
1153
00:50:51,187 --> 00:50:53,838
Eighty-five, 50 percent
have Alzheimer's,
1154
00:50:53,840 --> 00:50:55,304
current life span's 80.
1155
00:50:55,306 --> 00:50:56,733
They're gonna be 85 pretty soon.
1156
00:50:56,735 --> 00:50:59,890
Half our population at 85's
gonna have this disease.
1157
00:50:59,892 --> 00:51:01,682
And then keep going
up to 90 and 100,
1158
00:51:01,684 --> 00:51:03,023
and it gets even worse.
1159
00:51:03,025 --> 00:51:05,025
This is enemy number one.
1160
00:51:05,029 --> 00:51:08,660
It's interesting,
just this week
1161
00:51:09,084 --> 00:51:12,184
it was discovered
that an Egyptian mummy
1162
00:51:12,632 --> 00:51:15,635
had died of cancer,
so even way back in those times,
1163
00:51:15,637 --> 00:51:16,549
cancer was around.
1164
00:51:16,551 --> 00:51:19,832
What seems to have happened is,
as we have lived longer,
1165
00:51:19,834 --> 00:51:23,002
the number of diseases
that pop up to kill us
1166
00:51:23,004 --> 00:51:25,335
starts to increase,
and the reality is
1167
00:51:25,337 --> 00:51:28,253
I think this is a sort
of whack-a-mole situation
1168
00:51:28,255 --> 00:51:32,104
as we beat cancer to death
and it disappears,
1169
00:51:32,106 --> 00:51:33,338
something else will pop up.
1170
00:51:33,340 --> 00:51:35,127
Cancer is a specific disease,
1171
00:51:35,129 --> 00:51:37,252
and every cancer
is a specific gene involved
1172
00:51:37,254 --> 00:51:38,693
together with lifestyle.
1173
00:51:38,695 --> 00:51:41,840
Alzheimer's, specific disease,
specific genetics.
1174
00:51:42,204 --> 00:51:44,918
I can go on and on...
diabetes, heart disease.
1175
00:51:45,287 --> 00:51:49,223
These are diseases,
and as you get older,
1176
00:51:49,274 --> 00:51:52,564
your susceptibility
to these diseases increases,
1177
00:51:52,566 --> 00:51:54,394
and your genetics will determine
1178
00:51:54,396 --> 00:51:57,025
whether you get them
and when you get them
1179
00:51:57,027 --> 00:51:58,288
given your lifespan.
1180
00:51:58,290 --> 00:51:59,549
That's not aging,
1181
00:51:59,551 --> 00:52:02,013
that's just living long enough
to be susceptible,
1182
00:52:02,015 --> 00:52:06,153
so we may very well eradicate,
in our fantasy world,
1183
00:52:06,215 --> 00:52:09,280
all the cancers and strokes
and heart disease
1184
00:52:09,282 --> 00:52:11,893
and diabetes and Alzheimer's
we get right now
1185
00:52:11,895 --> 00:52:15,179
by 80 or 90 years old,
and then what's gonna happen?
1186
00:52:15,181 --> 00:52:18,273
You live out to 110,
and guess what's gonna happen,
1187
00:52:18,275 --> 00:52:20,154
new, other genetic variants
1188
00:52:20,156 --> 00:52:22,093
suddenly rear
their ugly heads and say,
1189
00:52:22,095 --> 00:52:24,634
"Now we're gonna affect
whether you live to 110
1190
00:52:24,636 --> 00:52:26,510
without Alzheimer's
and heart disease
1191
00:52:26,512 --> 00:52:27,830
and cancer and diabetes."
1192
00:52:27,832 --> 00:52:30,106
And it'll go on
and go on and go on.
1193
00:52:31,375 --> 00:52:33,733
There will undoubtedly
be enormous challenges
1194
00:52:33,735 --> 00:52:37,356
concerning the biological
approach to longevity.
1195
00:52:37,657 --> 00:52:40,736
There could, however,
be an alternative route
1196
00:52:40,738 --> 00:52:43,270
to extreme longevity.
1197
00:52:43,892 --> 00:52:45,705
When people are
worried about death,
1198
00:52:45,707 --> 00:52:47,133
I guess the issue
is what is it
1199
00:52:47,135 --> 00:52:50,371
that they would like
to have stay alive, okay?
1200
00:52:50,720 --> 00:52:54,173
And I think that's often
very unclear what the answer is,
1201
00:52:54,175 --> 00:52:57,205
because if you look at somebody
like Ray Kurzweil, for example,
1202
00:52:57,207 --> 00:52:59,013
with his promises
of the singularity
1203
00:52:59,015 --> 00:53:00,893
and our merging
with machine intelligence
1204
00:53:00,895 --> 00:53:02,471
and then being able
to kind of have
1205
00:53:02,473 --> 00:53:03,986
this kind of
infinite consciousness
1206
00:53:03,988 --> 00:53:06,160
projected outward
into the Cosmos.
1207
00:53:06,806 --> 00:53:08,564
I don't think he's
imaging a human body
1208
00:53:08,566 --> 00:53:10,646
living forever, okay?
1209
00:53:11,181 --> 00:53:14,616
And if that's what we're
talking about is immortality,
1210
00:53:14,618 --> 00:53:16,773
what I kind of think
Kurzweil is talking about,
1211
00:53:16,775 --> 00:53:18,173
then I can see it.
1212
00:53:18,175 --> 00:53:21,754
I mean, I could see at least
as something to work toward.
1213
00:53:21,829 --> 00:53:25,885
In 2005, Google's
Director of Engineering,
1214
00:53:25,887 --> 00:53:28,299
Ray Kurzweil,
published a book
1215
00:53:28,301 --> 00:53:31,000
entitled
The Singularity is Near:
1216
00:53:31,002 --> 00:53:34,163
When Humans Transcend Biology.
1217
00:53:34,165 --> 00:53:37,413
He predicts that by 2045,
1218
00:53:37,415 --> 00:53:39,689
it'll be possible
to upload our minds
1219
00:53:39,691 --> 00:53:42,822
into a computer,
effectively allowing us
1220
00:53:42,824 --> 00:53:45,059
to live indefinitely.
1221
00:53:58,924 --> 00:54:01,484
When people think
of death at this point,
1222
00:54:01,495 --> 00:54:03,634
they think of a body
going into a coffin,
1223
00:54:03,636 --> 00:54:05,645
and the coffin
going into the ground.
1224
00:54:05,762 --> 00:54:09,637
When in fact that age
of death is dying.
1225
00:54:09,829 --> 00:54:12,549
We are ending that stage.
1226
00:54:12,551 --> 00:54:14,133
We're entering a new stage
1227
00:54:14,135 --> 00:54:16,611
where we could
possibly upload consciousness
1228
00:54:16,613 --> 00:54:18,072
into a silicone substrate.
1229
00:54:18,074 --> 00:54:20,189
You know, a lot of these
science fiction ideas
1230
00:54:20,191 --> 00:54:22,937
from decades ago
are becoming real
1231
00:54:22,939 --> 00:54:25,424
and already people are
spending millions of pounds
1232
00:54:25,426 --> 00:54:28,015
research today
on making this happen.
1233
00:54:28,017 --> 00:54:30,929
Nobody expects to do it
before 2040 at the earliest.
1234
00:54:30,931 --> 00:54:34,304
My guess is 2050,
it'll be a few rich people
1235
00:54:34,306 --> 00:54:37,241
and a few kings and queens
here and there and politicians.
1236
00:54:37,243 --> 00:54:40,693
By 2060-2070, it's
reasonably well-off people.
1237
00:54:40,695 --> 00:54:44,233
By 2075, pretty much anybody
could be immortal.
1238
00:54:44,235 --> 00:54:46,634
I'm not convinced that
uploading my consciousness
1239
00:54:46,636 --> 00:54:49,611
onto a computer is
a form of immortality.
1240
00:54:49,613 --> 00:54:51,827
I would like to live forever,
1241
00:54:51,829 --> 00:54:54,314
but I'm not sure that
I would like to live forever
1242
00:54:54,316 --> 00:54:59,813
as some digibytes of memory
in a computer.
1243
00:54:59,815 --> 00:55:01,416
I wouldn't call that
living forever.
1244
00:55:01,418 --> 00:55:03,093
The things I wanna do
with my body
1245
00:55:03,095 --> 00:55:05,813
that I won't be able
to do in that computer.
1246
00:55:05,815 --> 00:55:08,030
Immortality is a question
that keeps arising
1247
00:55:08,032 --> 00:55:11,096
in the technology community,
and it's one which I think
1248
00:55:11,098 --> 00:55:14,424
is entirely feasible
in principle.
1249
00:55:14,426 --> 00:55:16,015
We won't actually
become immortal,
1250
00:55:16,017 --> 00:55:19,385
but what we will do is
we will get the technology
1251
00:55:19,387 --> 00:55:23,874
by around about 2050
to connect a human brain
1252
00:55:23,876 --> 00:55:27,634
to the machine world so well
that most of your thinking
1253
00:55:27,636 --> 00:55:29,674
is happening inside
the computer world,
1254
00:55:29,676 --> 00:55:31,033
inside the I.T.,
1255
00:55:31,035 --> 00:55:33,015
so your brain
is still being used,
1256
00:55:33,017 --> 00:55:35,914
but 99 percent of your thoughts,
99 percent of your memories
1257
00:55:35,916 --> 00:55:37,627
are actually out there
in the cloud
1258
00:55:37,629 --> 00:55:39,166
or whatever you wanna call it,
1259
00:55:39,168 --> 00:55:41,119
and only one percent
is inside your head.
1260
00:55:41,121 --> 00:55:42,679
So walk into work this morning,
1261
00:55:42,681 --> 00:55:45,200
you get hit by a bus,
it doesn't matter.
1262
00:55:45,392 --> 00:55:48,928
You just upload you mind
into an android
1263
00:55:48,930 --> 00:55:51,181
on Monday morning and carry on
as if nothing had happened.
1264
00:55:51,183 --> 00:55:53,079
The question with
that kind of technology
1265
00:55:53,081 --> 00:55:57,100
and the extension
of human capacity
1266
00:55:57,102 --> 00:56:02,202
and human life
through technology
1267
00:56:02,204 --> 00:56:05,118
is where does
the human end,
1268
00:56:05,120 --> 00:56:06,773
and the technology begin?
1269
00:56:06,775 --> 00:56:10,194
If we upload our consciousness
into a robot,
1270
00:56:10,196 --> 00:56:14,608
a humanoid robot that has touch,
the ability to feel,
1271
00:56:14,610 --> 00:56:18,108
all of the sensorial inputs,
if they're the same,
1272
00:56:18,110 --> 00:56:21,533
there is a potential
of continuity, right?
1273
00:56:21,535 --> 00:56:25,553
So you can have the same
types of experiences
1274
00:56:25,555 --> 00:56:28,189
in that new substrate
that you have as a human being.
1275
00:56:28,191 --> 00:56:30,287
It's beyond a continuity issue,
1276
00:56:30,289 --> 00:56:33,790
it's an issue that you have
the ability to record
1277
00:56:33,792 --> 00:56:35,902
and recall without the content...
1278
00:56:36,461 --> 00:56:38,498
the content being
the sensations, images
1279
00:56:38,500 --> 00:56:40,928
and feelings and thoughts
you experienced your whole life
1280
00:56:40,930 --> 00:56:42,415
that have associated
with each other
1281
00:56:42,417 --> 00:56:43,658
through your neuronetwork.
1282
00:56:43,660 --> 00:56:45,074
Where are they stored?
1283
00:56:45,076 --> 00:56:48,243
I don't think consciousness
and the brain
1284
00:56:48,245 --> 00:56:50,022
is anything to do
with any particular
1285
00:56:50,024 --> 00:56:51,873
individual region in the brain,
1286
00:56:51,875 --> 00:56:54,248
but it's something
that's all about...
1287
00:56:54,250 --> 00:56:56,633
its distributed organization.
1288
00:56:56,661 --> 00:56:58,293
I don't think there
are any mysteries.
1289
00:56:58,295 --> 00:57:00,769
There are no causal mysteries
in the brain,
1290
00:57:00,771 --> 00:57:04,969
and I think that
there's a perfectly...
1291
00:57:05,794 --> 00:57:08,574
a comprehensible, physical chain
1292
00:57:08,576 --> 00:57:11,030
of cause and effect
that goes from the things
1293
00:57:11,032 --> 00:57:13,413
that I see and hear around me
1294
00:57:13,415 --> 00:57:14,933
and the words
that come out of my mouth,
1295
00:57:14,935 --> 00:57:19,522
which would encompass
consciousness, I suppose.
1296
00:57:19,524 --> 00:57:20,537
But the things that...
1297
00:57:20,539 --> 00:57:22,333
the moment you say
something like that,
1298
00:57:22,335 --> 00:57:26,334
you're on the edge
of the philosophical precipice.
1299
00:57:26,336 --> 00:57:27,674
When you think about a machine,
1300
00:57:27,676 --> 00:57:30,352
the question is are you
simulating consciousness
1301
00:57:30,441 --> 00:57:33,561
or are you simulating cognition?
1302
00:57:34,015 --> 00:57:38,573
Cognition requires inputs
and reactions
1303
00:57:38,575 --> 00:57:39,853
that are associated
with each other
1304
00:57:39,855 --> 00:57:42,055
to create an output
and an outcome.
1305
00:57:42,240 --> 00:57:43,773
And you can program that all day
1306
00:57:43,775 --> 00:57:45,819
and you can make that
as sophisticated
1307
00:57:45,821 --> 00:57:47,386
and as information-dense
as you want,
1308
00:57:47,388 --> 00:57:50,532
almost to the point
that it mimics a real person.
1309
00:57:51,599 --> 00:57:54,154
But the question is will it
ever have the consciousness
1310
00:57:54,156 --> 00:57:56,022
that our species
with our genetics,
1311
00:57:56,024 --> 00:57:58,172
with our brain has.
1312
00:57:58,215 --> 00:58:01,573
No, a machine has
its own consciousness.
1313
00:58:01,575 --> 00:58:03,533
All you're doing
is programming it
1314
00:58:03,535 --> 00:58:07,011
to be cognitively responsive
the way you are.
1315
00:58:07,013 --> 00:58:09,173
I remember well,
when my father died,
1316
00:58:09,175 --> 00:58:13,093
asking my mother,
"If I could've captured
1317
00:58:13,095 --> 00:58:16,141
the very being of my father
in a machine,
1318
00:58:16,185 --> 00:58:17,773
and I could put him
in an android
1319
00:58:17,775 --> 00:58:20,533
that looked exactly like him,
had all the mannerisms,
1320
00:58:20,535 --> 00:58:23,893
and it was warm and it smelled
and it felt like him,
1321
00:58:23,895 --> 00:58:25,933
would you do it?"
and she said, "Absolutely not.
1322
00:58:25,935 --> 00:58:28,550
It wouldn't be your father,
it wouldn't be him."
1323
00:58:28,552 --> 00:58:30,647
I think that someday
you can upload
1324
00:58:30,649 --> 00:58:32,743
your current neuronetwork...
1325
00:58:33,560 --> 00:58:35,376
but that's not you.
1326
00:58:35,528 --> 00:58:39,328
That's just your current
neuro map, right?
1327
00:58:39,370 --> 00:58:43,515
[music]
1328
00:58:44,474 --> 00:58:46,634
As with any concept
that proposes
1329
00:58:46,636 --> 00:58:49,488
to change the natural
order of things,
1330
00:58:49,490 --> 00:58:54,672
the idea of extreme longevity
can be met with disbelief.
1331
00:58:55,357 --> 00:58:57,936
But there is currently
an international movement
1332
00:58:57,938 --> 00:59:00,889
called transhumanism
that is concerned
1333
00:59:00,891 --> 00:59:02,773
with fundamentally transforming
1334
00:59:02,775 --> 00:59:06,170
the human condition
by developing technologies
1335
00:59:06,172 --> 00:59:08,628
to greatly enhance human beings
1336
00:59:08,630 --> 00:59:13,324
in an intellectual, physical,
and psychological capacity.
1337
00:59:13,326 --> 00:59:16,183
I really want to just
simply live indefinitely,
1338
00:59:16,185 --> 00:59:19,053
and not have the Spectre
of Death hanging over me,
1339
00:59:19,055 --> 00:59:21,272
potentially
at any moment taking away
1340
00:59:21,274 --> 00:59:23,511
this thing
that we call existence.
1341
00:59:23,513 --> 00:59:25,780
So for me, that's
the primary goal
1342
00:59:25,782 --> 00:59:28,019
of the transhumanist movement.
1343
00:59:28,021 --> 00:59:31,133
Transhumanists believe
we should use technology
1344
00:59:31,135 --> 00:59:34,766
to overcome
our biological limitations.
1345
00:59:35,232 --> 00:59:36,358
What does that mean?
1346
00:59:36,360 --> 00:59:40,613
Well, very simplistically,
perhaps,
1347
00:59:40,615 --> 00:59:43,514
I think we should be aiming
for what one might call
1348
00:59:43,516 --> 00:59:45,634
a triple S civilization
1349
00:59:45,636 --> 00:59:49,647
of super intelligence,
super longevity,
1350
00:59:49,649 --> 00:59:51,081
and super happiness.
1351
00:59:51,083 --> 00:59:53,714
We have been evolving
through hundreds and hundreds
1352
00:59:53,716 --> 00:59:55,865
of thousands of years,
human beings,
1353
00:59:55,867 --> 00:59:58,339
and transhumanism
is the climax of that.
1354
00:59:58,341 --> 01:00:01,120
It's the result of
how we're going to get
1355
01:00:01,122 --> 01:00:05,874
to some kind of great future
where we are way beyond
1356
01:00:05,876 --> 01:00:07,253
what it means
to be a human being.
1357
01:00:07,255 --> 01:00:12,534
Unfortunately, organic robots
grow old and die,
1358
01:00:12,536 --> 01:00:17,069
and this isn't a choice,
it's completely involuntary.
1359
01:00:17,071 --> 01:00:19,381
120 years from now,
in the absence
1360
01:00:19,383 --> 01:00:22,493
of radical
biological interventions,
1361
01:00:22,495 --> 01:00:26,149
everyone listening
to this video will be dead,
1362
01:00:26,154 --> 01:00:28,472
and not beautifully as so,
1363
01:00:28,474 --> 01:00:32,412
but one's last years tends
to be those of decrepitude,
1364
01:00:32,414 --> 01:00:34,912
frequently senility, infirmity,
1365
01:00:34,914 --> 01:00:40,240
and transhumanists don't accept
aging as inevitable.
1366
01:00:40,242 --> 01:00:42,853
There's no immutable
law of nature
1367
01:00:42,855 --> 01:00:45,894
that says that organic robots
must grow old.
1368
01:00:45,896 --> 01:00:48,613
After all, silicone robots,
they don't need to grow old.
1369
01:00:48,615 --> 01:00:51,235
Their parts can be
replaced and upgraded.
1370
01:00:51,990 --> 01:00:53,787
Our bodies are capable
of adjusting
1371
01:00:53,789 --> 01:00:56,024
in ways we've hardly dreamt of.
1372
01:00:56,736 --> 01:00:58,805
If we can only find the key.
1373
01:00:59,354 --> 01:01:02,469
I'm so close now, so very close.
1374
01:01:04,055 --> 01:01:06,354
- The key to what?
- To be able to replace
1375
01:01:06,356 --> 01:01:08,135
diseased and damaged parts
of the body
1376
01:01:08,137 --> 01:01:10,876
as easily as we replace
eye corneas now.
1377
01:01:11,402 --> 01:01:13,282
Can't be done.
1378
01:01:14,042 --> 01:01:15,882
It can be done!
1379
01:01:16,452 --> 01:01:18,946
The relationship
between life and death
1380
01:01:19,295 --> 01:01:23,930
and the role of technology
in forestalling death
1381
01:01:24,135 --> 01:01:27,626
creates death, in a way,
as a new kind of problem.
1382
01:01:28,001 --> 01:01:30,860
Death becomes something
that needs to be solved.
1383
01:01:30,950 --> 01:01:32,389
Why would it be good
to live forever?
1384
01:01:32,391 --> 01:01:33,810
'Cause if you have a shit day,
1385
01:01:33,812 --> 01:01:36,340
it dilutes the depression
1386
01:01:36,342 --> 01:01:38,721
within countless
other days, you know?
1387
01:01:38,723 --> 01:01:44,133
And all of these metrics
are about failing to exist
1388
01:01:44,135 --> 01:01:46,922
in the full light
of your own autonomy.
1389
01:01:47,247 --> 01:01:49,042
That's all they're about,
1390
01:01:49,044 --> 01:01:50,929
and the paradox
of your own autonomy,
1391
01:01:50,931 --> 01:01:53,750
which is you're
simultaneously completely free
1392
01:01:53,752 --> 01:01:56,553
and completely unfree
at the same time.
1393
01:01:56,555 --> 01:01:57,878
I have been to many conferences
1394
01:01:57,880 --> 01:02:00,417
where you got
the anti-transhumanist person
1395
01:02:00,419 --> 01:02:02,308
saying, "This is just
denial of death."
1396
01:02:02,310 --> 01:02:05,425
At the end of the day,
that's all it's about, right?
1397
01:02:05,427 --> 01:02:08,753
And it's a kind of...
the last hangover
1398
01:02:08,755 --> 01:02:10,206
of the Abrahamic religions,
1399
01:02:10,208 --> 01:02:11,557
this idea that
we're gonna, you know,
1400
01:02:11,559 --> 01:02:12,940
come back to God and all this,
1401
01:02:12,942 --> 01:02:15,094
and we're gonna realize
our God-like nature,
1402
01:02:15,096 --> 01:02:20,742
and this is really the last kind
of point for that.
1403
01:02:21,465 --> 01:02:22,995
I think there's
a lot of truth to that,
1404
01:02:22,997 --> 01:02:24,713
especially in terms
of the issues
1405
01:02:24,715 --> 01:02:26,583
we've been talking about
where everybody just seems
1406
01:02:26,585 --> 01:02:28,698
to take for granted
that if you're given
1407
01:02:28,700 --> 01:02:31,221
the chance to live forever,
you'd live forever.
1408
01:02:31,223 --> 01:02:33,873
I think yes,
I think that that's true.
1409
01:02:33,875 --> 01:02:37,134
I don't think it's...
I think it's true,
1410
01:02:37,136 --> 01:02:38,521
I don't know
if it's as problematic
1411
01:02:38,523 --> 01:02:40,325
as people kind of claim it is.
1412
01:02:40,327 --> 01:02:42,177
In other words, that
there's something wrong with
1413
01:02:42,179 --> 01:02:43,560
having this fear of death
1414
01:02:43,562 --> 01:02:44,807
and wanting to live forever.
1415
01:02:44,809 --> 01:02:46,913
No, I think living forever is...
1416
01:02:46,915 --> 01:02:49,661
I think the question is what
are you doing with your time?
1417
01:02:49,663 --> 01:02:51,953
In what capacity
do you wanna live forever?
1418
01:02:51,955 --> 01:02:54,323
So I do think it makes
all the difference in the world
1419
01:02:54,325 --> 01:02:56,026
whether we're talking
about Kurzweil's way
1420
01:02:56,028 --> 01:02:57,661
or we're talking
about Aubrey de Grey's way.
1421
01:02:57,663 --> 01:02:59,416
The way the human
species operates
1422
01:02:59,418 --> 01:03:02,339
is that we're really never
fully ready for anything.
1423
01:03:02,341 --> 01:03:05,160
However, the prospect
of living indefinitely
1424
01:03:05,162 --> 01:03:08,989
is too promising
to turn down or to slow down
1425
01:03:08,991 --> 01:03:12,080
or to just not go after
at full speed.
1426
01:03:12,082 --> 01:03:14,193
By enabling us
to find technologies
1427
01:03:14,195 --> 01:03:16,294
to live indefinitely,
we're not making it
1428
01:03:16,296 --> 01:03:17,835
so that we're going
to live forever,
1429
01:03:17,837 --> 01:03:19,924
we're just making it
so we have that choice.
1430
01:03:19,926 --> 01:03:21,935
If people wanna
pull out of life
1431
01:03:21,937 --> 01:03:23,232
at some point down the future,
1432
01:03:23,234 --> 01:03:24,971
they're certainly
welcome to do that.
1433
01:03:24,973 --> 01:03:28,489
However, it's gonna be great
to eliminate death if we want,
1434
01:03:28,491 --> 01:03:31,054
because everyone
wants that choice.
1435
01:03:31,494 --> 01:03:34,273
There are other
socioeconomic repercussions
1436
01:03:34,275 --> 01:03:37,234
of living longer
that need to be considered.
1437
01:03:37,418 --> 01:03:40,013
The combination
of an aging population
1438
01:03:40,015 --> 01:03:42,771
and the escalating expenses
of healthcare,
1439
01:03:42,773 --> 01:03:45,674
social care, and retirement
is a problem
1440
01:03:45,676 --> 01:03:48,393
that already exists
the world over.
1441
01:03:48,395 --> 01:03:50,349
In the last century alone,
1442
01:03:50,351 --> 01:03:52,739
medicine has
massively contributed
1443
01:03:52,741 --> 01:03:55,195
to increased life expectancy.
1444
01:03:56,333 --> 01:03:59,152
According to the World
Health Organization,
1445
01:03:59,154 --> 01:04:02,010
the number of people
aged 60 years and over
1446
01:04:02,012 --> 01:04:06,271
is expected to increase
from the 605 million today
1447
01:04:06,273 --> 01:04:10,304
to 2 billion by the year 2050.
1448
01:04:10,548 --> 01:04:13,504
As people live longer,
they become more susceptible
1449
01:04:13,506 --> 01:04:16,325
to noncommunicable diseases.
1450
01:04:16,327 --> 01:04:19,851
This becomes
enormously expensive.
1451
01:04:20,093 --> 01:04:25,265
Dementia alone costs the NHS
23 billion a year.
1452
01:04:25,614 --> 01:04:28,454
Currently, elderly non-workers
1453
01:04:28,456 --> 01:04:32,155
account for a vast portion
of our population
1454
01:04:32,157 --> 01:04:36,460
and a vast portion of our work
force care for them.
1455
01:04:36,614 --> 01:04:42,075
It is economically beneficial
to end aging.
1456
01:04:42,455 --> 01:04:45,356
Social life is organized
around people having...
1457
01:04:45,358 --> 01:04:48,570
occupying certain roles
at certain ages, right?
1458
01:04:48,659 --> 01:04:50,888
And you can already see
the kinds of problems
1459
01:04:50,890 --> 01:04:52,993
that are caused
to the welfare system
1460
01:04:52,995 --> 01:04:56,033
when people live substantially
beyond the age of 65,
1461
01:04:56,035 --> 01:05:01,073
because when the whole number 65
was selected by Bismarck
1462
01:05:01,075 --> 01:05:03,151
when he started the first
social security system,
1463
01:05:03,153 --> 01:05:06,021
in Germany, the expectation was
that people would be living
1464
01:05:06,023 --> 01:05:08,521
two years beyond
the retirement age
1465
01:05:08,523 --> 01:05:10,291
to be able to get
the social security.
1466
01:05:10,293 --> 01:05:12,354
So it wasn't gonna
break the bank, okay?
1467
01:05:12,356 --> 01:05:15,044
Problem now is you've got people
who are living 20 years
1468
01:05:15,046 --> 01:05:17,552
or more beyond
the retirement age,
1469
01:05:17,554 --> 01:05:18,932
and that's unaffordable.
1470
01:05:18,934 --> 01:05:21,873
There's no question that,
within society as a whole,
1471
01:05:21,875 --> 01:05:25,953
there is an enormous tendency
to knee-jerk reactions
1472
01:05:25,955 --> 01:05:29,021
with regard to the problems
that might be created
1473
01:05:29,023 --> 01:05:31,810
if we were to eliminate aging.
1474
01:05:31,812 --> 01:05:33,913
There have been people
that said, you know,
1475
01:05:33,915 --> 01:05:36,377
"You'll be bored,
you won't have anything to do."
1476
01:05:36,379 --> 01:05:39,716
Speaking from a place
of a lifespan
1477
01:05:39,718 --> 01:05:41,487
that's 80 or 90 years old,
1478
01:05:41,489 --> 01:05:43,054
saying that we're
going to be bored
1479
01:05:43,056 --> 01:05:46,433
if we live to 150 years old
really is just invalid.
1480
01:05:46,435 --> 01:05:48,771
We have no idea what
we'll do with that time.
1481
01:05:48,773 --> 01:05:50,372
Part of
this transhumanism stuff,
1482
01:05:50,374 --> 01:05:53,182
where it gets some kind
of real policy traction,
1483
01:05:53,184 --> 01:05:57,169
is people who want us
not to live to be 1,000,
1484
01:05:57,171 --> 01:05:59,799
but maybe if we can
take that 20 years
1485
01:05:59,801 --> 01:06:03,122
that we're living longer now
than we did 100 years ago
1486
01:06:03,124 --> 01:06:05,075
and keep that productive.
1487
01:06:05,077 --> 01:06:06,894
So in other words,
if you could still be strong
1488
01:06:06,896 --> 01:06:10,154
and still be sharp
into your 70s and 80s,
1489
01:06:10,156 --> 01:06:12,802
and so not have to pull
any social security
1490
01:06:12,804 --> 01:06:15,466
until quite late in life
and then you'll be...
1491
01:06:15,468 --> 01:06:17,254
you'll have 20 extra years
1492
01:06:17,256 --> 01:06:19,330
where you're actually
contributing to the economy.
1493
01:06:19,332 --> 01:06:21,325
So one of the areas that
we're gonna have to think about
1494
01:06:21,327 --> 01:06:23,661
in the near future
if we do achieve
1495
01:06:23,663 --> 01:06:26,655
extreme longevity physically
1496
01:06:26,657 --> 01:06:29,537
is the idea of overpopulation.
1497
01:06:29,561 --> 01:06:32,713
This is a controversial idea,
of course,
1498
01:06:32,715 --> 01:06:36,713
and we may face a time period
where we have to say to people,
1499
01:06:36,715 --> 01:06:40,654
"You have to be licensed
to have more than one child."
1500
01:06:40,656 --> 01:06:43,393
The ideas around children,
I hope, will probably change
1501
01:06:43,395 --> 01:06:48,226
when people start to realize
that the values of children
1502
01:06:48,707 --> 01:06:51,045
need to be defined first
before we have them.
1503
01:06:51,047 --> 01:06:52,706
And that's not
something that we do.
1504
01:06:52,708 --> 01:06:55,591
We just have them,
and we don't define why
1505
01:06:55,593 --> 01:06:56,919
or for what purpose.
1506
01:06:56,921 --> 01:06:58,698
I'm not saying there has
to be a defined purpose,
1507
01:06:58,700 --> 01:07:01,232
but I'm saying that just
to continue our gene line
1508
01:07:01,234 --> 01:07:02,552
isn't the biggest reason.
1509
01:07:02,554 --> 01:07:04,153
At the moment, ultimately,
1510
01:07:04,155 --> 01:07:07,713
we see in any society
where fertility rate goes down
1511
01:07:07,715 --> 01:07:10,753
because of female prosperity
and emancipation
1512
01:07:10,755 --> 01:07:13,334
and education,
we also see the age
1513
01:07:13,336 --> 01:07:16,435
of the average childbirth
go up, right?
1514
01:07:16,437 --> 01:07:18,198
We see women having
their children later.
1515
01:07:18,200 --> 01:07:21,052
Now of course, at the moment,
there's a deadline for that,
1516
01:07:21,054 --> 01:07:22,872
but that's not going
to exist anymore,
1517
01:07:22,874 --> 01:07:24,673
because menopause
is part of aging.
1518
01:07:24,675 --> 01:07:27,443
So women who are choosing
to have their children
1519
01:07:27,445 --> 01:07:28,849
a bit later now,
1520
01:07:28,851 --> 01:07:30,817
it stands to reason
that a lot of them
1521
01:07:30,819 --> 01:07:32,494
are probably gonna choose
to have their children
1522
01:07:32,496 --> 01:07:34,715
a lot later and a lot later
and that, of course,
1523
01:07:34,717 --> 01:07:37,734
also has an enormous
depressive impact
1524
01:07:37,736 --> 01:07:40,599
on the trajectory
of global population.
1525
01:07:40,601 --> 01:07:42,236
If we actually said
to everybody, "Okay,
1526
01:07:42,238 --> 01:07:44,145
you're all now
gonna live for 1,000 years,
1527
01:07:44,147 --> 01:07:46,043
we could restructure society
1528
01:07:46,045 --> 01:07:47,957
so it's on
these 1,000-year cycles.
1529
01:07:47,959 --> 01:07:51,322
That's possible,
but the problem becomes
1530
01:07:51,324 --> 01:07:55,244
when you still allow people
to live the normal length
1531
01:07:55,246 --> 01:07:57,910
and you're also allowing
some people to live 1,000 years
1532
01:07:57,912 --> 01:08:00,653
then how do you compare
the value of the lives,
1533
01:08:00,655 --> 01:08:02,273
the amount of experience?
1534
01:08:02,275 --> 01:08:04,653
Supposing a 585-year-old guy
1535
01:08:04,655 --> 01:08:06,712
goes up for a job
against a 23-year-old.
1536
01:08:06,714 --> 01:08:08,340
How do you measure
the experience?
1537
01:08:08,342 --> 01:08:10,103
What, the old guy
always gets the job?
1538
01:08:10,105 --> 01:08:13,957
I mean, really, these kinds
of problems would arise
1539
01:08:13,959 --> 01:08:16,159
unless there was
some kind of legislation
1540
01:08:16,161 --> 01:08:18,814
about permissible
variation in age.
1541
01:08:18,816 --> 01:08:20,410
This is a bit of a conundrum
1542
01:08:20,412 --> 01:08:24,512
because we're all
expanding our lifespan,
1543
01:08:24,514 --> 01:08:27,660
and the question is,
would you like to live
1544
01:08:27,662 --> 01:08:30,033
for not 100 years but 200?
1545
01:08:30,035 --> 01:08:32,132
Would you choose
to if you could?
1546
01:08:32,134 --> 01:08:34,153
It would be very,
very difficult to say no.
1547
01:08:34,155 --> 01:08:37,753
The reality is the replacement
of human piece parts
1548
01:08:37,755 --> 01:08:40,181
is probably gonna take us
in that direction,
1549
01:08:40,183 --> 01:08:41,873
but it will be market driven,
1550
01:08:41,875 --> 01:08:44,353
and those people with the money
will be able to afford
1551
01:08:44,355 --> 01:08:46,403
to live a lot longer
than those people without.
1552
01:08:46,405 --> 01:08:48,283
Pretty much most of
the discovery these days
1553
01:08:48,285 --> 01:08:50,736
takes place in Western Europe
of the United States
1554
01:08:50,738 --> 01:08:55,126
or one or two other countries,
China, Singapore, and so on,
1555
01:08:55,128 --> 01:08:58,767
but if they're valuable enough...
and I don't mean monetarily...
1556
01:08:58,769 --> 01:09:01,793
if they're worth having,
then people extend them.
1557
01:09:01,795 --> 01:09:04,020
We have to start somewhere,
and I don't believe
1558
01:09:04,022 --> 01:09:06,028
in the "dog
in the manger" attitude,
1559
01:09:06,030 --> 01:09:07,496
is that you don't
give it to anybody
1560
01:09:07,498 --> 01:09:09,295
until you can provide it
for everybody.
1561
01:09:09,297 --> 01:09:11,836
All technologies
are discontinuous.
1562
01:09:11,838 --> 01:09:14,407
There are people
at this very moment
1563
01:09:14,409 --> 01:09:16,403
who are walking four kilometers
1564
01:09:16,405 --> 01:09:18,686
to get a bucket of water
from a well.
1565
01:09:18,863 --> 01:09:22,275
You know, there are people
who are having cornea operations
1566
01:09:22,277 --> 01:09:25,528
that are done with a needle
where it's stuck in their eye
1567
01:09:25,530 --> 01:09:27,876
and their cornea is scraped out.
1568
01:09:27,878 --> 01:09:30,426
You know, so these ideas
of totalizing
1569
01:09:30,428 --> 01:09:33,068
utopian technological
intervention
1570
01:09:33,075 --> 01:09:35,790
are part of a discontinuous
technological world
1571
01:09:35,792 --> 01:09:38,817
and the world will always be
discontinuous technologically.
1572
01:09:38,819 --> 01:09:43,270
When a child has to drink
dirty water, cannot get food,
1573
01:09:43,272 --> 01:09:45,891
and is dying of starvation
and disease,
1574
01:09:45,893 --> 01:09:48,397
and the solution
is just a few dollars,
1575
01:09:48,559 --> 01:09:50,098
there's something badly wrong.
1576
01:09:50,100 --> 01:09:52,660
We need to fix those things.
1577
01:09:53,035 --> 01:09:56,926
The only way that
we could fix them in the past
1578
01:09:56,928 --> 01:09:59,059
would've been
at unbelievable cost
1579
01:09:59,061 --> 01:10:00,223
because of the limitation
1580
01:10:00,225 --> 01:10:02,496
of our industrial capacity
and capability.
1581
01:10:02,498 --> 01:10:04,498
Not anymore.
1582
01:10:08,869 --> 01:10:12,921
...not the one which is...
which stops us from aging.
1583
01:10:13,124 --> 01:10:17,864
In the last 20 years,
healthcare in Sub-Saharan Africa
1584
01:10:17,866 --> 01:10:19,986
has greatly improved.
1585
01:10:20,392 --> 01:10:23,152
HIV prevalence has gone down,
1586
01:10:23,202 --> 01:10:26,173
Infant mortality rate
has gone down.
1587
01:10:26,175 --> 01:10:28,865
Immunization rates have gone up,
1588
01:10:28,867 --> 01:10:32,827
and the drug supply
in many areas has risen.
1589
01:10:32,871 --> 01:10:35,723
However, healthcare
and medication
1590
01:10:35,725 --> 01:10:37,353
in developing countries
1591
01:10:37,355 --> 01:10:41,296
is not always affordable
or even readily available.
1592
01:10:41,394 --> 01:10:43,188
It is not just medicine.
1593
01:10:43,190 --> 01:10:45,715
According to the World
Health Organization,
1594
01:10:45,717 --> 01:10:48,887
over 700 million people
worldwide
1595
01:10:48,889 --> 01:10:52,409
do not have access
to clean drinking water.
1596
01:10:52,515 --> 01:10:55,410
We still live in an age
where over one billion people
1597
01:10:55,412 --> 01:10:57,614
live off less
than a dollar a day
1598
01:10:57,616 --> 01:11:00,096
and live in extreme poverty.
1599
01:11:00,130 --> 01:11:03,106
It is ultimately
not a scientific issue,
1600
01:11:03,108 --> 01:11:05,803
it is a geopolitical issue.
1601
01:12:07,184 --> 01:12:09,744
Now a lot of people,
a lot of philanthropists
1602
01:12:09,814 --> 01:12:14,447
are of the view that
the most important thing to do
1603
01:12:14,449 --> 01:12:18,174
is to address the trailing edge
of quality of life.
1604
01:12:18,176 --> 01:12:20,938
In other words, to help
the disadvantaged.
1605
01:12:20,940 --> 01:12:23,687
But some visionary
philanthropists
1606
01:12:23,689 --> 01:12:26,525
such as the ones that fund
SENS Research Foundation...
1607
01:12:26,527 --> 01:12:28,313
and the fact is
I agree with this,
1608
01:12:28,315 --> 01:12:30,433
and that's why I've put
most of my inheritance
1609
01:12:30,435 --> 01:12:32,610
into SENS Research
Foundation, too...
1610
01:12:32,612 --> 01:12:36,564
we feel that, actually,
in the long run,
1611
01:12:36,566 --> 01:12:40,111
you lose out if you focus
too exclusively
1612
01:12:40,113 --> 01:12:41,508
on the trailing edge.
1613
01:12:41,510 --> 01:12:44,656
You've got also to push forward
the leading edge,
1614
01:12:44,658 --> 01:12:47,848
so that in the long term,
everybody moves forward.
1615
01:13:38,001 --> 01:13:40,633
Due to a lack of funding
from governments,
1616
01:13:40,635 --> 01:13:44,873
anti-aging research is often
pushed into the private sector.
1617
01:13:44,875 --> 01:13:48,851
If we look at funding
for disease...
1618
01:13:48,853 --> 01:13:50,474
cancer, heart disease...
1619
01:13:50,476 --> 01:13:52,814
they get six billion,
eight billion, ten billion.
1620
01:13:52,816 --> 01:13:55,557
AIDS still gets two
to four billion.
1621
01:13:55,753 --> 01:13:57,633
Let's look
at Alzheimer's disease.
1622
01:13:57,635 --> 01:13:59,982
It's probably the most
important disease of aging,
1623
01:13:59,984 --> 01:14:03,557
the brain... it gets
under a half a billion
1624
01:14:03,608 --> 01:14:04,781
from the federal government,
1625
01:14:04,783 --> 01:14:07,000
and a lot of that
goes to programs
1626
01:14:07,002 --> 01:14:08,673
that are tied up
with pharma trials
1627
01:14:08,675 --> 01:14:10,755
where we don't really
see it in the labs,
1628
01:14:10,788 --> 01:14:13,205
so you're maybe down to two
or three hundred million.
1629
01:14:13,364 --> 01:14:16,353
It's not nearly enough
to really make a dent.
1630
01:14:16,569 --> 01:14:22,109
The question is, why when it
comes to Alzheimer's disease,
1631
01:14:22,111 --> 01:14:24,093
which is a problem
in the elderly
1632
01:14:24,095 --> 01:14:26,353
do we just see it as...
the federal government
1633
01:14:26,355 --> 01:14:28,375
seems to see it
as a red-haired stepchild.
1634
01:14:28,377 --> 01:14:30,236
They don't take care of it.
1635
01:14:31,077 --> 01:14:32,461
Some people say it's because...
1636
01:14:32,463 --> 01:14:34,023
people say, "Well,
it affects old people,
1637
01:14:34,025 --> 01:14:36,177
they lived their lives,
let them go."
1638
01:14:36,179 --> 01:14:38,648
No one wants to admit that,
but maybe subconsciously,
1639
01:14:38,650 --> 01:14:40,414
when Congress is
thinking about this,
1640
01:14:40,416 --> 01:14:42,096
that's at play, who knows?
1641
01:14:43,007 --> 01:14:45,033
Maybe it's
much more compelling
1642
01:14:45,035 --> 01:14:49,031
to wanna put money into diseases
that affect young people
1643
01:14:49,033 --> 01:14:50,974
who still have
their whole life to live
1644
01:14:50,976 --> 01:14:52,892
when they have AIDS
or breast cancer
1645
01:14:52,894 --> 01:14:55,481
or cancer that can strike
somebody at 30 or 40 years old.
1646
01:14:55,483 --> 01:14:57,953
Age might be part of it,
and even if it's something
1647
01:14:57,955 --> 01:14:59,758
where you'd say,
"No, it can't be that!"
1648
01:14:59,760 --> 01:15:01,499
you never know what's
happening subconsciously
1649
01:15:01,501 --> 01:15:03,160
in those who are
making the decisions.
1650
01:15:03,162 --> 01:15:04,805
Otherwise, it just
makes no sense at all.
1651
01:15:04,807 --> 01:15:06,349
I don't know how to explain it.
1652
01:15:06,351 --> 01:15:07,934
When you talk
to people about aging
1653
01:15:07,936 --> 01:15:11,113
and rejuvenation medicine,
you're talking about things
1654
01:15:11,115 --> 01:15:13,601
that they haven't put
in the same category
1655
01:15:13,603 --> 01:15:14,855
as things that they can fight.
1656
01:15:14,857 --> 01:15:16,148
They are willing to put money
1657
01:15:16,150 --> 01:15:18,036
towards solving cancer
and curing cancer.
1658
01:15:18,038 --> 01:15:20,458
It's something they might have
the potential of experiencing.
1659
01:15:20,460 --> 01:15:23,153
But the thing that's 100 percent
in terms of probability,
1660
01:15:23,155 --> 01:15:25,513
they haven't classified that
as in the same category
1661
01:15:25,515 --> 01:15:27,812
when actually it is
and actually it's more dramatic
1662
01:15:27,814 --> 01:15:29,869
because 100 percent
of people experience it.
1663
01:15:29,912 --> 01:15:32,952
You need to have
the will to be cured.
1664
01:15:32,995 --> 01:15:37,062
Beyond that, medical science
will play its part.
1665
01:15:37,064 --> 01:15:39,945
I think it's essentially a crime
1666
01:15:39,947 --> 01:15:42,430
to not support
life extension science,
1667
01:15:42,432 --> 01:15:44,285
because if you support
the other side,
1668
01:15:44,287 --> 01:15:46,774
you're an advocate
for killing someone.
1669
01:15:46,776 --> 01:15:51,148
When you actually support
a culture of death,
1670
01:15:51,150 --> 01:15:54,364
when you support
embracing death,
1671
01:15:54,366 --> 01:15:58,055
what you're really doing is
not supporting embracing life.
1672
01:15:58,057 --> 01:15:59,916
Everyone ought to be healthy,
1673
01:15:59,918 --> 01:16:02,355
however long ago they were born.
1674
01:16:02,421 --> 01:16:05,333
When someone says, "Oh, dear,
we shouldn't defeat aging,
1675
01:16:05,335 --> 01:16:06,989
we shouldn't work
to eliminate aging,"
1676
01:16:06,991 --> 01:16:09,770
what they're actually saying
is they're not in favor
1677
01:16:09,772 --> 01:16:12,419
of healthcare for the elderly,
or to be more precise,
1678
01:16:12,421 --> 01:16:14,765
what they're saying is
they're only in favor
1679
01:16:14,767 --> 01:16:16,406
of healthcare for the elderly
1680
01:16:16,408 --> 01:16:18,893
so long as it
doesn't work very well.
1681
01:16:19,046 --> 01:16:21,752
And I think that's fucked up.
1682
01:16:21,999 --> 01:16:24,601
In September 2013,
1683
01:16:24,603 --> 01:16:27,747
Google announced
the conception of Calico,
1684
01:16:27,749 --> 01:16:29,984
an independent biotech company
1685
01:16:29,986 --> 01:16:32,846
that remains to this day
a little mysterious.
1686
01:16:33,155 --> 01:16:37,575
Its aim is to tackle aging
and devise interventions
1687
01:16:37,577 --> 01:16:41,692
that enable people to lead
longer and healthier lives.
1688
01:16:41,694 --> 01:16:44,153
In September 2014,
1689
01:16:44,155 --> 01:16:47,117
the life extension company
announced it was partnering
1690
01:16:47,119 --> 01:16:50,593
with biopharmaceutical
giant AbbVie,
1691
01:16:50,595 --> 01:16:55,971
and made a $1.5 billion
investment into research.
1692
01:16:56,038 --> 01:16:57,828
I think one of
the biggest obstacles
1693
01:16:57,830 --> 01:17:00,089
that we have at the moment
to come to terms
1694
01:17:00,091 --> 01:17:02,465
with this future world
we're talking about
1695
01:17:02,467 --> 01:17:04,646
is a lot of people
who basically
1696
01:17:04,648 --> 01:17:06,974
don't want it to happen at all
1697
01:17:06,976 --> 01:17:10,414
and so are placing
all kinds of ethical
1698
01:17:10,416 --> 01:17:12,255
and institutional restrictions
1699
01:17:12,257 --> 01:17:14,028
on the development
of this stuff
1700
01:17:14,030 --> 01:17:16,169
so that it becomes difficult,
let's say, in universities
1701
01:17:16,171 --> 01:17:18,570
to experiment with certain
kinds of drugs, right?
1702
01:17:18,572 --> 01:17:20,713
To develop certain kinds
of machines maybe even,
1703
01:17:20,715 --> 01:17:22,993
and as a result,
all of that kind of research
1704
01:17:22,995 --> 01:17:25,773
ends up going into
either the private sector
1705
01:17:25,775 --> 01:17:28,161
or maybe underground, right?
1706
01:17:28,163 --> 01:17:30,664
Or going into some country
that's an ethics-free zone
1707
01:17:30,666 --> 01:17:32,481
like China
or someplace like that,
1708
01:17:32,483 --> 01:17:34,309
and I think that's
where the real problems
1709
01:17:34,311 --> 01:17:37,286
potentially lie, because
we really need to be doing,
1710
01:17:37,288 --> 01:17:39,039
you know, we need to be
developing this stuff,
1711
01:17:39,041 --> 01:17:41,122
but in the public eye, right?
1712
01:17:41,124 --> 01:17:44,023
So it should be done
by the mainstream authorities
1713
01:17:44,025 --> 01:17:45,672
so we can monitor
the consequences
1714
01:17:45,674 --> 01:17:47,172
as they're happening
and then be able
1715
01:17:47,174 --> 01:17:48,534
to take appropriate action.
1716
01:17:48,536 --> 01:17:50,433
But I'm afraid that
a lot of this stuff
1717
01:17:50,435 --> 01:17:53,094
is perhaps being driven outside
1718
01:17:53,096 --> 01:17:55,359
because of all the restrictions
that are placed on it.
1719
01:17:55,361 --> 01:17:56,567
That I think is very worrisome,
1720
01:17:56,569 --> 01:17:58,374
because then you can't
keep track of the results,
1721
01:17:58,376 --> 01:18:00,955
and you don't know
exactly what's happening.
1722
01:18:00,957 --> 01:18:02,969
And I think that's
a real problem already
1723
01:18:02,971 --> 01:18:05,596
with a lot of this
more futuristic stuff.
1724
01:18:06,890 --> 01:18:08,945
Arguably, the human condition
1725
01:18:08,947 --> 01:18:11,940
is defined by
our anxiety of death.
1726
01:18:11,980 --> 01:18:14,390
It's of little wonder
that throughout history,
1727
01:18:14,392 --> 01:18:17,605
mankind has built
countless belief systems
1728
01:18:17,607 --> 01:18:20,340
in a bid to pacify
the fear of death
1729
01:18:20,342 --> 01:18:23,479
through the promise
of endless paradise.
1730
01:18:23,629 --> 01:18:26,829
Ultimately, death always wins.
1731
01:18:27,075 --> 01:18:31,651
If it's not so much death
we fear, it's dying.
1732
01:18:31,804 --> 01:18:34,015
The relationship
between life and death
1733
01:18:34,017 --> 01:18:38,294
is often figured
in terms of immortality
1734
01:18:38,296 --> 01:18:40,174
and the quest for immortality.
1735
01:18:40,233 --> 01:18:42,574
There's a philosopher,
a brilliant philosopher
1736
01:18:42,576 --> 01:18:46,107
called Stephen Cave, who,
in his book Immortality,
1737
01:18:46,109 --> 01:18:50,088
argues that our fear of death
is the great driver
1738
01:18:50,090 --> 01:18:53,927
of all civilization,
of all human endeavor,
1739
01:18:53,929 --> 01:18:57,461
and he identifies
four different ways
1740
01:18:57,463 --> 01:19:00,762
in which people
seek immortality,
1741
01:19:00,764 --> 01:19:04,820
so firstly the idea
of extending life,
1742
01:19:04,822 --> 01:19:06,462
of living forever.
1743
01:19:06,679 --> 01:19:08,719
Secondly, the idea
of resurrection
1744
01:19:08,734 --> 01:19:12,526
so that we might come back
after death in some form.
1745
01:19:13,079 --> 01:19:16,353
Thirdly, the idea
of the immortality
1746
01:19:16,355 --> 01:19:20,452
of some part of ourselves
beyond the physical body.
1747
01:19:20,454 --> 01:19:23,578
So perhaps the immortality
of the soul, for example,
1748
01:19:23,580 --> 01:19:26,190
or living on in Heaven.
1749
01:19:26,333 --> 01:19:30,369
And finally the idea
of leaving a legacy.
1750
01:19:30,638 --> 01:19:33,457
I think that one
of life's challenges
1751
01:19:33,459 --> 01:19:35,633
really actually is
to come to terms
1752
01:19:35,635 --> 01:19:38,836
with our own finitude
and mortality
1753
01:19:38,838 --> 01:19:40,577
and human limitations,
1754
01:19:40,579 --> 01:19:44,447
and this is
an enormous challenge.
1755
01:19:44,995 --> 01:19:53,835
[music]
1756
01:19:59,205 --> 01:20:03,285
Technology, a reflection
of our times.
1757
01:20:03,515 --> 01:20:08,570
Efficient, computerized,
with a sleek beauty all its own.
1758
01:20:08,577 --> 01:20:11,271
Technology is
the human imagination
1759
01:20:11,273 --> 01:20:13,153
converted into reality.
1760
01:20:13,270 --> 01:20:15,273
We are all interested
in the future,
1761
01:20:15,275 --> 01:20:17,771
for that is where you and I
are going to spend
1762
01:20:17,773 --> 01:20:20,013
the rest of our lives.
1763
01:20:20,520 --> 01:20:24,854
[high-pitched tone]
1764
01:20:24,856 --> 01:20:28,726
[music]
1765
01:20:28,918 --> 01:20:31,068
It's impossible to say for sure
1766
01:20:31,070 --> 01:20:33,689
where these new technologies
will take us
1767
01:20:33,691 --> 01:20:37,633
and how we will prepare
to implement them into society.
1768
01:20:37,635 --> 01:20:40,630
It is likely that they will
affect the sensibilities
1769
01:20:40,632 --> 01:20:43,312
of global infrastructure.
1770
01:20:43,731 --> 01:20:47,904
There are always anxieties
surrounding new technologies,
1771
01:20:47,906 --> 01:20:51,066
and this time is no exception.
1772
01:20:51,226 --> 01:20:53,026
I think people fear change,
1773
01:20:53,475 --> 01:20:56,388
and so the future represents
this enormous amount
1774
01:20:56,390 --> 01:20:58,539
of change that's coming at us.
1775
01:20:58,614 --> 01:21:00,560
I do think it's
overwhelming for people,
1776
01:21:00,562 --> 01:21:03,838
you know, they are
afraid to change
1777
01:21:03,840 --> 01:21:05,233
the paradigm that they live in,
1778
01:21:05,235 --> 01:21:08,127
and when we talk about
the future of work and death,
1779
01:21:08,129 --> 01:21:10,208
what we're really
talking about is changing
1780
01:21:10,210 --> 01:21:12,232
a paradigm
that has existed for us
1781
01:21:12,234 --> 01:21:13,432
as long as we can remember.
1782
01:21:13,434 --> 01:21:16,397
All of this kind
of scare mongering
1783
01:21:16,399 --> 01:21:19,820
about harm and risk
and stuff like that
1784
01:21:20,376 --> 01:21:24,616
really is... it's based on a kind
of psychological illusion,
1785
01:21:24,618 --> 01:21:28,658
namely that you imagine that you
kind of see the bad state
1786
01:21:28,660 --> 01:21:31,280
as a bad state when it happens,
1787
01:21:31,282 --> 01:21:33,611
whereas in fact,
what more likely happens
1788
01:21:33,613 --> 01:21:36,461
is that you kinda get adjusted
to the various changes
1789
01:21:36,463 --> 01:21:37,903
that are happening
in your environment
1790
01:21:37,905 --> 01:21:39,591
so that when you actually
do reach that state
1791
01:21:39,593 --> 01:21:42,494
we're talking about,
it'll seem normal.
1792
01:21:43,226 --> 01:21:45,325
And because, look,
when the automobile
1793
01:21:45,327 --> 01:21:47,666
was introduced in
the early 20th century,
1794
01:21:47,668 --> 01:21:49,607
people were saying
this is just gonna pump
1795
01:21:49,609 --> 01:21:51,228
a lot of smoke
into the atmosphere,
1796
01:21:51,230 --> 01:21:53,091
it's going to ruin
our contact with nature
1797
01:21:53,093 --> 01:21:55,101
'cause we'll be in
these enclosed vehicles,
1798
01:21:55,103 --> 01:21:56,313
we'll be going so fast,
1799
01:21:56,315 --> 01:21:57,664
we won't be able
to appreciate things,
1800
01:21:57,666 --> 01:22:00,109
there'll be congestion,
blah, blah, blah,
1801
01:22:00,111 --> 01:22:01,864
they were right!
1802
01:22:01,866 --> 01:22:03,226
They were right, but of course,
1803
01:22:03,228 --> 01:22:05,175
by the time you get
to that state
1804
01:22:05,177 --> 01:22:07,196
where the automobile
has had that impact,
1805
01:22:07,198 --> 01:22:09,020
it's also had all
this benefit as well,
1806
01:22:09,022 --> 01:22:11,328
and your whole life has been
kind of restructured around it.
1807
01:22:11,330 --> 01:22:13,993
Arguably, people who are using
1808
01:22:13,995 --> 01:22:19,233
or have been conceived
using in vitro fertilization
1809
01:22:19,235 --> 01:22:24,713
are cyborgs way before
they were ever even people.
1810
01:22:24,715 --> 01:22:27,851
Now that doesn't mean
that we understand kinship
1811
01:22:27,853 --> 01:22:29,500
in a radically different way.
1812
01:22:29,502 --> 01:22:31,193
Just look at
the Industrial Revolution!
1813
01:22:31,195 --> 01:22:34,562
Is anyone actually...
does anyone actually regret
1814
01:22:34,564 --> 01:22:36,437
that the Industrial Revolution
occurred?
1815
01:22:36,439 --> 01:22:38,520
No, it was fairly
turbulent, right?
1816
01:22:38,522 --> 01:22:41,414
You know, we did actually
have a little bit of strife
1817
01:22:41,416 --> 01:22:44,140
in the translation
from a pre-industrial world
1818
01:22:44,142 --> 01:22:45,336
to the world we know today.
1819
01:22:45,338 --> 01:22:47,200
But the fact is, we adapted.
1820
01:22:47,202 --> 01:22:49,687
The most important thing here
is to try to compare it
1821
01:22:49,689 --> 01:22:51,273
to something in the past.
1822
01:22:51,275 --> 01:22:54,793
Imagine we were...
it was 1914, 100 years back,
1823
01:22:54,795 --> 01:22:57,297
and I told you that
most people on the planet
1824
01:22:57,299 --> 01:22:58,673
would have the ability to have
1825
01:22:58,675 --> 01:23:00,325
this tiny cell phone screen
in front of them
1826
01:23:00,327 --> 01:23:03,313
and video conference with ten
of their friends all at once.
1827
01:23:03,315 --> 01:23:06,014
If it was 1914,
you would look at me and say,
1828
01:23:06,016 --> 01:23:07,937
"That's absurd,
this guy's insane."
1829
01:23:07,939 --> 01:23:10,336
However, it's
the sort of same concept
1830
01:23:10,338 --> 01:23:11,753
when I tell you now in 50 years
1831
01:23:11,755 --> 01:23:14,260
we're going to be digital
beings of ourselves,
1832
01:23:14,262 --> 01:23:15,553
it's no so far-fetched.
1833
01:23:15,555 --> 01:23:17,612
You have to look at it
in the historical context.
1834
01:23:17,614 --> 01:23:20,473
All concepts of
technological progress
1835
01:23:20,475 --> 01:23:24,713
in that way are linked
to post-enlightenment ideas
1836
01:23:24,715 --> 01:23:27,250
or non... they're linked
to the idea
1837
01:23:27,252 --> 01:23:30,218
of the arrow of time
being in free flight forward,
1838
01:23:30,220 --> 01:23:34,028
but they're also chiliastic,
they propose an end state.
1839
01:23:34,213 --> 01:23:35,673
They propose the end state,
1840
01:23:35,675 --> 01:23:37,334
and the end state
is the singularity,
1841
01:23:37,336 --> 01:23:39,593
but they propose it
as something desirable.
1842
01:23:39,595 --> 01:23:42,993
Now any kind of philosophy
like that, it's, you know,
1843
01:23:42,995 --> 01:23:46,554
jam tomorrow, jam yesterday,
but never jam today.
1844
01:23:46,556 --> 01:23:48,833
They're all philosophies
that are about
1845
01:23:48,835 --> 01:23:52,153
accept the shit you're in...
work, consume, die...
1846
01:23:52,155 --> 01:23:54,492
because there is something
better in the future,
1847
01:23:54,494 --> 01:23:56,242
or there's something
more innovative in the future.
1848
01:23:56,244 --> 01:23:58,760
There are good scenarios
and there are bad scenarios.
1849
01:23:58,762 --> 01:23:59,924
I don't know where we're headed.
1850
01:23:59,926 --> 01:24:03,294
I mean, I don't think
anyone really knows.
1851
01:24:03,296 --> 01:24:04,993
You know, if anyone
claims to know the future,
1852
01:24:04,995 --> 01:24:07,073
they're guessing,
they're extrapolating forward,
1853
01:24:07,075 --> 01:24:08,433
and we draw
some lines and curves
1854
01:24:08,435 --> 01:24:10,088
and see where
technology's gonna be.
1855
01:24:10,090 --> 01:24:11,390
What that means,
1856
01:24:11,392 --> 01:24:13,113
I don't think any of us
really understand.
1857
01:24:13,115 --> 01:24:16,153
Everyone assumes that
the future is going to be
1858
01:24:16,155 --> 01:24:17,635
dramatically different
from today
1859
01:24:17,637 --> 01:24:18,934
and that's absolutely true,
1860
01:24:18,936 --> 01:24:20,792
but it's also true
that the future will be
1861
01:24:20,794 --> 01:24:22,174
an extension of today's world.
1862
01:24:22,176 --> 01:24:24,713
The problems
that exist in today's world
1863
01:24:24,715 --> 01:24:27,247
are still gonna be
with us in the future.
1864
01:24:27,656 --> 01:24:29,555
Human nature is
not gonna change.
1865
01:24:29,557 --> 01:24:32,198
The end point
in all of this game
1866
01:24:32,200 --> 01:24:34,966
will become a bit of a moral
1867
01:24:35,075 --> 01:24:38,167
and an ethical question
for society
1868
01:24:38,169 --> 01:24:40,739
where decisions
will have to be made.
1869
01:24:41,355 --> 01:24:45,198
Like life itself, work
and death, for better or worse,
1870
01:24:45,200 --> 01:24:47,393
are two features
of the human experience
1871
01:24:47,395 --> 01:24:49,583
that are thrust upon us.
1872
01:24:50,494 --> 01:24:52,963
Whether or not
we define work and death
1873
01:24:52,965 --> 01:24:54,979
as problems in need of remedy,
1874
01:24:54,981 --> 01:24:59,042
human ingenuity is a progressive
and natural extension
1875
01:24:59,044 --> 01:25:01,599
of our own evolution.
1876
01:25:01,601 --> 01:25:07,333
[music]
1877
01:25:08,195 --> 01:25:11,523
Advancing our technological
capabilities is a way
1878
01:25:11,525 --> 01:25:15,309
of dealing with our limitations
as human beings.
1879
01:25:16,331 --> 01:25:17,734
Must we do something
1880
01:25:17,736 --> 01:25:20,374
just because we're capable
of doing something?
1881
01:25:20,428 --> 01:25:22,734
Or can we withhold
our hands and say
1882
01:25:22,736 --> 01:25:24,776
"No, this is not
a good thing to do"?
1883
01:25:24,778 --> 01:25:27,973
This is something
that the human species
1884
01:25:27,975 --> 01:25:29,393
must decide for itself.
1885
01:25:29,395 --> 01:25:32,090
You and I, we can't just
leave it to the scientists.
1886
01:25:32,092 --> 01:25:35,045
We have to know
what's going on and why!
1887
01:25:35,047 --> 01:25:38,000
Subtitles by explosiveskull
1888
01:25:38,002 --> 01:25:43,722
[music]
150836
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.