Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:08,920 --> 00:00:10,440
[Narrator] Modern man
is a miracle of evolution.
2
00:00:10,440 --> 00:00:15,440
[Dawkins] We now have
3
00:00:15,440 --> 00:00:17,600
a larger functioning brain
than has ever existed before,
4
00:00:17,600 --> 00:00:19,720
and that has enabled
us to develop things
5
00:00:19,720 --> 00:00:21,320
like language,
culture, engineering,
6
00:00:21,320 --> 00:00:24,560
technology, poetry,
mathematics, philosophy.
7
00:00:24,560 --> 00:00:28,440
Things which have
never been known before.
8
00:00:28,440 --> 00:00:31,480
[Narrator] All constantly
evolving, advancing.
9
00:00:31,480 --> 00:00:34,240
My primary scientific
and technical passion
10
00:00:34,240 --> 00:00:37,880
is the creation of
thinking machines
11
00:00:37,880 --> 00:00:40,720
that can think at
the human level,
12
00:00:40,720 --> 00:00:43,440
and ultimately beyond
the human level.
13
00:00:43,440 --> 00:00:47,040
[Narrator] Welcome
to the brave new world.
14
00:00:47,040 --> 00:00:50,960
We are moving toward a
technological singularity,
15
00:00:50,960 --> 00:00:53,600
a point at which there is
an intelligence explosion
16
00:00:53,600 --> 00:00:58,120
where an ordinary,
unenhanced human
17
00:00:58,120 --> 00:01:00,320
can no longer understand the
technological developments.
18
00:01:04,760 --> 00:01:05,880
[Indistinct radio chatter].
19
00:01:17,880 --> 00:01:22,120
[Narrator] The technologies
of the digital age
20
00:01:22,120 --> 00:01:24,920
have delivered unprecedented,
21
00:01:24,920 --> 00:01:26,200
perhaps boundless possibilities.
22
00:01:30,800 --> 00:01:35,480
We've reached a point
23
00:01:35,480 --> 00:01:37,280
at which man and
technology begin to merge.
24
00:01:37,280 --> 00:01:39,800
Tools become body parts and
the human being digitally optimized.
25
00:01:41,920 --> 00:01:45,280
The result is a controversial
vision: Super sapiens.
26
00:02:01,800 --> 00:02:05,120
[Narrator] In the beginning,
there was a hell on earth.
27
00:02:11,400 --> 00:02:15,120
Water became the elixir of life.
28
00:02:15,120 --> 00:02:19,080
From the simplest of organisms,
29
00:02:19,080 --> 00:02:21,640
increasingly complex beings
evolved over billions of years.
30
00:02:21,640 --> 00:02:26,280
Life w
31
00:02:26,280 --> 00:02:28,280
as changing and so
was the environment.
32
00:02:28,280 --> 00:02:33,640
The
33
00:02:33,640 --> 00:02:36,480
only way to survive was to adapt.
Then came a pivotal point in evolution.
34
00:02:37,480 --> 00:02:40,880
The rise of humans.
35
00:02:44,760 --> 00:02:47,600
Cultural achievements
bear witness
36
00:02:47,600 --> 00:02:50,480
to ever-increasing
human abilities.
37
00:02:50,480 --> 00:02:54,160
Alongside the natural world,
an intellectual world arose.
38
00:02:54,160 --> 00:02:57,320
Man created art,
language, and writing.
39
00:02:58,640 --> 00:03:03,480
Societies emerged and changed,
40
00:03:03,480 --> 00:03:05,760
and technology and its
increasingly swift progress
41
00:03:05,760 --> 00:03:09,000
accelerated this process.
42
00:03:09,000 --> 00:03:11,800
Today we can look
into our deep past
43
00:03:11,800 --> 00:03:14,040
and gaze into
the distant future.
44
00:03:19,520 --> 00:03:23,400
Technology has reached a
stage where it's all-pervading.
45
00:03:23,400 --> 00:03:27,720
It informs us, entertains
us, runs our lives.
46
00:03:27,720 --> 00:03:31,080
But what about the brain
that created all the technology.
47
00:03:31,080 --> 00:03:35,480
Is it still developing?
48
00:03:35,480 --> 00:03:38,080
[Dawkins] I think
it's hard to claim
49
00:03:38,080 --> 00:03:41,840
that our brain is still evolving in terms
of hardware. In order for that to happen,
50
00:03:41,840 --> 00:03:43,440
it would be necessary
that the brainiest individuals
51
00:03:43,440 --> 00:03:45,240
were still surviving best
or having the most children.
52
00:03:45,240 --> 00:03:48,920
And that's clearly not true.
53
00:03:48,920 --> 00:03:51,280
What's happening
is cultural evolution.
54
00:03:51,280 --> 00:03:52,840
We've externalized
our mental evolution.
55
00:03:59,200 --> 00:04:04,000
[Narrator] Modern visionaries
56
00:04:04,000 --> 00:04:04,680
are attempting to make
sense of the dichotomy:
57
00:04:04,681 --> 00:04:07,561
Humans on the one hand,
58
00:04:07,560 --> 00:04:08,840
technology on the other.
59
00:04:11,880 --> 00:04:13,040
My name's Mikey Siegel.
60
00:04:13,040 --> 00:04:16,920
I have a background
as an engineer
61
00:04:16,920 --> 00:04:19,040
in computer
engineering and robotics.
62
00:04:19,040 --> 00:04:23,480
The big transition in my life
63
00:04:23,480 --> 00:04:25,280
has been where I
started to discover
64
00:04:25,280 --> 00:04:27,600
that there was another world.
65
00:04:31,960 --> 00:04:36,360
For as long as I can remember,
66
00:04:36,360 --> 00:04:39,520
I've been excited about technology and engineering.
Eventually I hit, like, the engineer's jackpot
67
00:04:39,520 --> 00:04:42,280
and got to do robotics at m.I.T.
68
00:04:42,280 --> 00:04:46,760
I graduated m.I.T.,
69
00:04:46,760 --> 00:04:49,400
I got an amazing
job, well-paying,
70
00:04:49,400 --> 00:04:52,000
living with my friends
in San Francisco.
71
00:04:52,000 --> 00:04:54,440
And on paper, it
was the perfect life.
72
00:04:56,280 --> 00:05:00,360
But there was something wrong.
73
00:05:00,360 --> 00:05:04,440
It's almost as if
74
00:05:04,440 --> 00:05:06,040
I needed to get away
from the science and tech
75
00:05:06,040 --> 00:05:08,840
in order to have the perspective
76
00:05:08,840 --> 00:05:11,160
to actually see what it really
means and what it really is.
77
00:05:17,320 --> 00:05:20,600
[Narrator] Mikey Siegel spent some time in
India. There, he had a spiritual experience
78
00:05:20,600 --> 00:05:22,120
that inspired new ways
of thinking about the nexus
79
00:05:22,120 --> 00:05:24,360
between humans and machines.
80
00:05:28,520 --> 00:05:30,000
It's impossible to talk
about human beings
81
00:05:30,000 --> 00:05:31,920
without also talking
about technology.
82
00:05:31,920 --> 00:05:36,600
And so the question
83
00:05:36,600 --> 00:05:38,400
is not whether or not we
want technology to be here.
84
00:05:38,400 --> 00:05:40,280
The question is what
we want technology to be.
85
00:05:42,560 --> 00:05:48,560
[Indistinct chattering].
86
00:05:48,560 --> 00:05:50,680
[Narrator] Hooked on constant connectivity, we've
become inseparable from our digital devices.
87
00:05:50,680 --> 00:05:54,600
[Narrator] They've
become part of our lives,
88
00:05:54,600 --> 00:05:56,320
and now they're becoming
part of our bodies.
89
00:05:57,200 --> 00:06:01,720
[Narrator] For some people,
90
00:06:01,720 --> 00:06:02,960
artificial implants
are all the rage
91
00:06:02,960 --> 00:06:04,720
when the body's
communicating with computers.
92
00:06:07,800 --> 00:06:11,720
[Indistinct chattering].
93
00:06:11,720 --> 00:06:13,960
[Narrator] At the epicenter
laboratory in Stockholm,
94
00:06:13,960 --> 00:06:16,960
body hackers prepare
to make this happen.
95
00:06:16,960 --> 00:06:19,360
I think that the senses that
evolution has given us humans
96
00:06:20,160 --> 00:06:24,760
are quite lousy.
97
00:06:24,760 --> 00:06:27,080
I see the human
body as a platform
98
00:06:27,080 --> 00:06:30,200
which is a good beginning,
99
00:06:30,200 --> 00:06:31,280
and it's really up
to us to redesign it
100
00:06:31,280 --> 00:06:34,280
and to tweak it
101
00:06:34,280 --> 00:06:35,160
and to improve it
in all various ways,
102
00:06:35,161 --> 00:06:36,321
because it needs improvement.
103
00:06:39,800 --> 00:06:42,960
Human beings are not the
fastest runners on the planet,
104
00:06:42,960 --> 00:06:46,800
nor the longest-lived
organisms on the planet,
105
00:06:46,800 --> 00:06:50,800
nor can we jump higher
than any other being.
106
00:06:50,800 --> 00:06:53,520
We're just specific combinations
of atoms, molecules, and cells
107
00:06:53,520 --> 00:06:58,280
that happened to have evolved.
108
00:06:58,280 --> 00:07:00,360
Human nature is all about
self-transformation, I think.
109
00:07:00,360 --> 00:07:04,640
We all try to become
better people,
110
00:07:04,640 --> 00:07:06,360
although we have very different
views on what "better" means.
111
00:07:06,360 --> 00:07:10,520
I have a chip implant
here in my hand.
112
00:07:10,520 --> 00:07:12,960
I have my contact
details saved on my chip,
113
00:07:12,960 --> 00:07:16,600
so whenever I meet someone,
114
00:07:16,600 --> 00:07:19,040
they can scan my hand
115
00:07:19,040 --> 00:07:24,480
and they get my contacts
straight into their address book.
116
00:07:24,480 --> 00:07:26,760
Innovation has always been with humans, so
it's not really possible to stop technology.
117
00:07:26,760 --> 00:07:29,480
[Speaking foreign language].
118
00:07:29,480 --> 00:07:33,120
[Sandberg]. However, y
119
00:07:33,120 --> 00:07:36,160
ou might aim at getting beneficial
technologies before dangerous technologies.
120
00:07:39,600 --> 00:07:40,760
[Narrator] Bodyhackers
and their chip implants
121
00:07:40,760 --> 00:07:43,760
attract growing interest
122
00:07:43,760 --> 00:07:44,800
at technology fairs
around the world.
123
00:07:44,800 --> 00:07:48,760
[Narrator] At
124
00:07:48,760 --> 00:07:51,600
the pioneers festival in
Vienna, the big names in the field
125
00:07:51,600 --> 00:07:52,920
try to convince the
audience of their visions.
126
00:07:52,920 --> 00:07:56,120
[Sjoblad] Are you
welcoming the pain?
127
00:07:56,120 --> 00:07:56,760
[Automated female
voice]. Signal acquired.
128
00:07:56,761 --> 00:07:58,121
Connection established.
129
00:07:58,120 --> 00:08:00,000
"Transition to cyborg done."
130
00:08:00,000 --> 00:08:01,920
Oh, my god!
131
00:08:01,920 --> 00:08:04,800
That's awesome!
132
00:08:04,800 --> 00:08:06,320
[Sjoblad] We have a potential
133
00:08:06,320 --> 00:08:07,800
to be much more than
we are as humans,
134
00:08:09,440 --> 00:08:13,680
and still be humans.
135
00:08:13,680 --> 00:08:16,560
For example, by expand
136
00:08:16,560 --> 00:08:18,920
ing our sensory universe.
137
00:08:18,920 --> 00:08:23,120
What if we could
really understand
138
00:08:23,120 --> 00:08:25,320
how another person feels?
139
00:08:25,320 --> 00:08:30,000
I think that would make us
140
00:08:30,000 --> 00:08:32,880
not only more intelligent, but also
more importantly, more ethical beings.
141
00:08:38,960 --> 00:08:40,040
So I think it's very useful to
have people experimenting
142
00:08:40,040 --> 00:08:44,360
and doing implants.
143
00:08:44,360 --> 00:08:46,080
One reason why it's so important
to have people experimenting,
144
00:08:46,080 --> 00:08:49,600
even maybe sometimes
unwisely or radically,
145
00:08:49,600 --> 00:08:52,360
is that we don't know what
could be a good outcome.
146
00:08:52,360 --> 00:08:55,920
We can't know that
until somebody tries it.
147
00:08:55,920 --> 00:08:58,040
Being a cyborg
is like the future...
148
00:08:58,040 --> 00:08:59,760
[Narrator] Learning by doing.
149
00:08:59,760 --> 00:09:01,480
But no experiment
is without risk.
150
00:09:01,480 --> 00:09:05,960
Maybe more, that
would be amazing!
151
00:09:05,960 --> 00:09:08,740
[Harris]. Well, when you look at how technology's being
designed, it is being designed with very little foresight.
152
00:09:08,760 --> 00:09:15,040
At the moment,
153
00:09:15,040 --> 00:09:16,560
I'm not so eager to have something implanted under my skin,
but I can imagine doing it if it became not only tempting,
154
00:09:16,560 --> 00:09:21,160
but it may seem essential.
155
00:09:21,160 --> 00:09:24,120
Why would anyone
want to send email
156
00:09:24,120 --> 00:09:27,160
when we can talk on the phone?
157
00:09:27,160 --> 00:09:28,920
It seems like a step backward,
158
00:09:28,920 --> 00:09:30,720
and yet all of us got sort
of thrown over the precipice
159
00:09:30,720 --> 00:09:34,040
and now are dealing with some
fundamental changes in our lives.
160
00:09:37,600 --> 00:09:41,480
Many of us have outsourced
our memory to the Internet,
161
00:09:41,480 --> 00:09:46,600
where it's just,
162
00:09:46,600 --> 00:09:48,040
rather than worry about whether or not you can remember
something, you know you can always just search for it.
163
00:09:48,040 --> 00:09:51,520
[Indistinct chattering].
164
00:09:51,520 --> 00:09:53,000
[Sjoblad] Our environment
has changed quite radically.
165
00:09:53,000 --> 00:09:57,720
Some figures are
166
00:09:57,720 --> 00:10:01,640
that we have about seven billion connected devices
on earth. About the same number as we have humans.
167
00:10:01,880 --> 00:10:06,720
But in just five years from now,
168
00:10:06,720 --> 00:10:08,120
we can expect to have 50
billion connected devices,
169
00:10:08,120 --> 00:10:12,600
so more than 10 for each of us.
170
00:10:12,600 --> 00:10:16,040
And we need to have
some smart way to interact
171
00:10:16,040 --> 00:10:18,520
with all these digital
things around us.
172
00:10:18,520 --> 00:10:21,600
[Narrator] The crucial qu
173
00:10:21,600 --> 00:10:23,480
estion is how this
interaction should look.
174
00:10:23,480 --> 00:10:28,080
One of the controversies
raised by religious people
175
00:10:28,080 --> 00:10:30,480
is whether life is
intelligently designed.
176
00:10:30,480 --> 00:10:33,040
So far it has not been
intelligently designed.
177
00:10:33,040 --> 00:10:34,960
It's arisen by unconscious,
undesigned evolution.
178
00:10:35,440 --> 00:10:40,120
Now humans are in a position
179
00:10:40,120 --> 00:10:42,080
to intelligently design
the future of evolution.
180
00:10:42,080 --> 00:10:46,120
It could be very exciting
181
00:10:46,120 --> 00:10:47,520
if humans enhance
themselves in ways
182
00:10:47,520 --> 00:10:50,320
that make them
overall better people.
183
00:10:50,320 --> 00:10:52,640
But in order to do
this, we have to ask:
184
00:10:52,640 --> 00:10:55,000
What is the self?
185
00:10:55,000 --> 00:10:56,480
Will it really be a situation
186
00:10:56,480 --> 00:10:57,920
where I am still me
at the end of the day?
187
00:10:57,920 --> 00:11:02,000
[Indistinct chattering].
188
00:11:02,000 --> 00:11:04,360
[Narrator] The xtech
conference in San Francisco
189
00:11:04,360 --> 00:11:06,680
is one of the most important
hubs for innovations
190
00:11:06,680 --> 00:11:09,080
that connect technology
with the human brain.
191
00:11:09,080 --> 00:11:14,280
Mikey Siegel is on the lookout
192
00:11:14,280 --> 00:11:16,840
for the latest developments by inventors
who want to improve on human beings
193
00:11:16,840 --> 00:11:20,600
and their natural abilities,
194
00:11:20,600 --> 00:11:22,060
who are looking to
surpass their own limitations.
195
00:11:22,080 --> 00:11:28,360
The spectrum ranges
196
00:11:28,360 --> 00:11:32,360
from virtual reality game platforms to medical devices
that can alter the way the humanbrain functions.
197
00:11:35,880 --> 00:11:37,220
Harnessing the mind to control
objects is a popular concept.
198
00:11:37,240 --> 00:11:40,680
I will save you!
199
00:11:44,680 --> 00:11:46,400
[Narrator] But will alone
is often not enough.
200
00:11:46,400 --> 00:11:49,280
[Alarm blaring in game].
201
00:11:49,280 --> 00:11:50,160
Oh, no!
202
00:11:50,161 --> 00:11:52,401
[Narrator] Not yet.
203
00:11:52,400 --> 00:11:54,040
The conference offers a
great deal of opportunities
204
00:11:54,040 --> 00:11:56,960
for experiments and
new experiences.
205
00:11:56,960 --> 00:12:01,240
[Man] And the next
one we're doing here is
206
00:12:01,240 --> 00:12:03,440
the g.S.R. So this is the
conductivity of the skin.
207
00:12:03,440 --> 00:12:04,760
And that's going to measure
how intense of an emotion
208
00:12:04,760 --> 00:12:06,480
that you'll be having.
209
00:12:06,480 --> 00:12:08,240
[Siegel] We're good?
210
00:12:08,240 --> 00:12:09,600
Okay, I'm going to do my best
angry face, okay? Are you ready?
211
00:12:09,600 --> 00:12:11,480
All right.
212
00:12:11,480 --> 00:12:12,560
I don't want you to get
scared. [Man] It'll be okay.
213
00:12:12,560 --> 00:12:14,640
All right, stay calm.
214
00:12:14,640 --> 00:12:18,200
[Man] Go for it. If you don't
have a model of emotion,
215
00:12:18,200 --> 00:12:20,640
the only way you can get that is asking people how they
feel. But that, in turn, is changing the way that they feel.
216
00:12:20,640 --> 00:12:23,560
So having a sensor
system like this
217
00:12:23,560 --> 00:12:24,960
allows you to look at
what is that unspoken,
218
00:12:24,960 --> 00:12:27,360
unbiased response to
environments, to surveys,
219
00:12:27,360 --> 00:12:30,600
or anything along those lines?
220
00:12:30,600 --> 00:12:31,920
The edge that I'm
really interested in
221
00:12:31,920 --> 00:12:34,120
is the edge of
transformative technology.
222
00:12:34,120 --> 00:12:36,840
How do you design new
tools and technologies
223
00:12:36,840 --> 00:12:39,640
that connect and interface
with the brain and the mind,
224
00:12:39,640 --> 00:12:42,240
but can take us to a
new place, to new depths?
225
00:12:42,240 --> 00:12:45,560
To the furthest reaches
of human flourishing.
226
00:12:45,560 --> 00:12:48,600
So tell me about this headset.
227
00:12:48,600 --> 00:12:49,880
How does this work?
228
00:12:49,880 --> 00:12:51,680
So this is a dry e.E.G.
229
00:12:51,680 --> 00:12:53,800
It's got little sensors
that look like this...
230
00:12:53,800 --> 00:12:56,560
[Siegel] So the special
thing about this is that
231
00:12:56,560 --> 00:12:57,080
it doesn't actually require
any special preparation...
232
00:12:57,081 --> 00:13:01,241
Exactly.
233
00:13:01,240 --> 00:13:03,240
To put this on.
234
00:13:03,240 --> 00:13:04,640
I mean, this could be the ticket to
actually making this kind of technology
235
00:13:04,640 --> 00:13:07,480
accessible to a large
number of people.
236
00:13:07,480 --> 00:13:10,200
That's the idea, and
what we do is we take this
237
00:13:10,200 --> 00:13:14,760
and we make sense of this.
238
00:13:14,760 --> 00:13:16,640
We take this sensor activity and we can go into your
brainand look at how activity inside your brain is changing,
239
00:13:16,640 --> 00:13:20,240
and that starts to become
much more powerful.
240
00:13:20,240 --> 00:13:22,240
[Narrator] The researchers
have devised a way
241
00:13:22,240 --> 00:13:24,320
to visualize a person's
emotional state,
242
00:13:24,320 --> 00:13:27,080
a technique that
may change lives.
243
00:13:27,080 --> 00:13:30,400
[Man] ...We're doing
that all in real time.
244
00:13:30,400 --> 00:13:32,000
We're going to see a
massive increase in the ability
245
00:13:32,000 --> 00:13:35,600
for locked-in individuals
to be able to communicate.
246
00:13:35,600 --> 00:13:39,040
At first, on a
very basic level...
247
00:13:39,040 --> 00:13:40,320
We can see it
here, on the phone.
248
00:13:40,320 --> 00:13:43,240
Oh, wow!
249
00:13:43,240 --> 00:13:45,280
So you're very serene and
contented right now. Well, i'm just...
250
00:13:45,280 --> 00:13:48,000
And relaxed.
251
00:13:48,000 --> 00:13:50,160
I just love e.E.G. So much.
252
00:13:50,160 --> 00:13:54,440
Well, this is just
the beginning.
253
00:13:54,440 --> 00:13:56,200
Everyone who works in
brain-computer interface,
254
00:13:56,200 --> 00:13:57,800
we all think about the
idea of being able to upload
255
00:13:57,800 --> 00:14:01,400
a consciousness onto a machine.
256
00:14:01,400 --> 00:14:03,320
Would that copy
actually be conscious?
257
00:14:03,320 --> 00:14:06,560
That depends
upon the larger issue
258
00:14:06,560 --> 00:14:08,680
of whether a machine is
capable of being conscious.
259
00:14:08,680 --> 00:14:12,200
And if all of a
sudden this phone,
260
00:14:12,200 --> 00:14:12,880
my phone, knows
actually what I'm feeling,
261
00:14:12,881 --> 00:14:17,481
on one hand,
262
00:14:17,480 --> 00:14:21,600
I could imagine this being the
greatest marketing tool of all time.
263
00:14:21,600 --> 00:14:22,840
But on the other time, I could imagine this
being, like, the most sensitive, heartfelt,
264
00:14:22,840 --> 00:14:26,840
intuitive therapist...
265
00:14:26,840 --> 00:14:28,840
that I've ever come
in contact with,
266
00:14:28,840 --> 00:14:30,360
that is sensitive to my emotions
and perhaps helping to guide me
267
00:14:30,360 --> 00:14:34,080
towards a more balanced
and harmonious state.
268
00:14:34,080 --> 00:14:39,360
[Indistinct news
reports overlapping].
269
00:14:39,360 --> 00:14:44,160
[Narrator] For some, the new mind-machine nexus promises
endless possibilities. Critics warn of unchecked development.
270
00:14:44,160 --> 00:14:47,480
Marie curie, the
discoverer of radioactivity,
271
00:14:47,480 --> 00:14:49,620
could never have imagined
the consequences of her work.
272
00:14:49,640 --> 00:14:55,720
[Indisti
273
00:14:55,720 --> 00:14:57,960
[nct news reports overlapping]. Although this
example seems polemic, what would actually happen
274
00:14:57,960 --> 00:14:59,760
if machines develop
their own minds?
275
00:14:59,760 --> 00:15:02,160
Even influential figures like
Stephen Hawking and Elon Musk
276
00:15:02,160 --> 00:15:04,440
warn against this scenario.
277
00:15:08,360 --> 00:15:11,160
Mikey Siegel
imagines another future,
278
00:15:11,160 --> 00:15:14,200
where advanced
technology is used
279
00:15:14,200 --> 00:15:16,920
to elevate human
consciousness to a new level
280
00:15:16,920 --> 00:15:19,000
by plugging the
brain into a machine.
281
00:15:25,440 --> 00:15:30,240
[Siegel] And it felt like
the more I found my path,
282
00:15:30,240 --> 00:15:32,960
the less people had any
idea what I was talking about.
283
00:15:35,240 --> 00:15:38,560
What I'm suggesting is that
we are entering into a new age.
284
00:15:40,400 --> 00:15:45,080
What I would call
the experience age.
285
00:15:45,080 --> 00:15:48,800
There was a time where
your physical capacity
286
00:15:48,800 --> 00:15:51,640
was a measure of how
productive you could be.
287
00:15:51,640 --> 00:15:55,600
And then we moved
into the information age,
288
00:15:55,600 --> 00:15:57,520
where your intellect, your
mind, was the measure.
289
00:15:57,800 --> 00:16:02,680
And now, in the experience age,
290
00:16:02,680 --> 00:16:05,120
with technologies that can
become completely immersive
291
00:16:05,120 --> 00:16:06,920
and deeply impact our
experience of the world,
292
00:16:09,360 --> 00:16:12,760
all of a sudden, we're
playing in a new terrain.
293
00:16:17,280 --> 00:16:21,520
[Narrator]
Technological progress
294
00:16:21,520 --> 00:16:22,880
has always inspired
and empowered people
295
00:16:22,880 --> 00:16:25,000
to look for new possibilities
296
00:16:25,000 --> 00:16:26,280
to strike out in new directions.
297
00:16:32,760 --> 00:16:34,600
It brings dreams to
fruition and changes lives.
298
00:16:37,720 --> 00:16:41,880
Morning, guys.
299
00:16:41,880 --> 00:16:46,040
[Narrator] Steven Sanchez has relied on a wheelchair
ever since he broke a lumbar vertebra in a bike crash.
300
00:16:50,240 --> 00:16:53,200
A team from the university of California in
Berkeley has developed special leg prosthesis
301
00:16:53,200 --> 00:16:55,960
to help people like Steven.
302
00:16:55,960 --> 00:16:59,280
The construction
is an exoskeleton,
303
00:16:59,280 --> 00:17:02,160
an outer support structure
304
00:17:02,160 --> 00:17:04,240
that's designed
to provide stability.
305
00:17:04,240 --> 00:17:06,800
The whole thing works
electro-mechanically
306
00:17:06,800 --> 00:17:09,040
and is computer-controlled.
307
00:17:09,040 --> 00:17:13,160
Steven has spent nearly ha
308
00:17:13,160 --> 00:17:16,000
lf of his life in a wheelchair. What
happens next will define his future.
309
00:17:24,600 --> 00:17:27,960
[Narrator] He was told
he would never walk again.
310
00:17:38,680 --> 00:17:43,240
The design still has its limits.
311
00:17:43,240 --> 00:17:45,560
The batteries only
last a few hours
312
00:17:45,560 --> 00:17:47,120
and Steven cannot go upstairs.
313
00:17:50,600 --> 00:17:53,680
But for Steven,
314
00:17:53,680 --> 00:17:54,200
the freedom and the feeling
of being back on his feet
315
00:17:54,201 --> 00:17:57,561
is priceless.
316
00:17:57,560 --> 00:18:01,280
Th
317
00:18:01,280 --> 00:18:02,240
E next step is prosthesis that
aren't controlled by computers,
318
00:18:02,241 --> 00:18:05,681
but by thoughts.
319
00:18:09,680 --> 00:18:12,280
Mikey Siegel
320
00:18:12,280 --> 00:18:16,280
searches for technology that may be useful
for his researchat the xtech conference.
321
00:18:16,280 --> 00:18:18,880
It is the ultracortex,
it is a 3D-printed...
322
00:18:18,880 --> 00:18:20,000
[Narrator] The ultracortex is
a highly developed headset
323
00:18:20,000 --> 00:18:23,480
comprised of biosensors.
324
00:18:23,480 --> 00:18:26,320
They measure
electrical body functions
325
00:18:26,320 --> 00:18:28,560
like brain activity
and heartbeat.
326
00:18:28,560 --> 00:18:30,200
We're the lego of "I
want to be a cyborg."
327
00:18:34,160 --> 00:18:37,160
E.e.g. Is a main focus,
328
00:18:37,160 --> 00:18:38,760
but with our technology
you can also measure e.M.G.,
329
00:18:38,760 --> 00:18:42,600
which is muscle movements.
330
00:18:42,600 --> 00:18:47,120
[Narrator] To
331
00:18:47,120 --> 00:18:48,080
develop their technology, the researchers
perform experiments on themselves,
332
00:18:48,081 --> 00:18:50,601
all in the name of science.
333
00:18:55,040 --> 00:18:57,120
[Russomanno]. One of my recent
projects has been sending e.M.G.,
334
00:18:57,120 --> 00:19:01,560
or muscle signals,
335
00:19:01,560 --> 00:19:03,360
in from the openbci
board into a robotic hand.
336
00:19:08,520 --> 00:19:10,680
The project really just started out of interest
but over the course of the last six months
337
00:19:10,680 --> 00:19:11,600
there's been a huge
rise, a surge of energy
338
00:19:11,601 --> 00:19:14,721
in open-source prosthetics,
339
00:19:14,720 --> 00:19:16,600
and I just immediately
saw the value of openbci
340
00:19:16,600 --> 00:19:18,880
as a high-channel-count system
to help augment the space.
341
00:19:25,640 --> 00:19:26,520
[Schneider] Well, the first step is to
help people who have medical problems.
342
00:19:26,521 --> 00:19:31,401
Moving into the future,
343
00:19:31,400 --> 00:19:35,440
we might imagine b.M.I.S as a form of
enhancement rather than just a form of therapy.
344
00:19:35,440 --> 00:19:38,800
What if you could get
artificial limbs and then,
345
00:19:38,800 --> 00:19:41,880
should they break, you
could just have them replaced?
346
00:19:41,880 --> 00:19:45,920
This is awesome.
347
00:19:45,920 --> 00:19:47,560
How long would it take you
348
00:19:47,560 --> 00:19:49,120
to sign up for those
kinds of enhancements,
349
00:19:49,120 --> 00:19:52,280
rather than just using
your biological limbs?
350
00:19:52,280 --> 00:19:56,040
[Russomanno]. In regards to the
robotic arm that I've been working with,
351
00:19:56,040 --> 00:19:59,240
there's an element
of machine learning.
352
00:19:59,240 --> 00:20:01,200
What we're doing is very
simple machine learning,
353
00:20:01,200 --> 00:20:03,400
but the field of
machine learning
354
00:20:03,400 --> 00:20:05,560
is growing at an
exponential rate,
355
00:20:05,560 --> 00:20:07,880
and our ability to handle
and process the data
356
00:20:07,880 --> 00:20:11,520
that is getting generated
by systems like openbci
357
00:20:11,520 --> 00:20:14,320
is only going to
get more powerful.
358
00:20:14,320 --> 00:20:15,640
So, David, I'd like to think
that when the robots take over,
359
00:20:15,640 --> 00:20:17,520
they're going to spare
the two of us, you know.
360
00:20:17,520 --> 00:20:20,840
Yeah, yeah.
361
00:20:20,840 --> 00:20:21,680
We'll still be useful.
362
00:20:21,681 --> 00:20:22,961
I like that theory.
363
00:20:24,440 --> 00:20:26,760
[Russomanno]. Until
recently, you know,
364
00:20:26,760 --> 00:20:28,860
e.e.g. Has been a very expensive
device that one laboratory,
365
00:20:28,880 --> 00:20:32,800
one highly funded laboratory
at the top-tier schools,
366
00:20:32,800 --> 00:20:36,720
have access to.
367
00:20:36,720 --> 00:20:38,160
But now all of a sudden,
in the last five to ten years,
368
00:20:38,160 --> 00:20:39,880
it's just blown wide
open, the entire space.
369
00:20:39,880 --> 00:20:43,520
Because we've got
100-dollar devices
370
00:20:43,520 --> 00:20:45,760
that are as good as the best
technology 10, 20 years ago.
371
00:20:48,480 --> 00:20:50,520
[Narrator] Technology
now feeds itself.
372
00:20:50,520 --> 00:20:52,040
High-tech progress enables
components to be manufactured
373
00:20:52,040 --> 00:20:55,840
smaller and cheaper,
374
00:20:55,840 --> 00:20:57,280
which in turn ensures
technological advance.
375
00:20:57,280 --> 00:21:00,800
A lot of people that
have an incredible vision
376
00:21:00,800 --> 00:21:03,640
all of a sudden have a
tool that they can afford.
377
00:21:03,640 --> 00:21:07,240
They have a tool
that's accessible
378
00:21:07,240 --> 00:21:09,240
and they have a community
379
00:21:09,240 --> 00:21:11,320
of people that are
also using this tool
380
00:21:11,320 --> 00:21:12,960
that they can get support
from and work with.
381
00:21:12,960 --> 00:21:17,200
[Narrator] The startup openbci
382
00:21:17,200 --> 00:21:19,680
is an example of how
affordable technology
383
00:21:19,680 --> 00:21:22,200
is fanning the flames of
technological development.
384
00:21:22,200 --> 00:21:26,080
I think that the merger
of man and machine
385
00:21:26,080 --> 00:21:27,920
has kind of been going
on now for quite some time.
386
00:21:27,920 --> 00:21:31,160
It's no surprise that we're
talking about this here,
387
00:21:31,160 --> 00:21:33,480
that I have an opportunity to be
a part of something like openbci,
388
00:21:33,480 --> 00:21:35,560
where we see more and
more of these kinds of projects
389
00:21:37,200 --> 00:21:42,000
and examples of technology,
390
00:21:42,000 --> 00:21:44,680
digital, mechanical, and
biology, humans, animals,
391
00:21:46,880 --> 00:21:52,000
sort of coming together
at even closer connections.
392
00:21:52,000 --> 00:21:56,640
[Narrator] Joel Murphy is
preparing a spectacular experiment:
393
00:21:56,640 --> 00:22:00,600
The Guinea pig is
neuroscientist David putrino.
394
00:22:00,600 --> 00:22:04,880
[Murphy]. Are you
on the scalp now, too?
395
00:22:04,880 --> 00:22:06,440
Yeah, I can feel that
up against my scalp.
396
00:22:06,440 --> 00:22:07,640
All right, okay, this
is looking better.
397
00:22:07,640 --> 00:22:11,120
[Computer beeping slowly].
398
00:22:11,120 --> 00:22:13,920
All right, so blink your eyes a couple of times.
Okay, great, some nice eye blink artifacts.
399
00:22:13,920 --> 00:22:16,840
And fly this helicopter.
400
00:22:16,840 --> 00:22:20,000
Oh, just fly the helicopter?
401
00:22:20,000 --> 00:22:21,760
Is that all? [Murphy].
With your brain waves,
402
00:22:21,760 --> 00:22:25,120
with your brain waves,
okay? Sounds awesome.
403
00:22:25,120 --> 00:22:27,360
[Murphy]. This is gonna get awesome.
Okay, I can't wait. Let's do this.
404
00:22:27,360 --> 00:22:29,720
We're going to look simply at the
Alpha wave coming off your occipital lobe.
405
00:22:29,720 --> 00:22:30,640
We've got an electrode
back here for measuring that.
406
00:22:30,641 --> 00:22:35,161
And when you close your eyes...
407
00:22:35,160 --> 00:22:40,240
We're always creating electrical potentials
from our brains, from our muscles.
408
00:22:40,240 --> 00:22:43,920
Our heart is a big electrical signal generator. And so
it depends on what you want to do with those signals.
409
00:22:43,920 --> 00:22:47,880
And one of the things
that you could do
410
00:22:47,880 --> 00:22:50,160
is interface them
with a machine,
411
00:22:50,160 --> 00:22:51,560
either to tell the
machine about your state,
412
00:22:51,560 --> 00:22:55,600
or to get feedback
from a machine
413
00:22:55,600 --> 00:22:59,640
that's measuring your state
414
00:22:59,640 --> 00:23:01,440
and distilling some
useful information out of it.
415
00:23:01,440 --> 00:23:04,920
So this is looking
pretty good...
416
00:23:04,920 --> 00:23:05,520
[Narrator] The
preparations are complete.
417
00:23:05,521 --> 00:23:07,281
Joel is satisfied.
418
00:23:07,280 --> 00:23:09,240
The test flight can begin.
419
00:23:09,240 --> 00:23:11,280
[Murphy]. Yeah, we're
seeing some good Alpha.
420
00:23:11,280 --> 00:23:12,040
We're sending energy
to the helicopter.
421
00:23:12,041 --> 00:23:16,721
Indeed.
422
00:23:16,720 --> 00:23:19,200
[Narrator] The test
pilot tries to relax.
423
00:23:19,200 --> 00:23:20,840
[Computer continues
beeping slowly].
424
00:23:24,000 --> 00:23:26,640
[Helicopter whirring].
425
00:23:26,640 --> 00:23:28,680
[Murphy]. It's
really close, though.
426
00:23:28,680 --> 00:23:29,960
[Narrator] It seems
like things are working.
427
00:23:29,960 --> 00:23:32,560
But the helicopter
isn't taking off.
428
00:23:32,560 --> 00:23:34,840
[Murphy]. All right, you
can open your eyes now.
429
00:23:34,840 --> 00:23:37,040
[Computer beeping rapidly].
430
00:23:37,040 --> 00:23:38,960
Hang on, whoa.
431
00:23:38,960 --> 00:23:40,800
David, stop it!
432
00:23:40,800 --> 00:23:43,080
[Laughing].
433
00:23:43,080 --> 00:23:45,800
All right, I'm just gonna
make an adjustment.
434
00:23:45,800 --> 00:23:49,360
So I've just got to figure out what the best
sort of... [Putrino]. You're thresholding me now.
435
00:23:49,360 --> 00:23:49,920
[Murphy]. Yeah, well,
it's also this is super-fresh,
436
00:23:49,921 --> 00:23:52,841
super-fresh code.
437
00:23:52,840 --> 00:23:54,440
So it's like...
438
00:23:54,440 --> 00:23:56,440
It's my handicap.
439
00:23:56,440 --> 00:23:57,920
We have this capability
now of actually, you know,
440
00:23:57,920 --> 00:24:00,600
taking the analog
world inside of our head
441
00:24:00,600 --> 00:24:03,120
and turning it digital,
turning it binary,
442
00:24:03,120 --> 00:24:05,000
and then harnessing
technology, harnessing computers
443
00:24:05,000 --> 00:24:07,800
to better understand
consciousness,
444
00:24:07,800 --> 00:24:10,440
better understand, you know,
what it means to be human,
445
00:24:10,440 --> 00:24:12,080
and then maybe even
improve the human condition.
446
00:24:12,080 --> 00:24:16,280
[N
447
00:24:16,280 --> 00:24:19,080
[Arrator] And Joe Murphy is not about
to give up. [Putrino]. Looking good?
448
00:24:19,080 --> 00:24:20,560
And that looks really nice.
449
00:24:20,560 --> 00:24:21,840
There's the little
eye blinks artifacts,
450
00:24:21,840 --> 00:24:24,600
that's nice.
451
00:24:24,600 --> 00:24:26,800
[Narrator] Again, test
pilot David tries to relax.
452
00:24:26,800 --> 00:24:30,880
[Narrator] The
success is astounding.
453
00:24:30,880 --> 00:24:33,440
[Helicopter whirring].
454
00:24:33,440 --> 00:24:36,720
[Murphy]. Yeah,
455
00:24:36,720 --> 00:24:39,440
get it up there! [Putrino].
But what I find really exciting,
456
00:24:39,440 --> 00:24:41,040
the consumer market is actually
driving a lot of the innovation.
457
00:24:41,040 --> 00:24:44,560
We have these young
startups which are saying,
458
00:24:44,560 --> 00:24:47,600
"you know what? Science
is moving too slowly.
459
00:24:47,600 --> 00:24:49,840
Let's, let's crowd-source the
problem out to the community."
460
00:24:49,840 --> 00:24:54,320
[Murphy]. Okay, maybe
the problem is in hardware.
461
00:24:54,320 --> 00:24:56,360
"And let's get the
community thinking
462
00:24:56,360 --> 00:24:58,280
about how to solve
these problems.
463
00:24:58,280 --> 00:24:59,880
Let's get the community making
the next big technological steps
464
00:24:59,880 --> 00:25:03,600
and making huge innovations."
465
00:25:03,600 --> 00:25:05,000
And that's what I
think is really exciting.
466
00:25:05,000 --> 00:25:08,520
[Helicopter whirring].
467
00:25:08,520 --> 00:25:11,760
[Laughing].
468
00:25:11,760 --> 00:25:14,880
[Murphy]. It's okay, it's okay! At first, most people are
really psyched or turned on by the idea of interactivity
469
00:25:14,880 --> 00:25:17,040
and controlling robots
and mind-controlled drones.
470
00:25:17,040 --> 00:25:21,840
But what I'm more excited about
471
00:25:21,840 --> 00:25:23,880
is the capability to
give individuals access
472
00:25:23,880 --> 00:25:28,200
or further understanding of
the tools that already exist
473
00:25:28,200 --> 00:25:31,280
inside of their own heads.
474
00:25:31,280 --> 00:25:33,680
[Russomanno]. You
know, you have switches
475
00:25:33,680 --> 00:25:36,760
and the ability to alter these states
inside of your own head, if you know how.
476
00:25:40,080 --> 00:25:44,520
[Narrator] The enigmatic
relationship between mind and body
477
00:25:44,520 --> 00:25:46,760
is the focus of experiments
at the university of California
478
00:25:46,760 --> 00:25:50,520
in San Francisco.
479
00:25:50,520 --> 00:25:52,920
Researchers here
are investigating how
480
00:25:52,920 --> 00:25:55,360
physical activity influences
mental performance,
481
00:25:55,360 --> 00:25:59,320
and how this knowledge can
be used to enhance our lives.
482
00:25:59,320 --> 00:26:02,280
A.j., we're going to take a look
at your glass brain right now.
483
00:26:02,280 --> 00:26:06,840
Can you just do some activity
484
00:26:06,840 --> 00:26:11,760
so we can see some stuff
going on in there? So what you se
485
00:26:11,760 --> 00:26:12,880
e here is real-time brainĂșactivity
data being visualized,
486
00:26:12,880 --> 00:26:17,160
@literally with just a two
tenths of a second delay
487
00:26:17,160 --> 00:26:19,480
in it being
generated in his brain
488
00:26:19,480 --> 00:26:22,440
to being visualized
in the glass brain.
489
00:26:22,440 --> 00:26:26,560
So the promise of this
490
00:26:26,560 --> 00:26:30,280
is that will identify those patterns and
images that reflect optimal performance.
491
00:26:30,280 --> 00:26:32,360
And as you understand then
that machine can feed that back in
492
00:26:32,360 --> 00:26:34,280
and train them to that state.
493
00:26:34,280 --> 00:26:37,040
Exactly.
494
00:26:37,040 --> 00:26:39,200
All right, I'm gonna
get us started.
495
00:26:39,200 --> 00:26:41,760
[Narrator] The experiment is a
video game. However, in this case,
496
00:26:41,760 --> 00:26:44,000
the test player doesn't sit
comfortably at home on his sofa.
497
00:26:44,000 --> 00:26:45,240
Ready in 10 seconds.
498
00:26:45,240 --> 00:26:49,680
Five seconds.
499
00:26:49,680 --> 00:26:53,720
[Automated female voice].
Initiating cognitive workout sequence.
500
00:26:53,720 --> 00:26:55,160
Here we go.
501
00:26:57,480 --> 00:27:02,120
[Narrator] The experiment is
literally a mind/body workout.
502
00:27:02,120 --> 00:27:06,520
While the participant
exerts his body physically,
503
00:27:06,520 --> 00:27:08,520
he challenges his mind to
solve cognitive problems.
504
00:27:14,560 --> 00:27:17,640
And our hypothesis
here is that his outcomes,
505
00:27:17,640 --> 00:27:20,680
how his attention
abilities improve,
506
00:27:20,680 --> 00:27:22,640
will exceed someone that
played a version of this game
507
00:27:22,640 --> 00:27:24,800
just sitting down
at the computer.
508
00:27:29,160 --> 00:27:31,720
[Schneider] There's a lot more to the human
mind than we encounter in our everyday lives.
509
00:27:31,720 --> 00:27:33,880
So if we try to unleash
hidden potential in our brains,
510
00:27:34,920 --> 00:27:39,720
we could learn new
things about ourselves
511
00:27:39,720 --> 00:27:42,680
and perhaps improve
our level of consciousness.
512
00:27:42,680 --> 00:27:44,480
[Automated female
voice]. Entering next level.
513
00:27:49,600 --> 00:27:52,480
The other big technological advance
that we're trying to push to the next level
514
00:27:52,480 --> 00:27:54,680
is real-time brain
recordings during an activity,
515
00:27:54,680 --> 00:27:58,320
and also looking at how
areas are communicating
516
00:27:58,320 --> 00:28:00,920
with each other.
517
00:28:00,920 --> 00:28:02,000
So the holy grail being
a direct feedback loop,
518
00:28:02,000 --> 00:28:04,360
brain to machine the brain.
519
00:28:04,360 --> 00:28:07,120
That's exactly right.
520
00:28:07,120 --> 00:28:11,120
And we could do that during
many, many different activities.
521
00:28:11,120 --> 00:28:13,920
[Automated female voice].
Cognitive workout completed,
522
00:28:13,920 --> 00:28:17,840
prepare for next...
523
00:28:17,840 --> 00:28:19,320
We have a version of this where you meditate, and
we're recording brain activity and feeding that in
524
00:28:19,320 --> 00:28:22,320
so the meditative experience
525
00:28:22,320 --> 00:28:23,800
is responding to what's
going on in your brain.
526
00:28:23,800 --> 00:28:27,360
[Man] All right, our
signals look good.
527
00:28:27,360 --> 00:28:29,600
[Narrator] Mankind has been
seeking and using connections
528
00:28:29,600 --> 00:28:32,920
between mind and body
since time immemorial.
529
00:28:32,920 --> 00:28:36,320
Meditation, for instance, has
a positive influence on people.
530
00:28:36,320 --> 00:28:40,240
This has been scientifically
proven many times over.
531
00:28:40,240 --> 00:28:43,000
[Gazzaley] The ability to take
sometimes ancient practices
532
00:28:43,000 --> 00:28:46,520
and bring in technology so
that we could create algorithms,
533
00:28:46,520 --> 00:28:50,240
so that they're adaptive
and personalized,
534
00:28:50,240 --> 00:28:52,520
that's the really massive
win that we're seeing now.
535
00:28:52,520 --> 00:28:55,640
So there's some really
solid evidence coming out
536
00:28:55,640 --> 00:28:57,800
that these ancient practices
537
00:28:57,800 --> 00:28:59,560
can have very tangible
benefits on our biology.
538
00:28:59,560 --> 00:29:02,840
But if we can bring in
all of this technology,
539
00:29:02,840 --> 00:29:05,640
and get it into
people's hands in a way
540
00:29:05,640 --> 00:29:06,960
that they find much more
enjoyable and engaging,
541
00:29:06,960 --> 00:29:09,240
especially young people,
542
00:29:09,240 --> 00:29:13,080
these can be tools
to elevate our minds.
543
00:29:13,080 --> 00:29:15,200
And if we think about
that from this perspective,
544
00:29:15,200 --> 00:29:17,280
really the sky's the limit
on what we can accomplish.
545
00:29:19,960 --> 00:29:25,680
The point is,
546
00:29:25,680 --> 00:29:28,240
most of the effort has always been about capability,
the ability of the machine to do clever things,
547
00:29:28,240 --> 00:29:30,280
and very little has been
thinking about: Is it safe?
548
00:29:31,840 --> 00:29:35,200
Does it actually
produce good outcomes?
549
00:29:38,680 --> 00:29:40,960
[Narrator] But many researchers
550
00:29:40,960 --> 00:29:43,320
are well aware of the
risks of scientific progress.
551
00:29:43,320 --> 00:29:47,560
[Russomanno]. With respect to the idea
of the merger between man and machine,
552
00:29:47,560 --> 00:29:51,600
I think it's both very exciting,
553
00:29:51,600 --> 00:29:54,280
but also very frightening
at the same time,
554
00:29:54,280 --> 00:29:58,520
I think we have to be
very smart and cautious
555
00:29:58,520 --> 00:30:01,280
about how we
proceed into the future,
556
00:30:01,280 --> 00:30:04,480
and really approach the future
557
00:30:04,480 --> 00:30:07,760
while looking far
off into the horizon.
558
00:30:07,760 --> 00:30:11,160
We've made a lot of mistakes
559
00:30:11,160 --> 00:30:13,280
regarding data and
privacy and security
560
00:30:13,280 --> 00:30:14,640
with the emergence
of the Internet.
561
00:30:16,080 --> 00:30:21,160
And I think that we
should learn from those mis
562
00:30:21,160 --> 00:30:23,280
takes as we begin to connect
our brain to the Internet.
563
00:30:26,840 --> 00:30:29,120
[Sandberg]. In many ways,
if we could continue forever,
564
00:30:29,120 --> 00:30:31,440
by being very
cautious, very boring,
565
00:30:31,440 --> 00:30:33,200
uncreative, and never
doing anything fun,
566
00:30:33,200 --> 00:30:36,280
I think most people would say,
567
00:30:36,280 --> 00:30:37,200
"that's not the
right way of living.
568
00:30:37,201 --> 00:30:39,201
Some some risk is necessary."
569
00:30:44,000 --> 00:30:48,240
[Goertzel]. Imagine you
can modify your mind
570
00:30:48,240 --> 00:30:50,400
any way that you can imagine it
571
00:30:50,400 --> 00:30:52,880
and specify in
detail and program.
572
00:30:52,880 --> 00:30:55,200
Then you become
something massively smarter
573
00:30:55,200 --> 00:30:57,320
than what you can imagine now.
574
00:30:57,320 --> 00:31:00,720
You know,
575
00:31:00,720 --> 00:31:04,680
"I wish I could remember 1,000 things
in my mind at one time instead of just 10."
576
00:31:04,680 --> 00:31:06,120
"Okay, let's increase the
short-term memory buffer size," right?
577
00:31:06,120 --> 00:31:09,280
"I'd like to have a calculator
mathematica and Google
578
00:31:09,280 --> 00:31:12,840
accessible inside my
mind, that power of thought,
579
00:31:12,840 --> 00:31:15,360
instead of having to interact
with these things by my fingers,
580
00:31:15,360 --> 00:31:17,840
which is very slow and awkward."
581
00:31:17,840 --> 00:31:19,440
"Okay."
582
00:31:22,160 --> 00:31:24,400
[Narrator] One goal of research
583
00:31:24,400 --> 00:31:25,520
is to enhance certain
abilities of the human brain.
584
00:31:25,520 --> 00:31:28,440
I take that, plug
it into this box...
585
00:31:28,440 --> 00:31:29,800
I meet my friend Michael,
586
00:31:29,800 --> 00:31:31,440
who's a neuroscientist
587
00:31:31,440 --> 00:31:32,600
that's been studying
how electrical activity
588
00:31:32,600 --> 00:31:36,120
can change the
way the brain works.
589
00:31:36,120 --> 00:31:37,120
Your body is going to conduct
electricity from the right side...
590
00:31:37,121 --> 00:31:39,721
[Narrator] Once again,
591
00:31:39,720 --> 00:31:44,840
Mikey Siegel plays the part of
the human Guinea pig.[Siegel] so
592
00:31:44,840 --> 00:31:47,080
there's all these sensors
around this conference
593
00:31:47,080 --> 00:31:47,920
that are reading the
electrical activity of the brain.
594
00:31:47,921 --> 00:31:50,801
But what this is actually doing
595
00:31:50,800 --> 00:31:52,640
is essentially changing
596
00:31:52,640 --> 00:31:55,000
the electrical
activity of the brain,
597
00:31:55,000 --> 00:31:55,600
and you're targeting a
specific part of my brain...
598
00:31:55,601 --> 00:32:00,561
Yeah.
599
00:32:00,560 --> 00:32:02,920
That is going to have a certain effect, in
this case for visual processing. That's right.
600
00:32:02,920 --> 00:32:04,320
Exactly.
601
00:32:04,320 --> 00:32:05,600
[Siegel] Mmm, it's really cool.
602
00:32:05,600 --> 00:32:07,440
We can start the stimulation
when you're ready.
603
00:32:08,040 --> 00:32:11,960
I'm ready.
604
00:32:11,960 --> 00:32:14,800
All right.
605
00:32:14,800 --> 00:32:17,160
[Siegel] He wires me up with his latest gadget. And
immediately, I can feel the tingling on my head.
606
00:32:17,160 --> 00:32:19,920
I'm in your capable
hands here, Mike.
607
00:32:19,920 --> 00:32:22,040
And I can taste this
metallic taste in my mouth.
608
00:32:22,040 --> 00:32:26,280
And I also started to notice
609
00:32:26,280 --> 00:32:29,360
that my speech seemed a little weird.
I certainly wouldn't want to, you know,
610
00:32:29,360 --> 00:32:31,320
feel like that all
day every day.
611
00:32:31,320 --> 00:32:34,200
Yeah.
612
00:32:34,200 --> 00:32:34,560
Okay, so I have a
question for you, Mikey.
613
00:32:34,561 --> 00:32:37,521
Yeah.
614
00:32:37,520 --> 00:32:40,400
Do you remember the project that you
and I talked aboutputting in at the moma?
615
00:32:40,400 --> 00:32:41,960
Yeah.
616
00:32:41,960 --> 00:32:44,280
Tell us about it.
617
00:32:44,280 --> 00:32:45,720
Okay, well, I remember we
talked about putting a project...
618
00:32:45,720 --> 00:32:49,400
It was a year ago.
619
00:32:49,400 --> 00:32:53,360
Yeah, at this conference. One
year ago, we were downstairs. Yeah.
620
00:32:53,360 --> 00:32:55,960
I remember I had just had a fight with
my girlfriend. Oh, I didn't know that part.
621
00:32:55,960 --> 00:32:58,680
Anyway...
622
00:32:58,680 --> 00:33:00,300
And I'm trying to remember
what the details were.
623
00:33:00,320 --> 00:33:04,160
Some kind of installation.
624
00:33:04,160 --> 00:33:06,280
Yeah.
625
00:33:06,280 --> 00:33:07,320
I don't really
remember all the details.
626
00:33:07,320 --> 00:33:09,400
And I'm totally stumped.
627
00:33:09,400 --> 00:33:11,040
I can't remember the answer.
628
00:33:11,040 --> 00:33:15,040
And he explained to me
629
00:33:15,040 --> 00:33:16,880
that one of the effects of this electrical stimulation
is that it's in certain ways impairing my memory
630
00:33:16,880 --> 00:33:20,080
and my language processing.
631
00:33:20,080 --> 00:33:23,640
We are now trying to pump up...
632
00:33:23,640 --> 00:33:25,440
[Narrator] The aim of the experiment is
to significantly improve visual perception
633
00:33:25,440 --> 00:33:27,600
and the processing of
optical information in the brain.
634
00:33:28,400 --> 00:33:32,920
Do
635
00:33:32,920 --> 00:33:37,000
You still have your phone? Yeah.
636
00:33:37,000 --> 00:33:40,240
Let's have you do something hardcore visual.
So I pull up this game that I've been playing
637
00:33:40,240 --> 00:33:42,360
and a little frustrated with
for the past few months.
638
00:33:42,360 --> 00:33:46,120
[Narrator] Mikey starts
playing as he normally would.
639
00:33:46,120 --> 00:33:48,480
It actually makes
it more visual.
640
00:33:48,480 --> 00:33:50,760
[Narrator] Then quickly realizes
641
00:33:50,760 --> 00:33:51,920
his perception of
the game is different.
642
00:33:51,920 --> 00:33:54,280
It's sharper,
focused, more lucid.
643
00:33:54,280 --> 00:33:58,120
[Siegel] Within
about 10 minutes,
644
00:33:58,120 --> 00:34:00,240
I get a high score.
645
00:34:00,240 --> 00:34:02,800
Wow, I've never done...
646
00:34:02,800 --> 00:34:06,040
I've never beat
that level before.
647
00:34:06,040 --> 00:34:09,840
[Narrator] Mikey is
astounded by his new abilities.
648
00:34:09,840 --> 00:34:13,960
[Siegel] Wow.
649
00:34:13,960 --> 00:34:16,400
So I had...
650
00:34:16,400 --> 00:34:19,240
Before this day, I had never... I
don't think I've even gotten half as far.
651
00:34:19,240 --> 00:34:21,960
And then when I looked up
652
00:34:21,960 --> 00:34:25,280
and started just
looking around the room,
653
00:34:25,280 --> 00:34:28,320
I started to notice that
I could spot fine details,
654
00:34:28,320 --> 00:34:31,440
I could look out the window
and see the leaves on the trees.
655
00:34:31,440 --> 00:34:33,160
It was really a very
interesting experience.
656
00:34:37,520 --> 00:34:41,640
What I notice
657
00:34:41,640 --> 00:34:46,360
is this possibility to actually totally zone into what i'm
doing. It was a little hard with all the cameras around me.
658
00:34:46,360 --> 00:34:48,000
But when I hit it, it
was quite incredible.
659
00:34:48,000 --> 00:34:51,920
Mike, what did you
just do to my brain?
660
00:34:51,920 --> 00:34:53,960
Because I seem to have
extra-special abilities here.
661
00:34:56,520 --> 00:35:01,480
[Schneider] We may have an
opportunity to sculpt our brains
662
00:35:01,480 --> 00:35:03,360
and there may be a plurality
of different kinds of minds.
663
00:35:03,360 --> 00:35:07,800
One person may
have echolocation,
664
00:35:07,800 --> 00:35:09,560
another could have a
mobile Internet connection.
665
00:35:09,560 --> 00:35:12,160
Another could be
incredibly smart,
666
00:35:12,160 --> 00:35:15,400
could be a superintelligence.
667
00:35:15,400 --> 00:35:17,600
They may choose to
enhance themselves
668
00:35:17,600 --> 00:35:19,800
in ways that they feel
their employer wants.
669
00:35:19,800 --> 00:35:23,640
And we have to
have legal protection
670
00:35:23,640 --> 00:35:25,720
for those people who
decide not to enhance.
671
00:35:25,720 --> 00:35:29,080
So it was amazing to have this
experience with this technology.
672
00:35:29,080 --> 00:35:32,120
What if you actually took
multiple devices simultaneously,
673
00:35:32,120 --> 00:35:34,600
and you actually connected
them to a group of people?
674
00:35:37,680 --> 00:35:43,000
[Harris]. Well, if
we could merge two
675
00:35:43,000 --> 00:35:45,200
human brains, I think that would
produce a very different kind of mind.
676
00:35:45,200 --> 00:35:50,280
We know that to be the case,
677
00:35:50,280 --> 00:35:53,560
because people have had their brains divided
surgically to mitigate grand mal epileptic seizures.
678
00:35:53,560 --> 00:35:57,480
And you have two
minds at that point
679
00:35:57,480 --> 00:36:01,160
that are no longer
communicating with one another.
680
00:36:01,160 --> 00:36:05,600
In certain circumstances,
681
00:36:05,600 --> 00:36:09,200
they can get into a kind of tug of war with each other
so that literally the left hand is buttoning the shirt
682
00:36:09,200 --> 00:36:12,160
and the right hand
is unbuttoning it.
683
00:36:12,160 --> 00:36:14,240
But it's not in
principle different
684
00:36:14,240 --> 00:36:15,760
than uniting two
separate human beings,
685
00:36:17,680 --> 00:36:22,200
if we had the
technology to do that.
686
00:36:22,200 --> 00:36:26,480
[Indistinct news
reports overlapping].
687
00:36:26,480 --> 00:36:28,240
[Narrator] This is
not science fiction.
688
00:36:28,240 --> 00:36:31,040
There is a very real concern
689
00:36:31,040 --> 00:36:32,560
that unchecked
technological development
690
00:36:32,560 --> 00:36:34,760
will trigger an
unpredictable chain reaction.
691
00:36:34,760 --> 00:36:37,080
[Indistinct news
reports overlapping].
692
00:36:40,720 --> 00:36:42,160
Today, virtually anyone with
adequate programming knowledge
693
00:36:42,160 --> 00:36:44,840
can design body-brain
interfaces and tailor-made software
694
00:36:44,840 --> 00:36:48,520
without regulation.
695
00:36:48,520 --> 00:36:51,120
[Indistinct news
reports overlapping].
696
00:36:51,120 --> 00:36:53,680
What we have created
today, in our technologica
697
00:36:53,680 --> 00:36:55,240
l landscape,
698
00:36:55,240 --> 00:36:58,600
is an expression of who and
what we are as a human culture.
699
00:37:07,120 --> 00:37:11,200
[Narrator] Culture entails,
among other things,
700
00:37:11,200 --> 00:37:14,800
interpersonal contact,
701
00:37:14,800 --> 00:37:16,400
and exchange of
information, thoughts, feelings.
702
00:37:16,400 --> 00:37:20,640
We share these
things with others,
703
00:37:20,640 --> 00:37:22,880
sometimes in a playful way.
704
00:37:22,880 --> 00:37:26,200
Technology is an
instrument for this,
705
00:37:26,200 --> 00:37:28,640
but because of
technological progress,
706
00:37:28,640 --> 00:37:30,400
much is at stake in the future.
707
00:37:30,400 --> 00:37:36,040
[San
708
00:37:36,040 --> 00:37:38,720
[dberg]. Losing a game is not necessarily a bad
thing, because the point of the game is the activity.
709
00:37:38,720 --> 00:37:41,960
What about losing the world?
710
00:37:41,960 --> 00:37:44,400
That's kind of different
711
00:37:44,400 --> 00:37:45,920
because in many ways,
712
00:37:45,920 --> 00:37:46,800
we actually care about
being in the world.
713
00:37:46,801 --> 00:37:49,641
But we do care about also
714
00:37:49,640 --> 00:37:51,840
kind of playing the
game of existence.
715
00:37:51,840 --> 00:37:54,240
There has been a number of
human species on this planet.
716
00:38:01,360 --> 00:38:03,840
All of these homo species have
lived and they have passed away.
717
00:38:03,960 --> 00:38:07,240
Who says that homo
sapiens are any different?
718
00:38:12,400 --> 00:38:14,360
It's not obvious to me that a
replacement of our species
719
00:38:14,360 --> 00:38:18,400
by our own
technological creation
720
00:38:18,400 --> 00:38:21,480
would necessarily
be a bad thing.
721
00:38:21,480 --> 00:38:25,800
[Narrato
722
00:38:25,800 --> 00:38:27,800
[r] The future of mankind seems inextricably
connected to technological progress.
723
00:38:27,800 --> 00:38:30,800
For some, this makes it all
the more imperative to explore
724
00:38:30,800 --> 00:38:34,360
and use every possibility.
725
00:38:34,360 --> 00:38:37,040
[Cell phone ringing].
726
00:38:37,040 --> 00:38:41,720
[Siegel] Hello?
727
00:38:41,720 --> 00:38:42,680
[Craig over speaker phone]. Hey,
how you doing? What you up to?
728
00:38:42,681 --> 00:38:45,601
Oh, hey, Craig.
729
00:38:45,600 --> 00:38:48,120
Yeah, I'm on my way to the hackathon right
now. I should be there in like five minutes.
730
00:38:48,120 --> 00:38:52,720
There should be probably
about 15 or 20 folks
731
00:38:52,720 --> 00:38:55,800
from the consciousness
hacking community
732
00:38:55,800 --> 00:38:58,000
that are going to be there.
733
00:38:58,000 --> 00:39:02,120
We've got some people
734
00:39:02,120 --> 00:39:04,600
that are bringing in some pretty big projects, actually.
They're someone that's coming that's been building a nightclub
735
00:39:04,600 --> 00:39:09,360
that interacts with your mind.
736
00:39:09,360 --> 00:39:12,080
When I started this
journey, I felt totally alone.
737
00:39:12,080 --> 00:39:15,880
But things began to
change very quickly.
738
00:39:15,880 --> 00:39:18,200
I started a small community
called consciousness hacking,
739
00:39:18,200 --> 00:39:20,600
which started is 10
people sitting around a table
740
00:39:20,600 --> 00:39:22,160
discussing these ideas.
741
00:39:25,440 --> 00:39:28,600
[Narrator] Mikey's group
was on the right track.
742
00:39:28,600 --> 00:39:30,920
Within a year they
had 100 members
743
00:39:30,920 --> 00:39:33,640
who regularly met in the
San Francisco bay area.
744
00:39:33,640 --> 00:39:36,280
A year later, there are about
20 other such groups worldwide,
745
00:39:36,280 --> 00:39:39,200
totalling over 10,000 members.
746
00:39:44,680 --> 00:39:48,920
Their goal is to explore
ways to use technology
747
00:39:48,920 --> 00:39:50,840
to expand human consciousness.
748
00:39:58,320 --> 00:40:02,040
Mikey Siegel seems to have
arrived at this destination.
749
00:40:02,040 --> 00:40:07,600
He a
750
00:40:07,600 --> 00:40:10,280
nd his group have found a home for
their dreams, thanks to a generous investor
751
00:40:10,280 --> 00:40:11,720
who believes in
their utopian vision.
752
00:40:17,200 --> 00:40:22,200
The co-hack home, the
consciousness hacking home,
753
00:40:22,200 --> 00:40:24,800
is the centre of activity for
this special group of people.
754
00:40:24,800 --> 00:40:29,760
Hey, good to see you.
755
00:40:29,760 --> 00:40:32,120
[Narrator] They see themselves
as a kind of modern commune.
756
00:40:32,120 --> 00:40:35,560
The members live
together and work together.
757
00:40:35,560 --> 00:40:40,880
At first glance,
758
00:40:40,880 --> 00:40:42,600
this is reminiscent of the communes of the late
1960s, and their quest to expand human consciousness.
759
00:40:42,600 --> 00:40:46,560
[Indistinct chattering].
760
00:40:46,560 --> 00:40:51,640
And did he tell you
that he's starting...
761
00:40:51,640 --> 00:40:52,640
[Narrator] Mikey Siegel now works at
the Stanford university school of medicine.
762
00:40:52,641 --> 00:40:56,241
He's created his own curriculum
763
00:40:56,240 --> 00:40:57,800
for a new subject called
consciousness hacking.
764
00:40:57,800 --> 00:41:01,840
We've got some
fun projects going on.
765
00:41:01,840 --> 00:41:03,480
I have like a class on
technology and meditation.
766
00:41:03,480 --> 00:41:07,160
And then all the
way on the other side
767
00:41:07,160 --> 00:41:07,920
is a collaboration
with the d school,
768
00:41:07,921 --> 00:41:10,961
the design school,
769
00:41:10,960 --> 00:41:15,640
where students actually form into
teams. And so it's like individualized...
770
00:41:15,640 --> 00:41:16,880
[Siegel] The design school are
getting closer with the business school
771
00:41:16,880 --> 00:41:18,240
and actually beginning
to develop a lab,
772
00:41:18,240 --> 00:41:19,960
or some kind of work
or collaboration space.
773
00:41:25,440 --> 00:41:27,440
[Narrator] The co-hack
home has several workspaces.
774
00:41:27,600 --> 00:41:32,480
Here, researchers
from different disciplines
775
00:41:32,480 --> 00:41:34,640
develop, plan, and construct
new types of electronics,
776
00:41:34,640 --> 00:41:37,400
new program codes,
and software systems.
777
00:41:39,320 --> 00:41:44,600
Their mission is to
create completely
778
00:41:44,600 --> 00:41:47,800
new technologies, and to use
those technologies to design devices
779
00:41:47,800 --> 00:41:50,640
that will enhance
human consciousness,
780
00:41:50,640 --> 00:41:53,640
elevate it to previously
unimaginable levels.
781
00:41:53,640 --> 00:41:56,680
[Siegel] Can you get
data off of this thing...?
782
00:41:56,680 --> 00:41:57,680
[Man] It doesn't give
us the data inherently.
783
00:41:57,681 --> 00:42:00,761
We'd have to hack into it.
784
00:42:00,760 --> 00:42:01,840
[Narrator] Mikey Siegel is not
only the founder of a new movement,
785
00:42:01,840 --> 00:42:05,040
he is also a pioneering
thinker and a visionary.
786
00:42:05,040 --> 00:42:10,320
[Siegel] One of
my main interests
787
00:42:10,320 --> 00:42:13,760
is not just how the individual can transform, but
how we can create interpersonal transformation,
788
00:42:13,760 --> 00:42:16,440
how we can shift the
relationship between people.
789
00:42:21,640 --> 00:42:23,120
And if we think about
it, this is nothing new.
790
00:42:23,120 --> 00:42:25,440
This has been at the
core of human culture
791
00:42:25,440 --> 00:42:28,840
for thousands of years.
792
00:42:28,840 --> 00:42:30,160
It's a defining quality
of human beings.
793
00:42:30,160 --> 00:42:35,480
If I were to ask you
794
00:42:35,480 --> 00:42:38,680
what the most meaningful and important
experienceyou've ever had with another person was,
795
00:42:38,680 --> 00:42:40,320
it probably wouldn't be
about an idea or a thought.
796
00:42:40,320 --> 00:42:43,160
It would be about a feeling.
797
00:42:45,080 --> 00:42:48,520
[Indistinct chattering].
798
00:42:49,400 --> 00:42:54,800
And my question is:
799
00:42:54,800 --> 00:42:57,520
How can we use technology to help
support this already innate capacity
800
00:42:58,360 --> 00:43:01,680
for deep, profound
human connection?
801
00:43:06,440 --> 00:43:08,800
[Narrator] Mikey's group
is constantly working
802
00:43:08,800 --> 00:43:10,560
on the development
and creation of new tools.
803
00:43:11,200 --> 00:43:15,920
This laid-back,
free-form workspace
804
00:43:15,920 --> 00:43:17,720
is a hub focused on
helping people to develop,
805
00:43:20,040 --> 00:43:23,440
using the most advanced
technologies available.
806
00:43:25,160 --> 00:43:29,600
Devices that
measure brainwaves...
807
00:43:29,600 --> 00:43:32,720
[Man 1]. It's a
collaboration between...
808
00:43:32,720 --> 00:43:35,400
[Man 2]. Having
the neuro and bio...
809
00:43:35,400 --> 00:43:36,760
[Narrator] Computers that
capture individual states of being.
810
00:43:36,760 --> 00:43:41,200
So let me activate
the e.E.G. Again.
811
00:43:41,200 --> 00:43:42,720
[Narrator] But the researchers
are thinking far beyond
812
00:43:42,720 --> 00:43:44,320
purely scientific and
technical contexts.
813
00:43:44,320 --> 00:43:48,480
The tilt axis...
814
00:43:48,480 --> 00:43:49,520
[Narrator] They want to create
the technology of the future
815
00:43:49,520 --> 00:43:52,640
to awaken people to
higher consciousness.
816
00:43:52,640 --> 00:43:55,120
[Indistinct chattering].
817
00:43:58,200 --> 00:44:01,280
We just got funding
818
00:44:01,280 --> 00:44:04,800
for a brand-new and very exciting
project, really an experiment,
819
00:44:04,800 --> 00:44:06,120
to see what happens when
we use new types of technology
820
00:44:06,120 --> 00:44:09,880
to support a deep
sense of connection
821
00:44:09,880 --> 00:44:12,560
between a group of people.
822
00:44:12,560 --> 00:44:15,600
What we do is
823
00:44:15,600 --> 00:44:18,840
we actually take special sensors that
monitor the brain and that monitor the heart.
824
00:44:21,640 --> 00:44:23,040
And we wire up
not just one person,
825
00:44:23,040 --> 00:44:24,600
but we wire up the
whole group together.
826
00:44:29,120 --> 00:44:30,880
And then a computer
system analyzes all the data
827
00:44:30,880 --> 00:44:34,680
coming from this group.
828
00:44:34,680 --> 00:44:37,000
And what it's doing is
it's actually comparing
829
00:44:37,000 --> 00:44:39,640
against a scientific
understanding
830
00:44:39,640 --> 00:44:42,120
of what it looks like in
the brain, and in the heart,
831
00:44:42,120 --> 00:44:46,720
when we fall into a very
natural sense of deep connection
832
00:44:46,720 --> 00:44:48,840
that is always available,
but not always accessible.
833
00:44:54,440 --> 00:44:56,800
We talk about feeling
synced up with a person,
834
00:44:56,800 --> 00:44:59,800
and sure, there's
slang around that.
835
00:44:59,800 --> 00:45:01,600
But this actually has
real scientific meaning.
836
00:45:06,120 --> 00:45:10,200
And our experiment
is to explore:
837
00:45:10,200 --> 00:45:12,600
Can we actually use this
special type of technology,
838
00:45:12,600 --> 00:45:16,720
not just to measure
the connection,
839
00:45:16,720 --> 00:45:18,000
but to actually
guide this group?
840
00:45:22,520 --> 00:45:25,560
And this is a new frontier.
841
00:45:25,560 --> 00:45:27,760
This possibility of what
happens when technology is used
842
00:45:30,480 --> 00:45:35,240
not just to connect
us through information
843
00:45:35,240 --> 00:45:39,320
is an incredible, profound
space of possibility.
844
00:45:39,320 --> 00:45:42,560
[Narrator] The expansion of human
consciousness through modern technology
845
00:45:42,560 --> 00:45:46,600
is still in its infancy.
846
00:45:46,600 --> 00:45:48,480
But scientists are
already hard at work
847
00:45:48,480 --> 00:45:51,440
studying the possible
consequences.
848
00:45:51,440 --> 00:45:54,000
You know, I think that in
terms of what we can expect
849
00:45:54,000 --> 00:45:57,000
from the next five to ten years,
850
00:45:57,000 --> 00:45:59,960
I get really excited about it
851
00:45:59,960 --> 00:46:01,840
because I think that we're
going to really start to develop
852
00:46:01,840 --> 00:46:04,400
what we can do with
brain computer interfaces.
853
00:46:04,400 --> 00:46:07,560
Would we have a
bifurcated society?
854
00:46:07,560 --> 00:46:09,380
Would people of such different
types even want to live together
855
00:46:09,400 --> 00:46:14,440
in the same civilization?
856
00:46:14,440 --> 00:46:15,960
This is only the beginning of
the kind of questions and issues
857
00:46:15,960 --> 00:46:18,440
which we'll need to address.
858
00:46:22,120 --> 00:46:24,680
[Goertzel]. We are at the
transition between animals
859
00:46:24,680 --> 00:46:27,960
that cannot understand
what they are
860
00:46:27,960 --> 00:46:29,800
and reflect on their own
thinking and behaviour
861
00:46:29,800 --> 00:46:32,560
and the next stage
of intelligence,
862
00:46:32,560 --> 00:46:34,240
which is intelligent beings
that can be fully self-reflective,
863
00:46:34,240 --> 00:46:38,480
understand their
own minds and bodies,
864
00:46:38,480 --> 00:46:39,280
and repurpose and refactor
and improve themselves
865
00:46:39,281 --> 00:46:43,281
according to their goals.
866
00:46:43,280 --> 00:46:45,120
And a relatively small
number of evolutionary changes
867
00:46:45,120 --> 00:46:47,840
led to all the things that the
human brain has produced,
868
00:46:47,840 --> 00:46:50,720
modern technological
societies, complex organizations,
869
00:46:52,040 --> 00:46:56,280
scientific understanding
of the world.
870
00:46:56,280 --> 00:46:58,680
It just seems very plausible
871
00:46:58,680 --> 00:47:01,160
that some similar additional
tweaks on top of that,
872
00:47:01,160 --> 00:47:04,160
you know, whether by biologically
enhancing the human condition
873
00:47:04,160 --> 00:47:06,920
or creating a.I.S
that surpass us,
874
00:47:06,920 --> 00:47:09,080
could have equal or a much
greater impact on the world.
875
00:47:14,680 --> 00:47:16,240
The change that we
are exerting on the world
876
00:47:16,240 --> 00:47:20,080
is far more dramatic
877
00:47:20,080 --> 00:47:21,600
than the change that homo
sapiens was exerting on the world
878
00:47:21,600 --> 00:47:24,720
when neanderthal man
was watching it happen.
879
00:47:24,720 --> 00:47:29,080
Our biological bodies
were never selected
880
00:47:29,080 --> 00:47:31,280
to live in this breakneck,
changing world of technology,
881
00:47:32,400 --> 00:47:37,520
and who knows what
the future may bring?
882
00:47:37,520 --> 00:47:40,000
[Narrator] Technological progress
seems to be just a click away.
883
00:47:44,080 --> 00:47:47,560
[Distorted overlapping voices].
884
00:47:51,240 --> 00:47:53,360
[Narrator] But are we really
facing a glorious future?
885
00:47:58,760 --> 00:48:03,400
Mikey Siegel wants to connect,
886
00:48:03,400 --> 00:48:05,920
indeed, to reconcile,
mankind and technology.
887
00:48:05,920 --> 00:48:09,720
But even for him, many
questions remain unanswered.
888
00:48:09,720 --> 00:48:13,960
[Siegel] So where do
we want to go from here?
889
00:48:13,960 --> 00:48:17,440
We're about to see a
generation born into a world
890
00:48:17,440 --> 00:48:19,680
where every waking moment
will be mediated by technology.
891
00:48:22,200 --> 00:48:26,440
But can we really separate
who we are from what we build?
892
00:48:26,440 --> 00:48:30,520
Isn't technology just a mirror
893
00:48:30,520 --> 00:48:31,960
of who and what we
are as a humanity?
894
00:48:36,640 --> 00:48:38,520
Perhaps the smartest
are not always the fittest.
895
00:48:44,400 --> 00:48:47,200
The greatest innovation of
humanity will not be a gadget.
896
00:48:48,960 --> 00:48:52,960
It will be the software
that runs our min
897
00:48:52,960 --> 00:48:55,600
ds and our hearts.
898
00:48:55,600 --> 00:49:00,480
The greatest innovation
899
00:49:00,480 --> 00:49:02,320
of humanity will be our
inner operating system.
70944
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.