Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,497 --> 00:00:04,565
We all spend our lives
2
00:00:04,567 --> 00:00:07,268
on a search
for something so close,
3
00:00:07,270 --> 00:00:11,038
yet always just out of reach.
4
00:00:11,040 --> 00:00:13,741
Some call it the ego.
5
00:00:13,743 --> 00:00:15,776
Others, the soul.
6
00:00:19,347 --> 00:00:22,516
Now modern science
is prying into our thoughts,
7
00:00:22,518 --> 00:00:25,719
our memories, and our dreams,
8
00:00:25,721 --> 00:00:28,923
and asking the profoundly
puzzling question,
9
00:00:28,925 --> 00:00:32,927
"What makes us who we are?"
10
00:00:36,398 --> 00:00:41,001
Space, time, life itself.
11
00:00:43,405 --> 00:00:47,975
The secrets of the cosmos
lie through the wormhole.
12
00:00:47,977 --> 00:00:51,977
Through the Wormhole 03x04
What makes us who we are? Original Air Date on June 20, 2012
13
00:00:52,002 --> 00:00:56,002
== sync, corrected by elderman ==
14
00:00:59,955 --> 00:01:04,558
What is it that makes me me?
15
00:01:04,560 --> 00:01:08,729
Or makes you you?
16
00:01:08,731 --> 00:01:11,365
Is it the things we know?
17
00:01:11,367 --> 00:01:14,468
The people and places
we've experienced?
18
00:01:14,470 --> 00:01:18,372
What makes me the same person
now as when I was 40?
19
00:01:18,374 --> 00:01:21,141
Or when I was 10?
20
00:01:21,143 --> 00:01:23,744
What is it
that gives each one of us
21
00:01:23,746 --> 00:01:26,413
our unique identity?
22
00:01:26,415 --> 00:01:30,384
Scientists are beginning to
tackle this profound puzzle.
23
00:01:30,386 --> 00:01:32,519
To do so, they must probe
24
00:01:32,521 --> 00:01:37,124
one of the last great frontiers
of our understanding --
25
00:01:37,126 --> 00:01:38,926
the brain.
26
00:01:38,928 --> 00:01:44,131
Every summer, I used to go
camping with the Boy Scouts.
27
00:01:44,133 --> 00:01:46,767
Each item of clothing
I took with me
28
00:01:46,769 --> 00:01:49,103
had to have my name
sewn onto it.
29
00:01:49,105 --> 00:01:50,437
"Morgan."
30
00:01:50,439 --> 00:01:54,508
It was a name I never really
liked when I was a boy.
31
00:01:54,510 --> 00:01:57,277
I wondered how my life
might be different
32
00:01:57,279 --> 00:02:03,217
if I had been called something
normal like Robert or John.
33
00:02:03,219 --> 00:02:04,718
Well, like it or not,
34
00:02:04,720 --> 00:02:07,554
the name I was given
framed my identity.
35
00:02:07,556 --> 00:02:11,525
It helped make me who I am.
36
00:02:11,527 --> 00:02:12,626
Happy Birthday!
37
00:02:12,628 --> 00:02:14,261
Look, he's walking!
38
00:02:15,297 --> 00:02:16,330
Show your mom.
39
00:02:16,332 --> 00:02:17,364
Bye!
40
00:02:17,366 --> 00:02:18,565
It's beautiful!
41
00:02:19,702 --> 00:02:21,635
Wish you were here.
42
00:02:24,439 --> 00:02:27,241
Alison Gopnik is on a search
43
00:02:27,243 --> 00:02:31,812
to discover how and when
we first understand who we are.
44
00:02:31,814 --> 00:02:34,081
She's a child psychologist
45
00:02:34,083 --> 00:02:36,817
at the University of California
at Berkeley.
46
00:02:36,819 --> 00:02:39,119
So, the process
of forging an identity,
47
00:02:39,121 --> 00:02:41,388
of figuring out
who it is that we are,
48
00:02:41,390 --> 00:02:43,557
that's a process that really
takes us our whole lifetime.
49
00:02:43,559 --> 00:02:46,660
But some of the most crucial
parts of that
50
00:02:46,662 --> 00:02:48,495
seem to be things
that we're learning
51
00:02:48,497 --> 00:02:50,664
in this very, very early period
of our lives.
52
00:02:50,666 --> 00:02:52,066
As adults,
53
00:02:52,068 --> 00:02:55,202
it's easy to take
our identities for granted.
54
00:02:55,204 --> 00:02:57,204
We accept that who we are now
55
00:02:57,206 --> 00:03:00,107
is the same as who we were
a minute ago.
56
00:03:00,109 --> 00:03:02,009
But Alison has discovered
57
00:03:02,011 --> 00:03:05,946
that identity is not so solid
for young children.
58
00:03:05,948 --> 00:03:07,481
They spend much of their time
59
00:03:07,483 --> 00:03:12,586
trying to figure out
just who they are.
60
00:03:12,588 --> 00:03:15,322
So, when kids are just doing the
everyday things that kids do,
61
00:03:15,324 --> 00:03:18,525
when they're playing
and exploring and pretending,
62
00:03:18,527 --> 00:03:20,928
what they're actually doing
is being involved
63
00:03:20,930 --> 00:03:23,964
in this great, existential,
philosophical research program.
64
00:03:23,966 --> 00:03:28,035
What it means to be a person.
65
00:03:28,037 --> 00:03:31,105
One of the first
milestones in this program
66
00:03:31,107 --> 00:03:33,774
is called "The mirror stage."
67
00:03:33,776 --> 00:03:36,877
It begins when a child
is first able to recognize
68
00:03:36,879 --> 00:03:39,747
his or her own reflection.
69
00:03:39,749 --> 00:03:41,348
The thing
about a mirror isn't just
70
00:03:41,350 --> 00:03:43,817
that you see a body and a face,
which is fascinating,
71
00:03:43,819 --> 00:03:45,052
but you connect it
72
00:03:45,054 --> 00:03:46,820
to your own kinesthetic feeling
of yourself --
73
00:03:46,822 --> 00:03:48,222
the way your own body feels.
74
00:03:48,224 --> 00:03:51,725
Alison uses a clever
experiment on her toddlers
75
00:03:51,727 --> 00:03:54,962
to detect which of them
have developed an awareness
76
00:03:54,964 --> 00:03:56,663
of their reflections.
77
00:03:56,665 --> 00:04:00,267
She places a smudge of blue ink
on their noses
78
00:04:00,269 --> 00:04:03,303
and tells them
to look in the mirror.
79
00:04:03,305 --> 00:04:05,572
There we go.
There he is.
80
00:04:05,574 --> 00:04:08,942
There he is.
Right there. Yeah.
81
00:04:08,944 --> 00:04:11,912
15-month-old John
is hardly fazed
82
00:04:11,914 --> 00:04:14,348
by the blue-nosed child
staring back at him
83
00:04:14,350 --> 00:04:16,350
because he's not able
to recognize
84
00:04:16,352 --> 00:04:19,419
that the child in the mirror
is him.
85
00:04:19,421 --> 00:04:20,654
They're interested in the fact
86
00:04:20,656 --> 00:04:22,389
that that baby in the mirror
has a spot,
87
00:04:22,391 --> 00:04:24,958
but they don't seem to
connect that to the fact
88
00:04:24,960 --> 00:04:27,594
that there's actually a spot
on their own noses.
89
00:04:27,596 --> 00:04:30,497
When Alison tries
the same test on Karen,
90
00:04:30,499 --> 00:04:32,933
who is just a few months older
than John,
91
00:04:32,935 --> 00:04:35,169
she does something
quite different.
92
00:04:35,171 --> 00:04:37,004
Look at that!
93
00:04:37,006 --> 00:04:39,606
There we go. Yeah!
94
00:04:39,608 --> 00:04:40,941
What is that?
95
00:04:40,943 --> 00:04:43,043
And now the baby
seemed to realize, "oh, yeah.
96
00:04:43,045 --> 00:04:44,278
"That person in the mirror,
97
00:04:44,280 --> 00:04:45,946
that's the same person
that I am."
98
00:04:45,948 --> 00:04:48,649
It is a big first step
99
00:04:48,651 --> 00:04:51,718
on a long journey
of self-discovery.
100
00:04:51,720 --> 00:04:54,388
But Alison's work has shown
that her subjects have
101
00:04:54,390 --> 00:04:59,059
a lot more philosophical
research ahead of them.
102
00:04:59,061 --> 00:05:02,763
This is 3-year-old Geneva.
103
00:05:02,765 --> 00:05:03,997
Alison tempts her
104
00:05:03,999 --> 00:05:06,900
with what looks like
a box filled with cookies,
105
00:05:06,902 --> 00:05:09,036
but Geneva will soon find out
106
00:05:09,038 --> 00:05:11,972
that she can't judge a box
by its cover.
107
00:05:11,974 --> 00:05:16,743
What do you think
is inside this box?
108
00:05:16,745 --> 00:05:17,945
Hmm.
109
00:05:17,947 --> 00:05:19,079
What do you think
this is?
110
00:05:20,882 --> 00:05:22,382
Good cookies.
111
00:05:22,384 --> 00:05:26,119
Good cookies.
Should we find out?
112
00:05:26,121 --> 00:05:30,424
Let's open the box
and find out what's inside.
113
00:05:31,459 --> 00:05:32,759
Markers.
114
00:05:32,761 --> 00:05:34,895
Markers!
There's markers inside.
115
00:05:34,897 --> 00:05:36,496
Hmm.
116
00:05:37,766 --> 00:05:42,402
When I first showed you this box
all closed up like this,
117
00:05:42,404 --> 00:05:45,372
what did you think
was inside it?
118
00:05:45,374 --> 00:05:46,506
Markers.
119
00:05:46,508 --> 00:05:47,808
Markers!
120
00:05:47,810 --> 00:05:50,143
Geneva cannot reconcile
121
00:05:50,145 --> 00:05:53,480
that she now knows
the box is filled with markers
122
00:05:53,482 --> 00:05:58,318
when she once thought
it was filled with cookies.
123
00:05:58,320 --> 00:06:02,122
The concept that there is a you
who is the same person
124
00:06:02,124 --> 00:06:04,691
even if your thoughts
have changed
125
00:06:04,693 --> 00:06:07,294
is not an understanding
you're born with.
126
00:06:07,296 --> 00:06:10,030
It is something
you come to learn.
127
00:06:10,032 --> 00:06:13,667
Alison tries the cookie box test
with 4-year-old Jim.
128
00:06:13,669 --> 00:06:16,103
I have a question
for you.
129
00:06:16,105 --> 00:06:17,337
What?
130
00:06:17,339 --> 00:06:21,441
When you first saw this box
all closed up like this,
131
00:06:21,443 --> 00:06:24,244
what did you think
was inside it?
132
00:06:24,246 --> 00:06:26,580
Cookies
with chocolate chips.
133
00:06:26,582 --> 00:06:29,716
Uh-huh.
And what's really inside it?
134
00:06:29,718 --> 00:06:31,385
Markers!
135
00:06:31,387 --> 00:06:33,553
Yeah! That's great!
136
00:06:33,555 --> 00:06:35,555
So, a very important part
of my identity
137
00:06:35,557 --> 00:06:36,957
is being able to say,
138
00:06:36,959 --> 00:06:40,527
"Well, when I was 16, I believed
different things than I do now,
139
00:06:40,529 --> 00:06:43,397
but I was me who believed
those different things."
140
00:06:43,399 --> 00:06:44,932
When children realize
141
00:06:44,934 --> 00:06:47,801
their identities can survive
any change in their beliefs,
142
00:06:47,803 --> 00:06:51,338
they stop forgetting the things
they don't believe anymore,
143
00:06:51,340 --> 00:06:52,806
and for the first time,
144
00:06:52,808 --> 00:06:57,511
they unlock the astonishing
power of human memory.
145
00:07:02,250 --> 00:07:05,719
In this very hall,
I used to perform in the choir.
146
00:07:05,721 --> 00:07:07,387
It takes me back.
147
00:07:07,389 --> 00:07:10,557
In this very hall,
I used to perform in the choir.
148
00:07:10,559 --> 00:07:14,861
My own memories
are so richly detailed,
149
00:07:14,863 --> 00:07:17,998
and I spent many hours
in this hall,
150
00:07:18,000 --> 00:07:19,766
rehearsing, practicing.
151
00:07:19,768 --> 00:07:22,235
It just takes me back.
152
00:07:27,142 --> 00:07:30,711
Neuroscientist Donna Addis
153
00:07:30,713 --> 00:07:34,614
is head of the memory lab
at the University of Auckland,
154
00:07:34,616 --> 00:07:37,184
where she investigates
how memories shape us.
155
00:07:37,186 --> 00:07:38,885
She does this by peering in
156
00:07:38,887 --> 00:07:42,055
to a horseshoe-shaped area
in the brain
157
00:07:42,057 --> 00:07:45,025
called the hippocampus.
158
00:07:45,027 --> 00:07:48,495
For the last 50 years,
scientists have known
159
00:07:48,497 --> 00:07:53,066
that the hippocampus is critical
to the storage of memory.
160
00:07:53,068 --> 00:07:57,738
We learned this
thanks to one man,
161
00:07:57,740 --> 00:08:00,240
known as "Patient H.M."
162
00:08:00,242 --> 00:08:04,111
He had severe epilepsy
and, in 1953,
163
00:08:04,113 --> 00:08:07,848
received a radical,
new treatment.
164
00:08:07,850 --> 00:08:11,184
His hippocampus and part
of his inner temporal lobes
165
00:08:11,186 --> 00:08:15,255
were cut out.
166
00:08:15,257 --> 00:08:17,024
After the surgery,
167
00:08:17,026 --> 00:08:21,962
H.M. never suffered
a bout of epilepsy again,
168
00:08:21,964 --> 00:08:25,298
but he had completely lost
169
00:08:25,300 --> 00:08:27,801
the ability to form
new memories.
170
00:08:27,803 --> 00:08:32,839
He said, "Each day is like
waking from a dream."
171
00:08:32,841 --> 00:08:36,910
He had lost his identity.
172
00:08:36,912 --> 00:08:42,682
His sense of who he was
was frozen at age 27.
173
00:08:42,684 --> 00:08:44,584
As days turned into decades,
174
00:08:44,586 --> 00:08:46,453
he could no longer recognize
175
00:08:46,455 --> 00:08:49,856
the man staring back at him
in the mirror.
176
00:08:49,858 --> 00:08:51,892
How much of who we are
177
00:08:51,894 --> 00:08:56,496
is built upon the memories
we make each and every day?
178
00:08:56,498 --> 00:09:00,333
Traditionally, memory research
has really focused on the past,
179
00:09:00,335 --> 00:09:03,637
and in the last few years,
researchers such as myself
180
00:09:03,639 --> 00:09:07,107
have been looking at the ability
to imagine the future
181
00:09:07,109 --> 00:09:10,343
and how memory might actually
play a role in that.
182
00:09:10,345 --> 00:09:13,513
Donna and her team
set up an experiment
183
00:09:13,515 --> 00:09:17,851
to determine just how
the hippocampus and our memories
184
00:09:17,853 --> 00:09:20,687
help us perceive
our future selves.
185
00:09:20,689 --> 00:09:21,922
So, what we do is
186
00:09:21,924 --> 00:09:23,723
we have subjects come
into the lab
187
00:09:23,725 --> 00:09:27,494
and we have them retrieve
around 100 memories.
188
00:09:27,496 --> 00:09:31,598
I remember going to visit
my cousin in Germany.
189
00:09:31,600 --> 00:09:32,899
We walked along the beach,
190
00:09:32,901 --> 00:09:34,668
and at the end,
there was a cave.
191
00:09:34,670 --> 00:09:37,971
And I gave my grandmother
a seashell for Christmas.
192
00:09:37,973 --> 00:09:39,473
And for each memory,
193
00:09:39,475 --> 00:09:42,342
they identify a person,
a place, and an object
194
00:09:42,344 --> 00:09:43,844
that might be important.
195
00:09:43,846 --> 00:09:45,345
-The Autobahn.
-My mother.
196
00:09:45,347 --> 00:09:46,546
-Two penguins.
-Boston.
197
00:09:46,548 --> 00:09:48,148
-My book bag.
-Auckland waterfront.
198
00:09:48,150 --> 00:09:49,649
-My girlfriend, Sarah.
-The hospital.
199
00:09:49,651 --> 00:09:51,351
-My partner, Wayne.
-Eastern Beach.
200
00:09:51,353 --> 00:09:52,986
-Amsterdam airport.
-A cave.
201
00:09:52,988 --> 00:09:57,023
A week later,
Donna brings her subjects back,
202
00:09:57,025 --> 00:09:59,493
places them in an fMRI machine,
203
00:09:59,495 --> 00:10:03,530
and shows them details
from their recalled memories.
204
00:10:03,532 --> 00:10:06,366
But she deliberately jumbles
the details.
205
00:10:06,368 --> 00:10:08,201
An object from one memory
206
00:10:08,203 --> 00:10:11,738
has been grouped with a place
from another memory
207
00:10:11,740 --> 00:10:14,174
and a person from yet another.
208
00:10:14,176 --> 00:10:17,277
Donna then asks them
to make a story
209
00:10:17,279 --> 00:10:19,779
out of these mixed up memories,
210
00:10:19,781 --> 00:10:22,983
to imagine something
that has not happened yet
211
00:10:22,985 --> 00:10:26,119
but potentially could.
212
00:10:26,121 --> 00:10:27,420
We ask them,
213
00:10:27,422 --> 00:10:30,157
for each person, place, object
that they're seeing now,
214
00:10:30,159 --> 00:10:33,160
to imagine a future event
that might happen to them
215
00:10:33,162 --> 00:10:35,262
within the next five years
or so.
216
00:10:35,264 --> 00:10:39,766
While the participants
are imagining their futures,
217
00:10:39,768 --> 00:10:42,302
Donna measures
their brain responses.
218
00:10:42,304 --> 00:10:43,436
To her surprise,
219
00:10:43,438 --> 00:10:45,872
the part of the brain
that is vital
220
00:10:45,874 --> 00:10:49,242
to storing memories of the past,
the hippocampus,
221
00:10:49,244 --> 00:10:52,112
is blazing with activity.
222
00:10:52,114 --> 00:10:55,348
The hippocampus is playing
a really important role,
223
00:10:55,350 --> 00:10:56,917
not only in remembering,
224
00:10:56,919 --> 00:11:00,387
but also allowing us to build
these future simulations.
225
00:11:00,389 --> 00:11:03,790
Memory is important
not only for the past,
226
00:11:03,792 --> 00:11:05,725
but also for the future,
227
00:11:05,727 --> 00:11:08,295
for building up that sense
of who we are.
228
00:11:10,431 --> 00:11:15,268
Our memories are crucial
to forming our identities,
229
00:11:15,270 --> 00:11:17,137
but one group of scientists
have discovered
230
00:11:17,139 --> 00:11:19,439
that our memories
can be manipulated
231
00:11:19,441 --> 00:11:21,808
without our even knowing it.
232
00:11:21,810 --> 00:11:25,045
Are we really
who we think we are?
233
00:11:27,170 --> 00:11:31,240
Our ability to remember
is truly remarkable.
234
00:11:31,242 --> 00:11:33,542
In the course of our lives,
235
00:11:33,544 --> 00:11:38,414
the average person will grasp
the meaning of 100,000 words,
236
00:11:38,416 --> 00:11:41,150
get to know around 1,700 people,
237
00:11:41,152 --> 00:11:44,653
and read over 1,000 books.
238
00:11:44,655 --> 00:11:48,490
From these vast mental stores
of experience
239
00:11:48,492 --> 00:11:51,193
we each build our own identity,
240
00:11:51,195 --> 00:11:55,064
a pattern of memories
that is uniquely ours.
241
00:11:55,066 --> 00:12:00,069
But what if those memories
could be rewritten?
242
00:12:00,071 --> 00:12:04,240
Could we change who we are?
243
00:12:08,678 --> 00:12:10,479
Neuroscientist Tali Sharot
244
00:12:10,481 --> 00:12:12,948
from the University
College London
245
00:12:12,950 --> 00:12:14,283
and Micah Edelson
246
00:12:14,285 --> 00:12:17,686
from the Weizmann Institute
of Science in Israel
247
00:12:17,688 --> 00:12:20,389
love going to dinner parties.
248
00:12:20,391 --> 00:12:22,191
But they're not just having fun.
249
00:12:22,193 --> 00:12:24,059
They're doing it
for the sake of science.
250
00:12:24,061 --> 00:12:30,199
They study how social pressures
alter who we are.
251
00:12:30,201 --> 00:12:31,400
Well, imagine a situation
252
00:12:31,402 --> 00:12:33,302
that you're sitting
in a dinner party.
253
00:12:33,304 --> 00:12:36,705
You have a pretty good memory
of some situation that happened,
254
00:12:36,707 --> 00:12:40,609
and you actually remember it
happening in a certain way,
255
00:12:40,611 --> 00:12:42,878
but all of your friends
are telling you
256
00:12:42,880 --> 00:12:44,413
that you're actually wrong.
257
00:12:44,415 --> 00:12:46,081
They're saying
that something else happened.
258
00:12:46,083 --> 00:12:48,083
And Tali got the umbrella,
and she slid the umbrella
259
00:12:48,085 --> 00:12:49,885
- through the letterbox.
- Are you sure it was Tali?
260
00:12:49,887 --> 00:12:52,021
No, no, no.
I'm quite sure it wasn't me.
261
00:12:52,023 --> 00:12:54,089
No, it was definitely you.
You were definitely holding the umbrella.
262
00:12:54,091 --> 00:12:55,691
It was definitely Steve.
263
00:12:55,693 --> 00:12:57,393
I really thought it was Tali,
though.
264
00:12:57,395 --> 00:12:59,061
No, because
we were really impressed
265
00:12:59,063 --> 00:13:00,729
that Steve managed to
hook it around.
266
00:13:00,731 --> 00:13:02,298
Remember?
We had that big fanfare
267
00:13:02,300 --> 00:13:03,799
because we were
out in the rain.
268
00:13:03,801 --> 00:13:05,834
When someone changes
their memory
269
00:13:05,836 --> 00:13:08,938
so that it fits
other people's opinion,
270
00:13:08,940 --> 00:13:11,807
do we actually change
a signal in the brain
271
00:13:11,809 --> 00:13:14,677
that is representative
of the memory,
272
00:13:14,679 --> 00:13:17,479
or is it just that
we try to please other people?
273
00:13:17,481 --> 00:13:19,949
Social pressures
274
00:13:19,951 --> 00:13:21,917
can make us change the way
we tell stories.
275
00:13:21,919 --> 00:13:26,822
But can they also make us change
the stories we tell ourselves?
276
00:13:26,824 --> 00:13:29,291
Our actual memories?
277
00:13:29,293 --> 00:13:33,128
Tali and Micah set up
an experiment to find out,
278
00:13:33,130 --> 00:13:36,298
not in a restaurant,
but in a lab.
279
00:13:36,300 --> 00:13:38,834
They bring in
a group of volunteers
280
00:13:38,836 --> 00:13:40,569
to watch a film together.
281
00:13:40,571 --> 00:13:42,137
Afterwards,
282
00:13:42,139 --> 00:13:45,074
the volunteers answer
basic questions about the film
283
00:13:45,076 --> 00:13:47,242
to test what they remember.
284
00:13:49,546 --> 00:13:50,913
A few days later,
285
00:13:50,915 --> 00:13:53,716
the participants
take the same questionnaire
286
00:13:53,718 --> 00:13:55,217
inside a brain scanner,
287
00:13:55,219 --> 00:14:00,856
only this time, Micah and Tali
apply a social pressure.
288
00:14:00,858 --> 00:14:03,859
This time, they were exposed
to fake answers
289
00:14:03,861 --> 00:14:07,629
that were supposedly given
by their fellow group members.
290
00:14:07,631 --> 00:14:09,999
The group is led to believe
291
00:14:10,001 --> 00:14:12,234
that the others
who took the test
292
00:14:12,236 --> 00:14:16,372
remembered the character in
the film was not wearing a hat.
293
00:14:16,374 --> 00:14:20,175
And most people
changed their answers
294
00:14:20,177 --> 00:14:22,144
to go along with the crowd.
295
00:14:22,146 --> 00:14:23,679
Almost 70% of the cases,
296
00:14:23,681 --> 00:14:27,082
the participants conformed
and they gave a wrong answer,
297
00:14:27,084 --> 00:14:28,684
even though they were
initially pretty confident
298
00:14:28,686 --> 00:14:30,119
about the correct answer.
299
00:14:30,121 --> 00:14:35,057
But were they just outwardly
complying with a social norm,
300
00:14:35,059 --> 00:14:38,560
or did their memories
actually change?
301
00:14:40,463 --> 00:14:42,798
So, we test them again
a week after,
302
00:14:42,800 --> 00:14:46,468
once we've removed
the social pressure.
303
00:14:46,470 --> 00:14:48,003
And we assume that if they're
still making an error,
304
00:14:48,005 --> 00:14:51,573
that means that their memory
was actually changed.
305
00:14:51,575 --> 00:14:53,175
Tali and Micah found
306
00:14:53,177 --> 00:14:55,978
that most test subjects
stuck with the wrong answer
307
00:14:55,980 --> 00:14:57,913
even without peer pressure.
308
00:14:57,915 --> 00:15:02,518
The falsehood has taken root
in their brains.
309
00:15:02,520 --> 00:15:07,056
It actually causes
a long-lasting memory error,
310
00:15:07,058 --> 00:15:09,058
and using our brain data,
311
00:15:09,060 --> 00:15:10,826
we are able
to actually identify
312
00:15:10,828 --> 00:15:15,397
when such a long-lasting
memory error will occur.
313
00:15:15,399 --> 00:15:18,767
The brains of those people
who changed their memories
314
00:15:18,769 --> 00:15:21,703
showed high activity
not just in the hippocampus,
315
00:15:21,705 --> 00:15:23,272
where memories are stored,
316
00:15:23,274 --> 00:15:25,340
but also in a part of the brain
317
00:15:25,342 --> 00:15:28,944
that is connected with emotional
and social responses,
318
00:15:28,946 --> 00:15:30,312
the amygdala.
319
00:15:34,951 --> 00:15:36,485
A lot of what we know
320
00:15:36,487 --> 00:15:38,220
about the amygdala
and its function
321
00:15:38,222 --> 00:15:40,122
comes from
the animal world.
322
00:15:40,124 --> 00:15:43,292
So our reaction to anything
that's emotional --
323
00:15:43,294 --> 00:15:46,495
if we suddenly hear a noise
which is frightening
324
00:15:46,497 --> 00:15:49,731
or of processing emotional
expressions on one's face --
325
00:15:49,733 --> 00:15:52,468
the amygdala is crucial
for all of these functions.
326
00:15:52,470 --> 00:15:56,605
And it probably
helps us increase memory
327
00:15:56,607 --> 00:15:59,608
because, in a fearful
or emotional situation,
328
00:15:59,610 --> 00:16:02,144
the amygdala activation
is heightened
329
00:16:02,146 --> 00:16:03,779
and it also
increases activation
330
00:16:03,781 --> 00:16:05,881
in related structures
like the hippocampus.
331
00:16:07,351 --> 00:16:10,552
The amygdala is like
a bouncer at a nightclub.
332
00:16:10,554 --> 00:16:12,888
It decides which memories
333
00:16:12,890 --> 00:16:15,224
get to play a part
in shaping our life histories.
334
00:16:15,226 --> 00:16:18,994
The ones that carry emotional
weight are allowed in.
335
00:16:18,996 --> 00:16:23,565
The ones that don't are not.
336
00:16:23,567 --> 00:16:26,635
The participants
who changed their memories
337
00:16:26,637 --> 00:16:30,272
were emotionally affected
by the pressure to conform,
338
00:16:30,274 --> 00:16:32,975
setting their amygdalas ablaze,
339
00:16:32,977 --> 00:16:36,145
and the false memory snuck in.
340
00:16:36,147 --> 00:16:38,280
So, by looking
at amygdala activation,
341
00:16:38,282 --> 00:16:39,681
we can actually predict
342
00:16:39,683 --> 00:16:42,551
which memories are gonna
be changed for a very long time
343
00:16:42,553 --> 00:16:43,919
and which are not.
344
00:16:43,921 --> 00:16:45,787
You have to realize
345
00:16:45,789 --> 00:16:48,590
that memories are not
like a videotape.
346
00:16:48,592 --> 00:16:50,726
'Cause people
can be extremely confident
347
00:16:50,728 --> 00:16:53,061
that things happened
like they think they happened
348
00:16:53,063 --> 00:16:54,263
when they didn't.
349
00:16:54,265 --> 00:16:57,132
Our memories
are not just a record
350
00:16:57,134 --> 00:17:00,002
of the events that took place
in our lives.
351
00:17:00,004 --> 00:17:03,772
They are malleable and fallible.
352
00:17:03,774 --> 00:17:08,911
Our identities are created with
constant input from our society.
353
00:17:08,913 --> 00:17:11,547
No man is an island.
354
00:17:13,183 --> 00:17:15,984
But could we go a step further
355
00:17:15,986 --> 00:17:20,756
and deliberately re-engineer
someone's identity?
356
00:17:20,758 --> 00:17:22,191
To do that,
357
00:17:22,193 --> 00:17:25,260
you have to be able to peer into
a person's innermost thoughts,
358
00:17:25,262 --> 00:17:27,796
and believe it or not,
359
00:17:27,798 --> 00:17:31,733
that technology is already here.
360
00:17:36,846 --> 00:17:40,716
We are all actors,
to some extent.
361
00:17:41,827 --> 00:17:44,361
Who we appear to be can change
362
00:17:44,363 --> 00:17:48,298
depending on our mood
or the company we keep.
363
00:17:48,300 --> 00:17:50,267
But there is one time
364
00:17:50,269 --> 00:17:54,237
when who we really are
comes to the fore --
365
00:17:54,239 --> 00:17:56,740
when we dream.
366
00:17:56,742 --> 00:18:01,411
What if we could see our dreams
and study them?
367
00:18:01,413 --> 00:18:03,079
Could we know each other
368
00:18:03,081 --> 00:18:07,350
in a more profound way
than ever before?
369
00:18:08,352 --> 00:18:11,021
During an average life span,
370
00:18:11,023 --> 00:18:14,190
a human being spends
about six years dreaming.
371
00:18:14,192 --> 00:18:17,627
That's more than 52,000 hours
of imagery
372
00:18:17,629 --> 00:18:20,764
buzzing through
our unconscious brains.
373
00:18:25,704 --> 00:18:28,772
Computational neuroscientist
Yuki Kamitani
374
00:18:28,774 --> 00:18:31,474
believes one day
it will be possible
375
00:18:31,476 --> 00:18:34,911
to watch and record
what people are dreaming.
376
00:18:36,647 --> 00:18:38,081
When that happens,
377
00:18:38,083 --> 00:18:40,850
we will all
get to know ourselves
378
00:18:40,852 --> 00:18:42,252
on a much deeper level.
379
00:18:42,254 --> 00:18:45,689
I believe
that if we can reconstruct
380
00:18:45,691 --> 00:18:48,191
or decode the contents
of a dream,
381
00:18:48,193 --> 00:18:50,860
the identity is revealed.
382
00:18:50,862 --> 00:18:54,497
If we remember our dreams,
383
00:18:54,499 --> 00:18:58,301
it is often as a series
of emotionally charged images.
384
00:18:59,971 --> 00:19:02,672
In fact, scientists have found
385
00:19:02,674 --> 00:19:05,875
that the visual cortex
of a dreaming brain
386
00:19:05,877 --> 00:19:07,243
is highly active.
387
00:19:07,245 --> 00:19:10,647
Patterns of electrical activity
wash over it,
388
00:19:10,649 --> 00:19:12,482
which makes Yuki wonder,
389
00:19:12,484 --> 00:19:15,051
can we learn to read
those patterns
390
00:19:15,053 --> 00:19:18,121
and convert them
into images on a computer?
391
00:19:18,123 --> 00:19:23,660
Brain activity can be seen as
a code or an encrypted message
392
00:19:23,662 --> 00:19:27,197
about what's going on
in the visual world.
393
00:19:27,199 --> 00:19:31,234
The patterns of images
we make in our brains
394
00:19:31,236 --> 00:19:32,602
are highly distorted,
395
00:19:32,604 --> 00:19:35,338
in the same way
a pair of shattered glasses
396
00:19:35,340 --> 00:19:37,507
distorts our view of the world.
397
00:19:39,044 --> 00:19:40,510
But if we collected data
398
00:19:40,512 --> 00:19:43,880
on hundreds of images seen
through those shattered lenses,
399
00:19:43,882 --> 00:19:45,582
we could find a correspondence
400
00:19:45,584 --> 00:19:48,685
between the distorted images
and the real ones.
401
00:19:48,687 --> 00:19:50,653
It might take a while,
402
00:19:50,655 --> 00:19:54,591
but if we gave that job
to a powerful computer,
403
00:19:54,593 --> 00:19:58,928
it could decode the scrambled
images into recognizable ones.
404
00:19:58,930 --> 00:20:01,765
And this is how Yuki
tries to crack the code
405
00:20:01,767 --> 00:20:02,966
that turns images
406
00:20:02,968 --> 00:20:07,037
into patterns of activity
in the visual cortex.
407
00:20:07,039 --> 00:20:10,440
We measure the brain activity
of human subjects,
408
00:20:10,442 --> 00:20:13,510
and we let the subject
go into the scanner
409
00:20:13,512 --> 00:20:15,545
and scan their brain.
410
00:20:15,547 --> 00:20:17,013
And during that,
411
00:20:17,015 --> 00:20:19,616
we present some images
to the subject.
412
00:20:19,618 --> 00:20:22,819
Typically several hundreds
or thousands of images
413
00:20:22,821 --> 00:20:24,254
in single experiment.
414
00:20:24,256 --> 00:20:26,589
The images Yuki shows people
415
00:20:26,591 --> 00:20:28,358
are simple
black-and-white shapes --
416
00:20:28,360 --> 00:20:32,028
a square, a cross, a line.
417
00:20:32,030 --> 00:20:34,464
Using a powerful computer array,
418
00:20:34,466 --> 00:20:37,967
he records the precise pattern
of activity
419
00:20:37,969 --> 00:20:39,769
in the visual cortex.
420
00:20:39,771 --> 00:20:42,539
After multiple trials
with the same person,
421
00:20:42,541 --> 00:20:44,574
the computer learns
to distinguish
422
00:20:44,576 --> 00:20:46,743
the patterns triggered
by each image.
423
00:20:46,745 --> 00:20:48,078
In other words,
424
00:20:48,080 --> 00:20:51,481
the computer can judge
purely from the brain activity
425
00:20:51,483 --> 00:20:54,751
which of the shapes
the subject is looking at.
426
00:20:54,753 --> 00:20:57,420
And then Yuki does
something remarkable.
427
00:20:57,422 --> 00:21:01,458
He shows the subjects
brand-new images,
428
00:21:01,460 --> 00:21:04,127
images the computer
has never seen,
429
00:21:04,129 --> 00:21:07,263
and lets the computer
try to draw a picture
430
00:21:07,265 --> 00:21:10,133
of what the subjects are seeing.
431
00:21:10,135 --> 00:21:12,068
These are the images
432
00:21:12,070 --> 00:21:14,838
the computer reads
inside people's brains.
433
00:21:14,840 --> 00:21:20,110
And these are the images
they are actually looking at.
434
00:21:20,112 --> 00:21:21,678
This is the first time
435
00:21:21,680 --> 00:21:25,181
anyone has been able to know
what people are seeing
436
00:21:25,183 --> 00:21:27,550
purely by looking at
their brains.
437
00:21:27,552 --> 00:21:30,854
Looking at the visual cortex,
438
00:21:30,856 --> 00:21:35,291
we have just succeeded
in reconstructing seen images.
439
00:21:35,293 --> 00:21:39,095
We are now trying to reconstruct
imagined images
440
00:21:39,097 --> 00:21:42,866
or images in your dreams.
441
00:21:44,668 --> 00:21:46,836
Yuki's method, as yet,
442
00:21:46,838 --> 00:21:49,839
only works on pixilated
black-and-white images,
443
00:21:49,841 --> 00:21:52,509
but with a few more years
of refinement,
444
00:21:52,511 --> 00:21:55,778
Yuki believes we will be able
to record our dreams
445
00:21:55,780 --> 00:21:59,149
as full-color,
high-definition movies.
446
00:21:59,151 --> 00:22:03,153
And that would truly be
a window into our souls.
447
00:22:03,155 --> 00:22:06,389
Those, you know,
unconscious aspects of our mind
448
00:22:06,391 --> 00:22:10,160
defines what we are
and what the identity is.
449
00:22:10,162 --> 00:22:15,031
So, I think if we can reveal
some dream contents
450
00:22:15,033 --> 00:22:18,034
which someone is not aware of,
451
00:22:18,036 --> 00:22:24,941
then that might reveal some
deep property of that person.
452
00:22:26,710 --> 00:22:28,578
Our true identities
453
00:22:28,580 --> 00:22:32,282
could soon be laid bare
for all to see,
454
00:22:32,284 --> 00:22:35,285
including the parts
we don't want seen,
455
00:22:35,287 --> 00:22:39,756
like our deepest-held secrets
and fantasies.
456
00:22:39,758 --> 00:22:43,059
But you may not have to worry.
457
00:22:43,061 --> 00:22:47,130
Because the power to edit
the contents of our minds
458
00:22:47,132 --> 00:22:49,532
is close at hand.
459
00:22:52,183 --> 00:22:56,319
Our brains
are filled with memories.
460
00:22:56,321 --> 00:22:58,655
Some of them bring us joy.
461
00:22:58,657 --> 00:23:02,058
Others make us wish
we could forget.
462
00:23:02,060 --> 00:23:04,327
Whether we like it or not,
463
00:23:04,329 --> 00:23:08,465
our memories shape
how we think and how we act.
464
00:23:08,467 --> 00:23:14,204
But now one group of researchers
thinks it has found a way
465
00:23:14,206 --> 00:23:20,510
to change memory
and perhaps change who we are.
466
00:23:23,915 --> 00:23:27,551
Can we deliberately change
our sense of identity?
467
00:23:27,553 --> 00:23:32,889
Neuroscientist Andr? Fenton from
the State University of New York
468
00:23:32,891 --> 00:23:34,791
doesn't see why not.
469
00:23:34,793 --> 00:23:36,593
To him, the brain
470
00:23:36,595 --> 00:23:39,563
and its pathways of connections
between neurons
471
00:23:39,565 --> 00:23:42,766
are like the labyrinth
of streets in New York City --
472
00:23:42,768 --> 00:23:46,603
a maze he navigates
on his daily runs.
473
00:23:46,605 --> 00:23:48,839
And just like Manhattan traffic,
474
00:23:48,841 --> 00:23:51,408
conditions for the flow
of electricity around the brain
475
00:23:51,410 --> 00:23:55,212
are not the same on every route.
476
00:23:55,214 --> 00:23:57,113
If you experience something,
477
00:23:57,115 --> 00:23:59,683
there's been
an electrical activation
478
00:23:59,685 --> 00:24:01,218
somewhere in the brain
479
00:24:01,220 --> 00:24:03,119
that spreads through the brain,
480
00:24:03,121 --> 00:24:04,921
and that is your experience.
481
00:24:04,923 --> 00:24:06,356
As in a city,
482
00:24:06,358 --> 00:24:09,493
there are roads that connect one
district to another district,
483
00:24:09,495 --> 00:24:12,362
and those roads
can be very big boulevards
484
00:24:12,364 --> 00:24:14,164
that send a lot of traffic,
485
00:24:14,166 --> 00:24:18,435
or they can be small alleys that
send very specific information,
486
00:24:18,437 --> 00:24:21,838
but nonetheless,
not very rapidly or very easily.
487
00:24:21,840 --> 00:24:23,740
For years,
488
00:24:23,742 --> 00:24:26,243
scientists thought
the pathways in our brains
489
00:24:26,245 --> 00:24:29,346
were set in stone after we
matured from babies to adults.
490
00:24:29,348 --> 00:24:31,982
Alleys could not become wider.
491
00:24:31,984 --> 00:24:34,718
Highways
could not become narrower.
492
00:24:34,720 --> 00:24:36,653
But now it has become clear
493
00:24:36,655 --> 00:24:39,222
that the roads
in our adult brains
494
00:24:39,224 --> 00:24:41,391
are under constant construction.
495
00:24:41,393 --> 00:24:44,861
Every time we store
a new memory,
496
00:24:44,863 --> 00:24:48,498
electrical activity propagates
through millions of neurons.
497
00:24:48,500 --> 00:24:52,168
Just as Andr?
is forced to find a new route
498
00:24:52,170 --> 00:24:54,204
if his pathway is blocked,
499
00:24:54,206 --> 00:24:57,207
our neural pathways
adjust themselves
500
00:24:57,209 --> 00:25:00,243
to process and record
new experiences.
501
00:25:00,245 --> 00:25:03,113
And so, what
neuroscientists understand is
502
00:25:03,115 --> 00:25:07,150
that there's a sufficient amount
of this plasticity
503
00:25:07,152 --> 00:25:08,552
throughout life,
504
00:25:08,554 --> 00:25:10,453
and that it is affected
505
00:25:10,455 --> 00:25:13,957
and modulated and controlled
by experience.
506
00:25:13,959 --> 00:25:17,127
Recently,
scientists have identified
507
00:25:17,129 --> 00:25:18,795
a molecule in the brain
508
00:25:18,797 --> 00:25:23,199
that jumps into action when
we are forming new memories.
509
00:25:23,201 --> 00:25:27,737
It is called PKMzeta.
510
00:25:27,739 --> 00:25:31,007
PKMzeta stands for
"protein kinase mzeta."
511
00:25:31,009 --> 00:25:32,475
It's my favorite molecule.
512
00:25:34,579 --> 00:25:40,216
When PKMzeta gets told
to deploy in a neuron,
513
00:25:40,218 --> 00:25:42,419
it gets told to do that
514
00:25:42,421 --> 00:25:45,555
on the basis
of a recent experience,
515
00:25:45,557 --> 00:25:49,059
and what it does is
it mediates efficient
516
00:25:49,061 --> 00:25:53,496
or increased efficiency
of neural transmission.
517
00:25:53,498 --> 00:25:56,566
When a memory
needs to navigate its way
518
00:25:56,568 --> 00:25:58,902
through the traffic
of our brains,
519
00:25:58,904 --> 00:26:01,438
PKMzeta clears the way,
520
00:26:01,440 --> 00:26:05,575
making sure the memory safely
reaches long-term storage.
521
00:26:05,577 --> 00:26:07,811
Those long-term memories,
522
00:26:07,813 --> 00:26:11,114
the ones that you form now
and you will keep forever,
523
00:26:11,116 --> 00:26:17,020
that kind of information storage
seems to be mediated by PKMzeta.
524
00:26:17,022 --> 00:26:19,689
But Andr? knew of a chemical
525
00:26:19,691 --> 00:26:22,225
that could neutralize PKMzeta,
526
00:26:22,227 --> 00:26:26,229
called zeta inhibitory peptide,
or ZIP,
527
00:26:26,231 --> 00:26:31,134
and he wondered if he injected
it into a living brain,
528
00:26:31,136 --> 00:26:34,437
could he prevent it from forming
long-term memories.
529
00:26:34,439 --> 00:26:37,907
So, the logic
of the experiment we did
530
00:26:37,909 --> 00:26:39,209
is very straightforward.
531
00:26:39,211 --> 00:26:42,879
What you want to do is
produce a memory.
532
00:26:42,881 --> 00:26:45,915
A rat is in a rotating carousel,
533
00:26:45,917 --> 00:26:47,617
and the key here is that
534
00:26:47,619 --> 00:26:50,587
whenever it enters
that part of the floor,
535
00:26:50,589 --> 00:26:53,023
it becomes electrified.
536
00:26:53,025 --> 00:26:56,059
And so, they very quickly
and rapidly learn
537
00:26:56,061 --> 00:26:58,895
to stay away
from that part of the room.
538
00:26:58,897 --> 00:27:03,800
In this computer-generated
read-out of Andr?'s experiment,
539
00:27:03,802 --> 00:27:06,202
the rat runs around the carousel
540
00:27:06,204 --> 00:27:10,473
but consistently avoids the
triangular-shaped shock zone.
541
00:27:10,475 --> 00:27:15,278
30 days later, Andr? puts
the rat back in the chamber
542
00:27:15,280 --> 00:27:18,248
and observes that it still
remembers to stay away.
543
00:27:18,250 --> 00:27:23,987
It has stored a new
long-term memory in its brain.
544
00:27:23,989 --> 00:27:27,891
But when Andr? injects
the rat's hippocampus with ZIP,
545
00:27:27,893 --> 00:27:30,193
he sees something extraordinary.
546
00:27:30,195 --> 00:27:33,730
When the rat is put back
in the carousel one more time,
547
00:27:33,732 --> 00:27:36,032
it runs right over
the shock zone
548
00:27:36,034 --> 00:27:39,436
as though it had never
been shocked before.
549
00:27:39,438 --> 00:27:41,371
You could see
that the animal behaved
550
00:27:41,373 --> 00:27:43,073
more or less
like a naive animal,
551
00:27:43,075 --> 00:27:44,374
so it was very exciting.
552
00:27:44,376 --> 00:27:49,079
Andr? has erased
a piece of the rat's memory.
553
00:27:49,081 --> 00:27:51,614
The ability to forget
people we have met,
554
00:27:51,616 --> 00:27:55,652
places we have been,
things we have done
555
00:27:55,654 --> 00:27:58,655
is now a pharmaceutical
possibility.
556
00:27:58,657 --> 00:28:02,492
But Andr? can't see
inside his rat's brain,
557
00:28:02,494 --> 00:28:04,561
and so he cannot be sure
558
00:28:04,563 --> 00:28:08,164
how many memories
the ZIP molecule erased.
559
00:28:08,166 --> 00:28:10,300
As we begin to work out
560
00:28:10,302 --> 00:28:12,836
the synaptic organization
of memories,
561
00:28:12,838 --> 00:28:16,539
we'll then be in a position
to understand
562
00:28:16,541 --> 00:28:18,475
whether it's possible
563
00:28:18,477 --> 00:28:21,377
to actually make
selective manipulations
564
00:28:21,379 --> 00:28:23,012
of particular memories.
565
00:28:23,014 --> 00:28:25,982
We are always
going to be confronted
566
00:28:25,984 --> 00:28:29,552
with the possibility
of erasing all memories,
567
00:28:29,554 --> 00:28:34,057
which could never be
a good idea.
568
00:28:34,059 --> 00:28:38,528
Using ZIP
to erase a specific memory
569
00:28:38,530 --> 00:28:40,930
is still a ways off.
570
00:28:40,932 --> 00:28:45,401
But in Montreal, one doctor
has found another way.
571
00:28:45,403 --> 00:28:47,504
He is washing away
painful memories
572
00:28:47,506 --> 00:28:52,742
that make his patients prisoners
inside their own identities.
573
00:28:55,210 --> 00:28:58,712
Who we are depends on
where we have been,
574
00:28:58,714 --> 00:29:02,483
who we have loved,
who we have lost.
575
00:29:02,485 --> 00:29:06,653
For some of us, painful memories
can linger like an open wound.
576
00:29:06,655 --> 00:29:11,525
They can hold us back from
becoming who we want to become.
577
00:29:11,527 --> 00:29:16,463
Doctor Alain Brunet
is a psychologist
578
00:29:16,465 --> 00:29:19,666
at McGill University
in Montreal.
579
00:29:19,668 --> 00:29:21,935
He specializes
in treating people
580
00:29:21,937 --> 00:29:24,338
with post-traumatic stress
disorder.
581
00:29:24,340 --> 00:29:28,041
Alain himself has a deep
understanding for the condition.
582
00:29:29,512 --> 00:29:31,979
In 1989,
at the University of Montreal,
583
00:29:31,981 --> 00:29:35,849
a deranged man carried out
the worst mass shooting
584
00:29:35,851 --> 00:29:37,651
in Canadian history.
585
00:29:37,653 --> 00:29:39,153
Alain was on campus,
586
00:29:39,155 --> 00:29:42,523
studying for his master's degree
in psychology.
587
00:29:42,525 --> 00:29:44,725
He shot --
he went through the corridors.
588
00:29:44,727 --> 00:29:46,660
He shot 12 women,
589
00:29:46,662 --> 00:29:48,862
and eventually
there were 13 deaths.
590
00:29:50,900 --> 00:29:52,666
The crisis intervention
591
00:29:52,668 --> 00:29:56,170
that had been conducted after
this event was very poorly done,
592
00:29:56,172 --> 00:30:00,641
and many of us were left
with a bad taste in our mouth,
593
00:30:00,643 --> 00:30:03,577
and so, it did have
a profound effect on me
594
00:30:03,579 --> 00:30:07,014
and on what I decided to study.
595
00:30:09,450 --> 00:30:13,387
This horrific event
started Alain on a path
596
00:30:13,389 --> 00:30:16,089
that he is
still following today.
597
00:30:16,091 --> 00:30:19,293
He helps people
who suffer from PTSD
598
00:30:19,295 --> 00:30:25,165
get back a part of themselves
that seems to be lost.
599
00:30:25,167 --> 00:30:31,371
PTSD can be conceived
as a disorder of memory.
600
00:30:31,373 --> 00:30:32,706
Because in a sense,
601
00:30:32,708 --> 00:30:36,109
it's really about things
that you wish you'd forget.
602
00:30:36,111 --> 00:30:39,880
That memory has been burned
into your brain
603
00:30:39,882 --> 00:30:41,982
and is way too powerful,
604
00:30:41,984 --> 00:30:43,450
and it's making you fearful
605
00:30:43,452 --> 00:30:45,385
in situations
where you shouldn't.
606
00:30:47,956 --> 00:30:52,059
Memory is a little bit
like writing with ink.
607
00:30:52,061 --> 00:30:56,296
So, you can see
that the ink is still wet.
608
00:30:56,298 --> 00:31:00,534
If I use my fingers
and go over my writing,
609
00:31:00,536 --> 00:31:03,971
it will smear what I just wrote.
610
00:31:03,973 --> 00:31:06,573
And this is exactly like
the workings of memory.
611
00:31:06,575 --> 00:31:11,044
But when a memory
is emotionally powerful,
612
00:31:11,046 --> 00:31:12,846
proteins in the brain
613
00:31:12,848 --> 00:31:15,015
build connections
between neurons
614
00:31:15,017 --> 00:31:16,750
and the memory is transferred
615
00:31:16,752 --> 00:31:19,820
to a separate
long-term storage area.
616
00:31:19,822 --> 00:31:23,557
There, it leaves
a lasting impression.
617
00:31:23,559 --> 00:31:28,528
Once the ink is dry,
the memory is there for good.
618
00:31:28,530 --> 00:31:31,565
Of course,
it might fade with time,
619
00:31:31,567 --> 00:31:34,368
but that memory
will still be accessible.
620
00:31:34,370 --> 00:31:36,336
Many scientists believe
621
00:31:36,338 --> 00:31:38,572
that once the ink
of a memory is dry,
622
00:31:38,574 --> 00:31:42,643
it is fixed and indelible.
623
00:31:42,645 --> 00:31:48,548
But Alain believes that
every time we recall a memory,
624
00:31:48,550 --> 00:31:53,353
it is like we are creating a
brand-new memory all over again.
625
00:31:53,355 --> 00:31:55,122
When you recall a memory,
626
00:31:55,124 --> 00:31:56,556
it becomes active again,
627
00:31:56,558 --> 00:32:00,294
and it becomes buzzing
with electrical activity.
628
00:32:00,296 --> 00:32:02,429
It's really a little bit like
629
00:32:02,431 --> 00:32:07,034
if you were rewriting the word
"Rouge" again with fresh ink.
630
00:32:07,036 --> 00:32:10,837
The moment someone
recalls a painful memory,
631
00:32:10,839 --> 00:32:16,043
Alain believes he has
an opportunity to modify it.
632
00:32:16,045 --> 00:32:18,979
I've had a lot of traumatic
events happen in my life,
633
00:32:18,981 --> 00:32:23,917
which I was able to, you know,
work through and live through,
634
00:32:23,919 --> 00:32:25,919
but then the death
of my daughter --
635
00:32:25,921 --> 00:32:28,021
it was too much.
636
00:32:28,023 --> 00:32:31,658
I couldn't function.
I couldn't work any longer.
637
00:32:31,660 --> 00:32:33,894
I had absolutely lost who I was.
638
00:32:33,896 --> 00:32:35,862
There's no doubt about that.
639
00:32:35,864 --> 00:32:40,600
Lois Bouchet,
who has come to Alain for help,
640
00:32:40,602 --> 00:32:44,237
is in for an intense treatment.
641
00:32:44,239 --> 00:32:45,906
As a first step,
642
00:32:45,908 --> 00:32:50,811
he asks her to methodically
recall her painful memory
643
00:32:50,813 --> 00:32:55,248
by reading aloud a personal
account of the traumatic event.
644
00:32:55,250 --> 00:32:57,084
Okay.
645
00:32:57,086 --> 00:32:58,618
"I heard the doorbell
at 5:00 a.m.
646
00:32:58,620 --> 00:33:00,887
"I went to the door
in my nightgown,
647
00:33:00,889 --> 00:33:02,823
"thinking it was
my daughter.
648
00:33:02,825 --> 00:33:04,791
"When I saw
that it was the police,
649
00:33:04,793 --> 00:33:07,160
"I excused myself
to go get my housecoat on.
650
00:33:07,162 --> 00:33:10,664
"As I'm walking down the hallway
to the bedroom,
651
00:33:10,666 --> 00:33:13,700
they ask if anybody is
at home with me."
652
00:33:13,702 --> 00:33:15,168
While Lois reads,
653
00:33:15,170 --> 00:33:16,870
she's under the influence
654
00:33:16,872 --> 00:33:20,073
of a drug Alain administers
called Propranolol,
655
00:33:20,075 --> 00:33:24,845
a simple beta blocker
that reduces high blood pressure
656
00:33:24,847 --> 00:33:29,082
and has a well-known side effect
of slight memory loss.
657
00:33:29,084 --> 00:33:32,219
"I know that something
is terribly wrong.
658
00:33:32,221 --> 00:33:34,454
"I get a knot
in my stomach.
659
00:33:34,456 --> 00:33:36,923
"My heart
starts beating faster,
660
00:33:36,925 --> 00:33:40,293
"and I can feel myself
shaking inside.
661
00:33:40,295 --> 00:33:44,164
"When I come back
to the living room,
662
00:33:44,166 --> 00:33:48,702
"he tells me Nikki has been hit
by a truck on the 401
663
00:33:48,704 --> 00:33:50,670
"and my Nikki is dead.
664
00:33:50,672 --> 00:33:53,740
"All of a sudden,
I crouch down
665
00:33:53,742 --> 00:33:56,076
"and start to sob
uncontrollably.
666
00:33:56,078 --> 00:33:59,513
"The pain is incredible.
My chest hurts.
667
00:33:59,515 --> 00:34:02,115
I think, 'How can I make it
through this?'"
668
00:34:06,053 --> 00:34:09,256
they did that once a week
for six weeks,
669
00:34:09,258 --> 00:34:13,627
and then we tested them with
a battery of tests, interviews,
670
00:34:13,629 --> 00:34:17,597
and psychophysiological
measurement of their responding
671
00:34:17,599 --> 00:34:20,367
while they're listening
to an account of their trauma.
672
00:34:20,369 --> 00:34:22,636
After six weeks of treatment,
673
00:34:22,638 --> 00:34:25,472
70% of Alain's patients
674
00:34:25,474 --> 00:34:29,276
show hardly any signs
of PTSD symptoms.
675
00:34:29,278 --> 00:34:31,578
They could talk about the pain
676
00:34:31,580 --> 00:34:34,481
without being forced
to relive it.
677
00:34:34,483 --> 00:34:37,150
And that really blew our mind,
678
00:34:37,152 --> 00:34:42,389
because they had only received
one small dose of a medication,
679
00:34:42,391 --> 00:34:46,726
and those people had been
suffering from PTSD for decades.
680
00:34:46,728 --> 00:34:48,528
Alain's patients
681
00:34:48,530 --> 00:34:51,665
have written over
their traumatic memories.
682
00:34:51,667 --> 00:34:55,168
They have a second chance
to reclaim their lives
683
00:34:55,170 --> 00:34:57,838
and to reclaim a sense of self.
684
00:34:57,840 --> 00:35:02,442
As you carried on,
it got easier.
685
00:35:02,444 --> 00:35:04,110
You never forgot the feelings.
686
00:35:04,112 --> 00:35:06,513
Like, I'm always gonna be upset
about it.
687
00:35:06,515 --> 00:35:09,115
My daughter died.
That's never gonna go away.
688
00:35:09,117 --> 00:35:11,985
But now I can think about
what happened
689
00:35:11,987 --> 00:35:14,721
without feeling like
I'm going to lose my mind.
690
00:35:19,760 --> 00:35:21,495
With trauma,
691
00:35:21,497 --> 00:35:25,765
there will always be
a time before and a time after,
692
00:35:25,767 --> 00:35:30,737
but in my opinion,
people gain back their old self.
693
00:35:30,739 --> 00:35:36,743
Alain seems to have
found the fine-tuned tool
694
00:35:36,745 --> 00:35:40,213
that can target
specific memories.
695
00:35:40,215 --> 00:35:42,582
But even if
we can envision a time
696
00:35:42,584 --> 00:35:45,919
when our identities
can be transformed or restored,
697
00:35:45,921 --> 00:35:49,990
we still haven't grasped
the most fundamental aspect
698
00:35:49,992 --> 00:35:52,759
about what makes us who we are.
699
00:35:52,761 --> 00:35:55,061
What is it that makes our brains
700
00:35:55,063 --> 00:35:58,565
able to question who we are
in the first place?
701
00:35:58,567 --> 00:36:01,001
One man thinks
he has the answer.
702
00:36:01,003 --> 00:36:02,669
He's trying to re-create
703
00:36:02,671 --> 00:36:04,938
the essence
of what makes us us
704
00:36:04,940 --> 00:36:08,608
in pieces of silicone hardware.
705
00:36:11,831 --> 00:36:14,167
The core of who we are
706
00:36:14,168 --> 00:36:17,703
is something we carry with us
everywhere we go.
707
00:36:17,904 --> 00:36:20,204
It lives somewhere in the web
708
00:36:20,206 --> 00:36:23,841
of billions neurons
in our brains.
709
00:36:23,843 --> 00:36:27,444
Now some scientists
are trying to discover
710
00:36:27,446 --> 00:36:31,182
if this biological network
can be replicated
711
00:36:31,184 --> 00:36:33,717
in silicone hardware,
712
00:36:33,719 --> 00:36:38,622
whether we can build a robot
that will ask itself,
713
00:36:38,624 --> 00:36:42,359
"Who am I?"
714
00:36:44,863 --> 00:36:46,931
Computer engineer Steve Furber
715
00:36:46,933 --> 00:36:48,966
from the University
of Manchester
716
00:36:48,968 --> 00:36:51,635
is on a quest to find out
717
00:36:51,637 --> 00:36:55,439
if a human identity
can be built.
718
00:36:55,441 --> 00:36:57,708
He is attempting to make
the first replica of the brain
719
00:36:57,710 --> 00:37:00,511
that works in real time.
720
00:37:00,513 --> 00:37:02,413
If he succeeds,
721
00:37:02,415 --> 00:37:05,482
he could unlock the secret
of what makes us who we are.
722
00:37:05,484 --> 00:37:08,719
I think the whole issue
of understanding the brain
723
00:37:08,721 --> 00:37:10,020
is fascinating.
724
00:37:10,022 --> 00:37:12,723
It's so central
to our existence.
725
00:37:12,725 --> 00:37:15,292
We're pretty sure that
our understanding of the brain
726
00:37:15,294 --> 00:37:17,061
is missing
some fundamental ideas,
727
00:37:17,063 --> 00:37:18,262
and one of these is
728
00:37:18,264 --> 00:37:20,297
how information is represented
in the brain.
729
00:37:20,299 --> 00:37:22,366
Steve believes
730
00:37:22,368 --> 00:37:26,170
there is a neural code
that runs our brains,
731
00:37:26,172 --> 00:37:31,108
that one code is responsible
for controlling multiple jobs --
732
00:37:31,110 --> 00:37:34,078
seeing, hearing,
learning language.
733
00:37:34,080 --> 00:37:37,548
It's just a matter
of finding out what the code is.
734
00:37:37,550 --> 00:37:40,017
He suspects
the best place to look
735
00:37:40,019 --> 00:37:44,121
is in the part of the brain that
is far more evolved in humans
736
00:37:44,123 --> 00:37:46,257
than in other species --
737
00:37:46,259 --> 00:37:51,028
the thin, wrinkly, outer layer
called the neocortex.
738
00:37:51,030 --> 00:37:54,398
So, the neocortex is a very
interesting area of the brain
739
00:37:54,400 --> 00:37:57,101
because it's pretty much
the same at the back,
740
00:37:57,103 --> 00:37:59,803
where it's doing
low-level image processing,
741
00:37:59,805 --> 00:38:00,904
and at the front,
742
00:38:00,906 --> 00:38:03,307
where it's doing
high-level functions.
743
00:38:03,309 --> 00:38:05,609
So, if you're born
without sight,
744
00:38:05,611 --> 00:38:07,344
a lot of your visual cortex
745
00:38:07,346 --> 00:38:09,813
will be taken over
processing sound.
746
00:38:09,815 --> 00:38:13,384
And it's quite common that
people who don't have sight
747
00:38:13,386 --> 00:38:15,319
have much more acute hearing.
748
00:38:15,321 --> 00:38:17,288
So there must be something
in common
749
00:38:17,290 --> 00:38:19,456
about the algorithms
that are used there,
750
00:38:19,458 --> 00:38:21,392
if only we could see
what that was.
751
00:38:21,394 --> 00:38:23,027
Computer engineers
752
00:38:23,029 --> 00:38:26,363
have been trying replicate
biological brains for decades,
753
00:38:26,365 --> 00:38:29,133
using standard
computer technology.
754
00:38:29,135 --> 00:38:34,238
But Steve believes they've been
going about it all wrong.
755
00:38:34,240 --> 00:38:36,874
In a conventional computer,
756
00:38:36,876 --> 00:38:40,010
data gets moved around
in large chunks.
757
00:38:40,012 --> 00:38:41,945
That would be like a chef
758
00:38:41,947 --> 00:38:45,683
dumping an entire dinner
and dessert into one pot
759
00:38:45,685 --> 00:38:48,786
and serving a pile
to one unfortunate customer.
760
00:38:51,022 --> 00:38:55,526
But the brain is more like
a cocktail party.
761
00:38:55,528 --> 00:38:59,296
Small bits of data
are passed around and shared.
762
00:38:59,298 --> 00:39:02,166
Before you know it,
connections are being made
763
00:39:02,168 --> 00:39:06,170
and a complex situation
is underway.
764
00:39:08,841 --> 00:39:14,445
This highly interconnected way
to arrange small packets of data
765
00:39:14,447 --> 00:39:17,014
is what Steve wants to replicate
766
00:39:17,016 --> 00:39:20,284
in a custom-designed
silicone circuit.
767
00:39:20,286 --> 00:39:24,822
He has created a brand-new type
of computer chip
768
00:39:24,824 --> 00:39:27,458
specifically engineered to mimic
769
00:39:27,460 --> 00:39:29,360
the way neurons work
in the brain.
770
00:39:29,362 --> 00:39:32,930
It is called the SpiNNaker chip.
771
00:39:34,999 --> 00:39:36,166
SpiNNaker is a compression
772
00:39:36,168 --> 00:39:37,968
of spiking neural network
architecture.
773
00:39:37,970 --> 00:39:40,237
If you say it quickly enough,
it comes out like "SpiNNaker."
774
00:39:40,239 --> 00:39:43,006
The SpiNNaker chip is
a massively parallel computer
775
00:39:43,008 --> 00:39:46,910
designed to run models
of the brain in real time,
776
00:39:46,912 --> 00:39:49,646
which means that our model runs
at the same speed
777
00:39:49,648 --> 00:39:51,415
as the biology inside your head.
778
00:39:51,417 --> 00:39:54,084
Each one
of Steve's SpiNNaker chips
779
00:39:54,086 --> 00:39:59,123
can be programmed to replicate
the behavior of 16,000 neurons.
780
00:39:59,125 --> 00:40:00,824
That's only a tiny fraction
781
00:40:00,826 --> 00:40:03,861
of the 100 billion neurons
we have in our brain,
782
00:40:03,863 --> 00:40:06,263
but it is
a significant step beyond
783
00:40:06,265 --> 00:40:08,665
anything
that has been done before.
784
00:40:08,667 --> 00:40:12,770
Steve and a team from the
Technical University of Munich
785
00:40:12,772 --> 00:40:16,540
are now wiring these
brain-like chips to robots.
786
00:40:16,542 --> 00:40:19,543
This might look like
a remote-controlled toy,
787
00:40:19,545 --> 00:40:22,179
but it is not.
788
00:40:22,181 --> 00:40:27,217
It is controlling itself
by sensing the world around it.
789
00:40:27,219 --> 00:40:29,653
So, the robot is basically
following the line
790
00:40:29,655 --> 00:40:31,221
entirely under neural control.
791
00:40:31,223 --> 00:40:33,991
It has a vision sensor
on the front.
792
00:40:33,993 --> 00:40:38,295
The vision information is being
sent into the SpiNNaker card.
793
00:40:38,297 --> 00:40:41,265
The SpiNNaker card is executing
the real-time neural network,
794
00:40:41,267 --> 00:40:43,500
and then the outputs
from the SpiNNaker card
795
00:40:43,502 --> 00:40:45,869
are being sent back
via the laptop to the robot
796
00:40:45,871 --> 00:40:48,305
and controlling its movement.
797
00:40:48,307 --> 00:40:50,207
The brain is over there,
and the body is over here.
798
00:40:50,209 --> 00:40:52,810
The robot's SpiNNaker chip brain
799
00:40:52,812 --> 00:40:55,212
mimics the way
a real biological brain works.
800
00:40:55,214 --> 00:40:57,147
Just like a child,
801
00:40:57,149 --> 00:41:00,017
it interacts
with its environment
802
00:41:00,019 --> 00:41:04,755
and uses its physical body to
understand the world around it.
803
00:41:04,757 --> 00:41:08,992
The more it experiences,
the smarter it gets.
804
00:41:08,994 --> 00:41:11,562
Our current systems
have four chips on.
805
00:41:11,564 --> 00:41:14,765
They can model
about 50,000, 60,000 neurons.
806
00:41:14,767 --> 00:41:16,233
In a few months' time,
807
00:41:16,235 --> 00:41:19,403
we'll have boards
about 10 times bigger than that,
808
00:41:19,405 --> 00:41:22,473
and they'll be getting up
to the level of complexity
809
00:41:22,475 --> 00:41:25,008
of a honey bee,
which has 850,000 neurons.
810
00:41:25,010 --> 00:41:27,177
And then beyond that,
we'll build systems
811
00:41:27,179 --> 00:41:29,413
and get up to mammalian
brain sizes.
812
00:41:31,349 --> 00:41:34,952
The human brain
is a formidably complex system,
813
00:41:34,954 --> 00:41:38,856
and it would take millions more
SpiNNaker chips to build one,
814
00:41:38,858 --> 00:41:43,360
but Steve is confident
it is possible.
815
00:41:43,362 --> 00:41:46,430
If you had a model of the mind
running in a machine,
816
00:41:46,432 --> 00:41:49,867
I don't see why it shouldn't
behave in exactly the same way.
817
00:41:49,869 --> 00:41:52,769
The question of whether
machines modeling the brain
818
00:41:52,771 --> 00:41:55,906
may ultimately be capable
of supporting the imagination,
819
00:41:55,908 --> 00:41:57,474
dreams, and so on
820
00:41:57,476 --> 00:41:58,609
is a very hard question,
821
00:41:58,611 --> 00:42:00,477
but I don't see
any fundamental reason
822
00:42:00,479 --> 00:42:01,812
why we shouldn't expect that.
823
00:42:03,948 --> 00:42:06,250
Steve believes
824
00:42:06,252 --> 00:42:10,521
that human brains run
on simple algorithms,
825
00:42:10,523 --> 00:42:15,125
and what works for humans will
also work for his machines.
826
00:42:15,127 --> 00:42:18,161
The journey to forming
an identity begins
827
00:42:18,163 --> 00:42:21,331
when a body,
guided by networks of neurons,
828
00:42:21,333 --> 00:42:24,701
struggles to navigate its way
through the world.
829
00:42:24,703 --> 00:42:28,839
It learns, adapts, remembers,
830
00:42:28,841 --> 00:42:33,010
and eventually
becomes self-aware.
831
00:42:35,179 --> 00:42:38,682
What makes us who we are?
832
00:42:38,684 --> 00:42:42,085
Our identities
are built bit by bit
833
00:42:42,087 --> 00:42:46,790
from our memories, our dreams,
and our imaginations.
834
00:42:46,792 --> 00:42:50,394
No one's sense of self is fixed.
835
00:42:50,396 --> 00:42:54,298
Life is a journey
that makes us all unique,
836
00:42:54,300 --> 00:42:56,633
and discovering who we are
837
00:42:56,635 --> 00:42:59,937
is our greatest
and longest adventure.
838
00:43:00,096 --> 00:43:04,096
== sync, corrected by elderman ==
839
00:43:04,146 --> 00:43:08,696
Repair and Synchronization by
Easy Subtitles Synchronizer 1.0.0.0
65588
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.