All language subtitles for Through.the.Wormhole.S03E04.480p.WEB-DL.x264-mSD.HI

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish Download
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:02,497 --> 00:00:04,565 Freeman: We all spend our lives 2 00:00:04,567 --> 00:00:07,268 on a search for something so close, 3 00:00:07,270 --> 00:00:11,038 yet always just out of reach. 4 00:00:11,040 --> 00:00:13,741 Some call it the ego. 5 00:00:13,743 --> 00:00:15,776 Others, the soul. 6 00:00:19,347 --> 00:00:22,516 Now modern science is prying into our thoughts, 7 00:00:22,518 --> 00:00:25,719 our memories, and our dreams, 8 00:00:25,721 --> 00:00:28,923 and asking the profoundly puzzling question, 9 00:00:28,925 --> 00:00:32,927 "What makes us who we are?" 10 00:00:36,398 --> 00:00:41,001 Space, time, life itself. 11 00:00:43,405 --> 00:00:47,975 The secrets of the cosmos lie through the wormhole. 12 00:00:47,977 --> 00:00:51,977 ♪ Through the Wormhole 03x04 ♪ What makes us who we are? Original Air Date on June 20, 2012 13 00:00:52,002 --> 00:00:56,002 == sync, corrected by elderman == 14 00:00:59,955 --> 00:01:04,558 What is it that makes me me? 15 00:01:04,560 --> 00:01:08,729 Or makes you you? 16 00:01:08,731 --> 00:01:11,365 Is it the things we know? 17 00:01:11,367 --> 00:01:14,468 The people and places we've experienced? 18 00:01:14,470 --> 00:01:18,372 What makes me the same person now as when I was 40? 19 00:01:18,374 --> 00:01:21,141 Or when I was 10? 20 00:01:21,143 --> 00:01:23,744 What is it that gives each one of us 21 00:01:23,746 --> 00:01:26,413 our unique identity? 22 00:01:26,415 --> 00:01:30,384 Scientists are beginning to tackle this profound puzzle. 23 00:01:30,386 --> 00:01:32,519 To do so, they must probe 24 00:01:32,521 --> 00:01:37,124 one of the last great frontiers of our understanding -- 25 00:01:37,126 --> 00:01:38,926 the brain. 26 00:01:38,928 --> 00:01:44,131 Every summer, I used to go camping with the Boy Scouts. 27 00:01:44,133 --> 00:01:46,767 Each item of clothing I took with me 28 00:01:46,769 --> 00:01:49,103 had to have my name sewn onto it. 29 00:01:49,105 --> 00:01:50,437 "Morgan." 30 00:01:50,439 --> 00:01:54,508 It was a name I never really liked when I was a boy. 31 00:01:54,510 --> 00:01:57,277 I wondered how my life might be different 32 00:01:57,279 --> 00:02:03,217 if I had been called something normal like Robert or John. 33 00:02:03,219 --> 00:02:04,718 Well, like it or not, 34 00:02:04,720 --> 00:02:07,554 the name I was given framed my identity. 35 00:02:07,556 --> 00:02:11,525 It helped make me who I am. 36 00:02:11,527 --> 00:02:12,626 Happy Birthday! 37 00:02:12,628 --> 00:02:14,261 Man: Look, he's walking! 38 00:02:14,263 --> 00:02:15,295 [ Laughs ] 39 00:02:15,297 --> 00:02:16,330 Show your mom. 40 00:02:16,332 --> 00:02:17,364 Bye! 41 00:02:17,366 --> 00:02:18,565 It's beautiful! 42 00:02:18,567 --> 00:02:19,700 [ Cheering ] 43 00:02:19,702 --> 00:02:21,635 Wish you were here. 44 00:02:24,439 --> 00:02:27,241 Freeman: Alison Gopnik is on a search 45 00:02:27,243 --> 00:02:31,812 to discover how and when we first understand who we are. 46 00:02:31,814 --> 00:02:34,081 She's a child psychologist 47 00:02:34,083 --> 00:02:36,817 at the University of California at Berkeley. 48 00:02:36,819 --> 00:02:39,119 So, the process of forging an identity, 49 00:02:39,121 --> 00:02:41,388 of figuring out who it is that we are, 50 00:02:41,390 --> 00:02:43,557 that's a process that really takes us our whole lifetime. 51 00:02:43,559 --> 00:02:46,660 But some of the most crucial parts of that 52 00:02:46,662 --> 00:02:48,495 seem to be things that we're learning 53 00:02:48,497 --> 00:02:50,664 in this very, very early period of our lives. 54 00:02:50,666 --> 00:02:52,066 Freeman: As adults, 55 00:02:52,068 --> 00:02:55,202 it's easy to take our identities for granted. 56 00:02:55,204 --> 00:02:57,204 We accept that who we are now 57 00:02:57,206 --> 00:03:00,107 is the same as who we were a minute ago. 58 00:03:00,109 --> 00:03:02,009 But Alison has discovered 59 00:03:02,011 --> 00:03:05,946 that identity is not so solid for young children. 60 00:03:05,948 --> 00:03:07,481 They spend much of their time 61 00:03:07,483 --> 00:03:12,586 trying to figure out just who they are. 62 00:03:12,588 --> 00:03:15,322 So, when kids are just doing the everyday things that kids do, 63 00:03:15,324 --> 00:03:18,525 when they're playing and exploring and pretending, 64 00:03:18,527 --> 00:03:20,928 what they're actually doing is being involved 65 00:03:20,930 --> 00:03:23,964 in this great, existential, philosophical research program. 66 00:03:23,966 --> 00:03:28,035 What it means to be a person. 67 00:03:28,037 --> 00:03:31,105 Freeman: One of the first milestones in this program 68 00:03:31,107 --> 00:03:33,774 is called "The mirror stage." 69 00:03:33,776 --> 00:03:36,877 It begins when a child is first able to recognize 70 00:03:36,879 --> 00:03:39,747 his or her own reflection. 71 00:03:39,749 --> 00:03:41,348 Gopnik: The thing about a mirror isn't just 72 00:03:41,350 --> 00:03:43,817 that you see a body and a face, which is fascinating, 73 00:03:43,819 --> 00:03:45,052 but you connect it 74 00:03:45,054 --> 00:03:46,820 to your own kinesthetic feeling of yourself -- 75 00:03:46,822 --> 00:03:48,222 the way your own body feels. 76 00:03:48,224 --> 00:03:51,725 Freeman: Alison uses a clever experiment on her toddlers 77 00:03:51,727 --> 00:03:54,962 to detect which of them have developed an awareness 78 00:03:54,964 --> 00:03:56,663 of their reflections. 79 00:03:56,665 --> 00:04:00,267 She places a smudge of blue ink on their noses 80 00:04:00,269 --> 00:04:03,303 and tells them to look in the mirror. 81 00:04:03,305 --> 00:04:05,572 Gopnik: There we go. There he is. 82 00:04:05,574 --> 00:04:08,942 There he is. Right there. Yeah. 83 00:04:08,944 --> 00:04:11,912 Freeman: 15-month-old John is hardly fazed 84 00:04:11,914 --> 00:04:14,348 by the blue-nosed child staring back at him 85 00:04:14,350 --> 00:04:16,350 because he's not able to recognize 86 00:04:16,352 --> 00:04:19,419 that the child in the mirror is him. 87 00:04:19,421 --> 00:04:20,654 They're interested in the fact 88 00:04:20,656 --> 00:04:22,389 that that baby in the mirror has a spot, 89 00:04:22,391 --> 00:04:24,958 but they don't seem to connect that to the fact 90 00:04:24,960 --> 00:04:27,594 that there's actually a spot on their own noses. 91 00:04:27,596 --> 00:04:30,497 Freeman: When Alison tries the same test on Karen, 92 00:04:30,499 --> 00:04:32,933 who is just a few months older than John, 93 00:04:32,935 --> 00:04:35,169 she does something quite different. 94 00:04:35,171 --> 00:04:37,004 Gopnik: Look at that! 95 00:04:37,006 --> 00:04:39,606 There we go. Yeah! 96 00:04:39,608 --> 00:04:40,941 What is that? 97 00:04:40,943 --> 00:04:43,043 And now the baby seemed to realize, "oh, yeah. 98 00:04:43,045 --> 00:04:44,278 "That person in the mirror, 99 00:04:44,280 --> 00:04:45,946 that's the same person that I am." 100 00:04:45,948 --> 00:04:48,649 Freeman: It is a big first step 101 00:04:48,651 --> 00:04:51,718 on a long journey of self-discovery. 102 00:04:51,720 --> 00:04:54,388 But Alison's work has shown that her subjects have 103 00:04:54,390 --> 00:04:59,059 a lot more philosophical research ahead of them. 104 00:04:59,061 --> 00:05:02,763 This is 3-year-old Geneva. 105 00:05:02,765 --> 00:05:03,997 Alison tempts her 106 00:05:03,999 --> 00:05:06,900 with what looks like a box filled with cookies, 107 00:05:06,902 --> 00:05:09,036 but Geneva will soon find out 108 00:05:09,038 --> 00:05:11,972 that she can't judge a box by its cover. 109 00:05:11,974 --> 00:05:16,743 What do you think is inside this box? 110 00:05:16,745 --> 00:05:17,945 Hmm. 111 00:05:17,947 --> 00:05:19,079 What do you think this is? 112 00:05:20,882 --> 00:05:22,382 Good cookies. 113 00:05:22,384 --> 00:05:26,119 Good cookies. Should we find out? 114 00:05:26,121 --> 00:05:30,424 Let's open the box and find out what's inside. 115 00:05:31,459 --> 00:05:32,759 Markers. 116 00:05:32,761 --> 00:05:34,895 Markers! There's markers inside. 117 00:05:34,897 --> 00:05:36,496 Hmm. 118 00:05:36,498 --> 00:05:37,764 [ Laughs ] 119 00:05:37,766 --> 00:05:42,402 When I first showed you this box all closed up like this, 120 00:05:42,404 --> 00:05:45,372 what did you think was inside it? 121 00:05:45,374 --> 00:05:46,506 Markers. 122 00:05:46,508 --> 00:05:47,808 Markers! 123 00:05:47,810 --> 00:05:50,143 Freeman: Geneva cannot reconcile 124 00:05:50,145 --> 00:05:53,480 that she now knows the box is filled with markers 125 00:05:53,482 --> 00:05:58,318 when she once thought it was filled with cookies. 126 00:05:58,320 --> 00:06:02,122 The concept that there is a you who is the same person 127 00:06:02,124 --> 00:06:04,691 even if your thoughts have changed 128 00:06:04,693 --> 00:06:07,294 is not an understanding you're born with. 129 00:06:07,296 --> 00:06:10,030 It is something you come to learn. 130 00:06:10,032 --> 00:06:13,667 Alison tries the cookie box test with 4-year-old Jim. 131 00:06:13,669 --> 00:06:16,103 I have a question for you. 132 00:06:16,105 --> 00:06:17,337 What? 133 00:06:17,339 --> 00:06:21,441 When you first saw this box all closed up like this, 134 00:06:21,443 --> 00:06:24,244 what did you think was inside it? 135 00:06:24,246 --> 00:06:26,580 Cookies with chocolate chips. 136 00:06:26,582 --> 00:06:29,716 Uh-huh. And what's really inside it? 137 00:06:29,718 --> 00:06:31,385 Markers! 138 00:06:31,387 --> 00:06:33,553 Yeah! That's great! 139 00:06:33,555 --> 00:06:35,555 So, a very important part of my identity 140 00:06:35,557 --> 00:06:36,957 is being able to say, 141 00:06:36,959 --> 00:06:40,527 "Well, when I was 16, I believed different things than I do now, 142 00:06:40,529 --> 00:06:43,397 but I was me who believed those different things." 143 00:06:43,399 --> 00:06:44,932 Freeman: When children realize 144 00:06:44,934 --> 00:06:47,801 their identities can survive any change in their beliefs, 145 00:06:47,803 --> 00:06:51,338 they stop forgetting the things they don't believe anymore, 146 00:06:51,340 --> 00:06:52,806 and for the first time, 147 00:06:52,808 --> 00:06:57,511 they unlock the astonishing power of human memory. 148 00:07:02,250 --> 00:07:05,719 Addis: In this very hall, I used to perform in the choir. 149 00:07:05,721 --> 00:07:07,387 It takes me back. 150 00:07:07,389 --> 00:07:10,557 [ Echoing ] In this very hall, I used to perform in the choir. 151 00:07:10,559 --> 00:07:14,861 My own memories are so richly detailed, 152 00:07:14,863 --> 00:07:17,998 and I spent many hours in this hall, 153 00:07:18,000 --> 00:07:19,766 rehearsing, practicing. 154 00:07:19,768 --> 00:07:22,235 It just takes me back. 155 00:07:22,237 --> 00:07:27,140 [ Choir vocalizing ] 156 00:07:27,142 --> 00:07:30,711 Freeman: Neuroscientist Donna Addis 157 00:07:30,713 --> 00:07:34,614 is head of the memory lab at the University of Auckland, 158 00:07:34,616 --> 00:07:37,184 where she investigates how memories shape us. 159 00:07:37,186 --> 00:07:38,885 She does this by peering in 160 00:07:38,887 --> 00:07:42,055 to a horseshoe-shaped area in the brain 161 00:07:42,057 --> 00:07:45,025 called the hippocampus. 162 00:07:45,027 --> 00:07:48,495 For the last 50 years, scientists have known 163 00:07:48,497 --> 00:07:53,066 that the hippocampus is critical to the storage of memory. 164 00:07:53,068 --> 00:07:57,738 We learned this thanks to one man, 165 00:07:57,740 --> 00:08:00,240 known as "Patient H.M." 166 00:08:00,242 --> 00:08:04,111 He had severe epilepsy and, in 1953, 167 00:08:04,113 --> 00:08:07,848 received a radical, new treatment. 168 00:08:07,850 --> 00:08:11,184 His hippocampus and part of his inner temporal lobes 169 00:08:11,186 --> 00:08:15,255 were cut out. 170 00:08:15,257 --> 00:08:17,024 After the surgery, 171 00:08:17,026 --> 00:08:21,962 H.M. never suffered a bout of epilepsy again, 172 00:08:21,964 --> 00:08:25,298 but he had completely lost 173 00:08:25,300 --> 00:08:27,801 the ability to form new memories. 174 00:08:27,803 --> 00:08:32,839 He said, "Each day is like waking from a dream." 175 00:08:32,841 --> 00:08:36,910 He had lost his identity. 176 00:08:36,912 --> 00:08:42,682 His sense of who he was was frozen at age 27. 177 00:08:42,684 --> 00:08:44,584 As days turned into decades, 178 00:08:44,586 --> 00:08:46,453 he could no longer recognize 179 00:08:46,455 --> 00:08:49,856 the man staring back at him in the mirror. 180 00:08:49,858 --> 00:08:51,892 How much of who we are 181 00:08:51,894 --> 00:08:56,496 is built upon the memories we make each and every day? 182 00:08:56,498 --> 00:09:00,333 Traditionally, memory research has really focused on the past, 183 00:09:00,335 --> 00:09:03,637 and in the last few years, researchers such as myself 184 00:09:03,639 --> 00:09:07,107 have been looking at the ability to imagine the future 185 00:09:07,109 --> 00:09:10,343 and how memory might actually play a role in that. 186 00:09:10,345 --> 00:09:13,513 Freeman: Donna and her team set up an experiment 187 00:09:13,515 --> 00:09:17,851 to determine just how the hippocampus and our memories 188 00:09:17,853 --> 00:09:20,687 help us perceive our future selves. 189 00:09:20,689 --> 00:09:21,922 Addis: So, what we do is 190 00:09:21,924 --> 00:09:23,723 we have subjects come into the lab 191 00:09:23,725 --> 00:09:27,494 and we have them retrieve around 100 memories. 192 00:09:27,496 --> 00:09:31,598 I remember going to visit my cousin in Germany. 193 00:09:31,600 --> 00:09:32,899 We walked along the beach, 194 00:09:32,901 --> 00:09:34,668 and at the end, there was a cave. 195 00:09:34,670 --> 00:09:37,971 And I gave my grandmother a seashell for Christmas. 196 00:09:37,973 --> 00:09:39,473 Addis: And for each memory, 197 00:09:39,475 --> 00:09:42,342 they identify a person, a place, and an object 198 00:09:42,344 --> 00:09:43,844 that might be important. 199 00:09:43,846 --> 00:09:45,345 -The Autobahn. -My mother. 200 00:09:45,347 --> 00:09:46,546 -Two penguins. -Boston. 201 00:09:46,548 --> 00:09:48,148 -My book bag. -Auckland waterfront. 202 00:09:48,150 --> 00:09:49,649 -My girlfriend, Sarah. -The hospital. 203 00:09:49,651 --> 00:09:51,351 -My partner, Wayne. -Eastern Beach. 204 00:09:51,353 --> 00:09:52,986 -Amsterdam airport. -A cave. 205 00:09:52,988 --> 00:09:57,023 Freeman: A week later, Donna brings her subjects back, 206 00:09:57,025 --> 00:09:59,493 places them in an fMRI machine, 207 00:09:59,495 --> 00:10:03,530 and shows them details from their recalled memories. 208 00:10:03,532 --> 00:10:06,366 But she deliberately jumbles the details. 209 00:10:06,368 --> 00:10:08,201 An object from one memory 210 00:10:08,203 --> 00:10:11,738 has been grouped with a place from another memory 211 00:10:11,740 --> 00:10:14,174 and a person from yet another. 212 00:10:14,176 --> 00:10:17,277 Donna then asks them to make a story 213 00:10:17,279 --> 00:10:19,779 out of these mixed up memories, 214 00:10:19,781 --> 00:10:22,983 to imagine something that has not happened yet 215 00:10:22,985 --> 00:10:26,119 but potentially could. 216 00:10:26,121 --> 00:10:27,420 We ask them, 217 00:10:27,422 --> 00:10:30,157 for each person, place, object that they're seeing now, 218 00:10:30,159 --> 00:10:33,160 to imagine a future event that might happen to them 219 00:10:33,162 --> 00:10:35,262 within the next five years or so. 220 00:10:35,264 --> 00:10:39,766 Freeman: While the participants are imagining their futures, 221 00:10:39,768 --> 00:10:42,302 Donna measures their brain responses. 222 00:10:42,304 --> 00:10:43,436 To her surprise, 223 00:10:43,438 --> 00:10:45,872 the part of the brain that is vital 224 00:10:45,874 --> 00:10:49,242 to storing memories of the past, the hippocampus, 225 00:10:49,244 --> 00:10:52,112 is blazing with activity. 226 00:10:52,114 --> 00:10:55,348 The hippocampus is playing a really important role, 227 00:10:55,350 --> 00:10:56,917 not only in remembering, 228 00:10:56,919 --> 00:11:00,387 but also allowing us to build these future simulations. 229 00:11:00,389 --> 00:11:03,790 Memory is important not only for the past, 230 00:11:03,792 --> 00:11:05,725 but also for the future, 231 00:11:05,727 --> 00:11:08,295 for building up that sense of who we are. 232 00:11:10,431 --> 00:11:15,268 Our memories are crucial to forming our identities, 233 00:11:15,270 --> 00:11:17,137 but one group of scientists have discovered 234 00:11:17,139 --> 00:11:19,439 that our memories can be manipulated 235 00:11:19,441 --> 00:11:21,808 without our even knowing it. 236 00:11:21,810 --> 00:11:25,045 Are we really who we think we are? 237 00:11:27,170 --> 00:11:31,240 Our ability to remember is truly remarkable. 238 00:11:31,242 --> 00:11:33,542 In the course of our lives, 239 00:11:33,544 --> 00:11:38,414 the average person will grasp the meaning of 100,000 words, 240 00:11:38,416 --> 00:11:41,150 get to know around 1,700 people, 241 00:11:41,152 --> 00:11:44,653 and read over 1,000 books. 242 00:11:44,655 --> 00:11:48,490 From these vast mental stores of experience 243 00:11:48,492 --> 00:11:51,193 we each build our own identity, 244 00:11:51,195 --> 00:11:55,064 a pattern of memories that is uniquely ours. 245 00:11:55,066 --> 00:12:00,069 But what if those memories could be rewritten? 246 00:12:00,071 --> 00:12:04,240 Could we change who we are? 247 00:12:04,242 --> 00:12:06,342 [ Indistinct conversations ] 248 00:12:08,678 --> 00:12:10,479 Neuroscientist Tali Sharot 249 00:12:10,481 --> 00:12:12,948 from the University College London 250 00:12:12,950 --> 00:12:14,283 and Micah Edelson 251 00:12:14,285 --> 00:12:17,686 from the Weizmann Institute of Science in Israel 252 00:12:17,688 --> 00:12:20,389 love going to dinner parties. 253 00:12:20,391 --> 00:12:22,191 But they're not just having fun. 254 00:12:22,193 --> 00:12:24,059 They're doing it for the sake of science. 255 00:12:24,061 --> 00:12:30,199 They study how social pressures alter who we are. 256 00:12:30,201 --> 00:12:31,400 Well, imagine a situation 257 00:12:31,402 --> 00:12:33,302 that you're sitting in a dinner party. 258 00:12:33,304 --> 00:12:36,705 You have a pretty good memory of some situation that happened, 259 00:12:36,707 --> 00:12:40,609 and you actually remember it happening in a certain way, 260 00:12:40,611 --> 00:12:42,878 but all of your friends are telling you 261 00:12:42,880 --> 00:12:44,413 that you're actually wrong. 262 00:12:44,415 --> 00:12:46,081 They're saying that something else happened. 263 00:12:46,083 --> 00:12:48,083 And Tali got the umbrella, and she slid the umbrella 264 00:12:48,085 --> 00:12:49,885 through the letterbox. Woman: Are you sure it was Tali? 265 00:12:49,887 --> 00:12:52,021 No, no, no. I'm quite sure it wasn't me. 266 00:12:52,023 --> 00:12:54,089 No, it was definitely you. You were definitely holding the umbrella. 267 00:12:54,091 --> 00:12:55,691 It was definitely Steve. 268 00:12:55,693 --> 00:12:57,393 I really thought it was Tali, though. 269 00:12:57,395 --> 00:12:59,061 No, because we were really impressed 270 00:12:59,063 --> 00:13:00,729 that Steve managed to hook it around. 271 00:13:00,731 --> 00:13:02,298 Remember? We had that big fanfare 272 00:13:02,300 --> 00:13:03,799 because we were out in the rain. 273 00:13:03,801 --> 00:13:05,834 When someone changes their memory 274 00:13:05,836 --> 00:13:08,938 so that it fits other people's opinion, 275 00:13:08,940 --> 00:13:11,807 do we actually change a signal in the brain 276 00:13:11,809 --> 00:13:14,677 that is representative of the memory, 277 00:13:14,679 --> 00:13:17,479 or is it just that we try to please other people? 278 00:13:17,481 --> 00:13:19,949 Freeman: Social pressures 279 00:13:19,951 --> 00:13:21,917 can make us change the way we tell stories. 280 00:13:21,919 --> 00:13:26,822 But can they also make us change the stories we tell ourselves? 281 00:13:26,824 --> 00:13:29,291 Our actual memories? 282 00:13:29,293 --> 00:13:33,128 Tali and Micah set up an experiment to find out, 283 00:13:33,130 --> 00:13:36,298 not in a restaurant, but in a lab. 284 00:13:36,300 --> 00:13:38,834 They bring in a group of volunteers 285 00:13:38,836 --> 00:13:40,569 to watch a film together. 286 00:13:40,571 --> 00:13:42,137 Afterwards, 287 00:13:42,139 --> 00:13:45,074 the volunteers answer basic questions about the film 288 00:13:45,076 --> 00:13:47,242 to test what they remember. 289 00:13:49,546 --> 00:13:50,913 A few days later, 290 00:13:50,915 --> 00:13:53,716 the participants take the same questionnaire 291 00:13:53,718 --> 00:13:55,217 inside a brain scanner, 292 00:13:55,219 --> 00:14:00,856 only this time, Micah and Tali apply a social pressure. 293 00:14:00,858 --> 00:14:03,859 This time, they were exposed to fake answers 294 00:14:03,861 --> 00:14:07,629 that were supposedly given by their fellow group members. 295 00:14:07,631 --> 00:14:09,999 Freeman: The group is led to believe 296 00:14:10,001 --> 00:14:12,234 that the others who took the test 297 00:14:12,236 --> 00:14:16,372 remembered the character in the film was not wearing a hat. 298 00:14:16,374 --> 00:14:20,175 And most people changed their answers 299 00:14:20,177 --> 00:14:22,144 to go along with the crowd. 300 00:14:22,146 --> 00:14:23,679 Almost 70% of the cases, 301 00:14:23,681 --> 00:14:27,082 the participants conformed and they gave a wrong answer, 302 00:14:27,084 --> 00:14:28,684 even though they were initially pretty confident 303 00:14:28,686 --> 00:14:30,119 about the correct answer. 304 00:14:30,121 --> 00:14:35,057 But were they just outwardly complying with a social norm, 305 00:14:35,059 --> 00:14:38,560 or did their memories actually change? 306 00:14:40,463 --> 00:14:42,798 So, we test them again a week after, 307 00:14:42,800 --> 00:14:46,468 once we've removed the social pressure. 308 00:14:46,470 --> 00:14:48,003 And we assume that if they're still making an error, 309 00:14:48,005 --> 00:14:51,573 that means that their memory was actually changed. 310 00:14:51,575 --> 00:14:53,175 Freeman: Tali and Micah found 311 00:14:53,177 --> 00:14:55,978 that most test subjects stuck with the wrong answer 312 00:14:55,980 --> 00:14:57,913 even without peer pressure. 313 00:14:57,915 --> 00:15:02,518 The falsehood has taken root in their brains. 314 00:15:02,520 --> 00:15:07,056 It actually causes a long-lasting memory error, 315 00:15:07,058 --> 00:15:09,058 and using our brain data, 316 00:15:09,060 --> 00:15:10,826 we are able to actually identify 317 00:15:10,828 --> 00:15:15,397 when such a long-lasting memory error will occur. 318 00:15:15,399 --> 00:15:18,767 The brains of those people who changed their memories 319 00:15:18,769 --> 00:15:21,703 showed high activity not just in the hippocampus, 320 00:15:21,705 --> 00:15:23,272 where memories are stored, 321 00:15:23,274 --> 00:15:25,340 but also in a part of the brain 322 00:15:25,342 --> 00:15:28,944 that is connected with emotional and social responses, 323 00:15:28,946 --> 00:15:30,312 the amygdala. 324 00:15:34,951 --> 00:15:36,485 Sharot: A lot of what we know 325 00:15:36,487 --> 00:15:38,220 about the amygdala and its function 326 00:15:38,222 --> 00:15:40,122 comes from the animal world. 327 00:15:40,124 --> 00:15:43,292 So our reaction to anything that's emotional -- 328 00:15:43,294 --> 00:15:46,495 if we suddenly hear a noise which is frightening 329 00:15:46,497 --> 00:15:49,731 or of processing emotional expressions on one's face -- 330 00:15:49,733 --> 00:15:52,468 the amygdala is crucial for all of these functions. 331 00:15:52,470 --> 00:15:56,605 Edelson: And it probably helps us increase memory 332 00:15:56,607 --> 00:15:59,608 because, in a fearful or emotional situation, 333 00:15:59,610 --> 00:16:02,144 the amygdala activation is heightened 334 00:16:02,146 --> 00:16:03,779 and it also increases activation 335 00:16:03,781 --> 00:16:05,881 in related structures like the hippocampus. 336 00:16:05,883 --> 00:16:07,349 [ Growls ] 337 00:16:07,351 --> 00:16:10,552 Freeman: The amygdala is like a bouncer at a nightclub. 338 00:16:10,554 --> 00:16:12,888 It decides which memories 339 00:16:12,890 --> 00:16:15,224 get to play a part in shaping our life histories. 340 00:16:15,226 --> 00:16:18,994 The ones that carry emotional weight are allowed in. 341 00:16:18,996 --> 00:16:23,565 The ones that don't are not. 342 00:16:23,567 --> 00:16:26,635 The participants who changed their memories 343 00:16:26,637 --> 00:16:30,272 were emotionally affected by the pressure to conform, 344 00:16:30,274 --> 00:16:32,975 setting their amygdalas ablaze, 345 00:16:32,977 --> 00:16:36,145 and the false memory snuck in. 346 00:16:36,147 --> 00:16:38,280 So, by looking at amygdala activation, 347 00:16:38,282 --> 00:16:39,681 we can actually predict 348 00:16:39,683 --> 00:16:42,551 which memories are gonna be changed for a very long time 349 00:16:42,553 --> 00:16:43,919 and which are not. 350 00:16:43,921 --> 00:16:45,787 You have to realize 351 00:16:45,789 --> 00:16:48,590 that memories are not like a videotape. 352 00:16:48,592 --> 00:16:50,726 'Cause people can be extremely confident 353 00:16:50,728 --> 00:16:53,061 that things happened like they think they happened 354 00:16:53,063 --> 00:16:54,263 when they didn't. 355 00:16:54,265 --> 00:16:57,132 Freeman: Our memories are not just a record 356 00:16:57,134 --> 00:17:00,002 of the events that took place in our lives. 357 00:17:00,004 --> 00:17:03,772 They are malleable and fallible. 358 00:17:03,774 --> 00:17:08,911 Our identities are created with constant input from our society. 359 00:17:08,913 --> 00:17:11,547 No man is an island. 360 00:17:13,183 --> 00:17:15,984 But could we go a step further 361 00:17:15,986 --> 00:17:20,756 and deliberately re-engineer someone's identity? 362 00:17:20,758 --> 00:17:22,191 To do that, 363 00:17:22,193 --> 00:17:25,260 you have to be able to peer into a person's innermost thoughts, 364 00:17:25,262 --> 00:17:27,796 and believe it or not, 365 00:17:27,798 --> 00:17:31,733 that technology is already here. 366 00:17:31,735 --> 00:17:34,069 [ Beeping ] 367 00:17:36,846 --> 00:17:40,716 We are all actors, to some extent. 368 00:17:41,827 --> 00:17:44,361 Who we appear to be can change 369 00:17:44,363 --> 00:17:48,298 depending on our mood or the company we keep. 370 00:17:48,300 --> 00:17:50,267 But there is one time 371 00:17:50,269 --> 00:17:54,237 when who we really are comes to the fore -- 372 00:17:54,239 --> 00:17:56,740 when we dream. 373 00:17:56,742 --> 00:18:01,411 What if we could see our dreams and study them? 374 00:18:01,413 --> 00:18:03,079 Could we know each other 375 00:18:03,081 --> 00:18:07,350 in a more profound way than ever before? 376 00:18:08,352 --> 00:18:11,021 During an average life span, 377 00:18:11,023 --> 00:18:14,190 a human being spends about six years dreaming. 378 00:18:14,192 --> 00:18:17,627 That's more than 52,000 hours of imagery 379 00:18:17,629 --> 00:18:20,764 buzzing through our unconscious brains. 380 00:18:20,766 --> 00:18:23,867 [ Alarm buzzing ] 381 00:18:23,869 --> 00:18:25,702 [ Buzzing stops ] 382 00:18:25,704 --> 00:18:28,772 Computational neuroscientist Yuki Kamitani 383 00:18:28,774 --> 00:18:31,474 believes one day it will be possible 384 00:18:31,476 --> 00:18:34,911 to watch and record what people are dreaming. 385 00:18:36,647 --> 00:18:38,081 When that happens, 386 00:18:38,083 --> 00:18:40,850 we will all get to know ourselves 387 00:18:40,852 --> 00:18:42,252 on a much deeper level. 388 00:18:42,254 --> 00:18:45,689 Kamitani: I believe that if we can reconstruct 389 00:18:45,691 --> 00:18:48,191 or decode the contents of a dream, 390 00:18:48,193 --> 00:18:50,860 the identity is revealed. 391 00:18:50,862 --> 00:18:54,497 Freeman: If we remember our dreams, 392 00:18:54,499 --> 00:18:58,301 it is often as a series of emotionally charged images. 393 00:18:59,971 --> 00:19:02,672 In fact, scientists have found 394 00:19:02,674 --> 00:19:05,875 that the visual cortex of a dreaming brain 395 00:19:05,877 --> 00:19:07,243 is highly active. 396 00:19:07,245 --> 00:19:10,647 Patterns of electrical activity wash over it, 397 00:19:10,649 --> 00:19:12,482 which makes Yuki wonder, 398 00:19:12,484 --> 00:19:15,051 can we learn to read those patterns 399 00:19:15,053 --> 00:19:18,121 and convert them into images on a computer? 400 00:19:18,123 --> 00:19:23,660 Brain activity can be seen as a code or an encrypted message 401 00:19:23,662 --> 00:19:27,197 about what's going on in the visual world. 402 00:19:27,199 --> 00:19:31,234 Freeman: The patterns of images we make in our brains 403 00:19:31,236 --> 00:19:32,602 are highly distorted, 404 00:19:32,604 --> 00:19:35,338 in the same way a pair of shattered glasses 405 00:19:35,340 --> 00:19:37,507 distorts our view of the world. 406 00:19:37,509 --> 00:19:39,042 [ Glass shatters ] 407 00:19:39,044 --> 00:19:40,510 But if we collected data 408 00:19:40,512 --> 00:19:43,880 on hundreds of images seen through those shattered lenses, 409 00:19:43,882 --> 00:19:45,582 we could find a correspondence 410 00:19:45,584 --> 00:19:48,685 between the distorted images and the real ones. 411 00:19:48,687 --> 00:19:50,653 It might take a while, 412 00:19:50,655 --> 00:19:54,591 but if we gave that job to a powerful computer, 413 00:19:54,593 --> 00:19:58,928 it could decode the scrambled images into recognizable ones. 414 00:19:58,930 --> 00:20:01,765 And this is how Yuki tries to crack the code 415 00:20:01,767 --> 00:20:02,966 that turns images 416 00:20:02,968 --> 00:20:07,037 into patterns of activity in the visual cortex. 417 00:20:07,039 --> 00:20:10,440 We measure the brain activity of human subjects, 418 00:20:10,442 --> 00:20:13,510 and we let the subject go into the scanner 419 00:20:13,512 --> 00:20:15,545 and scan their brain. 420 00:20:15,547 --> 00:20:17,013 And during that, 421 00:20:17,015 --> 00:20:19,616 we present some images to the subject. 422 00:20:19,618 --> 00:20:22,819 Typically several hundreds or thousands of images 423 00:20:22,821 --> 00:20:24,254 in single experiment. 424 00:20:24,256 --> 00:20:26,589 Freeman: The images Yuki shows people 425 00:20:26,591 --> 00:20:28,358 are simple black-and-white shapes -- 426 00:20:28,360 --> 00:20:32,028 a square, a cross, a line. 427 00:20:32,030 --> 00:20:34,464 Using a powerful computer array, 428 00:20:34,466 --> 00:20:37,967 he records the precise pattern of activity 429 00:20:37,969 --> 00:20:39,769 in the visual cortex. 430 00:20:39,771 --> 00:20:42,539 After multiple trials with the same person, 431 00:20:42,541 --> 00:20:44,574 the computer learns to distinguish 432 00:20:44,576 --> 00:20:46,743 the patterns triggered by each image. 433 00:20:46,745 --> 00:20:48,078 In other words, 434 00:20:48,080 --> 00:20:51,481 the computer can judge purely from the brain activity 435 00:20:51,483 --> 00:20:54,751 which of the shapes the subject is looking at. 436 00:20:54,753 --> 00:20:57,420 And then Yuki does something remarkable. 437 00:20:57,422 --> 00:21:01,458 He shows the subjects brand-new images, 438 00:21:01,460 --> 00:21:04,127 images the computer has never seen, 439 00:21:04,129 --> 00:21:07,263 and lets the computer try to draw a picture 440 00:21:07,265 --> 00:21:10,133 of what the subjects are seeing. 441 00:21:10,135 --> 00:21:12,068 These are the images 442 00:21:12,070 --> 00:21:14,838 the computer reads inside people's brains. 443 00:21:14,840 --> 00:21:20,110 And these are the images they are actually looking at. 444 00:21:20,112 --> 00:21:21,678 This is the first time 445 00:21:21,680 --> 00:21:25,181 anyone has been able to know what people are seeing 446 00:21:25,183 --> 00:21:27,550 purely by looking at their brains. 447 00:21:27,552 --> 00:21:30,854 Kamitani: Looking at the visual cortex, 448 00:21:30,856 --> 00:21:35,291 we have just succeeded in reconstructing seen images. 449 00:21:35,293 --> 00:21:39,095 We are now trying to reconstruct imagined images 450 00:21:39,097 --> 00:21:42,866 or images in your dreams. 451 00:21:44,668 --> 00:21:46,836 Freeman: Yuki's method, as yet, 452 00:21:46,838 --> 00:21:49,839 only works on pixilated black-and-white images, 453 00:21:49,841 --> 00:21:52,509 but with a few more years of refinement, 454 00:21:52,511 --> 00:21:55,778 Yuki believes we will be able to record our dreams 455 00:21:55,780 --> 00:21:59,149 as full-color, high-definition movies. 456 00:21:59,151 --> 00:22:03,153 And that would truly be a window into our souls. 457 00:22:03,155 --> 00:22:06,389 Those, you know, unconscious aspects of our mind 458 00:22:06,391 --> 00:22:10,160 defines what we are and what the identity is. 459 00:22:10,162 --> 00:22:15,031 So, I think if we can reveal some dream contents 460 00:22:15,033 --> 00:22:18,034 which someone is not aware of, 461 00:22:18,036 --> 00:22:24,941 then that might reveal some deep property of that person. 462 00:22:26,710 --> 00:22:28,578 Our true identities 463 00:22:28,580 --> 00:22:32,282 could soon be laid bare for all to see, 464 00:22:32,284 --> 00:22:35,285 including the parts we don't want seen, 465 00:22:35,287 --> 00:22:39,756 like our deepest-held secrets and fantasies. 466 00:22:39,758 --> 00:22:43,059 But you may not have to worry. 467 00:22:43,061 --> 00:22:47,130 Because the power to edit the contents of our minds 468 00:22:47,132 --> 00:22:49,532 is close at hand. 469 00:22:52,183 --> 00:22:56,319 Our brains are filled with memories. 470 00:22:56,321 --> 00:22:58,655 Some of them bring us joy. 471 00:22:58,657 --> 00:23:02,058 Others make us wish we could forget. 472 00:23:02,060 --> 00:23:04,327 Whether we like it or not, 473 00:23:04,329 --> 00:23:08,465 our memories shape how we think and how we act. 474 00:23:08,467 --> 00:23:14,204 But now one group of researchers thinks it has found a way 475 00:23:14,206 --> 00:23:20,510 to change memory and perhaps change who we are. 476 00:23:23,915 --> 00:23:27,551 Can we deliberately change our sense of identity? 477 00:23:27,553 --> 00:23:32,889 Neuroscientist André Fenton from the State University of New York 478 00:23:32,891 --> 00:23:34,791 doesn't see why not. 479 00:23:34,793 --> 00:23:36,593 To him, the brain 480 00:23:36,595 --> 00:23:39,563 and its pathways of connections between neurons 481 00:23:39,565 --> 00:23:42,766 are like the labyrinth of streets in New York City -- 482 00:23:42,768 --> 00:23:46,603 a maze he navigates on his daily runs. 483 00:23:46,605 --> 00:23:48,839 And just like Manhattan traffic, 484 00:23:48,841 --> 00:23:51,408 conditions for the flow of electricity around the brain 485 00:23:51,410 --> 00:23:55,212 are not the same on every route. 486 00:23:55,214 --> 00:23:57,113 If you experience something, 487 00:23:57,115 --> 00:23:59,683 there's been an electrical activation 488 00:23:59,685 --> 00:24:01,218 somewhere in the brain 489 00:24:01,220 --> 00:24:03,119 that spreads through the brain, 490 00:24:03,121 --> 00:24:04,921 and that is your experience. 491 00:24:04,923 --> 00:24:06,356 As in a city, 492 00:24:06,358 --> 00:24:09,493 there are roads that connect one district to another district, 493 00:24:09,495 --> 00:24:12,362 and those roads can be very big boulevards 494 00:24:12,364 --> 00:24:14,164 that send a lot of traffic, 495 00:24:14,166 --> 00:24:18,435 or they can be small alleys that send very specific information, 496 00:24:18,437 --> 00:24:21,838 but nonetheless, not very rapidly or very easily. 497 00:24:21,840 --> 00:24:23,740 Freeman: For years, 498 00:24:23,742 --> 00:24:26,243 scientists thought the pathways in our brains 499 00:24:26,245 --> 00:24:29,346 were set in stone after we matured from babies to adults. 500 00:24:29,348 --> 00:24:31,982 Alleys could not become wider. 501 00:24:31,984 --> 00:24:34,718 Highways could not become narrower. 502 00:24:34,720 --> 00:24:36,653 But now it has become clear 503 00:24:36,655 --> 00:24:39,222 that the roads in our adult brains 504 00:24:39,224 --> 00:24:41,391 are under constant construction. 505 00:24:41,393 --> 00:24:44,861 Every time we store a new memory, 506 00:24:44,863 --> 00:24:48,498 electrical activity propagates through millions of neurons. 507 00:24:48,500 --> 00:24:52,168 Just as André is forced to find a new route 508 00:24:52,170 --> 00:24:54,204 if his pathway is blocked, 509 00:24:54,206 --> 00:24:57,207 our neural pathways adjust themselves 510 00:24:57,209 --> 00:25:00,243 to process and record new experiences. 511 00:25:00,245 --> 00:25:03,113 Fenton: And so, what neuroscientists understand is 512 00:25:03,115 --> 00:25:07,150 that there's a sufficient amount of this plasticity 513 00:25:07,152 --> 00:25:08,552 throughout life, 514 00:25:08,554 --> 00:25:10,453 and that it is affected 515 00:25:10,455 --> 00:25:13,957 and modulated and controlled by experience. 516 00:25:13,959 --> 00:25:17,127 Freeman: Recently, scientists have identified 517 00:25:17,129 --> 00:25:18,795 a molecule in the brain 518 00:25:18,797 --> 00:25:23,199 that jumps into action when we are forming new memories. 519 00:25:23,201 --> 00:25:27,737 It is called PKMzeta. 520 00:25:27,739 --> 00:25:31,007 PKMzeta stands for "protein kinase mzeta." 521 00:25:31,009 --> 00:25:32,475 It's my favorite molecule. 522 00:25:34,579 --> 00:25:40,216 When PKMzeta gets told to deploy in a neuron, 523 00:25:40,218 --> 00:25:42,419 it gets told to do that 524 00:25:42,421 --> 00:25:45,555 on the basis of a recent experience, 525 00:25:45,557 --> 00:25:49,059 and what it does is it mediates efficient 526 00:25:49,061 --> 00:25:53,496 or increased efficiency of neural transmission. 527 00:25:53,498 --> 00:25:56,566 Freeman: When a memory needs to navigate its way 528 00:25:56,568 --> 00:25:58,902 through the traffic of our brains, 529 00:25:58,904 --> 00:26:01,438 PKMzeta clears the way, 530 00:26:01,440 --> 00:26:05,575 making sure the memory safely reaches long-term storage. 531 00:26:05,577 --> 00:26:07,811 Fenton: Those long-term memories, 532 00:26:07,813 --> 00:26:11,114 the ones that you form now and you will keep forever, 533 00:26:11,116 --> 00:26:17,020 that kind of information storage seems to be mediated by PKMzeta. 534 00:26:17,022 --> 00:26:19,689 Freeman: But André knew of a chemical 535 00:26:19,691 --> 00:26:22,225 that could neutralize PKMzeta, 536 00:26:22,227 --> 00:26:26,229 called zeta inhibitory peptide, or ZIP, 537 00:26:26,231 --> 00:26:31,134 and he wondered if he injected it into a living brain, 538 00:26:31,136 --> 00:26:34,437 could he prevent it from forming long-term memories. 539 00:26:34,439 --> 00:26:37,907 Fenton: So, the logic of the experiment we did 540 00:26:37,909 --> 00:26:39,209 is very straightforward. 541 00:26:39,211 --> 00:26:42,879 What you want to do is produce a memory. 542 00:26:42,881 --> 00:26:45,915 A rat is in a rotating carousel, 543 00:26:45,917 --> 00:26:47,617 and the key here is that 544 00:26:47,619 --> 00:26:50,587 whenever it enters that part of the floor, 545 00:26:50,589 --> 00:26:53,023 it becomes electrified. 546 00:26:53,025 --> 00:26:56,059 And so, they very quickly and rapidly learn 547 00:26:56,061 --> 00:26:58,895 to stay away from that part of the room. 548 00:26:58,897 --> 00:27:03,800 Freeman: In this computer-generated read-out of André's experiment, 549 00:27:03,802 --> 00:27:06,202 the rat runs around the carousel 550 00:27:06,204 --> 00:27:10,473 but consistently avoids the triangular-shaped shock zone. 551 00:27:10,475 --> 00:27:15,278 30 days later, André puts the rat back in the chamber 552 00:27:15,280 --> 00:27:18,248 and observes that it still remembers to stay away. 553 00:27:18,250 --> 00:27:23,987 It has stored a new long-term memory in its brain. 554 00:27:23,989 --> 00:27:27,891 But when André injects the rat's hippocampus with ZIP, 555 00:27:27,893 --> 00:27:30,193 he sees something extraordinary. 556 00:27:30,195 --> 00:27:33,730 When the rat is put back in the carousel one more time, 557 00:27:33,732 --> 00:27:36,032 it runs right over the shock zone 558 00:27:36,034 --> 00:27:39,436 as though it had never been shocked before. 559 00:27:39,438 --> 00:27:41,371 You could see that the animal behaved 560 00:27:41,373 --> 00:27:43,073 more or less like a naive animal, 561 00:27:43,075 --> 00:27:44,374 so it was very exciting. 562 00:27:44,376 --> 00:27:49,079 Freeman: André has erased a piece of the rat's memory. 563 00:27:49,081 --> 00:27:51,614 The ability to forget people we have met, 564 00:27:51,616 --> 00:27:55,652 places we have been, things we have done 565 00:27:55,654 --> 00:27:58,655 is now a pharmaceutical possibility. 566 00:27:58,657 --> 00:28:02,492 But André can't see inside his rat's brain, 567 00:28:02,494 --> 00:28:04,561 and so he cannot be sure 568 00:28:04,563 --> 00:28:08,164 how many memories the ZIP molecule erased. 569 00:28:08,166 --> 00:28:10,300 Fenton: As we begin to work out 570 00:28:10,302 --> 00:28:12,836 the synaptic organization of memories, 571 00:28:12,838 --> 00:28:16,539 we'll then be in a position to understand 572 00:28:16,541 --> 00:28:18,475 whether it's possible 573 00:28:18,477 --> 00:28:21,377 to actually make selective manipulations 574 00:28:21,379 --> 00:28:23,012 of particular memories. 575 00:28:23,014 --> 00:28:25,982 We are always going to be confronted 576 00:28:25,984 --> 00:28:29,552 with the possibility of erasing all memories, 577 00:28:29,554 --> 00:28:34,057 which could never be a good idea. 578 00:28:34,059 --> 00:28:38,528 Freeman: Using ZIP to erase a specific memory 579 00:28:38,530 --> 00:28:40,930 is still a ways off. 580 00:28:40,932 --> 00:28:45,401 But in Montreal, one doctor has found another way. 581 00:28:45,403 --> 00:28:47,504 He is washing away painful memories 582 00:28:47,506 --> 00:28:52,742 that make his patients prisoners inside their own identities. 583 00:28:55,210 --> 00:28:58,712 Who we are depends on where we have been, 584 00:28:58,714 --> 00:29:02,483 who we have loved, who we have lost. 585 00:29:02,485 --> 00:29:06,653 For some of us, painful memories can linger like an open wound. 586 00:29:06,655 --> 00:29:11,525 They can hold us back from becoming who we want to become. 587 00:29:11,527 --> 00:29:16,463 Doctor Alain Brunet is a psychologist 588 00:29:16,465 --> 00:29:19,666 at McGill University in Montreal. 589 00:29:19,668 --> 00:29:21,935 He specializes in treating people 590 00:29:21,937 --> 00:29:24,338 with post-traumatic stress disorder. 591 00:29:24,340 --> 00:29:28,041 Alain himself has a deep understanding for the condition. 592 00:29:28,043 --> 00:29:29,510 [ Gunshots ] 593 00:29:29,512 --> 00:29:31,979 In 1989, at the University of Montreal, 594 00:29:31,981 --> 00:29:35,849 a deranged man carried out the worst mass shooting 595 00:29:35,851 --> 00:29:37,651 in Canadian history. 596 00:29:37,653 --> 00:29:39,153 Alain was on campus, 597 00:29:39,155 --> 00:29:42,523 studying for his master's degree in psychology. 598 00:29:42,525 --> 00:29:44,725 He shot -- he went through the corridors. 599 00:29:44,727 --> 00:29:46,660 He shot 12 women, 600 00:29:46,662 --> 00:29:48,862 and eventually there were 13 deaths. 601 00:29:48,864 --> 00:29:50,898 [ Gunshots, women screaming ] 602 00:29:50,900 --> 00:29:52,666 Brunet: The crisis intervention 603 00:29:52,668 --> 00:29:56,170 that had been conducted after this event was very poorly done, 604 00:29:56,172 --> 00:30:00,641 and many of us were left with a bad taste in our mouth, 605 00:30:00,643 --> 00:30:03,577 and so, it did have a profound effect on me 606 00:30:03,579 --> 00:30:07,014 and on what I decided to study. 607 00:30:09,450 --> 00:30:13,387 Freeman: This horrific event started Alain on a path 608 00:30:13,389 --> 00:30:16,089 that he is still following today. 609 00:30:16,091 --> 00:30:19,293 He helps people who suffer from PTSD 610 00:30:19,295 --> 00:30:25,165 get back a part of themselves that seems to be lost. 611 00:30:25,167 --> 00:30:31,371 PTSD can be conceived as a disorder of memory. 612 00:30:31,373 --> 00:30:32,706 Because in a sense, 613 00:30:32,708 --> 00:30:36,109 it's really about things that you wish you'd forget. 614 00:30:36,111 --> 00:30:39,880 That memory has been burned into your brain 615 00:30:39,882 --> 00:30:41,982 and is way too powerful, 616 00:30:41,984 --> 00:30:43,450 and it's making you fearful 617 00:30:43,452 --> 00:30:45,385 in situations where you shouldn't. 618 00:30:47,956 --> 00:30:52,059 Memory is a little bit like writing with ink. 619 00:30:52,061 --> 00:30:56,296 So, you can see that the ink is still wet. 620 00:30:56,298 --> 00:31:00,534 If I use my fingers and go over my writing, 621 00:31:00,536 --> 00:31:03,971 it will smear what I just wrote. 622 00:31:03,973 --> 00:31:06,573 And this is exactly like the workings of memory. 623 00:31:06,575 --> 00:31:11,044 Freeman: But when a memory is emotionally powerful, 624 00:31:11,046 --> 00:31:12,846 proteins in the brain 625 00:31:12,848 --> 00:31:15,015 build connections between neurons 626 00:31:15,017 --> 00:31:16,750 and the memory is transferred 627 00:31:16,752 --> 00:31:19,820 to a separate long-term storage area. 628 00:31:19,822 --> 00:31:23,557 There, it leaves a lasting impression. 629 00:31:23,559 --> 00:31:28,528 Once the ink is dry, the memory is there for good. 630 00:31:28,530 --> 00:31:31,565 Of course, it might fade with time, 631 00:31:31,567 --> 00:31:34,368 but that memory will still be accessible. 632 00:31:34,370 --> 00:31:36,336 Freeman: Many scientists believe 633 00:31:36,338 --> 00:31:38,572 that once the ink of a memory is dry, 634 00:31:38,574 --> 00:31:42,643 it is fixed and indelible. 635 00:31:42,645 --> 00:31:48,548 But Alain believes that every time we recall a memory, 636 00:31:48,550 --> 00:31:53,353 it is like we are creating a brand-new memory all over again. 637 00:31:53,355 --> 00:31:55,122 Brunet: When you recall a memory, 638 00:31:55,124 --> 00:31:56,556 it becomes active again, 639 00:31:56,558 --> 00:32:00,294 and it becomes buzzing with electrical activity. 640 00:32:00,296 --> 00:32:02,429 It's really a little bit like 641 00:32:02,431 --> 00:32:07,034 if you were rewriting the word "Rouge" again with fresh ink. 642 00:32:07,036 --> 00:32:10,837 Freeman: The moment someone recalls a painful memory, 643 00:32:10,839 --> 00:32:16,043 Alain believes he has an opportunity to modify it. 644 00:32:16,045 --> 00:32:18,979 I've had a lot of traumatic events happen in my life, 645 00:32:18,981 --> 00:32:23,917 which I was able to, you know, work through and live through, 646 00:32:23,919 --> 00:32:25,919 but then the death of my daughter -- 647 00:32:25,921 --> 00:32:28,021 it was too much. 648 00:32:28,023 --> 00:32:31,658 I couldn't function. I couldn't work any longer. 649 00:32:31,660 --> 00:32:33,894 I had absolutely lost who I was. 650 00:32:33,896 --> 00:32:35,862 There's no doubt about that. 651 00:32:35,864 --> 00:32:40,600 Freeman: Lois Bouchet, who has come to Alain for help, 652 00:32:40,602 --> 00:32:44,237 is in for an intense treatment. 653 00:32:44,239 --> 00:32:45,906 As a first step, 654 00:32:45,908 --> 00:32:50,811 he asks her to methodically recall her painful memory 655 00:32:50,813 --> 00:32:55,248 by reading aloud a personal account of the traumatic event. 656 00:32:55,250 --> 00:32:57,084 Okay. 657 00:32:57,086 --> 00:32:58,618 "I heard the doorbell at 5:00 a.m. 658 00:32:58,620 --> 00:33:00,887 "I went to the door in my nightgown, 659 00:33:00,889 --> 00:33:02,823 "thinking it was my daughter. 660 00:33:02,825 --> 00:33:04,791 "When I saw that it was the police, 661 00:33:04,793 --> 00:33:07,160 "I excused myself to go get my housecoat on. 662 00:33:07,162 --> 00:33:10,664 "As I'm walking down the hallway to the bedroom, 663 00:33:10,666 --> 00:33:13,700 they ask if anybody is at home with me." 664 00:33:13,702 --> 00:33:15,168 While Lois reads, 665 00:33:15,170 --> 00:33:16,870 she's under the influence 666 00:33:16,872 --> 00:33:20,073 of a drug Alain administers called Propranolol, 667 00:33:20,075 --> 00:33:24,845 a simple beta blocker that reduces high blood pressure 668 00:33:24,847 --> 00:33:29,082 and has a well-known side effect of slight memory loss. 669 00:33:29,084 --> 00:33:32,219 "I know that something is terribly wrong. 670 00:33:32,221 --> 00:33:34,454 "I get a knot in my stomach. 671 00:33:34,456 --> 00:33:36,923 "My heart starts beating faster, 672 00:33:36,925 --> 00:33:40,293 "and I can feel myself shaking inside. 673 00:33:40,295 --> 00:33:44,164 "When I come back to the living room, 674 00:33:44,166 --> 00:33:48,702 "he tells me Nikki has been hit by a truck on the 401 675 00:33:48,704 --> 00:33:50,670 "and my Nikki is dead. 676 00:33:50,672 --> 00:33:53,740 "All of a sudden, I crouch down 677 00:33:53,742 --> 00:33:56,076 "and start to sob uncontrollably. 678 00:33:56,078 --> 00:33:59,513 "The pain is incredible. My chest hurts. 679 00:33:59,515 --> 00:34:02,115 I think, 'How can I make it through this?'" 680 00:34:06,053 --> 00:34:09,256 they did that once a week for six weeks, 681 00:34:09,258 --> 00:34:13,627 and then we tested them with a battery of tests, interviews, 682 00:34:13,629 --> 00:34:17,597 and psychophysiological measurement of their responding 683 00:34:17,599 --> 00:34:20,367 while they're listening to an account of their trauma. 684 00:34:20,369 --> 00:34:22,636 Freeman: After six weeks of treatment, 685 00:34:22,638 --> 00:34:25,472 70% of Alain's patients 686 00:34:25,474 --> 00:34:29,276 show hardly any signs of PTSD symptoms. 687 00:34:29,278 --> 00:34:31,578 They could talk about the pain 688 00:34:31,580 --> 00:34:34,481 without being forced to relive it. 689 00:34:34,483 --> 00:34:37,150 And that really blew our mind, 690 00:34:37,152 --> 00:34:42,389 because they had only received one small dose of a medication, 691 00:34:42,391 --> 00:34:46,726 and those people had been suffering from PTSD for decades. 692 00:34:46,728 --> 00:34:48,528 Freeman: Alain's patients 693 00:34:48,530 --> 00:34:51,665 have written over their traumatic memories. 694 00:34:51,667 --> 00:34:55,168 They have a second chance to reclaim their lives 695 00:34:55,170 --> 00:34:57,838 and to reclaim a sense of self. 696 00:34:57,840 --> 00:35:02,442 Bouchet: As you carried on, it got easier. 697 00:35:02,444 --> 00:35:04,110 You never forgot the feelings. 698 00:35:04,112 --> 00:35:06,513 Like, I'm always gonna be upset about it. 699 00:35:06,515 --> 00:35:09,115 My daughter died. That's never gonna go away. 700 00:35:09,117 --> 00:35:11,985 But now I can think about what happened 701 00:35:11,987 --> 00:35:14,721 without feeling like I'm going to lose my mind. 702 00:35:19,760 --> 00:35:21,495 Brunet: With trauma, 703 00:35:21,497 --> 00:35:25,765 there will always be a time before and a time after, 704 00:35:25,767 --> 00:35:30,737 but in my opinion, people gain back their old self. 705 00:35:30,739 --> 00:35:36,743 Freeman: Alain seems to have found the fine-tuned tool 706 00:35:36,745 --> 00:35:40,213 that can target specific memories. 707 00:35:40,215 --> 00:35:42,582 But even if we can envision a time 708 00:35:42,584 --> 00:35:45,919 when our identities can be transformed or restored, 709 00:35:45,921 --> 00:35:49,990 we still haven't grasped the most fundamental aspect 710 00:35:49,992 --> 00:35:52,759 about what makes us who we are. 711 00:35:52,761 --> 00:35:55,061 What is it that makes our brains 712 00:35:55,063 --> 00:35:58,565 able to question who we are in the first place? 713 00:35:58,567 --> 00:36:01,001 One man thinks he has the answer. 714 00:36:01,003 --> 00:36:02,669 He's trying to re-create 715 00:36:02,671 --> 00:36:04,938 the essence of what makes us us 716 00:36:04,940 --> 00:36:08,608 in pieces of silicone hardware. 717 00:36:11,831 --> 00:36:14,167 The core of who we are 718 00:36:14,168 --> 00:36:17,703 is something we carry with us everywhere we go. 719 00:36:17,904 --> 00:36:20,204 It lives somewhere in the web 720 00:36:20,206 --> 00:36:23,841 of billions neurons in our brains. 721 00:36:23,843 --> 00:36:27,444 Now some scientists are trying to discover 722 00:36:27,446 --> 00:36:31,182 if this biological network can be replicated 723 00:36:31,184 --> 00:36:33,717 in silicone hardware, 724 00:36:33,719 --> 00:36:38,622 whether we can build a robot that will ask itself, 725 00:36:38,624 --> 00:36:42,359 "Who am I?" 726 00:36:44,863 --> 00:36:46,931 Computer engineer Steve Furber 727 00:36:46,933 --> 00:36:48,966 from the University of Manchester 728 00:36:48,968 --> 00:36:51,635 is on a quest to find out 729 00:36:51,637 --> 00:36:55,439 if a human identity can be built. 730 00:36:55,441 --> 00:36:57,708 He is attempting to make the first replica of the brain 731 00:36:57,710 --> 00:37:00,511 that works in real time. 732 00:37:00,513 --> 00:37:02,413 If he succeeds, 733 00:37:02,415 --> 00:37:05,482 he could unlock the secret of what makes us who we are. 734 00:37:05,484 --> 00:37:08,719 Furber: I think the whole issue of understanding the brain 735 00:37:08,721 --> 00:37:10,020 is fascinating. 736 00:37:10,022 --> 00:37:12,723 It's so central to our existence. 737 00:37:12,725 --> 00:37:15,292 We're pretty sure that our understanding of the brain 738 00:37:15,294 --> 00:37:17,061 is missing some fundamental ideas, 739 00:37:17,063 --> 00:37:18,262 and one of these is 740 00:37:18,264 --> 00:37:20,297 how information is represented in the brain. 741 00:37:20,299 --> 00:37:22,366 Freeman: Steve believes 742 00:37:22,368 --> 00:37:26,170 there is a neural code that runs our brains, 743 00:37:26,172 --> 00:37:31,108 that one code is responsible for controlling multiple jobs -- 744 00:37:31,110 --> 00:37:34,078 seeing, hearing, learning language. 745 00:37:34,080 --> 00:37:37,548 It's just a matter of finding out what the code is. 746 00:37:37,550 --> 00:37:40,017 He suspects the best place to look 747 00:37:40,019 --> 00:37:44,121 is in the part of the brain that is far more evolved in humans 748 00:37:44,123 --> 00:37:46,257 than in other species -- 749 00:37:46,259 --> 00:37:51,028 the thin, wrinkly, outer layer called the neocortex. 750 00:37:51,030 --> 00:37:54,398 Furber: So, the neocortex is a very interesting area of the brain 751 00:37:54,400 --> 00:37:57,101 because it's pretty much the same at the back, 752 00:37:57,103 --> 00:37:59,803 where it's doing low-level image processing, 753 00:37:59,805 --> 00:38:00,904 and at the front, 754 00:38:00,906 --> 00:38:03,307 where it's doing high-level functions. 755 00:38:03,309 --> 00:38:05,609 So, if you're born without sight, 756 00:38:05,611 --> 00:38:07,344 a lot of your visual cortex 757 00:38:07,346 --> 00:38:09,813 will be taken over processing sound. 758 00:38:09,815 --> 00:38:13,384 And it's quite common that people who don't have sight 759 00:38:13,386 --> 00:38:15,319 have much more acute hearing. 760 00:38:15,321 --> 00:38:17,288 So there must be something in common 761 00:38:17,290 --> 00:38:19,456 about the algorithms that are used there, 762 00:38:19,458 --> 00:38:21,392 if only we could see what that was. 763 00:38:21,394 --> 00:38:23,027 Freeman: Computer engineers 764 00:38:23,029 --> 00:38:26,363 have been trying replicate biological brains for decades, 765 00:38:26,365 --> 00:38:29,133 using standard computer technology. 766 00:38:29,135 --> 00:38:34,238 But Steve believes they've been going about it all wrong. 767 00:38:34,240 --> 00:38:36,874 In a conventional computer, 768 00:38:36,876 --> 00:38:40,010 data gets moved around in large chunks. 769 00:38:40,012 --> 00:38:41,945 That would be like a chef 770 00:38:41,947 --> 00:38:45,683 dumping an entire dinner and dessert into one pot 771 00:38:45,685 --> 00:38:48,786 and serving a pile to one unfortunate customer. 772 00:38:51,022 --> 00:38:55,526 But the brain is more like a cocktail party. 773 00:38:55,528 --> 00:38:59,296 Small bits of data are passed around and shared. 774 00:38:59,298 --> 00:39:02,166 Before you know it, connections are being made 775 00:39:02,168 --> 00:39:06,170 and a complex situation is underway. 776 00:39:06,172 --> 00:39:08,839 [ Munching ] 777 00:39:08,841 --> 00:39:14,445 This highly interconnected way to arrange small packets of data 778 00:39:14,447 --> 00:39:17,014 is what Steve wants to replicate 779 00:39:17,016 --> 00:39:20,284 in a custom-designed silicone circuit. 780 00:39:20,286 --> 00:39:24,822 He has created a brand-new type of computer chip 781 00:39:24,824 --> 00:39:27,458 specifically engineered to mimic 782 00:39:27,460 --> 00:39:29,360 the way neurons work in the brain. 783 00:39:29,362 --> 00:39:32,930 It is called the SpiNNaker chip. 784 00:39:34,999 --> 00:39:36,166 SpiNNaker is a compression 785 00:39:36,168 --> 00:39:37,968 of spiking neural network architecture. 786 00:39:37,970 --> 00:39:40,237 If you say it quickly enough, it comes out like "SpiNNaker." 787 00:39:40,239 --> 00:39:43,006 The SpiNNaker chip is a massively parallel computer 788 00:39:43,008 --> 00:39:46,910 designed to run models of the brain in real time, 789 00:39:46,912 --> 00:39:49,646 which means that our model runs at the same speed 790 00:39:49,648 --> 00:39:51,415 as the biology inside your head. 791 00:39:51,417 --> 00:39:54,084 Freeman: Each one of Steve's SpiNNaker chips 792 00:39:54,086 --> 00:39:59,123 can be programmed to replicate the behavior of 16,000 neurons. 793 00:39:59,125 --> 00:40:00,824 That's only a tiny fraction 794 00:40:00,826 --> 00:40:03,861 of the 100 billion neurons we have in our brain, 795 00:40:03,863 --> 00:40:06,263 but it is a significant step beyond 796 00:40:06,265 --> 00:40:08,665 anything that has been done before. 797 00:40:08,667 --> 00:40:12,770 Steve and a team from the Technical University of Munich 798 00:40:12,772 --> 00:40:16,540 are now wiring these brain-like chips to robots. 799 00:40:16,542 --> 00:40:19,543 This might look like a remote-controlled toy, 800 00:40:19,545 --> 00:40:22,179 but it is not. 801 00:40:22,181 --> 00:40:27,217 It is controlling itself by sensing the world around it. 802 00:40:27,219 --> 00:40:29,653 So, the robot is basically following the line 803 00:40:29,655 --> 00:40:31,221 entirely under neural control. 804 00:40:31,223 --> 00:40:33,991 It has a vision sensor on the front. 805 00:40:33,993 --> 00:40:38,295 The vision information is being sent into the SpiNNaker card. 806 00:40:38,297 --> 00:40:41,265 The SpiNNaker card is executing the real-time neural network, 807 00:40:41,267 --> 00:40:43,500 and then the outputs from the SpiNNaker card 808 00:40:43,502 --> 00:40:45,869 are being sent back via the laptop to the robot 809 00:40:45,871 --> 00:40:48,305 and controlling its movement. 810 00:40:48,307 --> 00:40:50,207 The brain is over there, and the body is over here. 811 00:40:50,209 --> 00:40:52,810 The robot's SpiNNaker chip brain 812 00:40:52,812 --> 00:40:55,212 mimics the way a real biological brain works. 813 00:40:55,214 --> 00:40:57,147 Just like a child, 814 00:40:57,149 --> 00:41:00,017 it interacts with its environment 815 00:41:00,019 --> 00:41:04,755 and uses its physical body to understand the world around it. 816 00:41:04,757 --> 00:41:08,992 The more it experiences, the smarter it gets. 817 00:41:08,994 --> 00:41:11,562 Our current systems have four chips on. 818 00:41:11,564 --> 00:41:14,765 They can model about 50,000, 60,000 neurons. 819 00:41:14,767 --> 00:41:16,233 In a few months' time, 820 00:41:16,235 --> 00:41:19,403 we'll have boards about 10 times bigger than that, 821 00:41:19,405 --> 00:41:22,473 and they'll be getting up to the level of complexity 822 00:41:22,475 --> 00:41:25,008 of a honey bee, which has 850,000 neurons. 823 00:41:25,010 --> 00:41:27,177 And then beyond that, we'll build systems 824 00:41:27,179 --> 00:41:29,413 and get up to mammalian brain sizes. 825 00:41:31,349 --> 00:41:34,952 Freeman: The human brain is a formidably complex system, 826 00:41:34,954 --> 00:41:38,856 and it would take millions more SpiNNaker chips to build one, 827 00:41:38,858 --> 00:41:43,360 but Steve is confident it is possible. 828 00:41:43,362 --> 00:41:46,430 If you had a model of the mind running in a machine, 829 00:41:46,432 --> 00:41:49,867 I don't see why it shouldn't behave in exactly the same way. 830 00:41:49,869 --> 00:41:52,769 The question of whether machines modeling the brain 831 00:41:52,771 --> 00:41:55,906 may ultimately be capable of supporting the imagination, 832 00:41:55,908 --> 00:41:57,474 dreams, and so on 833 00:41:57,476 --> 00:41:58,609 is a very hard question, 834 00:41:58,611 --> 00:42:00,477 but I don't see any fundamental reason 835 00:42:00,479 --> 00:42:01,812 why we shouldn't expect that. 836 00:42:03,948 --> 00:42:06,250 Freeman: Steve believes 837 00:42:06,252 --> 00:42:10,521 that human brains run on simple algorithms, 838 00:42:10,523 --> 00:42:15,125 and what works for humans will also work for his machines. 839 00:42:15,127 --> 00:42:18,161 The journey to forming an identity begins 840 00:42:18,163 --> 00:42:21,331 when a body, guided by networks of neurons, 841 00:42:21,333 --> 00:42:24,701 struggles to navigate its way through the world. 842 00:42:24,703 --> 00:42:28,839 It learns, adapts, remembers, 843 00:42:28,841 --> 00:42:33,010 and eventually becomes self-aware. 844 00:42:35,179 --> 00:42:38,682 What makes us who we are? 845 00:42:38,684 --> 00:42:42,085 Our identities are built bit by bit 846 00:42:42,087 --> 00:42:46,790 from our memories, our dreams, and our imaginations. 847 00:42:46,792 --> 00:42:50,394 No one's sense of self is fixed. 848 00:42:50,396 --> 00:42:54,298 Life is a journey that makes us all unique, 849 00:42:54,300 --> 00:42:56,633 and discovering who we are 850 00:42:56,635 --> 00:42:59,937 is our greatest and longest adventure. 851 00:43:00,096 --> 00:43:04,096 == sync, corrected by elderman ==66806

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.