All language subtitles for Through.the.Wormhole.S03E07.720p.HDTV.x264-ORENJi.HI

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
nl Dutch
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranรฎ)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:01,316 --> 00:00:04,451 Freeman: From the dawn of recorded history 2 00:00:04,453 --> 00:00:05,986 to the present day, 3 00:00:05,988 --> 00:00:09,522 humankind has struggled with its darker nature. 4 00:00:09,524 --> 00:00:12,892 We know that psychopaths can torture and kill 5 00:00:12,894 --> 00:00:14,226 without remorse, 6 00:00:14,228 --> 00:00:17,330 but what compels seemingly normal people 7 00:00:17,332 --> 00:00:20,299 to commit acts of cruelty and violence? 8 00:00:22,402 --> 00:00:26,137 Today, researchers are uncovering the hidden forces 9 00:00:26,139 --> 00:00:28,339 that inflame our inner demons, 10 00:00:28,341 --> 00:00:32,043 looking for ways to neutralize our deadliest urges 11 00:00:32,045 --> 00:00:34,846 and change human nature. 12 00:00:34,848 --> 00:00:38,083 Can we eliminate evil? 13 00:00:43,222 --> 00:00:48,293 Space, time, life itself. 14 00:00:50,062 --> 00:00:55,033 The secrets of the cosmos lie through the wormhole. 15 00:00:55,035 --> 00:00:59,035 โ™ช Through the Wormhole 03x07 โ™ช Can we eliminate evil? Original Air Date on July 18, 2012 16 00:00:59,060 --> 00:01:03,060 == sync, corrected by elderman == 17 00:01:07,933 --> 00:01:11,770 Few people consider themselves evil, 18 00:01:11,772 --> 00:01:16,107 yet evil seems an inescapable part of life. 19 00:01:16,109 --> 00:01:19,677 The mystery is -- why? 20 00:01:19,679 --> 00:01:22,879 For millennia, we blamed the devil -- 21 00:01:22,881 --> 00:01:27,383 a creature of darkness that made us do terrible things. 22 00:01:27,385 --> 00:01:33,689 Today, most Christians believe Satan is just a symbol. 23 00:01:33,691 --> 00:01:37,193 Psychologists and brain scientists have shown us 24 00:01:37,195 --> 00:01:42,331 that the evil we fear comes from within ourselves. 25 00:01:42,333 --> 00:01:45,134 Will it always be there, 26 00:01:45,136 --> 00:01:49,372 or can science find its roots and destroy it? 27 00:01:52,142 --> 00:01:57,213 When I was about 9 years old, we moved back to Chicago. 28 00:01:57,215 --> 00:02:02,251 Being new in the neighborhood, I became the target of a bully. 29 00:02:06,156 --> 00:02:10,860 One day, I decided enough was enough. 30 00:02:13,330 --> 00:02:15,965 But as I watched him lying there on the ground, 31 00:02:15,967 --> 00:02:20,970 I found I just couldn't savor my unexpected victory. 32 00:02:20,972 --> 00:02:25,609 I wondered, what made this kid so mean? 33 00:02:25,611 --> 00:02:29,913 Are some people just born bad? 34 00:02:38,991 --> 00:02:40,725 [ Cracking ] [ Grunts ] 35 00:02:40,727 --> 00:02:42,160 Man: And cut! 36 00:02:42,162 --> 00:02:44,762 So, what I need you to do for the next one 37 00:02:44,764 --> 00:02:46,530 is really curl out a bit more. 38 00:02:46,532 --> 00:02:48,699 I need to see more pain in there. 39 00:02:48,701 --> 00:02:50,401 Freeman: In Amsterdam, 40 00:02:50,403 --> 00:02:52,736 neuroscientist Christian Keysers 41 00:02:52,738 --> 00:02:55,940 is looking for the source of human cruelty. 42 00:02:55,942 --> 00:02:59,711 Okay, so I really want to seem as I'm fighting, where... 43 00:02:59,713 --> 00:03:01,880 Freeman: Christian investigates empathy -- 44 00:03:01,882 --> 00:03:04,783 our ability to identify and respond 45 00:03:04,785 --> 00:03:07,385 to what someone else thinks or feels. 46 00:03:07,387 --> 00:03:09,288 Okay, that was good. 47 00:03:09,290 --> 00:03:13,225 To find out how empathy works in our brains, 48 00:03:13,227 --> 00:03:16,162 Christian makes short films of painful acts 49 00:03:16,164 --> 00:03:18,098 to screen for test subjects. 50 00:03:18,100 --> 00:03:20,266 We need very controlled stimuli, 51 00:03:20,268 --> 00:03:23,503 where we just see two or three seconds of pain, 52 00:03:23,505 --> 00:03:26,106 and we need to repeat many of them, 53 00:03:26,108 --> 00:03:28,075 which is why we need to make them ourselves. 54 00:03:30,612 --> 00:03:33,848 So our actors are typically our graduate students and postdocs, 55 00:03:33,850 --> 00:03:35,883 because they know what they're doing 56 00:03:35,885 --> 00:03:38,152 and they can take some pain. 57 00:03:38,154 --> 00:03:41,422 Freeman: Christian screens his torture films 58 00:03:41,424 --> 00:03:44,058 in a theater unlike any on earth. 59 00:03:46,962 --> 00:03:50,965 Magnetic sensors inside this fMRI machine 60 00:03:50,967 --> 00:03:54,001 will peer deep into this man's brain, 61 00:03:54,003 --> 00:03:58,606 showing which areas are active when he experiences empathy. 62 00:03:58,608 --> 00:04:01,208 Okay, so I'll give you this button box. 63 00:04:01,210 --> 00:04:03,978 What I want you to do is, each time, 64 00:04:03,980 --> 00:04:07,482 to rate what you felt in this particular trial. 65 00:04:11,620 --> 00:04:14,689 First, Christian records what happens in the subject's brain 66 00:04:14,691 --> 00:04:16,791 when he sees someone else in pain. 67 00:04:17,960 --> 00:04:19,527 [ Cracking ] 68 00:04:21,730 --> 00:04:23,998 Then he measures what happens 69 00:04:24,000 --> 00:04:27,569 when the subject experiences pain first-hand. 70 00:04:31,208 --> 00:04:34,577 Now he compares the brain scans. 71 00:04:34,579 --> 00:04:37,746 So, the emotional empathy we've been studying here, 72 00:04:37,748 --> 00:04:40,515 you would mainly see in parts of your brain 73 00:04:40,517 --> 00:04:43,218 that are not on the surface of your brain, 74 00:04:43,220 --> 00:04:46,721 but inside of the insula that's a little bit deeper here, 75 00:04:46,723 --> 00:04:50,191 and, really, in the midline between your two hemispheres. 76 00:04:50,193 --> 00:04:53,293 Okay, so what you're seeing here is basically in red -- 77 00:04:53,295 --> 00:04:55,028 the brain activity that happened 78 00:04:55,030 --> 00:04:58,432 while we were hitting the subjects in the scanner. 79 00:04:58,434 --> 00:05:02,170 And then here, you see two of the emotional-brain regions. 80 00:05:02,172 --> 00:05:06,207 They really add this feeling of unpleasantness to what you feel. 81 00:05:06,209 --> 00:05:10,044 So, they're telling you, kind of, "Ouch, I don't like that." 82 00:05:10,046 --> 00:05:13,347 And so what we're seeing here, in the bottom, 83 00:05:13,349 --> 00:05:15,517 the brain activity that happens 84 00:05:15,519 --> 00:05:19,187 while the subject was watching somebody else's pain. 85 00:05:19,189 --> 00:05:22,891 All of these more emotional areas get reactivated, 86 00:05:22,893 --> 00:05:26,496 as if the subject had been feeling pain himself. 87 00:05:26,498 --> 00:05:29,899 Whenever you see the pain of somebody else, 88 00:05:29,901 --> 00:05:32,502 you will share it inside of yourself. 89 00:05:32,504 --> 00:05:35,438 The other person becomes part of yourself. 90 00:05:35,440 --> 00:05:39,276 The pain of others is not just something you see out there. 91 00:05:39,278 --> 00:05:41,411 It basically comes inside of you, 92 00:05:41,413 --> 00:05:43,846 and it becomes your pain, as well. 93 00:05:43,848 --> 00:05:46,582 Freeman: After screening hundreds of people, 94 00:05:46,584 --> 00:05:49,518 Christian believes that empathy is hard-wired 95 00:05:49,520 --> 00:05:52,454 into nearly all of our brains, 96 00:05:52,456 --> 00:05:55,090 but it is not distributed equally. 97 00:05:55,092 --> 00:05:57,258 There is a curve of empathy. 98 00:05:57,260 --> 00:06:00,294 Some people are extremely empathetic, 99 00:06:00,296 --> 00:06:02,830 others feel almost nothing. 100 00:06:02,832 --> 00:06:04,765 Think of a romantic movie. 101 00:06:04,767 --> 00:06:08,535 Most of us get caught up in the emotions on screen. 102 00:06:08,537 --> 00:06:10,404 But -- but why? 103 00:06:10,406 --> 00:06:12,539 This has to be the end. 104 00:06:12,541 --> 00:06:15,108 Goodbye. 105 00:06:15,110 --> 00:06:18,111 [ Sobbing ] 106 00:06:18,113 --> 00:06:21,682 Freeman: But for a few of us, it plays like this. 107 00:06:21,684 --> 00:06:23,184 [ Ship horn blares ] 108 00:06:23,186 --> 00:06:26,420 People with low empathy see and hear things differently 109 00:06:26,422 --> 00:06:29,723 because their brains work differently. 110 00:06:29,725 --> 00:06:32,893 Information flows through most brains 111 00:06:32,895 --> 00:06:36,731 like boats move along the canals of Amsterdam. 112 00:06:36,733 --> 00:06:38,400 But in some brains, 113 00:06:38,402 --> 00:06:43,005 that movement is impeded by narrow, or blocked, channels. 114 00:06:43,007 --> 00:06:45,074 So, if you imagine that back there, 115 00:06:45,076 --> 00:06:47,176 you would have the visual-brain areas 116 00:06:47,178 --> 00:06:48,878 that see what happens to others, 117 00:06:48,880 --> 00:06:51,615 and down there, you would have the emotional areas 118 00:06:51,617 --> 00:06:53,150 that normally feel your pain. 119 00:06:53,152 --> 00:06:56,319 And we think that what makes the difference, basically, 120 00:06:56,321 --> 00:06:59,656 between a very empathic person and a less empathic person 121 00:06:59,658 --> 00:07:01,291 is just the size of the canal 122 00:07:01,293 --> 00:07:04,828 that brings the information from the visual part of your brain 123 00:07:04,830 --> 00:07:06,196 to the emotional part, 124 00:07:06,198 --> 00:07:09,099 in which you will share the pain of other people. 125 00:07:09,101 --> 00:07:12,469 Freeman: And what of the monsters of our nightmares -- 126 00:07:12,471 --> 00:07:16,006 the psychopathic killers who look normal on the outside... 127 00:07:16,008 --> 00:07:17,341 [ Camera shutter clicks ] 128 00:07:17,343 --> 00:07:19,843 ...but are twisted on the inside? 129 00:07:19,845 --> 00:07:24,013 It is often said that psychopaths have no empathy, 130 00:07:24,015 --> 00:07:27,551 and this lack of empathy makes them evil. 131 00:07:27,553 --> 00:07:30,553 They are able to torture and kill 132 00:07:30,555 --> 00:07:34,223 because they can't relate to other people. 133 00:07:34,225 --> 00:07:35,892 Christian disagrees. 134 00:07:35,894 --> 00:07:39,629 Keysers: Well, I think the finding that surprised us most 135 00:07:39,631 --> 00:07:42,132 was actually the study on psychopaths. 136 00:07:42,134 --> 00:07:44,401 We went in there with a simple idea 137 00:07:44,403 --> 00:07:46,669 that evil people, like psychopaths, 138 00:07:46,671 --> 00:07:48,171 would just lack empathy, 139 00:07:48,173 --> 00:07:50,640 and what we actually saw in this study 140 00:07:50,642 --> 00:07:53,443 is that what makes them evil is more complex. 141 00:07:53,445 --> 00:07:56,546 It's not that they lack the capacity for empathy -- 142 00:07:56,548 --> 00:07:59,349 they just don't use it spontaneously. 143 00:07:59,351 --> 00:08:02,452 But if they want to, because, for instance, 144 00:08:02,454 --> 00:08:03,986 it serves the purpose 145 00:08:03,988 --> 00:08:07,723 of fooling somebody into giving them all their money, 146 00:08:07,725 --> 00:08:09,926 then they're quite able to empathize 147 00:08:09,928 --> 00:08:12,128 and really get in to people's minds. 148 00:08:12,130 --> 00:08:13,996 [ Gong crashes ] 149 00:08:13,998 --> 00:08:17,500 Freeman: So empathy is not everything. 150 00:08:17,502 --> 00:08:20,136 To keep from falling into evil, 151 00:08:20,138 --> 00:08:23,273 we also need a moral system to guide our behavior -- 152 00:08:23,275 --> 00:08:27,577 a code of conduct that helps us fit in to society 153 00:08:27,579 --> 00:08:30,179 and act in a non-destructive way. 154 00:08:30,181 --> 00:08:34,617 Scientists Karen Wynn and Paul Bloom of Yale University 155 00:08:34,619 --> 00:08:39,256 believe that moral code may be written into us at birth. 156 00:08:39,258 --> 00:08:42,426 That is a lot of duck you're fitting in your mouth. 157 00:08:42,428 --> 00:08:46,797 I've been studying babies now for just a little over 20 years. 158 00:08:46,799 --> 00:08:50,368 The more that I see of them, 159 00:08:50,370 --> 00:08:54,039 the more complex they become. 160 00:08:54,041 --> 00:08:57,575 There is a lot going on in there, 161 00:08:57,577 --> 00:09:02,179 and it's far more rich and complex 162 00:09:02,181 --> 00:09:06,516 of a mental life than we had ever thought. 163 00:09:06,518 --> 00:09:08,319 Bloom: By studying babies, 164 00:09:08,321 --> 00:09:12,256 you get to see humans before they're contaminated by culture, 165 00:09:12,258 --> 00:09:15,392 by television, by a lot of social interactions, 166 00:09:15,394 --> 00:09:16,860 by sex and romance. 167 00:09:16,862 --> 00:09:20,363 You get to see humans, in some sense, in their purest form, 168 00:09:20,365 --> 00:09:22,798 and you could ask, "what's our natures? 169 00:09:22,800 --> 00:09:24,967 "Are we kind? Are we cruel? 170 00:09:24,969 --> 00:09:29,438 Are we morally intelligent? Can we tell good from evil?" 171 00:09:29,440 --> 00:09:31,406 And the work I'm doing here with my colleague 172 00:09:31,408 --> 00:09:33,441 suggests that very early on 173 00:09:33,443 --> 00:09:35,677 there's some fundamental moral sense -- 174 00:09:35,679 --> 00:09:38,314 some moral instinct that's present in all of us. 175 00:09:40,884 --> 00:09:44,787 Freeman: How do you pose moral questions to a baby? 176 00:09:44,789 --> 00:09:47,757 Karen devised a kind of morality play 177 00:09:47,759 --> 00:09:50,026 for babies to watch and judge. 178 00:09:50,028 --> 00:09:52,495 We show babies a little puppet show 179 00:09:52,497 --> 00:09:55,766 in which this one puppet is trying to open a box, 180 00:09:55,768 --> 00:09:57,734 and he's trying and he's trying 181 00:09:57,736 --> 00:10:00,571 and he just can't quite get it on his own, 182 00:10:00,573 --> 00:10:02,106 and another puppet comes along 183 00:10:02,108 --> 00:10:04,675 and grabs the other side of the box lid and helps him open it. 184 00:10:04,677 --> 00:10:06,911 They then see the little puppet. 185 00:10:06,913 --> 00:10:09,180 He's trying again to open the box, 186 00:10:09,182 --> 00:10:11,416 and a different puppet comes along 187 00:10:11,418 --> 00:10:14,319 and jumps on top of the box lid, slams it shut. 188 00:10:14,321 --> 00:10:15,620 Oh! 189 00:10:15,622 --> 00:10:17,689 Wynn: And so our question to the babies is, 190 00:10:17,691 --> 00:10:20,124 "Babies, do you have any different feelings 191 00:10:20,126 --> 00:10:21,793 "towards these two characters -- 192 00:10:21,795 --> 00:10:24,996 "towards the one who helped, in a nice fashion, open the box, 193 00:10:24,998 --> 00:10:27,465 "and towards this other, who just really, quite rudely, 194 00:10:27,467 --> 00:10:28,800 "slams it down 195 00:10:28,802 --> 00:10:30,635 and foils this guy's attempts to get in to the box?" 196 00:10:30,637 --> 00:10:32,736 Which one do you like? 197 00:10:32,738 --> 00:10:34,438 And we find that, very reliably, 198 00:10:34,440 --> 00:10:38,274 babies, even as young as five and six months of age, 199 00:10:38,276 --> 00:10:42,145 will reach towards and reach for the helpful puppet. 200 00:10:42,147 --> 00:10:45,748 That one! [ Laughing ] Okay, good job! 201 00:10:45,750 --> 00:10:48,917 Freeman: Layla has chosen the good puppet. 202 00:10:48,919 --> 00:10:51,519 Between 80% and 95% of babies do. 203 00:10:51,521 --> 00:10:54,755 Paul and Karen believe that this is a sign 204 00:10:54,757 --> 00:10:57,692 that babies are drawn towards kindness 205 00:10:57,694 --> 00:11:00,461 and away from antisocial behavior. 206 00:11:00,463 --> 00:11:03,697 But if most of us are born good, 207 00:11:03,699 --> 00:11:07,434 why do some of us turn out so bad? 208 00:11:07,436 --> 00:11:09,270 Bloom: Well, there's all sorts of ways 209 00:11:09,272 --> 00:11:11,305 in which our sense of good could get perverted. 210 00:11:11,307 --> 00:11:12,706 If you're brought up in a culture 211 00:11:12,708 --> 00:11:14,475 which teaches you to be dismissive of others, 212 00:11:14,477 --> 00:11:18,279 which rewards selfishness, which rewards bad behavior, 213 00:11:18,281 --> 00:11:21,182 your sense of empathy could be blunted. 214 00:11:21,184 --> 00:11:25,119 So we have this built-in morality, but it's fragile. 215 00:11:25,121 --> 00:11:28,055 Freeman: Once we descend into darkness -- 216 00:11:28,057 --> 00:11:30,925 assault, rape, murder -- 217 00:11:30,927 --> 00:11:32,493 are we lost, 218 00:11:32,495 --> 00:11:36,765 or can the impulses that lead to evil be squelched? 219 00:11:36,767 --> 00:11:38,433 This man thinks so. 220 00:11:38,435 --> 00:11:41,636 He believes we can strengthen our brains 221 00:11:41,638 --> 00:11:44,139 and crush the evil within. 222 00:11:45,126 --> 00:11:49,594 Deep inside every human being is an animal -- 223 00:11:49,745 --> 00:11:53,080 a creature whose only goal is survival. 224 00:11:53,377 --> 00:11:56,912 Most of us can contain the animal within, 225 00:11:56,914 --> 00:12:01,516 but we all know people who yield to their baser impulses. 226 00:12:01,756 --> 00:12:06,425 Sometimes their actions have terrible consequences. 227 00:12:06,427 --> 00:12:10,329 We blame these people for their evil acts, 228 00:12:10,331 --> 00:12:13,633 but do they really have a choice? 229 00:12:13,635 --> 00:12:16,803 After years of probing the human mind, 230 00:12:16,805 --> 00:12:19,071 neuroscientist David Eagleman 231 00:12:19,073 --> 00:12:21,907 of the Baylor College of Medicine 232 00:12:21,909 --> 00:12:25,077 has come to a startling conclusion -- 233 00:12:25,079 --> 00:12:29,348 with a little bad luck, we could all become monsters. 234 00:12:29,350 --> 00:12:31,784 Along any axis you measure brains, 235 00:12:31,786 --> 00:12:35,387 whether that's empathy or intelligence or aggression, 236 00:12:35,389 --> 00:12:37,322 you find a big distribution. 237 00:12:37,324 --> 00:12:39,758 Not all brains are the same. 238 00:12:39,760 --> 00:12:42,060 Freeman: You are your brain, 239 00:12:42,062 --> 00:12:45,998 and your brain is a delicate, highly complex apparatus. 240 00:12:46,000 --> 00:12:50,169 Injury or disease can alter its chemical balance 241 00:12:50,171 --> 00:12:52,205 and physical integrity, 242 00:12:52,207 --> 00:12:55,209 which can alter your personality. 243 00:12:55,211 --> 00:12:58,245 If you were to damage your thumb in an accident, 244 00:12:58,247 --> 00:13:00,681 that wouldn't change you as a person, 245 00:13:00,683 --> 00:13:03,250 but if you damage an equivalently sized 246 00:13:03,252 --> 00:13:04,684 chunk of brain tissue, 247 00:13:04,686 --> 00:13:07,087 that can change your risk taking, 248 00:13:07,089 --> 00:13:08,621 your decision making, 249 00:13:08,623 --> 00:13:11,391 and even, perhaps, whether you become a murderer. 250 00:13:15,964 --> 00:13:17,364 In 1966, 251 00:13:17,366 --> 00:13:20,767 Charles Whitman climbed to the top of the tower 252 00:13:20,769 --> 00:13:22,702 on the U.T. Austin campus, 253 00:13:22,704 --> 00:13:25,638 and he indiscriminantly shot 48 people. 254 00:13:25,640 --> 00:13:28,708 The only thing that matched the horror of this event 255 00:13:28,710 --> 00:13:30,577 was the unexpected nature of it. 256 00:13:30,579 --> 00:13:32,479 There was nothing in his history 257 00:13:32,481 --> 00:13:35,382 that would have predicted this sort of behavior. 258 00:13:35,384 --> 00:13:38,785 He was an engineering student. He worked as a bank teller. 259 00:13:38,787 --> 00:13:41,721 He lived with his wife and his mother-in-law. 260 00:13:41,723 --> 00:13:43,523 So what could explain this? 261 00:13:43,525 --> 00:13:47,227 Well, in his suicide note, he said, "When this is all over, 262 00:13:47,229 --> 00:13:49,663 I want an autopsy to be performed." 263 00:13:49,665 --> 00:13:51,498 And what they found in his brain 264 00:13:51,500 --> 00:13:53,866 was a tumor about the size of a walnut, 265 00:13:53,868 --> 00:13:56,502 and it was pressing on a region of the brain 266 00:13:56,504 --> 00:13:57,770 called the amygdala, 267 00:13:57,772 --> 00:14:00,238 which is involved in fear and aggression. 268 00:14:00,240 --> 00:14:03,842 Freeman: The amygdala is the center of emotion -- 269 00:14:03,844 --> 00:14:06,411 the source of our primal desires. 270 00:14:06,413 --> 00:14:09,481 It is held in check by the frontal lobes 271 00:14:09,483 --> 00:14:13,385 and the temporal lobes -- the centers of self-control. 272 00:14:13,387 --> 00:14:17,356 We all have subconscious demons that we keep in check, 273 00:14:17,358 --> 00:14:21,994 but when the frontal or temporal lobes are compromised, 274 00:14:21,996 --> 00:14:24,063 startling behaviors can emerge -- 275 00:14:24,065 --> 00:14:27,433 behaviors we call evil. 276 00:14:27,435 --> 00:14:31,504 This is what probably happened to Charles Whitman. 277 00:14:31,506 --> 00:14:35,608 He sensed that something was wrong with his brain. 278 00:14:35,610 --> 00:14:40,112 He could no longer control his violent impulses. 279 00:14:40,114 --> 00:14:43,049 And one of the battles that humans have to fight 280 00:14:43,051 --> 00:14:45,918 is short-term versus long-term decision making. 281 00:14:45,920 --> 00:14:48,488 We have impulses that we want to gratify, 282 00:14:48,490 --> 00:14:50,357 and we have longer-term thinking 283 00:14:50,359 --> 00:14:52,592 that try to squelch those impulses. 284 00:14:52,594 --> 00:14:54,227 Let's say that I'm considering 285 00:14:54,229 --> 00:14:56,596 throwing this brick through this window -- 286 00:14:56,598 --> 00:14:58,364 part of me maybe wants to do it, 287 00:14:58,366 --> 00:15:00,433 part of me feels it's an illegal act 288 00:15:00,435 --> 00:15:02,936 and I'll get caught and I'll get in trouble. 289 00:15:02,938 --> 00:15:05,939 And it's an arm wrestle between these different things. 290 00:15:05,941 --> 00:15:08,008 Some people are better at this than others. 291 00:15:09,678 --> 00:15:13,847 [ Glass shatters, static ] 292 00:15:13,849 --> 00:15:17,251 Freeman: David believes we can strengthen our willpower 293 00:15:17,253 --> 00:15:18,753 with a little workout. 294 00:15:18,755 --> 00:15:22,624 Together with neuroscientist Stephen Laconte, 295 00:15:22,626 --> 00:15:27,296 he is testing something called "the prefrontal gym." 296 00:15:27,298 --> 00:15:29,197 There are no treadmills here, 297 00:15:29,199 --> 00:15:33,001 just a scanner that lets people see how their brains respond 298 00:15:33,003 --> 00:15:35,303 when they flex the mental muscles 299 00:15:35,305 --> 00:15:36,972 that govern self-control. 300 00:15:36,974 --> 00:15:39,007 You're going to hear some buzzing. 301 00:15:39,009 --> 00:15:43,645 Today, David and Steven are conducting their first-ever test 302 00:15:43,647 --> 00:15:45,480 on a criminal offender, 303 00:15:45,482 --> 00:15:47,749 a man whose cocaine addiction 304 00:15:47,751 --> 00:15:51,419 led him to steal from his friends and family. 305 00:15:51,421 --> 00:15:53,654 Right now, what this gentleman is doing 306 00:15:53,656 --> 00:15:56,457 is he's looking at images of drug-use cues, 307 00:15:56,459 --> 00:15:59,660 and we're asking him to either enhance his craving 308 00:15:59,662 --> 00:16:01,796 to these cues or suppress them. 309 00:16:01,798 --> 00:16:04,765 [ Beeping ] 310 00:16:04,767 --> 00:16:07,768 When the addict sees images of drug use, 311 00:16:07,770 --> 00:16:10,371 his own craving for drugs spikes. 312 00:16:10,373 --> 00:16:14,508 The fMRI scanner sees this increased activity in the brain 313 00:16:14,510 --> 00:16:17,611 and displays it as a measurement on a bar. 314 00:16:17,613 --> 00:16:21,882 When the craving networks in his brain are revving high, 315 00:16:21,884 --> 00:16:23,818 the bar moves to the red, 316 00:16:23,820 --> 00:16:26,955 but when he fights his dangerous urges, 317 00:16:26,957 --> 00:16:30,192 he can push the bar back toward the blue. 318 00:16:30,194 --> 00:16:32,462 With this bio-feedback, 319 00:16:32,464 --> 00:16:37,301 he's able to train his brain to resist his impulses. 320 00:16:37,303 --> 00:16:40,003 He's doing great. I mean, he's actually... 321 00:16:40,005 --> 00:16:41,872 Eventually, David and Steven 322 00:16:41,874 --> 00:16:44,508 hope to take this technology to prisons 323 00:16:44,510 --> 00:16:47,512 to try to help criminals not repeat their mistakes. 324 00:16:47,514 --> 00:16:49,714 The beauty about the prefrontal gym 325 00:16:49,716 --> 00:16:51,982 is that people are helping themselves. 326 00:16:51,984 --> 00:16:55,286 If they choose to strengthen their long-term decision making, 327 00:16:55,288 --> 00:16:57,354 this is the way they can do that. 328 00:16:57,356 --> 00:16:59,890 It doesn't change anything about the person, 329 00:16:59,892 --> 00:17:02,193 it just gives them a better opportunity 330 00:17:02,195 --> 00:17:04,162 to make good long-term decisions. 331 00:17:05,864 --> 00:17:07,933 Freeman: But there are some 332 00:17:07,935 --> 00:17:11,903 for whom this technique may never work -- psychopaths. 333 00:17:11,905 --> 00:17:14,038 They can pass for normal, 334 00:17:14,040 --> 00:17:18,109 but they are capable of terrifying acts of evil. 335 00:17:18,111 --> 00:17:22,581 Soon, a revolution in brain science may give us the tools 336 00:17:22,583 --> 00:17:26,652 to spot evil brains before they ever commit a crime. 337 00:17:27,620 --> 00:17:29,887 Psychopaths can inflict physical 338 00:17:29,889 --> 00:17:32,123 and psychological harm on others 339 00:17:32,125 --> 00:17:34,759 without feeling a shred of remorse. 340 00:17:35,552 --> 00:17:39,353 They are the people most of us consider evil, 341 00:17:39,355 --> 00:17:42,568 and there are more of them than you might suspect -- 342 00:17:42,570 --> 00:17:44,871 up to 3% of the population. 343 00:17:45,655 --> 00:17:48,256 That's a lot of dangerous minds. 344 00:17:48,258 --> 00:17:51,426 What if one of them was yours? 345 00:17:51,428 --> 00:17:55,764 If something was wrong with your brain, 346 00:17:55,766 --> 00:17:57,932 how would you know? 347 00:17:57,934 --> 00:18:02,003 If Jim Fallon got a look inside your head, 348 00:18:02,005 --> 00:18:03,838 he could tell you. 349 00:18:03,840 --> 00:18:07,542 He has spent his career studying the anatomy of the brain, 350 00:18:07,544 --> 00:18:10,244 with an emphasis on psychopathic killers. 351 00:18:10,246 --> 00:18:11,612 Fallon: Six years ago, 352 00:18:11,614 --> 00:18:13,781 two of my colleagues in psychiatry 353 00:18:13,783 --> 00:18:16,284 brought me a whole bunch of these scans. 354 00:18:16,286 --> 00:18:19,053 So we're doing PET scans but also some fMRI's, 355 00:18:19,055 --> 00:18:21,222 and about 3/4 of the way through, 356 00:18:21,224 --> 00:18:23,357 I notice a very definite pattern. 357 00:18:23,359 --> 00:18:26,861 And it turns out that these were scans of really bad killers -- 358 00:18:26,863 --> 00:18:29,630 serial killers, and very violent killers. 359 00:18:29,632 --> 00:18:33,400 Freeman: Jim has identified the unique brain structure 360 00:18:33,402 --> 00:18:35,235 of psychopathic murderers. 361 00:18:35,237 --> 00:18:37,904 Here are the areas of the brain -- 362 00:18:37,906 --> 00:18:40,540 amygdala, anterior temporal lobe, 363 00:18:40,542 --> 00:18:43,643 orbital cortex, medial prefrontal cortex, 364 00:18:43,645 --> 00:18:46,713 cingulate, back here to the hippocampus, 365 00:18:46,715 --> 00:18:48,681 back down -- see, it makes a big loop. 366 00:18:48,683 --> 00:18:51,351 These are the areas that are turned off in psychopaths. 367 00:18:51,353 --> 00:18:57,456 Freeman: Our brain anatomy radically affects how we see the world. 368 00:18:57,458 --> 00:18:59,492 How a normal person would see the world, 369 00:18:59,494 --> 00:19:01,694 it would be like driving around in this car. 370 00:19:01,696 --> 00:19:04,297 A normal person would be watching their speed. 371 00:19:04,299 --> 00:19:07,466 They would be putting themselves in other people's shoes. 372 00:19:07,468 --> 00:19:10,269 How fast would you go? What if you had kids here? 373 00:19:10,271 --> 00:19:11,671 And you'd be looking at people, 374 00:19:11,673 --> 00:19:13,706 they'd be looking at you, nothing to hide. 375 00:19:23,884 --> 00:19:28,654 The world of the psychopathic mind is just quite different. 376 00:19:28,656 --> 00:19:32,891 It's like driving around in this dark car at night. 377 00:19:32,893 --> 00:19:37,396 Now I'm protected from people seeing who I really am. 378 00:19:40,667 --> 00:19:42,100 As a psychopath, 379 00:19:42,102 --> 00:19:45,970 one would look out and you'd see these forms walking around, 380 00:19:45,972 --> 00:19:47,973 and they're no longer people. 381 00:19:47,975 --> 00:19:50,074 And so, in this way, you know, 382 00:19:50,076 --> 00:19:52,944 the psychopath is able to use the night. 383 00:19:52,946 --> 00:19:55,580 That is, the night of not connecting 384 00:19:55,582 --> 00:19:58,683 with empathy and emotion with other people, 385 00:19:58,685 --> 00:20:00,852 but seeing them as objects to use 386 00:20:00,854 --> 00:20:04,055 and to, if they get in the way, just run them over. 387 00:20:04,057 --> 00:20:07,593 Freeman: Jim estimates that at least 40 genes 388 00:20:07,595 --> 00:20:11,330 contribute to anti-social personality disorders 389 00:20:11,332 --> 00:20:13,800 and psychopathic brain patterns. 390 00:20:13,802 --> 00:20:17,337 These genes influence whether you're violent, 391 00:20:17,339 --> 00:20:19,506 narcissistic, or homicidal. 392 00:20:19,508 --> 00:20:22,776 So, if you have the genetics of a killer 393 00:20:22,778 --> 00:20:25,645 and the brain anatomy of a killer, 394 00:20:25,647 --> 00:20:28,614 are you destined to become a killer? 395 00:20:28,616 --> 00:20:30,049 For Jim Fallon, 396 00:20:30,051 --> 00:20:34,287 this question was about to become uncomfortably personal. 397 00:20:34,289 --> 00:20:37,090 Worried about Alzheimer's disease, 398 00:20:37,092 --> 00:20:40,960 he decided to run brain scans on his entire family. 399 00:20:40,962 --> 00:20:44,796 All of the tests came back fine -- except for one. 400 00:20:44,798 --> 00:20:48,033 So I was comparing at that time all these brains of killers, 401 00:20:48,035 --> 00:20:51,102 and I had these sheets that I was analyzing on my desk, 402 00:20:51,104 --> 00:20:53,404 and I thought they had gotten mixed up. 403 00:20:53,406 --> 00:20:56,040 That is, I thought one of our family's patterns 404 00:20:56,042 --> 00:20:58,075 was mixed up with the murderers', 405 00:20:58,077 --> 00:21:00,811 'cause it looked just like the murderers' brains. 406 00:21:00,813 --> 00:21:03,380 And, of course, it turned out to be my brain. 407 00:21:03,382 --> 00:21:07,885 Freeman: Jim's brain showed the telltale psychopathic coldness 408 00:21:07,887 --> 00:21:11,155 around the amygdala and the orbital cortex. 409 00:21:11,157 --> 00:21:14,793 Fallon: When I first saw this, I actually just kind of laughed. 410 00:21:14,795 --> 00:21:17,396 You know, I took it as like it was funny -- 411 00:21:17,398 --> 00:21:18,764 a little bit in denial. 412 00:21:18,766 --> 00:21:20,532 And it was a little confusing, 413 00:21:20,534 --> 00:21:22,768 but I thought I took it pretty well. 414 00:21:22,770 --> 00:21:24,036 Freeman: Next, 415 00:21:24,038 --> 00:21:28,141 Jim analyzed his genetic profile and family history. 416 00:21:28,143 --> 00:21:32,512 He found that he had inherited dozens of high-risk genes 417 00:21:32,514 --> 00:21:36,483 and had ancestors who had been convicted of murder. 418 00:21:42,756 --> 00:21:45,825 Then he asked his family and friends 419 00:21:45,827 --> 00:21:47,426 if he showed psychopathic traits. 420 00:21:47,428 --> 00:21:49,695 Fallon: They said, "Well, Jim, we've known all along 421 00:21:49,697 --> 00:21:52,198 "you're a psychopath -- you just don't really hurt anybody. 422 00:21:52,200 --> 00:21:54,599 "You play with everybody's head, you manipulate people, 423 00:21:54,601 --> 00:21:56,802 you're too competitive, you got to win everything." 424 00:21:56,804 --> 00:21:57,936 You know, all this stuff. 425 00:21:57,938 --> 00:21:59,771 They said, "but, you know, but you're funny, 426 00:21:59,773 --> 00:22:01,507 "and, you know, you don't swing at people, 427 00:22:01,509 --> 00:22:03,509 "you don't do any of that, so we just let it go, 428 00:22:03,511 --> 00:22:05,278 but everybody knows you're a psychopath." 429 00:22:05,280 --> 00:22:08,882 Freeman: So which is the real Jim? 430 00:22:08,884 --> 00:22:13,053 The esteemed scientist and life of the party, 431 00:22:13,055 --> 00:22:16,256 or the dangerous man revealed by the brain scans? 432 00:22:16,258 --> 00:22:20,059 Fallon: I kind of thought I really knew myself, 433 00:22:20,061 --> 00:22:21,394 and so I became very confident -- 434 00:22:21,396 --> 00:22:22,829 that I was interested in the brain, 435 00:22:22,831 --> 00:22:24,564 I was studying it, I felt confident in myself. 436 00:22:24,566 --> 00:22:28,335 When this happened, you know, when I was 60, 437 00:22:28,337 --> 00:22:29,903 that was a shock, actually, 438 00:22:29,905 --> 00:22:33,173 when I finally accepted that I wasn't who I thought I was. 439 00:22:33,175 --> 00:22:36,810 Freeman: Jim discovered an unsettling truth, 440 00:22:36,812 --> 00:22:39,246 but he was left with a mystery. 441 00:22:39,248 --> 00:22:42,749 If he has the brain and genes of a killer, 442 00:22:42,751 --> 00:22:44,818 why isn't he a killer? 443 00:22:44,820 --> 00:22:47,854 But given all the, you know, the genetic risk factors 444 00:22:47,856 --> 00:22:50,757 and how my brain is, you know, where it's kind of stuck, 445 00:22:50,759 --> 00:22:51,791 as it were, 446 00:22:51,793 --> 00:22:55,496 I look like I really dodged a bullet. 447 00:22:55,498 --> 00:22:57,298 And it was because my parents 448 00:22:57,300 --> 00:23:00,100 and my aunts and my uncles and my grandparents 449 00:23:00,102 --> 00:23:03,571 are the people who really kept me happy, that's for sure. 450 00:23:03,573 --> 00:23:05,339 Talk about nature/nurture -- 451 00:23:05,341 --> 00:23:08,375 that's when nature/nurture was really happening, 452 00:23:08,377 --> 00:23:09,876 in a very positive way. 453 00:23:11,546 --> 00:23:15,348 Freeman: High-risk genes and unusual brain architecture 454 00:23:15,350 --> 00:23:18,017 do not automatically create killers. 455 00:23:18,019 --> 00:23:21,187 Childhood abuse seems to be a critical ingredient. 456 00:23:21,189 --> 00:23:23,690 A loving home helped Jim Fallon 457 00:23:23,692 --> 00:23:26,192 become a boisterous overachiever, 458 00:23:26,194 --> 00:23:28,962 not a dangerous psychopath. 459 00:23:28,964 --> 00:23:32,465 Not all psychopaths are violent criminals, 460 00:23:32,467 --> 00:23:35,935 but what do we do with the ones who are? 461 00:23:35,937 --> 00:23:39,105 Do we simply remove them from society 462 00:23:39,107 --> 00:23:41,074 and throw away the key? 463 00:23:41,076 --> 00:23:43,109 Perhaps not. 464 00:23:43,111 --> 00:23:48,081 Researchers are not pioneering a radical way to eliminate evil -- 465 00:23:48,083 --> 00:23:52,319 by literally zapping it out of your brain. 466 00:23:55,176 --> 00:23:59,146 Some religions hold that man is a creature of evil. 467 00:23:59,856 --> 00:24:03,458 We may struggle to follow the righteous path, 468 00:24:03,460 --> 00:24:05,694 but ultimately, we will fail. 469 00:24:05,696 --> 00:24:08,130 We are all sinners. 470 00:24:08,132 --> 00:24:12,836 But what if we could make people good? 471 00:24:18,176 --> 00:24:19,877 In Zurich, Switzerland, 472 00:24:19,879 --> 00:24:22,613 Dr. Christian Ruff is blazing a trail 473 00:24:22,615 --> 00:24:25,950 on a controversial frontier of neuroscience -- 474 00:24:25,952 --> 00:24:28,952 changing the way people think and behave. 475 00:24:28,954 --> 00:24:31,789 Huff: Human behavior is quite unique in the animal world. 476 00:24:31,791 --> 00:24:33,357 In contrast to us animals, 477 00:24:33,359 --> 00:24:35,826 we don't just follow our self interest, 478 00:24:35,828 --> 00:24:38,662 but we're able to actually control our behavior 479 00:24:38,664 --> 00:24:41,064 in line with social norms and rules -- 480 00:24:41,066 --> 00:24:43,734 pretty much like the rules of this card game 481 00:24:43,736 --> 00:24:45,202 that we're playing here. 482 00:24:45,204 --> 00:24:48,472 People are always tempted to break rules, 483 00:24:48,474 --> 00:24:49,907 to break laws, 484 00:24:49,909 --> 00:24:53,977 and the problem really is that if some people start doing this, 485 00:24:53,979 --> 00:24:57,281 if some people start breaking out of social norms, 486 00:24:57,283 --> 00:24:59,216 then very soon, chaos ensues. 487 00:24:59,218 --> 00:25:01,519 [ Crowd screaming, siren wails ] 488 00:25:01,521 --> 00:25:03,988 So it's quite important for society 489 00:25:03,990 --> 00:25:06,891 to put in place strong-punishment threats -- 490 00:25:06,893 --> 00:25:10,227 to basically instill in people's heads the knowledge 491 00:25:10,229 --> 00:25:13,464 that if they violate certain norms, certain laws, 492 00:25:13,466 --> 00:25:15,265 then they will get punished. 493 00:25:17,102 --> 00:25:19,871 Freeman: What do we do with people who cannot or will not 494 00:25:19,873 --> 00:25:21,573 follow the rules of the game? 495 00:25:21,575 --> 00:25:26,678 Christian's solution is to zap their brains with electricity. 496 00:25:26,680 --> 00:25:29,347 These players are linked together 497 00:25:29,349 --> 00:25:31,483 in an interactive video game. 498 00:25:31,485 --> 00:25:33,252 They all wear headbands 499 00:25:33,254 --> 00:25:36,022 designed to pass electrical current 500 00:25:36,024 --> 00:25:39,958 into the parts of their brains that control altruism, 501 00:25:39,960 --> 00:25:43,696 or concern for the well-being of others. 502 00:25:43,698 --> 00:25:46,498 With a targeted electrical pulse, 503 00:25:46,500 --> 00:25:51,337 Christian has found that he can make people, including himself, 504 00:25:51,339 --> 00:25:53,138 much more considerate. 505 00:25:53,140 --> 00:25:54,473 Ruff: In the beginning, 506 00:25:54,475 --> 00:25:57,209 there's a slight tingling underneath the electrodes 507 00:25:57,211 --> 00:25:59,444 at the scalp for about 30 seconds or so, 508 00:25:59,446 --> 00:26:01,980 but afterwards I can't really feel it anymore, 509 00:26:01,982 --> 00:26:04,049 whether I'm being stimulated or not. 510 00:26:04,051 --> 00:26:07,452 Freeman: Today, Christian and the volunteers 511 00:26:07,454 --> 00:26:11,022 are going to play a simple profit-sharing game. 512 00:26:11,024 --> 00:26:13,992 Each player is allotted a sum of money 513 00:26:13,994 --> 00:26:17,796 and decides how much to give an anonymous partner. 514 00:26:17,798 --> 00:26:21,701 You have to decide on every trial what your opponent -- 515 00:26:21,703 --> 00:26:25,005 the other person -- will consider to be fair. 516 00:26:25,007 --> 00:26:27,240 Here in this example, for instance, 517 00:26:27,242 --> 00:26:29,776 I can decide now the white is what I keep 518 00:26:29,778 --> 00:26:32,212 and the black is what I give to the other person. 519 00:26:32,214 --> 00:26:34,081 So I keep a lot -- 70%. 520 00:26:34,083 --> 00:26:35,482 And, in this condition, 521 00:26:35,484 --> 00:26:37,617 the other player can now punish me. 522 00:26:37,619 --> 00:26:39,352 Oh, and that's what they did. 523 00:26:39,354 --> 00:26:42,522 They took everything away from me for being so selfish. 524 00:26:42,524 --> 00:26:47,127 Freeman: At first, most of the players act selfishly, 525 00:26:47,129 --> 00:26:50,731 but then the electricity begins to flow. 526 00:26:50,733 --> 00:26:53,667 After five minutes of brain stimulation, 527 00:26:53,669 --> 00:26:55,168 Christian and the other players 528 00:26:55,170 --> 00:26:57,104 are now much more willing to compromise. 529 00:26:57,106 --> 00:26:59,106 [ Beep ] 530 00:26:59,108 --> 00:27:01,007 This is another punishment trial, actually, 531 00:27:01,009 --> 00:27:02,943 so I'm going to give a bit more now, actually, 532 00:27:02,945 --> 00:27:04,778 to the other person. 533 00:27:04,780 --> 00:27:07,314 And let's see whether they punish me or not. 534 00:27:07,316 --> 00:27:09,216 Oh, no, okay. 535 00:27:09,218 --> 00:27:11,151 So I get to keep what I basically chose for myself. 536 00:27:11,153 --> 00:27:16,289 Freeman: The headbands have coerced the players into being nice. 537 00:27:16,291 --> 00:27:19,292 These behavioral changes are temporary. 538 00:27:19,294 --> 00:27:21,961 They persist for about 20 minutes 539 00:27:21,963 --> 00:27:24,063 after the stimulation stops, 540 00:27:24,065 --> 00:27:27,499 but Christian believes that repeated treatment 541 00:27:27,501 --> 00:27:30,335 will condition people to act kindly. 542 00:27:30,337 --> 00:27:35,040 Could this technology be used in jails and mental hospitals 543 00:27:35,042 --> 00:27:36,874 to suppress evil thoughts 544 00:27:36,876 --> 00:27:40,244 and turn criminals back into good people? 545 00:27:40,246 --> 00:27:43,548 Ruff: I think we're definitely not at the point yet 546 00:27:43,550 --> 00:27:45,317 where we can employ these methods 547 00:27:45,319 --> 00:27:47,553 to make people who commit very selfish acts 548 00:27:47,555 --> 00:27:49,121 that harm others not commit them. 549 00:27:49,123 --> 00:27:51,257 But by understanding these brain processes 550 00:27:51,259 --> 00:27:53,794 and how we can affect them with brain stimulation, 551 00:27:53,796 --> 00:27:55,628 we might be getting there one day. 552 00:27:55,630 --> 00:27:57,397 It's definitely not too far away. 553 00:27:57,399 --> 00:28:01,401 Freeman: Neuroscientist Jim Fallon agrees. 554 00:28:01,403 --> 00:28:04,905 In fact, he believes we already have the means 555 00:28:04,907 --> 00:28:07,707 to do it with drugs. 556 00:28:07,709 --> 00:28:10,477 The big question is, can we control those behaviors 557 00:28:10,479 --> 00:28:12,345 that we consider evil in people? 558 00:28:12,347 --> 00:28:14,047 And the answer's probably "yes." 559 00:28:14,049 --> 00:28:16,182 It depends on how far you want to go. 560 00:28:16,184 --> 00:28:20,052 What one could do is just simply snort, intranasally -- 561 00:28:20,054 --> 00:28:22,755 up the nose -- different compounds. 562 00:28:22,757 --> 00:28:26,692 And so let's say one has a problem with impulse control. 563 00:28:26,694 --> 00:28:30,196 Well, impulse control happens to be that area -- 564 00:28:30,198 --> 00:28:34,433 orbital cortex -- right above where the smell receptors are, 565 00:28:34,435 --> 00:28:36,703 so it's the first thing that's hit. 566 00:28:36,705 --> 00:28:39,272 So one could simply put in pieces of DNA 567 00:28:39,274 --> 00:28:41,774 that will be snorted in and concentrate 568 00:28:41,776 --> 00:28:43,142 at the orbital cortex 569 00:28:43,144 --> 00:28:46,245 that will increase those neurotransmitter systems 570 00:28:46,247 --> 00:28:48,714 that increase the function of the area 571 00:28:48,716 --> 00:28:50,383 and, therefore, inhibition. 572 00:28:50,385 --> 00:28:53,819 We could decide to do it so that everybody has their own cocktail 573 00:28:53,821 --> 00:28:55,287 of behavioral modification. 574 00:28:55,289 --> 00:28:57,723 It will only last a certain amount of time. 575 00:28:57,725 --> 00:29:00,960 This is something that society could do. 576 00:29:00,962 --> 00:29:04,963 It sounds a little wild, but it's completely doable. 577 00:29:04,965 --> 00:29:09,936 Freeman: We may soon have the means to reshape damaged brains 578 00:29:09,938 --> 00:29:13,673 and stop violent behavior before it starts. 579 00:29:13,675 --> 00:29:17,276 But this neuroscientist thinks eliminating evil 580 00:29:17,278 --> 00:29:21,480 will take more than peering into the heads of criminals. 581 00:29:21,482 --> 00:29:25,450 We must also probe the minds of those who judge them. 582 00:29:27,010 --> 00:29:29,378 Today, we tend to punish criminals 583 00:29:29,380 --> 00:29:33,115 more for the harm they inflict than for their evil intent. 584 00:29:33,809 --> 00:29:35,242 That's understandable. 585 00:29:35,244 --> 00:29:38,512 It is much easier to count bodies and bullet holes 586 00:29:38,514 --> 00:29:42,149 than determine what was going on in someone else's brain. 587 00:29:42,151 --> 00:29:46,287 But what if we could peer into criminal minds 588 00:29:46,289 --> 00:29:50,091 and judge them on the evil we find there? 589 00:29:50,093 --> 00:29:52,594 That day may be coming soon, 590 00:29:52,596 --> 00:29:56,664 when the true motives of not just criminals, 591 00:29:56,666 --> 00:30:01,135 but also the people who judge them, are laid bare. 592 00:30:01,137 --> 00:30:04,505 As one of the few people in the world 593 00:30:04,507 --> 00:30:07,875 who is both a biologist and a lawyer, 594 00:30:07,877 --> 00:30:12,112 Owen Jones has a unique view of criminal justice. 595 00:30:12,114 --> 00:30:13,547 Objection, Your Honor. 596 00:30:13,549 --> 00:30:16,383 This mock courtroom is part of Owen's laboratory, 597 00:30:16,385 --> 00:30:19,253 a place where he explores what goes on 598 00:30:19,255 --> 00:30:22,757 in the minds of criminals, judges, and juries. 599 00:30:22,759 --> 00:30:25,159 Jones: So, we're seeing a lot of increasing effort 600 00:30:25,161 --> 00:30:27,128 to bring neuroscience into the courtroom, 601 00:30:27,130 --> 00:30:28,363 for better or for worse. 602 00:30:28,365 --> 00:30:31,733 Sometimes, for example, criminal defendants 603 00:30:31,735 --> 00:30:35,537 may be bringing evidence of their own brain scans 604 00:30:35,539 --> 00:30:39,907 to try to avoid conviction altogether -- 605 00:30:39,909 --> 00:30:42,343 to say "I should not be held responsible." 606 00:30:42,345 --> 00:30:46,314 Freeman: But Owen's focus is not so much on criminal brains 607 00:30:46,316 --> 00:30:49,517 as on the brains of the people who determine their guilt. 608 00:30:49,519 --> 00:30:53,822 And he's finding that the ways we judge evil behavior 609 00:30:53,824 --> 00:30:55,256 are severely flawed. 610 00:30:55,258 --> 00:30:57,926 Jones: Our legal system requires jurors 611 00:30:57,928 --> 00:30:59,761 to be amateur mind readers. 612 00:30:59,763 --> 00:31:03,131 They're supposed to figure out not just who did it, 613 00:31:03,133 --> 00:31:06,769 but what was the mental state of the person who did it? 614 00:31:06,771 --> 00:31:09,438 Do you remember the police coming to your house 615 00:31:09,440 --> 00:31:10,539 later that night? 616 00:31:10,541 --> 00:31:12,708 Yeah. They woke me up about 3:00 a.m. 617 00:31:12,710 --> 00:31:16,011 Freeman: Owen has found that jurors are not good 618 00:31:16,013 --> 00:31:19,615 at distinguishing the gray areas of criminal intent. 619 00:31:19,617 --> 00:31:23,052 Emotional circumstances will bias their decisions. 620 00:31:25,689 --> 00:31:29,125 Say two men drive home drunk from the bar. 621 00:31:29,127 --> 00:31:30,693 [ Tires squealing ] 622 00:31:30,695 --> 00:31:33,162 One hits a tree. 623 00:31:34,631 --> 00:31:36,432 The other hits a tree... 624 00:31:36,434 --> 00:31:38,701 and the little girl in front of the tree. 625 00:31:38,703 --> 00:31:41,971 The first man will get a light sentence. 626 00:31:41,973 --> 00:31:45,575 The man who killed the girl will go to prison. 627 00:31:45,577 --> 00:31:48,778 The question is, for how long? 628 00:31:51,148 --> 00:31:53,683 Owen has found that jurors are likely 629 00:31:53,685 --> 00:31:56,853 to give this man the stiffest possible sentence. 630 00:31:56,855 --> 00:31:59,222 Jones: Jurors have a tendency to think 631 00:31:59,224 --> 00:32:02,091 that the driver had a higher level of intent -- 632 00:32:02,093 --> 00:32:03,660 a knowing level of intent, 633 00:32:03,662 --> 00:32:06,863 instead of a reckless level of intent, for example -- 634 00:32:06,865 --> 00:32:08,198 than he actually did. 635 00:32:08,200 --> 00:32:09,999 Freeman: In other words, 636 00:32:10,001 --> 00:32:11,935 even though the two drivers 637 00:32:11,937 --> 00:32:14,804 had exactly the same level of intent, 638 00:32:14,806 --> 00:32:18,175 jurors will believe the driver who hit the girl 639 00:32:18,177 --> 00:32:20,945 was more evil than the other driver. 640 00:32:20,947 --> 00:32:25,716 When emotions dominate, judgments are harsh. 641 00:32:25,718 --> 00:32:30,153 At other times, jurors will shut off their emotions completely 642 00:32:30,155 --> 00:32:34,358 and inexplicably excuse murderous intent. 643 00:32:34,360 --> 00:32:35,859 Jones: Suppose, for example, 644 00:32:35,861 --> 00:32:37,795 that I want to poison my friend Amy, 645 00:32:37,797 --> 00:32:39,163 causing her death -- 646 00:32:39,165 --> 00:32:42,433 and I believe her to be very allergic to poppy seeds. 647 00:32:42,435 --> 00:32:44,569 I sprinkle poppy seeds liberally, 648 00:32:44,571 --> 00:32:45,970 and I serve it to her. 649 00:32:45,972 --> 00:32:47,171 Unbeknownst to me, 650 00:32:47,173 --> 00:32:49,407 Amy's not allergic to poppy seeds, 651 00:32:49,409 --> 00:32:50,941 and so she does not die. 652 00:32:50,943 --> 00:32:53,811 But let's vary the circumstances a little bit. 653 00:32:53,813 --> 00:32:57,148 Suppose that, although she's not allergic to poppy seeds, 654 00:32:57,150 --> 00:32:59,050 Amy is very allergic to peanuts, 655 00:32:59,052 --> 00:33:01,819 and unbeknownst to me, who wants to kill her, 656 00:33:01,821 --> 00:33:04,622 the chef in the kitchen puts peanuts on her salad. 657 00:33:04,624 --> 00:33:06,257 If she dies as a consequence, 658 00:33:06,259 --> 00:33:09,160 a lot of people will start to think, "Wait a second, 659 00:33:09,162 --> 00:33:12,130 "I shouldn't be punished for attempting to murder her 660 00:33:12,132 --> 00:33:15,367 because I didn't actually cause the harm that befell her." 661 00:33:15,369 --> 00:33:18,938 So in a way, the fact that somebody else caused her death 662 00:33:18,940 --> 00:33:22,241 operates as a shield to my liability and punishment. 663 00:33:22,243 --> 00:33:23,843 Freeman: Once again, 664 00:33:23,845 --> 00:33:26,914 the criminal has been judged on results, 665 00:33:26,916 --> 00:33:28,649 not on his intentions. 666 00:33:28,651 --> 00:33:30,384 Owen suspects there is 667 00:33:30,386 --> 00:33:33,220 a neurological explanation for this. 668 00:33:33,222 --> 00:33:36,224 To find out what happens in the brain 669 00:33:36,226 --> 00:33:39,160 when we try to gauge levels of evil, 670 00:33:39,162 --> 00:33:43,064 Owen has put judges and jurors into brain scanners 671 00:33:43,066 --> 00:33:47,168 and presented them with criminal scenarios like these. 672 00:33:47,170 --> 00:33:50,738 He first discovered significant activity 673 00:33:50,740 --> 00:33:55,243 in a region called the dorsolateral prefrontal cortex. 674 00:33:55,245 --> 00:33:58,379 It governs analysis and cognition. 675 00:33:58,381 --> 00:33:59,614 This part of the brain 676 00:33:59,616 --> 00:34:02,150 seems to be doing a lot of the heavy lifting 677 00:34:02,152 --> 00:34:04,986 in deciding whether or not to punish someone at all. 678 00:34:04,988 --> 00:34:08,990 Freeman: But once the prefrontal cortex decides to punish, 679 00:34:08,992 --> 00:34:12,126 another part of the brain decides how much -- 680 00:34:12,128 --> 00:34:15,096 the amygdala, which governs our emotions. 681 00:34:15,098 --> 00:34:17,931 The punishment decision is a product 682 00:34:17,933 --> 00:34:20,266 of two very different regions -- 683 00:34:20,268 --> 00:34:21,935 one highly analytic, 684 00:34:21,937 --> 00:34:26,072 one more emotional that is setting a punishment amount -- 685 00:34:26,074 --> 00:34:30,209 that are separately deployed but yet jointly involved 686 00:34:30,211 --> 00:34:33,180 in yielding the punishment decision. 687 00:34:33,182 --> 00:34:38,019 Freeman: Balancing the emotional and analytic parts of the brain 688 00:34:38,021 --> 00:34:42,057 is the magic trick required of every judge and jury. 689 00:34:42,059 --> 00:34:45,694 If jurors have reduced function in either area, 690 00:34:45,696 --> 00:34:49,131 their punishment decisions could be flawed. 691 00:34:49,133 --> 00:34:52,033 Jones: Research like this may enable us 692 00:34:52,035 --> 00:34:53,568 to de-bias decisions 693 00:34:53,570 --> 00:34:57,305 really focused on those aspects of a person's behavior 694 00:34:57,307 --> 00:35:00,141 that we want to take most into account 695 00:35:00,143 --> 00:35:02,843 when setting degrees of culpability. 696 00:35:02,845 --> 00:35:05,078 Freeman: Owen hopes his findings 697 00:35:05,080 --> 00:35:08,014 will eventually lead to fairer sentencing -- 698 00:35:08,016 --> 00:35:11,017 that we will eventually have a legal system 699 00:35:11,019 --> 00:35:12,818 that only imprisons people 700 00:35:12,820 --> 00:35:15,454 who truly want to do harm to others... 701 00:35:15,456 --> 00:35:19,459 rather than those who simply made tragic errors. 702 00:35:19,461 --> 00:35:23,397 But even if we improve the way we judge criminals, 703 00:35:23,399 --> 00:35:28,001 it will not be the endgame in the struggle to eliminate evil. 704 00:35:28,003 --> 00:35:31,972 Because, as history bears witness, 705 00:35:31,974 --> 00:35:37,544 sometimes entire societies lose their moral compass. 706 00:35:37,546 --> 00:35:41,381 How do we stop the evil that poisons whole nations? 707 00:35:44,409 --> 00:35:49,179 We know that evil can twist and bend solitary minds, 708 00:35:50,060 --> 00:35:54,597 but there's another form of evil that infects whole societies. 709 00:35:54,599 --> 00:35:59,636 It compels ordinary people to support genocidal regimes 710 00:35:59,638 --> 00:36:03,473 and economies based on slavery. 711 00:36:03,475 --> 00:36:06,209 What makes societies turn bad? 712 00:36:06,211 --> 00:36:09,546 Can we stop it from happening? 713 00:36:09,548 --> 00:36:12,348 Karen Wynn's experiments at Yale 714 00:36:12,350 --> 00:36:16,352 show that even babies have a sense of good and evil 715 00:36:16,354 --> 00:36:18,654 and seem to prefer goodness. 716 00:36:18,656 --> 00:36:21,523 But Karen runs another experiment 717 00:36:21,525 --> 00:36:23,792 that is far less comforting. 718 00:36:23,794 --> 00:36:28,864 It shows that the human tendency to identify with groups 719 00:36:28,866 --> 00:36:32,101 and discriminate against those not in "our" group 720 00:36:32,103 --> 00:36:34,570 starts very young. 721 00:36:34,572 --> 00:36:35,738 In this study, 722 00:36:35,740 --> 00:36:38,741 we present the baby with two food choices -- 723 00:36:38,743 --> 00:36:41,444 graham crackers and, say, green beans. 724 00:36:41,446 --> 00:36:44,313 Then, we bring babies into our experimental room, 725 00:36:44,315 --> 00:36:46,615 and they're introduced to two puppets. 726 00:36:46,617 --> 00:36:48,884 And each of the puppets gets a choice 727 00:36:48,886 --> 00:36:51,352 between graham crackers and green beans. 728 00:36:51,354 --> 00:36:54,956 Mmm! Yum! I like graham crackers! 729 00:36:54,958 --> 00:36:58,126 The other puppet shows the opposite preferences. 730 00:36:58,128 --> 00:37:02,630 Ew! Yuck! I don't like graham crackers. 731 00:37:02,632 --> 00:37:04,299 What we find, quite reliably, 732 00:37:04,301 --> 00:37:06,668 is that babies tend to choose the puppet 733 00:37:06,670 --> 00:37:09,638 who expressed the same tastes as they themselves did. 734 00:37:09,640 --> 00:37:11,439 Which one do you like? 735 00:37:11,441 --> 00:37:12,874 That one! 736 00:37:12,876 --> 00:37:15,677 Okay, good job! Does he get a hug? [ Laughs ] 737 00:37:15,679 --> 00:37:19,447 Freeman: Babies not only prefer puppets that agree with them, 738 00:37:19,449 --> 00:37:21,749 they also like to see puppets that don't agree with them 739 00:37:21,751 --> 00:37:22,817 get punished. 740 00:37:22,819 --> 00:37:24,018 Hi. 741 00:37:24,020 --> 00:37:26,454 The puppet who doesn't like graham crackers 742 00:37:26,456 --> 00:37:27,822 has become an outsider -- 743 00:37:27,824 --> 00:37:30,525 not part of the baby's group. 744 00:37:30,527 --> 00:37:33,261 My heart felt sad when I got that result 745 00:37:33,263 --> 00:37:37,031 because it did tell me that this preference for similarity 746 00:37:37,033 --> 00:37:38,433 that we're observing 747 00:37:38,435 --> 00:37:41,035 in babies under a year of age 748 00:37:41,037 --> 00:37:45,539 isn't just a trivial, superficial, fleeting thing, 749 00:37:45,541 --> 00:37:47,474 but it is having consequences 750 00:37:47,476 --> 00:37:50,010 across their psychological terrain, 751 00:37:50,012 --> 00:37:53,413 in terms of how they think about these characters, 752 00:37:53,415 --> 00:37:56,983 what they expect of them, their perceptions of them, 753 00:37:56,985 --> 00:38:01,154 and also how they want them to be treated in the social world. 754 00:38:01,156 --> 00:38:05,125 Freeman: Karen believes our brains are built to care more 755 00:38:05,127 --> 00:38:07,961 about people close to us, in our group, 756 00:38:07,963 --> 00:38:10,197 than those further away. 757 00:38:10,199 --> 00:38:13,633 And we segregate ourselves along lines drawn as simply 758 00:38:13,635 --> 00:38:16,436 as whether or not you like graham crackers 759 00:38:16,438 --> 00:38:19,239 or have the same color skin. 760 00:38:19,241 --> 00:38:23,911 Sometimes this can have very bad consequences. 761 00:38:23,913 --> 00:38:26,213 [ Shouting in German ] 762 00:38:26,215 --> 00:38:28,314 [ Crowd cheers ] 763 00:38:34,788 --> 00:38:38,291 Freeman: Steve Pinker is an experimental psychologist 764 00:38:38,293 --> 00:38:40,826 and cognitive scientist at Harvard. 765 00:38:40,828 --> 00:38:44,030 Pinker: A lot of the worst atrocities in history 766 00:38:44,032 --> 00:38:45,765 came about when one group 767 00:38:45,767 --> 00:38:48,134 dehumanized or demonized another. 768 00:38:48,136 --> 00:38:50,636 They may have thought that they were subhuman, 769 00:38:50,638 --> 00:38:53,472 that they were like vermin, like rats or cockroaches. 770 00:38:53,474 --> 00:38:55,407 A lot of moral progress might come 771 00:38:55,409 --> 00:38:56,875 when we change our mind-set, 772 00:38:56,877 --> 00:38:59,178 and instead of dividing people into groups, 773 00:38:59,180 --> 00:39:01,346 think of the species as a group. 774 00:39:01,348 --> 00:39:04,350 Think of the world as being one big village 775 00:39:04,352 --> 00:39:06,685 and everyone is part of our tribe. 776 00:39:06,687 --> 00:39:10,155 Freeman: Steve thinks this change in thinking, 777 00:39:10,157 --> 00:39:12,892 from identifying with small groups 778 00:39:12,894 --> 00:39:17,196 to belonging to one inclusive society, is slowly happening. 779 00:39:17,198 --> 00:39:21,835 The most obvious effect has been a dramatic decline in violence. 780 00:39:21,837 --> 00:39:24,939 Pinker: When I tell people that violence has been in decline 781 00:39:24,941 --> 00:39:26,273 for long stretches of time 782 00:39:26,275 --> 00:39:29,477 and that we're probably living in the most peaceful era 783 00:39:29,479 --> 00:39:32,447 in the history of our species, they think I'm nuts. 784 00:39:32,449 --> 00:39:35,984 So I had to make the case with a book that was 800 pages long, 785 00:39:35,986 --> 00:39:39,587 with graph after graph and statistic after statistic, 786 00:39:39,589 --> 00:39:42,090 just to prove the point to people. 787 00:39:47,497 --> 00:39:49,132 For example, not far from where I'm standing, 788 00:39:49,134 --> 00:39:50,600 there's an area of Boston 789 00:39:50,602 --> 00:39:52,968 that used to be called the "Combat Zone" 790 00:39:52,970 --> 00:39:55,003 because there were so many murders 791 00:39:55,005 --> 00:39:56,572 and stabbings and muggings. 792 00:39:56,574 --> 00:39:59,741 Now it's being re-colonized by young urban professionals. 793 00:39:59,743 --> 00:40:02,844 Or a few hundred years ago, there were wars going on, 794 00:40:02,846 --> 00:40:04,279 not far from here, 795 00:40:04,281 --> 00:40:07,382 that involved enemies like Canada and Britain and France. 796 00:40:07,384 --> 00:40:09,952 Now the idea of a war with those countries 797 00:40:09,954 --> 00:40:11,586 would seem like a bad joke. 798 00:40:11,588 --> 00:40:12,588 300 years ago, 799 00:40:12,590 --> 00:40:14,556 I might have been burned at the stake 800 00:40:14,558 --> 00:40:16,158 for beliefs that I hold today, 801 00:40:16,160 --> 00:40:17,827 and a hundred years before that, 802 00:40:17,829 --> 00:40:20,363 I might have had my head cut off with a hatchet 803 00:40:20,365 --> 00:40:21,498 in the Indian wars. 804 00:40:21,500 --> 00:40:24,300 Freeman: It is difficult to overstate 805 00:40:24,302 --> 00:40:27,271 just how violent and cruel the world was 806 00:40:27,273 --> 00:40:30,408 for much of the history of the human race. 807 00:40:30,410 --> 00:40:34,712 Once, slavery was legal everywhere in the world. 808 00:40:34,714 --> 00:40:39,350 Now it is officially illegal everywhere in the world. 809 00:40:39,352 --> 00:40:42,387 War and murder were daily facts of life. 810 00:40:42,389 --> 00:40:46,090 Now, for most people, they are exceptional events. 811 00:40:46,092 --> 00:40:48,893 Steve attributes most of this change 812 00:40:48,895 --> 00:40:52,697 to the increased role of government and the rule of law. 813 00:40:52,699 --> 00:40:54,099 But he also thinks 814 00:40:54,101 --> 00:40:57,402 people today are sharper than their ancestors. 815 00:40:57,404 --> 00:40:58,837 Pinker: You might wonder, 816 00:40:58,839 --> 00:41:02,007 are people getting nicer because they're getting smarter? 817 00:41:02,009 --> 00:41:04,743 And, believe it or not, the answer is, maybe yes. 818 00:41:04,745 --> 00:41:08,915 I.Q. scores have been increasing throughout the 20th Century 819 00:41:08,917 --> 00:41:10,617 and all over the world. 820 00:41:10,619 --> 00:41:12,452 No one knows exactly why, 821 00:41:12,454 --> 00:41:16,322 but it's probably a combination of increased schooling 822 00:41:16,324 --> 00:41:19,091 and a trickle down of technological 823 00:41:19,093 --> 00:41:23,029 and analytic concepts from science into everyday life. 824 00:41:23,031 --> 00:41:26,398 But, as a result, it's not farfetched to think 825 00:41:26,400 --> 00:41:29,901 that people could see the benefits of cooperation 826 00:41:29,903 --> 00:41:32,770 and see the downsides of violence more 827 00:41:32,772 --> 00:41:36,107 as they start to intellectualize their lives. 828 00:41:36,109 --> 00:41:40,212 Freeman: Mass communication and mass transportation 829 00:41:40,214 --> 00:41:43,716 are breaking down the barriers between us, 830 00:41:43,718 --> 00:41:46,686 and so is our increasing knowledge 831 00:41:46,688 --> 00:41:48,989 of how the human mind works. 832 00:41:48,991 --> 00:41:53,861 The more we learn, the more we see the humanity within us all, 833 00:41:53,863 --> 00:41:56,363 even those we think of as evil. 834 00:41:59,334 --> 00:42:04,938 A world without a trace of evil will remain a fantasy, 835 00:42:04,940 --> 00:42:08,375 but the better we understand the brain, 836 00:42:08,377 --> 00:42:10,644 the better able we are to identify 837 00:42:10,646 --> 00:42:12,780 the most dangerous among us 838 00:42:12,782 --> 00:42:16,383 and stop them before they do serious harm. 839 00:42:16,385 --> 00:42:18,819 We may never eliminate evil, 840 00:42:18,821 --> 00:42:22,323 but perhaps we can contain it 841 00:42:22,325 --> 00:42:26,027 and reduce the damage it does to our lives. 842 00:42:26,692 --> 00:42:30,692 == sync, corrected by elderman ==67934

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.