All language subtitles for Year.Million.S01E04.Mind.Meld.1080p.WEB-DL.DD5.1.H.264-NTb_track3_eng

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic Download
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian Download
km Cambodian
ca Catalan
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English Download
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt-PT Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:09,042 --> 00:00:10,909 NARRATOR: Imagine deep in the future, 2 00:00:10,944 --> 00:00:13,345 you and your loved ones have carved out time 3 00:00:13,380 --> 00:00:15,680 to take in a concert. 4 00:00:15,715 --> 00:00:19,384 But this isn't just your average jam session. 5 00:00:19,419 --> 00:00:23,722 Let me take you behind the scenes. 6 00:00:23,757 --> 00:00:25,724 This is what's actually happening. 7 00:00:25,759 --> 00:00:28,827 You and all your fellow concert-goers, 8 00:00:28,862 --> 00:00:33,031 your brains are hardwired together. 9 00:00:33,066 --> 00:00:34,366 Still not getting it? 10 00:00:34,401 --> 00:00:35,600 For the price of admission, 11 00:00:35,635 --> 00:00:37,769 when all of your brains are connected, 12 00:00:37,804 --> 00:00:42,207 you can be the performer, the audience, the orchestra, 13 00:00:42,242 --> 00:00:44,943 the vibrations in the air itself. 14 00:00:47,581 --> 00:00:51,516 You hear a melody, it sparks an emotion, a memory. 15 00:00:51,551 --> 00:00:54,953 Down the rabbit hole you go. 16 00:00:54,988 --> 00:00:58,723 Imagine literally teleporting yourself to that moment in time 17 00:00:58,758 --> 00:01:03,462 and actually living the experience. 18 00:01:03,497 --> 00:01:09,301 Welcome to your future, in the hive mind. 19 00:01:18,411 --> 00:01:21,213 It's the deep future. 20 00:01:21,248 --> 00:01:23,381 Your body, gone. 21 00:01:23,416 --> 00:01:27,285 You're all computer, all the time. 22 00:01:27,320 --> 00:01:29,421 Your brain is way more powerful 23 00:01:29,456 --> 00:01:32,991 than even a billion supercomputers. 24 00:01:33,026 --> 00:01:38,430 Jobs, food, language, water, even traditional thought, 25 00:01:38,465 --> 00:01:40,999 all of humanity's building blocks, 26 00:01:41,034 --> 00:01:42,601 all that's done. 27 00:01:42,636 --> 00:01:45,637 And you are immortal. 28 00:01:45,672 --> 00:01:47,506 Squirming in your chair yet? 29 00:01:47,541 --> 00:01:48,974 You should be. 30 00:01:49,009 --> 00:01:50,675 This isn't science fiction. 31 00:01:50,710 --> 00:01:54,246 Today's visionary thinkers say it's a strong probability 32 00:01:54,281 --> 00:01:57,649 that this is what your world is going to look like. 33 00:01:57,684 --> 00:02:00,952 Tonight, they'll guide you toward that spectacular future, 34 00:02:00,987 --> 00:02:03,922 and we'll see how one family navigates it, 35 00:02:03,957 --> 00:02:06,191 one invention at a time. 36 00:02:06,226 --> 00:02:09,194 This is the story of your future. 37 00:02:09,229 --> 00:02:13,298 This is the road to Year Million. 38 00:02:21,975 --> 00:02:24,943 MICHAEL GRAZIANO: Human beings are the cooperative species. 39 00:02:24,978 --> 00:02:26,478 I mean, that's what made us so successful. 40 00:02:26,513 --> 00:02:27,712 There's no other species on Earth 41 00:02:27,747 --> 00:02:29,814 that can come together in such groups, 42 00:02:29,849 --> 00:02:33,718 instantly intuit what everyone else is thinking, 43 00:02:33,753 --> 00:02:37,022 and cooperate on large-scale projects. 44 00:02:37,057 --> 00:02:40,625 NARRATOR: That's right, folks, communication is our superpower, 45 00:02:40,660 --> 00:02:44,863 whether it's through the spoken or the written word. 46 00:02:44,898 --> 00:02:46,531 Ever since that first Homo sapien 47 00:02:46,566 --> 00:02:49,467 turned a primal grunt into an actual word, 48 00:02:49,502 --> 00:02:51,469 humans have used word-based language 49 00:02:51,504 --> 00:02:56,975 to make us the alpha species on this planet. 50 00:02:57,010 --> 00:02:58,843 CHUCK NICE: Communication is going exactly 51 00:02:58,878 --> 00:03:04,549 where it's been going since the first primates started talking. 52 00:03:04,584 --> 00:03:06,518 Men will say, 'I don't want to talk about it,' 53 00:03:06,553 --> 00:03:08,486 and women will say, 'Why not?' 54 00:03:08,521 --> 00:03:10,188 Okay, okay, I had to do it. 55 00:03:10,223 --> 00:03:12,691 I'm sorry. I just had to do it. 56 00:03:12,726 --> 00:03:14,893 NARRATOR: We forgive you, Chuck, but he does have a point. 57 00:03:14,928 --> 00:03:17,662 While word-based language may be humanity's superpower, 58 00:03:17,697 --> 00:03:21,533 when communication breaks down, it is also our kryptonite. 59 00:03:21,568 --> 00:03:23,635 Let's go back to scripture. 60 00:03:26,273 --> 00:03:27,572 CHARLES SOULE: The Tower of Babel 61 00:03:27,607 --> 00:03:30,775 is one of those proto myths in human society 62 00:03:30,810 --> 00:03:33,445 suggesting that there was a time 63 00:03:33,480 --> 00:03:36,448 when humans all spoke the same language. 64 00:03:36,483 --> 00:03:37,782 They decided to get together 65 00:03:37,817 --> 00:03:40,285 and do the most incredible thing that they could, 66 00:03:40,320 --> 00:03:42,621 which was to build a tower so high 67 00:03:42,656 --> 00:03:44,856 that it would reach all the way to heaven. 68 00:03:44,891 --> 00:03:48,393 And so they start building this thing and it gets really tall. 69 00:03:48,428 --> 00:03:51,296 But then God looks over the edge of heaven 70 00:03:51,331 --> 00:03:53,632 and says, uh, wait a minute, 71 00:03:53,667 --> 00:03:55,934 this is not what I want to have happen. 72 00:03:55,969 --> 00:03:58,903 And so he does something clever, because he is, he is God. 73 00:03:58,938 --> 00:04:01,773 And he, he makes it so that none of those people 74 00:04:01,808 --> 00:04:03,208 who are building the tower together 75 00:04:03,243 --> 00:04:04,609 speak the same language anymore. 76 00:04:04,644 --> 00:04:05,777 And so the minute they stop being able 77 00:04:05,812 --> 00:04:07,212 to speak the same language, 78 00:04:07,247 --> 00:04:08,413 they can't work together anymore. 79 00:04:08,448 --> 00:04:10,749 And so the Tower of Babel falls 80 00:04:10,784 --> 00:04:12,984 and, and never is able to be built again. 81 00:04:13,019 --> 00:04:14,919 I think it's such a brilliant explanation 82 00:04:14,954 --> 00:04:16,454 of the way that human beings think. 83 00:04:16,489 --> 00:04:18,990 If you could just, could talk to somebody 84 00:04:19,025 --> 00:04:21,493 and make sure that you were understood in a clear way, 85 00:04:21,528 --> 00:04:23,128 I think we'd be able to work together 86 00:04:23,163 --> 00:04:24,629 in a really beautiful way. 87 00:04:24,664 --> 00:04:26,631 I think that'd be incredible. 88 00:04:26,666 --> 00:04:30,302 NARRATOR: You know where this is headed, right? 89 00:04:30,337 --> 00:04:32,671 PETER DIAMANDIS: We're about to see this explosion 90 00:04:32,706 --> 00:04:34,406 in the way we communicate, 91 00:04:34,441 --> 00:04:36,808 and that it's these next 20 or 30 years 92 00:04:36,843 --> 00:04:42,647 that we are really plugging the brain into the Internet. 93 00:04:42,682 --> 00:04:45,317 NARRATOR: We're headed to a future of pure, seamless, 94 00:04:45,352 --> 00:04:47,085 unadulterated communication 95 00:04:47,120 --> 00:04:49,988 that will enable levels of cooperation and intelligence 96 00:04:50,023 --> 00:04:51,690 that will make the Tower of Babel 97 00:04:51,725 --> 00:04:53,925 look like a Lego set. 98 00:04:53,960 --> 00:04:56,961 And the communication revolution has already begun. 99 00:04:56,996 --> 00:05:00,165 Billions are spent every year on communication apps. 100 00:05:00,200 --> 00:05:02,033 Twitter, emojis, Google Translate 101 00:05:02,068 --> 00:05:04,469 are all breaking down language barriers. 102 00:05:04,504 --> 00:05:06,504 But these are just the first baby steps 103 00:05:06,539 --> 00:05:09,341 in the evolution of communication. 104 00:05:09,376 --> 00:05:11,309 Flash forward a few thousand years 105 00:05:11,344 --> 00:05:13,278 and traditional word-based language 106 00:05:13,313 --> 00:05:14,612 will be ancient history. 107 00:05:14,647 --> 00:05:16,548 We'll be communicating effortlessly 108 00:05:16,583 --> 00:05:18,016 and at the speed of light, 109 00:05:18,051 --> 00:05:20,318 and that will seismically transform 110 00:05:20,353 --> 00:05:22,987 the very nature of our existence. 111 00:05:23,022 --> 00:05:24,322 This is how we'll do it. 112 00:05:24,357 --> 00:05:26,658 First stage, telepathy. 113 00:05:26,693 --> 00:05:29,060 Using tiny nanochips implanted in our brains 114 00:05:29,095 --> 00:05:31,529 connected to an ultra-high-speed Internet, 115 00:05:31,564 --> 00:05:33,331 we will finally realize the dream 116 00:05:33,366 --> 00:05:36,234 of actual brain-to-brain communication. 117 00:05:36,269 --> 00:05:37,869 Opening our brains to one another 118 00:05:37,904 --> 00:05:41,206 will be a transformational moment in human communication, 119 00:05:41,241 --> 00:05:44,642 as well as the end of privacy as we know it. 120 00:05:44,677 --> 00:05:48,680 But when we go there we will be ready for the next step, 121 00:05:48,715 --> 00:05:51,383 swarm intelligence. 122 00:05:51,418 --> 00:05:53,351 Combining our high-speed connectivity 123 00:05:53,386 --> 00:05:55,053 with our brain-to-brain communication, 124 00:05:55,088 --> 00:05:57,355 we'll combine our diverse outlooks 125 00:05:57,390 --> 00:05:59,224 to exponentially boost our intelligence 126 00:05:59,259 --> 00:06:03,194 and work together in swarms to solve problems in groups 127 00:06:03,229 --> 00:06:05,263 that we never could alone. 128 00:06:05,298 --> 00:06:10,201 We'll need it when we come face to face with alien intelligence. 129 00:06:10,236 --> 00:06:11,536 Figuring out how to communicate 130 00:06:11,571 --> 00:06:13,605 will require all of our ingenuity 131 00:06:13,640 --> 00:06:15,106 and will only be possible 132 00:06:15,141 --> 00:06:17,876 because of our communication revolution. 133 00:06:17,911 --> 00:06:20,044 And when we've mastered swarm intelligence 134 00:06:20,079 --> 00:06:22,714 and become super-intelligent beings, 135 00:06:22,749 --> 00:06:24,916 the final step in human communication will be 136 00:06:24,951 --> 00:06:28,887 when we merge our minds into a single consciousness. 137 00:06:28,922 --> 00:06:31,723 Like that super-trippy concert we just witnessed. 138 00:06:31,758 --> 00:06:34,626 We'll evolve beyond individuality 139 00:06:34,661 --> 00:06:37,395 and shed our very sense of self. 140 00:06:37,430 --> 00:06:39,030 And when we've united humanity 141 00:06:39,065 --> 00:06:41,399 into an enormous super intelligence, 142 00:06:41,434 --> 00:06:42,967 eliminating the barriers between us, 143 00:06:43,002 --> 00:06:46,104 then we can finally build on our limitless imagination, 144 00:06:46,139 --> 00:06:47,705 and everything will be possible. 145 00:06:47,740 --> 00:06:50,308 So, what will be our Tower of Babel look like 146 00:06:50,343 --> 00:06:51,376 in the future? 147 00:06:51,411 --> 00:06:52,977 Stay with us and find out. 148 00:06:53,012 --> 00:06:54,779 But first let's roll back the clock 149 00:06:54,814 --> 00:06:56,247 to witness the first stage, 150 00:06:56,282 --> 00:07:01,086 when we humans are just beginning to embrace telepathy. 151 00:07:11,431 --> 00:07:13,097 [muffled chatter, laughter] 152 00:07:13,132 --> 00:07:18,503 This is what your future dinner party might look and sound like. 153 00:07:18,538 --> 00:07:21,239 [muffled speech] 154 00:07:21,274 --> 00:07:23,274 Oops, I forgot. Since our brains really do think 155 00:07:23,309 --> 00:07:27,479 faster than we speak, let me slow it down so you can hear. 156 00:07:29,816 --> 00:07:32,784 Our brains work faster than our mouths. 157 00:07:32,819 --> 00:07:35,453 EVA: Mm, it's good, it's really good. 158 00:07:35,488 --> 00:07:36,521 MAN: Jess, your mom's a lightweight. 159 00:07:36,556 --> 00:07:38,089 NARRATOR: How bizarre will it be 160 00:07:38,124 --> 00:07:42,627 when our mouths are used solely for eating and breathing? 161 00:07:42,662 --> 00:07:44,963 In the future, a dinner party will take place 162 00:07:44,998 --> 00:07:46,130 in total silence. 163 00:07:46,165 --> 00:07:48,967 Because words as we know them will disappear. 164 00:07:49,002 --> 00:07:52,504 Tiny chips in our brains will enable us to communicate 165 00:07:52,539 --> 00:07:54,939 telepathically via the Internet. 166 00:07:54,974 --> 00:07:57,442 WOMAN: So, Johnny, did you hear about the tsunami in Morocco? 167 00:07:57,477 --> 00:07:59,644 WOMAN: The family that I saw today were really interesting. 168 00:07:59,679 --> 00:08:01,646 WOMAN: The project on Europa, it's amazing. 169 00:08:01,681 --> 00:08:03,281 NARRATOR: Some people, like Oscar here, 170 00:08:03,316 --> 00:08:05,016 will resist the tech implants, 171 00:08:05,051 --> 00:08:07,852 because it also comes with a risk for hacking, 172 00:08:07,887 --> 00:08:09,387 but we'll get into that later. 173 00:08:09,422 --> 00:08:12,858 OSCAR: So, how was your day? 174 00:08:14,260 --> 00:08:17,996 MAN: You still haven't upgraded to telepathy? 175 00:08:20,233 --> 00:08:22,567 NARRATOR: So how do we get to that telepathic dinner party 176 00:08:22,602 --> 00:08:25,604 in the future from where we are today? 177 00:08:28,207 --> 00:08:30,208 The key will be to find a common language 178 00:08:30,243 --> 00:08:32,744 that can connect all of humanity. 179 00:08:32,779 --> 00:08:34,913 I wonder where we'd find something like that. 180 00:08:34,948 --> 00:08:37,749 NICE: I will say this, and I say it without compunction 181 00:08:37,784 --> 00:08:39,417 and with a great deal of confidence-- 182 00:08:39,452 --> 00:08:42,654 the Internet is our universal language. 183 00:08:42,689 --> 00:08:44,055 It's already there. 184 00:08:44,090 --> 00:08:46,324 And you would think that we would now have 185 00:08:46,359 --> 00:08:48,760 this incredible exchange of ideas 186 00:08:48,795 --> 00:08:52,764 and exciting means of transferring information, 187 00:08:52,799 --> 00:08:54,732 but instead what do we do? 188 00:08:54,767 --> 00:09:01,039 We send emojis that talk to us in little faces. 189 00:09:01,074 --> 00:09:04,842 Why? 'Cause they're cute and you can understand them. 190 00:09:04,877 --> 00:09:06,945 NARRATOR: Call them cute or really irritating, 191 00:09:06,980 --> 00:09:10,114 truth is there's no denying the emoji's part of a new grammar 192 00:09:10,149 --> 00:09:12,450 connecting people all around the world 193 00:09:12,485 --> 00:09:14,619 in a way never before seen in history. 194 00:09:14,654 --> 00:09:16,721 And it's just the beginning. 195 00:09:16,756 --> 00:09:18,790 RAY KURZWEIL: We're going to connect our neocortex 196 00:09:18,825 --> 00:09:21,025 to a synthetic neocortex in the cloud. 197 00:09:21,060 --> 00:09:22,594 And I'm thinking that it will be a hybrid 198 00:09:22,629 --> 00:09:25,129 of our biological brains, 199 00:09:25,164 --> 00:09:28,433 with the non-biological extension in the cloud. 200 00:09:28,468 --> 00:09:31,102 NARRATOR: When our brains are directly linked to the cloud, 201 00:09:31,137 --> 00:09:32,904 then the dream of telepathic communication 202 00:09:32,939 --> 00:09:35,306 will finally be realized. 203 00:09:35,341 --> 00:09:37,709 THOMAS WEBSTER: It's actually not so far-fetched, I think, 204 00:09:37,744 --> 00:09:39,310 if you think about it. 205 00:09:39,345 --> 00:09:44,082 So if you're able to create nano-implants or nano-sensors 206 00:09:44,117 --> 00:09:47,118 to put in the brain of one person, 207 00:09:47,153 --> 00:09:49,454 and then put it in the brain of another person. 208 00:09:49,489 --> 00:09:51,122 That's a way to communicate in a way 209 00:09:51,157 --> 00:09:54,125 that we have never really thought about 210 00:09:54,160 --> 00:09:56,861 in society so far. 211 00:09:58,665 --> 00:10:00,298 ANNALEE NEWITZ: There are experiments now 212 00:10:00,333 --> 00:10:04,469 where we have brain-computer interfaces, 213 00:10:04,504 --> 00:10:07,839 which really does suggest that 214 00:10:07,874 --> 00:10:11,609 something like telepathy could exist. 215 00:10:11,644 --> 00:10:13,845 BARATUNDE THURSTON: Oh, that's so dangerous. 216 00:10:13,880 --> 00:10:17,615 Oh, man, we're going to have so many broken relationships. 217 00:10:17,650 --> 00:10:19,784 And look at what people tweet. 218 00:10:19,819 --> 00:10:21,819 That takes effort. 219 00:10:21,854 --> 00:10:23,988 You have to launch an app, you know, 220 00:10:24,023 --> 00:10:27,325 open up the compose window, tap out a message, and press send. 221 00:10:27,360 --> 00:10:29,827 And people still say things they regret, 222 00:10:29,862 --> 00:10:32,630 and lose their jobs over it and get divorced over it. 223 00:10:32,665 --> 00:10:34,365 Thinking to communication? 224 00:10:34,400 --> 00:10:36,267 That's, that's real sloppy. 225 00:10:36,302 --> 00:10:38,870 That's going to be a hot mess. 226 00:10:41,541 --> 00:10:42,840 NICE: You know, it's funny, 227 00:10:42,875 --> 00:10:44,709 because here's how telepathy is awesome-- 228 00:10:44,744 --> 00:10:47,178 when you're the only person who has it. 229 00:10:47,213 --> 00:10:48,880 [laughs] 230 00:10:48,915 --> 00:10:50,782 When that's like your superpower 231 00:10:50,817 --> 00:10:53,351 and you're going around reading everybody's mind, 232 00:10:53,386 --> 00:10:54,986 but nobody can read yours. 233 00:10:55,021 --> 00:10:57,822 That's when telepathy is great. 234 00:10:57,857 --> 00:11:00,558 NARRATOR: That's probably not going to be how it works. 235 00:11:00,593 --> 00:11:02,460 Telepathy will be accessible to everyone. 236 00:11:02,495 --> 00:11:05,663 But telepathy will definitely require some adjustments, 237 00:11:05,698 --> 00:11:07,498 and there will be growing pains. 238 00:11:07,533 --> 00:11:12,470 It won't be all sitting around the campfire singing Kumbaya. 239 00:11:12,505 --> 00:11:14,372 N.K. JEMISIN: Lord help us, what if something like a Twitter mob 240 00:11:14,407 --> 00:11:16,040 existed for the mind? 241 00:11:16,075 --> 00:11:20,511 Um, no, I can't think of anything more horrific. 242 00:11:20,546 --> 00:11:22,046 But if we could control it, 243 00:11:22,081 --> 00:11:24,015 if it was really just another way of connecting, 244 00:11:24,050 --> 00:11:28,052 like AOL, but you know, instead of 'You've got mail,' 245 00:11:28,087 --> 00:11:30,888 it's, you know, 'You've got thoughts.' 246 00:11:30,923 --> 00:11:32,356 THURSTON: On the other hand, 247 00:11:32,391 --> 00:11:36,828 it could lead to a more subtle set of interactions, 248 00:11:36,863 --> 00:11:40,498 because you would feel the weight not just of your words, 249 00:11:40,533 --> 00:11:42,934 but of your thoughts. 250 00:11:42,969 --> 00:11:44,669 NARRATOR: That will be amazing. 251 00:11:44,704 --> 00:11:47,438 Think of it, instantaneous, immersive, 252 00:11:47,473 --> 00:11:49,207 empathic communication. 253 00:11:49,242 --> 00:11:52,009 Humanity will never be the same again. 254 00:11:52,044 --> 00:11:53,711 But it's not going to be easy. 255 00:11:53,746 --> 00:11:55,847 Wireless devices are manufactured to conform 256 00:11:55,882 --> 00:11:59,283 to an agreed-upon standard, like Bluetooth or Wi-Fi. 257 00:11:59,318 --> 00:12:01,686 But telepathy involves human beings, 258 00:12:01,721 --> 00:12:03,755 and we can't agree on anything. 259 00:12:03,790 --> 00:12:05,389 ANDERS SANDBERG: The fundamental problem 260 00:12:05,424 --> 00:12:09,727 of brain-to-brain communication is that every brain is unique. 261 00:12:09,762 --> 00:12:12,797 When I think about the concept of a mountain, 262 00:12:12,832 --> 00:12:15,066 I might envision something like Matterhorn, 263 00:12:15,101 --> 00:12:17,769 because I was seeing that as a child. 264 00:12:17,804 --> 00:12:20,471 And my neurons firing when I think mountain, 265 00:12:20,506 --> 00:12:23,708 might be very different from your neurons that fire. 266 00:12:23,743 --> 00:12:25,409 So I need to find a mapping, 267 00:12:25,444 --> 00:12:29,413 so when my mountain neurons fire in a particular pattern, 268 00:12:29,448 --> 00:12:31,048 we can activate the right ones. 269 00:12:31,083 --> 00:12:34,051 That's a very tough machine learning problem. 270 00:12:34,086 --> 00:12:36,220 NARRATOR: And it's not just going to be a tough problem 271 00:12:36,255 --> 00:12:37,889 for machines to learn. 272 00:12:37,924 --> 00:12:40,792 It's also going to be a big adjustment for humans as well. 273 00:12:40,827 --> 00:12:43,060 Not everyone is going to be an early adopter. 274 00:12:43,095 --> 00:12:45,630 There will be those who don't want their innermost thoughts 275 00:12:45,665 --> 00:12:46,964 accessible to others. 276 00:12:46,999 --> 00:12:49,233 Or those worried about hacking, like Oscar, 277 00:12:49,268 --> 00:12:51,169 the patriarch of our future family 278 00:12:51,204 --> 00:12:55,106 who hasn't yet bought into this newfangled technology. 279 00:12:59,946 --> 00:13:02,513 JESS: Hey, did Sajani tell you their news? 280 00:13:02,548 --> 00:13:03,581 EVA: No, what's up? 281 00:13:03,616 --> 00:13:05,283 NARRATOR: Oscar is old school, 282 00:13:05,318 --> 00:13:06,918 but how long can he hold out 283 00:13:06,953 --> 00:13:09,821 when everyone around him is communicating telepathically? 284 00:13:09,856 --> 00:13:12,356 JESS: Damon wants to marry another woman. 285 00:13:12,391 --> 00:13:13,858 EVA: What? 286 00:13:13,893 --> 00:13:15,493 Does he want a divorce? 287 00:13:15,528 --> 00:13:16,661 JESS: No, no, not at all. 288 00:13:16,696 --> 00:13:18,262 OSCAR: What? 289 00:13:18,297 --> 00:13:20,364 JESS: Apparently, they met someone they both like and... 290 00:13:20,399 --> 00:13:21,699 OSCAR: What is it? 291 00:13:21,734 --> 00:13:24,068 EVA: Nothing. JESS: Nothing. 292 00:13:28,407 --> 00:13:30,308 OSCAR: Okay, that's it. 293 00:13:30,343 --> 00:13:31,776 JESS: Dad. 294 00:13:31,811 --> 00:13:33,344 NARRATOR: At some point he might have to give in, 295 00:13:33,379 --> 00:13:36,113 just to join the conversation. 296 00:13:36,148 --> 00:13:39,250 OSCAR: Telepathy upgrade, dosage Oscar. 297 00:13:50,897 --> 00:13:53,932 So, what's for dessert? 298 00:14:03,509 --> 00:14:05,009 NARRATOR: Telepathy is on its way. 299 00:14:05,044 --> 00:14:06,911 When we connect our minds together 300 00:14:06,946 --> 00:14:09,580 it will be a singular moment in human history 301 00:14:09,615 --> 00:14:12,316 sending human intelligence into overdrive; 302 00:14:12,351 --> 00:14:16,254 a first step on our path back to that Tower of Babel. 303 00:14:16,289 --> 00:14:18,322 But it won't come without a sacrifice. 304 00:14:18,357 --> 00:14:20,758 Opening our innermost thoughts to each other 305 00:14:20,793 --> 00:14:23,060 will begin to blur the differences between us 306 00:14:23,095 --> 00:14:25,196 and will be the beginning of the end 307 00:14:25,231 --> 00:14:30,101 to one of our most cherished possessions: our privacy. 308 00:14:38,210 --> 00:14:40,411 NARRATOR: In the future we're headed to a world 309 00:14:40,446 --> 00:14:42,113 where Internet-enabled telepathy 310 00:14:42,148 --> 00:14:45,750 will connect us all brain-to-brain. 311 00:14:45,785 --> 00:14:47,852 This is going to catapult humanity forward 312 00:14:47,887 --> 00:14:52,456 on our journey to that future Tower of Babel. 313 00:14:52,491 --> 00:14:55,293 Just imagine the implications. 314 00:14:59,231 --> 00:15:00,765 GEORGE DVORSKY: You know, just with the flick of a thought, 315 00:15:00,800 --> 00:15:02,800 start to engage in a conversation with someone 316 00:15:02,835 --> 00:15:04,268 miles away, and not even have to talk. 317 00:15:04,303 --> 00:15:05,770 That's going to change a lot of things 318 00:15:05,805 --> 00:15:07,538 in terms of the level of engagement 319 00:15:07,573 --> 00:15:08,773 that we have with others, 320 00:15:08,808 --> 00:15:09,941 and just even the sense of intimacy 321 00:15:09,976 --> 00:15:13,311 that we will have with others. 322 00:15:13,346 --> 00:15:18,215 DAVID BYRNE: Ooh, that's a risky place to go, I think. 323 00:15:18,250 --> 00:15:21,285 Sometimes it might be, benefit both parties 324 00:15:21,320 --> 00:15:25,256 if you withhold a little bit of information 325 00:15:25,291 --> 00:15:28,125 and give it some time. 326 00:15:28,160 --> 00:15:29,961 BRIAN BENDIS: Even as we're sitting here talking, 327 00:15:29,996 --> 00:15:31,963 you're thinking other things about me, right. 328 00:15:31,998 --> 00:15:34,065 You're lost in the little bald spot. 329 00:15:34,100 --> 00:15:36,000 Would you want to share that? 330 00:15:36,035 --> 00:15:37,902 NARRATOR: Bald spot? What bald spot? 331 00:15:37,937 --> 00:15:39,603 I totally didn't even notice. 332 00:15:39,638 --> 00:15:43,407 Okay, yes, it's true, I was thinking about the bald spot. 333 00:15:43,442 --> 00:15:44,442 He has a point. 334 00:15:44,477 --> 00:15:45,776 There are definitely thoughts 335 00:15:45,811 --> 00:15:48,112 I would prefer not to share with the world. 336 00:15:48,147 --> 00:15:50,915 But in this newly highly connected world 337 00:15:50,950 --> 00:15:53,284 where we're communicating mind-to-mind, 338 00:15:53,319 --> 00:15:55,320 we may not have a choice. 339 00:16:01,560 --> 00:16:04,795 DIAMANDIS: In 2010, we had 1.8 billion people connected. 340 00:16:04,830 --> 00:16:09,633 By 2020, 2025, that number is expected to grow 341 00:16:09,668 --> 00:16:12,136 to all 8 billion humans on the planet. 342 00:16:12,171 --> 00:16:14,772 So imagine a time in the near future 343 00:16:14,807 --> 00:16:16,140 where every single person on the planet 344 00:16:16,175 --> 00:16:19,043 is connected with Internet. 345 00:16:19,078 --> 00:16:21,012 And these 8 billion people with a megabit connection 346 00:16:21,047 --> 00:16:25,282 now have access at a brand-new level. 347 00:16:25,317 --> 00:16:28,119 MICHIO KAKU: The Internet will evolve into brain-net. 348 00:16:28,154 --> 00:16:32,623 That is, we'll send emotions, memories, feelings. 349 00:16:32,658 --> 00:16:34,692 And that's going to change everything. 350 00:16:34,727 --> 00:16:38,129 NARRATOR: So if the signal is coming from within our brain, 351 00:16:38,164 --> 00:16:41,465 how do we police what we unconsciously send? 352 00:16:41,500 --> 00:16:43,901 Our deepest thoughts and feelings could be hacked 353 00:16:43,936 --> 00:16:45,569 and littered across the Internet. 354 00:16:45,604 --> 00:16:48,739 Who will have access to these extremely personal 355 00:16:48,774 --> 00:16:51,042 and private parts of ourselves? 356 00:16:51,077 --> 00:16:54,812 How does that change how we communicate? 357 00:16:54,847 --> 00:16:56,414 MORGAN MARQUIS-BOIRE: Privacy is at the heart 358 00:16:56,449 --> 00:16:58,649 of many treasured parts of the human experience, 359 00:16:58,684 --> 00:17:00,084 like love and like family. 360 00:17:00,119 --> 00:17:03,554 It is sort of a basic right of humans. 361 00:17:03,589 --> 00:17:06,657 The worry, of course, about the sort of erosion of privacy 362 00:17:06,692 --> 00:17:08,426 is that it changes, you know, 363 00:17:08,461 --> 00:17:10,428 sort of the nature of who we are as humans, 364 00:17:10,463 --> 00:17:12,997 and the nature of our relationship with others 365 00:17:13,032 --> 00:17:14,999 in ways that are not positive. 366 00:17:15,034 --> 00:17:18,169 We become afraid to think certain thoughts, 367 00:17:18,204 --> 00:17:21,272 because we know that we're constantly being watched. 368 00:17:21,307 --> 00:17:22,840 NARRATOR: You're worried about the government 369 00:17:22,875 --> 00:17:24,742 monitoring your search history. 370 00:17:24,777 --> 00:17:28,579 Well, in the future it won't just be your search history, 371 00:17:28,614 --> 00:17:30,681 it will be your entire history. 372 00:17:30,716 --> 00:17:34,518 And that raises a serious question. 373 00:17:34,553 --> 00:17:37,254 ROSE EVELETH: Are we going to have privacy in the future? 374 00:17:37,289 --> 00:17:38,589 No. 375 00:17:38,624 --> 00:17:41,358 It's just going to be a different conversation. 376 00:17:41,393 --> 00:17:45,262 THURSTON: We've been sold convenience and efficiency 377 00:17:45,297 --> 00:17:50,367 as a tradeoff for letting essentially surveillance, 378 00:17:50,402 --> 00:17:52,103 at scale, into our lives. 379 00:17:52,138 --> 00:17:55,072 So we get free information online, 380 00:17:55,107 --> 00:17:57,475 by giving up our information online. 381 00:17:57,510 --> 00:18:02,713 And if every choice you make is mediated by an algorithm, 382 00:18:02,748 --> 00:18:06,484 from what you eat, to who you love, to where you go to lunch, 383 00:18:06,519 --> 00:18:08,385 then that's kind of a destruction of self 384 00:18:08,420 --> 00:18:10,988 and a destruction of independence and free will 385 00:18:11,023 --> 00:18:15,493 that gets very philosophical at that point. 386 00:18:15,528 --> 00:18:18,062 EVELETH: In the sort of most dystopian version of this, 387 00:18:18,097 --> 00:18:20,664 you're being surveyed all the time. 388 00:18:20,699 --> 00:18:23,300 TREVOR PAGLEN: When you put that picture on Facebook, 389 00:18:23,335 --> 00:18:27,338 it becomes a part of a body of data 390 00:18:27,373 --> 00:18:33,644 attached to that specific person and to the people around them. 391 00:18:33,679 --> 00:18:36,514 NARRATOR: And that's just what's going on today. 392 00:18:36,549 --> 00:18:39,183 ADAM HARVEY: You begin to feel very watched 393 00:18:39,218 --> 00:18:41,652 when you know how powerful computer vision is, 394 00:18:41,687 --> 00:18:45,122 in terms of extracting knowledge and building a narrative. 395 00:18:45,157 --> 00:18:48,259 NARRATOR: In the future, when we communicate telepathically, 396 00:18:48,294 --> 00:18:50,995 technology might stop tracking key words or faces, 397 00:18:51,030 --> 00:18:54,465 and start tracking your thoughts and your feelings. 398 00:18:54,500 --> 00:18:56,634 THURSTON: That's pretty terrifying. 399 00:18:56,669 --> 00:18:57,868 PAGLEN: It's a question of freedom, 400 00:18:57,903 --> 00:19:00,337 and it's a question of rights. 401 00:19:00,372 --> 00:19:02,706 NARRATOR: We risk becoming a kind of surveillance state 402 00:19:02,741 --> 00:19:03,774 in the future. 403 00:19:03,809 --> 00:19:05,543 Some say we already are. 404 00:19:05,578 --> 00:19:08,679 And as we spend more and more of our time online, 405 00:19:08,714 --> 00:19:10,481 watching and learning on the Internet, 406 00:19:10,516 --> 00:19:13,350 the Internet is also watching and learning us. 407 00:19:13,385 --> 00:19:14,818 It's a two-way street. 408 00:19:14,853 --> 00:19:18,522 And that could be a frightening proposition. 409 00:19:18,557 --> 00:19:24,995 ♪ ♪ 410 00:19:25,030 --> 00:19:27,031 PAGLEN: For a number of years in studio, 411 00:19:27,066 --> 00:19:32,036 we've been developing tools to work with machine vision. 412 00:19:32,071 --> 00:19:34,872 NARRATOR: Trevor Paglen is an artist in San Francisco. 413 00:19:34,907 --> 00:19:36,840 And he created this performance piece 414 00:19:36,875 --> 00:19:39,243 with the famed Kronos Quartet. 415 00:19:39,278 --> 00:19:41,845 It looks like a normal concert, right? 416 00:19:41,880 --> 00:19:43,847 Except it's anything but. 417 00:19:43,882 --> 00:19:45,816 These musicians are being watched, 418 00:19:45,851 --> 00:19:47,418 and not just by the audience. 419 00:19:47,453 --> 00:19:50,087 Cameras project the performers on the screen, 420 00:19:50,122 --> 00:19:55,359 and using algorithms, interpret what they're seeing. 421 00:19:55,394 --> 00:19:58,095 PAGLEN: You start to see a very sharp contrast 422 00:19:58,130 --> 00:19:59,930 between how you, as a human audience, 423 00:19:59,965 --> 00:20:01,398 are perceiving the performance, 424 00:20:01,433 --> 00:20:03,734 and how these machinic forms of seeing 425 00:20:03,769 --> 00:20:05,402 are perceiving the performance. 426 00:20:05,437 --> 00:20:08,539 NARRATOR: As you can tell, it's not always completely accurate. 427 00:20:08,574 --> 00:20:11,208 But using facial recognition software, 428 00:20:11,243 --> 00:20:13,077 Trevor is using this performance 429 00:20:13,112 --> 00:20:16,247 to demonstrate how computers watch and record us. 430 00:20:16,282 --> 00:20:18,716 And just how easy would it be for computers 431 00:20:18,751 --> 00:20:22,419 to build a digital profile of us based on algorithms 432 00:20:22,454 --> 00:20:26,590 that may or may not have our best interests in mind. 433 00:20:26,625 --> 00:20:32,229 PAGLEN: My concern is that this very intimate quantification 434 00:20:32,264 --> 00:20:33,397 of everyday life 435 00:20:33,432 --> 00:20:38,068 adds up to an extremely conformist society. 436 00:20:38,103 --> 00:20:39,737 It's very easy to imagine a future 437 00:20:39,772 --> 00:20:43,540 in which you put a picture of you drinking a beer on Facebook, 438 00:20:43,575 --> 00:20:45,142 that automatically translating 439 00:20:45,177 --> 00:20:48,412 into an increase in your car insurance. 440 00:20:48,447 --> 00:20:52,316 NARRATOR: That's bad, but it could get much worse. 441 00:20:55,788 --> 00:20:57,554 KAKU: Let's say a crime is committed, 442 00:20:57,589 --> 00:21:02,226 and the CIA or the FBI wants to have access to everyone 443 00:21:02,261 --> 00:21:04,962 who has an inclination to do something 444 00:21:04,997 --> 00:21:07,898 like the crime that was just committed. 445 00:21:07,933 --> 00:21:10,067 It scans a database of everybody, 446 00:21:10,102 --> 00:21:11,635 and then just prints out the names. 447 00:21:11,670 --> 00:21:13,070 And you could be totally innocent 448 00:21:13,105 --> 00:21:15,739 and your name could be picked out. 449 00:21:15,774 --> 00:21:18,609 NARRATOR: You might be thinking of Edward Snowden right now. 450 00:21:18,644 --> 00:21:22,346 You can see how, unchecked, this kind of access and power 451 00:21:22,381 --> 00:21:24,782 is wide open for abuse. 452 00:21:24,817 --> 00:21:27,117 NEWITZ: How do we make sure that this kind of technology 453 00:21:27,152 --> 00:21:31,288 isn't being used to categorize people unfairly? 454 00:21:31,323 --> 00:21:34,124 It leads right into issues around racial profiling, 455 00:21:34,159 --> 00:21:35,459 it leads into issues 456 00:21:35,494 --> 00:21:38,195 around other kinds of profiling as well. 457 00:21:38,230 --> 00:21:40,097 NARRATOR: And what about identity theft? 458 00:21:40,132 --> 00:21:41,632 Sure, it's a problem today, 459 00:21:41,667 --> 00:21:44,101 but when the Internet is a part of your brain, 460 00:21:44,136 --> 00:21:46,470 hackers might just not be able to steal your data, 461 00:21:46,505 --> 00:21:49,540 they might also be able to hack your mind! 462 00:21:49,575 --> 00:21:51,442 MARQUIS-BOIRE: As anyone who's sat in a plane 463 00:21:51,477 --> 00:21:53,811 and worried about the hundreds of thousands of lines of code 464 00:21:53,846 --> 00:21:57,348 that keep them floating in the air, 465 00:21:57,383 --> 00:21:58,949 you know, similarly, 466 00:21:58,984 --> 00:22:01,785 the exploitability of our digital lives 467 00:22:01,820 --> 00:22:05,022 is something that, you know, 468 00:22:05,057 --> 00:22:06,890 it's sort of like an ever-pressing concern 469 00:22:06,925 --> 00:22:08,726 in the back of my mind. 470 00:22:08,761 --> 00:22:11,895 NARRATOR: These are serious issues with major repercussions 471 00:22:11,930 --> 00:22:13,964 that our newly connected society 472 00:22:13,999 --> 00:22:15,566 is going to have to grapple with, 473 00:22:15,601 --> 00:22:17,701 because the Internet isn't going anywhere. 474 00:22:17,736 --> 00:22:19,403 BRYAN JOHNSON: In the history of the human race, 475 00:22:19,438 --> 00:22:23,073 we have never stopped the development of a technology. 476 00:22:23,108 --> 00:22:24,975 No matter how dangerous it is, 477 00:22:25,010 --> 00:22:27,144 we have never been able to stop it. 478 00:22:27,179 --> 00:22:28,379 NARRATOR: The trend of human history 479 00:22:28,414 --> 00:22:30,714 is greater and greater connection. 480 00:22:30,749 --> 00:22:32,249 There's no turning back. 481 00:22:32,284 --> 00:22:35,219 We have opened the Pandora's box. 482 00:22:35,254 --> 00:22:36,487 DIAMANDIS: I see us going, 483 00:22:36,522 --> 00:22:40,591 over the next 30, 40 years at the outmost, 484 00:22:40,626 --> 00:22:43,160 from individuals, me and you, 485 00:22:43,195 --> 00:22:47,765 to a meta-intelligence, where 8 billion people are plugged in, 486 00:22:47,800 --> 00:22:51,702 through the cloud, knowing each other's thoughts and feelings, 487 00:22:51,737 --> 00:22:55,606 and becoming conscious at a brand-new level. 488 00:22:55,641 --> 00:22:59,176 NARRATOR: Our current notion of privacy will be ancient history. 489 00:22:59,211 --> 00:23:01,245 But that is a sacrifice we'll have to make, 490 00:23:01,280 --> 00:23:05,582 because we will gain so much more in our new open society. 491 00:23:05,617 --> 00:23:07,451 SANDBERG: One could imagine a world 492 00:23:07,486 --> 00:23:11,522 where thoughts are flowing freely between different minds, 493 00:23:11,557 --> 00:23:13,757 but different minds are solving problems 494 00:23:13,792 --> 00:23:17,227 or looking at things from a different perspective. 495 00:23:17,262 --> 00:23:19,630 NARRATOR: When we master telepathy and redefine privacy, 496 00:23:19,665 --> 00:23:22,733 we'll be on our way to that shining tower of the future. 497 00:23:22,768 --> 00:23:23,934 What comes next? 498 00:23:23,969 --> 00:23:26,103 A revolutionary form of communication 499 00:23:26,138 --> 00:23:27,938 that will unlock the collective power 500 00:23:27,973 --> 00:23:29,573 of our supercharged brains. 501 00:23:29,608 --> 00:23:31,975 Nothing can stop us when we launch human cooperation 502 00:23:32,010 --> 00:23:35,846 into overdrive with swarm intelligence. 503 00:23:43,355 --> 00:23:46,490 NARRATOR: They say two heads are better than one. 504 00:23:46,525 --> 00:23:50,160 Well, in the future we're not going to settle for just two, 505 00:23:50,195 --> 00:23:55,799 try 200, 2,000, 2 million! 506 00:23:55,834 --> 00:24:00,003 Powered by high-speed Internet connected directly to our brain, 507 00:24:00,038 --> 00:24:02,506 we'll all be communicating telepathically 508 00:24:02,541 --> 00:24:07,277 and working together at the speed of light. 509 00:24:07,312 --> 00:24:09,279 And that's going to blow the walls 510 00:24:09,314 --> 00:24:12,416 off what humanity is capable of. 511 00:24:12,451 --> 00:24:14,318 And whom does the hyper-connected, 512 00:24:14,353 --> 00:24:16,587 telepathic society of Year Million 513 00:24:16,622 --> 00:24:19,122 have to thank for their superpower? 514 00:24:19,157 --> 00:24:21,859 That's right, bees. 515 00:24:21,894 --> 00:24:25,262 Welcome to swarm intelligence. 516 00:24:25,297 --> 00:24:27,865 ['Flight of the Bumblebee' playing] 517 00:24:27,900 --> 00:24:29,266 LOUIS ROSENBERG: Bees go out, 518 00:24:29,301 --> 00:24:31,502 and every year they have to find a new home. 519 00:24:31,537 --> 00:24:34,671 And so what they do is they form a swarm. 520 00:24:34,706 --> 00:24:36,707 And that swarm will negotiate 521 00:24:36,742 --> 00:24:41,211 and find the best possible site among all the options. 522 00:24:41,246 --> 00:24:43,547 And what's amazing is that an individual bee can't conceive 523 00:24:43,582 --> 00:24:46,650 of the problem of finding the best possible site. 524 00:24:46,685 --> 00:24:49,219 But when they work together as a swarm, 525 00:24:49,254 --> 00:24:52,556 they converge on that best answer. 526 00:24:54,726 --> 00:24:57,394 EVELETH: You might know a ton about Chinese geography, 527 00:24:57,429 --> 00:24:58,962 which I don't know anything about. 528 00:24:58,997 --> 00:25:01,231 And I might know a ton about krill, 529 00:25:01,266 --> 00:25:03,033 and you might not know anything about that. 530 00:25:03,068 --> 00:25:04,501 I actually do know a lot about krill. 531 00:25:04,536 --> 00:25:06,203 And then together, we're really good at Jeopardy, 532 00:25:06,238 --> 00:25:07,538 or whatever it is, you know. 533 00:25:07,573 --> 00:25:10,040 And that's kind of the idea, right? 534 00:25:10,075 --> 00:25:13,377 THURSTON: Connectivity breeds connection. 535 00:25:13,412 --> 00:25:15,212 I think that there's something real powerful, 536 00:25:15,247 --> 00:25:17,347 if thought becomes communication. 537 00:25:17,382 --> 00:25:19,149 ROSENBERG: 'Cause ultimately, 538 00:25:19,184 --> 00:25:24,088 it's about collecting input from diverse groups. 539 00:25:26,725 --> 00:25:29,326 NARRATOR: For animals, the key to swarm intelligence 540 00:25:29,361 --> 00:25:32,396 is rapid communication within the group. 541 00:25:32,431 --> 00:25:34,197 One of the things that's held back humans 542 00:25:34,232 --> 00:25:37,334 from tapping the full potential of swarm intelligence 543 00:25:37,369 --> 00:25:40,037 is our traditional word-based language. 544 00:25:40,072 --> 00:25:43,574 Powerful as it is, it's just too slow. 545 00:25:43,609 --> 00:25:45,342 JOHNSON: Right now, we communicate 546 00:25:45,377 --> 00:25:49,613 at something like 40 to 60 bits per second via voice. 547 00:25:49,648 --> 00:25:53,450 But our brains can process information much faster. 548 00:25:53,485 --> 00:25:55,519 ROSENBERG: If we're trying to solve problems, 549 00:25:55,554 --> 00:25:58,088 and we work together as a system, 550 00:25:58,123 --> 00:25:59,890 we should find solutions to problems 551 00:25:59,925 --> 00:26:02,893 that over time, as technology becomes more seamless, 552 00:26:02,928 --> 00:26:05,362 they'll just think, and they'll think together as a system, 553 00:26:05,397 --> 00:26:06,730 they'll think together as a swarm, 554 00:26:06,765 --> 00:26:08,231 and they'll converge on answers 555 00:26:08,266 --> 00:26:13,270 that optimizes the satisfaction of the whole population. 556 00:26:13,305 --> 00:26:15,038 SANDBERG: Once we figure out the science 557 00:26:15,073 --> 00:26:17,608 of deliberately swarming people, 558 00:26:17,643 --> 00:26:19,076 we're going to unleash 559 00:26:19,111 --> 00:26:23,580 a tremendous form of collective intelligence. 560 00:26:23,615 --> 00:26:26,083 NARRATOR: Swarm intelligence could well be the only way 561 00:26:26,118 --> 00:26:27,417 in the far future 562 00:26:27,452 --> 00:26:31,755 that we can compete with artificial intelligence. 563 00:26:31,790 --> 00:26:34,591 When we combine our collective brain power together, 564 00:26:34,626 --> 00:26:37,594 it will be like millions of incredibly powerful computers 565 00:26:37,629 --> 00:26:42,599 uniting to solve the world's most pressing problems. 566 00:26:42,634 --> 00:26:46,737 Like how to house refugees whose homes have been destroyed by war 567 00:26:46,772 --> 00:26:50,507 or the effects of an increasingly warming planet. 568 00:26:57,516 --> 00:26:59,282 [muffled voices] 569 00:26:59,317 --> 00:27:02,152 This is what swarm intelligence may look like in the future. 570 00:27:02,187 --> 00:27:03,854 Jess is telepathically swarming with other people 571 00:27:03,889 --> 00:27:05,355 around the world 572 00:27:05,390 --> 00:27:08,759 trying to come up with a solution to a refugee crisis. 573 00:27:08,794 --> 00:27:10,360 But where are my manners? 574 00:27:10,395 --> 00:27:13,764 Let me slow this down again so your minds can process it. 575 00:27:13,799 --> 00:27:17,100 Really, this conversation happened in the blink of an eye. 576 00:27:17,135 --> 00:27:19,169 MAN: Another tsunami in less than a month. 577 00:27:19,204 --> 00:27:22,372 WOMAN: Not to mention the hurricanes in North America. 578 00:27:22,407 --> 00:27:24,541 WOMAN: And the drought in Central Asia. 579 00:27:24,576 --> 00:27:27,477 MAN: Climate change is wreaking havoc in our cities. 580 00:27:27,512 --> 00:27:30,147 WOMAN: Millions of people have been displaced. 581 00:27:30,182 --> 00:27:32,482 JESS: We have to do something to help them. 582 00:27:32,517 --> 00:27:34,618 MAN: Can we stabilize the climate? 583 00:27:34,653 --> 00:27:38,221 WOMAN: Eventually, yes, but in the meantime? 584 00:27:38,256 --> 00:27:40,824 JESS: These people need homes. What can we do? 585 00:27:40,859 --> 00:27:42,859 WOMAN: We redesign major cities. 586 00:27:42,894 --> 00:27:44,795 MAN: Relocate to other planets. 587 00:27:44,830 --> 00:27:46,229 WOMAN: Too much time. 588 00:27:46,264 --> 00:27:48,999 JESS: There has to be an inexpensive and quick solution. 589 00:27:49,034 --> 00:27:50,801 NARRATOR: Working together as a swarm, 590 00:27:50,836 --> 00:27:54,071 they're able to come up with a creative and fast solution 591 00:27:54,106 --> 00:27:55,338 to save the planet. 592 00:27:55,373 --> 00:27:56,640 MAN: We need mobility. 593 00:27:56,675 --> 00:27:57,674 JESS: Mobile. 594 00:27:57,709 --> 00:27:59,042 MAN: Inexpensive. 595 00:27:59,077 --> 00:28:00,977 WOMAN: Clean energy. 596 00:28:01,012 --> 00:28:02,646 JESS: That's it! 597 00:28:02,681 --> 00:28:06,450 NARRATOR: This is the future of communication and cooperation. 598 00:28:12,724 --> 00:28:14,191 JEMISIN: We've got the potential to harness 599 00:28:14,226 --> 00:28:16,593 a tremendously democratizing force. 600 00:28:16,628 --> 00:28:18,595 A planet-wide e-democracy. 601 00:28:18,630 --> 00:28:20,163 Which would be awesome, 602 00:28:20,198 --> 00:28:22,532 if we can manage to do it in a way that's safe. 603 00:28:22,567 --> 00:28:25,402 NARRATOR: Swarms of doctors could find cures for diseases 604 00:28:25,437 --> 00:28:27,037 faster than they could alone. 605 00:28:27,072 --> 00:28:30,407 Swarms of engineers could invent machines and build structures 606 00:28:30,442 --> 00:28:32,442 no individual can imagine. 607 00:28:32,477 --> 00:28:34,845 The bigger and more connected the swarm, 608 00:28:34,880 --> 00:28:38,081 the more powerful it could be. 609 00:28:38,116 --> 00:28:39,416 But like anything powerful, 610 00:28:39,451 --> 00:28:43,620 there is a dark side to the swarm. 611 00:28:43,655 --> 00:28:45,455 NEWITZ: As we all know, big groups of people 612 00:28:45,490 --> 00:28:49,926 sometimes get together and, you know, do really dumb things. 613 00:28:49,961 --> 00:28:51,595 NARRATOR: The Internet is already full 614 00:28:51,630 --> 00:28:52,796 of hackers and trolls. 615 00:28:52,831 --> 00:28:56,433 Now imagine a global swarm of snooping hackers, 616 00:28:56,468 --> 00:28:59,069 connected directly to your brain. 617 00:28:59,104 --> 00:29:00,771 AMY WEBB: As with everything, 618 00:29:00,806 --> 00:29:06,543 there is the technology and then the way that we use technology. 619 00:29:06,578 --> 00:29:08,879 MARQUIS-BOIRE: Technology acts as a power amplifier, right? 620 00:29:08,914 --> 00:29:12,649 And so it is neither sort of inherently good nor bad. 621 00:29:12,684 --> 00:29:15,318 It's simply a tool that amplifies the desires 622 00:29:15,353 --> 00:29:18,321 of the individual or the institution. 623 00:29:18,356 --> 00:29:22,058 WEBB: Part of our obligations as humans 624 00:29:22,093 --> 00:29:26,596 is to inject ourselves in the process. 625 00:29:26,631 --> 00:29:28,298 THURSTON: We have to learn from the mistakes we've made 626 00:29:28,333 --> 00:29:32,469 with past technologies and with past human interaction. 627 00:29:32,504 --> 00:29:34,938 NARRATOR: Swarm intelligence is powerful, 628 00:29:34,973 --> 00:29:36,406 and we'll have to make sure that 629 00:29:36,441 --> 00:29:38,809 we use our massive new intelligence 630 00:29:38,844 --> 00:29:42,479 to unite humanity, not to oppress others. 631 00:29:42,514 --> 00:29:45,515 But the true test of our elevated ability to cooperate 632 00:29:45,550 --> 00:29:49,653 will come when we encounter something other than ourselves. 633 00:29:49,688 --> 00:29:53,490 That's right. Aliens. 634 00:29:53,525 --> 00:29:56,193 KAKU: Let's say one day we're scanning the heavens, 635 00:29:56,228 --> 00:29:58,695 and we pick up a regular message. 636 00:29:58,730 --> 00:30:01,631 Not random noise, but a regular message. 637 00:30:01,666 --> 00:30:03,366 [humming tune from Close Encounters of the Third Kind] 638 00:30:03,401 --> 00:30:04,935 SOULE: Something like that, right? 639 00:30:04,970 --> 00:30:08,705 KAKU: That, of course, is going to be earthshaking. 640 00:30:08,740 --> 00:30:10,540 NARRATOR: It certainly will be. 641 00:30:10,575 --> 00:30:13,276 How will we communicate with them? 642 00:30:13,311 --> 00:30:15,812 Understanding what extraterrestrials are doing 643 00:30:15,847 --> 00:30:19,115 in the skies above us will be a major test 644 00:30:19,150 --> 00:30:22,219 of our newfound communication skills. 645 00:30:22,254 --> 00:30:29,025 It may be the difference between survival and extinction. 646 00:30:29,060 --> 00:30:36,066 ♪ ♪ 647 00:30:50,749 --> 00:30:52,215 NARRATOR: As we trip the light fantastic 648 00:30:52,250 --> 00:30:53,917 down the path to Year Million, 649 00:30:53,952 --> 00:30:55,185 toward a time when we can build 650 00:30:55,220 --> 00:30:57,320 our very own tower to the heavens, 651 00:30:57,355 --> 00:31:00,757 communication will be completely redefined. 652 00:31:00,792 --> 00:31:03,326 But let's be clear when we talk about the Year Million. 653 00:31:03,361 --> 00:31:06,096 We're not really talking about a specific year. 654 00:31:06,131 --> 00:31:10,066 It's our way of saying a future so different, so transformative, 655 00:31:10,101 --> 00:31:13,169 that it's just a glimmer on the horizon of our imagination. 656 00:31:13,204 --> 00:31:16,106 And only the boldest thinkers are able to see 657 00:31:16,141 --> 00:31:17,540 where we're headed. 658 00:31:17,575 --> 00:31:20,043 We just might make some astounding discoveries 659 00:31:20,078 --> 00:31:23,046 along the way. 660 00:31:23,081 --> 00:31:24,414 KAKU: I'm going to stick my neck out 661 00:31:24,449 --> 00:31:27,584 and say that we will probably make contact 662 00:31:27,619 --> 00:31:30,053 with an extraterrestrial civilization. 663 00:31:30,088 --> 00:31:33,924 That's going to be one of the greatest turning points 664 00:31:33,959 --> 00:31:35,792 in human history. 665 00:31:35,827 --> 00:31:41,531 Every single historical account of the evolution of our species 666 00:31:41,566 --> 00:31:43,366 will have to take into account 667 00:31:43,401 --> 00:31:45,635 the fact that we have finally made contact 668 00:31:45,670 --> 00:31:48,605 with another intelligent life-form. 669 00:31:48,640 --> 00:31:50,807 NARRATOR: And when that day finally arrives, 670 00:31:50,842 --> 00:31:53,543 the question is, what then? 671 00:31:53,578 --> 00:31:55,045 EVELETH: You want to make sure that, 672 00:31:55,080 --> 00:31:56,413 like, they don't want to kill you. 673 00:31:56,448 --> 00:31:58,114 Very quickly, as quickly as you can, 674 00:31:58,149 --> 00:31:59,582 you want to make sure 675 00:31:59,617 --> 00:32:01,451 that they are not trying to kill and eat you, right. 676 00:32:01,486 --> 00:32:03,787 NARRATOR: Yes, that would be tops on the list, 677 00:32:03,822 --> 00:32:05,221 I would imagine. 678 00:32:05,256 --> 00:32:07,457 Assuming we get past that, what next? 679 00:32:07,492 --> 00:32:10,794 It could be the greatest moment in human history, or not. 680 00:32:10,829 --> 00:32:12,395 After all, first contact 681 00:32:12,430 --> 00:32:15,565 could be 'Arrival' or 'Independence Day.' 682 00:32:15,600 --> 00:32:17,167 THURSTON: I loved 'Arrival.' 683 00:32:17,202 --> 00:32:19,736 It's about feelings and communication. 684 00:32:19,771 --> 00:32:21,504 EVELETH: 'Arrival' is a great example 685 00:32:21,539 --> 00:32:23,773 of that being patient and trying to actually communicate, 686 00:32:23,808 --> 00:32:26,242 and trying to think about things scientifically 687 00:32:26,277 --> 00:32:28,411 and not rush into anything. 688 00:32:28,446 --> 00:32:31,081 When we are looking at these species or whatever, 689 00:32:31,116 --> 00:32:32,582 aliens that come, 690 00:32:32,617 --> 00:32:35,685 is a better method than just trying to blow them up. 691 00:32:35,720 --> 00:32:39,255 NEGIN FARSAD: I like the idea of aliens being like prettier 692 00:32:39,290 --> 00:32:41,291 than what we've thought of them. 693 00:32:41,326 --> 00:32:43,860 You know, we've kind of made them ugly 694 00:32:43,895 --> 00:32:45,528 over the last several decades. 695 00:32:45,563 --> 00:32:47,097 There's no need for that. 696 00:32:47,132 --> 00:32:50,467 They can actually be quite gorgeous, you know what I mean? 697 00:32:50,502 --> 00:32:52,102 NARRATOR: That's a good point, Negin, 698 00:32:52,137 --> 00:32:53,436 but whatever they look like, 699 00:32:53,471 --> 00:32:55,205 let's just assume for the sake of argument 700 00:32:55,240 --> 00:32:57,374 that first contact with extraterrestrials 701 00:32:57,409 --> 00:32:59,542 goes more in the direction of 'Arrival.' 702 00:32:59,577 --> 00:33:00,844 Then our greatest challenge, 703 00:33:00,879 --> 00:33:02,545 what all of our super-charged intelligence 704 00:33:02,580 --> 00:33:05,882 will need to figure out, is how to communicate with them. 705 00:33:05,917 --> 00:33:07,384 But it's not going to be easy. 706 00:33:07,419 --> 00:33:11,688 Without an Alien Dictionary, where do we even begin? 707 00:33:13,058 --> 00:33:14,491 KAKU: There are three features 708 00:33:14,526 --> 00:33:16,726 that we think intelligent alien life will have. 709 00:33:16,761 --> 00:33:19,396 First of all is vision, some kind of stereovision, 710 00:33:19,431 --> 00:33:21,231 the vision of a hunter. 711 00:33:21,266 --> 00:33:24,067 Second is a thumb, a grappling instrument, 712 00:33:24,102 --> 00:33:25,735 a tentacle of some sort. 713 00:33:25,770 --> 00:33:29,305 And third, a language by which you can hand down information 714 00:33:29,340 --> 00:33:31,374 from generation to generation. 715 00:33:31,409 --> 00:33:33,243 But they're not going to communicate 716 00:33:33,278 --> 00:33:34,878 using American English, 717 00:33:34,913 --> 00:33:37,347 and they're not going to have subject, verb, predicate, 718 00:33:37,382 --> 00:33:39,883 the way we construct sentences. 719 00:33:39,918 --> 00:33:43,887 NICE: I hope, I can only hope 720 00:33:43,922 --> 00:33:49,225 that they all look something like Selma Hayek. 721 00:33:49,260 --> 00:33:50,894 That would be really good. 722 00:33:50,929 --> 00:33:53,029 NARRATOR: We're talking about communication here, Chuck, 723 00:33:53,064 --> 00:33:54,397 let's stick to the subject. 724 00:33:54,432 --> 00:33:55,765 EVELETH: What are the linguistics of this? 725 00:33:55,800 --> 00:33:57,367 What does this actually look like? 726 00:33:57,402 --> 00:33:58,868 How do we figure out, 727 00:33:58,903 --> 00:34:00,770 when they're painting these weird circles, 728 00:34:00,805 --> 00:34:03,106 and we're using these weird lines and sticks, 729 00:34:03,141 --> 00:34:05,575 how do we figure out how to communicate with them? 730 00:34:05,610 --> 00:34:07,444 NARRATOR: How we figure out the aliens' language 731 00:34:07,479 --> 00:34:09,345 will make or break us, 732 00:34:09,380 --> 00:34:11,781 and scientists are already working on it. 733 00:34:11,816 --> 00:34:13,083 How, you might ask? 734 00:34:13,118 --> 00:34:15,919 Well, by taking a page from Dr. Doolittle's book 735 00:34:15,954 --> 00:34:19,923 and starting right here with the animals on Earth. 736 00:34:19,958 --> 00:34:22,358 Dolphins to be specific. 737 00:34:22,393 --> 00:34:24,294 At the National Aquarium in Maryland, 738 00:34:24,329 --> 00:34:27,597 Dr. Diana Reiss and her team are studying dolphins 739 00:34:27,632 --> 00:34:31,101 and how one day we might be able to not just understand them, 740 00:34:31,136 --> 00:34:33,603 but communicate with them. 741 00:34:33,638 --> 00:34:34,804 DIANA REISS: I got really interested 742 00:34:34,839 --> 00:34:36,773 in working with dolphins particularly, 743 00:34:36,808 --> 00:34:39,109 because they were so different from us. 744 00:34:39,144 --> 00:34:41,945 These animals are truly non-terrestrials 745 00:34:41,980 --> 00:34:44,280 in every sense of the word. 746 00:34:44,315 --> 00:34:45,949 ANA HOCEVAR: We're trying to understand 747 00:34:45,984 --> 00:34:49,486 how we could communicate to a completely different species 748 00:34:49,521 --> 00:34:53,923 that is as close to an alien as you can get, for a human. 749 00:34:53,958 --> 00:34:55,492 NARRATOR: 95 million years ago 750 00:34:55,527 --> 00:34:59,295 dolphins and primates parted ways on the evolutionary chain. 751 00:34:59,330 --> 00:35:01,397 But like us, dolphins have big brains 752 00:35:01,432 --> 00:35:03,933 and a sophisticated social intelligence. 753 00:35:03,968 --> 00:35:06,836 So, as far as working with any other animals on the planet, 754 00:35:06,871 --> 00:35:09,339 there is none better suited to being a test case 755 00:35:09,374 --> 00:35:11,908 for learning to speak to aliens than dolphins. 756 00:35:11,943 --> 00:35:16,246 REISS: What we did was we created an underwater keyboard. 757 00:35:16,281 --> 00:35:18,148 It's like a big iPhone. 758 00:35:18,183 --> 00:35:20,483 And if you touch it, something happens. 759 00:35:20,518 --> 00:35:23,486 I want to give us this interface, 760 00:35:23,521 --> 00:35:26,456 a window where we can exchange things. 761 00:35:26,491 --> 00:35:27,757 HOCEVAR: It really opens the door 762 00:35:27,792 --> 00:35:29,859 to understanding their vocalizations. 763 00:35:29,894 --> 00:35:32,662 NARRATOR: Of course! An interspecies iPhone. 764 00:35:32,697 --> 00:35:34,330 Dr. Reiss and her team are hopeful 765 00:35:34,365 --> 00:35:36,299 that this technology will be a platform 766 00:35:36,334 --> 00:35:37,667 in which humans and dolphins 767 00:35:37,702 --> 00:35:40,870 can one day learn to understand one another. 768 00:35:40,905 --> 00:35:43,106 REISS: Wouldn't it be amazing when they hit a key, 769 00:35:43,141 --> 00:35:44,641 it translates to English? 770 00:35:44,676 --> 00:35:48,344 You can hear that and you can respond. 771 00:35:48,379 --> 00:35:49,546 NARRATOR: Amazingly, their efforts 772 00:35:49,581 --> 00:35:51,381 are already being rewarded. 773 00:35:51,416 --> 00:35:52,849 The dolphin have already figured out 774 00:35:52,884 --> 00:35:54,884 that if they touch the screen with their beaks, 775 00:35:54,919 --> 00:35:57,187 they get a reaction. 776 00:35:57,222 --> 00:35:59,989 MATT MIRA: What are dolphins going to be talking about? 777 00:36:00,024 --> 00:36:03,960 Yeah, water's kind of warm today, huh? Yep. 778 00:36:03,995 --> 00:36:06,162 SOULE: I hope that that's basically what they're saying. 779 00:36:06,197 --> 00:36:09,666 These fish are great. I love to swim. Let's jump. 780 00:36:09,701 --> 00:36:11,734 NARRATOR: Or maybe they're discussing dolphin politics 781 00:36:11,769 --> 00:36:13,002 and dolphin philosophy. 782 00:36:13,037 --> 00:36:15,905 We just don't know, but one day we might. 783 00:36:15,940 --> 00:36:19,576 REISS: In a way, the touchscreen is a true window in itself, 784 00:36:19,611 --> 00:36:21,377 into the minds of these animals. 785 00:36:21,412 --> 00:36:25,081 I think technology can make what's invisible to us 786 00:36:25,116 --> 00:36:26,516 perhaps more visible; 787 00:36:26,551 --> 00:36:29,686 what's inaudible to us more audible. 788 00:36:29,721 --> 00:36:31,421 MARCELO MAGNASCO: If we were to succeed, 789 00:36:31,456 --> 00:36:34,057 we would succeed in actually 790 00:36:34,092 --> 00:36:36,859 not being alone in the universe anymore, 791 00:36:36,894 --> 00:36:38,995 which is a pretty sweet thought. 792 00:36:39,030 --> 00:36:40,597 NARRATOR: And that's why 793 00:36:40,632 --> 00:36:44,534 the Search for Extraterrestrial Intelligence, or SETI Institute, 794 00:36:44,569 --> 00:36:46,269 has been tracking her work with dolphins. 795 00:36:46,304 --> 00:36:49,439 The challenges she faces are the same ones they anticipate 796 00:36:49,474 --> 00:36:52,508 they'll face when aliens show up. 797 00:36:52,543 --> 00:36:53,843 We'll need a plan. 798 00:36:53,878 --> 00:36:57,547 And this technology just may be the answer. 799 00:36:57,582 --> 00:37:00,450 REISS: The technology that we're developing in these projects 800 00:37:00,485 --> 00:37:03,186 will enable us to see better, to hear better, 801 00:37:03,221 --> 00:37:07,590 to understand better, and to empathize more and care more, 802 00:37:07,625 --> 00:37:09,058 once we have that knowledge. 803 00:37:09,093 --> 00:37:12,962 You know, with knowledge comes great responsibility. 804 00:37:12,997 --> 00:37:14,264 NARRATOR: It certainly does. 805 00:37:14,299 --> 00:37:16,399 I couldn't have said it better myself. 806 00:37:16,434 --> 00:37:19,402 THURSTON: There is a potential for a shared connection 807 00:37:19,437 --> 00:37:21,404 and a shared mindset. 808 00:37:21,439 --> 00:37:23,539 FARSAD: With an interspecies Internet, 809 00:37:23,574 --> 00:37:25,041 we're sort of headed 810 00:37:25,076 --> 00:37:29,612 in this kind of universal one-language scenario. 811 00:37:29,647 --> 00:37:33,750 So if you've ever wanted to talk to a walrus, 812 00:37:33,785 --> 00:37:36,753 in the future, you could do that. 813 00:37:36,788 --> 00:37:40,056 THURSTON: That could be pretty magical. 814 00:37:40,091 --> 00:37:41,591 NARRATOR: Pretty magical indeed. 815 00:37:41,626 --> 00:37:44,127 In the future, we may not agree with dolphins or aliens, 816 00:37:44,162 --> 00:37:46,663 or even walruses on anything. 817 00:37:46,698 --> 00:37:48,665 But as long as we're communicating, 818 00:37:48,700 --> 00:37:52,235 there is always an opportunity for greater understanding, 819 00:37:52,270 --> 00:37:54,671 greater empathy, and greater connection. 820 00:37:54,706 --> 00:37:57,507 And as we travel deeper and deeper into the future, 821 00:37:57,542 --> 00:37:59,409 we'll need to master communication 822 00:37:59,444 --> 00:38:01,077 with all sentient beings 823 00:38:01,112 --> 00:38:06,182 if we ever hope to build our own Tower of Babel in the future. 824 00:38:06,217 --> 00:38:07,684 But we're not there yet. 825 00:38:07,719 --> 00:38:10,019 We're going to connect on a whole other level. 826 00:38:10,054 --> 00:38:12,622 If you thought losing your privacy was a big deal, 827 00:38:12,657 --> 00:38:16,459 the final step is going to be a big pill to swallow. 828 00:38:16,494 --> 00:38:18,661 DVORSKY: You're starting to lose the individual, 829 00:38:18,696 --> 00:38:19,862 and you're starting to now gain 830 00:38:19,897 --> 00:38:22,165 in this kind of massive collectivity, 831 00:38:22,200 --> 00:38:24,867 this, this entity that kind of maybe even thinks 832 00:38:24,902 --> 00:38:28,171 and has impulses and tendencies toward certain direction. 833 00:38:28,206 --> 00:38:30,440 The sum of its intelligence would hopefully be greater 834 00:38:30,475 --> 00:38:32,809 than the sum of its parts. 835 00:38:32,844 --> 00:38:34,844 THURSTON: That's going to change politics, right? 836 00:38:34,879 --> 00:38:37,180 That's going to change relationships. 837 00:38:37,215 --> 00:38:39,615 Could you merge minds? 838 00:38:39,650 --> 00:38:41,551 NARRATOR: Oh, yes, we can, and we will. 839 00:38:41,586 --> 00:38:44,020 In the Year Million era we'll take the final plunge 840 00:38:44,055 --> 00:38:47,357 and shed our ego, our sense of self, our individuality, 841 00:38:47,392 --> 00:38:49,792 and join together in a single consciousness. 842 00:38:49,827 --> 00:38:51,728 It's a high-bandwidth blending of our mind 843 00:38:51,763 --> 00:38:53,730 that creates its own super intelligence, 844 00:38:53,765 --> 00:38:57,667 a consciousness of which each person is just one small part. 845 00:38:57,702 --> 00:39:01,271 Say hello to your future in the hive mind. 846 00:39:10,114 --> 00:39:11,881 NARRATOR: We're almost at the end of our journey 847 00:39:11,916 --> 00:39:13,249 to Year Million 848 00:39:13,284 --> 00:39:17,720 and that future version of a Tower of Babel to the heavens. 849 00:39:17,755 --> 00:39:20,223 And the final step in our communication evolution 850 00:39:20,258 --> 00:39:21,891 is a doozy. 851 00:39:21,926 --> 00:39:23,159 Yep, you got it. 852 00:39:23,194 --> 00:39:25,194 In the Year Million we're not just communicating 853 00:39:25,229 --> 00:39:26,996 telepathically mind-to-mind, 854 00:39:27,031 --> 00:39:29,599 using our hyper-connectivity to swarm together 855 00:39:29,634 --> 00:39:32,201 and increase our intelligence a thousand-fold. 856 00:39:32,236 --> 00:39:36,172 The final step will be when we finally shed our sense of self 857 00:39:36,207 --> 00:39:37,740 and individuality 858 00:39:37,775 --> 00:39:42,912 and merge our minds together into a single consciousness. 859 00:39:45,950 --> 00:39:47,750 DVORSKY: As we become progressively interconnected 860 00:39:47,785 --> 00:39:49,051 with each other, 861 00:39:49,086 --> 00:39:51,421 lines that separate one brain from another brain 862 00:39:51,456 --> 00:39:53,423 will become increasingly blurred. 863 00:39:53,458 --> 00:39:54,891 And if you can imagine, you know, 864 00:39:54,926 --> 00:39:57,360 hundreds or if not even thousands of individuals, 865 00:39:57,395 --> 00:39:59,061 interlinked in this way, 866 00:39:59,096 --> 00:40:02,565 you're going to have what's referred to as the hive mind. 867 00:40:02,600 --> 00:40:04,901 NARRATOR: Whoa, the hive mind. 868 00:40:04,936 --> 00:40:06,369 It sounds scary, 869 00:40:06,404 --> 00:40:09,739 and it just might be, and we'll get to that in a minute. 870 00:40:09,774 --> 00:40:13,242 But it could also be incredibly empowering. 871 00:40:13,277 --> 00:40:14,877 NEWITZ: Basically like what we do 872 00:40:14,912 --> 00:40:16,646 when we create a computer cluster, 873 00:40:16,681 --> 00:40:18,047 putting all the computers together 874 00:40:18,082 --> 00:40:19,415 and having them work together 875 00:40:19,450 --> 00:40:22,485 allows them to do bigger and tougher problems. 876 00:40:22,520 --> 00:40:23,920 NARRATOR: You might say to yourself 877 00:40:23,955 --> 00:40:25,621 that sounds a lot like the swarm intelligence 878 00:40:25,656 --> 00:40:28,057 we talked about, but they are very different. 879 00:40:28,092 --> 00:40:30,626 Swarm intelligence is decentralized. 880 00:40:30,661 --> 00:40:33,629 We will still have our own identity, a sense of self. 881 00:40:33,664 --> 00:40:35,965 Hive mind is a paradigm shift. 882 00:40:36,000 --> 00:40:38,901 We will let go of our egos and come together 883 00:40:38,936 --> 00:40:41,237 to create a centralized consciousness. 884 00:40:41,272 --> 00:40:43,272 When you connect to the hive mind, 885 00:40:43,307 --> 00:40:46,309 you will share everything. 886 00:40:49,981 --> 00:40:52,815 NEWITZ: It could be a really great experience at a concert, 887 00:40:52,850 --> 00:40:54,951 where like all of us are feeling the same thing 888 00:40:54,986 --> 00:40:58,454 as we listen to an awesome guitar solo. 889 00:40:58,489 --> 00:41:01,757 NARRATOR: Exactly, like a concert. 890 00:41:01,792 --> 00:41:04,594 Speaking of which, remember that concert we saw earlier? 891 00:41:04,629 --> 00:41:07,330 Our future family has become part of the hive mind, 892 00:41:07,365 --> 00:41:11,133 and is experiencing a concert through a single consciousness. 893 00:41:11,168 --> 00:41:12,468 BRIAN GREENE: If all those minds 894 00:41:12,503 --> 00:41:16,038 are in some computer digital environment 895 00:41:16,073 --> 00:41:18,841 that allows them to interface in a more profound way 896 00:41:18,876 --> 00:41:21,944 than the biological means that we have at our disposal now, 897 00:41:21,979 --> 00:41:24,647 I can't help but think that that would be a greater level 898 00:41:24,682 --> 00:41:27,383 of collective communication. 899 00:41:27,418 --> 00:41:28,618 NARRATOR: It would be. 900 00:41:28,653 --> 00:41:30,553 And that's why they can be any part 901 00:41:30,588 --> 00:41:32,522 of this little concert they desire. 902 00:41:32,557 --> 00:41:33,890 Plugged in like they are, 903 00:41:33,925 --> 00:41:36,292 their minds will create a larger intelligence, 904 00:41:36,327 --> 00:41:37,960 a larger consciousness. 905 00:41:37,995 --> 00:41:40,463 Our whole idea of what it means to be human 906 00:41:40,498 --> 00:41:42,131 will change drastically, 907 00:41:42,166 --> 00:41:45,501 because when we let go of the ego, the self-preservation, 908 00:41:45,536 --> 00:41:48,004 the competition involved with having a sense of self 909 00:41:48,039 --> 00:41:51,674 and come together as one, well, then everything is possible. 910 00:41:51,709 --> 00:41:53,476 GREENE: There is something very powerful 911 00:41:53,511 --> 00:41:56,312 of all minds working together. 912 00:41:56,347 --> 00:41:59,081 Maybe we'll find, in this domain, 913 00:41:59,116 --> 00:42:01,717 that the notion of decision 914 00:42:01,752 --> 00:42:05,988 is not where we find our individual footprint. 915 00:42:06,023 --> 00:42:09,492 Maybe our individual footprint comes with an outlook 916 00:42:09,527 --> 00:42:12,662 or a perspective that we hold dear, 917 00:42:12,697 --> 00:42:16,332 and only we as individuals are aware of it, or know it. 918 00:42:16,367 --> 00:42:19,602 Maybe that will be enough, perhaps. 919 00:42:19,637 --> 00:42:22,171 NEWITZ: I definitely think it will radically impact 920 00:42:22,206 --> 00:42:25,575 what we think of as a self. 921 00:42:25,610 --> 00:42:27,076 NARRATOR: Take a moment, breathe. 922 00:42:27,111 --> 00:42:29,745 I know, it's a big idea to wrap your head around. 923 00:42:29,780 --> 00:42:31,414 Everything we think makes us human 924 00:42:31,449 --> 00:42:33,583 is tied to our sense of self. 925 00:42:33,618 --> 00:42:36,018 Hive mind won't come without sacrifice. 926 00:42:36,053 --> 00:42:38,220 It will take a complete redefinition 927 00:42:38,255 --> 00:42:39,555 of what it means to be human. 928 00:42:39,590 --> 00:42:43,593 The question is, will it be worth it? 929 00:42:43,628 --> 00:42:45,027 FARSAD: What feels dangerous about the hive mind 930 00:42:45,062 --> 00:42:47,129 is that we'll all, like, just know the same things 931 00:42:47,164 --> 00:42:49,599 and then we won't have anything to talk about. 932 00:42:49,634 --> 00:42:53,102 It'll be so sad, because like the whole point of life 933 00:42:53,137 --> 00:42:55,037 is to just like hang out with your friends. 934 00:42:55,072 --> 00:42:57,640 It's like, talk some smack, you know what I mean, 935 00:42:57,675 --> 00:43:00,276 and if you all already know the thing, you know, 936 00:43:00,311 --> 00:43:01,611 there's no smack to talk, 937 00:43:01,646 --> 00:43:03,779 and that would be very frustrating. 938 00:43:03,814 --> 00:43:05,581 NARRATOR: Frustrating indeed. 939 00:43:05,616 --> 00:43:08,651 What's life without some good old-fashioned trash talk? 940 00:43:08,686 --> 00:43:10,620 Well, we just might find out. 941 00:43:10,655 --> 00:43:12,488 But what about our autonomy? 942 00:43:12,523 --> 00:43:14,557 When we mingle our minds, what happens? 943 00:43:14,592 --> 00:43:16,158 Are we still you and me? 944 00:43:16,193 --> 00:43:18,394 Or do we become the same thing? 945 00:43:18,429 --> 00:43:23,332 NEWITZ: Any technology that we use to mingle our minds 946 00:43:23,367 --> 00:43:26,402 could have good or bad effects. 947 00:43:26,437 --> 00:43:28,738 So you want to be able to step back 948 00:43:28,773 --> 00:43:31,807 and be a little bit skeptical of any kind of groupthink. 949 00:43:31,842 --> 00:43:33,476 MARQUIS-BOIRE: There is the worry 950 00:43:33,511 --> 00:43:38,014 that this connected hive mind can be used in sinister ways, 951 00:43:38,049 --> 00:43:40,182 depending on who's in charge of it. 952 00:43:40,217 --> 00:43:42,184 NARRATOR: That's right, when you're part of the hive mind, 953 00:43:42,219 --> 00:43:44,754 your mind, at least in the traditional sense, 954 00:43:44,789 --> 00:43:46,689 might not be yours alone. 955 00:43:46,724 --> 00:43:49,825 KAKU: You're like a worker bee in a gigantic hive. 956 00:43:49,860 --> 00:43:54,263 You have no individuality whatsoever. 957 00:43:54,298 --> 00:43:56,532 MIRA: You know, if you think about the Borg from Star Trek, 958 00:43:56,567 --> 00:43:57,767 the Borg is the hive mind. 959 00:43:57,802 --> 00:43:59,535 The Borg are a collective, 960 00:43:59,570 --> 00:44:03,205 so they think as a whole and not as an individual. 961 00:44:03,240 --> 00:44:07,309 And in many ways individuality is what makes us feel human. 962 00:44:07,344 --> 00:44:12,314 To have that stripped away and become part of a hive collective 963 00:44:12,349 --> 00:44:15,117 is one of the more terrifying things you could do. 964 00:44:15,152 --> 00:44:16,952 It's like joining a cult. 965 00:44:16,987 --> 00:44:20,222 NEWITZ: The perfect way to get a zombie army would be 966 00:44:20,257 --> 00:44:21,857 string all their brains together, 967 00:44:21,892 --> 00:44:24,226 hook them up to somebody who really knows what they're doing, 968 00:44:24,261 --> 00:44:25,728 and just blasts their brains 969 00:44:25,763 --> 00:44:28,497 with like whatever information they want to give them. 970 00:44:28,532 --> 00:44:31,300 You know, now you must do this labor 971 00:44:31,335 --> 00:44:34,203 in order to exalt the great one. 972 00:44:34,238 --> 00:44:36,005 NARRATOR: That doesn't sound good at all. 973 00:44:36,040 --> 00:44:37,173 What if I want out? 974 00:44:37,208 --> 00:44:40,576 Is there some sort of hive mind eject button? 975 00:44:40,611 --> 00:44:42,044 DVORSKY: One would hope, for example, 976 00:44:42,079 --> 00:44:44,013 that you could perhaps pull out of the hive mind, 977 00:44:44,048 --> 00:44:46,248 that you could remove yourself from the grid. 978 00:44:46,283 --> 00:44:47,750 We struggle with this today, 979 00:44:47,785 --> 00:44:49,885 we turn off our phones or go into airplane mode, 980 00:44:49,920 --> 00:44:51,654 and we feel like we're naked somehow, 981 00:44:51,689 --> 00:44:53,355 or that somehow we're disconnected from the world. 982 00:44:53,390 --> 00:44:55,391 Imagine how terrifying or disconcerting it would be 983 00:44:55,426 --> 00:44:58,728 in the future, if we suddenly, after engaging in a hive mind, 984 00:44:58,763 --> 00:45:01,363 we pulled our self out of it. 985 00:45:01,398 --> 00:45:02,665 NARRATOR: The hive mind sounds like 986 00:45:02,700 --> 00:45:04,600 it could be a deeply oppressive place, 987 00:45:04,635 --> 00:45:07,403 like North Korea but on steroids. 988 00:45:07,438 --> 00:45:09,238 That's the worst-case scenario. 989 00:45:09,273 --> 00:45:12,742 KAKU: But another possibility is that it is freedom, 990 00:45:12,777 --> 00:45:14,076 a world of enlightenment, 991 00:45:14,111 --> 00:45:17,179 a world of knowledge and prosperity. 992 00:45:17,214 --> 00:45:19,248 NARRATOR: When we're all joined together as one, 993 00:45:19,283 --> 00:45:23,085 could we finally eliminate conflict, wars and suffering? 994 00:45:23,120 --> 00:45:25,621 THURSTON: And if you and I are the same, 995 00:45:25,656 --> 00:45:28,891 then when I hurt you, I literally hurt myself. 996 00:45:28,926 --> 00:45:33,062 That changes war, that changes anger, that changes love. 997 00:45:33,097 --> 00:45:37,099 'Cause when I love you, I love myself. 998 00:45:37,134 --> 00:45:38,400 ROSENBERG: We could evolve into something 999 00:45:38,435 --> 00:45:39,969 that we can't even conceive, 1000 00:45:40,004 --> 00:45:42,972 into a different type of creature. 1001 00:45:43,007 --> 00:45:44,874 This super-organism. 1002 00:45:44,909 --> 00:45:47,076 NARRATOR: Might the hive mind even be necessary 1003 00:45:47,111 --> 00:45:48,477 for our survival? 1004 00:45:48,512 --> 00:45:50,312 I mean, we've come this far alone. 1005 00:45:50,347 --> 00:45:51,814 But you know the old saying, 1006 00:45:51,849 --> 00:45:54,984 divided we fall and united we stand. 1007 00:45:55,019 --> 00:45:58,087 GREENE: To my mind, the only way that we survive 1008 00:45:58,122 --> 00:45:59,255 into the far future, 1009 00:45:59,290 --> 00:46:02,591 is to bring us all together in some manner 1010 00:46:02,626 --> 00:46:05,928 that leverages the whole collective consciousness 1011 00:46:05,963 --> 00:46:10,332 in a way that's more powerful than the individual minds alone. 1012 00:46:10,367 --> 00:46:11,734 THURSTON: I think that's exciting. 1013 00:46:11,769 --> 00:46:15,237 I think it's weird, though. 1014 00:46:15,272 --> 00:46:18,107 SANDBERG: I imagine it as a vast coral reef. 1015 00:46:18,142 --> 00:46:21,343 Explosion of new forms, new kinds of minds, 1016 00:46:21,378 --> 00:46:23,612 new kind of consciousness. 1017 00:46:23,647 --> 00:46:25,748 I can't imagine any of the details. 1018 00:46:25,783 --> 00:46:28,017 Because I think most of them would be beyond 1019 00:46:28,052 --> 00:46:29,919 my puny human brain. 1020 00:46:29,954 --> 00:46:32,855 Just like an ant cannot understand a city, 1021 00:46:32,890 --> 00:46:36,125 we cannot understand Year Million. 1022 00:46:36,160 --> 00:46:40,696 But we can see something there, beyond the clouds. 1023 00:46:40,731 --> 00:46:42,464 NARRATOR: That is the future we're barreling toward 1024 00:46:42,499 --> 00:46:43,933 in Year Million. 1025 00:46:43,968 --> 00:46:46,035 When we are one with animals, extraterrestrials, 1026 00:46:46,070 --> 00:46:47,870 and most importantly, each other, 1027 00:46:47,905 --> 00:46:51,373 then beyond the clouds may be exactly where we find ourselves. 1028 00:46:51,408 --> 00:46:55,678 That's right, we'll be building a new future, our own tower, 1029 00:46:55,713 --> 00:46:58,113 perhaps in a far-off distant galaxy, 1030 00:46:58,148 --> 00:46:59,215 a gleaming testament 1031 00:46:59,250 --> 00:47:01,817 to our brilliant ingenuity and creativity. 1032 00:47:01,852 --> 00:47:03,352 We're headed for the stars. 1033 00:47:03,387 --> 00:47:04,720 And that will be possible 1034 00:47:04,755 --> 00:47:06,856 because of the coming communication revolution 1035 00:47:06,891 --> 00:47:08,624 that will take human intelligence 1036 00:47:08,659 --> 00:47:11,160 into the stratosphere. 83369

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.