All language subtitles for The.Great.Hack.2019.NF.WEB-DL.DDP5.1.x264-NTG-English

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai Download
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:14,305 --> 00:00:16,850 [wind blowing softly] 2 00:00:33,867 --> 00:00:36,453 [soft, indistinct chatter] 3 00:00:58,391 --> 00:01:00,161 [man 1] Ms. Kaiser, you seem to have traveled a long way 4 00:01:00,185 --> 00:01:02,896 from an idealistic intern in Barack Obama's campaign 5 00:01:02,979 --> 00:01:06,483 to working for an organization that keeps pretty unsavory company. 6 00:01:09,611 --> 00:01:11,291 Didn't that make you uncomfortable at all? 7 00:01:17,911 --> 00:01:20,789 [man 2] You referred to having two sets of business cards. 8 00:01:21,873 --> 00:01:23,083 Who did you work for? 9 00:01:30,298 --> 00:01:33,218 [explosions booming] 10 00:01:40,225 --> 00:01:41,745 [man 1] Don't take this the wrong way. 11 00:01:42,936 --> 00:01:44,646 In your life, have you ever worked for 12 00:01:45,897 --> 00:01:49,567 or provided information to any country's intelligence agency? 13 00:02:03,790 --> 00:02:05,458 [man] Hi. A small coffee, please? 14 00:02:05,959 --> 00:02:07,418 - Uh, $2.25. - Great. 15 00:02:13,716 --> 00:02:14,717 All right. 16 00:02:17,554 --> 00:02:21,224 - [phone chimes] - [man] Who has seen an advertisement 17 00:02:21,808 --> 00:02:23,476 that has convinced you 18 00:02:23,560 --> 00:02:27,480 that your microphone is listening to your conversations? 19 00:02:29,274 --> 00:02:31,317 [class laughing] 20 00:02:32,110 --> 00:02:34,654 [man] It's hard for us to imagine how else it could work, 21 00:02:35,113 --> 00:02:38,950 but what's happening is that your behavior 22 00:02:39,033 --> 00:02:41,119 is being accurately predicted. 23 00:02:41,452 --> 00:02:45,957 So, the ads that seem uncannily accurate, 24 00:02:46,457 --> 00:02:48,668 that have to be eavesdropping on us, 25 00:02:49,377 --> 00:02:51,796 are more likely to be evidence 26 00:02:52,005 --> 00:02:53,756 that the targeting works, 27 00:02:53,923 --> 00:02:55,967 and that it predicts our behavior. 28 00:02:56,801 --> 00:03:00,930 Maybe it's because I grew up with the Internet as a reality. 29 00:03:01,014 --> 00:03:04,267 The ads don't bother me all that much. 30 00:03:04,517 --> 00:03:06,895 When does it turn sour? 31 00:03:18,656 --> 00:03:20,992 [electronic chiming] 32 00:03:21,534 --> 00:03:22,911 [phone chimes, vibrates] 33 00:03:23,203 --> 00:03:26,581 [female voice over PA] This is a Brooklyn-bound Q express train. 34 00:03:26,998 --> 00:03:30,043 - The next stop is Canal Street. - [phone vibrates] 35 00:03:30,126 --> 00:03:31,127 [phone vibrates] 36 00:03:33,588 --> 00:03:35,924 [Carroll] It began with the dream of a connected world. 37 00:03:38,468 --> 00:03:41,930 A space where everyone could share each other's experiences 38 00:03:42,347 --> 00:03:43,973 and feel less alone. 39 00:03:46,142 --> 00:03:49,604 It wasn't long before this world became our matchmaker, 40 00:03:51,981 --> 00:03:55,068 instant fact-checker, personal entertainer, 41 00:03:55,318 --> 00:03:58,863 guardian of our memories, even our therapist. 42 00:04:00,240 --> 00:04:01,783 [speaks indistinctly to students] 43 00:04:01,866 --> 00:04:04,827 [Carroll] I was teaching digital media and developing apps. 44 00:04:05,495 --> 00:04:08,831 So, I knew that the data from our online activity 45 00:04:08,915 --> 00:04:10,333 wasn't just evaporating. 46 00:04:12,877 --> 00:04:15,546 And as I dug deeper, I realized... 47 00:04:18,258 --> 00:04:20,843 these digital traces of ourselves 48 00:04:21,302 --> 00:04:25,556 are being mined into a trillion-dollar-a-year industry. 49 00:04:29,519 --> 00:04:31,646 We are now the commodity. 50 00:04:33,523 --> 00:04:35,191 But we were so in love 51 00:04:35,525 --> 00:04:38,111 with the gift of this free connectivity... 52 00:04:39,153 --> 00:04:42,490 that no one bothered to read the terms and conditions. 53 00:04:49,247 --> 00:04:52,500 - [overlapping voices] - [digital noise] 54 00:04:57,297 --> 00:04:59,215 [Carroll] All of my interactions, 55 00:04:59,757 --> 00:05:03,052 my credit card swipes, web searches, 56 00:05:03,219 --> 00:05:05,763 locations, my likes, 57 00:05:08,266 --> 00:05:12,687 they're all collected in real time and attached to my identity, 58 00:05:13,229 --> 00:05:17,817 giving any buyer direct access to my emotional pulse. 59 00:05:21,779 --> 00:05:25,950 Armed with this knowledge, they compete for my attention, 60 00:05:27,285 --> 00:05:33,624 feeding me a steady stream of content built for and seen only by me. 61 00:05:37,920 --> 00:05:41,132 And this is true for each and every one of us. 62 00:05:43,801 --> 00:05:44,886 What I like, 63 00:05:46,637 --> 00:05:47,764 what I fear, 64 00:05:48,723 --> 00:05:50,224 what gets my attention, 65 00:05:51,476 --> 00:05:55,813 what my boundaries are, and what it takes to cross them. 66 00:05:58,232 --> 00:05:59,859 [overlapping voices] 67 00:05:59,942 --> 00:06:01,527 [woman 1] Go back to Washington. 68 00:06:01,778 --> 00:06:03,946 [woman 2] Crooked Hillary tells lots of lies. 69 00:06:04,030 --> 00:06:06,699 The stock market's gonna crash. I mean, this'll cause a civil war. 70 00:06:08,117 --> 00:06:11,204 [Carroll] We saw the fallout of our filtered realities 71 00:06:11,287 --> 00:06:12,497 in the 2016 election. 72 00:06:12,580 --> 00:06:14,980 [woman 3] ...you were not offended when Donald Trump said it! 73 00:06:15,666 --> 00:06:18,419 [overlapping shouts and jeers] 74 00:06:19,837 --> 00:06:22,423 - [man 1] Get the fuck out! - [angry shouts overlapping] 75 00:06:22,507 --> 00:06:26,386 [Carroll] The real world became a deeply divided wreckage site. 76 00:06:26,469 --> 00:06:30,515 - [protesters chanting indistinctly] - [overlapping shouts] 77 00:06:30,640 --> 00:06:33,518 [man 2] Fuck those dirty beaners! Build the wall! 78 00:06:34,060 --> 00:06:35,060 [man 3] Whoo! 79 00:06:35,395 --> 00:06:38,106 [men shouting] Fight! 80 00:06:38,523 --> 00:06:42,610 [Carroll] How did the dream of the connected world tear us apart? 81 00:06:48,449 --> 00:06:50,076 [static crackles] 82 00:07:03,673 --> 00:07:06,801 [Carroll] My daughter is eight, and my son is four. 83 00:07:07,885 --> 00:07:13,683 Uh, every app is carefully scrutinized before ins... being installed, and... 84 00:07:14,100 --> 00:07:17,895 And, like, now, I'm the dad who reads the privacy policy and says, 85 00:07:19,188 --> 00:07:22,275 "No, you see here? They read your messages. 86 00:07:22,817 --> 00:07:25,027 Are you okay with that?" [chuckles] 87 00:07:27,613 --> 00:07:31,242 That's, like, the new way I'm gonna be an annoying parent. 88 00:07:33,578 --> 00:07:35,788 - Hey. - [kids chattering] 89 00:07:36,456 --> 00:07:38,499 [Carroll] I've been concerned for a long time 90 00:07:38,583 --> 00:07:42,545 about how the misuse of our data and information 91 00:07:42,628 --> 00:07:44,589 could affect my children's future. 92 00:07:46,757 --> 00:07:50,386 But it wasn't until after the 2016 election that I realized 93 00:07:50,678 --> 00:07:52,805 it had already happened on our watch. 94 00:07:55,725 --> 00:07:57,643 It was really, like, a feeling of, like... 95 00:07:57,727 --> 00:07:58,727 [takes a deep breath] 96 00:07:58,728 --> 00:08:01,314 ...the worst-case scenario has happened with technology. 97 00:08:03,941 --> 00:08:05,109 Hmm. 98 00:08:05,359 --> 00:08:07,653 I became obsessed with finding answers. 99 00:08:09,739 --> 00:08:11,782 And the question I kept asking myself was: 100 00:08:13,117 --> 00:08:14,785 Who was feeding us fear? 101 00:08:15,328 --> 00:08:16,412 And how? 102 00:08:21,417 --> 00:08:25,671 This was our Project Alamo, where the digital arm 103 00:08:25,755 --> 00:08:28,007 of the Trump campaign operation was held. 104 00:08:29,383 --> 00:08:32,011 When Project Alamo was at its peak, 105 00:08:33,012 --> 00:08:35,681 they were spending one million dollars a day 106 00:08:36,390 --> 00:08:38,184 on Facebook ads. 107 00:08:39,352 --> 00:08:42,188 We had the Facebook, and YouTube, and Google people. 108 00:08:42,271 --> 00:08:44,065 They would kind of congregate here. 109 00:08:44,315 --> 00:08:47,235 I mean, they were basically our hands-on partners 110 00:08:47,318 --> 00:08:50,279 as far as, you know, being able to utilize the platform 111 00:08:50,363 --> 00:08:52,073 as effectively as possible. 112 00:08:52,365 --> 00:08:53,783 But what we also learned 113 00:08:53,866 --> 00:08:56,244 is that a company called Cambridge Analytica... 114 00:08:59,330 --> 00:09:02,708 was also working on Project Alamo. 115 00:09:04,168 --> 00:09:06,045 Cambridge Analytica was here. 116 00:09:06,254 --> 00:09:09,840 And this is kind of the brain of... of, you know, the data. 117 00:09:09,924 --> 00:09:12,677 - [over video] This was the data center. - [man] Right. 118 00:09:13,219 --> 00:09:15,405 "We gotta target this state. We gotta target that state." 119 00:09:15,429 --> 00:09:17,491 - So, within that... - [man] How would they know that? 120 00:09:17,515 --> 00:09:20,195 - How would they know that... - That's... That's their secret sauce. 121 00:09:27,066 --> 00:09:28,734 - [computer chimes] - [mouse clicks] 122 00:09:28,943 --> 00:09:31,445 - [Carroll] Paul-Olivier? - [Paul-Olivier] I'm there. 123 00:09:31,904 --> 00:09:34,740 Okay. Let me just, uh, set up my screen. 124 00:09:36,200 --> 00:09:39,787 [Carroll] I connected with a mathematician based out of Switzerland 125 00:09:39,870 --> 00:09:41,539 named Paul-Olivier Dehaye. 126 00:09:42,123 --> 00:09:45,334 [Dehaye] I've been looking at Cambridge Analytica for over a year 127 00:09:45,626 --> 00:09:48,087 and I think there's more to be found. 128 00:09:49,589 --> 00:09:51,591 [Carroll] Both Paul and I understood that 129 00:09:51,674 --> 00:09:54,510 in order to send people personalized messages, 130 00:09:54,594 --> 00:09:55,886 you need people's data. 131 00:09:57,763 --> 00:10:02,143 And Cambridge Analytica claimed to have 5,000 data points 132 00:10:02,226 --> 00:10:04,186 on every American voter. 133 00:10:07,815 --> 00:10:08,983 But it was invisible. 134 00:10:10,943 --> 00:10:15,239 And so the question is, how do you make the invisible visible? 135 00:10:19,827 --> 00:10:21,746 [sighs] That's the hardest part. Um... 136 00:10:25,333 --> 00:10:27,710 - [marker squeaks] - [Carroll] Paul-Olivier Dehaye had 137 00:10:27,793 --> 00:10:28,919 a hypothesis, 138 00:10:29,170 --> 00:10:32,882 and the hypothesis was that US voter data 139 00:10:33,007 --> 00:10:36,218 was processed by Cambridge Analytica's parent company 140 00:10:36,302 --> 00:10:37,553 in Great Britain. 141 00:10:43,768 --> 00:10:45,436 And if it was true, 142 00:10:45,519 --> 00:10:49,273 I could use a British lawyer to force Cambridge Analytica 143 00:10:49,357 --> 00:10:50,650 to give me my data. 144 00:10:54,445 --> 00:10:57,657 [man] I think the beauty of David's case is it encapsulates 145 00:10:57,990 --> 00:11:02,828 why data rights should be considered just fundamental rights, simple rights. 146 00:11:02,912 --> 00:11:04,705 Because all he wants to know 147 00:11:05,122 --> 00:11:06,540 is what did you do? 148 00:11:08,042 --> 00:11:12,213 And if David finds out the data beneath his profile, 149 00:11:13,130 --> 00:11:15,883 you'll start to be able to connect the dots in various ways 150 00:11:16,509 --> 00:11:20,054 with Facebook and Cambridge Analytica and Trump and Brexit 151 00:11:20,137 --> 00:11:23,891 and all these loosely-connected entities. 152 00:11:24,517 --> 00:11:26,519 Because you get to see inside the beast, 153 00:11:26,852 --> 00:11:28,604 you get to see inside the system. 154 00:11:39,573 --> 00:11:42,576 [man] I used to be the COO and CFO 155 00:11:42,660 --> 00:11:45,496 of the Cambridge Analytica, or SCL, Group. 156 00:11:46,831 --> 00:11:50,167 If you spoke to most people that worked at Cambridge Analytica, 157 00:11:50,626 --> 00:11:52,128 they would say the same thing. 158 00:11:52,211 --> 00:11:55,715 It was, uh... an environment of great innovation. 159 00:11:57,675 --> 00:11:59,927 Hello, my name is Alexander Nix. 160 00:12:00,010 --> 00:12:01,804 I'm CEO of Cambridge Analytica, 161 00:12:02,096 --> 00:12:05,182 the world's leading data-driven communications company. 162 00:12:05,599 --> 00:12:08,394 From Mad Men of old to Math Men of today, 163 00:12:08,477 --> 00:12:10,479 expert data scientists whose insight 164 00:12:10,563 --> 00:12:13,399 can tell you far more about audiences that you want to reach 165 00:12:13,482 --> 00:12:14,900 and how to reach them. 166 00:12:18,654 --> 00:12:23,033 [Wheatland] Alexander Nix was very focused on building a strong elections business. 167 00:12:23,868 --> 00:12:28,247 And then the Obama campaign very successfully used data 168 00:12:28,622 --> 00:12:31,000 and digital communications, 169 00:12:31,250 --> 00:12:34,587 which created a market opportunity to provide a service 170 00:12:34,670 --> 00:12:36,756 to Republican politics in the US. 171 00:12:36,839 --> 00:12:39,383 [crowd cheering] 172 00:12:39,467 --> 00:12:42,762 [Cruz] God bless the great state of Iowa. 173 00:12:42,845 --> 00:12:44,472 [crowd cheering] 174 00:12:44,555 --> 00:12:47,641 [Wheatland] Ted Cruz went from the lowest rated candidate 175 00:12:47,725 --> 00:12:48,934 in the primaries 176 00:12:49,059 --> 00:12:54,064 to being the last man standing before Trump got the nomination. 177 00:12:54,148 --> 00:12:55,441 [crowd members cheer] 178 00:12:55,524 --> 00:12:57,234 Let me first of all say... 179 00:12:58,861 --> 00:13:01,864 - to God be the glory. - [crowd cheering] 180 00:13:06,285 --> 00:13:08,513 [reporter] Everyone said Ted Cruz had this amazing ground game, 181 00:13:08,537 --> 00:13:10,790 and now we know who came up with all of it. 182 00:13:10,873 --> 00:13:14,126 Joining me now, Alexander Nix, CEO of Cambridge Analytica, 183 00:13:14,210 --> 00:13:15,419 the company behind it all. 184 00:13:15,503 --> 00:13:18,464 It's fascinating, Alexander, to look at all of the work 185 00:13:18,547 --> 00:13:20,007 that goes into the ground game. 186 00:13:20,090 --> 00:13:22,051 Have any of the other candidates called you? 187 00:13:22,635 --> 00:13:23,969 Well, um... 188 00:13:31,185 --> 00:13:34,188 [audience applauding] 189 00:13:35,731 --> 00:13:37,942 It's my privilege to speak to you today 190 00:13:38,025 --> 00:13:41,737 about the power of big data and psychographics. 191 00:13:42,530 --> 00:13:45,950 When Cambridge Analytica joined the Trump campaign, 192 00:13:46,033 --> 00:13:48,118 we were an attractive proposition. 193 00:13:48,744 --> 00:13:52,331 We'd just spent 14 months working on the Ted Cruz campaign, 194 00:13:52,414 --> 00:13:56,669 and had collected a huge amount of voter data and research, 195 00:13:56,752 --> 00:13:59,338 which we were able to hand over to the Trump team. 196 00:14:01,382 --> 00:14:04,510 By having hundreds and hundreds of thousands of Americans 197 00:14:04,593 --> 00:14:05,593 undertake this survey, 198 00:14:07,179 --> 00:14:09,515 we were able to form a model, 199 00:14:09,598 --> 00:14:12,977 where we have somewhere close to four or five thousand data points 200 00:14:13,060 --> 00:14:16,981 we can use to predict the personality of every adult in the United States. 201 00:14:17,565 --> 00:14:20,192 Because it's personality that drives behavior, 202 00:14:20,276 --> 00:14:23,362 and behavior that obviously influences how you vote. 203 00:14:24,613 --> 00:14:26,615 We could then start to target people 204 00:14:26,699 --> 00:14:29,451 with highly-targeted digital video content. 205 00:14:30,244 --> 00:14:33,539 [man 1] Secretary Clinton said there was nothing marked classified on her emails 206 00:14:33,622 --> 00:14:35,541 either sent or received. Was that true? 207 00:14:36,959 --> 00:14:39,837 [Trump] Our movement is about replacing 208 00:14:39,920 --> 00:14:44,049 a failed and corrupt political establishment. 209 00:14:44,133 --> 00:14:46,886 Why aren't I 50 points ahead, you might ask? 210 00:14:46,969 --> 00:14:49,513 [man 2] Do you really need to ask? 211 00:14:49,597 --> 00:14:53,309 [Clinton coughing] 212 00:14:57,730 --> 00:15:01,734 - [crowd cheering] - [rousing orchestral theme plays] 213 00:15:02,735 --> 00:15:04,445 A night that will go down in history, 214 00:15:04,528 --> 00:15:07,698 a stunning upset as Donald Trump triumphs over Hillary Clinton, 215 00:15:07,781 --> 00:15:11,368 defying the polls, the pundits, and the political class once again, 216 00:15:11,452 --> 00:15:14,163 this time elected president of the United States. 217 00:15:16,624 --> 00:15:18,626 [orchestral theme continues] 218 00:15:24,590 --> 00:15:28,802 [crowd cheering] 219 00:15:31,889 --> 00:15:34,308 [crowd chanting] USA! USA! 220 00:15:34,391 --> 00:15:36,435 - Thank you. - [chanting continues] 221 00:15:36,518 --> 00:15:38,437 [Trump] Thank you very much, everybody. 222 00:15:46,528 --> 00:15:50,032 [Nix] If there's one singular takeaway from this event, 223 00:15:50,115 --> 00:15:52,952 that is that these sorts of technologies 224 00:15:53,035 --> 00:15:54,662 can make a huge difference 225 00:15:54,745 --> 00:15:58,248 and will continue to do so for many years to come. 226 00:15:58,958 --> 00:16:00,084 Thank you. 227 00:16:00,167 --> 00:16:02,711 [audience applauding] 228 00:16:07,466 --> 00:16:10,302 [Wheatland] After the election, it was really exciting. 229 00:16:10,386 --> 00:16:11,386 [cheering] 230 00:16:12,638 --> 00:16:15,516 [Wheatland] We could see the path to being a billion-dollar company. 231 00:16:16,517 --> 00:16:19,937 We were on top of the world. Or at least we thought we were. 232 00:16:37,788 --> 00:16:39,707 [Cadwalladr] This is the exciting box. 233 00:16:45,045 --> 00:16:47,548 I've been investigating Cambridge Analytica 234 00:16:48,632 --> 00:16:54,888 and how that ties to the Brexit campaign to leave the European Union. 235 00:16:57,391 --> 00:17:01,103 And this has been my full-time, 12-hours-a-day, 236 00:17:01,186 --> 00:17:05,649 seven-days-a-week kind of obsession, I would say, since then. 237 00:17:06,567 --> 00:17:07,901 It's been all-consuming. 238 00:17:08,610 --> 00:17:09,778 [sighs] 239 00:17:12,281 --> 00:17:15,492 When I first started looking into this whole web of links 240 00:17:15,576 --> 00:17:18,954 between Cambridge Analytica and Brexit... 241 00:17:20,122 --> 00:17:22,416 I emailed Andy Wigmore, 242 00:17:22,708 --> 00:17:26,754 who is an associate of Nigel Farage. 243 00:17:27,087 --> 00:17:32,634 And Nigel Farage is a very central figure in the Brexit campaign. 244 00:17:34,470 --> 00:17:36,680 I sort of said, "Oh, can we go for a coffee? 245 00:17:37,306 --> 00:17:40,851 I'm really interested in technology and campaigning." 246 00:17:41,977 --> 00:17:45,272 And then he just sort of like, he just like laid it all out. 247 00:17:45,355 --> 00:17:47,357 [upbeat jazz playing] 248 00:17:51,236 --> 00:17:53,280 [Cadwalladr] It was just after the Inauguration. 249 00:17:53,697 --> 00:17:57,034 So, Andy, he was just like sort of, like, showing me all the photos on his phone. 250 00:17:57,117 --> 00:17:58,869 "This is the inauguration." 251 00:18:00,913 --> 00:18:02,414 Then, "Oh, it's such a laugh! 252 00:18:02,664 --> 00:18:04,583 We had such a good time. Oh, Donald..." 253 00:18:06,543 --> 00:18:09,588 And I was like, "How did the introduction to Cambridge Analytica come out?" 254 00:18:10,923 --> 00:18:14,301 He was like, "You know, it's 'cause Nigel. Nigel's friends with Steve Bannon." 255 00:18:14,593 --> 00:18:17,805 - Ladies and gentlemen, Steve Bannon! - [audience cheers] 256 00:18:18,263 --> 00:18:22,101 [Cadwalladr] Steve Bannon headed the campaign for Trump. 257 00:18:23,102 --> 00:18:26,772 He's also the Vice President of Cambridge Analytica. 258 00:18:27,940 --> 00:18:31,068 So he's like, "Yeah, there's this bunch of billionaires in the States. 259 00:18:31,360 --> 00:18:33,445 We've all got the same aims, 260 00:18:33,946 --> 00:18:38,283 and Brexit was the petri dish for Trump." 261 00:18:38,367 --> 00:18:41,411 For most of my life, America is the leader. 262 00:18:41,703 --> 00:18:44,998 Now, I would like to think, in my own little way, 263 00:18:45,499 --> 00:18:47,292 that what we did with Brexit 264 00:18:47,751 --> 00:18:51,380 was the beginning of what is gonna turn out to be 265 00:18:51,463 --> 00:18:53,674 a global revolution 266 00:18:53,757 --> 00:18:56,176 and that Trump's victory is a part of that. 267 00:18:56,260 --> 00:18:58,137 [people cheering, applauding] 268 00:19:01,557 --> 00:19:05,644 [scoffs] Anyway, and then he told me all sorts of other stuff 269 00:19:05,727 --> 00:19:08,605 about, you know, how they used artificial intelligence, 270 00:19:08,689 --> 00:19:12,651 you know, how they were mining details from Facebook 271 00:19:12,734 --> 00:19:14,278 and, um... 272 00:19:14,361 --> 00:19:18,115 And he was like... And he was like, "It's creepy, Carole!" He said, 273 00:19:18,198 --> 00:19:20,450 "The amount of information you can get on people... 274 00:19:20,534 --> 00:19:22,369 People just give it to you!" 275 00:19:22,452 --> 00:19:24,132 And he sort of said, "It's just so creepy!" 276 00:19:26,039 --> 00:19:28,041 So, I just kind of kept going. 277 00:19:29,751 --> 00:19:31,044 The Brexit work. 278 00:19:32,462 --> 00:19:37,134 I'd started tracking down all these Cambridge Analytica ex-employees. 279 00:19:40,470 --> 00:19:44,892 And, eventually, I got one guy who was prepared to talk to me. 280 00:19:46,476 --> 00:19:47,519 Chris Wylie. 281 00:19:51,190 --> 00:19:54,735 We had this first telephone call, which was insane. 282 00:19:54,818 --> 00:19:57,654 It was about eight hours long. And... 283 00:19:58,655 --> 00:20:00,324 [mimics explosion sound] 284 00:20:05,704 --> 00:20:08,373 My name is Christopher Wylie, I'm a data scientist 285 00:20:08,457 --> 00:20:10,500 and I helped set up Cambridge Analytica. 286 00:20:12,044 --> 00:20:15,297 It's incorrect to call Cambridge Analytica 287 00:20:15,380 --> 00:20:19,551 a purely sort of data science company or an algorithm, you know, company. 288 00:20:19,676 --> 00:20:22,930 You know, it is a full-service propaganda machine. 289 00:20:26,516 --> 00:20:29,561 [interviewer] You were an investor in Cambridge Analytica. 290 00:20:29,645 --> 00:20:32,624 - I helped put the company together. - [interviewer] And... Yes, you did. And... 291 00:20:32,648 --> 00:20:34,733 And gave it... And gave it that amazing name. 292 00:20:35,192 --> 00:20:38,111 [Wylie] Steve Bannon was the editor of Breitbart. 293 00:20:39,613 --> 00:20:42,366 He follows this idea of the Breitbart doctrine, 294 00:20:42,449 --> 00:20:46,370 which is that, if you want to fundamentally change society, 295 00:20:46,495 --> 00:20:48,121 you first have to break it. 296 00:20:48,455 --> 00:20:49,915 And it's only when you break it 297 00:20:49,998 --> 00:20:54,711 is when you can remold the pieces into your vision of a new society. 298 00:20:58,257 --> 00:20:59,925 This was the weapon 299 00:21:00,008 --> 00:21:03,178 that Steve Bannon wanted to build to fight his culture war. 300 00:21:04,179 --> 00:21:05,847 And we could build them for him. 301 00:21:06,431 --> 00:21:09,393 But I needed to figure out a way of getting data, 302 00:21:09,476 --> 00:21:12,938 and so I went to these Cambridge University profs 303 00:21:13,021 --> 00:21:14,606 and asked, "What do you think?" 304 00:21:28,328 --> 00:21:33,750 Kogan offered us apps on Facebook that were given special permission 305 00:21:33,959 --> 00:21:39,756 to harvest data not from just the person who used the app or joined the app, 306 00:21:41,049 --> 00:21:45,053 but also it would then go into their entire friend network 307 00:21:45,637 --> 00:21:48,724 and pull out all of the friends' data as well. 308 00:21:50,475 --> 00:21:53,061 If you were a friend of somebody who used the app, 309 00:21:53,145 --> 00:21:56,148 you would have no idea that I just pulled all of your data. 310 00:22:00,777 --> 00:22:03,780 We took things like status updates, likes, 311 00:22:03,864 --> 00:22:06,116 in some cases, private messages. 312 00:22:08,660 --> 00:22:10,954 We wouldn't just be targeting you as a voter, 313 00:22:11,038 --> 00:22:14,333 we'd be targeting you as a personality. 314 00:22:16,460 --> 00:22:20,672 We would only need to touch a couple hundred thousand people 315 00:22:20,756 --> 00:22:23,342 to build a psychological profile 316 00:22:23,425 --> 00:22:28,013 of each voter in all of the United States. 317 00:22:31,975 --> 00:22:33,244 [Cadwalladr] And people had no idea 318 00:22:33,268 --> 00:22:35,354 that their data was being taken in this way? 319 00:22:36,313 --> 00:22:37,314 [Wylie] No. 320 00:22:42,486 --> 00:22:44,363 [Cadwalladr] You didn't ever stop and think, 321 00:22:44,446 --> 00:22:47,866 "Actually, this is people's personal information, 322 00:22:47,949 --> 00:22:52,579 and we're taking it, and we're using it in ways that they don't understand"? 323 00:22:53,288 --> 00:22:54,288 [Wylie] No. 324 00:22:56,375 --> 00:23:00,670 Throughout history, you have examples of grossly unethical experiments. 325 00:23:01,129 --> 00:23:02,631 [Cadwalladr] Is that what this was? 326 00:23:03,382 --> 00:23:07,636 [Wylie] I think that, yes, it was a grossly unethical experiment. 327 00:23:08,970 --> 00:23:12,182 You are playing with the psychology of an entire country 328 00:23:12,265 --> 00:23:14,434 without their consent or awareness. 329 00:23:15,769 --> 00:23:17,729 And not only are you, like, 330 00:23:17,813 --> 00:23:20,190 playing with the psychology of an entire nation, 331 00:23:20,273 --> 00:23:22,377 you're playing with the psychology of an entire nation 332 00:23:22,401 --> 00:23:24,111 in the context of the democratic process. 333 00:23:24,194 --> 00:23:27,280 [news anchor speaks indistinctly nearby] 334 00:23:31,034 --> 00:23:33,578 [Carroll] The revelations have started to spill out. 335 00:23:34,663 --> 00:23:40,460 We're now not just threatening to do things, but we're actually doing it. 336 00:23:42,671 --> 00:23:45,590 - [anchor] Okay, we're ready, guys. - [man] One, two, three... 337 00:23:45,966 --> 00:23:48,093 We turn now to the burgeoning scandal 338 00:23:48,176 --> 00:23:51,054 around voter-profiling company Cambridge Analytica. 339 00:23:51,388 --> 00:23:54,724 David Carroll filed a legal challenge in Britain 340 00:23:54,808 --> 00:23:57,352 asking the court to force Cambridge Analytica 341 00:23:57,436 --> 00:24:00,564 to turn over all the data it harvested on him. 342 00:24:00,814 --> 00:24:02,816 Explain what you are demanding. 343 00:24:03,275 --> 00:24:05,402 Uh, full disclosure, so... 344 00:24:06,945 --> 00:24:08,780 where did they get our data, 345 00:24:08,864 --> 00:24:12,284 how did they process it, who did they share it with, 346 00:24:12,576 --> 00:24:14,828 and do we have a right to opt out? 347 00:24:17,080 --> 00:24:18,331 [man] Cambridge Analytica says 348 00:24:18,415 --> 00:24:19,541 it's got 5,000 data points 349 00:24:19,624 --> 00:24:21,668 on many, many millions of people out there. 350 00:24:21,751 --> 00:24:22,836 [Carroll] That's right. 351 00:24:22,919 --> 00:24:26,298 When people can actually see the extent of the surveillance, 352 00:24:26,381 --> 00:24:28,633 I think they're going to be shocked. 353 00:24:30,218 --> 00:24:33,430 We don't work with Facebook data. We don't have Facebook data. 354 00:24:33,513 --> 00:24:36,766 Uh, we do use Facebook as a platform, uh, to advertise. 355 00:24:37,184 --> 00:24:38,351 Mr. Nix, Channel 4 News. 356 00:24:38,560 --> 00:24:40,687 Did you mislead Parliament over the Facebook issue? 357 00:24:40,770 --> 00:24:43,315 - Absolutely not. - [reporter continues indistinctly] 358 00:24:44,149 --> 00:24:47,736 [Carroll] It's crazy that I have to mount a year-long, 359 00:24:47,819 --> 00:24:51,865 super risky legal challenge in another country 360 00:24:51,990 --> 00:24:54,159 to get my voter profile. 361 00:24:54,326 --> 00:24:57,537 David, don't stop, don't relent. 362 00:24:57,621 --> 00:24:58,872 [Carroll chuckles] 363 00:24:58,955 --> 00:25:00,957 - Keep going. Good. - I'm gonna do it, don't worry. 364 00:25:01,041 --> 00:25:02,876 - [woman] Don't sleep! - [Carroll laughs] 365 00:25:05,670 --> 00:25:08,423 Facebook's down 6.35%. 366 00:25:08,715 --> 00:25:10,717 That's 120 billion dollars. 367 00:25:11,092 --> 00:25:12,219 This is huge. 368 00:25:14,387 --> 00:25:17,265 [reporter 1] Officers working for the UK Information Commissioner 369 00:25:17,349 --> 00:25:20,727 are searching the headquarters of Cambridge Analytica, in London. 370 00:25:20,810 --> 00:25:22,872 [reporter 2] They're inside, they're looking at computers, 371 00:25:22,896 --> 00:25:24,523 they're looking for documents. 372 00:25:25,524 --> 00:25:29,486 [reporter 3] Facebook knew about that data collection over two years ago 373 00:25:29,569 --> 00:25:32,572 but did not go public until three days ago. 374 00:25:33,365 --> 00:25:34,407 Really, Facebook? 375 00:25:34,491 --> 00:25:36,326 You forgot to mention that 50 million people 376 00:25:36,409 --> 00:25:37,595 had their private data breached, 377 00:25:37,619 --> 00:25:40,455 but every time it's my uncle's friend's sister's dog's birthday, 378 00:25:40,539 --> 00:25:41,873 I get a notification? 379 00:25:41,957 --> 00:25:44,167 [audience laughs] 380 00:25:48,129 --> 00:25:50,298 [reporter 4] You are taking on a giant, 381 00:25:50,382 --> 00:25:52,425 a Goliath of big data marketing. 382 00:25:53,343 --> 00:25:55,262 How hopeful are you of succeeding? 383 00:25:55,720 --> 00:26:00,850 - [woman makes indistinct introduction] - [audience applauds] 384 00:26:02,018 --> 00:26:04,688 [Carroll] People don't want to admit that propaganda works. 385 00:26:05,480 --> 00:26:09,859 Because to admit it means confronting our own susceptibilities, 386 00:26:10,193 --> 00:26:12,320 horrific lack of privacy, 387 00:26:12,612 --> 00:26:14,281 and hopeless dependency 388 00:26:14,364 --> 00:26:17,742 on tech platforms ruining our democracies 389 00:26:17,826 --> 00:26:19,619 on various attack surfaces. 390 00:26:20,370 --> 00:26:23,248 Join the struggle to help get our data back. 391 00:26:23,999 --> 00:26:26,585 [applause, cheering] 392 00:26:37,345 --> 00:26:41,891 Welcome to our inquiry into disinformation and fake news. 393 00:26:41,975 --> 00:26:45,520 I'd like to welcome Christopher Wylie and Paul-Olivier Dehaye, 394 00:26:45,604 --> 00:26:48,231 uh, to the committee to give evidence this morning. 395 00:26:49,190 --> 00:26:51,276 [O'Hara] Have you or anybody else made any assessment 396 00:26:51,359 --> 00:26:53,296 of actually whether any of this made much difference 397 00:26:53,320 --> 00:26:56,364 to the final outcome of the EU Referendum? 398 00:26:59,618 --> 00:27:03,580 When... When you're caught in the Olympics doping, right, 399 00:27:03,788 --> 00:27:08,668 there's not a debate about how much illegal drug you took. Right? 400 00:27:08,752 --> 00:27:10,045 Or, "Well, 401 00:27:10,128 --> 00:27:11,808 he probably would've come in first anyway," 402 00:27:11,880 --> 00:27:14,758 or, you know, "He only took half of the amount," or... 403 00:27:14,841 --> 00:27:17,719 Doesn't matter. If you're caught cheating, you lose your medal. Right? 404 00:27:17,802 --> 00:27:18,928 Because... 405 00:27:19,304 --> 00:27:23,308 [stammers]...if we allow cheating in our democratic process, 406 00:27:23,683 --> 00:27:24,600 what about next time? 407 00:27:24,601 --> 00:27:28,146 What about the time after that? Right? You shouldn't win by cheating. 408 00:27:29,648 --> 00:27:33,026 A lot of people will say, and I'll say, um, 409 00:27:33,735 --> 00:27:36,464 that given that you're someone who worked very closely with these people, 410 00:27:36,488 --> 00:27:37,656 uh, for a period of time, 411 00:27:37,739 --> 00:27:40,325 why have you decided to speak out against it 412 00:27:40,408 --> 00:27:42,928 and give evidence against people who used to be your colleagues? 413 00:27:43,286 --> 00:27:46,623 It's a process of coming to terms with what you've created 414 00:27:46,706 --> 00:27:49,751 and the impact that that... that... that has had. 415 00:27:49,834 --> 00:27:53,963 Um, I am incredibly remorseful for my... my role in setting it up. 416 00:27:54,214 --> 00:27:59,094 But there's been a lot of attention on me because I'm sort of... I've become the... 417 00:27:59,427 --> 00:28:02,347 uh, you know, the face of it, because I'm the one that's... 418 00:28:02,514 --> 00:28:04,557 come forward and put my name to it. 419 00:28:04,683 --> 00:28:06,363 But someone else that you should be calling 420 00:28:06,434 --> 00:28:07,894 to the committee is Brittany Kaiser. 421 00:28:08,478 --> 00:28:09,813 Who's Brittany Kaiser? 422 00:28:27,789 --> 00:28:32,085 I'm not that interested in standing up for powerful white men anymore 423 00:28:32,168 --> 00:28:35,505 who obviously don't have everybody's best interests at heart. 424 00:28:39,384 --> 00:28:41,469 [reporter 1] Brittany Kaiser, once a key player 425 00:28:41,553 --> 00:28:43,513 inside Cambridge Analytica, 426 00:28:43,596 --> 00:28:45,765 casting herself as a whistle-blower. 427 00:28:46,433 --> 00:28:49,269 [reporter 2] Until three weeks ago, Brittany Kaiser, a top exec there, 428 00:28:49,352 --> 00:28:52,981 she had a key to Steve Bannon's townhouse in Washington. 429 00:28:53,064 --> 00:28:56,151 She spoke at CPAC in 2016, along with Kellyanne Conway, 430 00:28:56,234 --> 00:28:58,653 spent election night at the Trump victory party 431 00:28:58,737 --> 00:29:01,364 with mega-donor Rebekah Mercer. 432 00:29:02,532 --> 00:29:04,743 [reporter 1] Miss Kaiser was also closely involved 433 00:29:04,826 --> 00:29:07,537 with millionaire Brexit supporter Arron Banks 434 00:29:07,620 --> 00:29:09,706 and his Leave. EU campaign. 435 00:29:18,173 --> 00:29:20,067 [reporter 3] She's raising some interesting things. 436 00:29:20,091 --> 00:29:21,444 Why is she talking now, do you think? 437 00:29:21,468 --> 00:29:23,470 [man] Well, she only gave us part of the picture. 438 00:29:23,553 --> 00:29:27,056 She's talking to investigators, and so we'll know the full picture 439 00:29:27,140 --> 00:29:28,475 at some point later... 440 00:29:37,942 --> 00:29:39,944 [soft jazz playing] 441 00:29:51,456 --> 00:29:53,666 [Kaiser] I have evidence 442 00:29:53,750 --> 00:29:58,171 that the Brexit campaigns and the Trump campaign 443 00:29:58,254 --> 00:30:00,423 could've been conducted illegally. 444 00:30:03,343 --> 00:30:07,514 And so, for my own safety, I don't need geolocation of where this is. 445 00:30:07,680 --> 00:30:09,474 Just me sitting here... 446 00:30:11,434 --> 00:30:13,937 the person trying to overthrow two administrations 447 00:30:14,020 --> 00:30:16,856 and all of the most powerful companies in the world, 448 00:30:16,940 --> 00:30:18,858 all at once. 449 00:30:18,942 --> 00:30:20,527 [chuckles] 450 00:30:20,610 --> 00:30:25,240 With one disjointed but hopefully-soon-seamless narrative. 451 00:30:25,323 --> 00:30:26,366 [chuckles] 452 00:30:27,784 --> 00:30:31,204 The wealthiest companies are technology companies. 453 00:30:31,913 --> 00:30:34,999 Google, Facebook, Amazon, Tesla. 454 00:30:35,500 --> 00:30:37,710 And the reason why these companies 455 00:30:37,794 --> 00:30:40,547 are the most powerful companies in the world 456 00:30:40,630 --> 00:30:45,343 is because, last year, data surpassed oil in its value. 457 00:30:45,802 --> 00:30:48,388 Data is the most valuable asset on Earth. 458 00:30:49,430 --> 00:30:52,517 And these companies are valuable 459 00:30:52,600 --> 00:30:56,271 because they have been exploiting people's assets. 460 00:30:57,939 --> 00:31:00,817 It wasn't until one of my friends reached out to me 461 00:31:00,900 --> 00:31:03,945 to ask was I going to be all right 462 00:31:04,028 --> 00:31:07,740 with the way that my story would be seen in history. 463 00:31:08,867 --> 00:31:11,452 And I thought, "No. 464 00:31:12,537 --> 00:31:14,414 I'm not okay, actually." [chuckles] 465 00:31:14,497 --> 00:31:18,042 And there's probably a lot of information that I could give 466 00:31:18,126 --> 00:31:22,171 that would be helpful to making things okay, possibly. 467 00:31:42,859 --> 00:31:44,611 [Hilder] I'm a political technologist 468 00:31:44,694 --> 00:31:47,989 who tries to shine a big light 469 00:31:48,072 --> 00:31:50,408 on how data's been used and abused. 470 00:31:52,535 --> 00:31:55,038 It's a moment where people have that visceral sense. 471 00:31:55,121 --> 00:31:56,748 There is, you know, 472 00:31:57,332 --> 00:31:59,918 that there's something wrong here, uh, that we need to fix. 473 00:32:00,919 --> 00:32:05,506 And so, I've dropped pretty much everything I was doing 474 00:32:05,590 --> 00:32:07,967 to work on this with Brittany Kaiser. 475 00:32:10,345 --> 00:32:13,306 I went and found her and met her, 476 00:32:13,389 --> 00:32:15,642 and she was very forthcoming 477 00:32:16,267 --> 00:32:20,063 in a way which made me think, [chuckles] "There's a lot here." 478 00:32:20,563 --> 00:32:24,025 [applause, scattered cheers] 479 00:32:26,861 --> 00:32:30,657 What we really need to be understanding is people's levers of persuasion. 480 00:32:30,740 --> 00:32:34,118 How are we actually going to message voters so that they can under... 481 00:32:34,202 --> 00:32:37,497 [man] Tina and I met with Brittany Kaiser. 482 00:32:37,580 --> 00:32:40,249 [Kaiser] We look very unlike any other political 483 00:32:40,333 --> 00:32:42,001 and communications firm, so... 484 00:32:42,085 --> 00:32:44,045 [woman] Do you work both sides of the aisle? 485 00:32:44,128 --> 00:32:46,523 Uh, no, we only work for the Republicans in the United States. 486 00:32:46,547 --> 00:32:47,547 [woman] Okay. 487 00:32:48,341 --> 00:32:49,676 And in Britain? 488 00:32:49,842 --> 00:32:53,304 [Kaiser] Well, actually, right now we're working on the Brexit campaign. 489 00:32:54,555 --> 00:32:57,767 At Leave. EU, we're going to be running a large-scale research 490 00:32:57,850 --> 00:33:00,103 throughout the nation to really understand 491 00:33:00,186 --> 00:33:03,481 why people are interested in staying in or out of the EU. 492 00:33:03,564 --> 00:33:07,652 And the answers to that will help inform our policy and our communications, 493 00:33:07,735 --> 00:33:10,321 to make sure that we turn out more first-time voters, 494 00:33:10,405 --> 00:33:14,200 more unregistered voters, more apathetic voters than ever before. 495 00:33:31,009 --> 00:33:33,928 [Hilder] I think we now have the foundations laid for... 496 00:33:34,554 --> 00:33:38,057 her to share what is some reasonably explosive materials 497 00:33:38,224 --> 00:33:39,642 that we've been finding. 498 00:33:40,226 --> 00:33:46,024 Uh, and, uh... her inbox and her hard drive, 499 00:33:46,733 --> 00:33:50,945 uh, really are a treasure trove of... uh, sketchy information. 500 00:33:53,906 --> 00:33:56,075 And we're still just scratching the surface. 501 00:33:59,454 --> 00:34:02,540 [dance music playing] 502 00:34:21,768 --> 00:34:24,520 [Hilder] Tell us about the first meeting you had in Trump Tower. 503 00:34:24,812 --> 00:34:27,065 In November 2015, 504 00:34:27,148 --> 00:34:32,153 I went with Alexander Nix to go see Corey Lewandowski, 505 00:34:32,236 --> 00:34:34,238 who was the campaign manager at the time. 506 00:34:34,489 --> 00:34:38,951 And I asked Corey, why could this place possibly look so familiar? 507 00:34:39,118 --> 00:34:42,413 And he said, "This is the set of The Apprentice. 508 00:34:42,789 --> 00:34:45,041 That's probably why you recognize it." And... 509 00:34:45,541 --> 00:34:47,376 I was kind of shocked, you know? 510 00:34:48,127 --> 00:34:52,048 The Trump campaign HQ is a reality TV set. 511 00:34:52,131 --> 00:34:53,883 [Hilder] Yes. It is. 512 00:34:57,762 --> 00:35:00,765 And the idea of a company... 513 00:35:01,099 --> 00:35:04,143 conducting large-scale analysis of a population... 514 00:35:04,227 --> 00:35:05,227 Mm-hmm. 515 00:35:05,269 --> 00:35:08,564 ...and then identifying the triggers that people have 516 00:35:08,648 --> 00:35:12,068 in terms of what's gonna move them from one state to another state, 517 00:35:12,151 --> 00:35:15,113 that feels very challenging 518 00:35:15,196 --> 00:35:18,324 to the individual's sense of autonomy and freedom... 519 00:35:18,407 --> 00:35:19,492 - Mm-hmm. - ...uh... 520 00:35:19,575 --> 00:35:22,203 and to the idea of democracy. 521 00:35:22,662 --> 00:35:23,704 Doesn't it? 522 00:35:24,455 --> 00:35:28,000 I don't know. Um... I would challenge that. 523 00:35:28,668 --> 00:35:31,420 What this strategy is mostly meant to do 524 00:35:31,504 --> 00:35:34,590 is to identify people who are still considering 525 00:35:34,674 --> 00:35:36,384 - many different options... - Yes. 526 00:35:36,467 --> 00:35:41,264 ...and educate them on some of the options that are out there, 527 00:35:41,347 --> 00:35:42,765 and if they're on the fence, 528 00:35:42,849 --> 00:35:45,893 then they can be persuaded to go one way or the other. 529 00:35:45,977 --> 00:35:48,855 - Yes, they can. - Uh, again, that is their own choice. 530 00:35:48,938 --> 00:35:50,398 - But a lot of the times... - Is it? 531 00:35:50,481 --> 00:35:53,201 - ...these are individuals that... - [Hilder] Is it their own choice? 532 00:35:54,569 --> 00:35:56,689 In the end, they're the ones that go to the ballot box 533 00:35:56,737 --> 00:35:58,990 - and make their ch... decision. - [Hilder] Yeah. 534 00:35:59,365 --> 00:36:02,368 I mean, I'm asking you these questions as Brittany Kaiser. 535 00:36:02,451 --> 00:36:03,536 - [Kaiser] I know. - Right? 536 00:36:03,619 --> 00:36:07,248 I'm not asking you these questions as Cambridge Analytica or SCL, 537 00:36:07,331 --> 00:36:10,334 because that's not who you are anymore. Right? 538 00:36:10,626 --> 00:36:12,545 I get it. I get it. But... 539 00:36:12,628 --> 00:36:14,748 [Hilder stammers] And do you think Cambridge Analytica 540 00:36:14,797 --> 00:36:17,925 was ever involved in the contravention of people's human rights? 541 00:36:18,009 --> 00:36:19,010 [Kaiser] No. 542 00:36:20,636 --> 00:36:25,766 But, [chuckles] again, I start to question a lot of things the more I hear. 543 00:36:25,850 --> 00:36:26,800 [Hilder] Yeah. 544 00:36:26,809 --> 00:36:29,770 [Kaiser] I mean, I had spent my entire career before that 545 00:36:29,854 --> 00:36:31,939 working for human rights. 546 00:36:33,900 --> 00:36:34,900 [Hilder] Okay. 547 00:36:35,318 --> 00:36:37,111 Let's go back to that. [chuckles] 548 00:36:38,196 --> 00:36:41,407 It wasn't that long ago. Just a decade. 549 00:36:43,159 --> 00:36:45,077 - [Hilder] It wasn't that long ago. - Yeah. 550 00:36:50,541 --> 00:36:54,170 [Kaiser] I had worked in elections since I was 14 or 15. 551 00:36:55,922 --> 00:36:59,842 I told my cousin I applied to intern on the Obama campaign. 552 00:37:00,092 --> 00:37:01,302 She was like, "Oh, my God. 553 00:37:01,385 --> 00:37:03,721 You better get that internship, or I'll die." 554 00:37:05,014 --> 00:37:07,767 I was part of the team running Obama's Facebook. 555 00:37:09,310 --> 00:37:14,065 We invented the way social media is used to communicate with voters. 556 00:37:19,820 --> 00:37:22,990 I then spent several years working on human rights 557 00:37:23,074 --> 00:37:24,742 and international relations, 558 00:37:25,576 --> 00:37:27,328 first for Amnesty International, 559 00:37:27,411 --> 00:37:31,123 then lobbying at the United Nations and European Parliament. 560 00:37:34,710 --> 00:37:38,214 And I used to always say I love human rights campaigning, 561 00:37:38,923 --> 00:37:41,968 but sometimes I feel like I'm banging my head against a brick wall 562 00:37:42,051 --> 00:37:44,387 because I can't see the results of what I'm doing. 563 00:37:44,470 --> 00:37:46,889 I don't know if I'm literally just wasting my time. 564 00:37:49,433 --> 00:37:52,144 And that's where I was when I met Alexander Nix. 565 00:37:53,896 --> 00:37:56,524 - [low chatter] - [cutlery clinking] 566 00:37:56,607 --> 00:37:59,487 [Kaiser] Friends of ours thought it would be a good joke to introduce us. 567 00:38:00,486 --> 00:38:04,240 He was very interested in learning more about my experience with the Democrats. 568 00:38:04,573 --> 00:38:06,367 He gave me his card and said, 569 00:38:06,450 --> 00:38:09,161 "Let me get you drunk and steal your secrets." 570 00:38:12,248 --> 00:38:16,294 And in December 2014, he offered me a job. 571 00:38:21,215 --> 00:38:24,927 Coming across a company where you could actually see your impact 572 00:38:25,011 --> 00:38:27,346 was really exciting for me. 573 00:38:33,894 --> 00:38:36,564 I got a little more conservative or posh 574 00:38:36,647 --> 00:38:41,110 in terms of the way that I dressed and the way that I spoke, 575 00:38:41,944 --> 00:38:46,532 and doing things like going on shooting at the weekends 576 00:38:46,615 --> 00:38:47,992 and stuff like that. 577 00:38:48,409 --> 00:38:52,538 It's just very different to what I would normally spend my time doing. 578 00:38:59,962 --> 00:39:01,881 [Hilder] Must've been a hell of an adventure. 579 00:39:02,214 --> 00:39:05,343 It was really interesting. I strapped on my cowboy boots, 580 00:39:05,426 --> 00:39:08,554 got into character, got my NRA membership. 581 00:39:08,721 --> 00:39:11,140 - Yeah, you joined the NRA, right? - I did, yeah. 582 00:39:11,223 --> 00:39:13,059 Just to understand how these people think, 583 00:39:13,142 --> 00:39:14,142 - like... - Uh-huh. 584 00:39:14,268 --> 00:39:15,686 I don't want to use guns. 585 00:39:15,770 --> 00:39:18,356 - I'm not really interested in guns at all. - Yeah. 586 00:39:18,439 --> 00:39:20,316 I felt like I was getting to know... 587 00:39:21,233 --> 00:39:24,236 people that I used to disagree with a lot, 588 00:39:24,320 --> 00:39:27,281 like my grandparents, my aunts, uncles, cousins. 589 00:39:28,532 --> 00:39:32,620 [Hilder] So this wasn't just an outfit that you put on, and it felt important? 590 00:39:32,870 --> 00:39:35,498 It was important. It is important. 591 00:39:35,581 --> 00:39:37,833 - [Hilder] Yeah. - I feel like the main problem 592 00:39:38,000 --> 00:39:39,210 in US politics 593 00:39:39,293 --> 00:39:43,130 is that people are so polarized that they can't understand each other, 594 00:39:43,214 --> 00:39:46,092 and therefore they can't work together, and therefore nothing gets done. 595 00:39:47,468 --> 00:39:51,138 - [PA chimes] - [woman speaking indistinctly on PA] 596 00:39:59,647 --> 00:40:03,651 I am about to draft some questions for a senator 597 00:40:03,734 --> 00:40:07,029 who will be able to ask them to Mark Zuckerberg 598 00:40:07,113 --> 00:40:10,950 in the Senate Judiciary hearing on Tuesday. 599 00:40:11,283 --> 00:40:14,829 "How much of Facebook's revenue 600 00:40:15,996 --> 00:40:19,375 comes directly from the monetization 601 00:40:20,000 --> 00:40:23,045 of users' personal data?" 602 00:40:24,713 --> 00:40:26,590 [chortles] All of it! 603 00:40:27,967 --> 00:40:29,552 Exactly. 604 00:40:30,386 --> 00:40:33,013 The reality is that Facebook knows more about this 605 00:40:33,097 --> 00:40:34,974 than pretty much anyone in the world 606 00:40:35,057 --> 00:40:39,937 because Facebook is the best platform on which to run experiments. 607 00:40:40,020 --> 00:40:41,939 - Yeah, it is. Um, it... - [laughing] 608 00:40:42,022 --> 00:40:44,525 And it actually always gets you the best engagement rates. 609 00:40:44,608 --> 00:40:46,735 We always spend the majority amount of money 610 00:40:46,819 --> 00:40:49,613 on any commercial or political campaign in Facebook. 611 00:40:50,156 --> 00:40:51,866 Always gets the majority of the ad budget. 612 00:40:51,949 --> 00:40:53,617 - It does, it does. - Yep. 613 00:40:55,411 --> 00:40:58,956 [Hilder] There is at least the possibility that the American public 614 00:40:59,039 --> 00:41:01,876 and publics in other countries have been experimented on. 615 00:41:07,423 --> 00:41:09,484 [Kaiser] Remember those Facebook quizzes that we used 616 00:41:09,508 --> 00:41:12,678 to form personality models for all voters in the US? 617 00:41:15,556 --> 00:41:19,351 The truth is, we didn't target every American voter equally. 618 00:41:20,519 --> 00:41:22,396 The bulk of our resources 619 00:41:22,480 --> 00:41:26,108 went into targeting those whose minds we thought we could change. 620 00:41:26,901 --> 00:41:29,028 We called them "the persuadables." 621 00:41:31,238 --> 00:41:32,907 They're everywhere in the country, 622 00:41:32,990 --> 00:41:36,619 but the persuadables that mattered were the ones in swing states 623 00:41:36,702 --> 00:41:40,873 like Michigan, Wisconsin, Pennsylvania, and Florida. 624 00:41:44,001 --> 00:41:47,922 Now, each of these states were broken down by precinct. 625 00:41:49,215 --> 00:41:53,093 So, you can say there are 22,000 persuadable voters 626 00:41:53,302 --> 00:41:54,845 in this precinct, 627 00:41:55,971 --> 00:41:59,725 and if we target enough persuadable people in the right precincts, 628 00:41:59,850 --> 00:42:03,479 then those states would turn red instead of blue. 629 00:42:04,980 --> 00:42:07,942 Our creative team designed personalized content 630 00:42:08,025 --> 00:42:09,693 to trigger those individuals. 631 00:42:09,777 --> 00:42:12,029 Terrorists love porous borders. 632 00:42:12,112 --> 00:42:15,324 Widespread gaps in border security allow terrorists... 633 00:42:15,407 --> 00:42:19,745 [Kaiser] We bombarded them through blogs, websites, articles, videos, ads, 634 00:42:19,828 --> 00:42:21,664 every platform you can imagine. 635 00:42:22,122 --> 00:42:24,959 Until they saw the world the way we wanted them to. 636 00:42:27,920 --> 00:42:29,088 [emoji growls] 637 00:42:29,296 --> 00:42:31,590 [Kaiser] Until they voted for our candidate. 638 00:42:33,384 --> 00:42:35,010 It's like a boomerang. 639 00:42:35,553 --> 00:42:37,137 You send your data out, 640 00:42:38,138 --> 00:42:39,807 it gets analyzed, 641 00:42:40,224 --> 00:42:43,727 and it comes back at you as targeted messaging 642 00:42:44,311 --> 00:42:46,146 to change your behavior. 643 00:42:53,070 --> 00:42:55,072 [indistinct chatter] 644 00:43:00,744 --> 00:43:02,371 [indistinct announcement over PA] 645 00:43:02,454 --> 00:43:05,374 DCMS Committee announced the future witnesses 646 00:43:05,457 --> 00:43:07,209 - for a fake news inquiry. - Yes. 647 00:43:07,668 --> 00:43:09,712 - There you are. You're... - Me. 648 00:43:09,795 --> 00:43:13,090 - [Hilder] You're the day before Alexander. - [Kaiser] The former CEO. 649 00:43:13,966 --> 00:43:15,634 He's going the day after me. 650 00:43:16,844 --> 00:43:19,847 Yes. Is it... Is it all feeling a bit real? 651 00:43:19,930 --> 00:43:21,557 [chuckling] It's really intense. 652 00:43:22,266 --> 00:43:23,642 - [sniffles] - [Hilder] It's real. 653 00:43:23,976 --> 00:43:26,437 - And it's big. - [Kaiser sighs] 654 00:43:39,533 --> 00:43:42,578 [Cadwalladr] The first time I wrote about Cambridge Analytica, 655 00:43:43,037 --> 00:43:45,497 it was December 2016. 656 00:43:47,291 --> 00:43:49,501 I said that they'd worked for the Trump campaign 657 00:43:49,585 --> 00:43:51,587 and for the Brexit campaign. 658 00:43:54,465 --> 00:43:56,305 And I started getting letters from them saying, 659 00:43:56,383 --> 00:43:58,218 "We never worked for the Leave campaign." 660 00:44:02,097 --> 00:44:04,725 And this was baffling because on Leave. EU's website, 661 00:44:04,808 --> 00:44:07,019 it said, "We hired Cambridge Analytica." 662 00:44:08,896 --> 00:44:11,482 There were statements from Alexander Nix about how they worked 663 00:44:11,565 --> 00:44:12,650 for the Leave campaign. 664 00:44:13,484 --> 00:44:17,112 Yeah, I'm afraid we don't talk about that campaign. At all. 665 00:44:17,571 --> 00:44:19,365 [laughs] 666 00:44:19,448 --> 00:44:21,075 [woman] You didn't? Or you did? 667 00:44:21,158 --> 00:44:22,451 No, no, we don't discuss it. 668 00:44:22,534 --> 00:44:24,134 - [woman] Okay. Not at all. - [Nix] Yeah. 669 00:44:24,703 --> 00:44:27,831 [Cadwalladr] And that was when I discovered this video 670 00:44:27,915 --> 00:44:30,125 of Leave. EU's press launch. 671 00:44:32,628 --> 00:44:35,798 And I was like, well, there, look, it's Brittany Kaiser! 672 00:44:35,881 --> 00:44:38,425 She works for Cambridge Analytica. 673 00:44:38,509 --> 00:44:40,552 She's at the press launch 674 00:44:40,636 --> 00:44:43,389 talking about all the clever things 675 00:44:43,472 --> 00:44:46,350 that they're going to do with data for the Leave Campaign. 676 00:44:48,102 --> 00:44:50,979 Like, what the fuck? 677 00:44:51,063 --> 00:44:54,817 How can you carry on denying it? This is nuts! 678 00:44:54,900 --> 00:44:57,945 [Russian national anthem playing] 679 00:44:58,779 --> 00:45:00,539 [Cadwalladr] And it was exactly the same time 680 00:45:00,614 --> 00:45:02,408 that Leave. EU started posting 681 00:45:02,491 --> 00:45:04,243 the horrible videos of me. 682 00:45:04,326 --> 00:45:05,929 [woman panting] I've gotta get out of here! 683 00:45:05,953 --> 00:45:08,747 [Cadwalladr] There was a spoof video of a scene from Airplane! 684 00:45:08,831 --> 00:45:11,434 - [stewardess] Get a hold of yourself! - [man] Please, let me handle this. 685 00:45:11,458 --> 00:45:14,604 [Cadwalladr] There was like a whole stream of people going, "Don't be so hysterical!" 686 00:45:14,628 --> 00:45:15,628 And, like, hitting her. 687 00:45:15,671 --> 00:45:17,589 Go back to your seat! I'll take care of this. 688 00:45:17,673 --> 00:45:19,717 [Cadwalladr] It was like, "Calm down! Calm down!" 689 00:45:19,800 --> 00:45:22,278 [makes slapping sound] And, you know, so a whole line of people, 690 00:45:22,302 --> 00:45:24,638 and then the last person's carrying a gun. 691 00:45:26,390 --> 00:45:27,975 And the whole thing they had, 692 00:45:28,058 --> 00:45:30,936 it was set to the music from the Russian national anthem. 693 00:45:31,019 --> 00:45:33,564 [anthem continues playing] 694 00:45:35,065 --> 00:45:36,400 Ugh. 695 00:45:38,110 --> 00:45:41,488 The day after that Leave. EU video was put out, 696 00:45:41,572 --> 00:45:45,701 the editor of another news organization that I was going to do a report for 697 00:45:46,201 --> 00:45:47,536 took me for lunch and said, 698 00:45:47,619 --> 00:45:50,579 "Actually, we think it's too much of a risk having you present the report." 699 00:45:53,792 --> 00:45:58,046 It is this sort of visceral thing of living with this disinformation 700 00:45:58,130 --> 00:46:00,924 and this propaganda every single day. 701 00:46:01,216 --> 00:46:04,887 And feeling the effects of it and knowing that it does work, 702 00:46:04,970 --> 00:46:07,973 it does have an impact in real life, whether people believe that or not. 703 00:46:13,437 --> 00:46:15,439 You know, I'm used to just writing stories. 704 00:46:15,522 --> 00:46:18,025 You write it, and then you go on to the next subject. 705 00:46:18,108 --> 00:46:21,445 I'm a feature writer, that's what I did. But I was just like, "They've lied." 706 00:46:21,528 --> 00:46:24,168 And they're lying about something which is actually really massive. 707 00:46:24,239 --> 00:46:26,533 'Cause it's, you know, the... 708 00:46:27,075 --> 00:46:29,953 rest of the future of our country. 709 00:46:30,662 --> 00:46:33,749 - [reporter 1] What do you think, Nigel? - [reporters murmuring indistinctly] 710 00:46:35,584 --> 00:46:39,755 [Cadwalladr] In the referendum, most people had very fixed views. 711 00:46:40,672 --> 00:46:44,218 But there was a tiny sliver of people who didn't. 712 00:46:44,301 --> 00:46:46,345 These were "the persuadables." 713 00:46:46,678 --> 00:46:50,057 It was all about finding these very few people 714 00:46:50,140 --> 00:46:52,810 and then bombarding them with ads. 715 00:46:55,020 --> 00:46:58,649 This is the thing which was invisible to all of us. 716 00:47:00,442 --> 00:47:06,448 Let June the 23rd go down in our history as our independence day! 717 00:47:06,532 --> 00:47:08,867 [loud cheering] 718 00:47:10,494 --> 00:47:13,705 [man 1] The British people have spoken, and the answer is, "We're out." 719 00:47:13,789 --> 00:47:15,082 [woman 1] For good or for ill, 720 00:47:15,165 --> 00:47:18,043 this decision will define our politics for years to come. 721 00:47:18,126 --> 00:47:22,256 [woman 2] This great country has made a terrible mistake. 722 00:47:22,339 --> 00:47:24,174 [man 2] It's an earthquake that has happened. 723 00:47:24,258 --> 00:47:27,553 And what happens after earthquakes? We wait to see. 724 00:47:27,886 --> 00:47:30,013 [woman 3] People weren't agreed on what Leave meant. 725 00:47:30,097 --> 00:47:32,992 - [man 3] It's simple: leave. Full stop. - [woman 3] Was no manifesto for Leave. 726 00:47:33,016 --> 00:47:34,643 But there is no "leave, full stop..." 727 00:47:34,726 --> 00:47:37,604 [crowd chanting] Brexit! Brexit! Brexit! 728 00:47:38,730 --> 00:47:40,232 [PA chimes] 729 00:47:40,315 --> 00:47:41,995 [woman over PA] In the interest of safety, 730 00:47:42,067 --> 00:47:45,487 parents are advised not to carry children on baggage trolleys 731 00:47:45,571 --> 00:47:47,573 or allow them to play on the escalators. 732 00:47:47,656 --> 00:47:49,575 - [Kaiser] Hi, Mama! - [mother on phone] Hey! 733 00:47:50,784 --> 00:47:52,744 I'm through, I'm through, I'm through, yeah. 734 00:47:52,828 --> 00:47:56,748 So, I managed to get into the United Kingdom with no issues, 735 00:47:56,832 --> 00:47:59,251 which is really fantastic. 736 00:47:59,585 --> 00:48:02,379 [mother] Well, I just want you to mentally be okay with this, 737 00:48:02,462 --> 00:48:06,091 'cause what you're doing is a monumental undertaking. 738 00:48:06,174 --> 00:48:07,133 I know. 739 00:48:07,134 --> 00:48:08,719 And I still fear for your life. 740 00:48:08,802 --> 00:48:09,803 Yeah. 741 00:48:09,887 --> 00:48:12,097 With the powerful people that are involved... 742 00:48:12,639 --> 00:48:13,807 [Kaiser] Yeah, I know. 743 00:48:13,891 --> 00:48:15,976 You just have to be careful all the time. 744 00:48:16,059 --> 00:48:17,853 [Kaiser] I know, but I can't keep quiet 745 00:48:17,936 --> 00:48:20,147 just because it'll make powerful people mad. 746 00:48:20,230 --> 00:48:23,066 I know. I know, I know, I know. I know. 747 00:48:23,734 --> 00:48:26,111 I totally get that, you know. 748 00:48:26,862 --> 00:48:30,908 Somebody's always got to bring down these jerks. 749 00:48:31,074 --> 00:48:32,200 Exactly. 750 00:48:32,284 --> 00:48:34,953 So, you know, anyway. 751 00:48:35,203 --> 00:48:37,039 When I have time off next month, 752 00:48:37,122 --> 00:48:40,667 I've gotta go and put deposits down on electric and gas. 753 00:48:41,126 --> 00:48:44,379 I don't have $1,000 right now, so I'll have to wait. 754 00:48:44,463 --> 00:48:47,382 [Kaiser] Well, I could... I could pay for it. That means... 755 00:48:47,466 --> 00:48:49,885 Oh, don't worry, I don't need it right now. 756 00:48:51,720 --> 00:48:53,472 All right, honey, you stay healthy. 757 00:48:53,555 --> 00:48:54,472 [Kaiser] Love you. 758 00:48:54,473 --> 00:48:55,599 Be safe, honey. I love you. 759 00:48:55,682 --> 00:48:56,599 [Kaiser] I love you. Bye-bye. 760 00:48:56,600 --> 00:48:57,809 Bye-bye, baby. 761 00:48:59,603 --> 00:49:00,604 [Kaiser sighs] 762 00:49:07,945 --> 00:49:09,821 [Kaiser] I'm really happy to be back, 763 00:49:09,905 --> 00:49:14,368 but I don't think I can really do much going out in public while I'm here. 764 00:49:17,162 --> 00:49:18,664 Last time I left, 765 00:49:18,747 --> 00:49:23,627 I was in a very difficult situation with a lot of my friends. 766 00:49:24,294 --> 00:49:25,934 - [Kaiser] So fire door number one? - Yes. 767 00:49:26,004 --> 00:49:27,673 - [man murmurs indistinctly] - What? 768 00:49:29,883 --> 00:49:33,345 [Kaiser] So many people were so angry that I was working on the Brexit campaign, 769 00:49:33,428 --> 00:49:36,682 so angry I continued to work for a company 770 00:49:36,765 --> 00:49:39,601 that supported people like Ted Cruz and Donald Trump. 771 00:49:42,312 --> 00:49:45,649 And there's still this whole group of people that are wondering, 772 00:49:45,732 --> 00:49:49,361 am I taking the high road, or am I doing something to protect myself? 773 00:49:56,368 --> 00:49:58,137 [reporter 1] More news on Facebook over the weekend, 774 00:49:58,161 --> 00:50:00,580 as Mark Zuckerberg prepares to testify before Congress 775 00:50:00,664 --> 00:50:01,957 tomorrow and Wednesday. 776 00:50:02,040 --> 00:50:03,851 Earlier this morning, he announced some new measures 777 00:50:03,875 --> 00:50:06,670 in the company's efforts to prevent interference in elections... 778 00:50:11,049 --> 00:50:13,051 [low chatter] 779 00:50:18,056 --> 00:50:20,684 [Kaiser gasps] It's today's FT. 780 00:50:20,767 --> 00:50:23,729 My name's at the top of the front page of the FT. 781 00:50:23,812 --> 00:50:24,855 Oh, shit. 782 00:50:25,856 --> 00:50:27,566 [chuckles] There it is. 783 00:50:29,234 --> 00:50:31,111 "Zuckerberg braced for Congress grilling. 784 00:50:31,194 --> 00:50:34,573 Facebook chief will admit that the social network did not do enough 785 00:50:34,656 --> 00:50:37,117 to stop its tools being used for harm." 786 00:50:37,826 --> 00:50:41,538 [Carroll] "Facebook should pay its two billion users for their personal data. 787 00:50:41,788 --> 00:50:45,834 The big tech companies are evolving into digital kleptocracies. 788 00:50:46,084 --> 00:50:47,294 Yesterday." 789 00:50:49,171 --> 00:50:50,797 [clicks tongue] Um... 790 00:50:51,590 --> 00:50:54,301 I sort of missed the paragraph of, like, 791 00:50:55,343 --> 00:50:57,554 "I helped build this monster 792 00:50:58,472 --> 00:50:59,472 that... 793 00:51:01,767 --> 00:51:05,395 wreaked havoc upon the world and will take decades to recover from, 794 00:51:06,396 --> 00:51:10,025 and I feel really bad about that." I don't see that here. 795 00:51:12,194 --> 00:51:14,071 [Kaiser] The data wars have begun. 796 00:51:15,572 --> 00:51:17,365 [camera shutters clicking] 797 00:51:24,498 --> 00:51:28,627 [Carroll] I mean, this is a company that is a superstate, 798 00:51:28,919 --> 00:51:33,131 and the only nation that has jurisdiction over it is ours. 799 00:51:33,215 --> 00:51:36,384 [gavel slams on video feed] 800 00:51:40,347 --> 00:51:44,810 [Grassley] The Committees on the Judiciary and Commerce, Science and Transportation 801 00:51:44,893 --> 00:51:46,228 will come to order. 802 00:51:48,271 --> 00:51:51,024 [Zuckerberg] Chairman Grassley and members of the committee: 803 00:51:52,192 --> 00:51:57,364 My top priority has always been our social mission of connecting people, 804 00:51:57,447 --> 00:52:00,158 building community, and bringing the world closer together. 805 00:52:01,243 --> 00:52:04,538 But it's clear now that we didn't do enough to prevent these tools 806 00:52:04,621 --> 00:52:06,206 from being used for harm as well. 807 00:52:06,706 --> 00:52:09,417 Before I talk about the steps we're taking to address them, 808 00:52:09,501 --> 00:52:10,981 I want to talk about how we got here. 809 00:52:11,545 --> 00:52:14,172 When we first contacted Cambridge Analytica, 810 00:52:14,256 --> 00:52:16,466 they told us that they had deleted the data. 811 00:52:16,842 --> 00:52:17,842 About a month ago, 812 00:52:17,884 --> 00:52:20,303 we heard new reports that suggested that wasn't true. 813 00:52:21,054 --> 00:52:25,225 So, we're getting to the bottom of exactly what Cambridge Analytica did. 814 00:52:25,767 --> 00:52:28,019 [Kaiser] Blame it on me, Mark. Go for it. 815 00:52:28,103 --> 00:52:30,689 ...to address this and to prevent it from happening again. 816 00:52:31,481 --> 00:52:35,318 Thank you for having me here today, and I'm ready to take your questions. 817 00:52:36,444 --> 00:52:40,907 Well, Mr. Zuckerberg, during the 2016 campaign, 818 00:52:41,032 --> 00:52:44,286 Cambridge Analytica worked with the Trump campaign 819 00:52:44,369 --> 00:52:47,789 to refine tactics under Project Alamo. 820 00:52:47,873 --> 00:52:50,292 Were Facebook employees involved in that? 821 00:52:51,585 --> 00:52:54,480 Senator, I don't know that our employees were involved with Cambridge Analytica. 822 00:52:54,504 --> 00:52:56,464 - Yes, they were. - Whoa! 823 00:52:56,548 --> 00:52:57,591 [man] Oh, my God. 824 00:52:57,674 --> 00:53:00,468 The Republican team in DC was. I met them. 825 00:53:00,677 --> 00:53:03,680 ...although I know that we did help out the Trump campaign overall 826 00:53:03,763 --> 00:53:06,600 in sales support in the same way that we do with other campaigns. 827 00:53:06,683 --> 00:53:08,560 So, they may have been involved 828 00:53:08,643 --> 00:53:11,271 and all working together during that time period? 829 00:53:11,354 --> 00:53:14,107 Maybe that's something your investigation will find out. 830 00:53:14,191 --> 00:53:16,818 Senator, I can certainly have my team get back to you 831 00:53:16,902 --> 00:53:19,946 on any specifics there that I don't know sitting here today. 832 00:53:20,030 --> 00:53:22,782 Oh, my God. This is the whole point of the hearing, you... 833 00:53:22,866 --> 00:53:25,493 - [Cantwell] Know what I'm talking about? - No, I do not. 834 00:53:25,577 --> 00:53:26,577 [Cantwell] Okay. 835 00:53:27,996 --> 00:53:29,122 [man] It can go to you. 836 00:53:29,789 --> 00:53:33,585 Do you think the 87 million users, do you consider them victims? 837 00:53:34,628 --> 00:53:36,463 Uh, Senator, I think... 838 00:53:36,796 --> 00:53:38,048 Uh... 839 00:53:38,215 --> 00:53:42,010 Yes. I mean, they... they did not want their information to be 840 00:53:42,219 --> 00:53:46,014 sold to Cambridge Analytica by a developer. And... And... 841 00:53:46,348 --> 00:53:47,515 that happened. 842 00:53:47,682 --> 00:53:49,309 And it happened on our watch. 843 00:53:49,434 --> 00:53:50,936 So even though we didn't do it, 844 00:53:51,019 --> 00:53:53,563 I think we have a responsibility to be able to prevent that 845 00:53:53,647 --> 00:53:55,190 and be able to take action sooner. 846 00:53:55,690 --> 00:53:57,734 One of the steps that we need to take now 847 00:53:57,817 --> 00:54:00,946 is go do a full audit of all of Cambridge Analytica's systems 848 00:54:01,029 --> 00:54:04,449 to understand what they're doing, whether they still have any data... 849 00:54:04,532 --> 00:54:08,411 [man 1] Obviously, Facebook has been done considerable reputational damage 850 00:54:08,495 --> 00:54:10,956 by its association with Cambridge Analytica. 851 00:54:11,039 --> 00:54:13,541 [woman 1] ...process by which Cambridge Analytica... 852 00:54:13,625 --> 00:54:15,228 [woman 2] ...relates to Cambridge Analytica... 853 00:54:15,252 --> 00:54:18,255 [man 2] The recent stories about Cambridge Analytica 854 00:54:18,338 --> 00:54:19,941 - and data mining... - [hearing continues indistinctly] 855 00:54:19,965 --> 00:54:23,176 [man] Look at how many people are posting Zuckerberg... 856 00:54:23,260 --> 00:54:25,321 - [Kaiser] Everyone's watching this. - [man] Oh, my God. 857 00:54:25,345 --> 00:54:27,281 - [Kaiser] And the whole thing just says... - [man] Wow. 858 00:54:27,305 --> 00:54:29,307 [Kaiser]...Cambridge, Cambridge, Cambridge. 859 00:54:29,391 --> 00:54:31,059 [no audible dialogue] 860 00:54:31,851 --> 00:54:33,687 [Kaiser] I never thought everyone in the world 861 00:54:33,770 --> 00:54:36,064 would know who Cambridge Analytica was. 862 00:54:36,147 --> 00:54:38,942 [Zuckerberg hearing continues indistinctly] 863 00:54:46,741 --> 00:54:49,536 [Wheatland] I learnt many things from that period. 864 00:54:50,161 --> 00:54:53,707 I learnt, for example, that when you're in a PR crisis, 865 00:54:53,790 --> 00:54:56,835 the one thing that you can't hire is a PR crisis company. 866 00:54:57,168 --> 00:54:59,337 We spoke to... 867 00:55:00,630 --> 00:55:05,010 [sighs]...tens of PR crisis companies that listened intently, 868 00:55:05,093 --> 00:55:07,220 went away to think about it, and came back and said, 869 00:55:07,304 --> 00:55:10,557 "Sorry, we can't associate ourselves with your brand." [chuckles] 870 00:55:10,640 --> 00:55:15,395 Um... And actually, I thought that's what they were there, um, for. 871 00:55:15,770 --> 00:55:19,858 And so it became impossible to, um... to get a voice. 872 00:55:26,072 --> 00:55:29,075 The reason that we've called this news conference today 873 00:55:29,159 --> 00:55:33,913 is to begin to counter some of the unfounded allegations 874 00:55:33,997 --> 00:55:38,168 and, frankly, the torrent of ill-informed and inaccurate speculation. 875 00:55:47,135 --> 00:55:49,929 Do you think what was done was illegal? 876 00:55:50,013 --> 00:55:51,765 We think it is probably illegal 877 00:55:51,848 --> 00:55:54,517 according to UK law, and that's what we're challenging. 878 00:55:54,601 --> 00:55:56,936 We do have a statement from Cambridge Analytica. 879 00:55:57,020 --> 00:55:58,438 Cambridge said, 880 00:55:58,521 --> 00:56:01,775 "David Carroll has no more right to submit this request 881 00:56:01,858 --> 00:56:05,737 than a member of the Taliban sitting in a cave in Afghanistan." 882 00:56:08,615 --> 00:56:10,867 [Wheatland] And it came like a tsunami. 883 00:56:11,743 --> 00:56:16,331 There were 35,000 media stories per day. 884 00:56:19,459 --> 00:56:22,087 They wanted to discredit Trump, 885 00:56:22,420 --> 00:56:24,839 they wanted to discredit Brexit, 886 00:56:24,923 --> 00:56:26,841 and we were the vehicle for doing it. 887 00:56:27,634 --> 00:56:31,012 Do you feel that you have skewed democracy? 888 00:56:31,388 --> 00:56:36,351 By providing campaign services to a candidate who'd been fairly nominated 889 00:56:36,434 --> 00:56:39,646 as the Republican representative of the United States? 890 00:56:40,105 --> 00:56:41,523 How is that possible? 891 00:56:43,191 --> 00:56:46,027 [Wheatland] Cambridge Analytica became responsible 892 00:56:46,111 --> 00:56:49,864 for pretty much everything that was wrong in the world. 893 00:56:55,078 --> 00:56:57,622 Are you saying that Cambridge Analytica lies? 894 00:56:57,705 --> 00:57:00,417 They knowingly misrepresent the truth. 895 00:57:00,583 --> 00:57:02,168 What's your proof of that? 896 00:57:02,836 --> 00:57:04,087 I was there. 897 00:57:05,547 --> 00:57:08,133 [Wheatland] Chris Wylie spoke with great authority 898 00:57:08,216 --> 00:57:11,094 about what had gone on in Cambridge Analytica and SCL 899 00:57:11,177 --> 00:57:14,222 during 2015 and 2016, 900 00:57:14,305 --> 00:57:17,517 at a time when he was never there. 901 00:57:18,560 --> 00:57:22,939 He had worked for the company for nine months, left in 2014. 902 00:57:23,231 --> 00:57:28,153 He then went out and pitched the Trump campaign, 903 00:57:28,778 --> 00:57:31,281 and lost to us. 904 00:57:34,242 --> 00:57:37,328 Chris Wylie set out to kill the company. 905 00:57:39,998 --> 00:57:41,332 [man] And what about Brittany? 906 00:57:46,963 --> 00:57:48,965 I don't know what Brittany was doing. 907 00:57:54,387 --> 00:57:56,264 Brittany was someone that... 908 00:57:57,015 --> 00:58:00,185 I thought was a friend, I know Alexander thought was a friend. 909 00:58:01,436 --> 00:58:02,562 But, you know, 910 00:58:04,105 --> 00:58:06,441 when the world gets turned upside down, 911 00:58:07,901 --> 00:58:11,738 people behave in different ways. 912 00:58:12,864 --> 00:58:15,450 Maybe even they don't understand 913 00:58:16,284 --> 00:58:18,495 why they're doing what they're doing at the time. 914 00:58:24,959 --> 00:58:28,713 [Hilder] I would strongly recommend that we start doing testimony prep. 915 00:58:30,131 --> 00:58:32,300 Uh, go through the emails together, 916 00:58:32,383 --> 00:58:35,720 maybe go through other stuff and hash out what's there. 917 00:58:37,430 --> 00:58:39,557 [Kaiser] Oh, my God. I have my entire calendar. 918 00:58:41,226 --> 00:58:43,144 [whispers] Shit. Shit. 919 00:58:43,228 --> 00:58:45,989 - [Hilder] You have your entire calendar? - I have my entire calendar. 920 00:58:46,022 --> 00:58:47,273 Oh, my God. 921 00:58:47,524 --> 00:58:48,650 [Hilder] Downloaded? 922 00:58:48,733 --> 00:58:52,195 [Kaiser] I didn't think that that was going to still link. [gasps] 923 00:58:52,278 --> 00:58:53,363 That's amazing. 924 00:58:53,446 --> 00:58:56,050 I can actually do an entire timeline of everything, if that's the case. 925 00:58:56,074 --> 00:58:57,234 [Hilder] A timeline is great. 926 00:58:57,700 --> 00:58:59,160 [whispers] Fuck, look at this. 927 00:58:59,744 --> 00:59:02,580 [chuckles] September 2015. 928 00:59:04,749 --> 00:59:06,626 US Chamber of Commerce... 929 00:59:07,126 --> 00:59:08,211 Meeting at... 930 00:59:11,172 --> 00:59:13,883 Leave.EU. That was fun. Run-through. 931 00:59:13,967 --> 00:59:15,009 [Hilder] It's all here. 932 00:59:15,093 --> 00:59:17,762 I know exactly when everything happened. 933 00:59:17,845 --> 00:59:19,472 Always. Forever. 934 00:59:23,393 --> 00:59:25,186 [Kaiser] I didn't realize how much I had. 935 00:59:25,687 --> 00:59:27,021 I've got much more than that, 936 00:59:27,105 --> 00:59:30,108 it's just those are the things I forwarded that I think are worthwhile. 937 00:59:33,444 --> 00:59:34,444 [elevator bell dings] 938 00:59:36,197 --> 00:59:38,783 [Kaiser] Did you get the chance to look through all of that? 939 00:59:38,866 --> 00:59:41,666 [Hilder] I went through most of it. I'll look through more of it today. 940 00:59:44,664 --> 00:59:47,417 [Kaiser] Let me get out one of the pitches. Um... 941 00:59:47,667 --> 00:59:49,919 - [keyboard clacking] - "CA Political." 942 00:59:52,171 --> 00:59:53,339 What is this? 943 00:59:54,257 --> 00:59:55,592 This looks mental. 944 00:59:57,343 --> 01:00:02,849 This is a list of, like, the main sources of data. 945 01:00:04,017 --> 01:00:08,104 And look, it's got that fucking Facebook data set of 30 million individuals. 946 01:00:08,187 --> 01:00:09,856 It just says it in there! [exclaims] 947 01:00:09,939 --> 01:00:11,608 What the fuck! 948 01:00:12,108 --> 01:00:13,401 [whispers] Oh, my God. 949 01:00:14,068 --> 01:00:16,738 - [Hilder] February... - [Kaiser] Created February 4th, 2016! 950 01:00:16,821 --> 01:00:19,866 That was after we said to Facebook that we deleted that shit! 951 01:00:20,992 --> 01:00:23,036 This is the 30 million individuals 952 01:00:23,119 --> 01:00:26,205 that we got their data through Professor Kogan. 953 01:00:26,289 --> 01:00:27,498 That's that. 954 01:00:28,041 --> 01:00:31,794 And it admits to it right here, "Our data makes us different," 955 01:00:32,712 --> 01:00:36,174 because we're scraping people's profiles, and other people are not. 956 01:00:37,967 --> 01:00:38,967 Fuck. 957 01:00:45,475 --> 01:00:47,977 [chuckles] I forgot about this stuff, you know? 958 01:00:48,853 --> 01:00:50,772 There's just so much stuff. 959 01:01:02,241 --> 01:01:04,243 [Nix] ...we seek to do as a firm. 960 01:01:04,327 --> 01:01:06,788 We are a behavior change agency. 961 01:01:08,039 --> 01:01:11,125 The holy grail of communications is 962 01:01:11,209 --> 01:01:13,294 when you can start to change behavior. 963 01:01:15,171 --> 01:01:18,633 Uh, Trinidad. This is a great, interesting case history 964 01:01:18,716 --> 01:01:20,218 of how we look at problems. 965 01:01:21,094 --> 01:01:23,346 [folk music playing] 966 01:01:23,680 --> 01:01:25,682 [Nix] There are two main political parties, 967 01:01:25,890 --> 01:01:27,642 one for the blacks and one for the Indians. 968 01:01:28,059 --> 01:01:29,727 And you know, they screw each other. 969 01:01:30,228 --> 01:01:33,064 So, we were working for the Indians. 970 01:01:34,399 --> 01:01:37,985 We went to the client and we said, "We want to target the youth." 971 01:01:38,152 --> 01:01:42,365 And we try and increase apathy. 972 01:01:44,075 --> 01:01:45,910 The campaign had to be non-political, 973 01:01:45,993 --> 01:01:47,704 because the kids don't care about politics. 974 01:01:47,787 --> 01:01:50,748 It had to be reactive, because they're lazy. 975 01:01:51,582 --> 01:01:54,961 So we came up with this campaign, which was all about: 976 01:01:55,044 --> 01:01:57,171 Be part of the gang. Do something cool. 977 01:01:57,255 --> 01:01:58,423 Be part of a movement. 978 01:01:58,715 --> 01:02:00,967 And it was called the "Do So!" campaign. 979 01:02:01,384 --> 01:02:03,024 [Kaiser] It means "I'm not going to vote." 980 01:02:03,219 --> 01:02:04,887 [Nix] "Do so! Don't vote." 981 01:02:04,971 --> 01:02:06,973 [crowd cheering] 982 01:02:07,974 --> 01:02:09,726 [man] The salute of resistance 983 01:02:09,851 --> 01:02:12,729 that is known to all across Trinidad and Tobago. 984 01:02:13,396 --> 01:02:15,481 Do So! Do So! 985 01:02:16,399 --> 01:02:18,484 - Do So! - [crowd continues cheering] 986 01:02:18,776 --> 01:02:21,988 [Nix] It's a sign of resistance against, not the government, 987 01:02:22,071 --> 01:02:24,574 against politics and voting. 988 01:02:24,949 --> 01:02:27,034 - ♪ Run with it, run with it ♪ - ♪ Run, run ♪ 989 01:02:27,118 --> 01:02:28,762 - ♪ Run with it, run with it ♪ - ♪ Run, run ♪ 990 01:02:28,786 --> 01:02:31,164 [Nix] They're making their own YouTube videos. 991 01:02:31,247 --> 01:02:34,667 This is the prime minister's house that's being graffitied. 992 01:02:34,751 --> 01:02:36,502 It was carnage. 993 01:02:38,296 --> 01:02:41,048 [Nix] We knew that when it came to voting, 994 01:02:41,215 --> 01:02:44,177 all the Afro-Caribbean kids wouldn't vote, 995 01:02:44,260 --> 01:02:45,261 because they Do So! 996 01:02:45,344 --> 01:02:48,973 But all the Indian kids would do what their parents told them to do, 997 01:02:49,223 --> 01:02:50,808 which is go out and vote. 998 01:02:51,434 --> 01:02:53,269 They had a lot of fun doing this, 999 01:02:53,352 --> 01:02:56,397 but they're not gonna go against their parents' will. 1000 01:02:56,939 --> 01:02:58,024 [fireworks crackling] 1001 01:02:58,107 --> 01:03:00,526 [overlapping shouts] 1002 01:03:00,610 --> 01:03:05,072 Thank God for the guidance that has brought us here to this victory. 1003 01:03:05,156 --> 01:03:07,492 Thank you. Thank you, God. 1004 01:03:07,575 --> 01:03:12,663 [Nix] And the difference in 18-to 35-year-old turnout was like 40%. 1005 01:03:13,372 --> 01:03:16,209 And that swung the election about 6%, 1006 01:03:16,292 --> 01:03:19,253 which was all we needed in an election that's very close. 1007 01:03:22,256 --> 01:03:28,179 We now undertake ten national campaigns for prime minister or president each year. 1008 01:03:28,805 --> 01:03:30,389 Malaysia, we're working in. 1009 01:03:30,473 --> 01:03:34,602 [Kaiser] We did Lithuania, Romania, Kenya, Ghana. 1010 01:03:35,186 --> 01:03:37,266 - [Nix] So quite a few this year. - [Kaiser] Nigeria. 1011 01:03:37,563 --> 01:03:40,650 - [man] The Brexit campaign? - [Nix] Oh, and the Brexit campaign, yeah. 1012 01:03:41,067 --> 01:03:42,902 But we don't talk about that. 1013 01:03:42,985 --> 01:03:44,278 [Kaiser] Oops, we won! 1014 01:03:44,362 --> 01:03:45,738 [laughter] 1015 01:03:50,243 --> 01:03:52,912 [man] Do you worry at all that she might let you down? 1016 01:03:54,539 --> 01:03:55,873 [sighs] 1017 01:03:56,541 --> 01:03:58,251 Look, um... 1018 01:04:01,254 --> 01:04:02,463 [sighs] 1019 01:04:06,843 --> 01:04:08,135 That is a good question. 1020 01:04:10,429 --> 01:04:15,184 I know already that she is a complicated person, uh, 1021 01:04:15,268 --> 01:04:16,435 who has... 1022 01:04:18,521 --> 01:04:20,773 you know, done some complicated things. 1023 01:04:21,524 --> 01:04:22,608 Uh... 1024 01:04:23,734 --> 01:04:26,571 I believe in redemption. 1025 01:04:26,946 --> 01:04:28,239 [chuckles] 1026 01:04:28,948 --> 01:04:35,371 Uh, individual redemption and collective... uh, social redemption. 1027 01:04:35,454 --> 01:04:39,041 Uh, I'm an idealist. I think we can fix stuff that's broken, 1028 01:04:39,333 --> 01:04:42,962 uh, and, at the same time, 1029 01:04:43,045 --> 01:04:46,591 I am a realist about the fact that you can't fix everything. Um... 1030 01:04:46,716 --> 01:04:49,176 You know, some things get broken and stay broken. 1031 01:04:54,682 --> 01:04:57,018 Good morning, welcome to this further session 1032 01:04:57,101 --> 01:05:00,104 of the Digital, Culture, Media and Sport Select Committee. 1033 01:05:00,187 --> 01:05:02,773 Very pleased to welcome Brittany Kaiser to give evidence to 1034 01:05:02,857 --> 01:05:04,442 the committee this morning. 1035 01:05:05,276 --> 01:05:09,739 Now, there was a contact between Facebook and Cambridge Analytica 1036 01:05:09,822 --> 01:05:14,994 about the use of data in, I think it was 2015, from memory. 1037 01:05:15,202 --> 01:05:18,205 Did you know about that at the time? 1038 01:05:18,414 --> 01:05:21,876 Uh, so Facebook had announced to all of their clients 1039 01:05:21,959 --> 01:05:25,296 that they were going to close their clients' access to this data, 1040 01:05:25,963 --> 01:05:27,673 so we agreed to delete it, 1041 01:05:27,757 --> 01:05:31,010 but in March 2016, 1042 01:05:31,093 --> 01:05:33,304 you know, six or eight weeks after 1043 01:05:33,387 --> 01:05:37,141 our chief data officer said that those data sets were deleted, 1044 01:05:37,224 --> 01:05:40,227 I have an email from one of our senior data scientists 1045 01:05:40,311 --> 01:05:44,941 that said that we were actually using Facebook Like data in our modeling. 1046 01:05:45,024 --> 01:05:46,525 Ooh. 1047 01:05:46,609 --> 01:05:48,903 - [Kaiser] So that seems strange to me. - Uh-oh. 1048 01:05:48,986 --> 01:05:51,131 [Kaiser] If we had deleted all of the Facebook data sets, 1049 01:05:51,155 --> 01:05:53,491 how we were still using that for modeling in March. 1050 01:05:54,700 --> 01:05:57,596 Ms. Kaiser, you seem to have traveled a long way from an idealistic intern 1051 01:05:57,620 --> 01:05:59,288 in Barack Obama's campaign, 1052 01:05:59,372 --> 01:06:03,376 uh, to working for a company that keeps pretty unsavory company, 1053 01:06:03,459 --> 01:06:07,213 uh, in wishing to make pitches to far-right political parties. 1054 01:06:07,296 --> 01:06:09,799 - Mm-hmm. - [man] Didn't that make you uncomfortable? 1055 01:06:09,966 --> 01:06:10,966 Uh, yes. 1056 01:06:11,717 --> 01:06:15,846 I would say questioning the ethics of it is correct, definitely. 1057 01:06:15,930 --> 01:06:19,892 But I have been offered introductions to clients 1058 01:06:19,976 --> 01:06:21,686 that I refused to meet with before, 1059 01:06:21,769 --> 01:06:25,940 um, such as the Alternative for Germany and Marine Le Pen's campaign. 1060 01:06:26,023 --> 01:06:29,235 I refused to even get on a phone call with them. 1061 01:06:30,361 --> 01:06:31,696 But not UKIP? 1062 01:06:31,988 --> 01:06:33,572 Not UKIP, no. 1063 01:06:35,116 --> 01:06:38,869 Um, did you appear and give a presentation to the launch of Leave. EU? 1064 01:06:38,953 --> 01:06:40,121 Yes, I did. 1065 01:06:40,204 --> 01:06:42,999 [Matheson] Um, you must've been a bit, um, disappointed then 1066 01:06:43,082 --> 01:06:45,501 when you subsequently didn't do any work for them. 1067 01:06:45,793 --> 01:06:48,838 And we didn't do any further work for them after that day, yes. 1068 01:06:48,921 --> 01:06:50,273 [Matheson] So you had done some work. 1069 01:06:50,297 --> 01:06:52,633 What was the nature of the work that you had done so far? 1070 01:06:52,717 --> 01:06:53,968 We had taken receipt 1071 01:06:54,051 --> 01:06:57,388 of UK Independence Party data and the survey data. 1072 01:06:57,471 --> 01:07:03,352 She's, like, contradicting Nix a lot, what he said previously. 1073 01:07:03,436 --> 01:07:05,956 So, I think you've been quite clear, as far as you're concerned, 1074 01:07:06,022 --> 01:07:10,026 you were working on the campaign, but just not being paid for it? 1075 01:07:10,109 --> 01:07:12,069 - Mm-hmm. - [Collins] You're pretty clear on that. 1076 01:07:12,862 --> 01:07:14,739 Just to clarify, for our benefit, 1077 01:07:14,822 --> 01:07:16,615 to be effective in this space, 1078 01:07:16,699 --> 01:07:18,951 how big a kind of working set do you need 1079 01:07:19,035 --> 01:07:22,163 to be able to then use that to create the basis 1080 01:07:22,246 --> 01:07:24,790 for targeting the whole country in terms of voting? 1081 01:07:25,082 --> 01:07:27,185 I'm not a data scientist, so I wouldn't be able to say 1082 01:07:27,209 --> 01:07:29,712 the minimum number of data points that you would require, 1083 01:07:29,795 --> 01:07:32,673 uh, but I do know that their targeting tool 1084 01:07:32,757 --> 01:07:36,594 used to be export-controlled by the British government, 1085 01:07:36,677 --> 01:07:39,930 so that would mean that the methodology was considered a weapon. 1086 01:07:40,598 --> 01:07:43,267 Um, weapons-grade communications tactics. 1087 01:07:43,350 --> 01:07:46,896 What you're saying is that the proposal was for Leave. EU to use what you call 1088 01:07:46,979 --> 01:07:50,858 weapons-grade communications techniques against the UK population? 1089 01:07:51,942 --> 01:07:53,903 - [Kaiser] Yes, sir. - It's crazy. 1090 01:07:54,236 --> 01:07:56,989 [O'Hara] Uh, I just want to get your perspective as well. 1091 01:07:57,073 --> 01:07:59,158 What do you actually think the legislators should do 1092 01:07:59,241 --> 01:08:01,368 in order to better protect people's data? 1093 01:08:01,911 --> 01:08:05,956 [Kaiser] Well, I'm very glad that you asked that. Think about it right now. 1094 01:08:06,040 --> 01:08:08,459 The sole worth of Google and Facebook 1095 01:08:08,542 --> 01:08:12,046 is the fact that they own, and possess, and hold, and use 1096 01:08:12,129 --> 01:08:14,249 the personal data of people from all around the world. 1097 01:08:14,965 --> 01:08:17,218 So I think that the best way to move forward 1098 01:08:17,301 --> 01:08:21,180 are for people to really possess their data like their property. 1099 01:08:21,847 --> 01:08:23,825 - [O'Hara] Thank you, Chair. - [Collins] Thank you. 1100 01:08:23,849 --> 01:08:26,602 Um, I think that concludes the questions from us today. 1101 01:08:26,685 --> 01:08:28,005 Just before we close the session, 1102 01:08:28,062 --> 01:08:30,981 I just have to make a short announcement about Alexander Nix. 1103 01:08:31,065 --> 01:08:34,193 He's now not able to give evidence to the Committee tomorrow 1104 01:08:34,276 --> 01:08:37,321 as a consequence of him having been served with an information notice 1105 01:08:37,404 --> 01:08:39,244 and being subject to the criminal investigation 1106 01:08:39,323 --> 01:08:41,367 by the Information Commissioner's Office. 1107 01:08:41,450 --> 01:08:44,036 And I hope we'll be able to update people 1108 01:08:44,120 --> 01:08:46,413 about that early next week. Thank you very much. 1109 01:08:47,081 --> 01:08:48,081 Thank you. 1110 01:08:48,749 --> 01:08:49,750 [Carroll] Shit! 1111 01:08:51,001 --> 01:08:53,045 [woman] The proceeding has ended. 1112 01:08:53,129 --> 01:08:54,129 Yes, it has! 1113 01:08:56,090 --> 01:08:57,091 [laughs] 1114 01:09:04,807 --> 01:09:06,809 [indistinct chatter] 1115 01:09:08,477 --> 01:09:10,479 [upbeat music playing over speakers] 1116 01:09:11,772 --> 01:09:13,732 I just got a text from Alexander. 1117 01:09:15,860 --> 01:09:17,486 [chuckles] Alexander Nix. 1118 01:09:20,447 --> 01:09:21,615 [man] What did he say? 1119 01:09:22,283 --> 01:09:25,619 "Well done, Britt. Looked quite tough, and you did okay." 1120 01:09:26,453 --> 01:09:29,540 With a winky face little emoji. 1121 01:09:31,167 --> 01:09:33,794 It makes me kind of sad. You know what I mean? 1122 01:09:33,878 --> 01:09:36,088 Like, it's not like he spent three and a half years 1123 01:09:36,172 --> 01:09:37,923 being an asshole to me. He didn't. 1124 01:09:38,174 --> 01:09:40,443 [Hilder] He spent three and a half years being nice to you 1125 01:09:40,467 --> 01:09:42,261 to get you to do what he wanted you to do. 1126 01:09:42,344 --> 01:09:46,473 Yeah. But he is rather fun. [laughs] 1127 01:09:46,557 --> 01:09:48,243 - [phone vibrates] - [Hilder] I know, I know. 1128 01:09:48,267 --> 01:09:49,310 [Carroll] Hey, Justin. 1129 01:09:49,810 --> 01:09:52,146 [Justin over phone] Hey, what's up? What did you think? 1130 01:09:52,646 --> 01:09:55,441 I'm processing it. There were a lot of revelations. 1131 01:09:57,484 --> 01:10:00,988 SCL has to file its defense at the end of the month, 1132 01:10:01,614 --> 01:10:04,200 so, it'll be really interesting to see, like, 1133 01:10:04,283 --> 01:10:07,328 what they think they can do, especially after this. 1134 01:10:07,995 --> 01:10:09,205 [Justin] That's right. 1135 01:10:10,206 --> 01:10:11,290 [Carroll] I mean, shit. 1136 01:10:11,373 --> 01:10:15,753 She said that basically psychographics should be classified as a weapon. 1137 01:10:20,216 --> 01:10:23,719 It seems like Kaiser has some moral compass in her. 1138 01:10:26,805 --> 01:10:30,392 But, so many times, she knew that she was 1139 01:10:30,476 --> 01:10:32,061 in a dark world 1140 01:10:32,353 --> 01:10:33,896 and didn't step away. 1141 01:10:33,979 --> 01:10:37,191 And they... they got... got her on that a couple of times. 1142 01:10:37,274 --> 01:10:38,274 Yeah. 1143 01:10:42,154 --> 01:10:45,157 [interviewer] You did work for a man who, upon meeting you, 1144 01:10:45,241 --> 01:10:46,927 said to you, you know, "Let me get you drunk 1145 01:10:46,951 --> 01:10:48,452 and steal your secrets." 1146 01:10:48,869 --> 01:10:51,789 You knew the kind of company that you were working for. 1147 01:10:51,997 --> 01:10:54,208 I don't know. I guess I trusted him. 1148 01:10:54,500 --> 01:10:56,377 I worked for him for three and a half years. 1149 01:10:56,460 --> 01:10:59,171 He was a friend and mentor. I mean... 1150 01:10:59,463 --> 01:11:01,465 He actually just sent me a text, 1151 01:11:01,548 --> 01:11:05,135 although I haven't spoken to him in, I don't know, at least over a month. 1152 01:11:05,219 --> 01:11:06,696 [interviewer] So, he was watching you. 1153 01:11:06,720 --> 01:11:08,720 - He watched, yes. - [interviewer] What did he say? 1154 01:11:09,056 --> 01:11:12,393 He said that it looked pretty tough but that I did a good job. 1155 01:11:12,476 --> 01:11:14,520 - [interviewer] Did you reply? - No. 1156 01:11:14,728 --> 01:11:15,854 [interviewer] Will you? 1157 01:11:15,938 --> 01:11:18,482 No, I don't think that's appropriate at this time. 1158 01:11:18,565 --> 01:11:21,110 [interviewer] So, there's no friendship there? 1159 01:11:21,777 --> 01:11:27,658 Well, I now question, you know, how much of a friendship it actually was. 1160 01:11:49,722 --> 01:11:52,141 [Cadwalladr] The thing which I give Brittany credit for... 1161 01:11:52,433 --> 01:11:56,520 it's really amazed me how many people are just keeping their mouths shut. 1162 01:12:01,066 --> 01:12:03,485 I mean, it was a jaw-dropping moment 1163 01:12:03,569 --> 01:12:08,866 when Brittany said these are classified as weapons-grade technology. 1164 01:12:09,199 --> 01:12:11,994 And it was actually illegal to use those 1165 01:12:12,077 --> 01:12:14,455 without the permission of the British government. 1166 01:12:18,125 --> 01:12:19,710 It's psyops. 1167 01:12:20,961 --> 01:12:24,214 Psyops is psychological operations. 1168 01:12:24,298 --> 01:12:27,801 And it's a... it's a term that the military uses 1169 01:12:27,885 --> 01:12:32,348 to describe what you do in warfare which isn't warfare. 1170 01:12:32,431 --> 01:12:34,350 So, essentially, you know, 1171 01:12:34,433 --> 01:12:36,602 in a place like Afghanistan, you've got a choice. 1172 01:12:36,685 --> 01:12:38,562 You either bomb the shit out of a village 1173 01:12:38,854 --> 01:12:41,398 or you try and use other techniques 1174 01:12:41,482 --> 01:12:44,943 to persuade them that actually, "The Taliban's not very good, 1175 01:12:45,027 --> 01:12:46,862 and you'd be much better off without them." 1176 01:12:51,033 --> 01:12:54,203 SCL started out as a military contractor. 1177 01:12:54,286 --> 01:12:55,454 SCL Defense. 1178 01:12:57,498 --> 01:13:00,417 [Nix] We have a fairly substantial defense business. 1179 01:13:01,627 --> 01:13:03,837 We actually train the British Army, the British Navy, 1180 01:13:03,921 --> 01:13:05,464 the U.S. Army, U.S. Special Forces. 1181 01:13:05,547 --> 01:13:09,009 We train NATO, the CIA, State Department, Pentagon. 1182 01:13:09,718 --> 01:13:15,099 It's using research to influence behavior of hostile audiences. 1183 01:13:15,766 --> 01:13:20,646 You know, how do you persuade 14-to 30-year-old Muslim boys 1184 01:13:20,729 --> 01:13:22,356 not to join Al-Qaeda? 1185 01:13:23,440 --> 01:13:24,942 Essentially communication warfare. 1186 01:13:25,025 --> 01:13:26,985 - [man] Allahu Akbar! - [all] Allahu Akbar! 1187 01:13:27,486 --> 01:13:30,948 [Cadwalladr] They'd worked in Afghanistan, they'd worked in Iraq, 1188 01:13:31,031 --> 01:13:34,576 they'd worked in various places in Eastern Europe. 1189 01:13:34,993 --> 01:13:37,663 But the real game changer 1190 01:13:37,746 --> 01:13:42,918 was they started using information warfare in elections. 1191 01:13:44,044 --> 01:13:47,714 [Nix] There's a lot of overlap, because it's all the same methodology. 1192 01:13:50,843 --> 01:13:55,139 [Cadwalladr] All of the campaigns which Cambridge Analytica/SCL did 1193 01:13:55,222 --> 01:13:57,599 for the developing world, 1194 01:13:57,808 --> 01:14:02,020 it was all about practicing some new technology or trick. 1195 01:14:02,354 --> 01:14:04,106 How to persuade people, 1196 01:14:04,189 --> 01:14:07,943 how to suppress turnout, or how to increase turnout. 1197 01:14:10,654 --> 01:14:11,654 And then it's like, 1198 01:14:11,697 --> 01:14:15,075 "Okay, now we've got the hang of it, let's use it in Britain and America." 1199 01:14:24,751 --> 01:14:25,752 [Nix] 1200 01:14:37,014 --> 01:14:38,557 [overlapping restaurant chatter] 1201 01:14:42,269 --> 01:14:43,269 [man] Yes. 1202 01:15:19,306 --> 01:15:22,643 ...and expands, but with no branding, 1203 01:15:22,893 --> 01:15:25,646 so it's unattributable, untrackable. 1204 01:15:26,396 --> 01:15:28,565 - [applause] - And my view... 1205 01:15:29,441 --> 01:15:32,569 is that if you can't run your own house, 1206 01:15:32,778 --> 01:15:35,447 - you certainly can't run the White House. - [crowd cheering] 1207 01:15:35,531 --> 01:15:36,990 - Can't do it. - [cheering] 1208 01:15:38,158 --> 01:15:41,620 [Trump] Crooked Hillary, right? Crooked. She's crooked as you can be. 1209 01:15:41,703 --> 01:15:43,413 [crowd chanting] Lock her up! Lock her up! 1210 01:15:43,497 --> 01:15:45,415 [man] Yep, that's right, lock her up! 1211 01:15:45,499 --> 01:15:47,251 Lock her up! Lock her up! 1212 01:15:47,584 --> 01:15:49,920 Lock her up! Lock her up! 1213 01:15:50,003 --> 01:15:51,838 [crowd continues shouting] 1214 01:15:51,922 --> 01:15:54,424 Let's defeat her in November. 1215 01:16:08,105 --> 01:16:10,482 [man] What was it like for you to watch 1216 01:16:10,566 --> 01:16:12,526 the Channel 4 undercover video? 1217 01:16:14,069 --> 01:16:15,612 Nobody recognized it. 1218 01:16:18,407 --> 01:16:20,701 When we watched that video... 1219 01:16:21,118 --> 01:16:25,872 I watched it in the New York office with, um, all the staff there. 1220 01:16:25,956 --> 01:16:27,791 And we knew it was coming out. 1221 01:16:28,000 --> 01:16:30,377 And I think everybody was... 1222 01:16:31,962 --> 01:16:33,255 in a state of shock. 1223 01:16:35,424 --> 01:16:40,095 Everybody walked away from the screen in silence back to their desks. 1224 01:16:47,978 --> 01:16:51,398 [reporter 1] Tonight, an undercover interview by Channel 4 News in London 1225 01:16:51,481 --> 01:16:52,983 shows Cambridge executives, 1226 01:16:53,066 --> 01:16:55,485 including CEO Alexander Nix, 1227 01:16:55,569 --> 01:16:58,030 boasting about the company's role in Trump's win. 1228 01:16:58,488 --> 01:17:01,533 This series of undercover interviews by Channel 4 News 1229 01:17:01,617 --> 01:17:03,076 also caught Nix on tape 1230 01:17:03,160 --> 01:17:06,038 talking about potential bribery and entrapment. 1231 01:17:08,832 --> 01:17:10,334 [man] I don't understand. 1232 01:17:14,588 --> 01:17:16,798 [reporter 1] Mr. Nix, can I ask you what your message is 1233 01:17:16,882 --> 01:17:19,217 to Cambridge Analytica employees today? 1234 01:17:22,220 --> 01:17:24,780 [reporter 2] We've just got a statement from Cambridge Analytica. 1235 01:17:25,932 --> 01:17:29,686 Alexander Nix has been suspended with immediate effect. 1236 01:17:29,770 --> 01:17:34,066 The company accused of harvesting the data of more than 87 million Facebook users 1237 01:17:34,149 --> 01:17:35,484 says it is shutting down. 1238 01:17:35,817 --> 01:17:40,489 The company says it intends to file for bankruptcy in the US and the UK. 1239 01:17:42,324 --> 01:17:44,576 [reporter] Critics believe Cambridge Analytica 1240 01:17:44,660 --> 01:17:47,496 and SCL Elections may be shutting operations 1241 01:17:47,579 --> 01:17:51,583 to limit or restrict the ability of the authority's investigations 1242 01:17:51,667 --> 01:17:53,794 and also to get rid of evidence. 1243 01:18:02,678 --> 01:18:04,364 [Wheatland] The Cambridge Analytica scandal, 1244 01:18:04,388 --> 01:18:05,972 is it now the Facebook scandal? 1245 01:18:08,100 --> 01:18:10,185 I mean, this is not about one company. 1246 01:18:11,436 --> 01:18:16,608 This technology is going on unabated and will continue to go on. 1247 01:18:17,943 --> 01:18:19,653 But Cambridge Analytica's gone. 1248 01:18:20,696 --> 01:18:23,532 In some senses, I feel that, um... 1249 01:18:25,867 --> 01:18:30,038 that because of the way that this technology is moving so fast, 1250 01:18:30,414 --> 01:18:34,710 and because people don't really understand it, 1251 01:18:34,793 --> 01:18:37,170 and because there's a lot of concerns about it, 1252 01:18:37,254 --> 01:18:40,215 there was always going to be a Cambridge Analytica. 1253 01:18:40,632 --> 01:18:43,218 It just sucks for me it was Cambridge Analytica. 1254 01:18:55,814 --> 01:18:57,733 [Cadwalladr] After we dealt with the threats 1255 01:18:57,816 --> 01:19:00,527 from Cambridge Analytica over the course of a year, 1256 01:19:01,027 --> 01:19:03,363 then the thing which made our heads explode 1257 01:19:03,447 --> 01:19:06,283 was the day before publication, when we got a letter from Facebook. 1258 01:19:06,992 --> 01:19:10,245 Yeah, it felt like an attempt to... to cow us into submission. 1259 01:19:10,328 --> 01:19:11,747 It didn't feel like a sort of... 1260 01:19:11,830 --> 01:19:14,499 To me, it didn't feel like a legitimate response... 1261 01:19:14,583 --> 01:19:18,462 And you sort of go, you know, why is a great, big organization like you 1262 01:19:18,545 --> 01:19:20,130 using UK lawyers? 1263 01:19:20,213 --> 01:19:22,382 And again, a very aggressive threat 1264 01:19:22,466 --> 01:19:24,318 for which, actually, they then... Didn't they apologize? 1265 01:19:24,342 --> 01:19:26,178 Yes, they said it was not their finest hour. 1266 01:19:26,261 --> 01:19:27,888 - [all laugh] - [Cadwalladr] Yeah. 1267 01:19:27,971 --> 01:19:29,473 And up until that point, 1268 01:19:29,556 --> 01:19:31,808 it was like the tech giants were still, like, 1269 01:19:31,892 --> 01:19:33,727 the nice guys who wear hoodies, 1270 01:19:33,810 --> 01:19:35,562 - who connected the world. - [Phillips] Yep. 1271 01:19:35,645 --> 01:19:37,898 And there was a shift away 1272 01:19:37,981 --> 01:19:39,983 from big tech being good 1273 01:19:40,066 --> 01:19:43,945 to saying well, actually, we do need to start asking questions 1274 01:19:44,029 --> 01:19:46,364 about this and what it is. 1275 01:19:50,243 --> 01:19:52,245 [Cadwalladr] Cambridge Analytica is gone, 1276 01:19:52,829 --> 01:19:58,502 but it's really important to understand that the Cambridge Analytica story 1277 01:19:58,668 --> 01:20:02,923 actually points to this much bigger, more worrying story 1278 01:20:03,757 --> 01:20:08,637 which is that our personal data is out there and being used against us 1279 01:20:08,720 --> 01:20:10,931 in ways we don't understand. 1280 01:20:14,643 --> 01:20:16,520 And if David gets his data back, 1281 01:20:16,812 --> 01:20:19,856 we can hopefully start getting some answers. 1282 01:20:22,734 --> 01:20:26,780 [Carroll] The deadline is today for SCL to comply with the law 1283 01:20:26,863 --> 01:20:28,615 and give me my data. 1284 01:20:30,659 --> 01:20:35,497 We're at the precipice of evasion or accountability. 1285 01:20:38,583 --> 01:20:40,210 Carole tweeted, 1286 01:20:40,293 --> 01:20:43,421 "Prof. Carroll also giving evidence to European Parliament today 1287 01:20:43,505 --> 01:20:47,634 on day of deadline for Cambridge Analytica to turn over his data to him. 1288 01:20:47,717 --> 01:20:51,388 If it fails to do so, it becomes a matter for criminal proceedings." 1289 01:20:55,934 --> 01:20:58,728 [Carroll] "Hey Ravi, have you heard anything?" 1290 01:21:00,272 --> 01:21:01,356 "Not yet." 1291 01:21:07,445 --> 01:21:11,783 I've been waiting to hear from my lawyer, and we have heard nothing. 1292 01:21:11,867 --> 01:21:14,244 And so they have not respected the regulator. 1293 01:21:14,327 --> 01:21:16,788 They are not respecting the law. 1294 01:21:17,372 --> 01:21:19,666 So now that this is becoming a criminal matter, 1295 01:21:19,749 --> 01:21:21,626 we are now in uncharted waters. 1296 01:21:22,711 --> 01:21:26,298 And I will continue to pursue it 1297 01:21:26,381 --> 01:21:30,051 because their model has the potential to affect a population 1298 01:21:30,135 --> 01:21:32,596 even if it's just a tiny slice of the population, 1299 01:21:32,679 --> 01:21:34,264 because in the United States, 1300 01:21:34,347 --> 01:21:39,436 only about 70,000 voters in three states decided the election. 1301 01:21:42,188 --> 01:21:45,275 Thank you very much, um, Professor Carroll. Um... 1302 01:21:45,609 --> 01:21:46,818 Mr. Batten. 1303 01:21:47,277 --> 01:21:51,281 [Batten] My question is for Carole Cadwalladr from The Guardian. 1304 01:21:51,364 --> 01:21:56,077 Is The Guardian's stand on this a purely politically partisan one 1305 01:21:56,161 --> 01:22:00,040 in its own intention to assist in any way that it can 1306 01:22:00,123 --> 01:22:03,251 to reverse and overturn the result of the referendum? 1307 01:22:05,420 --> 01:22:10,258 This is not a partisan issue, I cannot say that more strongly. 1308 01:22:10,342 --> 01:22:14,304 This is about the integrity of our democracy. 1309 01:22:14,387 --> 01:22:17,057 It's about our national sovereignty. 1310 01:22:17,599 --> 01:22:21,061 And I would think that you would have an interest in that also. 1311 01:22:21,144 --> 01:22:23,146 [pounding applause] 1312 01:22:24,147 --> 01:22:28,234 I think that we desperately need more information, 1313 01:22:28,652 --> 01:22:31,154 because we don't how people were targeted 1314 01:22:31,237 --> 01:22:33,782 and we don't know what data that was based upon. 1315 01:22:34,157 --> 01:22:40,956 One thing we do know is that Facebook has been obstructive in its efforts 1316 01:22:41,039 --> 01:22:43,708 to help the British Parliament investigate this matter. 1317 01:22:44,084 --> 01:22:48,755 Uh, really, really, really, you've got to, like, look higher 1318 01:22:48,838 --> 01:22:50,966 and really see the bigger issue here 1319 01:22:51,049 --> 01:22:53,635 and the bigger picture and the bigger risks to us all. 1320 01:22:55,637 --> 01:22:57,973 [applause] 1321 01:23:04,771 --> 01:23:07,983 [reporter] Roger, even if, uh, Facebook hasn't broken any laws, 1322 01:23:08,066 --> 01:23:12,320 have they broken a sort of moral trust that they have with their consumers? 1323 01:23:12,404 --> 01:23:14,197 Well, I... They have with me. 1324 01:23:14,280 --> 01:23:18,743 I mean, I spent three months, starting in October 2016, trying to say, 1325 01:23:18,827 --> 01:23:22,455 "Guys, I think you're killing democracy, and you're gonna kill your business." 1326 01:23:22,539 --> 01:23:24,833 - Hi, how are you? I'm Roger. - Hi, how are you? 1327 01:23:24,916 --> 01:23:26,811 - [McNamee] Pleasure to meet you. - Pleasure to meet you. 1328 01:23:26,835 --> 01:23:31,131 Facebook is designed to monopolize attention. 1329 01:23:31,589 --> 01:23:34,509 Just taking all of the basic tricks of propaganda, 1330 01:23:34,592 --> 01:23:37,053 marrying them to the tricks of casino gambling. 1331 01:23:37,137 --> 01:23:39,014 You know, slot machines and the like. 1332 01:23:39,264 --> 01:23:43,977 And basically playing on instincts, 1333 01:23:44,394 --> 01:23:47,772 and fear and anger are the two most dependable ways of doing that. 1334 01:23:47,897 --> 01:23:50,483 And so, they created a set of tools 1335 01:23:50,567 --> 01:23:56,197 to allow advertisers to exploit that emotional audience 1336 01:23:57,073 --> 01:24:00,368 with individual-level targeting, right? 1337 01:24:00,452 --> 01:24:05,623 There's 2.1 billion people, each with their own reality. 1338 01:24:05,915 --> 01:24:08,251 And once everybody has their own reality, 1339 01:24:08,334 --> 01:24:11,463 - it's relatively easy to manipulate them. - Mmm. Yeah. 1340 01:24:11,546 --> 01:24:16,926 And the other thing about this is, they know that it's killing me... 1341 01:24:17,218 --> 01:24:18,678 - [Kaiser] Yeah. - ...to be critical 1342 01:24:18,762 --> 01:24:20,597 of what I've viewed as my baby. 1343 01:24:20,680 --> 01:24:24,809 It is a lot easier to just sort of say, "I'm not gonna think about it." 1344 01:24:24,893 --> 01:24:25,893 [Hilder] Yes. 1345 01:24:25,977 --> 01:24:27,645 - But... - [Kaiser] Yeah. 1346 01:24:28,229 --> 01:24:31,191 ...you get tested in your life a few times, right? And... 1347 01:24:31,483 --> 01:24:33,151 for me, this was one of those moments. 1348 01:24:33,234 --> 01:24:35,004 I was either gonna stand up and do something about this, 1349 01:24:35,028 --> 01:24:37,781 or I wasn't gonna stand up and do anything about anything. Right? 1350 01:24:37,864 --> 01:24:40,533 Because my fingerprints are on this thing. 1351 01:24:41,034 --> 01:24:43,495 - [Kaiser] I know. - I mean, I felt really guilty. 1352 01:24:44,621 --> 01:24:47,332 And I just want to be able to... 1353 01:24:48,458 --> 01:24:49,876 sleep at night. 1354 01:25:09,354 --> 01:25:12,774 [Hilder] One of the things that I was really struck by was... 1355 01:25:13,066 --> 01:25:15,318 what happened with you 1356 01:25:15,401 --> 01:25:17,487 and the Obama people and the Hillary people. 1357 01:25:18,863 --> 01:25:22,325 [Kaiser] Uh, none of them ever wanted to offer to pay me. 1358 01:25:23,493 --> 01:25:27,497 And, um, when your family loses all their money 1359 01:25:27,580 --> 01:25:30,375 and loses their family home 1360 01:25:30,458 --> 01:25:33,002 and your father, who's the main breadwinner, 1361 01:25:33,086 --> 01:25:35,588 has brain surgery and can never work again, 1362 01:25:36,214 --> 01:25:38,925 you have to work for people that pay you. 1363 01:25:40,718 --> 01:25:42,220 [thunder rumbles] 1364 01:25:42,303 --> 01:25:45,265 [Hilder] Your family lost their money in 2008? 1365 01:25:45,682 --> 01:25:49,519 Um, yeah, but it took a while for it all to really fall apart. 1366 01:25:49,602 --> 01:25:50,854 - [man] Yeah. - Um... 1367 01:25:52,147 --> 01:25:57,026 We lost our family home in 2014, when I started working for Cambridge. 1368 01:26:09,497 --> 01:26:12,375 [reporter 1] Alexander Nix appears before Parliament's Media Committee 1369 01:26:12,458 --> 01:26:14,460 after previously refusing to testify 1370 01:26:14,544 --> 01:26:17,338 due to law enforcement investigations into the firm. 1371 01:26:19,924 --> 01:26:22,260 [overlapping questions] 1372 01:26:22,343 --> 01:26:23,887 - Hi, Jo, how are you? - Hello. 1373 01:26:27,223 --> 01:26:29,058 [Carroll] The last time I was in London, 1374 01:26:29,142 --> 01:26:33,188 I remember considering challenging SCL 1375 01:26:34,522 --> 01:26:38,651 and running through my head, like, how scary it was. 1376 01:26:40,195 --> 01:26:42,572 The Committee's very grateful, uh, to Alexander Nix 1377 01:26:42,655 --> 01:26:45,408 for agreeing to come back in front of the committee today 1378 01:26:45,491 --> 01:26:47,243 to answer our questions... 1379 01:26:47,327 --> 01:26:49,120 [Carroll] Now, to be back here and, uh... 1380 01:26:49,204 --> 01:26:52,457 these guys are down for the count and... 1381 01:26:52,540 --> 01:26:54,876 the villain is up against the wall. 1382 01:26:56,336 --> 01:26:59,505 [chuckles] You know, does he have any allies left in the world 1383 01:27:00,298 --> 01:27:02,634 or has everybody turned against him? 1384 01:27:04,385 --> 01:27:05,720 [sighs] Right. 1385 01:27:06,763 --> 01:27:09,849 I'd like to make a few short clarifications. 1386 01:27:09,933 --> 01:27:12,560 Um, these will only take a few minutes, 1387 01:27:12,644 --> 01:27:17,273 uh, but it is important to be able to frame, uh, my answers. 1388 01:27:17,357 --> 01:27:18,358 He's so nervous. 1389 01:27:18,441 --> 01:27:20,461 Mr. Nix, I'd be grateful if you'd start with the committee's questions 1390 01:27:20,485 --> 01:27:22,087 and then see how we go through the hearing. 1391 01:27:22,111 --> 01:27:26,449 Ordinarily, uh, I would respect that, but these aren't ordinary circumstances, 1392 01:27:26,532 --> 01:27:30,203 and so, if I may, I'd like to start with a very brief statement 1393 01:27:30,286 --> 01:27:31,663 just to set out my position. 1394 01:27:31,746 --> 01:27:34,332 I would rather take this on a question by question basis 1395 01:27:34,415 --> 01:27:37,126 rather than being dealt with as a statement at the beginning. 1396 01:27:37,377 --> 01:27:39,587 Mr. Collins, you'll have plenty of opportunity, 1397 01:27:39,671 --> 01:27:43,174 as will all the Committee, to ask me as many questions as you want, 1398 01:27:43,258 --> 01:27:45,134 but I have to insist on... 1399 01:27:45,218 --> 01:27:48,614 - How could you possibly start like that... - [Collins] It's not your place to insist. 1400 01:27:48,638 --> 01:27:51,724 "I accept that some of my answers could have been clearer..." 1401 01:27:51,808 --> 01:27:54,560 [Collins] So, instead, you're just reading out the statement. 1402 01:27:54,644 --> 01:27:58,189 - Can you answer the first question... - Why is he doing that? 1403 01:27:59,816 --> 01:28:02,456 - Could you repeat your first question? - [Collins] Yes, thank you. 1404 01:28:02,527 --> 01:28:05,238 You did pitch to work on the Referendum, 1405 01:28:05,321 --> 01:28:09,284 and I don't want to dwell on Leave. EU because you've made your position clear. 1406 01:28:09,534 --> 01:28:12,912 We're really scratching around here, Mr. Farrelly. Um... 1407 01:28:13,329 --> 01:28:17,417 We've been working, or I've been working with this company for 15 years. Um... 1408 01:28:17,500 --> 01:28:20,086 We've never undertaken an election in the UK. 1409 01:28:20,253 --> 01:28:21,462 [Collins] Well, I was... 1410 01:28:21,546 --> 01:28:23,732 - I hope I wasn't scratching around. - That is not true. 1411 01:28:23,756 --> 01:28:26,759 [Collins] I was comparing what you told us with the evidence 1412 01:28:26,843 --> 01:28:28,344 that's subsequently emerged, 1413 01:28:28,428 --> 01:28:30,468 - and you clearly felt... - [phone chimes, vibrates] 1414 01:28:30,513 --> 01:28:33,641 ...that the work that you've done, uh... [stammers, continues indistinctly] 1415 01:28:33,725 --> 01:28:37,186 [Kaiser] So, I got an email from Carole. 1416 01:28:37,270 --> 01:28:40,315 She knows that I met Julian Assange in February. 1417 01:28:40,398 --> 01:28:42,025 [chuckles] Um... 1418 01:28:42,108 --> 01:28:47,155 And she knows that I donated to WikiLeaks at some point in Bitcoin. 1419 01:28:49,073 --> 01:28:52,076 If she prints something about it today, it's going to make... 1420 01:28:53,202 --> 01:28:56,372 my conversations with my own government really difficult. 1421 01:28:56,914 --> 01:28:59,059 [Collins] Um, it came up in Brittany Kaiser's evidence, 1422 01:28:59,083 --> 01:29:02,628 because you spoke to us about, uh, Julian Assange the last time you came, 1423 01:29:02,712 --> 01:29:06,049 saying that you made an attempt to gain access to the emails 1424 01:29:06,132 --> 01:29:08,092 that Julian Assange had, um, 1425 01:29:08,176 --> 01:29:11,554 the Hillary Clinton emails, in order to benefit your client, 1426 01:29:11,637 --> 01:29:12,554 the Trump campaign. 1427 01:29:12,555 --> 01:29:15,099 Well, these were very contentious emails, potentially... 1428 01:29:15,183 --> 01:29:17,602 - [Collins] Yeah. - ...and we wanted to understand... 1429 01:29:17,685 --> 01:29:19,812 as did every journalist 1430 01:29:19,896 --> 01:29:22,565 and, I would say, most political consultants 1431 01:29:22,648 --> 01:29:25,109 on both sides of the aisle in the United States, 1432 01:29:25,193 --> 01:29:26,778 um, what was contained in them. 1433 01:29:26,861 --> 01:29:32,492 [stammers] I don't think that curiosity is indicative of anything nefarious. 1434 01:29:32,575 --> 01:29:36,245 - [hearing continues indistinctly] - Oh, my God. Carole published the article. 1435 01:29:38,122 --> 01:29:40,666 I didn't discuss the US election! 1436 01:29:40,750 --> 01:29:43,086 - Oh, my God, this is insane! - [phone ringing] 1437 01:29:43,169 --> 01:29:44,337 Paul! 1438 01:29:44,420 --> 01:29:46,964 - [Paul] Hello, how are you! - Paul! 1439 01:29:47,507 --> 01:29:50,051 I have said to you, it's all coming out, 1440 01:29:50,134 --> 01:29:51,469 and the question is how. 1441 01:29:51,552 --> 01:29:54,347 I didn't conspire to leak Hillary's emails, 1442 01:29:54,430 --> 01:29:58,184 and I have nothing... [clears throat] ...to do with Russia, so... 1443 01:29:58,267 --> 01:29:59,267 [Paul] Yes. 1444 01:29:59,435 --> 01:30:01,020 The fact is... 1445 01:30:01,104 --> 01:30:02,104 [clears throat] 1446 01:30:02,522 --> 01:30:05,233 - ...it looks like I did both. - Does it look like you did both? 1447 01:30:05,316 --> 01:30:08,694 If I wasn't me, I would say yes, that's what it looks like! 1448 01:30:08,778 --> 01:30:10,446 - [chuckles nervously] - [Paul laughs] 1449 01:30:11,155 --> 01:30:12,698 That's why I'm freaking out. 1450 01:30:12,782 --> 01:30:15,535 There's gonna be so many people that literally never believe me. 1451 01:30:15,952 --> 01:30:18,496 I will die with people still not believing me. 1452 01:30:18,579 --> 01:30:21,374 - Uh, that is possible. - [chuckles] 1453 01:30:21,624 --> 01:30:24,043 - That is definitely possible. [laughs] - I know! 1454 01:30:24,794 --> 01:30:26,170 Agh! [slams table] 1455 01:30:27,380 --> 01:30:28,423 All right... 1456 01:30:29,006 --> 01:30:31,446 [Kaiser sniffles] I think I need to get the fuck out of here. 1457 01:30:34,887 --> 01:30:37,807 [O'Hara] From where I'm sitting, since you've come here today, 1458 01:30:37,890 --> 01:30:41,686 you have attempted to paint yourself as the victim here, 1459 01:30:41,769 --> 01:30:46,566 though, by no stretch of the imagination can you be seen as a victim. 1460 01:30:46,858 --> 01:30:50,194 Surely you can see that you are not the victim here. 1461 01:30:51,028 --> 01:30:52,864 [Nix] What if I was the victim? 1462 01:30:52,947 --> 01:30:56,868 What happens if, as some of these investigations are concluded, 1463 01:30:56,951 --> 01:30:59,787 people realize that actually we were simply... 1464 01:31:00,371 --> 01:31:04,917 the guys who were, uh, perceived to have contributed 1465 01:31:05,001 --> 01:31:06,711 to the Trump campaign 1466 01:31:06,794 --> 01:31:10,923 and were wrongly accredited with being the architects of Brexit 1467 01:31:11,340 --> 01:31:15,636 and as a result of the polarizing nature of those two political campaigns, 1468 01:31:15,720 --> 01:31:18,723 the global liberal media took umbrage 1469 01:31:18,806 --> 01:31:21,684 and decided to put us in their crosshairs 1470 01:31:21,767 --> 01:31:26,147 and launch a coordinated attack on us as a company 1471 01:31:26,230 --> 01:31:29,317 in order to destroy our reputations and our business, 1472 01:31:29,400 --> 01:31:34,405 and all of this was underpinned by a stream of allegations, 1473 01:31:34,489 --> 01:31:37,867 unfounded, groundless allegations that came from Mr. Wylie, 1474 01:31:37,950 --> 01:31:41,287 who gave the media the ammunition that they needed... 1475 01:31:41,662 --> 01:31:43,164 that they wanted, 1476 01:31:43,247 --> 01:31:46,417 to be able to attack us for something that, in the case of Brexit, 1477 01:31:46,501 --> 01:31:47,502 we simply didn't do. 1478 01:31:47,585 --> 01:31:49,504 [O'Hara] So you are the victim in all of this. 1479 01:31:50,796 --> 01:31:54,717 Well, if you're sitting where I am right now, you'd probably feel... 1480 01:31:54,800 --> 01:31:56,177 uh... [stammers] 1481 01:31:56,260 --> 01:31:57,470 ...quite victimized. 1482 01:31:57,845 --> 01:31:59,514 Where the fuck is my passport? 1483 01:32:00,348 --> 01:32:01,348 [exhales] 1484 01:32:01,766 --> 01:32:03,809 Not having a good day right now. 1485 01:32:04,352 --> 01:32:05,645 Did I put it somewhere else? 1486 01:32:08,731 --> 01:32:09,857 [sighs] Oh, my God. 1487 01:32:10,149 --> 01:32:12,568 I've never put it there before in my life. 1488 01:32:13,110 --> 01:32:15,613 Not that I'm thinking straight today, so... 1489 01:32:17,615 --> 01:32:18,741 [sighs] 1490 01:32:19,992 --> 01:32:22,662 I'm flustered. Sorry, guys. 1491 01:32:27,416 --> 01:32:29,460 Coco Mademoiselle makes me feel better. 1492 01:32:30,503 --> 01:32:32,171 At least I smell good. 1493 01:32:38,678 --> 01:32:41,639 I have no idea what's gonna happen in the next coming days. 1494 01:32:43,015 --> 01:32:46,352 I literally came back here because I wanted to be cooperative, 1495 01:32:47,562 --> 01:32:48,938 I want to help. 1496 01:32:49,438 --> 01:32:51,440 [low, indistinct radio chatter] 1497 01:33:15,047 --> 01:33:17,192 [reporter] Today, The Guardian newspaper in Britain reports 1498 01:33:17,216 --> 01:33:19,468 that a senior executive at Cambridge Analytica 1499 01:33:19,552 --> 01:33:22,305 met with Julian Assange from WikiLeaks, 1500 01:33:22,555 --> 01:33:25,933 which is the entity that distributed the documents that Russia had stolen. 1501 01:33:27,935 --> 01:33:31,022 She says they discussed the US election. 1502 01:33:49,498 --> 01:33:52,585 [Kaiser] The Mueller investigation called when I booked my flight 1503 01:33:52,668 --> 01:33:55,087 and they decided to issue a subpoena. 1504 01:33:56,380 --> 01:34:00,635 We were talking to them in a very, like, friendly, cooperative way 1505 01:34:00,718 --> 01:34:05,097 and then Carole's article completely changed the way that they see me. 1506 01:34:06,557 --> 01:34:09,518 And, yeah, I... 1507 01:34:10,144 --> 01:34:12,438 worked at Cambridge Analytica 1508 01:34:12,521 --> 01:34:15,399 while they had Facebook data sets. 1509 01:34:16,400 --> 01:34:18,110 And, you know, I... 1510 01:34:19,612 --> 01:34:23,074 went to Russia one time while I worked for Cambridge. 1511 01:34:23,157 --> 01:34:25,743 I visited Julian Assange while I worked for Cambridge. 1512 01:34:25,951 --> 01:34:27,787 I once donated to WikiLeaks. 1513 01:34:27,870 --> 01:34:32,416 I pitched the Trump campaign and wrote the first contract. 1514 01:34:33,167 --> 01:34:36,170 Like, all of these things make it look like I am... 1515 01:34:36,504 --> 01:34:40,049 at the center of some big, crazy thing. 1516 01:34:41,258 --> 01:34:44,178 And I see that, and I can't argue with that. 1517 01:34:46,055 --> 01:34:49,392 I might need to rethink the way that I've been doing things 1518 01:34:49,475 --> 01:34:50,810 for the past few years. 1519 01:35:05,032 --> 01:35:08,661 [Cadwalladr] This is a story which we haven't published yet 1520 01:35:08,744 --> 01:35:11,288 talking about all the investigations 1521 01:35:11,372 --> 01:35:14,333 which have been kicked off in Britain and the US 1522 01:35:14,417 --> 01:35:16,168 since the story came out. 1523 01:35:16,252 --> 01:35:19,630 So, there's an investigation by the FBI, 1524 01:35:19,714 --> 01:35:22,675 by the US, the SEC, 1525 01:35:22,758 --> 01:35:24,260 by the Department of Justice, 1526 01:35:24,343 --> 01:35:25,928 by Robert Mueller, 1527 01:35:26,011 --> 01:35:28,556 and by the Senate Intelligence Committee, 1528 01:35:28,639 --> 01:35:31,016 the Judiciary Committee, the House Intelligence Committee. 1529 01:35:31,100 --> 01:35:33,060 And then these are all the ones which are going on 1530 01:35:33,102 --> 01:35:34,770 which are connected in Britain. 1531 01:35:38,274 --> 01:35:41,277 Parliament spent 18 months investigating. 1532 01:35:42,403 --> 01:35:44,572 They called in all these witnesses. 1533 01:35:48,284 --> 01:35:52,037 And at the end of it, their report says very clearly, 1534 01:35:52,121 --> 01:35:54,623 "Our electoral laws are not fit for purpose." 1535 01:35:56,417 --> 01:36:00,421 We literally cannot have a free and fair election in this country. 1536 01:36:01,881 --> 01:36:04,759 And we can't have it because of Facebook, 1537 01:36:04,925 --> 01:36:08,846 because of the tech giants who are still completely unaccountable. 1538 01:36:13,058 --> 01:36:15,478 It sounds, like, quite apocalyptic. 1539 01:36:15,644 --> 01:36:19,815 But it does feel like we are entering into a whole new era. 1540 01:36:20,858 --> 01:36:24,945 We can see that authoritarian governments are on the rise. 1541 01:36:25,362 --> 01:36:31,243 And they're all using these politics of hate and fear on Facebook. 1542 01:36:33,370 --> 01:36:34,765 - Look at Brazil. - [crowd cheering] 1543 01:36:34,789 --> 01:36:38,501 [Cadwalladr] There's this right-wing extremist 1544 01:36:38,584 --> 01:36:39,919 who's been elected. 1545 01:36:40,002 --> 01:36:44,048 And we know that WhatsApp, which is a part of Facebook, 1546 01:36:44,131 --> 01:36:49,929 was really clearly implicated in the dissemination of fake news there. 1547 01:36:51,263 --> 01:36:53,474 And look at what happened in Myanmar. 1548 01:36:54,350 --> 01:36:56,727 There is evidence that Facebook was used 1549 01:36:56,811 --> 01:36:58,604 to incite racial hatred 1550 01:36:58,687 --> 01:37:00,731 which caused a genocide. 1551 01:37:07,488 --> 01:37:13,327 We also know that the Russian government was using Facebook's tools in the US. 1552 01:37:17,081 --> 01:37:24,046 There's evidence that Russian intelligence created fake Black Lives Matter memes. 1553 01:37:24,755 --> 01:37:28,509 And when people clicked on them, they were taken to pages 1554 01:37:28,592 --> 01:37:32,388 where they were actually invited to protests 1555 01:37:32,471 --> 01:37:35,724 that were organized by the Russian government. 1556 01:37:35,808 --> 01:37:37,911 - [crowd chants] Justice! Now! - [man] When do we want it? 1557 01:37:37,935 --> 01:37:40,080 [Cadwalladr] At the same time, they were setting up pages 1558 01:37:40,104 --> 01:37:44,191 targeting adversary groups, like Blue Lives Matter. 1559 01:37:45,818 --> 01:37:48,696 It's about stoking fear and hate 1560 01:37:48,779 --> 01:37:51,657 to turn the country against itself. 1561 01:37:52,700 --> 01:37:54,493 Divide and conquer. 1562 01:37:54,910 --> 01:37:57,246 [overlapping shouts] 1563 01:37:57,329 --> 01:37:58,998 White power! 1564 01:37:59,456 --> 01:38:01,333 Fascist and proud! 1565 01:38:01,417 --> 01:38:04,378 [overlapping protests] 1566 01:38:04,461 --> 01:38:07,965 [chanting] Fuck Donald Trump! Fuck Donald Trump! 1567 01:38:08,883 --> 01:38:12,136 [Cadwalladr] These platforms which were created to connect us 1568 01:38:12,553 --> 01:38:14,847 have now been weaponized. 1569 01:38:16,765 --> 01:38:20,227 And it's impossible to know what is what 1570 01:38:20,311 --> 01:38:24,315 because it's happening on exactly the same platforms 1571 01:38:24,398 --> 01:38:27,985 that we chat to our friends or share baby photos. 1572 01:38:30,404 --> 01:38:32,489 Nothing is what it seems. 1573 01:38:45,127 --> 01:38:47,567 - [Kaiser] Hi, how are you? - [man] Doing wonderful, yourself? 1574 01:38:47,671 --> 01:38:48,923 I'm all right. 1575 01:38:49,006 --> 01:38:55,012 Um, I stayed here last week and I checked in a suitcase and two bags. 1576 01:38:55,095 --> 01:38:58,474 And I had to go to the airport and just, like, left. 1577 01:38:58,557 --> 01:39:00,434 So my bags have been here for a week. 1578 01:39:06,941 --> 01:39:08,984 [reporter] My guest, Carole Cadwalladr, 1579 01:39:09,068 --> 01:39:12,363 writes for the British newspapers The Observer and The Guardian. 1580 01:39:12,863 --> 01:39:15,282 Can you just say a little bit more about the Facebook data? 1581 01:39:15,824 --> 01:39:20,871 [Cadwalladr] This thing of the data, so how Americans were targeted, 1582 01:39:20,955 --> 01:39:22,998 and what they were targeted with, 1583 01:39:23,290 --> 01:39:27,544 is a sort of key part of Mueller's investigation. 1584 01:39:30,839 --> 01:39:34,718 [Kaiser] I am headed to Washington, DC, 1585 01:39:34,969 --> 01:39:38,430 for my testimony for the Mueller investigation. 1586 01:39:38,514 --> 01:39:40,516 [indistinct radio chatter] 1587 01:39:41,767 --> 01:39:45,270 [Kaiser] I definitely didn't think that while we're sitting there 1588 01:39:45,354 --> 01:39:47,898 counting votes on our data screen 1589 01:39:47,982 --> 01:39:50,401 that some of those votes 1590 01:39:50,484 --> 01:39:56,281 were made by people who had seen fake news stories 1591 01:39:56,365 --> 01:39:59,451 paid for by Russia on their Facebook page. 1592 01:40:01,120 --> 01:40:02,413 Maybe I wanted to believe 1593 01:40:02,496 --> 01:40:05,749 that Cambridge Analytica was just the best. 1594 01:40:07,376 --> 01:40:09,586 It's a convenient story to believe. 1595 01:40:21,515 --> 01:40:23,517 [indistinct announcement over PA] 1596 01:40:25,060 --> 01:40:28,731 ...service down to our nation's capital, Washington Reagan DC airport. 1597 01:40:35,195 --> 01:40:37,656 [Kaiser] I don't think it's possible to shed any of this. 1598 01:40:40,075 --> 01:40:42,911 You can't really put something like this behind you. 1599 01:40:58,510 --> 01:41:01,805 "Youth engagement, persuasion... 1600 01:41:03,140 --> 01:41:04,516 apathy." 1601 01:41:04,600 --> 01:41:05,920 [Nix] Malaysia, we're working in. 1602 01:41:05,976 --> 01:41:10,105 [Kaiser] We did Lithuania, Romania, Kenya, Ghana. 1603 01:41:10,689 --> 01:41:12,816 [Nix] Oh, and the Brexit campaign, yeah. 1604 01:41:12,900 --> 01:41:14,693 But we don't talk about that. 1605 01:41:14,777 --> 01:41:17,821 - [Kaiser] Oops, we won! - [laughter on recording] 1606 01:41:18,655 --> 01:41:20,949 Listening to this now, it just sounds like... 1607 01:41:21,784 --> 01:41:25,913 a criminal admitting to everything he’s done wrong around the world. 1608 01:41:27,498 --> 01:41:28,540 You know? 1609 01:41:29,291 --> 01:41:32,002 I'm just there, nervously laughing along with him, 1610 01:41:32,086 --> 01:41:33,170 letting it happen. 1611 01:41:34,630 --> 01:41:35,631 [chuckles] 1612 01:41:41,386 --> 01:41:46,058 As I said, it's the opposite of what I've worked my whole life to do. 1613 01:41:47,601 --> 01:41:48,727 So... 1614 01:41:51,814 --> 01:41:55,192 it makes me angry at myself that I could sit through a meeting like that... 1615 01:41:55,943 --> 01:41:58,153 and not quit directly afterwards, 1616 01:41:59,321 --> 01:42:00,572 basically. 1617 01:42:04,243 --> 01:42:05,619 What was I doing? 1618 01:42:06,411 --> 01:42:08,664 [reporter] What investigators have you been talking to? 1619 01:42:09,623 --> 01:42:13,710 I'm currently working to be as helpful as possible 1620 01:42:13,794 --> 01:42:17,422 to any government investigations where I can provide assistance, 1621 01:42:17,506 --> 01:42:20,092 but I can't comment on that right now while they're ongoing. 1622 01:42:28,725 --> 01:42:31,270 [woman over PA] At this time, you may use your cellular service. 1623 01:42:31,353 --> 01:42:35,023 However, larger electronic devices must remain stowed. 1624 01:42:35,232 --> 01:42:37,109 [Hilder] Brittany made mistakes. 1625 01:42:38,819 --> 01:42:41,155 But I think it was very brave of her 1626 01:42:41,238 --> 01:42:44,199 to come out and then to keep cooperating 1627 01:42:44,283 --> 01:42:45,826 and not to walk away. 1628 01:42:48,620 --> 01:42:51,665 She is one of two people 1629 01:42:51,748 --> 01:42:56,378 who has blown the whistle in any serious way on Cambridge Analytica. 1630 01:43:00,465 --> 01:43:01,967 We're all responsible. 1631 01:43:04,928 --> 01:43:07,806 So the question is, what do we do with that responsibility? 1632 01:43:08,348 --> 01:43:09,683 Can we embrace it? 1633 01:43:12,394 --> 01:43:14,521 [Kaiser] Happy that it's finally happening 1634 01:43:14,605 --> 01:43:18,275 so that I can just tell people what happened and get everything... 1635 01:43:19,943 --> 01:43:20,944 on record 1636 01:43:22,154 --> 01:43:25,199 for this government, my government. [chuckles] 1637 01:43:47,095 --> 01:43:49,181 [reporter] You remember, though, Cambridge Analytica. 1638 01:43:49,431 --> 01:43:51,391 Its big claim in 2016 1639 01:43:51,475 --> 01:43:53,477 was that it had access to voter data 1640 01:43:53,685 --> 01:43:56,688 on all of the people voting in the US election. 1641 01:43:58,273 --> 01:44:02,402 Well, just one of the 157 million people who voted in that election, 1642 01:44:02,486 --> 01:44:06,365 a man called David Carroll, asked them a very simple question: 1643 01:44:06,990 --> 01:44:09,826 "Can I see the data you have on me?" 1644 01:44:10,661 --> 01:44:12,621 And they refused to give it to him. 1645 01:44:14,248 --> 01:44:18,335 But crucially, today, Cambridge Analytica pled guilty 1646 01:44:18,418 --> 01:44:22,422 at Hendon Magistrates' Court for failing to comply with the ICO notice. 1647 01:44:40,482 --> 01:44:43,277 [Carroll] The Cambridge Analytica case is behind me now. 1648 01:44:43,860 --> 01:44:46,738 They pleaded guilty for not giving me my data, 1649 01:44:47,739 --> 01:44:50,242 and I'll probably never get it back. 1650 01:44:53,120 --> 01:44:55,330 By the time my daughter is 18, 1651 01:44:55,414 --> 01:44:58,750 she'll have 70,000 data points defining her, 1652 01:44:58,917 --> 01:45:01,295 and currently she has no rights, 1653 01:45:01,628 --> 01:45:03,839 no control over that at all. 1654 01:45:06,216 --> 01:45:07,551 But the battle continues. 1655 01:45:08,176 --> 01:45:10,887 [applause] 1656 01:45:13,432 --> 01:45:15,100 I don't have to tell you 1657 01:45:15,183 --> 01:45:17,102 that there is this dark undertow 1658 01:45:17,185 --> 01:45:19,563 which is connecting us all globally. 1659 01:45:19,646 --> 01:45:23,567 And it is flowing via the technology platforms. 1660 01:45:24,026 --> 01:45:26,236 And that is why I am here 1661 01:45:26,320 --> 01:45:31,450 to address you directly, the Gods of Silicon Valley. 1662 01:45:31,533 --> 01:45:33,535 [audience cheers and applauds] 1663 01:45:35,245 --> 01:45:36,997 Mark Zuckerberg, 1664 01:45:38,206 --> 01:45:40,000 and Sheryl Sandberg, 1665 01:45:40,083 --> 01:45:43,170 and Larry Page, and Sergey Brin, 1666 01:45:43,253 --> 01:45:44,671 and Jack Dorsey. 1667 01:45:46,381 --> 01:45:49,301 Because you set out to connect people 1668 01:45:49,801 --> 01:45:51,803 and you are refusing to acknowledge 1669 01:45:51,887 --> 01:45:55,682 that this same technology is now driving us apart. 1670 01:45:56,767 --> 01:45:59,478 And what you don't seem to understand 1671 01:45:59,561 --> 01:46:03,357 is that this is bigger than you, and it's bigger than any of us. 1672 01:46:03,440 --> 01:46:08,779 And it is not about left or right, or leave or remain, or Trump or not. 1673 01:46:09,363 --> 01:46:11,281 It's about whether it's actually possible 1674 01:46:11,365 --> 01:46:13,742 to have a free and fair election ever again. 1675 01:46:14,368 --> 01:46:18,038 And so my question to you is: Is this what you want? 1676 01:46:19,247 --> 01:46:22,209 Is this how you want history to remember you? 1677 01:46:23,293 --> 01:46:26,922 As the handmaidens to authoritarianism? 1678 01:46:27,464 --> 01:46:31,510 And my question to everybody else is, is this what we want? 1679 01:46:31,968 --> 01:46:36,181 To sit back and play with our phones as this darkness falls? 1680 01:46:41,770 --> 01:46:44,272 Who is logged into Facebook right now? 1681 01:46:46,274 --> 01:46:47,359 Almost everybody. 1682 01:46:49,069 --> 01:46:50,779 So, as individuals, 1683 01:46:50,862 --> 01:46:55,534 we can limit the flood of data that we're leaking all over the place. 1684 01:46:55,617 --> 01:46:59,496 But there's no silver bullet. There's no way to go off the grid. 1685 01:46:59,579 --> 01:47:02,207 So, you have to understand 1686 01:47:02,874 --> 01:47:05,877 how your data is affecting your life. 1687 01:47:07,087 --> 01:47:10,841 Our dignity as humans is at stake. 1688 01:47:11,091 --> 01:47:13,844 [overlapping shouts of protest] 1689 01:47:20,434 --> 01:47:22,314 [Carroll] But the hardest part in all of this... 1690 01:47:22,394 --> 01:47:25,355 [Trump] Got a lot of rough people in those caravans. They are not a... 1691 01:47:25,439 --> 01:47:27,899 [Carroll] ...is that these wreckage sites... 1692 01:47:27,983 --> 01:47:31,069 - [people screaming] - ...and crippling divisions... 1693 01:47:33,989 --> 01:47:37,742 begin with the manipulation of one individual. 1694 01:47:39,369 --> 01:47:40,537 Then another. 1695 01:47:42,164 --> 01:47:43,248 And another. 1696 01:47:47,794 --> 01:47:50,213 So, I can't help but ask myself: 1697 01:47:52,132 --> 01:47:53,758 Can I be manipulated? 1698 01:47:57,387 --> 01:47:58,388 Can you? 145779

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.