All language subtitles for The.Social.Dilemma.2020.720p.WEBRip.x264.AAC-[YTS.MX].hi
Afrikaans
Albanian
Amharic
Arabic
Armenian
Azerbaijani
Basque
Belarusian
Bengali
Bosnian
Bulgarian
Catalan
Cebuano
Chichewa
Chinese (Simplified)
Chinese (Traditional)
Corsican
Croatian
Czech
Danish
Dutch
English
Esperanto
Estonian
Filipino
Finnish
French
Frisian
Galician
Georgian
German
Greek
Gujarati
Haitian Creole
Hausa
Hawaiian
Hebrew
Hindi
Hmong
Hungarian
Icelandic
Igbo
Indonesian
Irish
Italian
Japanese
Javanese
Kannada
Kazakh
Khmer
Korean
Kurdish (Kurmanji)
Kyrgyz
Lao
Latin
Latvian
Lithuanian
Luxembourgish
Macedonian
Malagasy
Malay
Malayalam
Maltese
Maori
Marathi
Mongolian
Myanmar (Burmese)
Nepali
Norwegian
Pashto
Persian
Polish
Portuguese
Punjabi
Romanian
Russian
Samoan
Scots Gaelic
Serbian
Sesotho
Shona
Sindhi
Sinhala
Slovak
Slovenian
Somali
Spanish
Sundanese
Swahili
Swedish
Tajik
Tamil
Telugu
Thai
Turkish
Ukrainian
Urdu
Uzbek
Vietnamese
Welsh
Xhosa
Yiddish
Yoruba
Zulu
Odia (Oriya)
Kinyarwanda
Turkmen
Tatar
Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,000 --> 00:00:07,000
YTS.MX से डाउनलोड किया गया
2
00:00:08,000 --> 00:00:13,000
आधिकारिक YIFY फिल्में साइट:
YTS.MX
3
00:00:15,849 --> 00:00:17,934
[भयानक वाद्य संगीत वादन]
4
00:00:31,114 --> 00:00:34,659
[साक्षात्कारकर्ता] आप आगे क्यों नहीं जाते?
बैठ जाओ और देखो अगर तुम आराम कर सकते हैं।
5
00:00:37,579 --> 00:00:39,789
-आप अच्छे हैं? ठीक है।
हाँ। [छोडती]
6
00:00:39,914 --> 00:00:42,125
- [साक्षात्कारकर्ता] उम ...
- [सेल फोन कांपना]
7
00:00:43,043 --> 00:00:44,794
[क्रू सदस्य] एक, मार्कर लें।
8
00:00:46,796 --> 00:00:48,798
[साक्षात्कारकर्ता] अपना
परिचय देकर शुरू करना चाहते हैं ?
9
00:00:48,882 --> 00:00:49,799
[चालक दल के सदस्य खांसी]
10
00:00:50,467 --> 00:00:53,344
नमस्ते दुनिया। बेली। तीन ले लो।
11
00:00:53,970 --> 00:00:56,347
- [साक्षात्कारकर्ता] आप अच्छे हैं?
-यह सबसे खराब हिस्सा है, यार।
12
00:00:56,890 --> 00:00:59,517
[चकली] मुझे यह पसंद नहीं है।
13
00:00:59,851 --> 00:01:02,228
मैंने 2011 और 2012 में फेसबुक पर काम किया।
14
00:01:02,312 --> 00:01:05,190
मैं
इंस्टाग्राम पर वास्तव में शुरुआती कर्मचारियों में से एक था ।
15
00:01:05,273 --> 00:01:08,693
[आदमी १] मैंने उह, गूगल, उह, YouTube पर काम किया ।
16
00:01:08,777 --> 00:01:11,696
[महिला] एप्पल, गूगल, ट्विटर, पाम।
17
00:01:12,739 --> 00:01:15,533
मैंने मोज़िला लैब्स को शुरू करने में मदद की
और फ़ायरफ़ॉक्स की तरफ बढ़ गया।
18
00:01:15,617 --> 00:01:18,119
- [एक साक्षात्कारकर्ता] क्या हम रोलिंग कर रहे हैं? सब लोग?
- 'क्रू मेंबर्स जवाब देते हैं]
19
00:01:18,203 --> 00:01:19,162
[साक्षात्कारकर्ता] महान।
20
00:01:21,206 --> 00:01:22,624
[आदमी २] मैंने ट्विटर पर काम किया।
21
00:01:23,041 --> 00:01:23,917
वहां मेरी आखिरी नौकरी है
22
00:01:24,000 --> 00:01:26,169
इंजीनियरिंग के वरिष्ठ उपाध्यक्ष थे ।
23
00:01:27,337 --> 00:01:29,255
- [3 आदमी] मैं Pinterest का अध्यक्ष था।
- [घूँट]
24
00:01:29,339 --> 00:01:32,717
इससे पहले, उम,
मैं ... विमुद्रीकरण के निदेशक थे
25
00:01:32,801 --> 00:01:34,260
पांच साल तक फेसबुक पर।
26
00:01:34,344 --> 00:01:37,972
ट्विटर पर रहते हुए, मैंने
उनके डेवलपर प्लेटफॉर्म को चलाने में कई साल बिताए ,
27
00:01:38,056 --> 00:01:40,225
और फिर मैं
उपभोक्ता उत्पाद का प्रमुख बन गया ।
28
00:01:40,308 --> 00:01:44,270
मैं Google ड्राइव,
जीमेल चैट का संवाहक था ,
29
00:01:44,354 --> 00:01:46,689
फेसबुक पेज
और फेसबुक जैसे बटन।
30
00:01:47,440 --> 00:01:50,777
हाँ। यह है ... यही कारण है कि मैंने खर्च किया,
जैसे, आठ महीने
31
00:01:50,860 --> 00:01:52,779
वकीलों के साथ आगे और पीछे बात करना।
32
00:01:54,072 --> 00:01:55,406
यह मुझे बाहर निकालता है।
33
00:01:58,409 --> 00:01:59,702
[आदमी २] जब मैं वहाँ था,
34
00:01:59,786 --> 00:02:02,914
मुझे हमेशा ऐसा लगता
था , मौलिक रूप से, यह अच्छे के लिए एक बल था।
35
00:02:03,414 --> 00:02:05,375
मुझे नहीं पता कि क्या मुझे ऐसा लगता है।
36
00:02:05,458 --> 00:02:10,588
नैतिक चिंताओं के कारण मैंने जून 2017 में Google को छोड़ दिया ।
37
00:02:10,672 --> 00:02:14,134
और ... और न केवल Google पर
बल्कि उद्योग में बड़े पैमाने पर।
38
00:02:14,217 --> 00:02:15,385
मैं बहुत चिंतित हूं।
39
00:02:16,636 --> 00:02:17,679
मैं बहुत चिंतित हूं।
40
00:02:19,097 --> 00:02:21,808
आज तथ्य को खोना आसान है
41
00:02:21,891 --> 00:02:27,814
इन उपकरणों ने वास्तव
में दुनिया में कुछ अद्भुत चीजें बनाई हैं।
42
00:02:27,897 --> 00:02:31,943
वे परिवार के सदस्यों को खो चुके हैं।
उन्हें अंग दाता मिल गया है।
43
00:02:32,026 --> 00:02:36,573
मेरा मतलब है, सार्थक,
प्रणालीगत परिवर्तन हो रहे थे
44
00:02:36,656 --> 00:02:39,159
इन प्लेटफार्मों की वजह से दुनिया भर में
45
00:02:39,242 --> 00:02:40,285
वह सकारात्मक थे!
46
00:02:40,827 --> 00:02:44,539
मुझे लगता है कि हम
उस सिक्के के फ्लिप पक्ष के बारे में अनुभवहीन थे ।
47
00:02:45,540 --> 00:02:48,585
हाँ, ये चीजें, आप उन्हें जारी करते हैं,
और वे अपने स्वयं के जीवन पर ले जाते हैं।
48
00:02:48,668 --> 00:02:52,005
और उनका उपयोग कैसे किया जाता है
यह आपकी अपेक्षा से बहुत अलग है।
49
00:02:52,088 --> 00:02:56,509
कोई भी, मैं गहराई से विश्वास करता हूं,
इनमें से किसी भी परिणाम का इरादा है।
50
00:02:56,593 --> 00:02:59,554
कोई बुरा आदमी नहीं है।
नहीं, बिल्कुल नहीं।
51
00:03:01,598 --> 00:03:03,975
[साक्षात्कारकर्ता] तो, फिर,
क्या ... समस्या क्या है?
52
00:03:09,147 --> 00:03:11,482
[साक्षात्कारकर्ता] क्या कोई समस्या है,
और क्या समस्या है?
53
00:03:12,108 --> 00:03:13,026
[स्वैलोज़]
54
00:03:17,614 --> 00:03:19,991
[क्लिक जीभ] हाँ, यह
एक एकल, रसीला देना मुश्किल है ...
55
00:03:20,074 --> 00:03:22,118
मैं
कई अलग-अलग समस्याओं को छूने की कोशिश कर रहा हूँ ।
56
00:03:22,535 --> 00:03:23,953
[साक्षात्कारकर्ता] क्या समस्या है?
57
00:03:24,621 --> 00:03:25,914
[क्लिक जीभ, चकल्लस]
58
00:03:27,916 --> 00:03:29,500
[चिड़ियों की चहचहाहट]
59
00:03:31,169 --> 00:03:32,670
[कुत्ते के भौंकने की दूरी]
60
00:03:33,463 --> 00:03:35,340
[रिपोर्टर 1]
बढ़ती आलोचना का सामना करने के बावजूद,
61
00:03:35,423 --> 00:03:37,675
तथाकथित बिग टेक के नाम बड़े हो रहे हैं।
62
00:03:37,759 --> 00:03:40,929
संपूर्ण तकनीक उद्योग
नए स्तर पर जांच के अधीन है।
63
00:03:41,012 --> 00:03:43,806
और एक नया अध्ययन लिंक पर प्रकाश डालता है
64
00:03:43,890 --> 00:03:46,142
मानसिक स्वास्थ्य
और सोशल मीडिया के उपयोग के बीच।
65
00:03:46,226 --> 00:03:48,686
[टीवी पर]
यहाँ नवीनतम शोध के बारे में बात करने के लिए ...
66
00:03:48,770 --> 00:03:51,397
[टकर कार्लसन] ... चल रहा है कि कोई भी कवरेज नहीं मिलता है।
67
00:03:51,481 --> 00:03:54,108
लाखों अमेरिकियों को निराशा की लत है
68
00:03:54,192 --> 00:03:56,319
उनके इलेक्ट्रॉनिक उपकरणों के लिए।
69
00:03:56,402 --> 00:03:57,987
[रिपोर्टर 2] यह इस तथ्य से बढ़ा है
70
00:03:58,071 --> 00:04:00,698
कि तुम सचमुच अपने आप को अब अलग कर सकते हो
71
00:04:00,782 --> 00:04:02,742
एक बुलबुले में, हमारी तकनीक के लिए धन्यवाद।
72
00:04:02,825 --> 00:04:04,577
फेक न्यूज ज्यादा एडवांस होती जा रही है
73
00:04:04,661 --> 00:04:06,788
और
दुनिया भर के समाजों को खतरा ।
74
00:04:06,871 --> 00:04:10,250
जब हमने 12 साल पहले ट्विटर बनाया था, तो हम इसकी कोई उम्मीद नहीं कर रहे थे।
75
00:04:10,333 --> 00:04:12,502
व्हाइट हाउस के अधिकारियों का कहना है कि
उनके पास विश्वास करने का कोई कारण नहीं है
76
00:04:12,585 --> 00:04:14,754
रूसी साइबर हमले रुक जाएंगे।
77
00:04:14,837 --> 00:04:18,132
YouTube को
साइट को साफ करने पर ध्यान केंद्रित करने के लिए मजबूर किया जा रहा है ।
78
00:04:18,216 --> 00:04:21,552
[रिपोर्टर 3] टिकटोक, अगर आप किसी भी बाहर वहाँ बात करते हैं ...
79
00:04:21,636 --> 00:04:24,013
[टीवी पर] ... कोई मौका नहीं है कि वे इस चीज़ को हटा देंगे ...
80
00:04:24,097 --> 00:04:26,224
अरे, इस्ला,
क्या तुम मेज तैयार कर सकते हो, कृपया?
81
00:04:26,307 --> 00:04:28,601
[रिपोर्टर ४] सोशल मीडिया के बारे में एक सवाल है
82
00:04:28,685 --> 00:04:29,978
आपके बच्चे को उदास कर रहा है।
83
00:04:30,061 --> 00:04:32,105
[माँ] इस्ला,
क्या आप टेबल सेट कर सकते हैं, कृपया?
84
00:04:32,188 --> 00:04:35,316
[रिपोर्टर 5] ये कॉस्मेटिक प्रक्रियाएं किशोरियों के साथ इतनी लोकप्रिय हो रही हैं,
85
00:04:35,400 --> 00:04:37,902
प्लास्टिक सर्जन ने
इसके लिए एक नया सिंड्रोम बनाया है,
86
00:04:37,986 --> 00:04:40,822
"स्नैपचैट डिस्मॉर्फिया,"
सर्जरी के इच्छुक युवा रोगियों के साथ
87
00:04:40,905 --> 00:04:43,741
इसलिए वे और अधिक देख सकते हैं जैसे वे
फ़िल्टर्ड सेल्फ़ी में करते हैं।
88
00:04:43,825 --> 00:04:45,910
फिर भी यह न देखें कि आपने उसे
वह चीज क्यों करने दी ।
89
00:04:45,994 --> 00:04:47,412
मुझे क्या करना चाहिए था?
90
00:04:47,495 --> 00:04:49,580
मेरा मतलब है,
उसकी कक्षा के हर दूसरे बच्चे के पास एक था।
91
00:04:50,164 --> 00:04:51,165
वह केवल 11 है।
92
00:04:51,249 --> 00:04:52,959
कैस, कोई भी आपको एक पाने के लिए मजबूर नहीं करता है।
93
00:04:53,042 --> 00:04:55,086
आप
जब तक चाहें काट सकते हैं।
94
00:04:55,169 --> 00:04:59,340
अरे, मैं एक सेल फोन के बिना जुड़ा हुआ हूं,
ठीक है? मैं अभी इंटरनेट पर हूं।
95
00:04:59,424 --> 00:05:03,094
इसके अलावा, यह वास्तविक कनेक्शन भी नहीं है।
यह सिर्फ sh-- का भार है
96
00:05:03,177 --> 00:05:05,013
निगरानी पूंजीवाद आकार में आ गया है
97
00:05:05,096 --> 00:05:07,765
हमारी राजनीति और संस्कृति
कई मायनों में लोग अनुभव नहीं करते हैं।
98
00:05:07,849 --> 00:05:10,101
[रिपोर्टर 6]
ISIS ने अनुयायियों को ऑनलाइन प्रेरित किया,
99
00:05:10,184 --> 00:05:12,812
और अब श्वेत वर्चस्ववादी भी ऐसा ही कर रहे हैं।
100
00:05:12,895 --> 00:05:14,147
हाल ही में भारत में,
101
00:05:14,230 --> 00:05:17,442
इंटरनेट लिंच मॉब्स ने
इन पांचों सहित एक दर्जन लोगों की जान ले ली है ...
102
00:05:17,525 --> 00:05:20,361
[रिपोर्टर not ] यह सिर्फ फर्जी खबर नहीं है; यह परिणामों के साथ फर्जी खबर है।
103
00:05:20,445 --> 00:05:24,073
[रिपोर्टर ic] फर्जी खबरों की उम्र में आप महामारी को कैसे संभालते हैं ?
104
00:05:24,157 --> 00:05:26,993
क्या आप
चीनी खाना खाकर कोरोनोवायरस प्राप्त कर सकते हैं ?
105
00:05:27,535 --> 00:05:32,540
हम सूचना युग
से विघटन युग में चले गए हैं ।
106
00:05:32,623 --> 00:05:34,667
हमारे लोकतंत्र पर हमला हो रहा है।
107
00:05:34,751 --> 00:05:36,919
[आदमी ४] मैंने जो कहा, वह था,
मुझे लगता है कि उपकरण
108
00:05:37,003 --> 00:05:39,005
जो आज बनाए गए हैं वे शुरू हो रहे हैं
109
00:05:39,088 --> 00:05:41,799
समाज कैसे काम करता है , इसके सामाजिक ताने बाने को मिटाने के लिए । ”
110
00:05:41,883 --> 00:05:44,427
[भयानक वाद्य संगीत जारी है]
111
00:05:55,980 --> 00:05:58,483
- [एक संगीत धूमिल]
- [अविवेकी बकबक]
112
00:05:58,566 --> 00:05:59,442
[चालक दल के सदस्य] ठीक है।
113
00:06:00,151 --> 00:06:03,446
[मंच प्रबंधक] आजा
टिप्पणी का स्वागत करते हैं। हम वीडियो चलाते हैं।
114
00:06:04,197 --> 00:06:07,325
और फिर, "देवियों और सज्जनों,
ट्रिस्टन हैरिस।"
115
00:06:07,408 --> 00:06:08,868
-सही।
- [स्टेज मैनेजर] ग्रेट।
116
00:06:08,951 --> 00:06:12,038
तो, मैं आता हूं, और ...
117
00:06:13,831 --> 00:06:17,126
मूल रूप से कहते हैं, "आने के लिए आप सभी का धन्यवाद।"
उम ...
118
00:06:17,919 --> 00:06:22,048
इसलिए, आज, मैं
प्रौद्योगिकी के नए एजेंडे के बारे में बात करना चाहता हूं ।
119
00:06:22,131 --> 00:06:25,468
और हम ऐसा क्यों करना चाहते हैं
क्योंकि अगर आप लोगों से पूछें,
120
00:06:25,551 --> 00:06:27,804
"टेक इंडस्ट्री में
अभी क्या गलत है ?"
121
00:06:28,262 --> 00:06:31,641
शिकायतों
और घोटालों का कैकोफनी है,
122
00:06:31,724 --> 00:06:33,893
और "उन्होंने हमारा डेटा चुरा लिया।"
और तकनीक की लत है।
123
00:06:33,976 --> 00:06:35,978
और फर्जी खबर है।
और वहां ध्रुवीकरण है
124
00:06:36,062 --> 00:06:37,855
और कुछ चुनाव
जो हैक हो रहे हैं।
125
00:06:38,189 --> 00:06:41,609
लेकिन क्या ऐसा कुछ है
जो इन सभी समस्याओं के नीचे है
126
00:06:41,692 --> 00:06:44,612
कि यह सब चीजें
एक ही बार में हो रही हैं?
127
00:06:44,821 --> 00:06:46,364
[मंच प्रबंधक लगातार बोल रहा है]
128
00:06:46,447 --> 00:06:48,408
-क्या यह अच्छा लगता है?
-बहुत अच्छा। हाँ।
129
00:06:49,033 --> 00:06:49,992
उम ... [आहें]
130
00:06:50,743 --> 00:06:52,954
मैं बस कोशिश कर रहा हूं ...
जैसे, मैं चाहता हूं कि लोग देखें ...
131
00:06:53,037 --> 00:06:55,123
जैसे,
टेक इंडस्ट्री में समस्या हो रही है,
132
00:06:55,206 --> 00:06:56,707
और इसका कोई नाम नहीं है,
133
00:06:56,791 --> 00:07:00,211
और यह एक स्रोत के साथ करना है,
जैसे, एक ...
134
00:07:00,795 --> 00:07:03,589
[भयानक वाद्य संगीत वादन]
135
00:07:05,091 --> 00:07:09,387
[ट्रिस्टन] जब आप अपने चारों ओर देखते हैं,
तो ऐसा लगता है कि दुनिया पागल हो रही है।
136
00:07:12,765 --> 00:07:15,309
आपको खुद से पूछना होगा, जैसे,
"क्या यह सामान्य है?
137
00:07:16,102 --> 00:07:18,771
या हम सभी किसी तरह
के जादू के तहत गिर गए हैं ? "
138
00:07:27,989 --> 00:07:30,491
मैं चाहता हूं कि और लोग समझ सकें
कि यह कैसे काम करता है
139
00:07:30,575 --> 00:07:34,036
क्योंकि यह कुछ ऐसा नहीं होना चाहिए
जो केवल तकनीकी उद्योग जानता हो।
140
00:07:34,120 --> 00:07:36,247
यह कुछ ऐसा होना चाहिए
जो हर कोई जानता हो।
141
00:07:36,330 --> 00:07:38,708
[बैग ज़िप]
142
00:07:41,419 --> 00:07:42,378
[धीरे से] अलविदा।
143
00:07:43,629 --> 00:07:44,881
[गार्ड] यहाँ तुम जाओ, सर।
144
00:07:47,383 --> 00:07:48,676
- [कर्मचारी] नमस्कार!
- [ट्रिस्टन] हाय।
145
00:07:48,759 --> 00:07:50,678
-Tristan। आपसे मिलकर अच्छा लगा।
- यह ट्रिस- टैन है, है ना?
146
00:07:50,761 --> 00:07:51,721
-हाँ।
-बहुत बढ़िया। ठंडा।
147
00:07:53,181 --> 00:07:55,933
[प्रस्तुतकर्ता] ट्रिस्टन हैरिस
गूगल के लिए एक पूर्व डिजाइन नैतिकतावादी है
148
00:07:56,017 --> 00:07:59,395
और
सिलिकॉन वैली को सबसे करीबी चीज कहा गया है ।
149
00:07:59,479 --> 00:08:00,730
[रिपोर्टर] वह टेक पूछ रहा है
150
00:08:00,813 --> 00:08:04,192
लाने के लिए क्या वह
अपने उत्पादों के लिए "नैतिक डिजाइन" कहते हैं।
151
00:08:04,275 --> 00:08:06,903
[एंडरसन कूपर] एक तकनीकी अंदरूनी सूत्र के लिए इतना कुंद होना दुर्लभ है ,
152
00:08:06,986 --> 00:08:10,114
लेकिन ट्रिस्टन हैरिस का मानना है कि किसी को होना चाहिए।
153
00:08:11,324 --> 00:08:12,700
[ट्रिस्टन] जब मैं गूगल पर था,
154
00:08:12,783 --> 00:08:16,037
मैं जीमेल टीम पर था,
और मैं बस जलने लगा
155
00:08:16,120 --> 00:08:18,372
'क्योंकि हम के बारे में
बहुत सारी बातचीत थी ...
156
00:08:19,457 --> 00:08:23,169
तुम्हें पता है, इनबॉक्स कैसा दिखना चाहिए
और इसका रंग कैसा होना चाहिए, और ...
157
00:08:23,252 --> 00:08:25,880
और मुझे पता है, व्यक्तिगत रूप
से ई-मेल के आदी हैं ,
158
00:08:26,297 --> 00:08:27,632
और मुझे यह आकर्षक लगा
159
00:08:27,715 --> 00:08:31,511
Gmail में कोई भी ऐसा नहीं था
जो इसे कम व्यसनी बनाने पर काम कर रहा हो।
160
00:08:31,969 --> 00:08:34,514
और मैं ऐसा था,
"क्या कोई और इस बारे में सोच रहा है?
161
00:08:34,597 --> 00:08:36,390
मैंने किसी से इस बारे में बात नहीं की है।
162
00:08:36,849 --> 00:08:39,685
-और मैं इस निराशा को महसूस कर रहा था ...
- [आह]
163
00:08:39,769 --> 00:08:41,229
... तकनीक उद्योग के साथ, कुल मिलाकर,
164
00:08:41,312 --> 00:08:43,147
कि हम चाहते हैं, की तरह, हमारे रास्ते खो दिया है।
165
00:08:43,231 --> 00:08:46,442
- [अशुभ वाद्य संगीत बजाना]
- [संदेश अलर्ट चिम्पिंग]
166
00:08:46,817 --> 00:08:49,820
[ट्रिस्टन] तुम्हें पता है, मैं वास्तव
में कोशिश करने और समझ पाने के लिए संघर्ष किया
167
00:08:49,904 --> 00:08:52,573
कैसे, अंदर से, हम इसे बदल सकते हैं।
168
00:08:52,907 --> 00:08:55,117
[ऊर्जावान पियानो संगीत वादन]
169
00:08:55,201 --> 00:08:58,120
[ट्रिस्टन] और वह तब था जब मैंने
प्रस्तुति देने का फैसला किया ,
170
00:08:58,204 --> 00:08:59,497
हथियारों के लिए कॉल की तरह।
171
00:09:00,998 --> 00:09:04,961
हर दिन, मैं घर गया और मैंने
हर एक रात में एक दो घंटे काम किया ।
172
00:09:05,044 --> 00:09:06,087
[टाइपिंग]
173
00:09:06,170 --> 00:09:08,548
[ट्रिस्टन] यह मूल रूप से सिर्फ कहा गया था,
आप जानते हैं,
174
00:09:08,631 --> 00:09:11,884
इतिहास में इससे पहले कभी 50 डिज़ाइनर नहीं हैं--
175
00:09:12,426 --> 00:09:15,263
कैलिफोर्निया में 20- से 35 साल के गोरे लोग--
176
00:09:15,888 --> 00:09:19,725
दो अरब लोगों पर प्रभाव डालने वाले निर्णय किए ।
177
00:09:21,018 --> 00:09:24,438
दो बिलियन लोगों के विचार होंगे
कि उनका इरादा नहीं था
178
00:09:24,522 --> 00:09:28,401
क्योंकि Google के एक डिज़ाइनर ने कहा,
"यह है कि सूचनाएँ कैसे काम करती हैं
179
00:09:28,484 --> 00:09:30,778
उस स्क्रीन पर जिसे आप
सुबह उठते हैं। "
180
00:09:31,195 --> 00:09:35,283
और
इस समस्या के समाधान के लिए Google के रूप में हमारी एक नैतिक जिम्मेदारी है।
181
00:09:36,075 --> 00:09:37,743
और मैंने यह प्रस्तुति भेजी
182
00:09:37,827 --> 00:09:41,789
Google पर मेरे निकटतम सहयोगियों में से लगभग 15, 20 ,
183
00:09:41,872 --> 00:09:44,959
और मैं इसे लेकर बहुत घबराया हुआ था।
मुझे यकीन नहीं था कि यह जमीन कैसे थी।
184
00:09:46,460 --> 00:09:48,045
जब मैं अगले दिन काम पर गया,
185
00:09:48,129 --> 00:09:50,464
अधिकांश लैपटॉप
में प्रस्तुति खुली थी।
186
00:09:52,133 --> 00:09:54,552
उस दिन के बाद, वहाँ था, जैसे,
400 एक साथ दर्शकों,
187
00:09:54,635 --> 00:09:56,053
इसलिए यह सिर्फ बढ़ता रहा और बढ़ता रहा।
188
00:09:56,137 --> 00:10:00,266
मुझे कंपनी के चारों ओर से ई-मेल मिले।
मेरा मतलब है, हर विभाग के लोग कहते हैं,
189
00:10:00,349 --> 00:10:02,852
"मैं पूरी तरह सहमत हूँ।"
"मैं इसे अपने बच्चों को प्रभावित करते हुए देखता हूं।"
190
00:10:02,935 --> 00:10:04,979
"मैं इसे
अपने आसपास के लोगों को प्रभावित करता देख रहा हूं।"
191
00:10:05,062 --> 00:10:06,939
"हमें इस बारे में कुछ करना होगा।"
192
00:10:07,481 --> 00:10:10,818
ऐसा महसूस हुआ कि मैं
क्रांति या ऐसा कुछ शुरू करने की तरह था ।
193
00:10:11,861 --> 00:10:15,197
बाद में, मुझे पता चला कि लैरी पेज
को इस प्रस्तुति के बारे में सूचित कर दिया गया था
194
00:10:15,281 --> 00:10:17,908
-उस दिन तीन अलग-अलग बैठकें।
- [अविवेकी बकवास]
195
00:10:17,992 --> 00:10:20,286
[ट्रिस्टन] और इसलिए, इसने
इस तरह के सांस्कृतिक क्षण का निर्माण किया
196
00:10:20,870 --> 00:10:24,415
-that Google needed to take seriously.
-[whooshing]
197
00:10:26,000 --> 00:10:28,878
-[Tristan] And then... nothing.
-[whooshing fades]
198
00:10:32,673 --> 00:10:34,216
[message alerts chiming]
199
00:10:34,300 --> 00:10:36,135
[Tim] Everyone in 2006...
200
00:10:37,219 --> 00:10:39,221
including all of us at Facebook,
201
00:10:39,305 --> 00:10:43,392
just had total admiration for Google
and what Google had built,
202
00:10:43,476 --> 00:10:47,396
which was this incredibly useful service
203
00:10:47,480 --> 00:10:51,442
that did, far as we could tell,
lots of goodness for the world,
204
00:10:51,525 --> 00:10:54,695
and they built
this parallel money machine.
205
00:10:55,404 --> 00:11:00,034
We had such envy for that,
and it seemed so elegant to us...
206
00:11:00,826 --> 00:11:02,161
and so perfect.
207
00:11:02,953 --> 00:11:05,289
Facebook had been around
for about two years,
208
00:11:05,373 --> 00:11:08,376
um, and I was hired to come in
and figure out
209
00:11:08,459 --> 00:11:10,586
what the business model was gonna be
for the company.
210
00:11:10,670 --> 00:11:13,422
I was the director of monetization.
The point was, like,
211
00:11:13,506 --> 00:11:17,051
"You're the person who's gonna figure out
how this thing monetizes."
212
00:11:17,134 --> 00:11:19,804
And there were a lot of people
who did a lot of the work,
213
00:11:19,887 --> 00:11:25,476
but I was clearly one of the people
who was pointing towards...
214
00:11:26,769 --> 00:11:28,562
"Well, we have to make money, A...
215
00:11:29,313 --> 00:11:33,651
and I think this advertising model
is probably the most elegant way.
216
00:11:36,278 --> 00:11:38,280
[bright instrumental music playing]
217
00:11:42,243 --> 00:11:44,370
Uh-oh. What's this video Mom just sent us?
218
00:11:44,453 --> 00:11:46,747
Oh, that's from a talk show,
but that's pretty good.
219
00:11:46,831 --> 00:11:47,873
Guy's kind of a genius.
220
00:11:47,957 --> 00:11:50,584
He's talking all about deleting
social media, which you gotta do.
221
00:11:50,668 --> 00:11:52,878
I might have to start blocking
her e-mails.
222
00:11:52,962 --> 00:11:54,880
I don't even know
what she's talking about, man.
223
00:11:54,964 --> 00:11:56,090
She's worse than I am.
224
00:11:56,173 --> 00:11:58,509
-No, she only uses it for recipes.
-Right, and work.
225
00:11:58,592 --> 00:12:00,553
-And workout videos.
-[guy] And to check up on us.
226
00:12:00,636 --> 00:12:03,055
And everyone else she's ever met
in her entire life.
227
00:12:04,932 --> 00:12:07,893
If you are scrolling throughyour social media feed
228
00:12:07,977 --> 00:12:11,731
while you're watchin' us, you need to putthe damn phone down and listen up
229
00:12:11,814 --> 00:12:14,817
'cause our next guest has written
an incredible book
230
00:12:14,900 --> 00:12:18,112
about how much it's wrecking our lives.
231
00:12:18,195 --> 00:12:19,447
Please welcome author
232
00:12:19,530 --> 00:12:23,951
of Ten Arguments for DeletingYour Social Media Accounts Right Now...
233
00:12:24,034 --> 00:12:26,287
-[Sunny Hostin] Uh-huh.
-...Jaron Lanier.
234
00:12:26,370 --> 00:12:27,913
[cohosts speaking indistinctly]
235
00:12:27,997 --> 00:12:31,834
[Jaron] Companies like Google and Facebook
are some of the wealthiest
236
00:12:31,917 --> 00:12:33,544
and most successful of all time.
237
00:12:33,711 --> 00:12:36,839
Uh, they have relatively few employees.
238
00:12:36,922 --> 00:12:41,427
They just have this giant computer
that rakes in money, right? Uh...
239
00:12:41,510 --> 00:12:42,970
Now, what are they being paid for?
240
00:12:43,053 --> 00:12:45,222
[chuckles]
That's a really important question.
241
00:12:47,308 --> 00:12:50,311
[Roger] So, I've been an investor
in technology for 35 years.
242
00:12:51,020 --> 00:12:54,356
The first 50 years of Silicon Valley,
the industry made products--
243
00:12:54,440 --> 00:12:55,566
hardware, software--
244
00:12:55,649 --> 00:12:58,402
sold 'em to customers.
Nice, simple business.
245
00:12:58,486 --> 00:13:01,447
For the last ten years,
the biggest companies in Silicon Valley
246
00:13:01,530 --> 00:13:03,866
have been in the business
of selling their users.
247
00:13:03,949 --> 00:13:05,910
It's a little even trite to say now,
248
00:13:05,993 --> 00:13:09,205
but... because we don't pay
for the products that we use,
249
00:13:09,288 --> 00:13:12,166
advertisers pay
for the products that we use.
250
00:13:12,249 --> 00:13:14,210
Advertisers are the customers.
251
00:13:14,710 --> 00:13:16,086
We're the thing being sold.
252
00:13:16,170 --> 00:13:17,630
The classic saying is:
253
00:13:17,713 --> 00:13:21,592
"If you're not paying for the product,
then you are the product."
254
00:13:23,385 --> 00:13:27,223
A lot of people think, you know,
"Oh, well, Google's just a search box,
255
00:13:27,306 --> 00:13:29,850
and Facebook's just a place to see
what my friends are doing
256
00:13:29,934 --> 00:13:31,101
and see their photos."
257
00:13:31,185 --> 00:13:35,481
But what they don't realize
is they're competing for your attention.
258
00:13:36,524 --> 00:13:41,111
So, you know, Facebook, Snapchat,
Twitter, Instagram, YouTube,
259
00:13:41,195 --> 00:13:45,699
companies like this, their business model
is to keep people engaged on the screen.
260
00:13:46,283 --> 00:13:49,578
Let's figure out how to get
as much of this person's attention
261
00:13:49,662 --> 00:13:50,955
as we possibly can.
262
00:13:51,455 --> 00:13:53,374
How much time can we get you to spend?
263
00:13:53,874 --> 00:13:56,669
How much of your life can we get you
to give to us?
264
00:13:58,629 --> 00:14:01,090
[Justin] When you think about
how some of these companies work,
265
00:14:01,173 --> 00:14:02,424
it starts to make sense.
266
00:14:03,050 --> 00:14:06,095
There are all these services
on the Internet that we think of as free,
267
00:14:06,178 --> 00:14:09,473
but they're not free.
They're paid for by advertisers.
268
00:14:09,557 --> 00:14:11,559
Why do advertisers pay those companies?
269
00:14:11,642 --> 00:14:14,687
They pay in exchange for showing their ads
to us.
270
00:14:14,770 --> 00:14:18,357
We're the product. Our attention
is the product being sold to advertisers.
271
00:14:18,816 --> 00:14:20,442
That's a little too simplistic.
272
00:14:20,860 --> 00:14:23,654
It's the gradual, slight,
imperceptible change
273
00:14:23,737 --> 00:14:26,574
in your own behavior and perception
that is the product.
274
00:14:27,658 --> 00:14:30,244
And that is the product.
It's the only possible product.
275
00:14:30,327 --> 00:14:34,081
There's nothing else on the table
that could possibly be called the product.
276
00:14:34,164 --> 00:14:37,001
That's the only thing there is
for them to make money from.
277
00:14:37,668 --> 00:14:39,253
Changing what you do,
278
00:14:39,336 --> 00:14:41,714
how you think, who you are.
279
00:14:42,631 --> 00:14:45,301
It's a gradual change. It's slight.
280
00:14:45,384 --> 00:14:48,971
If you can go to somebody and you say,
"Give me $10 million,
281
00:14:49,054 --> 00:14:54,310
and I will change the world one percent
in the direction you want it to change..."
282
00:14:54,852 --> 00:14:58,188
It's the world! That can be incredible,
and that's worth a lot of money.
283
00:14:59,315 --> 00:15:00,149
Okay.
284
00:15:00,691 --> 00:15:04,570
[Shoshana] This is what every business
has alwaysdreamt of:
285
00:15:04,653 --> 00:15:10,910
to have a guarantee that if it places
an ad, it will be successful.
286
00:15:11,327 --> 00:15:12,786
That's their business.
287
00:15:12,870 --> 00:15:14,413
They sell certainty.
288
00:15:14,997 --> 00:15:17,625
In order to be successful
in that business,
289
00:15:17,708 --> 00:15:19,793
you have to have great predictions.
290
00:15:20,085 --> 00:15:24,173
Great predictions begin
with one imperative:
291
00:15:25,215 --> 00:15:26,926
you need a lot of data.
292
00:15:29,136 --> 00:15:31,305
Many people call this
surveillance capitalism,
293
00:15:31,639 --> 00:15:34,350
capitalism profiting
off of the infinite tracking
294
00:15:34,433 --> 00:15:38,062
of everywhere everyone goes
by large technology companies
295
00:15:38,145 --> 00:15:40,356
whose business model is to make sure
296
00:15:40,439 --> 00:15:42,858
that advertisers are as successful
as possible.
297
00:15:42,942 --> 00:15:45,569
This is a new kind of marketplace now.
298
00:15:45,653 --> 00:15:48,072
It's a marketplace
that never existed before.
299
00:15:48,822 --> 00:15:55,371
And it's a marketplace
that trades exclusively in human futures.
300
00:15:56,080 --> 00:16:01,585
Just like there are markets that trade
in pork belly futures or oil futures.
301
00:16:02,127 --> 00:16:07,591
We now have markets
that trade in human futures at scale,
302
00:16:08,175 --> 00:16:13,472
and those markets have produced
the trillions of dollars
303
00:16:14,014 --> 00:16:19,269
that have made the Internet companies
the richest companies
304
00:16:19,353 --> 00:16:22,356
in the history of humanity.
305
00:16:23,357 --> 00:16:25,359
[indistinct chatter]
306
00:16:27,361 --> 00:16:30,990
[Jeff] What I want people to know
is that everything they're doing online
307
00:16:31,073 --> 00:16:34,326
is being watched, is being tracked,
is being measured.
308
00:16:35,035 --> 00:16:39,623
Every single action you take
is carefully monitored and recorded.
309
00:16:39,707 --> 00:16:43,836
Exactly what image you stop and look at,
for how long you look at it.
310
00:16:43,919 --> 00:16:45,796
Oh, yeah, seriously,
for how long you look at it.
311
00:16:45,879 --> 00:16:47,881
[monitors beeping]
312
00:16:50,509 --> 00:16:52,219
[Tristan] They knowwhen people are lonely.
313
00:16:52,302 --> 00:16:53,804
They know when people are depressed.
314
00:16:53,887 --> 00:16:57,099
They know when people are lookingat photos of your ex-romantic partners.
315
00:16:57,182 --> 00:17:00,853
They know what you're doing late at night.They know the entire thing.
316
00:17:01,270 --> 00:17:03,230
Whether you're an introvertor an extrovert,
317
00:17:03,313 --> 00:17:06,817
or what kind of neuroses you have,what your personality type is like.
318
00:17:08,193 --> 00:17:11,613
[Shoshana] They have more information
about us
319
00:17:11,697 --> 00:17:14,324
than has ever been imagined
in human history.
320
00:17:14,950 --> 00:17:16,368
It is unprecedented.
321
00:17:18,579 --> 00:17:22,791
And so, all of this data that we're...
that we're just pouring out all the time
322
00:17:22,875 --> 00:17:26,754
is being fed into these systems
that have almost no human supervision
323
00:17:27,463 --> 00:17:30,883
and that are making better and better
and better and better predictions
324
00:17:30,966 --> 00:17:33,552
about what we're gonna do
and... and who we are.
325
00:17:33,635 --> 00:17:35,637
[indistinct chatter]
326
00:17:36,305 --> 00:17:39,349
[Aza] People have the misconception
it's our data being sold.
327
00:17:40,350 --> 00:17:43,187
It's not in Facebook's business interest
to give up the data.
328
00:17:45,522 --> 00:17:47,107
What do they do with that data?
329
00:17:49,401 --> 00:17:50,986
[console whirring]
330
00:17:51,070 --> 00:17:54,490
[Aza] They build models
that predict our actions,
331
00:17:54,573 --> 00:17:57,618
and whoever has the best model wins.
332
00:18:02,706 --> 00:18:04,041
His scrolling speed is slowing.
333
00:18:04,124 --> 00:18:06,085
Nearing the end
of his average session length.
334
00:18:06,168 --> 00:18:07,002
Decreasing ad load.
335
00:18:07,086 --> 00:18:08,337
Pull back on friends and family.
336
00:18:09,588 --> 00:18:11,340
[Tristan] On the other side of the screen,
337
00:18:11,423 --> 00:18:15,469
it's almost as if they had
this avatar voodoo doll-like model of us.
338
00:18:16,845 --> 00:18:18,180
All of the things we've ever done,
339
00:18:18,263 --> 00:18:19,473
all the clicks we've ever made,
340
00:18:19,556 --> 00:18:21,642
all the videos we've watched,
all the likes,
341
00:18:21,725 --> 00:18:25,354
that all gets brought back into building
a more and more accurate model.
342
00:18:25,896 --> 00:18:27,481
The model, once you have it,
343
00:18:27,564 --> 00:18:29,858
you can predict the kinds of things
that person does.
344
00:18:29,942 --> 00:18:31,777
Right, let me just test.
345
00:18:32,569 --> 00:18:34,988
[Tristan] Where you'll go.
I can predictwhat kind of videos
346
00:18:35,072 --> 00:18:36,115
will keep you watching.
347
00:18:36,198 --> 00:18:39,159
I can predict what kinds of emotions tend
to trigger you.
348
00:18:39,243 --> 00:18:40,410
[blue AI] Yes, perfect.
349
00:18:41,578 --> 00:18:43,372
The most epic fails of the year.
350
00:18:46,125 --> 00:18:47,543
-[crowd groans on video]
-[whooshes]
351
00:18:48,627 --> 00:18:51,088
-Perfect. That worked.
-Following with another video.
352
00:18:51,171 --> 00:18:54,049
Beautiful. Let's squeeze in a sneaker ad
before it starts.
353
00:18:56,426 --> 00:18:58,178
[Tristan] At a lot
of technology companies,
354
00:18:58,262 --> 00:18:59,721
there's three main goals.
355
00:18:59,805 --> 00:19:01,348
There's the engagement goal:
356
00:19:01,431 --> 00:19:03,684
to drive up your usage,
to keep you scrolling.
357
00:19:04,601 --> 00:19:06,145
There's the growth goal:
358
00:19:06,228 --> 00:19:08,689
to keep you coming back
and inviting as many friends
359
00:19:08,772 --> 00:19:10,816
and getting them to invite more friends.
360
00:19:11,650 --> 00:19:13,152
And then there's the advertising goal:
361
00:19:13,235 --> 00:19:14,987
to make sure that,
as all that's happening,
362
00:19:15,070 --> 00:19:17,406
we're making as much money as possible
from advertising.
363
00:19:18,115 --> 00:19:19,158
[console beeps]
364
00:19:19,241 --> 00:19:21,994
Each of these goals are powered
by algorithms
365
00:19:22,077 --> 00:19:24,454
whose job is to figure out
what to show you
366
00:19:24,538 --> 00:19:26,165
to keep those numbers going up.
367
00:19:26,623 --> 00:19:29,918
We often talked about, at Facebook,
this idea
368
00:19:30,002 --> 00:19:34,006
of being able to just dial that as needed.
369
00:19:34,673 --> 00:19:38,594
And, you know, we talked
about having Mark have those dials.
370
00:19:41,305 --> 00:19:44,474
"Hey, I want more users in Korea today."
371
00:19:45,684 --> 00:19:46,602
"Turn the dial."
372
00:19:47,436 --> 00:19:49,188
"Let's dial up the ads a little bit."
373
00:19:49,980 --> 00:19:51,899
"Dial up monetization, just slightly."
374
00:19:52,858 --> 00:19:55,444
And so, that happ--
375
00:19:55,527 --> 00:19:59,239
I mean, at all of these companies,
there is that level of precision.
376
00:19:59,990 --> 00:20:02,409
-Dude, how--
-I don't know how I didn't get carded.
377
00:20:02,492 --> 00:20:05,704
-That ref just, like, sucked or something.
-You got literally all the way...
378
00:20:05,787 --> 00:20:07,956
-That's Rebecca. Go talk to her.
-I know who it is.
379
00:20:08,040 --> 00:20:10,834
-Dude, yo, go talk to her.
-[guy] I'm workin' on it.
380
00:20:10,918 --> 00:20:14,171
His calendar says he's on a break
right now. We should be live.
381
00:20:14,755 --> 00:20:16,465
[sighs] Want me to nudge him?
382
00:20:17,132 --> 00:20:18,050
Yeah, nudge away.
383
00:20:18,133 --> 00:20:19,092
[console beeps]
384
00:20:21,637 --> 00:20:24,181
"Your friend Tyler just joined.
Say hi with a wave."
385
00:20:26,016 --> 00:20:27,184
[Engagement AI] Come on, Ben.
386
00:20:27,267 --> 00:20:29,311
Send a wave. [sighs]
387
00:20:29,394 --> 00:20:32,606
-You're not... Go talk to her, dude.
-[phone vibrates, chimes]
388
00:20:33,857 --> 00:20:35,484
-[Ben sighs]
-[cell phone chimes]
389
00:20:36,902 --> 00:20:37,986
[console beeps]
390
00:20:38,070 --> 00:20:40,447
New link! All right, we're on. [exhales]
391
00:20:40,948 --> 00:20:46,078
Follow that up with a post
from User 079044238820, Rebecca.
392
00:20:46,161 --> 00:20:49,790
Good idea. GPS coordinates indicate
that they're in close proximity.
393
00:20:55,921 --> 00:20:57,172
He's primed for an ad.
394
00:20:57,631 --> 00:20:58,632
Auction time.
395
00:21:00,133 --> 00:21:02,803
Sold! To Deep Fade hair wax.
396
00:21:03,387 --> 00:21:07,933
We had 468 interested bidders. We sold Ben
at 3.262 cents for an impression.
397
00:21:08,850 --> 00:21:10,852
[melancholy piano music playing]
398
00:21:14,147 --> 00:21:15,065
[Ben sighs]
399
00:21:17,109 --> 00:21:18,735
[Jaron] We've created a world
400
00:21:18,819 --> 00:21:21,530
in which online connection
has become primary,
401
00:21:22,072 --> 00:21:23,907
especially for younger generations.
402
00:21:23,991 --> 00:21:28,328
And yet, in that world,
any time two people connect,
403
00:21:29,162 --> 00:21:33,250
the only way it's financed
is through a sneaky third person
404
00:21:33,333 --> 00:21:35,627
who's paying to manipulate
those two people.
405
00:21:36,128 --> 00:21:39,381
So, we've created
an entire global generation of people
406
00:21:39,464 --> 00:21:44,011
who are raised within a context
where the very meaning of communication,
407
00:21:44,094 --> 00:21:47,431
the very meaning of culture,
is manipulation.
408
00:21:47,514 --> 00:21:49,641
We've put deceit and sneakiness
409
00:21:49,725 --> 00:21:52,311
at the absolute center
of everything we do.
410
00:22:05,615 --> 00:22:07,242
-[interviewer] Grab the...
-[Tristan] Okay.
411
00:22:07,326 --> 00:22:09,286
-Where's it help to hold it?
-[interviewer] Great.
412
00:22:09,369 --> 00:22:10,787
-[Tristan] Here?
-[interviewer] Yeah.
413
00:22:10,871 --> 00:22:13,832
How does this come across on camera
if I were to do, like, this move--
414
00:22:13,915 --> 00:22:15,542
-[interviewer] We can--
-[blows] Like that?
415
00:22:15,625 --> 00:22:16,918
-[interviewer laughs] What?
-Yeah.
416
00:22:17,002 --> 00:22:19,004
-[interviewer] Do that again.
-Exactly. Yeah. [blows]
417
00:22:19,087 --> 00:22:20,589
Yeah. No, it's probably not...
418
00:22:20,672 --> 00:22:21,965
Like... yeah.
419
00:22:22,466 --> 00:22:23,884
I mean, this one is less...
420
00:22:29,681 --> 00:22:33,268
[interviewer laughs] Larissa's, like,
actually freaking out over here.
421
00:22:34,728 --> 00:22:35,562
Is that good?
422
00:22:35,645 --> 00:22:37,773
[instrumental music playing]
423
00:22:37,856 --> 00:22:41,068
[Tristan] I was, like, five years old
when I learned how to do magic.
424
00:22:41,151 --> 00:22:45,781
And I could fool adults,
fully-grown adults with, like,PhDs.
425
00:22:55,040 --> 00:22:57,709
Magicians were almost like
the first neuroscientists
426
00:22:57,793 --> 00:22:58,960
and psychologists.
427
00:22:59,044 --> 00:23:02,005
Like, they were the ones
who first understood
428
00:23:02,089 --> 00:23:03,382
how people's minds work.
429
00:23:04,216 --> 00:23:07,677
They just, in real time, are testing
lots and lots of stuff on people.
430
00:23:09,137 --> 00:23:11,139
A magician understands something,
431
00:23:11,223 --> 00:23:14,017
some part of your mind
that we're not aware of.
432
00:23:14,101 --> 00:23:15,936
That's what makes the illusion work.
433
00:23:16,019 --> 00:23:20,607
Doctors, lawyers, people who know
how to build 747s or nuclear missiles,
434
00:23:20,690 --> 00:23:24,361
they don't know more about
how their own mind is vulnerable.
435
00:23:24,444 --> 00:23:26,113
That's a separate discipline.
436
00:23:26,571 --> 00:23:28,990
And it's a discipline
that applies to all human beings.
437
00:23:30,909 --> 00:23:34,079
From that perspective, you can have
a very different understanding
438
00:23:34,162 --> 00:23:35,580
of what technology is doing.
439
00:23:36,873 --> 00:23:39,584
When I was
at the Stanford Persuasive Technology Lab,
440
00:23:39,668 --> 00:23:41,044
this is what we learned.
441
00:23:41,628 --> 00:23:43,463
How could you use everything we know
442
00:23:43,547 --> 00:23:45,882
about the psychology
of what persuades people
443
00:23:45,966 --> 00:23:48,385
and build that into technology?
444
00:23:48,468 --> 00:23:50,887
Now, many of you in the audience
are geniuses already.
445
00:23:50,971 --> 00:23:55,851
I think that's true, but my goal is
to turn you into a behavior-change genius.
446
00:23:56,852 --> 00:24:01,148
There are many prominent Silicon Valley
figures who went through that class--
447
00:24:01,231 --> 00:24:05,485
key growth figures at Facebook and Uber
and... and other companies--
448
00:24:05,569 --> 00:24:09,197
and learned how to make technology
more persuasive,
449
00:24:09,614 --> 00:24:10,782
Tristan being one.
450
00:24:12,284 --> 00:24:14,619
[Tristan] Persuasive technology
is just sort of design
451
00:24:14,703 --> 00:24:16,580
intentionally applied to the extreme,
452
00:24:16,663 --> 00:24:18,874
where we really want to modify
someone's behavior.
453
00:24:18,957 --> 00:24:20,542
We want them to take this action.
454
00:24:20,625 --> 00:24:23,336
We want them to keep doing this
with their finger.
455
00:24:23,420 --> 00:24:26,256
You pull down and you refresh,
it's gonna be a new thing at the top.
456
00:24:26,339 --> 00:24:28,508
Pull down and refresh again, it's new.
Every single time.
457
00:24:28,592 --> 00:24:33,722
Which, in psychology, we call
a positive intermittent reinforcement.
458
00:24:33,805 --> 00:24:37,142
You don't know when you're gonna get it
or if you're gonna get something,
459
00:24:37,225 --> 00:24:40,061
which operates just like the slot machines
in Vegas.
460
00:24:40,145 --> 00:24:42,230
It's not enough
that you use the product consciously,
461
00:24:42,314 --> 00:24:44,024
I wanna dig down deeper
into the brain stem
462
00:24:44,107 --> 00:24:45,817
and implant, inside of you,
463
00:24:45,901 --> 00:24:47,652
an unconscious habit
464
00:24:47,736 --> 00:24:50,864
so that you are being programmed
at a deeper level.
465
00:24:50,947 --> 00:24:52,115
You don't even realize it.
466
00:24:52,532 --> 00:24:54,034
[teacher] A man, James Marshall...
467
00:24:54,117 --> 00:24:56,286
[Tristan] Every time you see it there
on the counter,
468
00:24:56,369 --> 00:24:59,789
and you just look at it,
and you know if you reach over,
469
00:24:59,873 --> 00:25:01,333
it just might have something for you,
470
00:25:01,416 --> 00:25:03,877
so you play that slot machine
to see what you got, right?
471
00:25:03,960 --> 00:25:06,046
That's not by accident.
That's a design technique.
472
00:25:06,129 --> 00:25:08,632
[teacher] He brings a golden nugget
to an officer
473
00:25:09,841 --> 00:25:11,301
in the army in San Francisco.
474
00:25:12,219 --> 00:25:15,388
Mind you, the... the population
of San Francisco was only...
475
00:25:15,472 --> 00:25:17,432
[Jeff]
Another example is photo tagging.
476
00:25:17,516 --> 00:25:19,643
-[teacher] The secret didn't last.
-[phone vibrates]
477
00:25:19,726 --> 00:25:21,186
[Jeff] So, if you get an e-mail
478
00:25:21,269 --> 00:25:24,064
that says your friend just tagged you
in a photo,
479
00:25:24,147 --> 00:25:28,568
of course you're going to click
on that e-mail and look at the photo.
480
00:25:29,152 --> 00:25:31,821
It's not something
you can just decide to ignore.
481
00:25:32,364 --> 00:25:34,157
This is deep-seated, like,
482
00:25:34,241 --> 00:25:36,326
human personality
that they're tapping into.
483
00:25:36,409 --> 00:25:38,078
What you should be asking yourself is:
484
00:25:38,161 --> 00:25:40,288
"Why doesn't that e-mail contain
the photo in it?
485
00:25:40,372 --> 00:25:42,457
It would be a lot easier
to see the photo."
486
00:25:42,541 --> 00:25:45,919
When Facebook found that feature,
they just dialed the hell out of that
487
00:25:46,002 --> 00:25:48,505
because they said, "This is gonna be
a great way to grow activity.
488
00:25:48,588 --> 00:25:51,091
Let's just get people tagging each other
in photos all day long."
489
00:25:51,174 --> 00:25:53,176
[upbeat techno music playing]
490
00:25:57,889 --> 00:25:58,890
[cell phone chimes]
491
00:25:59,349 --> 00:26:00,475
He commented.
492
00:26:00,559 --> 00:26:01,434
[Growth AI] Nice.
493
00:26:01,935 --> 00:26:04,688
Okay, Rebecca received it,
and she is responding.
494
00:26:04,771 --> 00:26:07,566
All right, let Ben know that she's typing
so we don't lose him.
495
00:26:07,649 --> 00:26:08,733
Activating ellipsis.
496
00:26:09,776 --> 00:26:11,945
[teacher continues speaking indistinctly]
497
00:26:13,697 --> 00:26:15,865
[tense instrumental music playing]
498
00:26:19,953 --> 00:26:21,329
Great, she posted.
499
00:26:21,454 --> 00:26:24,249
He's commenting on her comment
about his comment on her post.
500
00:26:25,041 --> 00:26:26,418
Hold on, he stopped typing.
501
00:26:26,751 --> 00:26:27,752
Let's autofill.
502
00:26:28,420 --> 00:26:30,005
Emojis. He loves emojis.
503
00:26:33,842 --> 00:26:34,676
He went with fire.
504
00:26:34,759 --> 00:26:36,803
[clicks tongue, sighs]
I was rootin' for eggplant.
505
00:26:38,597 --> 00:26:42,726
[Tristan] There's an entire discipline
and field called "growth hacking."
506
00:26:42,809 --> 00:26:47,147
Teams of engineers
whose job is to hack people's psychology
507
00:26:47,230 --> 00:26:48,565
so they can get more growth.
508
00:26:48,648 --> 00:26:50,984
They can get more user sign-ups,
more engagement.
509
00:26:51,067 --> 00:26:52,861
They can get you to invite more people.
510
00:26:52,944 --> 00:26:55,989
After all the testing, all the iterating,
all of this stuff,
511
00:26:56,072 --> 00:26:57,907
you know the single biggest thing
we realized?
512
00:26:57,991 --> 00:27:00,702
Get any individual to seven friends
in ten days.
513
00:27:01,953 --> 00:27:02,787
That was it.
514
00:27:02,871 --> 00:27:05,498
Chamath was the head of growth at Facebook
early on,
515
00:27:05,582 --> 00:27:08,251
and he's very well known
in the tech industry
516
00:27:08,335 --> 00:27:11,004
for pioneering a lot of the growth tactics
517
00:27:11,087 --> 00:27:14,758
that were used to grow Facebook
at incredible speed.
518
00:27:14,841 --> 00:27:18,553
And those growth tactics have then become
the standard playbook for Silicon Valley.
519
00:27:18,637 --> 00:27:21,222
They were used at Uber
and at a bunch of other companies.
520
00:27:21,306 --> 00:27:27,062
One of the things that he pioneered
was the use of scientific A/B testing
521
00:27:27,145 --> 00:27:28,480
of small feature changes.
522
00:27:29,022 --> 00:27:30,940
Companies like Google and Facebook
523
00:27:31,024 --> 00:27:34,569
would roll out
lots of little, tiny experiments
524
00:27:34,653 --> 00:27:36,821
that they were constantly doing on users.
525
00:27:36,905 --> 00:27:39,866
And over time,
by running these constant experiments,
526
00:27:39,949 --> 00:27:43,036
you... you develop the most optimal way
527
00:27:43,119 --> 00:27:45,288
to get users to do
what you want them to do.
528
00:27:45,372 --> 00:27:46,790
It's... It's manipulation.
529
00:27:47,332 --> 00:27:49,459
[interviewer]
Uh, you're making me feel like a lab rat.
530
00:27:49,834 --> 00:27:51,920
You are a lab rat. We're all lab rats.
531
00:27:52,545 --> 00:27:55,548
And it's not like we're lab rats
for developing a cure for cancer.
532
00:27:55,632 --> 00:27:58,134
It's not like they're trying
to benefit us.
533
00:27:58,218 --> 00:28:01,680
Right? We're just zombies,
and they want us to look at more ads
534
00:28:01,763 --> 00:28:03,181
so they can make moremoney.
535
00:28:03,556 --> 00:28:05,266
[Shoshana] Facebook conducted
536
00:28:05,350 --> 00:28:08,228
what they called
"massive-scale contagion experiments."
537
00:28:08,311 --> 00:28:09,145
Okay.
538
00:28:09,229 --> 00:28:13,066
[Shoshana] How do we use subliminal cues
on the Facebook pages
539
00:28:13,400 --> 00:28:17,654
to get more people to go vote
in the midterm elections?
540
00:28:17,987 --> 00:28:20,824
And they discovered
that they were able to do that.
541
00:28:20,907 --> 00:28:24,160
One thing they concluded
is that we now know
542
00:28:24,744 --> 00:28:28,915
we can affect real-world behavior
and emotions
543
00:28:28,998 --> 00:28:32,877
without ever triggering
the user's awareness.
544
00:28:33,378 --> 00:28:37,382
They are completely clueless.
545
00:28:38,049 --> 00:28:41,970
We're pointing these engines of AI
back at ourselves
546
00:28:42,053 --> 00:28:46,224
to reverse-engineer what elicits responses
from us.
547
00:28:47,100 --> 00:28:49,561
Almost like you're stimulating nerve cells
on a spider
548
00:28:49,644 --> 00:28:51,479
to see what causes its legs to respond.
549
00:28:51,938 --> 00:28:53,940
So, it really is
this kind of prison experiment
550
00:28:54,023 --> 00:28:56,735
where we're just, you know,
roping people into the matrix,
551
00:28:56,818 --> 00:29:00,572
and we're just harvesting all this money
and... and data from all their activity
552
00:29:00,655 --> 00:29:01,489
to profit from.
553
00:29:01,573 --> 00:29:03,450
And we're not even aware
that it's happening.
554
00:29:04,117 --> 00:29:07,912
So, we want to psychologically figure out
how to manipulate you as fast as possible
555
00:29:07,996 --> 00:29:10,081
and then give you back that dopamine hit.
556
00:29:10,165 --> 00:29:12,375
We did that brilliantly at Facebook.
557
00:29:12,625 --> 00:29:14,919
Instagram has done it.
WhatsApp has done it.
558
00:29:15,003 --> 00:29:17,380
You know, Snapchat has done it.
Twitter has done it.
559
00:29:17,464 --> 00:29:19,424
I mean, it's exactly the kind of thing
560
00:29:19,507 --> 00:29:22,427
that a... that a hacker like myself
would come up with
561
00:29:22,510 --> 00:29:27,015
because you're exploiting a vulnerability
in... in human psychology.
562
00:29:27,807 --> 00:29:29,726
[chuckles] And I just...
I think that we...
563
00:29:29,809 --> 00:29:33,438
you know, the inventors, creators...
564
00:29:33,980 --> 00:29:37,317
uh, you know, and it's me, it's Mark,
it's the...
565
00:29:37,400 --> 00:29:40,403
you know, Kevin Systrom at Instagram...
It's all of these people...
566
00:29:40,487 --> 00:29:46,451
um, understood this consciously,
and we did it anyway.
567
00:29:50,580 --> 00:29:53,750
No one got upset when bicycles showed up.
568
00:29:55,043 --> 00:29:58,004
Right? Like, if everyone's starting
to go around on bicycles,
569
00:29:58,087 --> 00:30:00,924
no one said,
"Oh, my God, we've just ruined society.
570
00:30:01,007 --> 00:30:03,051
[chuckles]
Like, bicycles are affecting people.
571
00:30:03,134 --> 00:30:05,303
They're pulling people
away from their kids.
572
00:30:05,386 --> 00:30:08,723
They're ruining the fabric of democracy.
People can't tell what's true."
573
00:30:08,807 --> 00:30:11,476
Like, we never said any of that stuff
about a bicycle.
574
00:30:12,769 --> 00:30:16,147
If something is a tool,
it genuinely is just sitting there,
575
00:30:16,731 --> 00:30:18,733
waiting patiently.
576
00:30:19,317 --> 00:30:22,821
If something is not a tool,
it's demanding things from you.
577
00:30:22,904 --> 00:30:26,533
It's seducing you. It's manipulating you.
It wants things from you.
578
00:30:26,950 --> 00:30:30,495
And we've moved away from having
a tools-based technology environment
579
00:30:31,037 --> 00:30:34,499
to an addiction- and manipulation-based
technology environment.
580
00:30:34,582 --> 00:30:35,708
That's what's changed.
581
00:30:35,792 --> 00:30:39,420
Social media isn't a tool
that's just waiting to be used.
582
00:30:39,504 --> 00:30:43,466
It has its own goals,
and it has its own means of pursuing them
583
00:30:43,550 --> 00:30:45,677
by using your psychology against you.
584
00:30:45,760 --> 00:30:47,762
[ominous instrumental music playing]
585
00:30:57,564 --> 00:31:00,567
[Tim] Rewind a few years ago,
I was the...
586
00:31:00,650 --> 00:31:02,318
I was the president of Pinterest.
587
00:31:03,152 --> 00:31:05,113
I was coming home,
588
00:31:05,196 --> 00:31:08,366
and I couldn't get off my phone
once I got home,
589
00:31:08,449 --> 00:31:12,161
despite having two young kids
who needed my love and attention.
590
00:31:12,245 --> 00:31:15,748
I was in the pantry, you know,
typing away on an e-mail
591
00:31:15,832 --> 00:31:17,542
or sometimes looking at Pinterest.
592
00:31:18,001 --> 00:31:19,627
I thought, "God, this is classic irony.
593
00:31:19,711 --> 00:31:22,046
I am going to work during the day
594
00:31:22,130 --> 00:31:26,426
and building something
that then I am falling prey to."
595
00:31:26,509 --> 00:31:30,096
And I couldn't... I mean, some
of those moments, I couldn't help myself.
596
00:31:30,179 --> 00:31:31,848
-[notification chimes]
-[woman gasps]
597
00:31:32,307 --> 00:31:36,102
The one
that I'm... I'm most prone to is Twitter.
598
00:31:36,185 --> 00:31:38,021
Uh, used to be Reddit.
599
00:31:38,104 --> 00:31:42,859
I actually had to write myself software
to break my addiction to reading Reddit.
600
00:31:42,942 --> 00:31:44,903
-[notifications chime]
-[slot machines whir]
601
00:31:45,403 --> 00:31:47,780
I'm probably most addicted to my e-mail.
602
00:31:47,864 --> 00:31:49,866
I mean, really. I mean, I... I feel it.
603
00:31:49,949 --> 00:31:51,409
-[notifications chime]
-[woman gasps]
604
00:31:51,492 --> 00:31:52,493
[electricity crackles]
605
00:31:52,577 --> 00:31:54,954
Well, I mean, it's sort-- it's interesting
606
00:31:55,038 --> 00:31:58,166
that knowing what was going on
behind the curtain,
607
00:31:58,249 --> 00:32:01,628
I still wasn't able to control my usage.
608
00:32:01,711 --> 00:32:03,046
So, that's a little scary.
609
00:32:03,630 --> 00:32:07,050
Even knowing how these tricks work,
I'm still susceptible to them.
610
00:32:07,133 --> 00:32:09,886
I'll still pick up the phone,
and 20 minutes will disappear.
611
00:32:09,969 --> 00:32:11,387
[notifications chime]
612
00:32:11,471 --> 00:32:12,722
-[fluid rushes]
-[woman gasps]
613
00:32:12,805 --> 00:32:15,725
Do you check your smartphone
before you pee in the morning
614
00:32:15,808 --> 00:32:17,477
or while you're peeing in the morning?
615
00:32:17,560 --> 00:32:19,479
'Cause those are the only two choices.
616
00:32:19,562 --> 00:32:23,274
I tried through willpower,
just pure willpower...
617
00:32:23,358 --> 00:32:26,903
"I'll put down my phone, I'll leave
my phone in the car when I get home."
618
00:32:26,986 --> 00:32:30,573
I think I told myself a thousand times,
a thousand different days,
619
00:32:30,657 --> 00:32:32,617
"I am not gonna bring my phone
to the bedroom,"
620
00:32:32,700 --> 00:32:34,535
and then 9:00 p.m. rolls around.
621
00:32:34,619 --> 00:32:37,121
"Well, I wanna bring my phone
in the bedroom."
622
00:32:37,205 --> 00:32:39,290
[takes a deep breath]
And so, that was sort of...
623
00:32:39,374 --> 00:32:41,125
Willpower was kind of attempt one,
624
00:32:41,209 --> 00:32:44,295
and then attempt two was,
you know, brute force.
625
00:32:44,379 --> 00:32:48,091
[announcer] Introducing the Kitchen Safe.
The Kitchen Safe is a revolutionary,
626
00:32:48,174 --> 00:32:51,678
new, time-locking container
that helps you fight temptation.
627
00:32:51,761 --> 00:32:56,724
All David has to do is placethose temptations in the Kitchen Safe.
628
00:32:57,392 --> 00:33:00,395
Next, he rotates the dialto set the timer.
629
00:33:01,479 --> 00:33:04,232
And, finally, he presses the dialto activate the lock.
630
00:33:04,315 --> 00:33:05,525
The Kitchen Safe is great...
631
00:33:05,608 --> 00:33:06,776
We have that, don't we?
632
00:33:06,859 --> 00:33:08,653
...video games, credit cards,and cell phones.
633
00:33:08,736 --> 00:33:09,654
Yeah, we do.
634
00:33:09,737 --> 00:33:12,407
[announcer] Once the Kitchen Safeis locked, it cannot be opened
635
00:33:12,490 --> 00:33:13,866
until the timer reaches zero.
636
00:33:13,950 --> 00:33:15,618
[Anna] So, here's the thing.
637
00:33:15,702 --> 00:33:17,537
Social media is a drug.
638
00:33:17,620 --> 00:33:20,873
I mean,
we have a basic biological imperative
639
00:33:20,957 --> 00:33:23,084
to connect with other people.
640
00:33:23,167 --> 00:33:28,214
That directly affects the release
of dopamine in the reward pathway.
641
00:33:28,297 --> 00:33:32,552
Millions of years of evolution, um,
are behind that system
642
00:33:32,635 --> 00:33:35,596
to get us to come together
and live in communities,
643
00:33:35,680 --> 00:33:38,016
to find mates, to propagate our species.
644
00:33:38,099 --> 00:33:41,853
So, there's no doubt
that a vehicle like social media,
645
00:33:41,936 --> 00:33:45,690
which optimizes this connection
between people,
646
00:33:45,773 --> 00:33:48,568
is going to have the potential
for addiction.
647
00:33:52,071 --> 00:33:54,115
-Mmm! [laughs]
-Dad, stop!
648
00:33:55,450 --> 00:33:58,453
I have, like, 1,000 more snips
to send before dinner.
649
00:33:58,536 --> 00:34:00,788
-[dad] Snips?
-I don't know what a snip is.
650
00:34:00,872 --> 00:34:03,207
-Mm, that smells good, baby.
-All right. Thank you.
651
00:34:03,291 --> 00:34:05,877
I was, um, thinking we could use
all five senses
652
00:34:05,960 --> 00:34:07,712
to enjoy our dinner tonight.
653
00:34:07,795 --> 00:34:11,382
So, I decided that we're not gonna have
any cell phones at the table tonight.
654
00:34:11,466 --> 00:34:13,301
So, turn 'em in.
655
00:34:13,801 --> 00:34:14,802
-Really?
-[mom] Yep.
656
00:34:15,928 --> 00:34:18,056
-All right.
-Thank you. Ben?
657
00:34:18,139 --> 00:34:20,433
-Okay.
-Mom, the phone pirate. [scoffs]
658
00:34:21,100 --> 00:34:21,934
-Got it.
-Mom!
659
00:34:22,518 --> 00:34:26,147
So, they will be safe in here
until after dinner...
660
00:34:27,273 --> 00:34:30,651
-and everyone can just chill out.
-[safe whirs]
661
00:34:30,735 --> 00:34:31,569
Okay?
662
00:34:40,828 --> 00:34:41,704
[Cass sighs]
663
00:34:45,708 --> 00:34:47,043
[notification chimes]
664
00:34:47,418 --> 00:34:49,253
-Can I just see who it is?
-No.
665
00:34:54,759 --> 00:34:56,969
Just gonna go get another fork.
666
00:34:58,304 --> 00:34:59,263
Thank you.
667
00:35:04,727 --> 00:35:06,771
Honey, you can't open that.
668
00:35:06,854 --> 00:35:09,315
I locked it for an hour,
so just leave it alone.
669
00:35:11,192 --> 00:35:13,361
So, what should we talk about?
670
00:35:13,444 --> 00:35:14,695
Well, we could talk
671
00:35:14,779 --> 00:35:17,615
about the, uh, Extreme Center wackos
I drove by today.
672
00:35:17,698 --> 00:35:18,825
-[mom] Please, Frank.
-What?
673
00:35:18,908 --> 00:35:20,785
[mom] I don't wanna talk about politics.
674
00:35:20,868 --> 00:35:23,538
-What's wrong with the Extreme Center?
-See? He doesn't even get it.
675
00:35:23,621 --> 00:35:24,622
It depends on who you ask.
676
00:35:24,705 --> 00:35:26,624
It's like asking,
"What's wrong with propaganda?"
677
00:35:26,707 --> 00:35:28,376
-[safe smashes]
-[mom and Frank scream]
678
00:35:28,709 --> 00:35:29,710
[Frank] Isla!
679
00:35:32,797 --> 00:35:33,756
Oh, my God.
680
00:35:36,425 --> 00:35:38,553
-[sighs] Do you want me to...
-[mom] Yeah.
681
00:35:41,973 --> 00:35:43,933
[Anna] I... I'm worried about my kids.
682
00:35:44,016 --> 00:35:46,686
And if you have kids,
I'm worried about your kids.
683
00:35:46,769 --> 00:35:50,189
Armed with all the knowledge that I have
and all of the experience,
684
00:35:50,273 --> 00:35:52,108
I am fighting my kids about the time
685
00:35:52,191 --> 00:35:54,443
that they spend on phones
and on the computer.
686
00:35:54,527 --> 00:35:58,197
I will say to my son, "How many hours do
you think you're spending on your phone?"
687
00:35:58,281 --> 00:36:01,075
He'll be like, "It's, like, half an hour.
It's half an hour, tops."
688
00:36:01,159 --> 00:36:04,829
I'd say upwards hour, hour and a half.
689
00:36:04,912 --> 00:36:06,789
I looked at his screen report
a couple weeks ago.
690
00:36:06,873 --> 00:36:08,708
-Three hours and 45 minutes.
-[James] That...
691
00:36:11,377 --> 00:36:13,588
I don't think that's...
No. Per day, on average?
692
00:36:13,671 --> 00:36:15,506
-Yeah.
-Should I go get it right now?
693
00:36:15,590 --> 00:36:19,177
There's not a day that goes by
that I don't remind my kids
694
00:36:19,260 --> 00:36:21,762
about the pleasure-pain balance,
695
00:36:21,846 --> 00:36:24,390
about dopamine deficit states,
696
00:36:24,473 --> 00:36:26,267
about the risk of addiction.
697
00:36:26,350 --> 00:36:27,310
[Mary] Moment of truth.
698
00:36:27,935 --> 00:36:29,687
Two hours, 50 minutes per day.
699
00:36:29,770 --> 00:36:31,772
-Let's see.
-Actually, I've been using a lot today.
700
00:36:31,856 --> 00:36:33,357
-Last seven days.
-That's probably why.
701
00:36:33,441 --> 00:36:37,361
Instagram, six hours, 13 minutes.
Okay, so my Instagram's worse.
702
00:36:39,572 --> 00:36:41,991
My screen's completely shattered.
703
00:36:42,200 --> 00:36:43,201
Thanks, Cass.
704
00:36:44,410 --> 00:36:45,995
What do you mean, "Thanks, Cass"?
705
00:36:46,078 --> 00:36:49,040
You keep freaking Mom out about our phones
when it's not really a problem.
706
00:36:49,373 --> 00:36:51,167
We don't need our phones to eat dinner!
707
00:36:51,250 --> 00:36:53,878
I get what you're saying.
It's just not that big a deal. It's not.
708
00:36:56,047 --> 00:36:58,382
If it's not that big a deal,
don't use it for a week.
709
00:36:59,634 --> 00:37:00,593
[Ben sighs]
710
00:37:01,135 --> 00:37:06,349
Yeah. Yeah, actually, if you can put
that thing away for, like, a whole week...
711
00:37:07,725 --> 00:37:09,518
I will buy you a new screen.
712
00:37:10,978 --> 00:37:12,897
-Like, starting now?
-[mom] Starting now.
713
00:37:15,149 --> 00:37:16,859
-Okay. You got a deal.
-[mom] Okay.
714
00:37:16,943 --> 00:37:19,111
Okay, you gotta leave it here, though,
buddy.
715
00:37:19,862 --> 00:37:21,364
All right, I'm plugging it in.
716
00:37:22,531 --> 00:37:25,076
Let the record show... I'm backing away.
717
00:37:25,159 --> 00:37:25,993
Okay.
718
00:37:27,787 --> 00:37:29,413
-You're on the clock.
-[Ben] One week.
719
00:37:29,497 --> 00:37:30,331
Oh, my...
720
00:37:31,457 --> 00:37:32,416
Think he can do it?
721
00:37:33,000 --> 00:37:34,252
I don't know. We'll see.
722
00:37:35,002 --> 00:37:36,128
Just eat, okay?
723
00:37:44,220 --> 00:37:45,263
Good family dinner!
724
00:37:47,682 --> 00:37:49,809
[Tristan] These technology products
were not designed
725
00:37:49,892 --> 00:37:53,896
by child psychologists who are trying
to protect and nurture children.
726
00:37:53,980 --> 00:37:56,148
They were just designing
to make these algorithms
727
00:37:56,232 --> 00:37:58,734
that were really good at recommending
the next video to you
728
00:37:58,818 --> 00:38:02,321
or really good at getting you
to take a photo with a filter on it.
729
00:38:15,710 --> 00:38:16,669
[cell phone chimes]
730
00:38:16,752 --> 00:38:18,879
[Tristan] It's not just
that it's controlling
731
00:38:18,963 --> 00:38:20,548
where they spend their attention.
732
00:38:21,173 --> 00:38:26,304
Especially social media starts to dig
deeper and deeper down into the brain stem
733
00:38:26,387 --> 00:38:29,765
and take over kids' sense of self-worth
and identity.
734
00:38:41,736 --> 00:38:42,903
[notifications chiming]
735
00:38:52,371 --> 00:38:56,208
[Tristan] We evolved to care about
whether other people in our tribe...
736
00:38:56,751 --> 00:38:59,128
think well of us or not
'cause it matters.
737
00:38:59,837 --> 00:39:04,550
But were we evolved to be aware
of what 10,000 people think of us?
738
00:39:04,633 --> 00:39:08,763
We were not evolved
to have social approval being dosed to us
739
00:39:08,846 --> 00:39:10,348
every five minutes.
740
00:39:10,431 --> 00:39:13,142
That was not at all what we were built
to experience.
741
00:39:15,394 --> 00:39:19,982
[Chamath] We curate our lives
around this perceived sense of perfection
742
00:39:20,733 --> 00:39:23,527
because we get rewarded
in these short-term signals--
743
00:39:23,611 --> 00:39:25,154
hearts, likes, thumbs-up--
744
00:39:25,237 --> 00:39:28,407
and we conflate that with value,
and we conflate it with truth.
745
00:39:29,825 --> 00:39:33,120
And instead, what it really is
is fake, brittle popularity...
746
00:39:33,913 --> 00:39:37,458
that's short-term and that leaves you
even more, and admit it,
747
00:39:37,541 --> 00:39:39,919
vacant and empty before you did it.
748
00:39:41,295 --> 00:39:43,381
Because then it forces you
into this vicious cycle
749
00:39:43,464 --> 00:39:47,176
where you're like, "What's the next thing
I need to do now?'Cause I need it back."
750
00:39:48,260 --> 00:39:50,846
Think about that compounded
by two billion people,
751
00:39:50,930 --> 00:39:54,767
and then think about how people react then
to the perceptions of others.
752
00:39:54,850 --> 00:39:56,435
It's just a... It's really bad.
753
00:39:56,977 --> 00:39:58,229
It's really, really bad.
754
00:40:00,856 --> 00:40:03,484
[Jonathan] There has been
a gigantic increase
755
00:40:03,567 --> 00:40:06,529
in depression and anxiety
for American teenagers
756
00:40:06,612 --> 00:40:10,950
which began right around...
between 2011 and 2013.
757
00:40:11,033 --> 00:40:15,371
The number of teenage girls out of 100,000
in this country
758
00:40:15,454 --> 00:40:17,123
who were admitted to a hospital every year
759
00:40:17,206 --> 00:40:19,917
because they cut themselves
or otherwise harmed themselves,
760
00:40:20,000 --> 00:40:23,921
that number was pretty stable
until around 2010, 2011,
761
00:40:24,004 --> 00:40:25,756
and then it begins going way up.
762
00:40:28,759 --> 00:40:32,513
It's up 62 percent for older teen girls.
763
00:40:33,848 --> 00:40:38,310
It's up 189 percent for the preteen girls.
That's nearly triple.
764
00:40:40,312 --> 00:40:43,524
Even more horrifying,
we see the same pattern with suicide.
765
00:40:44,775 --> 00:40:47,570
The older teen girls, 15 to 19 years old,
766
00:40:47,653 --> 00:40:49,196
they're up 70 percent,
767
00:40:49,280 --> 00:40:51,699
compared to the first decade
of this century.
768
00:40:52,158 --> 00:40:55,077
The preteen girls,
who have very low rates to begin with,
769
00:40:55,161 --> 00:40:57,663
they are up 151 percent.
770
00:40:58,831 --> 00:41:01,709
And that pattern points to social media.
771
00:41:04,044 --> 00:41:07,214
Gen Z, the kids born after 1996 or so,
772
00:41:07,298 --> 00:41:10,342
those kids are the first generation
in history
773
00:41:10,426 --> 00:41:12,636
that got on social media in middle school.
774
00:41:12,720 --> 00:41:14,722
[thunder rumbling in distance]
775
00:41:15,890 --> 00:41:17,600
[Jonathan] How do they spend their time?
776
00:41:19,727 --> 00:41:22,730
They come home from school,
and they're on their devices.
777
00:41:24,315 --> 00:41:29,195
A whole generation is more anxious,
more fragile, more depressed.
778
00:41:29,320 --> 00:41:30,529
-[thunder rumbles]
-[Isla gasps]
779
00:41:30,613 --> 00:41:33,282
[Jonathan] They're much less comfortable
taking risks.
780
00:41:34,325 --> 00:41:37,536
The rates at which they get
driver's licenses have been dropping.
781
00:41:38,954 --> 00:41:41,081
The number
who have ever gone out on a date
782
00:41:41,165 --> 00:41:44,251
or had any kind of romantic interaction
is dropping rapidly.
783
00:41:47,505 --> 00:41:49,715
This is a real change in a generation.
784
00:41:53,177 --> 00:41:57,306
And remember, for every one of these,
for every hospital admission,
785
00:41:57,389 --> 00:42:00,267
there's a family that is traumatized
and horrified.
786
00:42:00,351 --> 00:42:02,353
"My God, what is happening to our kids?"
787
00:42:08,734 --> 00:42:09,693
[Isla sighs]
788
00:42:19,411 --> 00:42:21,413
[Tim] It's plain as day to me.
789
00:42:22,873 --> 00:42:28,128
These services are killing people...
and causing people to kill themselves.
790
00:42:29,088 --> 00:42:33,300
I don't know any parent who says, "Yeah,
I really want my kids to be growing up
791
00:42:33,384 --> 00:42:36,887
feeling manipulated by tech designers, uh,
792
00:42:36,971 --> 00:42:39,723
manipulating their attention,
making it impossible to do their homework,
793
00:42:39,807 --> 00:42:42,560
making them compare themselves
to unrealistic standards of beauty."
794
00:42:42,643 --> 00:42:44,687
Like, no one wants that. [chuckles]
795
00:42:45,104 --> 00:42:46,355
No one does.
796
00:42:46,438 --> 00:42:48,482
We... We used to have these protections.
797
00:42:48,566 --> 00:42:50,943
When children watched
Saturday morning cartoons,
798
00:42:51,026 --> 00:42:52,778
we cared about protecting children.
799
00:42:52,861 --> 00:42:56,574
We would say, "You can't advertise
to these age children in these ways."
800
00:42:57,366 --> 00:42:58,784
But then you take YouTube for Kids,
801
00:42:58,867 --> 00:43:02,454
and it gobbles up that entire portion
of the attention economy,
802
00:43:02,538 --> 00:43:04,915
and now all kids are exposed
to YouTube for Kids.
803
00:43:04,999 --> 00:43:07,710
And all those protections
and all those regulations are gone.
804
00:43:08,210 --> 00:43:10,212
[tense instrumental music playing]
805
00:43:18,304 --> 00:43:22,141
[Tristan] We're training and conditioning
a whole new generation of people...
806
00:43:23,434 --> 00:43:29,148
that when we are uncomfortable or lonely
or uncertain or afraid,
807
00:43:29,231 --> 00:43:31,775
we have a digital pacifier for ourselves
808
00:43:32,234 --> 00:43:36,488
that is kind of atrophying our own ability
to deal with that.
809
00:43:53,881 --> 00:43:55,674
[Tristan] Photoshop didn't have
1,000 engineers
810
00:43:55,758 --> 00:43:58,969
on the other side of the screen,
using notifications, using your friends,
811
00:43:59,053 --> 00:44:02,431
using AI to predict what's gonna
perfectly addict you, or hook you,
812
00:44:02,514 --> 00:44:04,516
or manipulate you, or allow advertisers
813
00:44:04,600 --> 00:44:08,437
to test 60,000 variations
of text or colors to figure out
814
00:44:08,520 --> 00:44:11,065
what's the perfect manipulation
of your mind.
815
00:44:11,148 --> 00:44:14,985
This is a totally new species
of power and influence.
816
00:44:16,070 --> 00:44:19,156
I... I would say, again, the methods used
817
00:44:19,239 --> 00:44:22,785
to play on people's ability
to be addicted or to be influenced
818
00:44:22,868 --> 00:44:25,204
may be different this time,
and they probably are different.
819
00:44:25,287 --> 00:44:28,749
They were different when newspapers
came in and the printing press came in,
820
00:44:28,832 --> 00:44:31,835
and they were different
when television came in,
821
00:44:31,919 --> 00:44:34,004
and you had three major networks and...
822
00:44:34,463 --> 00:44:36,423
-At the time.
-At the time. That's what I'm saying.
823
00:44:36,507 --> 00:44:38,384
But I'm saying the idea
that there's a new level
824
00:44:38,467 --> 00:44:42,054
and that new level has happened
so many times before.
825
00:44:42,137 --> 00:44:45,099
I mean, this is just the latest new level
that we've seen.
826
00:44:45,182 --> 00:44:48,727
There's this narrative that, you know,
"We'll just adapt to it.
827
00:44:48,811 --> 00:44:51,188
We'll learn how to live
with these devices,
828
00:44:51,271 --> 00:44:53,732
just like we've learned how to live
with everything else."
829
00:44:53,816 --> 00:44:56,694
And what this misses
is there's something distinctly new here.
830
00:44:57,486 --> 00:45:00,155
Perhaps the most dangerous piece
of all this is the fact
831
00:45:00,239 --> 00:45:04,410
that it's driven by technology
that's advancing exponentially.
832
00:45:05,869 --> 00:45:09,081
Roughly, if you say from, like,
the 1960s to today,
833
00:45:09,873 --> 00:45:12,960
processing power has gone up
about a trillion times.
834
00:45:13,794 --> 00:45:18,340
Nothing else that we have has improved
at anything near that rate.
835
00:45:18,424 --> 00:45:22,177
Like, cars are, you know,
roughly twice as fast.
836
00:45:22,261 --> 00:45:25,013
And almost everything else is negligible.
837
00:45:25,347 --> 00:45:27,182
And perhaps most importantly,
838
00:45:27,266 --> 00:45:31,353
our human-- our physiology,
our brains have evolved not at all.
839
00:45:37,401 --> 00:45:41,488
[Tristan] Human beings, at a mind and body
and sort of physical level,
840
00:45:41,947 --> 00:45:43,866
are not gonna fundamentally change.
841
00:45:44,825 --> 00:45:45,868
[indistinct chatter]
842
00:45:47,035 --> 00:45:48,954
[chuckling] I know, but they...
843
00:45:49,037 --> 00:45:51,623
[continues speaking indistinctly]
844
00:45:53,584 --> 00:45:54,752
[camera shutter clicks]
845
00:45:56,837 --> 00:46:00,924
[Tristan] We can do genetic engineering
and develop new kinds of human beings,
846
00:46:01,008 --> 00:46:05,220
but realistically speaking,
you're living inside of hardware, a brain,
847
00:46:05,304 --> 00:46:07,222
that was, like, millions of years old,
848
00:46:07,306 --> 00:46:10,559
and then there's this screen, and then
on the opposite side of the screen,
849
00:46:10,642 --> 00:46:13,562
there's these thousands of engineers
and supercomputers
850
00:46:13,645 --> 00:46:16,106
that have goals that are different
than your goals,
851
00:46:16,190 --> 00:46:19,693
and so, who's gonna win in that game?
Who's gonna win?
852
00:46:25,699 --> 00:46:26,617
How are we losing?
853
00:46:27,159 --> 00:46:29,828
-I don't know.
-Where is he? This is not normal.
854
00:46:29,912 --> 00:46:32,080
Did I overwhelm him
with friends and family content?
855
00:46:32,164 --> 00:46:34,082
-Probably.
-Well, maybe it was all the ads.
856
00:46:34,166 --> 00:46:37,795
No. Something's very wrong.
Let's switch to resurrection mode.
857
00:46:39,713 --> 00:46:44,051
[Tristan] When you think of AI,
you know, an AI's gonna ruin the world,
858
00:46:44,134 --> 00:46:47,221
and you see, like, a Terminator,
and you see Arnold Schwarzenegger.
859
00:46:47,638 --> 00:46:48,680
I'll be back.
860
00:46:48,764 --> 00:46:50,933
[Tristan] You see drones,
and you think, like,
861
00:46:51,016 --> 00:46:52,684
"Oh, we're gonna kill people with AI."
862
00:46:53,644 --> 00:46:59,817
And what people miss is that AI
already runs today's world right now.
863
00:46:59,900 --> 00:47:03,237
Even talking about "an AI"
is just a metaphor.
864
00:47:03,320 --> 00:47:09,451
At these companies like... like Google,
there's just massive, massive rooms,
865
00:47:10,327 --> 00:47:13,121
some of them underground,
some of them underwater,
866
00:47:13,205 --> 00:47:14,498
of just computers.
867
00:47:14,581 --> 00:47:17,835
Tons and tons of computers,
as far as the eye can see.
868
00:47:18,460 --> 00:47:20,504
They're deeply interconnected
with each other
869
00:47:20,587 --> 00:47:22,923
and running
extremely complicated programs,
870
00:47:23,006 --> 00:47:26,009
sending information back and forth
between each other all the time.
871
00:47:26,802 --> 00:47:28,595
And they'll be running
many different programs,
872
00:47:28,679 --> 00:47:31,014
many different products
on those same machines.
873
00:47:31,348 --> 00:47:33,684
Some of those things could be described
as simple algorithms,
874
00:47:33,767 --> 00:47:35,227
some could be described as algorithms
875
00:47:35,310 --> 00:47:37,521
that are so complicated,
you would call them intelligence.
876
00:47:39,022 --> 00:47:39,982
[crew member sighs]
877
00:47:40,065 --> 00:47:42,568
[Cathy]
I like to say that algorithms are opinions
878
00:47:42,651 --> 00:47:43,777
embedded in code...
879
00:47:45,070 --> 00:47:47,656
and that algorithms are not objective.
880
00:47:48,365 --> 00:47:51,577
Algorithms are optimized
to some definition of success.
881
00:47:52,244 --> 00:47:53,370
So, if you can imagine,
882
00:47:53,453 --> 00:47:57,124
if a... if a commercial enterprise builds
an algorithm
883
00:47:57,207 --> 00:47:59,293
to their definition of success,
884
00:47:59,835 --> 00:48:01,211
it's a commercial interest.
885
00:48:01,587 --> 00:48:02,671
It's usually profit.
886
00:48:03,130 --> 00:48:07,384
You are giving the computer
the goal state, "I want this outcome,"
887
00:48:07,467 --> 00:48:10,262
and then the computer itself is learning
how to do it.
888
00:48:10,345 --> 00:48:12,598
That's where the term "machine learning"
comes from.
889
00:48:12,681 --> 00:48:14,850
And so, every day, it gets slightly better
890
00:48:14,933 --> 00:48:16,977
at picking the right posts
in the right order
891
00:48:17,060 --> 00:48:19,438
so that you spend longer and longer
in that product.
892
00:48:19,521 --> 00:48:22,232
And no one really understands
what they're doing
893
00:48:22,316 --> 00:48:23,901
in order to achieve that goal.
894
00:48:23,984 --> 00:48:28,238
The algorithm has a mind of its own,
so even though a person writes it,
895
00:48:28,906 --> 00:48:30,657
it's written in a way
896
00:48:30,741 --> 00:48:35,037
that you kind of build the machine,
and then the machine changes itself.
897
00:48:35,120 --> 00:48:37,873
There's only a handful of people
at these companies,
898
00:48:37,956 --> 00:48:40,000
at Facebook and Twitter
and other companies...
899
00:48:40,083 --> 00:48:43,795
There's only a few people who understand
how those systems work,
900
00:48:43,879 --> 00:48:46,715
and even they don't necessarily
fully understand
901
00:48:46,798 --> 00:48:49,551
what's gonna happen
with a particular piece of content.
902
00:48:49,968 --> 00:48:55,474
So, as humans, we've almost lost control
over these systems.
903
00:48:55,891 --> 00:48:59,603
Because they're controlling, you know,
the information that we see,
904
00:48:59,686 --> 00:49:02,189
they're controlling us more
than we're controlling them.
905
00:49:02,522 --> 00:49:04,733
-[console whirs]
-[Growth AI] Cross-referencing him
906
00:49:04,816 --> 00:49:07,319
against comparables
in his geographic zone.
907
00:49:07,402 --> 00:49:09,571
His psychometricdoppelgangers.
908
00:49:09,655 --> 00:49:13,700
There are 13,694 people
behaving just like him in his region.
909
00:49:13,784 --> 00:49:16,370
-What's trending with them?
-We need something actually good
910
00:49:16,453 --> 00:49:17,704
for a proper resurrection,
911
00:49:17,788 --> 00:49:19,957
given that the typical stuff
isn't working.
912
00:49:20,040 --> 00:49:21,875
Not even that cute girl from school.
913
00:49:22,334 --> 00:49:25,253
My analysis shows that going political
with Extreme Center content
914
00:49:25,337 --> 00:49:28,256
has a 62.3 percent chance
of long-term engagement.
915
00:49:28,340 --> 00:49:29,299
That's not bad.
916
00:49:29,383 --> 00:49:32,010
[sighs] It's not good enough to lead with.
917
00:49:32,302 --> 00:49:35,305
Okay, okay, so we've tried notifying him
about tagged photos,
918
00:49:35,389 --> 00:49:39,017
invitations, current events,
even a direct message fromRebecca.
919
00:49:39,101 --> 00:49:42,813
But what about User 01265923010?
920
00:49:42,896 --> 00:49:44,648
Yeah, Ben loved all of her posts.
921
00:49:44,731 --> 00:49:47,776
For months and, like,
literally all of them, and then nothing.
922
00:49:47,859 --> 00:49:50,445
I calculate a 92.3 percent chance
of resurrection
923
00:49:50,529 --> 00:49:52,030
with a notification about Ana.
924
00:49:56,535 --> 00:49:57,494
And her new friend.
925
00:49:59,621 --> 00:50:01,623
[eerie instrumental music playing]
926
00:50:10,590 --> 00:50:11,675
[cell phone vibrates]
927
00:50:25,689 --> 00:50:27,441
[Ben] Oh, you gotta be kiddin' me.
928
00:50:32,404 --> 00:50:33,613
Uh... [sighs]
929
00:50:35,657 --> 00:50:36,616
Okay.
930
00:50:38,869 --> 00:50:40,996
-What?
-[fanfare plays, fireworks pop]
931
00:50:41,413 --> 00:50:42,789
[claps] Bam! We're back!
932
00:50:42,873 --> 00:50:44,374
Let's get back to making money, boys.
933
00:50:44,458 --> 00:50:46,334
Yes, and connecting Ben
with the entire world.
934
00:50:46,418 --> 00:50:49,087
I'm giving him access
to all the information he might like.
935
00:50:49,755 --> 00:50:53,717
Hey, do you guys ever wonder if, you know,
like, the feed is good for Ben?
936
00:50:57,095 --> 00:50:58,430
-No.
-No. [chuckles slightly]
937
00:51:00,307 --> 00:51:03,268
-[chuckles softly]
-["I Put a Spell on You" playing]
938
00:51:17,491 --> 00:51:19,076
♪ I put a spell on you♪
939
00:51:25,040 --> 00:51:26,374
♪ 'Cause you're mine♪
940
00:51:28,627 --> 00:51:32,089
[vocalizing] ♪ Ah! ♪
941
00:51:34,508 --> 00:51:36,593
♪ You better stop the things you do ♪
942
00:51:41,181 --> 00:51:42,265
♪ I ain't lyin' ♪
943
00:51:44,976 --> 00:51:46,686
♪ No, I ain't lyin' ♪
944
00:51:49,981 --> 00:51:51,817
♪ You know I can't stand it ♪
945
00:51:53,026 --> 00:51:54,611
♪ You're runnin' around♪
946
00:51:55,612 --> 00:51:57,239
♪ You know better, Daddy♪
947
00:51:58,782 --> 00:52:02,077
♪ I can't stand it'Cause you put me down♪
948
00:52:03,286 --> 00:52:04,121
♪ Yeah, yeah♪
949
00:52:06,456 --> 00:52:08,375
♪ I put a spell on you♪
950
00:52:12,379 --> 00:52:14,840
♪ Because you're mine ♪
951
00:52:18,718 --> 00:52:19,845
♪ You're mine ♪
952
00:52:20,929 --> 00:52:24,349
[Roger] So, imagine you're on Facebook...
953
00:52:24,766 --> 00:52:29,312
and you're effectively playing
against this artificial intelligence
954
00:52:29,396 --> 00:52:31,314
that knows everything about you,
955
00:52:31,398 --> 00:52:34,568
can anticipate your next move,
and you know literally nothing about it,
956
00:52:34,651 --> 00:52:37,404
except that there are cat videos
and birthdays on it.
957
00:52:37,821 --> 00:52:39,656
That's not a fair fight.
958
00:52:41,575 --> 00:52:43,869
Ben and Jerry, it's time to go, bud!
959
00:52:48,039 --> 00:52:48,874
[sighs]
960
00:52:51,126 --> 00:52:51,960
Ben?
961
00:53:01,011 --> 00:53:02,137
[knocks lightly on door]
962
00:53:02,679 --> 00:53:04,723
-[Cass] Ben.
-[Ben] Mm.
963
00:53:05,182 --> 00:53:06,057
Come on.
964
00:53:07,225 --> 00:53:08,351
School time. [claps]
965
00:53:08,435 --> 00:53:09,269
Let's go.
966
00:53:12,189 --> 00:53:13,148
[Ben sighs]
967
00:53:25,118 --> 00:53:27,120
[excited chatter]
968
00:53:31,374 --> 00:53:33,627
-[tech] How you doing today?
-Oh, I'm... I'm nervous.
969
00:53:33,710 --> 00:53:35,003
-Are ya?
-Yeah. [chuckles]
970
00:53:37,380 --> 00:53:39,132
[Tristan]
We were all looking for the moment
971
00:53:39,216 --> 00:53:42,969
when technology would overwhelm
human strengths and intelligence.
972
00:53:43,053 --> 00:53:47,015
When is it gonna cross the singularity,
replace our jobs, be smarter than humans?
973
00:53:48,141 --> 00:53:50,101
But there's this much earlier moment...
974
00:53:50,977 --> 00:53:55,315
when technology exceeds
and overwhelms human weaknesses.
975
00:53:57,484 --> 00:54:01,780
This point being crossed
is at the root of addiction,
976
00:54:02,113 --> 00:54:04,741
polarization, radicalization,
outrage-ification,
977
00:54:04,824 --> 00:54:06,368
vanity-ification, the entire thing.
978
00:54:07,702 --> 00:54:09,913
This is overpowering human nature,
979
00:54:10,538 --> 00:54:13,500
and this is checkmate on humanity.
980
00:54:20,131 --> 00:54:21,883
-[sighs deeply]
-[door opens]
981
00:54:30,558 --> 00:54:31,851
I'm sorry. [sighs]
982
00:54:37,607 --> 00:54:39,609
-[seat belt clicks]
-[engine starts]
983
00:54:41,736 --> 00:54:44,656
[Jaron] One of the ways
I try to get people to understand
984
00:54:45,198 --> 00:54:49,828
just how wrong feeds from places
like Facebook are
985
00:54:49,911 --> 00:54:51,454
is to think about the Wikipedia.
986
00:54:52,956 --> 00:54:56,209
When you go to a page, you're seeing
the same thing as other people.
987
00:54:56,584 --> 00:55:00,297
So, it's one of the few things online
that we at least hold in common.
988
00:55:00,380 --> 00:55:03,425
Now, just imagine for a second
that Wikipedia said,
989
00:55:03,508 --> 00:55:07,178
"We're gonna give each person
a different customized definition,
990
00:55:07,262 --> 00:55:09,472
and we're gonna be paid by people
for that."
991
00:55:09,556 --> 00:55:13,435
So, Wikipedia would be spying on you.
Wikipedia would calculate,
992
00:55:13,518 --> 00:55:17,188
"What's the thing I can do
to get this person to change a little bit
993
00:55:17,272 --> 00:55:19,899
on behalf of some commercial interest?"
Right?
994
00:55:19,983 --> 00:55:21,818
And then it would change the entry.
995
00:55:22,444 --> 00:55:24,738
Can you imagine that?
Well, you should be able to,
996
00:55:24,821 --> 00:55:26,823
'cause that's exactly what's happening
on Facebook.
997
00:55:26,906 --> 00:55:28,992
It's exactly what's happening
in your YouTube feed.
998
00:55:29,075 --> 00:55:31,786
When you go to Google and type in
"Climate change is,"
999
00:55:31,870 --> 00:55:34,998
you're going to see different results
depending on where you live.
1000
00:55:36,166 --> 00:55:38,460
In certain cities,
you're gonna see itautocomplete
1001
00:55:38,543 --> 00:55:40,462
with "climate change is a hoax."
1002
00:55:40,545 --> 00:55:42,088
In other cases, you're gonna see
1003
00:55:42,172 --> 00:55:44,841
"climate change is causing the destruction
of nature."
1004
00:55:44,924 --> 00:55:48,428
And that's a function not
of what the truth is about climate change,
1005
00:55:48,511 --> 00:55:51,097
but about
where you happen to be Googling from
1006
00:55:51,181 --> 00:55:54,100
and the particular things
Google knows about your interests.
1007
00:55:54,851 --> 00:55:58,021
Even two friends
who are so close to each other,
1008
00:55:58,104 --> 00:56:00,190
who have almost the exact same set
of friends,
1009
00:56:00,273 --> 00:56:02,817
they think, you know,
"I'm going to news feeds on Facebook.
1010
00:56:02,901 --> 00:56:05,403
I'll see the exact same set of updates."
1011
00:56:05,487 --> 00:56:06,738
But it's not like that at all.
1012
00:56:06,821 --> 00:56:08,448
They see completely different worlds
1013
00:56:08,531 --> 00:56:10,575
because they're based
on these computers calculating
1014
00:56:10,658 --> 00:56:12,035
what's perfect for each of them.
1015
00:56:12,118 --> 00:56:14,245
[whistling over monitor]
1016
00:56:14,329 --> 00:56:18,416
[Roger] The way to think about it
is it's 2.7 billion Truman Shows.
1017
00:56:18,500 --> 00:56:21,294
Each person has their own reality,
with their own...
1018
00:56:22,670 --> 00:56:23,671
facts.
1019
00:56:23,755 --> 00:56:27,008
Why do you thinkthat, uh, Truman has never come close
1020
00:56:27,092 --> 00:56:30,095
to discovering the true natureof his world until now?
1021
00:56:31,054 --> 00:56:34,140
We accept the reality of the world
with which we're presented.
1022
00:56:34,224 --> 00:56:35,141
It's as simple as that.
1023
00:56:36,476 --> 00:56:41,064
Over time, you have the false sense
that everyone agrees with you,
1024
00:56:41,147 --> 00:56:44,067
because everyone in your news feed
sounds just like you.
1025
00:56:44,567 --> 00:56:49,072
And that once you're in that state,
it turns out you're easily manipulated,
1026
00:56:49,155 --> 00:56:51,741
the same way you would be manipulated
by a magician.
1027
00:56:51,825 --> 00:56:55,370
A magician shows you a card trick
and says, "Pick a card, any card."
1028
00:56:55,453 --> 00:56:58,164
What you don't realize
was that they've done a set-up,
1029
00:56:58,456 --> 00:57:00,583
so you pick the card
they want you to pick.
1030
00:57:00,667 --> 00:57:03,169
And that's how Facebook works.
Facebook sits there and says,
1031
00:57:03,253 --> 00:57:06,172
"Hey, you pick your friends.
You pick the links that you follow."
1032
00:57:06,256 --> 00:57:08,716
But that's all nonsense.
It's just like the magician.
1033
00:57:08,800 --> 00:57:11,302
Facebook is in charge of your news feed.
1034
00:57:11,386 --> 00:57:14,514
We all simply are operating
on a different set of facts.
1035
00:57:14,597 --> 00:57:16,474
When that happens at scale,
1036
00:57:16,558 --> 00:57:20,645
you're no longer able to reckon with
or even consume information
1037
00:57:20,728 --> 00:57:23,690
that contradicts with that world view
that you've created.
1038
00:57:23,773 --> 00:57:26,443
That means we aren't actually being
objective,
1039
00:57:26,526 --> 00:57:28,319
constructive individuals. [chuckles]
1040
00:57:28,403 --> 00:57:32,449
[crowd chanting] Open up your eyes,
don't believe the lies! Open up...
1041
00:57:32,532 --> 00:57:34,701
[Justin] And then you look
over at the other side,
1042
00:57:35,243 --> 00:57:38,746
and you start to think,
"How can those people be so stupid?
1043
00:57:38,830 --> 00:57:42,125
Look at all of this information
that I'm constantly seeing.
1044
00:57:42,208 --> 00:57:44,627
How are they not seeing
that same information?"
1045
00:57:44,711 --> 00:57:47,297
And the answer is, "They're not seeing
that same information."
1046
00:57:47,380 --> 00:57:50,800
[crowd continues chanting]
Open up your eyes, don't believe the lies!
1047
00:57:50,884 --> 00:57:52,010
[shouting indistinctly]
1048
00:57:52,093 --> 00:57:55,472
-[interviewer] What are Republicans like?
-People that don't have a clue.
1049
00:57:55,555 --> 00:57:58,933
The Democrat Party is a crime syndicate,
not a real political party.
1050
00:57:59,017 --> 00:58:03,188
A huge new Pew Research Center study
of 10,000 American adults
1051
00:58:03,271 --> 00:58:05,315
finds us more divided than ever,
1052
00:58:05,398 --> 00:58:09,152
withpersonal and political polarization
at a 20-year high.
1053
00:58:11,738 --> 00:58:14,199
[pundit] You have
more than a third of Republicans saying
1054
00:58:14,282 --> 00:58:16,826
the Democratic Party is a threat
to the nation,
1055
00:58:16,910 --> 00:58:20,580
more than a quarter of Democrats saying
the same thing about the Republicans.
1056
00:58:20,663 --> 00:58:22,499
So many of the problems
that we're discussing,
1057
00:58:22,582 --> 00:58:24,417
like, around political polarization
1058
00:58:24,501 --> 00:58:28,046
exist in spades on cable television.
1059
00:58:28,129 --> 00:58:31,007
The media has this exact same problem,
1060
00:58:31,090 --> 00:58:33,343
where their business model, by and large,
1061
00:58:33,426 --> 00:58:35,762
is that they're selling our attention
to advertisers.
1062
00:58:35,845 --> 00:58:38,890
And the Internet is just a new,
even more efficient way to do that.
1063
00:58:40,141 --> 00:58:44,145
[Guillaume] At YouTube, I was working
on YouTube recommendations.
1064
00:58:44,229 --> 00:58:47,148
It worries me that an algorithm
that I worked on
1065
00:58:47,232 --> 00:58:50,401
is actually increasing polarization
in society.
1066
00:58:50,485 --> 00:58:53,112
But from the point of view of watch time,
1067
00:58:53,196 --> 00:58:57,617
this polarization is extremely efficient
at keeping people online.
1068
00:58:58,785 --> 00:59:00,870
The only reasonthese teachers are teaching this stuff
1069
00:59:00,954 --> 00:59:02,288
is 'cause they're getting paid to.
1070
00:59:02,372 --> 00:59:04,374
-It's absolutely absurd.
-[Cass] Hey, Benji.
1071
00:59:04,916 --> 00:59:06,292
No soccer practice today?
1072
00:59:06,376 --> 00:59:08,795
Oh, there is. I'm just catching up
on some news stuff.
1073
00:59:08,878 --> 00:59:11,506
[vlogger] Do research. Anythingthat sways from the Extreme Center--
1074
00:59:11,589 --> 00:59:14,008
Wouldn't exactly call the stuff
that you're watching news.
1075
00:59:15,552 --> 00:59:18,846
You're always talking about how messed up
everything is. So are they.
1076
00:59:19,305 --> 00:59:21,140
But that stuff is just propaganda.
1077
00:59:21,224 --> 00:59:24,060
[vlogger] Neither is true.It's all about what makes sense.
1078
00:59:24,769 --> 00:59:26,938
Ben, I'm serious.
That stuff is bad for you.
1079
00:59:27,021 --> 00:59:29,232
-You should go to soccer practice.
-[Ben] Mm.
1080
00:59:31,109 --> 00:59:31,943
[Cass sighs]
1081
00:59:35,154 --> 00:59:37,490
I share this stuff because I care.
1082
00:59:37,574 --> 00:59:41,077
I care that you are being misled,and it's not okay. All right?
1083
00:59:41,160 --> 00:59:43,121
[Guillaume] People think
the algorithm is designed
1084
00:59:43,204 --> 00:59:46,833
to give them what they really want,
only it's not.
1085
00:59:46,916 --> 00:59:52,589
The algorithm is actually trying to find
a few rabbit holes that are very powerful,
1086
00:59:52,672 --> 00:59:56,217
trying to find which rabbit hole
is the closest to your interest.
1087
00:59:56,301 --> 00:59:59,262
And then if you start watching
one of those videos,
1088
00:59:59,846 --> 01:00:02,223
then it will recommend it
over and over again.
1089
01:00:02,682 --> 01:00:04,934
It's not like anybody wants this
to happen.
1090
01:00:05,018 --> 01:00:07,812
It's just that this is
what the recommendation system is doing.
1091
01:00:07,895 --> 01:00:10,815
So much so that Kyrie Irving,
the famous basketball player,
1092
01:00:11,065 --> 01:00:14,235
uh, said he believed the Earth was flat,
and he apologized later
1093
01:00:14,319 --> 01:00:16,154
because he blamed it
on a YouTube rabbit hole.
1094
01:00:16,487 --> 01:00:18,656
You know, like,
you click the YouTube click
1095
01:00:18,740 --> 01:00:21,534
and it goes, like,
how deep the rabbit hole goes.
1096
01:00:21,618 --> 01:00:23,369
When he later came on to NPR to say,
1097
01:00:23,453 --> 01:00:25,955
"I'm sorry for believing this.
I didn't want to mislead people,"
1098
01:00:26,039 --> 01:00:28,291
a bunch of students in a classroom
were interviewed saying,
1099
01:00:28,374 --> 01:00:29,667
"The round-Earthers got to him."
1100
01:00:29,751 --> 01:00:30,960
[audience chuckles]
1101
01:00:31,044 --> 01:00:33,963
The flat-Earth conspiracy theory
was recommended
1102
01:00:34,047 --> 01:00:37,634
hundreds of millions of times
by the algorithm.
1103
01:00:37,717 --> 01:00:43,890
It's easy to think that it's just
a few stupid people who get convinced,
1104
01:00:43,973 --> 01:00:46,893
but the algorithm is getting smarter
and smarter every day.
1105
01:00:46,976 --> 01:00:50,188
So, today, they are convincing the people
that the Earth is flat,
1106
01:00:50,271 --> 01:00:53,983
but tomorrow, they will be convincing you
of something that's false.
1107
01:00:54,317 --> 01:00:57,820
[reporter] On November 7th,the hashtag "Pizzagate" was born.
1108
01:00:57,904 --> 01:00:59,197
[Renée] Pizzagate...
1109
01:01:00,114 --> 01:01:01,449
[clicks tongue] Oh, boy.
1110
01:01:01,532 --> 01:01:02,533
Uh... [laughs]
1111
01:01:03,159 --> 01:01:06,913
I still am not 100 percent sure
how this originally came about,
1112
01:01:06,996 --> 01:01:12,377
but the idea that ordering a pizza
meant ordering a trafficked person.
1113
01:01:12,460 --> 01:01:15,046
As the groups got bigger on Facebook,
1114
01:01:15,129 --> 01:01:19,967
Facebook's recommendation engine
started suggesting to regular users
1115
01:01:20,051 --> 01:01:21,761
that they join Pizzagate groups.
1116
01:01:21,844 --> 01:01:27,392
So, if a user was, for example,
anti-vaccine or believed in chemtrails
1117
01:01:27,475 --> 01:01:30,645
or had indicated to Facebook's algorithms
in some way
1118
01:01:30,728 --> 01:01:33,398
that they were prone to belief
in conspiracy theories,
1119
01:01:33,481 --> 01:01:36,859
Facebook's recommendation engine
would serve themPizzagate groups.
1120
01:01:36,943 --> 01:01:41,072
Eventually, this culminated in
a man showing up with a gun,
1121
01:01:41,155 --> 01:01:44,617
deciding that he was gonna go liberate
the children from the basement
1122
01:01:44,701 --> 01:01:46,911
of the pizza place
that did not have a basement.
1123
01:01:46,994 --> 01:01:48,538
[officer 1] What were you doing?
1124
01:01:48,871 --> 01:01:50,498
[man] Making sure
there was nothing there.
1125
01:01:50,581 --> 01:01:52,458
-[officer 1] Regarding?
-[man] Pedophile ring.
1126
01:01:52,542 --> 01:01:54,293
-[officer 1] What?
-[man] Pedophile ring.
1127
01:01:54,377 --> 01:01:55,962
[officer 2] He's talking about Pizzagate.
1128
01:01:56,045 --> 01:02:00,216
This is an example of a conspiracy theory
1129
01:02:00,299 --> 01:02:03,678
that was propagated
across all social networks.
1130
01:02:03,761 --> 01:02:06,097
The social network's
own recommendation engine
1131
01:02:06,180 --> 01:02:07,974
is voluntarily serving this up to people
1132
01:02:08,057 --> 01:02:10,643
who had never searched
for the term "Pizzagate" in their life.
1133
01:02:12,437 --> 01:02:14,439
[Tristan] There's a study, an MIT study,
1134
01:02:14,522 --> 01:02:19,819
that fake news on Twitter spreads
six times faster than true news.
1135
01:02:19,902 --> 01:02:21,863
What is that world gonna look like
1136
01:02:21,946 --> 01:02:24,741
when one has a six-times advantage
to the other one?
1137
01:02:25,283 --> 01:02:27,660
You can imagine
these things are sort of like...
1138
01:02:27,744 --> 01:02:31,706
they... they tilt the floor
of... of human behavior.
1139
01:02:31,789 --> 01:02:34,709
They make some behavior harder
and some easier.
1140
01:02:34,792 --> 01:02:37,420
And you're always free
to walk up the hill,
1141
01:02:37,503 --> 01:02:38,796
but fewer people do,
1142
01:02:38,880 --> 01:02:43,092
and so, at scale, at society's scale,
you really are just tilting the floor
1143
01:02:43,176 --> 01:02:45,970
and changing what billions of people think
and do.
1144
01:02:46,053 --> 01:02:52,018
We've created a system
that biases towards false information.
1145
01:02:52,643 --> 01:02:54,437
Not because we want to,
1146
01:02:54,520 --> 01:02:58,816
but because false information makes
the companies more money
1147
01:02:59,400 --> 01:03:01,319
than the truth. The truth is boring.
1148
01:03:01,986 --> 01:03:04,489
It's a disinformation-for-profit
business model.
1149
01:03:04,906 --> 01:03:08,159
You make money the more you allow
unregulated messages
1150
01:03:08,701 --> 01:03:11,287
to reach anyone for the best price.
1151
01:03:11,662 --> 01:03:13,956
Because climate change? Yeah.
1152
01:03:14,040 --> 01:03:16,751
It's a hoax. Yeah, it's real.That's the point.
1153
01:03:16,834 --> 01:03:20,046
The more they talk about itand the more they divide us,
1154
01:03:20,129 --> 01:03:22,423
the more they have the power,the more...
1155
01:03:22,507 --> 01:03:25,468
[Tristan] Facebook has trillions
of these news feed posts.
1156
01:03:26,552 --> 01:03:29,180
They can't know what's real
or what's true...
1157
01:03:29,972 --> 01:03:33,726
which is why this conversation
is so critical right now.
1158
01:03:33,810 --> 01:03:37,021
[reporter 1] It's not just COVID-19that's spreading fast.
1159
01:03:37,104 --> 01:03:40,191
There's a flow of misinformation onlineabout the virus.
1160
01:03:40,274 --> 01:03:41,818
[reporter 2] The notiondrinking water
1161
01:03:41,901 --> 01:03:43,694
will flush coronavirus from your system
1162
01:03:43,778 --> 01:03:47,490
is one of several myths about the viruscirculating on social media.
1163
01:03:47,573 --> 01:03:50,451
[automated voice] The government plannedthis event, created the virus,
1164
01:03:50,535 --> 01:03:53,621
and had a simulationof how the countries would react.
1165
01:03:53,955 --> 01:03:55,581
Coronavirus is a... a hoax.
1166
01:03:56,165 --> 01:03:57,959
[man] SARS, coronavirus.
1167
01:03:58,376 --> 01:04:01,045
And look at when it was made. 2018.
1168
01:04:01,128 --> 01:04:03,798
I think the US government started
this shit.
1169
01:04:04,215 --> 01:04:09,095
Nobody is sick. Nobody is sick.
Nobody knows anybody who's sick.
1170
01:04:09,512 --> 01:04:13,015
Maybe the government is using
the coronavirus as an excuse
1171
01:04:13,099 --> 01:04:15,643
to get everyone to stay inside
because something else is happening.
1172
01:04:15,726 --> 01:04:18,020
Coronavirus is not killing people,
1173
01:04:18,104 --> 01:04:20,940
it's the 5G radiation
that they're pumping out.
1174
01:04:21,023 --> 01:04:22,525
[crowd shouting]
1175
01:04:22,608 --> 01:04:24,944
[Tristan]
We're being bombarded with rumors.
1176
01:04:25,403 --> 01:04:28,823
People are blowing upactual physical cell phone towers.
1177
01:04:28,906 --> 01:04:32,201
We see Russia and China spreading rumorsand conspiracy theories.
1178
01:04:32,285 --> 01:04:35,246
[reporter 3] This morning,panic and protest in Ukraine as...
1179
01:04:35,329 --> 01:04:38,916
[Tristan] People have no idea what's true,and now it's a matter of life and death.
1180
01:04:39,876 --> 01:04:42,628
[woman] Those sources that are spreading
coronavirus misinformation
1181
01:04:42,712 --> 01:04:45,798
have amassed
something like 52 million engagements.
1182
01:04:45,882 --> 01:04:50,094
You're saying that silver solution
would be effective.
1183
01:04:50,177 --> 01:04:54,140
Well, let's say it hasn't been tested
on this strain of the coronavirus, but...
1184
01:04:54,223 --> 01:04:57,226
[Tristan] What we're seeing with COVIDis just an extreme version
1185
01:04:57,310 --> 01:05:00,521
of what's happeningacross our information ecosystem.
1186
01:05:00,938 --> 01:05:05,026
Social media amplifies exponential gossipand exponential hearsay
1187
01:05:05,109 --> 01:05:07,111
to the pointthat we don't know what's true,
1188
01:05:07,194 --> 01:05:08,946
no matter what issue we care about.
1189
01:05:15,161 --> 01:05:16,579
[teacher] He discovers this.
1190
01:05:16,662 --> 01:05:18,664
[continues lecturing indistinctly]
1191
01:05:19,874 --> 01:05:21,292
[Rebecca whispers] Ben.
1192
01:05:26,130 --> 01:05:28,257
-Are you still on the team?
-[Ben] Mm-hmm.
1193
01:05:30,384 --> 01:05:32,678
[Rebecca] Okay, well,
I'm gonna get a snackbefore practice
1194
01:05:32,762 --> 01:05:34,430
if you... wanna come.
1195
01:05:35,640 --> 01:05:36,515
[Ben] Hm?
1196
01:05:36,974 --> 01:05:38,601
[Rebecca] You know, never mind.
1197
01:05:38,684 --> 01:05:40,686
[footsteps fading]
1198
01:05:45,066 --> 01:05:47,526
[vlogger] Nine out of ten peopleare dissatisfied right now.
1199
01:05:47,610 --> 01:05:50,613
The EC is like any political movementin history, when you think about it.
1200
01:05:50,696 --> 01:05:54,492
We are standing up, and we are...we are standing up to this noise.
1201
01:05:54,575 --> 01:05:57,036
You are my people. I trust you guys.
1202
01:05:59,246 --> 01:06:02,583
-The Extreme Center content is brilliant.
-He absolutely loves it.
1203
01:06:02,667 --> 01:06:03,626
Running an auction.
1204
01:06:04,627 --> 01:06:08,547
840 bidders. He sold for 4.35 cents
to a weapons manufacturer.
1205
01:06:08,631 --> 01:06:10,800
Let's promote some of these events.
1206
01:06:10,883 --> 01:06:13,511
Upcoming rallies in his geographic zone
later this week.
1207
01:06:13,594 --> 01:06:15,179
I've got a new vlogger lined up, too.
1208
01:06:15,262 --> 01:06:16,263
[chuckles]
1209
01:06:17,765 --> 01:06:22,979
And... and, honestly, I'm telling you,I'm willing to do whatever it takes.
1210
01:06:23,062 --> 01:06:24,939
And I mean whatever.
1211
01:06:32,154 --> 01:06:33,197
-Subscribe...
-[Cass] Ben?
1212
01:06:33,280 --> 01:06:35,908
...and also come backbecause I'm telling you, yo...
1213
01:06:35,992 --> 01:06:38,869
-[knocking on door]
-...I got some real big things comin'.
1214
01:06:38,953 --> 01:06:40,162
Some real big things.
1215
01:06:40,788 --> 01:06:45,292
[Roger] One of the problems with Facebook
is that, as a tool of persuasion,
1216
01:06:45,793 --> 01:06:47,920
it may be the greatest thing ever created.
1217
01:06:48,004 --> 01:06:52,508
Now, imagine what that means in the hands
of a dictator or an authoritarian.
1218
01:06:53,718 --> 01:06:57,638
If you want to control the population
of your country,
1219
01:06:57,722 --> 01:07:01,308
there has never been a tool
as effective as Facebook.
1220
01:07:04,937 --> 01:07:07,398
[Cynthia]
Some of the most troubling implications
1221
01:07:07,481 --> 01:07:10,985
of governments and other bad actors
weaponizing social media,
1222
01:07:11,235 --> 01:07:13,612
um, is that it has led
to real, offline harm.
1223
01:07:13,696 --> 01:07:15,072
I think the most prominent example
1224
01:07:15,156 --> 01:07:17,658
that's gotten a lot of press
is what's happened in Myanmar.
1225
01:07:19,243 --> 01:07:21,203
In Myanmar,
when people think of the Internet,
1226
01:07:21,287 --> 01:07:22,913
what they are thinking about is Facebook.
1227
01:07:22,997 --> 01:07:25,916
And what often happens is
when people buy their cell phone,
1228
01:07:26,000 --> 01:07:29,920
the cell phone shop owner will actually
preload Facebook on there for them
1229
01:07:30,004 --> 01:07:31,505
and open an account for them.
1230
01:07:31,589 --> 01:07:34,884
And so when people get their phone,
the first thing they open
1231
01:07:34,967 --> 01:07:37,595
and the only thing they know how to open
is Facebook.
1232
01:07:38,179 --> 01:07:41,891
Well, a new bombshell investigation
exposes Facebook's growing struggle
1233
01:07:41,974 --> 01:07:43,809
to tackle hate speech in Myanmar.
1234
01:07:43,893 --> 01:07:46,020
[crowd shouting]
1235
01:07:46,103 --> 01:07:49,190
Facebook really gave the military
and other bad actors
1236
01:07:49,273 --> 01:07:51,776
a new way to manipulate public opinion
1237
01:07:51,859 --> 01:07:55,529
and to help incite violence
against the Rohingya Muslims
1238
01:07:55,613 --> 01:07:57,406
that included mass killings,
1239
01:07:58,115 --> 01:07:59,867
burning of entire villages,
1240
01:07:59,950 --> 01:08:03,704
mass rape, and other serious crimes
against humanity
1241
01:08:03,788 --> 01:08:04,955
that have now led
1242
01:08:05,039 --> 01:08:08,209
to 700,000Rohingya Muslims
having to flee the country.
1243
01:08:11,170 --> 01:08:14,799
It's not
that highly motivated propagandists
1244
01:08:14,882 --> 01:08:16,550
haven't existed before.
1245
01:08:16,634 --> 01:08:19,762
It's that the platforms make it possible
1246
01:08:19,845 --> 01:08:23,724
to spread manipulative narratives
with phenomenal ease,
1247
01:08:23,808 --> 01:08:25,434
and without very much money.
1248
01:08:25,518 --> 01:08:27,812
If I want to manipulate an election,
1249
01:08:27,895 --> 01:08:30,564
I can now go into
a conspiracy theory group on Facebook,
1250
01:08:30,648 --> 01:08:32,233
and I can find 100 people
1251
01:08:32,316 --> 01:08:34,443
who believe
that the Earth is completely flat
1252
01:08:34,860 --> 01:08:37,780
and think it's all this conspiracy theory
that we landed on the moon,
1253
01:08:37,863 --> 01:08:41,450
and I can tell Facebook,
"Give me 1,000 users who look like that."
1254
01:08:42,118 --> 01:08:46,080
Facebook will happily send me
thousands of users that look like them
1255
01:08:46,163 --> 01:08:49,250
that I can now hit
with more conspiracy theories.
1256
01:08:50,376 --> 01:08:53,087
-[button clicks]
-Sold for 3.4 cents an impression.
1257
01:08:53,379 --> 01:08:56,382
-New EC video to promote.
-[Advertising AI] Another ad teed up.
1258
01:08:58,509 --> 01:09:00,928
[Justin] Algorithms
and manipulative politicians
1259
01:09:01,011 --> 01:09:02,138
are becoming so expert
1260
01:09:02,221 --> 01:09:04,056
at learning how to trigger us,
1261
01:09:04,140 --> 01:09:08,352
getting so good at creating fake news
that we absorb as if it were reality,
1262
01:09:08,435 --> 01:09:10,813
and confusing us into believing
those lies.
1263
01:09:10,896 --> 01:09:12,606
It's as though we have
less and less control
1264
01:09:12,690 --> 01:09:14,150
over who we are and what we believe.
1265
01:09:14,233 --> 01:09:16,235
[ominous instrumental music playing]
1266
01:09:31,375 --> 01:09:32,835
[vlogger] ...so they can pick sides.
1267
01:09:32,918 --> 01:09:34,879
There's lies here,and there's lies over there.
1268
01:09:34,962 --> 01:09:36,338
So they can keep the power,
1269
01:09:36,422 --> 01:09:39,967
-so they can control everything.
-[police siren blaring]
1270
01:09:40,050 --> 01:09:42,553
[vlogger] They can control our minds,
1271
01:09:42,636 --> 01:09:46,390
-so that they can keep their secrets.
-[crowd chanting]
1272
01:09:48,517 --> 01:09:50,895
[Tristan] Imagine a world
where no one believes anything true.
1273
01:09:52,897 --> 01:09:55,649
Everyone believes
the government's lying to them.
1274
01:09:56,317 --> 01:09:58,444
Everything is a conspiracy theory.
1275
01:09:58,527 --> 01:10:01,197
"I shouldn't trust anyone.
I hate the other side."
1276
01:10:01,280 --> 01:10:02,698
That's where all this is heading.
1277
01:10:02,781 --> 01:10:06,160
The political earthquakes in Europe
continue to rumble.
1278
01:10:06,243 --> 01:10:08,412
This time, in Italy and Spain.
1279
01:10:08,495 --> 01:10:11,999
[reporter] Overall, Europe's traditional,centrist coalition lost its majority
1280
01:10:12,082 --> 01:10:15,002
while far rightand far left populist parties made gains.
1281
01:10:15,085 --> 01:10:16,086
[man shouts]
1282
01:10:16,170 --> 01:10:17,504
[crowd chanting]
1283
01:10:19,757 --> 01:10:20,591
Back up.
1284
01:10:21,300 --> 01:10:22,509
-[radio beeps]
-Okay, let's go.
1285
01:10:24,845 --> 01:10:26,847
[police siren wailing]
1286
01:10:28,390 --> 01:10:31,268
[reporter] These accountswere deliberately, specifically attempting
1287
01:10:31,352 --> 01:10:34,355
-to sow political discord in Hong Kong.
-[crowd shouting]
1288
01:10:36,440 --> 01:10:37,399
[sighs]
1289
01:10:38,609 --> 01:10:40,361
-All right, Ben.
-[car doors lock]
1290
01:10:42,863 --> 01:10:45,032
What does it look like to be a country
1291
01:10:45,115 --> 01:10:48,410
that's entire diet is Facebook
and social media?
1292
01:10:48,953 --> 01:10:50,871
Democracy crumbled quickly.
1293
01:10:50,955 --> 01:10:51,830
Six months.
1294
01:10:51,914 --> 01:10:53,791
[reporter 1] After that chaos in Chicago,
1295
01:10:53,874 --> 01:10:57,086
violent clashes between protestersand supporters...
1296
01:10:58,003 --> 01:11:01,632
[reporter 2] Democracy is facinga crisis of confidence.
1297
01:11:01,715 --> 01:11:04,343
What we're seeing is a global assault
on democracy.
1298
01:11:04,426 --> 01:11:05,427
[crowd shouting]
1299
01:11:05,511 --> 01:11:07,930
[Renée] Most of the countries
that are targeted are countries
1300
01:11:08,013 --> 01:11:09,723
that run democratic elections.
1301
01:11:10,641 --> 01:11:12,518
[Tristan] This is happening at scale.
1302
01:11:12,601 --> 01:11:15,562
By state actors,
by people with millions of dollars saying,
1303
01:11:15,646 --> 01:11:18,524
"I wanna destabilize Kenya.
I wanna destabilize Cameroon.
1304
01:11:18,607 --> 01:11:20,651
Oh, Angola? That only costs this much."
1305
01:11:20,734 --> 01:11:23,362
[reporter] An extraordinary electiontook place Sunday in Brazil.
1306
01:11:23,445 --> 01:11:25,823
With a campaign that's been powered
by social media.
1307
01:11:25,906 --> 01:11:29,702
[crowd chanting in Portuguese]
1308
01:11:31,036 --> 01:11:33,956
[Tristan] We in the tech industry
have created the tools
1309
01:11:34,039 --> 01:11:37,418
to destabilize
and erode the fabric of society
1310
01:11:37,501 --> 01:11:40,254
in every country,all at once, everywhere.
1311
01:11:40,337 --> 01:11:44,508
You have this in Germany, Spain, France,
Brazil, Australia.
1312
01:11:44,591 --> 01:11:47,261
Some of the most "developed nations"
in the world
1313
01:11:47,344 --> 01:11:49,221
are now imploding on each other,
1314
01:11:49,305 --> 01:11:50,931
and what do they have in common?
1315
01:11:51,974 --> 01:11:52,975
Knowing what you know now,
1316
01:11:53,058 --> 01:11:56,312
do you believe Facebook impacted
the results of the 2016 election?
1317
01:11:56,770 --> 01:11:58,814
[Mark Zuckerberg]
Oh, that's... that is hard.
1318
01:11:58,897 --> 01:12:00,691
You know,it's... the...
1319
01:12:01,275 --> 01:12:04,653
the reality is, well, there
were so many different forces at play.
1320
01:12:04,737 --> 01:12:07,865
Representatives from Facebook, Twitter,
and Google are back on Capitol Hill
1321
01:12:07,948 --> 01:12:09,450
for a second day of testimony
1322
01:12:09,533 --> 01:12:12,578
about Russia's interference
in the 2016 election.
1323
01:12:12,661 --> 01:12:17,291
The manipulation
by third parties is not a hack.
1324
01:12:18,500 --> 01:12:21,462
Right? The Russians didn't hack Facebook.
1325
01:12:21,545 --> 01:12:24,965
What they did was they used the tools
that Facebook created
1326
01:12:25,049 --> 01:12:27,843
for legitimate advertisers
and legitimate users,
1327
01:12:27,926 --> 01:12:30,346
and they applied it
to a nefarious purpose.
1328
01:12:32,014 --> 01:12:34,391
[Tristan]
It's like remote-control warfare.
1329
01:12:34,475 --> 01:12:36,602
One country can manipulate another one
1330
01:12:36,685 --> 01:12:39,229
without actually invading
its physical borders.
1331
01:12:39,605 --> 01:12:42,232
[reporter 1] We're seeing violent images.It appears to be a dumpster
1332
01:12:42,316 --> 01:12:43,317
being pushed around...
1333
01:12:43,400 --> 01:12:46,028
[Tristan] But it wasn't
about who you wanted to vote for.
1334
01:12:46,362 --> 01:12:50,574
It was about sowing total chaos
and division in society.
1335
01:12:50,657 --> 01:12:53,035
[reporter 2] Now,this was in Huntington Beach. A march...
1336
01:12:53,118 --> 01:12:54,870
[Tristan] It's about making two sides
1337
01:12:54,953 --> 01:12:56,413
who couldn't hear each other anymore,
1338
01:12:56,497 --> 01:12:58,123
who didn't want to hear each other
anymore,
1339
01:12:58,207 --> 01:12:59,875
who didn't trust each other anymore.
1340
01:12:59,958 --> 01:13:03,212
[reporter 3] This is a citywhere hatred was laid bare
1341
01:13:03,295 --> 01:13:05,464
and transformed into racial violence.
1342
01:13:05,547 --> 01:13:07,549
[crowd shouting]
1343
01:13:09,009 --> 01:13:11,178
[indistinct shouting]
1344
01:13:12,471 --> 01:13:14,014
[men grunting]
1345
01:13:17,851 --> 01:13:20,062
[police siren blaring]
1346
01:13:20,145 --> 01:13:20,979
[Cass] Ben!
1347
01:13:21,605 --> 01:13:22,439
Cassandra!
1348
01:13:22,981 --> 01:13:23,816
-Cass!
-Ben!
1349
01:13:23,899 --> 01:13:25,484
[officer 1] Come here! Come here!
1350
01:13:27,486 --> 01:13:31,156
Arms up. Arms up.
Get down on your knees. Now, down.
1351
01:13:31,240 --> 01:13:32,491
[crowd continues shouting]
1352
01:13:36,120 --> 01:13:37,204
-[officer 2] Calm--
-Ben!
1353
01:13:37,287 --> 01:13:38,664
[officer 2] Hey! Hands up!
1354
01:13:39,623 --> 01:13:41,750
Turn around. On the ground.On the ground!
1355
01:13:43,710 --> 01:13:46,463
-[crowd echoing]
-[melancholy piano music playing]
1356
01:13:51,969 --> 01:13:54,388
[siren continues wailing]
1357
01:13:56,723 --> 01:14:00,018
[Tristan] Do we want this system for sale
to the highest bidder?
1358
01:14:01,437 --> 01:14:05,399
For democracy to be completely for sale,
where you can reach any mind you want,
1359
01:14:05,482 --> 01:14:09,069
target a lie to that specific population,
and create culture wars?
1360
01:14:09,236 --> 01:14:10,237
Do we want that?
1361
01:14:14,700 --> 01:14:16,577
[Marco Rubio] We are a nation of people...
1362
01:14:16,952 --> 01:14:18,871
that no longer speak to each other.
1363
01:14:19,872 --> 01:14:23,000
We are a nation of people
who have stopped being friends with people
1364
01:14:23,083 --> 01:14:25,461
because of who they voted for
in the last election.
1365
01:14:25,878 --> 01:14:28,422
We are a nation of people
who have isolated ourselves
1366
01:14:28,505 --> 01:14:30,966
to only watch channels
that tell us that we're right.
1367
01:14:32,259 --> 01:14:36,597
My message here today is that tribalism
is ruining us.
1368
01:14:37,347 --> 01:14:39,183
It is tearing our country apart.
1369
01:14:40,267 --> 01:14:42,811
It is no way for sane adults to act.
1370
01:14:43,187 --> 01:14:45,314
If everyone's entitled to their own facts,
1371
01:14:45,397 --> 01:14:49,401
there's really no need for compromise,
no need for people to come together.
1372
01:14:49,485 --> 01:14:51,695
In fact, there's really no need
for people to interact.
1373
01:14:52,321 --> 01:14:53,530
We need to have...
1374
01:14:53,989 --> 01:14:58,410
some shared understanding of reality.
Otherwise, we aren't a country.
1375
01:14:58,952 --> 01:15:02,998
So, uh, long-term, the solution here is
to build more AI tools
1376
01:15:03,081 --> 01:15:08,128
that find patterns of people using
the services that no real person would do.
1377
01:15:08,212 --> 01:15:11,840
We are allowing the technologists
to frame this as a problem
1378
01:15:11,924 --> 01:15:13,884
that they're equipped to solve.
1379
01:15:15,135 --> 01:15:16,470
That is... That's a lie.
1380
01:15:17,679 --> 01:15:20,724
People talk about AI
as if it will know truth.
1381
01:15:21,683 --> 01:15:23,685
AI's not gonna solve these problems.
1382
01:15:24,269 --> 01:15:27,189
AI cannot solve the problem of fake news.
1383
01:15:28,649 --> 01:15:31,026
Google doesn't have the option of saying,
1384
01:15:31,109 --> 01:15:36,240
"Oh, is this conspiracy? Is this truth?"
Because they don't know what truth is.
1385
01:15:36,782 --> 01:15:37,783
They don't have a...
1386
01:15:37,908 --> 01:15:40,827
They don't have a proxy for truth
that's better than a click.
1387
01:15:41,870 --> 01:15:45,123
If we don't agree on what is true
1388
01:15:45,207 --> 01:15:47,584
or that there is such a thing as truth,
1389
01:15:48,293 --> 01:15:49,294
we're toast.
1390
01:15:49,753 --> 01:15:52,089
This is the problem
beneath other problems
1391
01:15:52,172 --> 01:15:54,424
because if we can't agree on what's true,
1392
01:15:55,092 --> 01:15:57,803
then we can't navigate
out of any of our problems.
1393
01:15:57,886 --> 01:16:00,806
-[ominous instrumental music playing]
-[console droning]
1394
01:16:05,435 --> 01:16:07,729
[Growth AI] We should suggest
Flat Earth Football Club.
1395
01:16:07,813 --> 01:16:10,566
[Engagement AI] Don't show him
sports updates. He doesn't engage.
1396
01:16:11,483 --> 01:16:14,027
[AIs speaking indistinctly]
1397
01:16:15,696 --> 01:16:17,698
[music swells]
1398
01:16:39,886 --> 01:16:42,764
[Jaron] A lot of people in Silicon Valley
subscribe to some kind of theory
1399
01:16:42,848 --> 01:16:45,142
that we're building
some global super brain,
1400
01:16:45,309 --> 01:16:48,020
and all of our users
are just interchangeable little neurons,
1401
01:16:48,103 --> 01:16:49,563
no one of which is important.
1402
01:16:50,230 --> 01:16:53,150
And it subjugates people
into this weird role
1403
01:16:53,233 --> 01:16:56,069
where you're just, like,
this little computing element
1404
01:16:56,153 --> 01:16:58,905
that we're programming
through our behavior manipulation
1405
01:16:58,989 --> 01:17:02,367
for the service of this giant brain,
and you don't matter.
1406
01:17:02,451 --> 01:17:04,911
You're not gonna get paid.
You're not gonna get acknowledged.
1407
01:17:04,995 --> 01:17:06,455
You don't have self-determination.
1408
01:17:06,538 --> 01:17:09,416
We'll sneakily just manipulate you
because you're a computing node,
1409
01:17:09,499 --> 01:17:12,336
so we need to program you'cause that's
what you do with computing nodes.
1410
01:17:14,504 --> 01:17:16,506
[reflective instrumental music playing]
1411
01:17:20,093 --> 01:17:21,845
Oh, man. [sighs]
1412
01:17:21,928 --> 01:17:25,390
[Tristan] When you think about technology
and it being an existential threat,
1413
01:17:25,474 --> 01:17:28,060
you know, that's a big claim, and...
1414
01:17:29,603 --> 01:17:33,982
it's easy to then, in your mind, think,
"Okay, so, there I am with the phone...
1415
01:17:35,609 --> 01:17:37,235
scrolling, clicking, using it.
1416
01:17:37,319 --> 01:17:39,196
Like, where's the existential threat?
1417
01:17:40,280 --> 01:17:41,615
Okay, there's the supercomputer.
1418
01:17:41,698 --> 01:17:43,950
The other side of the screen,
pointed at my brain,
1419
01:17:44,409 --> 01:17:47,537
got me to watch one more video.
Where's the existential threat?"
1420
01:17:47,621 --> 01:17:49,623
[indistinct chatter]
1421
01:17:54,252 --> 01:17:57,631
[Tristan] It's not
about the technology
1422
01:17:57,714 --> 01:17:59,341
being the existential threat.
1423
01:18:03,679 --> 01:18:06,264
It's the technology's ability
1424
01:18:06,348 --> 01:18:09,476
to bring out the worst in society...
[chuckles]
1425
01:18:09,559 --> 01:18:13,522
...and the worst in society
being the existential threat.
1426
01:18:18,819 --> 01:18:20,570
If technology creates...
1427
01:18:21,697 --> 01:18:23,115
mass chaos,
1428
01:18:23,198 --> 01:18:24,533
outrage, incivility,
1429
01:18:24,616 --> 01:18:26,326
lack of trust in each other,
1430
01:18:27,452 --> 01:18:30,414
loneliness, alienation, more polarization,
1431
01:18:30,706 --> 01:18:33,333
more election hacking, more populism,
1432
01:18:33,917 --> 01:18:36,962
more distraction and inability
to focus on the real issues...
1433
01:18:37,963 --> 01:18:39,715
that's just society. [scoffs]
1434
01:18:40,340 --> 01:18:46,388
And now society
is incapable of healing itself
1435
01:18:46,471 --> 01:18:48,515
and just devolving into a kind of chaos.
1436
01:18:51,977 --> 01:18:54,938
This affects everyone,
even if you don't use these products.
1437
01:18:55,397 --> 01:18:57,524
These things have become
digital Frankensteins
1438
01:18:57,607 --> 01:19:00,068
that areterraforming the world
in their image,
1439
01:19:00,152 --> 01:19:01,862
whether it's the mental health of children
1440
01:19:01,945 --> 01:19:04,489
or our politics
and our political discourse,
1441
01:19:04,573 --> 01:19:07,492
without taking responsibility
for taking over the public square.
1442
01:19:07,576 --> 01:19:10,579
-So, again, it comes back to--
-And who do you think's responsible?
1443
01:19:10,662 --> 01:19:13,582
I think we have
to have the platforms be responsible
1444
01:19:13,665 --> 01:19:15,584
for when they take over
election advertising,
1445
01:19:15,667 --> 01:19:17,794
they're responsible
for protecting elections.
1446
01:19:17,878 --> 01:19:20,380
When they take over mental health of kids
or Saturday morning,
1447
01:19:20,464 --> 01:19:22,841
they're responsible
for protecting Saturday morning.
1448
01:19:23,592 --> 01:19:27,929
The race to keep people's attention
isn't going away.
1449
01:19:28,388 --> 01:19:31,850
Our technology's gonna become
more integrated into our lives, not less.
1450
01:19:31,933 --> 01:19:34,895
The AIs are gonna get better at predicting
what keeps us on the screen,
1451
01:19:34,978 --> 01:19:37,105
not worse at predicting
what keeps us on the screen.
1452
01:19:38,940 --> 01:19:42,027
I... I am 62 years old,
1453
01:19:42,110 --> 01:19:44,821
getting older every minute,
the more this conversation goes on...
1454
01:19:44,905 --> 01:19:48,033
-[crowd chuckles]
-...but... but I will tell you that, um...
1455
01:19:48,700 --> 01:19:52,370
I'm probably gonna be dead and gone,
and I'll probably be thankful for it,
1456
01:19:52,454 --> 01:19:54,331
when all this shit comes to fruition.
1457
01:19:54,790 --> 01:19:59,586
Because... Because I think
that this scares me to death.
1458
01:20:00,754 --> 01:20:03,048
Do... Do you...
Do you see it the same way?
1459
01:20:03,548 --> 01:20:06,885
Or am I overreacting to a situation
that I don't know enough about?
1460
01:20:09,805 --> 01:20:11,598
[interviewer]
What are you most worried about?
1461
01:20:13,850 --> 01:20:18,480
[sighs] I think,
in the... in the shortest time horizon...
1462
01:20:19,523 --> 01:20:20,524
civil war.
1463
01:20:24,444 --> 01:20:29,908
If we go down the current status quo
for, let's say, another 20 years...
1464
01:20:31,117 --> 01:20:34,579
we probably destroy our civilization
through willful ignorance.
1465
01:20:34,663 --> 01:20:37,958
We probably fail to meet the challenge
of climate change.
1466
01:20:38,041 --> 01:20:42,087
We probably degrade
the world's democracies
1467
01:20:42,170 --> 01:20:46,132
so that they fall into some sort
of bizarre autocratic dysfunction.
1468
01:20:46,216 --> 01:20:48,426
We probably ruin the global economy.
1469
01:20:48,760 --> 01:20:52,264
Uh, we probably, um, don't survive.
1470
01:20:52,347 --> 01:20:54,808
You know,
I... I really do view it as existential.
1471
01:20:54,891 --> 01:20:56,893
[helicopter blades whirring]
1472
01:21:02,524 --> 01:21:04,985
[Tristan]
Is this the last generation of people
1473
01:21:05,068 --> 01:21:08,488
that are gonna know what it was like
before this illusion took place?
1474
01:21:11,074 --> 01:21:14,578
Like, how do you wake up from the matrix
when you don't know you're in the matrix?
1475
01:21:14,661 --> 01:21:16,538
[ominous instrumental music playing]
1476
01:21:27,382 --> 01:21:30,635
[Tristan] A lot of what we're saying
sounds like it's just this...
1477
01:21:31,511 --> 01:21:33,680
one-sided doom and gloom.
1478
01:21:33,763 --> 01:21:36,808
Like, "Oh, my God,
technology's just ruining the world
1479
01:21:36,892 --> 01:21:38,059
and it's ruining kids,"
1480
01:21:38,143 --> 01:21:40,061
and it's like... "No." [chuckles]
1481
01:21:40,228 --> 01:21:44,065
It's confusing
because it's simultaneous utopia...
1482
01:21:44,608 --> 01:21:45,567
and dystopia.
1483
01:21:45,942 --> 01:21:50,447
Like, I could hit a button on my phone,
and a car shows up in 30 seconds,
1484
01:21:50,530 --> 01:21:52,699
and I can go exactly where I need to go.
1485
01:21:52,782 --> 01:21:55,660
That is magic. That's amazing.
1486
01:21:56,161 --> 01:21:57,662
When we were making the like button,
1487
01:21:57,746 --> 01:22:01,499
our entire motivation was, "Can we spread
positivity and love in the world?"
1488
01:22:01,583 --> 01:22:05,003
The idea that, fast-forward to today,
and teens would be getting depressed
1489
01:22:05,086 --> 01:22:06,421
when they don't have enough likes,
1490
01:22:06,504 --> 01:22:08,632
or it could be leading
to political polarization
1491
01:22:08,715 --> 01:22:09,883
was nowhere on our radar.
1492
01:22:09,966 --> 01:22:12,135
I don't think these guys set out
to be evil.
1493
01:22:13,511 --> 01:22:15,764
It's just the business model
that has a problem.
1494
01:22:15,847 --> 01:22:20,226
You could shut down the service
and destroy whatever it is--
1495
01:22:20,310 --> 01:22:24,522
$20 billion of shareholder value--
and get sued and...
1496
01:22:24,606 --> 01:22:27,108
But you can't, in practice,
put the genie back in the bottle.
1497
01:22:27,192 --> 01:22:30,403
You can make some tweaks,
but at the end of the day,
1498
01:22:30,487 --> 01:22:34,032
you've gotta grow revenue and usage,
quarter over quarter. It's...
1499
01:22:34,658 --> 01:22:37,535
The bigger it gets,
the harder it is for anyone to change.
1500
01:22:38,495 --> 01:22:43,458
What I see is a bunch of people
who are trapped by a business model,
1501
01:22:43,541 --> 01:22:46,169
an economic incentive,
and shareholder pressure
1502
01:22:46,252 --> 01:22:48,922
that makes it almost impossible
to do something else.
1503
01:22:49,005 --> 01:22:50,924
I think we need to accept that it's okay
1504
01:22:51,007 --> 01:22:53,176
for companies to be focused
on making money.
1505
01:22:53,259 --> 01:22:55,637
What's not okay
is when there's no regulation, no rules,
1506
01:22:55,720 --> 01:22:56,888
and no competition,
1507
01:22:56,972 --> 01:23:00,850
and the companies are acting
as sort ofde facto governments.
1508
01:23:00,934 --> 01:23:03,353
And then they're saying,
"Well, we can regulate ourselves."
1509
01:23:03,436 --> 01:23:05,981
I mean, that's just a lie.
That's just ridiculous.
1510
01:23:06,064 --> 01:23:08,650
Financial incentives kind of run
the world,
1511
01:23:08,733 --> 01:23:12,529
so any solution to this problem
1512
01:23:12,612 --> 01:23:15,573
has to realign the financial incentives.
1513
01:23:16,074 --> 01:23:18,785
There's no fiscal reason
for these companies to change.
1514
01:23:18,868 --> 01:23:21,329
And that is why I think
we need regulation.
1515
01:23:21,413 --> 01:23:24,290
The phone company
has tons of sensitive data about you,
1516
01:23:24,374 --> 01:23:27,544
and we have a lot of laws that make sure
they don't do the wrong things.
1517
01:23:27,627 --> 01:23:31,506
We have almost no laws
around digital privacy, for example.
1518
01:23:31,589 --> 01:23:34,426
We could tax data collection
and processing
1519
01:23:34,509 --> 01:23:37,554
the same way that you, for example,
pay your water bill
1520
01:23:37,637 --> 01:23:39,723
by monitoring the amount of water
that you use.
1521
01:23:39,806 --> 01:23:43,226
You tax these companies on the data assets
that they have.
1522
01:23:43,309 --> 01:23:44,769
It gives them a fiscal reason
1523
01:23:44,853 --> 01:23:47,856
to not acquire every piece of data
on the planet.
1524
01:23:47,939 --> 01:23:50,567
The law runs way behind on these things,
1525
01:23:50,650 --> 01:23:55,864
but what I know is the current situation
exists not for the protection of users,
1526
01:23:55,947 --> 01:23:58,700
but for the protection
of the rights and privileges
1527
01:23:58,783 --> 01:24:01,453
of these gigantic,
incredibly wealthy companies.
1528
01:24:02,245 --> 01:24:05,832
Are we always gonna defer to the richest,
most powerful people?
1529
01:24:05,915 --> 01:24:07,417
Or are we ever gonna say,
1530
01:24:07,959 --> 01:24:12,047
"You know, there are times
when there is a national interest.
1531
01:24:12,130 --> 01:24:15,592
There are times
when the interests of people, of users,
1532
01:24:15,675 --> 01:24:17,385
is actually more important
1533
01:24:18,011 --> 01:24:21,473
than the profits of somebody
who's already a billionaire"?
1534
01:24:21,556 --> 01:24:26,603
These markets undermine democracy,
and they undermine freedom,
1535
01:24:26,686 --> 01:24:28,521
and they should be outlawed.
1536
01:24:29,147 --> 01:24:31,816
This is not a radical proposal.
1537
01:24:31,900 --> 01:24:34,194
There are other markets that we outlaw.
1538
01:24:34,277 --> 01:24:36,988
We outlaw markets in human organs.
1539
01:24:37,072 --> 01:24:39,491
We outlaw markets in human slaves.
1540
01:24:39,949 --> 01:24:44,037
Because they have
inevitable destructive consequences.
1541
01:24:44,537 --> 01:24:45,830
We live in a world
1542
01:24:45,914 --> 01:24:50,001
in which a tree is worth more,
financially, dead than alive,
1543
01:24:50,085 --> 01:24:53,838
in a world in which a whale
is worth more dead than alive.
1544
01:24:53,922 --> 01:24:56,341
For so long as our economy works
in that way
1545
01:24:56,424 --> 01:24:58,134
and corporations go unregulated,
1546
01:24:58,218 --> 01:25:00,678
they're going to continue
to destroy trees,
1547
01:25:00,762 --> 01:25:01,763
to kill whales,
1548
01:25:01,846 --> 01:25:06,101
to mine the earth, and to continue
to pull oil out of the ground,
1549
01:25:06,184 --> 01:25:08,394
even though we know
it is destroying the planet
1550
01:25:08,478 --> 01:25:12,148
and we know that it's going to leave
a worse world for future generations.
1551
01:25:12,232 --> 01:25:13,858
This is short-term thinking
1552
01:25:13,942 --> 01:25:16,694
based on this religion of profit
at all costs,
1553
01:25:16,778 --> 01:25:20,156
as if somehow, magically, each corporation
acting in its selfish interest
1554
01:25:20,240 --> 01:25:21,950
is going to produce the best result.
1555
01:25:22,033 --> 01:25:24,494
This has been affecting the environment
for a long time.
1556
01:25:24,577 --> 01:25:27,288
What's frightening,
and what hopefully is the last straw
1557
01:25:27,372 --> 01:25:29,207
that will make us wake up
as a civilization
1558
01:25:29,290 --> 01:25:31,709
to how flawed this theory has been
in the first place
1559
01:25:31,793 --> 01:25:35,004
is to see that now we're the tree,
we're the whale.
1560
01:25:35,088 --> 01:25:37,048
Our attention can be mined.
1561
01:25:37,132 --> 01:25:39,134
We are more profitable to a corporation
1562
01:25:39,217 --> 01:25:41,594
if we're spending time
staring at a screen,
1563
01:25:41,678 --> 01:25:42,971
staring at an ad,
1564
01:25:43,054 --> 01:25:45,890
than if we're spending that time
living our life in a rich way.
1565
01:25:45,974 --> 01:25:47,559
And so, we're seeing the results of that.
1566
01:25:47,642 --> 01:25:50,687
We're seeing corporations using
powerful artificial intelligence
1567
01:25:50,770 --> 01:25:53,648
to outsmart us and figure out
how to pull our attention
1568
01:25:53,731 --> 01:25:55,358
toward the things they want us to look at,
1569
01:25:55,441 --> 01:25:57,277
rather than the things
that are most consistent
1570
01:25:57,360 --> 01:25:59,237
with our goals and our values
and our lives.
1571
01:25:59,320 --> 01:26:01,322
[static crackles]
1572
01:26:02,991 --> 01:26:04,450
[crowd cheering]
1573
01:26:05,535 --> 01:26:06,911
[Steve Jobs] What a computer is,
1574
01:26:06,995 --> 01:26:10,290
is it's the most remarkable tool
that we've ever come up with.
1575
01:26:11,124 --> 01:26:13,877
And it's the equivalent of a bicycle
for our minds.
1576
01:26:15,628 --> 01:26:20,091
The idea of humane technology,
that's where Silicon Valley got its start.
1577
01:26:21,050 --> 01:26:25,722
And we've lost sight of it
because it became the cool thing to do,
1578
01:26:25,805 --> 01:26:27,265
as opposed to the right thing to do.
1579
01:26:27,348 --> 01:26:29,726
The Internet was, like,
a weird, wacky place.
1580
01:26:29,809 --> 01:26:31,394
It was experimental.
1581
01:26:31,477 --> 01:26:34,731
Creative things happened on the Internet,
and certainly, they do still,
1582
01:26:34,814 --> 01:26:38,610
but, like, it just feels like this,
like, giant mall. [chuckles]
1583
01:26:38,693 --> 01:26:42,071
You know, it's just like, "God,
there's gotta be...
1584
01:26:42,155 --> 01:26:44,157
there's gotta be more to it than that."
1585
01:26:44,991 --> 01:26:45,992
[man typing]
1586
01:26:46,659 --> 01:26:48,411
[Bailey] I guess I'm just an optimist.
1587
01:26:48,494 --> 01:26:52,040
'Cause I think we can change
what social media looks like and means.
1588
01:26:54,083 --> 01:26:56,711
[Justin] The way the technology works
is not a law of physics.
1589
01:26:56,794 --> 01:26:57,921
It is not set in stone.
1590
01:26:58,004 --> 01:27:02,175
These are choices that human beings
like myself have been making.
1591
01:27:02,759 --> 01:27:05,345
And human beings can change
those technologies.
1592
01:27:06,971 --> 01:27:09,974
[Tristan] And the question now is
whether or not we're willing to admit
1593
01:27:10,475 --> 01:27:15,438
that those bad outcomes are coming
directly as a product of our work.
1594
01:27:21,027 --> 01:27:24,864
It's that we built these things,
and we have a responsibility to change it.
1595
01:27:28,409 --> 01:27:30,411
[static crackling]
1596
01:27:37,085 --> 01:27:38,711
[Tristan] The attention extraction model
1597
01:27:38,795 --> 01:27:42,298
is not how we want to treat
human beings.
1598
01:27:45,343 --> 01:27:48,137
[distorted] Is it just me or...
1599
01:27:49,722 --> 01:27:51,099
[distorted] Poor sucker.
1600
01:27:51,516 --> 01:27:53,226
[Tristan] The fabric of a healthy society
1601
01:27:53,309 --> 01:27:56,145
depends on us getting off
this corrosive business model.
1602
01:27:56,938 --> 01:27:58,064
[console beeps]
1603
01:27:58,147 --> 01:28:00,149
[gentle instrumental music playing]
1604
01:28:01,526 --> 01:28:04,612
[console whirs, grows quiet]
1605
01:28:04,696 --> 01:28:08,157
[Tristan] We can demand
that these products be designed humanely.
1606
01:28:09,409 --> 01:28:13,121
We can demand to not be treated
as an extractable resource.
1607
01:28:15,164 --> 01:28:18,334
The intention could be:
"How do we make the world better?"
1608
01:28:20,336 --> 01:28:21,504
[Jaron] Throughout history,
1609
01:28:21,587 --> 01:28:23,798
every single time
something's gotten better,
1610
01:28:23,881 --> 01:28:26,342
it's because somebody has come along
to say,
1611
01:28:26,426 --> 01:28:28,428
"This is stupid. We can do better."
[laughs]
1612
01:28:29,178 --> 01:28:32,557
Like, it's the critics
that drive improvement.
1613
01:28:33,141 --> 01:28:35,393
It's the critics
who are the true optimists.
1614
01:28:37,020 --> 01:28:39,147
[sighs] Hello.
1615
01:28:42,984 --> 01:28:44,277
[sighs] Um...
1616
01:28:46,195 --> 01:28:47,697
I mean, it seems kind of crazy, right?
1617
01:28:47,780 --> 01:28:51,534
It's like the fundamental way
that this stuff is designed...
1618
01:28:52,994 --> 01:28:55,163
isn't going in a good direction.
[chuckles]
1619
01:28:55,246 --> 01:28:56,873
Like, the entire thing.
1620
01:28:56,956 --> 01:29:00,626
So, it sounds crazy to say
we need to change all that,
1621
01:29:01,169 --> 01:29:02,670
but that's what we need to do.
1622
01:29:04,297 --> 01:29:05,923
[interviewer] Think we're gonna get there?
1623
01:29:07,383 --> 01:29:08,301
We have to.
1624
01:29:14,515 --> 01:29:16,476
[tense instrumental music playing]
1625
01:29:20,646 --> 01:29:24,942
[interviewer] Um,
it seems like you're very optimistic.
1626
01:29:26,194 --> 01:29:27,570
-Is that how I sound?
-[crew laughs]
1627
01:29:27,653 --> 01:29:28,905
[interviewer] Yeah, I mean...
1628
01:29:28,988 --> 01:29:31,449
I can't believe you keep saying that,
because I'm like, "Really?
1629
01:29:31,532 --> 01:29:33,409
I feel like we're headed toward dystopia.
1630
01:29:33,493 --> 01:29:35,328
I feel like we're on the fast track
to dystopia,
1631
01:29:35,411 --> 01:29:37,830
and it's gonna take a miracle
to get us out of it."
1632
01:29:37,914 --> 01:29:40,291
And that miracle is, of course,
collective will.
1633
01:29:41,000 --> 01:29:44,587
I am optimistic
that we're going to figure it out,
1634
01:29:44,670 --> 01:29:47,048
but I think it's gonna take a long time.
1635
01:29:47,131 --> 01:29:50,385
Because not everybody recognizes
that this is a problem.
1636
01:29:50,468 --> 01:29:55,890
I think one of the big failures
in technology today
1637
01:29:55,973 --> 01:29:58,643
is a real failure of leadership,
1638
01:29:58,726 --> 01:30:01,979
of, like, people coming out
and having these open conversations
1639
01:30:02,063 --> 01:30:05,900
about things that... not just
what went well, but what isn't perfect
1640
01:30:05,983 --> 01:30:08,194
so that someone can come in
and build something new.
1641
01:30:08,277 --> 01:30:10,321
At the end of the day, you know,
1642
01:30:10,405 --> 01:30:14,617
this machine isn't gonna turn around
until there's massive public pressure.
1643
01:30:14,700 --> 01:30:18,329
By having these conversations
and... and voicing your opinion,
1644
01:30:18,413 --> 01:30:21,082
in some cases
through these very technologies,
1645
01:30:21,165 --> 01:30:24,252
we can start to change the tide.
We can start to change the conversation.
1646
01:30:24,335 --> 01:30:27,004
It might sound strange,
but it's my world. It's my community.
1647
01:30:27,088 --> 01:30:29,632
I don't hate them. I don't wanna do
any harm to Google or Facebook.
1648
01:30:29,715 --> 01:30:32,885
I just want to reform them
so they don't destroy the world. You know?
1649
01:30:32,969 --> 01:30:35,513
I've uninstalled a ton of apps
from my phone
1650
01:30:35,596 --> 01:30:37,723
that I felt were just wasting my time.
1651
01:30:37,807 --> 01:30:40,685
All the social media apps,
all the news apps,
1652
01:30:40,768 --> 01:30:42,520
and I've turned off notifications
1653
01:30:42,603 --> 01:30:45,815
on anything that was vibrating my leg
with information
1654
01:30:45,898 --> 01:30:48,943
that wasn't timely and important to me
right now.
1655
01:30:49,026 --> 01:30:51,279
It's for the same reason
I don't keep cookies in my pocket.
1656
01:30:51,362 --> 01:30:53,197
Reduce the number of notifications
you get.
1657
01:30:53,281 --> 01:30:54,449
Turn off notifications.
1658
01:30:54,532 --> 01:30:55,950
Turning off all notifications.
1659
01:30:56,033 --> 01:30:58,536
I'm not using Google anymore,
I'm using Qwant,
1660
01:30:58,619 --> 01:31:01,497
which doesn't store your search history.
1661
01:31:01,581 --> 01:31:04,459
Never accept a video recommended to you
on YouTube.
1662
01:31:04,542 --> 01:31:07,003
Always choose.
That's another way to fight.
1663
01:31:07,086 --> 01:31:12,133
There are tons of Chrome extensions
that remove recommendations.
1664
01:31:12,216 --> 01:31:15,178
[interviewer] You're recommending
something to undo what you made.
1665
01:31:15,261 --> 01:31:16,554
[laughing] Yep.
1666
01:31:16,929 --> 01:31:21,642
Before you share, fact-check,
consider the source, do that extra Google.
1667
01:31:21,726 --> 01:31:25,104
If it seems like it's something designed
to really push your emotional buttons,
1668
01:31:25,188 --> 01:31:26,314
like, it probably is.
1669
01:31:26,397 --> 01:31:29,025
Essentially, you vote with your clicks.
1670
01:31:29,108 --> 01:31:30,359
If you click on clickbait,
1671
01:31:30,443 --> 01:31:33,779
you're creating a financial incentive
that perpetuates this existing system.
1672
01:31:33,863 --> 01:31:36,949
Make sure that you get
lots of different kinds of information
1673
01:31:37,033 --> 01:31:37,909
in your own life.
1674
01:31:37,992 --> 01:31:40,995
I follow people on Twitter
that I disagree with
1675
01:31:41,078 --> 01:31:44,207
because I want to be exposed
to different points of view.
1676
01:31:44,665 --> 01:31:46,584
Notice that many people
in the tech industry
1677
01:31:46,667 --> 01:31:49,045
don't give these devices
to their own children.
1678
01:31:49,128 --> 01:31:51,047
My kids don't use social media at all.
1679
01:31:51,839 --> 01:31:53,549
[interviewer] Is that a rule,
or is that a...
1680
01:31:53,633 --> 01:31:54,509
That's a rule.
1681
01:31:55,092 --> 01:31:57,845
We are zealots about it.
1682
01:31:57,929 --> 01:31:59,222
We're... We're crazy.
1683
01:31:59,305 --> 01:32:05,603
And we don't let our kids have
really any screen time.
1684
01:32:05,686 --> 01:32:08,564
I've worked out
what I think are three simple rules, um,
1685
01:32:08,648 --> 01:32:12,610
that make life a lot easier for families
and that are justified by the research.
1686
01:32:12,693 --> 01:32:15,571
So, the first rule is
all devices out of the bedroom
1687
01:32:15,655 --> 01:32:17,281
at a fixed time every night.
1688
01:32:17,365 --> 01:32:20,535
Whatever the time is, half an hour
before bedtime, all devices out.
1689
01:32:20,618 --> 01:32:24,038
The second rule is no social media
until high school.
1690
01:32:24,121 --> 01:32:26,374
Personally, I think the age should be 16.
1691
01:32:26,457 --> 01:32:28,960
Middle school's hard enough.
Keep it out until high school.
1692
01:32:29,043 --> 01:32:32,964
And the third rule is
work out a time budget with your kid.
1693
01:32:33,047 --> 01:32:34,757
And if you talk with them and say,
1694
01:32:34,840 --> 01:32:37,927
"Well, how many hours a day
do you wanna spend on your device?
1695
01:32:38,010 --> 01:32:39,637
What do you think is a good amount?"
1696
01:32:39,720 --> 01:32:41,597
they'll often say
something pretty reasonable.
1697
01:32:42,056 --> 01:32:44,642
Well, look, I know perfectly well
1698
01:32:44,725 --> 01:32:48,563
that I'm not gonna get everybody
to delete their social media accounts,
1699
01:32:48,646 --> 01:32:50,439
but I think I can get a few.
1700
01:32:50,523 --> 01:32:54,402
And just getting a few people
to delete their accounts matters a lot,
1701
01:32:54,485 --> 01:32:58,406
and the reason why is that that creates
the space for a conversation
1702
01:32:58,489 --> 01:33:00,908
because I want there to be enough people
out in the society
1703
01:33:00,992 --> 01:33:05,204
who are free of the manipulation engines
to have a societal conversation
1704
01:33:05,288 --> 01:33:07,540
that isn't bounded
by the manipulation engines.
1705
01:33:07,623 --> 01:33:10,126
So, do it! Get out of the system.
1706
01:33:10,209 --> 01:33:12,503
Yeah, delete. Get off the stupid stuff.
1707
01:33:13,546 --> 01:33:16,507
The world's beautiful.
Look. Look, it's great out there.
1708
01:33:17,258 --> 01:33:18,384
[laughs]
1709
01:33:21,971 --> 01:33:24,432
-[birds singing]
-[children playing and shouting]
164196
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.