All language subtitles for In the Age of AI (full film) _ FRONTLINE_HD
Afrikaans
Albanian
Amharic
Arabic
Armenian
Azerbaijani
Basque
Belarusian
Bengali
Bosnian
Bulgarian
Catalan
Cebuano
Chichewa
Chinese (Simplified)
Chinese (Traditional)
Corsican
Croatian
Czech
Danish
Dutch
English
Esperanto
Estonian
Filipino
Finnish
French
Frisian
Galician
Georgian
German
Greek
Gujarati
Haitian Creole
Hausa
Hawaiian
Hebrew
Hindi
Hmong
Hungarian
Icelandic
Igbo
Indonesian
Irish
Italian
Japanese
Javanese
Kannada
Kazakh
Khmer
Korean
Kurdish (Kurmanji)
Kyrgyz
Lao
Latin
Latvian
Lithuanian
Luxembourgish
Macedonian
Malagasy
Malay
Malayalam
Maltese
Maori
Marathi
Mongolian
Myanmar (Burmese)
Nepali
Norwegian
Pashto
Persian
Polish
Portuguese
Punjabi
Romanian
Russian
Samoan
Scots Gaelic
Serbian
Sesotho
Shona
Sindhi
Sinhala
Slovak
Slovenian
Somali
Spanish
Sundanese
Swahili
Swedish
Tajik
Tamil
Telugu
Thai
Turkish
Ukrainian
Urdu
Uzbek
Vietnamese
Welsh
Xhosa
Yiddish
Yoruba
Zulu
Odia (Oriya)
Kinyarwanda
Turkmen
Tatar
Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:16,630 --> 00:00:17,930
NARRATOR: Tonight--
2
00:00:17,930 --> 00:00:20,470
The race to become an A.I.superpower is on...
3
00:00:20,470 --> 00:00:22,670
NARRATOR: The politics ofartificial intelligence...
4
00:00:22,670 --> 00:00:24,530
There will bea Chinese tech sector
5
00:00:24,530 --> 00:00:26,200
and there will bea American tech sector.
6
00:00:26,200 --> 00:00:27,730
NARRATOR: The new tech war.
7
00:00:27,730 --> 00:00:30,200
The more data,the better the A.I. works.
8
00:00:30,200 --> 00:00:33,930
So in the age of A.I.,where data is the new oil,
9
00:00:33,930 --> 00:00:36,170
China is the new Saudi Arabia.
10
00:00:36,170 --> 00:00:37,930
NARRATOR:The future of work...
11
00:00:37,930 --> 00:00:40,100
When I increase productivitythrough automation,
12
00:00:40,100 --> 00:00:42,070
jobs go away.
13
00:00:42,070 --> 00:00:46,100
I believe about 50% of jobswill be somewhat
14
00:00:46,100 --> 00:00:50,000
or extremely threatened by A.I.in the next 15 years or so.
15
00:00:50,000 --> 00:00:52,630
NARRATOR: A.I. and corporatesurveillance...
16
00:00:52,630 --> 00:00:55,500
We thought that we weresearching Google.
17
00:00:55,500 --> 00:00:57,930
We had no idea that Googlewas searching us.
18
00:00:57,930 --> 00:01:00,170
NARRATOR: And the threatto democracy.
19
00:01:00,170 --> 00:01:02,600
China is on its wayto building
20
00:01:02,600 --> 00:01:04,070
a total surveillance state.
21
00:01:04,070 --> 00:01:06,130
NARRATOR: Tonight on"Frontline"...
22
00:01:06,130 --> 00:01:09,300
It has pervaded so manyelements of everyday life.
23
00:01:09,300 --> 00:01:11,930
How do we make it transparentand accountable?
24
00:01:11,930 --> 00:01:13,770
NARRATOR:..."In the Age of A.I."
25
00:01:16,530 --> 00:01:21,200
♪ ♪
26
00:01:34,900 --> 00:01:37,970
♪ ♪
27
00:01:42,700 --> 00:01:46,330
NARRATOR: This is the world'smost complex board game.
28
00:01:48,070 --> 00:01:51,530
There are more possible movesin the game of Go
29
00:01:51,530 --> 00:01:55,800
than there are atomsin the universe.
30
00:01:55,800 --> 00:02:01,700
Legend has it that in 2300 BCE,Emperor Yao devised it
31
00:02:01,700 --> 00:02:08,000
to teach his son discipline,concentration, and balance.
32
00:02:08,000 --> 00:02:12,570
And, over 4,000 years later,this ancient Chinese game
33
00:02:12,570 --> 00:02:17,370
would signal the startof a new industrial age.
34
00:02:17,370 --> 00:02:18,430
♪ ♪
35
00:02:25,930 --> 00:02:31,400
It was 2016, in Seoul,South Korea.
36
00:02:31,400 --> 00:02:35,170
Can machines overtakehuman intelligence?
37
00:02:35,170 --> 00:02:37,730
A breakthrough moment when theworld champion
38
00:02:37,730 --> 00:02:40,630
of the Asian board game Gotakes on an A.I. program
39
00:02:40,630 --> 00:02:42,530
developed by Google.
40
00:02:42,530 --> 00:02:49,430
(speaking Korean):
41
00:02:55,300 --> 00:02:56,770
In countries whereit's very popular,
42
00:02:56,770 --> 00:03:00,570
like China and Japan and,and South Korea, to them,
43
00:03:00,570 --> 00:03:02,100
Go is not just a game, right?
44
00:03:02,100 --> 00:03:04,000
It's, like, how you learnstrategy.
45
00:03:04,000 --> 00:03:07,500
It has an almost spiritualcomponent.
46
00:03:07,500 --> 00:03:09,400
You know, if you talkto South Koreans, right,
47
00:03:09,400 --> 00:03:11,500
and Lee Sedol is the world'sgreatest Go player,
48
00:03:11,500 --> 00:03:13,730
he's a national heroin South Korea.
49
00:03:13,730 --> 00:03:18,630
They were sure that Lee Sedolwould beat AlphaGo hands down.
50
00:03:18,630 --> 00:03:23,030
♪ ♪
51
00:03:23,030 --> 00:03:26,130
NARRATOR: Google's AlphaGowas a computer program that,
52
00:03:26,130 --> 00:03:28,870
starting with the rules of Go
53
00:03:28,870 --> 00:03:31,330
and a databaseof historical games,
54
00:03:31,330 --> 00:03:34,630
had been designedto teach itself.
55
00:03:34,630 --> 00:03:38,700
I was one of the commentatorsat the Lee Sedol games.
56
00:03:38,700 --> 00:03:42,700
And yes, it was watched by tensof millions of people.
57
00:03:42,700 --> 00:03:44,300
(man speaking Korean)
58
00:03:44,300 --> 00:03:46,630
NARRATOR: ThroughoutSoutheast Asia,
59
00:03:46,630 --> 00:03:48,400
this was seen asa sports spectacle
60
00:03:48,400 --> 00:03:49,800
with national pride at stake.
61
00:03:49,800 --> 00:03:51,030
Wow, that was a player guess.
62
00:03:51,030 --> 00:03:53,530
NARRATOR: But much morewas in play.
63
00:03:53,530 --> 00:03:55,730
This was the public unveiling
64
00:03:55,730 --> 00:03:57,830
of a form of artificialintelligence
65
00:03:57,830 --> 00:04:00,400
called deep learning,
66
00:04:00,400 --> 00:04:03,400
that mimics the neural networksof the human brain.
67
00:04:03,400 --> 00:04:05,430
So what happens with machinelearning,
68
00:04:05,430 --> 00:04:08,300
or artificial intelligence--initially with AlphaGo--
69
00:04:08,300 --> 00:04:12,130
is that the machine is fedall kinds of Go games,
70
00:04:12,130 --> 00:04:15,530
and then it studies them,learns from them,
71
00:04:15,530 --> 00:04:17,830
and figures out its own moves.
72
00:04:17,829 --> 00:04:19,699
And because it's an A.I.system--
73
00:04:19,700 --> 00:04:21,570
it's not just followinginstructions,
74
00:04:21,570 --> 00:04:23,930
it's figuring out its owninstructions--
75
00:04:23,930 --> 00:04:26,930
it comes up with moves thathumans hadn't thought of before.
76
00:04:26,930 --> 00:04:31,130
So, it studies games that humanshave played, it knows the rules,
77
00:04:31,130 --> 00:04:36,130
and then it comes upwith creative moves.
78
00:04:36,130 --> 00:04:38,030
(woman speaking Korean)
79
00:04:39,600 --> 00:04:42,370
(speaking Korean):
80
00:04:42,370 --> 00:04:44,800
That's a very...that's a very surprising move.
81
00:04:44,800 --> 00:04:47,670
I thought it was a mistake.
82
00:04:47,670 --> 00:04:51,470
NARRATOR: Game two, move 37.
83
00:04:51,470 --> 00:04:54,200
That move 37 was a move thathumans could not fathom,
84
00:04:54,200 --> 00:04:57,070
but yet it ended up beingbrilliant
85
00:04:57,070 --> 00:05:00,470
and woke people up to say,
86
00:05:00,470 --> 00:05:03,100
"Wow, after thousandsof years of playing,
87
00:05:03,100 --> 00:05:06,330
we never thought about makinga move like that."
88
00:05:06,330 --> 00:05:08,370
Oh, he resigned.
89
00:05:08,370 --> 00:05:12,300
It looks like... Lee Sedol hasjust resigned, actually.
90
00:05:12,300 --> 00:05:13,830
Yeah!Yes.
91
00:05:13,830 --> 00:05:15,530
NARRATOR: In the end, thescientists watched
92
00:05:15,530 --> 00:05:18,200
their algorithms win fourof the games.
93
00:05:18,200 --> 00:05:20,470
Lee Sedol took one.
94
00:05:20,470 --> 00:05:22,330
What happened with Go,first and foremost,
95
00:05:22,330 --> 00:05:25,830
was a huge victory for deep mindand for A.I., right?
96
00:05:25,830 --> 00:05:28,170
It wasn't that the computersbeat the humans,
97
00:05:28,170 --> 00:05:31,970
it was that, you know, one typeof intelligence beat another.
98
00:05:31,970 --> 00:05:34,230
NARRATOR: Artificialintelligence had proven
99
00:05:34,230 --> 00:05:36,770
it could marshal a vast amountof data,
100
00:05:36,770 --> 00:05:40,300
beyond anything any humancould handle,
101
00:05:40,300 --> 00:05:44,400
and use it to teach itself howto predict an outcome.
102
00:05:44,400 --> 00:05:48,400
The commercial implicationswere enormous.
103
00:05:48,400 --> 00:05:51,670
While AlphaGo is a,is a toy game,
104
00:05:51,670 --> 00:05:57,530
but its success and its wakingeveryone up, I think,
105
00:05:57,530 --> 00:06:03,770
is, is going to be rememberedas the pivotal moment
106
00:06:03,770 --> 00:06:07,230
where A.I. became mature
107
00:06:07,230 --> 00:06:09,100
and everybody jumpedon the bandwagon.
108
00:06:09,100 --> 00:06:10,570
♪ ♪
109
00:06:10,570 --> 00:06:14,270
NARRATOR: This is about theconsequences of that defeat.
110
00:06:14,270 --> 00:06:16,270
(man speaking local language)
111
00:06:16,270 --> 00:06:19,870
How the A.I. algorithms areushering in a new age
112
00:06:19,870 --> 00:06:24,200
of great potential andprosperity,
113
00:06:24,200 --> 00:06:29,170
but an age that will also deepeninequality, challenge democracy,
114
00:06:29,170 --> 00:06:35,200
and divide the worldinto two A.I. superpowers.
115
00:06:35,200 --> 00:06:39,130
Tonight, five stories about howartificial intelligence
116
00:06:39,130 --> 00:06:40,930
is changing our world.
117
00:06:40,930 --> 00:06:43,930
♪ ♪
118
00:06:51,800 --> 00:06:56,330
China has decided to chasethe A.I. future.
119
00:06:56,330 --> 00:06:58,770
The difference betweenthe internet mindset
120
00:06:58,770 --> 00:07:00,830
and the A.I. mindset...
121
00:07:00,830 --> 00:07:04,700
NARRATOR: A future made andembraced by a new generation.
122
00:07:07,000 --> 00:07:10,770
Well, it's hard not to feelthe kind of immense energy,
123
00:07:10,770 --> 00:07:15,570
and also the obvious factof the demographics.
124
00:07:15,570 --> 00:07:18,770
They're mostly very youngerpeople,
125
00:07:18,770 --> 00:07:22,830
so that this clearly istechnology which is being
126
00:07:22,830 --> 00:07:26,030
generated by a whole newgeneration.
127
00:07:26,030 --> 00:07:27,800
NARRATOR: Orville Schellis one of
128
00:07:27,800 --> 00:07:30,100
America's foremostChina scholars.
129
00:07:30,100 --> 00:07:31,730
(speaking Mandarin)
130
00:07:31,730 --> 00:07:34,830
NARRATOR: He first came here45 years ago.
131
00:07:34,830 --> 00:07:38,270
When I, when I first camehere, in 1975,
132
00:07:38,270 --> 00:07:40,770
Chairman Mao was still alive,
133
00:07:40,770 --> 00:07:43,300
the Cultural Revolutionwas coming on,
134
00:07:43,300 --> 00:07:47,830
and there wasn't a single whiffof anything
135
00:07:47,830 --> 00:07:49,170
of what you see here.
136
00:07:49,170 --> 00:07:50,770
It was unimaginable.
137
00:07:50,770 --> 00:07:54,500
In fact, in those years,one very much thought,
138
00:07:54,500 --> 00:08:00,530
"This is the way China is, thisis the way it's going to be."
139
00:08:00,530 --> 00:08:02,570
And the fact that it has gonethrough
140
00:08:02,570 --> 00:08:06,330
so many different changes sinceis quite extraordinary.
141
00:08:06,330 --> 00:08:08,270
(man giving instructions)
142
00:08:08,270 --> 00:08:11,770
NARRATOR: This extraordinaryprogress goes back
143
00:08:11,770 --> 00:08:14,370
to that game of Go.
144
00:08:14,370 --> 00:08:16,830
I think that the governmentrecognized
145
00:08:16,830 --> 00:08:18,300
that this was a sort of criticalthing for the future,
146
00:08:18,300 --> 00:08:20,270
and, "We need to catch upin this," that, you know,
147
00:08:20,270 --> 00:08:22,900
"We cannot have a foreigncompany showing us up
148
00:08:22,900 --> 00:08:24,300
at our own game.
149
00:08:24,300 --> 00:08:25,730
And this is going to besomething that is going to be
150
00:08:25,730 --> 00:08:27,100
critically importantin the future."
151
00:08:27,100 --> 00:08:29,230
So, you know, we called it theSputnik moment for,
152
00:08:29,230 --> 00:08:31,000
for the Chinese government--
153
00:08:31,000 --> 00:08:33,970
the Chinese government kind ofwoke up.
154
00:08:33,969 --> 00:08:36,599
(translated): As we often sayin China,
155
00:08:36,599 --> 00:08:39,699
"The beginning is the mostdifficult part."
156
00:08:39,700 --> 00:08:42,630
NARRATOR: In 2017, Xi Jinpingannounced
157
00:08:42,630 --> 00:08:44,570
the government's bold new plans
158
00:08:44,570 --> 00:08:47,570
to an audienceof foreign diplomats.
159
00:08:47,570 --> 00:08:51,000
China would catch up with theU.S. in artificial intelligence
160
00:08:51,000 --> 00:08:55,170
by 2025 and lead the worldby 2030.
161
00:08:55,170 --> 00:08:57,500
(translated): ...andintensified cooperation
162
00:08:57,500 --> 00:09:00,270
in frontier areas such asdigital economy,
163
00:09:00,270 --> 00:09:02,930
artificial intelligence,nanotechnology,
164
00:09:02,930 --> 00:09:05,230
and accounting computing.
165
00:09:05,230 --> 00:09:08,730
♪ ♪
166
00:09:11,530 --> 00:09:15,200
NARRATOR: Today, China leadsthe world in e-commerce.
167
00:09:18,370 --> 00:09:22,070
Drones deliver to ruralvillages.
168
00:09:22,070 --> 00:09:25,070
And a society that bypassedcredit cards
169
00:09:25,070 --> 00:09:28,030
now shops in storeswithout cashiers,
170
00:09:28,030 --> 00:09:33,200
where the currencyis facial recognition.
171
00:09:33,200 --> 00:09:36,230
No country has ever movedthat fast.
172
00:09:36,230 --> 00:09:38,730
And in a short two-and-a-halfyears,
173
00:09:38,730 --> 00:09:43,400
China's A.I. implementationreally went from minimal amount
174
00:09:43,400 --> 00:09:47,230
to probably about17 or 18 unicorns,
175
00:09:47,230 --> 00:09:50,000
that is billion-dollarcompanies, in A.I. today.
176
00:09:50,000 --> 00:09:55,130
And that, that progress is,is hard to believe.
177
00:09:55,130 --> 00:09:57,830
NARRATOR: The progress waspowered by a new generation
178
00:09:57,830 --> 00:10:01,870
of ambitious young techs pouringout of Chinese universities,
179
00:10:01,870 --> 00:10:05,570
competing with each otherfor new ideas,
180
00:10:05,570 --> 00:10:11,630
and financed by a new cadre ofChinese venture capitalists.
181
00:10:11,630 --> 00:10:13,600
This is Sinovation,
182
00:10:13,600 --> 00:10:17,100
created by U.S.-educated A.I.scientist and businessman
183
00:10:17,100 --> 00:10:19,000
Kai-Fu Lee.
184
00:10:19,000 --> 00:10:24,170
These unicorns-- we've gotone, two, three, four, five,
185
00:10:24,170 --> 00:10:27,300
six, in the general A.I. area.
186
00:10:27,300 --> 00:10:29,630
And unicorn means abillion-dollar company,
187
00:10:29,630 --> 00:10:33,870
a company whose valuationor market capitalization
188
00:10:33,870 --> 00:10:36,870
is at $1 billion or higher.
189
00:10:36,870 --> 00:10:42,830
I think we put two unicornsto show $5 billion or higher.
190
00:10:42,830 --> 00:10:45,300
NARRATOR: Kai-Fu Lee was bornin Taiwan.
191
00:10:45,300 --> 00:10:48,530
His parents sent himto high school in Tennessee.
192
00:10:48,530 --> 00:10:51,270
His PhD thesisat Carnegie Mellon
193
00:10:51,270 --> 00:10:53,900
was on computer speechrecognition,
194
00:10:53,900 --> 00:10:55,570
which took him to Apple.
195
00:10:55,570 --> 00:10:57,900
Well, reality is a stepcloser to science fiction,
196
00:10:57,900 --> 00:11:00,770
with Apple Computers'new developed program...
197
00:11:00,770 --> 00:11:03,830
NARRATOR: And at 31,an early measure of fame.
198
00:11:03,830 --> 00:11:06,100
Kai-Fu Lee,the inventor of Apple's
199
00:11:06,100 --> 00:11:07,530
speech-recognition technology.
200
00:11:07,530 --> 00:11:10,400
Casper, copy thisto Make Write 2.
201
00:11:10,400 --> 00:11:12,730
Casper, paste.
202
00:11:12,730 --> 00:11:15,600
Casper, 72-point italic outline.
203
00:11:15,600 --> 00:11:18,970
NARRATOR: He would move on toMicrosoft research in Asia
204
00:11:18,970 --> 00:11:21,430
and became the headof Google China.
205
00:11:21,430 --> 00:11:26,530
Ten years ago, he startedSinovation in Beijing,
206
00:11:26,530 --> 00:11:30,700
and began looking for promisingstartups and A.I. talent.
207
00:11:30,700 --> 00:11:33,500
So, the Chineseentrepreneurial companies
208
00:11:33,500 --> 00:11:35,500
started as copycats.
209
00:11:35,500 --> 00:11:39,570
But over the last 15 years,China has developed its own form
210
00:11:39,570 --> 00:11:45,100
of entrepreneurship, and thatentrepreneurship is described
211
00:11:45,100 --> 00:11:50,130
as tenacious, very fast,winner-take-all,
212
00:11:50,130 --> 00:11:52,930
and incredible work ethic.
213
00:11:52,930 --> 00:11:57,430
I would say these few thousandChinese top entrepreneurs,
214
00:11:57,430 --> 00:11:59,230
they could take on anyentrepreneur
215
00:11:59,230 --> 00:12:01,400
anywhere in the world.
216
00:12:01,400 --> 00:12:04,170
NARRATOR: Entrepreneurs likeCao Xudong,
217
00:12:04,170 --> 00:12:10,100
the 33-year-old C.E.O. ofa new startup called Momenta.
218
00:12:10,100 --> 00:12:12,700
This is a ring road aroundBeijing.
219
00:12:12,700 --> 00:12:15,470
The car is driving itself.
220
00:12:15,470 --> 00:12:18,730
♪ ♪
221
00:12:21,230 --> 00:12:24,470
You see, another cutting,another cutting-in.
222
00:12:24,470 --> 00:12:26,670
Another cut-in, yeah, yeah.
223
00:12:26,670 --> 00:12:29,470
NARRATOR: Cao has no doubtabout the inevitability
224
00:12:29,470 --> 00:12:33,430
of autonomous vehicles.
225
00:12:33,430 --> 00:12:39,130
Just like AlphaGo can beatthe human player in, in Go,
226
00:12:39,130 --> 00:12:43,570
I think the machine willdefinitely surpass
227
00:12:43,570 --> 00:12:47,370
the human driver, in the end.
228
00:12:47,370 --> 00:12:48,730
NARRATOR: Recently, therehave been cautions
229
00:12:48,730 --> 00:12:53,530
about how soon autonomousvehicles will be deployed,
230
00:12:53,530 --> 00:12:55,700
but Cao and his team areconfident
231
00:12:55,700 --> 00:12:58,730
they're in for the long haul.
232
00:12:58,730 --> 00:13:01,030
U.S. will be the firstto deploy,
233
00:13:01,030 --> 00:13:03,830
but China may be the firstto popularize.
234
00:13:03,830 --> 00:13:05,270
It is 50-50 right now.
235
00:13:05,270 --> 00:13:07,000
U.S. is ahead in technology.
236
00:13:07,000 --> 00:13:10,030
China has a larger market,and the Chinese government
237
00:13:10,030 --> 00:13:12,870
is helping with infrastructureefforts--
238
00:13:12,870 --> 00:13:16,100
for example, building a new citythe size of Chicago
239
00:13:16,100 --> 00:13:18,670
with autonomous driving enabled,
240
00:13:18,670 --> 00:13:21,700
and also a new highway that hassensors built in
241
00:13:21,700 --> 00:13:24,230
to help autonomous vehiclebe safer.
242
00:13:24,230 --> 00:13:27,470
NARRATOR: Their earlyinvestors included
243
00:13:27,470 --> 00:13:29,470
Mercedes-Benz.
244
00:13:29,470 --> 00:13:33,430
I feel very lucky and veryinspiring
245
00:13:33,430 --> 00:13:38,000
and very exciting that we'reliving in this era.
246
00:13:38,000 --> 00:13:40,800
♪ ♪
247
00:13:40,800 --> 00:13:42,630
NARRATOR: Life in China islargely conducted
248
00:13:42,630 --> 00:13:45,000
on smartphones.
249
00:13:45,000 --> 00:13:48,270
A billion people use WeChat,the equivalent of Facebook,
250
00:13:48,270 --> 00:13:51,030
Messenger, and PayPal,and much more,
251
00:13:51,030 --> 00:13:54,200
combined into just onesuper-app.
252
00:13:54,200 --> 00:13:55,870
And there are many more.
253
00:13:55,870 --> 00:14:00,070
China is the best placefor A.I. implementation today,
254
00:14:00,070 --> 00:14:04,230
because the vast amount of datathat's available in China.
255
00:14:04,230 --> 00:14:07,670
China has a lot more users thanany other country,
256
00:14:07,670 --> 00:14:10,700
three to four times more thanthe U.S.
257
00:14:10,700 --> 00:14:14,900
There are 50 times more mobilepayments than the U.S.
258
00:14:14,900 --> 00:14:17,300
There are ten times more fooddeliveries,
259
00:14:17,300 --> 00:14:21,030
which serve as data to learnmore about user behavior
260
00:14:21,030 --> 00:14:22,870
than the U.S.
261
00:14:22,870 --> 00:14:26,570
300 times more shared bicyclerides,
262
00:14:26,570 --> 00:14:30,400
and each shared bicycle ridehas all kinds of sensors
263
00:14:30,400 --> 00:14:32,730
submitting data up to the cloud.
264
00:14:32,730 --> 00:14:36,230
We're talking about maybe tentimes more data than the U.S.,
265
00:14:36,230 --> 00:14:41,230
and A.I. is basically run ondata and fueled by data.
266
00:14:41,230 --> 00:14:44,400
The more data, the betterthe A.I. works,
267
00:14:44,400 --> 00:14:47,500
more importantly than howbrilliant the researcher is
268
00:14:47,500 --> 00:14:49,000
working on the problem.
269
00:14:49,000 --> 00:14:54,100
So, in the age of A.I.,where data is the new oil,
270
00:14:54,100 --> 00:14:57,370
China is the new Saudi Arabia.
271
00:14:57,370 --> 00:14:59,570
NARRATOR: And access to allthat data
272
00:14:59,570 --> 00:15:02,870
means that the deep-learningalgorithm can quickly predict
273
00:15:02,870 --> 00:15:05,370
behavior, like thecreditworthiness of someone
274
00:15:05,370 --> 00:15:06,870
wanting a short-term loan.
275
00:15:06,870 --> 00:15:09,270
Here is our application.
276
00:15:09,270 --> 00:15:13,900
And customer can choose how manymoney they want to borrow
277
00:15:13,900 --> 00:15:16,830
and how long they wantto borrow,
278
00:15:16,830 --> 00:15:21,130
and they can inputtheir datas here.
279
00:15:21,130 --> 00:15:27,530
And after, after that, you canjust borrow very quickly.
280
00:15:27,530 --> 00:15:31,030
NARRATOR: The C.E.O. shows ushow quickly you can get a loan.
281
00:15:31,030 --> 00:15:33,100
It is, it has done.
282
00:15:33,100 --> 00:15:35,430
NARRATOR: It takes an averageof eight seconds.
283
00:15:35,430 --> 00:15:38,400
It has passed to banks.Wow.
284
00:15:38,400 --> 00:15:40,170
NARRATOR:In the eight seconds,
285
00:15:40,170 --> 00:15:42,930
the algorithm has assessed5,000 personal features
286
00:15:42,930 --> 00:15:44,630
from all your data.
287
00:15:44,630 --> 00:15:50,500
5,000 features that isrelated with the delinquency,
288
00:15:50,500 --> 00:15:57,800
when maybe the banks only usefew, maybe, maybe ten features
289
00:15:57,800 --> 00:16:02,170
when they are doingtheir risk amendment.
290
00:16:02,170 --> 00:16:03,630
NARRATOR: Processing millionsof transactions,
291
00:16:03,630 --> 00:16:06,930
it'll dig up features that wouldnever be apparent
292
00:16:06,930 --> 00:16:11,870
to a human loan officer,like how confidently you type
293
00:16:11,870 --> 00:16:15,600
your loan application,or, surprisingly,
294
00:16:15,600 --> 00:16:18,670
if you keep your cell phonebattery charged.
295
00:16:18,670 --> 00:16:21,300
It's very interesting, thebattery of the phone
296
00:16:21,300 --> 00:16:24,170
is related with theirdelinquency rate.
297
00:16:24,170 --> 00:16:26,530
Someone who has much morelower battery,
298
00:16:26,530 --> 00:16:31,400
they get much more dangerousthan others.
299
00:16:31,400 --> 00:16:34,370
It's probably unfathomableto an American
300
00:16:34,370 --> 00:16:39,600
how a country can dramaticallyevolve itself
301
00:16:39,600 --> 00:16:43,470
from a copycat laggard to,all of a sudden,
302
00:16:43,470 --> 00:16:48,030
to nearly as good as the U.S. intechnology.
303
00:16:48,030 --> 00:16:50,300
NARRATOR: Like thisfacial-recognition startup
304
00:16:50,300 --> 00:16:51,800
he invested in.
305
00:16:51,800 --> 00:16:56,470
Megvii was started by threeyoung graduates in 2011.
306
00:16:56,470 --> 00:17:00,700
It's now a world leader in usingA.I. to identify people.
307
00:17:03,530 --> 00:17:05,000
It's pretty fast.
308
00:17:05,000 --> 00:17:07,530
For example,on the mobile device,
309
00:17:07,530 --> 00:17:10,670
we have timed thefacial-recognition speed.
310
00:17:10,670 --> 00:17:13,830
It's actually lessthan 100 milliseconds.
311
00:17:13,829 --> 00:17:15,829
So, that's very, very fast.
312
00:17:15,829 --> 00:17:19,969
So 0.1 second that we can, wewill be able to recognize you,
313
00:17:19,970 --> 00:17:24,200
even on a mobile device.
314
00:17:24,199 --> 00:17:26,299
NARRATOR: The company claimsthe system is better
315
00:17:26,300 --> 00:17:30,170
than any human at identifyingpeople in its database.
316
00:17:30,170 --> 00:17:33,770
And for those who aren't,it can describe them.
317
00:17:33,770 --> 00:17:36,530
Like our director--what he's wearing,
318
00:17:36,530 --> 00:17:42,230
and a good guess at his age,missing it by only a few months.
319
00:17:42,230 --> 00:17:46,970
We are the first one toreally take facial recognition
320
00:17:46,970 --> 00:17:50,570
to commercial quality.
321
00:17:50,570 --> 00:17:52,070
NARRATOR: That's why inBeijing today,
322
00:17:52,070 --> 00:17:57,630
you can pay for your KFCwith a smile.
323
00:17:57,630 --> 00:17:59,070
You know, it's not sosurprising,
324
00:17:59,070 --> 00:18:01,230
we've seen Chinese companiescatching up to the U.S.
325
00:18:01,230 --> 00:18:02,630
in technology for a long time.
326
00:18:02,630 --> 00:18:05,200
And so, if particular effortand attention is paid
327
00:18:05,200 --> 00:18:07,700
in a specific sector,it's not so surprising
328
00:18:07,700 --> 00:18:09,600
that they would surpassthe rest of the world.
329
00:18:09,600 --> 00:18:12,000
And facial recognition is one ofthe, really the first places
330
00:18:12,000 --> 00:18:15,300
we've seen that start to happen.
331
00:18:15,300 --> 00:18:18,230
NARRATOR: It's a technologyprized by the government,
332
00:18:18,230 --> 00:18:23,370
like this program in Shenzhento discourage jaywalking.
333
00:18:23,370 --> 00:18:27,400
Offenders are shamed in public--and with facial recognition,
334
00:18:27,400 --> 00:18:31,330
can be instantly fined.
335
00:18:31,330 --> 00:18:34,570
Critics warn that the governmentand some private companies
336
00:18:34,570 --> 00:18:37,170
have been building a nationaldatabase
337
00:18:37,170 --> 00:18:41,330
from dozens of experimentalsocial-credit programs.
338
00:18:41,330 --> 00:18:43,500
The government wants tointegrate
339
00:18:43,500 --> 00:18:48,670
all these individual behaviors,or corporations' records,
340
00:18:48,670 --> 00:18:55,670
into some kind of metrics andcompute out a single number
341
00:18:55,670 --> 00:18:59,030
or set of number associatedwith a individual,
342
00:18:59,030 --> 00:19:04,670
a citizen, and using that,to implement a incentive
343
00:19:04,670 --> 00:19:06,130
or punishment system.
344
00:19:06,130 --> 00:19:07,400
NARRATOR: A highsocial-credit number
345
00:19:07,400 --> 00:19:11,100
can be rewarded with discountson bus fares.
346
00:19:11,100 --> 00:19:15,800
A low number can leadto a travel ban.
347
00:19:15,800 --> 00:19:18,430
Some say it's very popularwith a Chinese public
348
00:19:18,430 --> 00:19:21,500
that wants to punishbad behavior.
349
00:19:21,500 --> 00:19:25,070
Others see a future that rewardsparty loyalty
350
00:19:25,070 --> 00:19:28,570
and silences criticism.
351
00:19:28,570 --> 00:19:32,970
Right now, there is no finalsystem being implemented.
352
00:19:32,970 --> 00:19:41,070
And from those experiments, wealready see that the possibility
353
00:19:41,070 --> 00:19:44,400
of what this social-creditsystem can do to individual.
354
00:19:44,400 --> 00:19:48,400
It's very powerful--Orwellian-like--
355
00:19:48,400 --> 00:19:56,170
and it's extremely troublesomein terms of civil liberty.
356
00:19:56,170 --> 00:19:58,270
NARRATOR: Every eveningin Shanghai,
357
00:19:58,270 --> 00:20:01,270
ever-present cameras record thecrowds
358
00:20:01,270 --> 00:20:03,430
as they surge down to the Bund,
359
00:20:03,430 --> 00:20:07,530
the promenade along the banksof the Huangpu River.
360
00:20:07,530 --> 00:20:10,830
Once the great trading houses ofEurope came here to do business
361
00:20:10,830 --> 00:20:12,570
with the Middle Kingdom.
362
00:20:12,570 --> 00:20:15,600
In the last century,they were all shut down
363
00:20:15,600 --> 00:20:18,200
by Mao's revolution.
364
00:20:18,200 --> 00:20:20,530
But now, in the age of A.I.,
365
00:20:20,530 --> 00:20:22,670
people come here to takein a spectacle
366
00:20:22,670 --> 00:20:26,100
that reflects China'sremarkable progress.
367
00:20:26,100 --> 00:20:28,630
(spectators gasp)
368
00:20:28,630 --> 00:20:32,070
And illuminates the greatpolitical paradox of capitalism
369
00:20:32,070 --> 00:20:37,200
taken rootin the communist state.
370
00:20:37,200 --> 00:20:40,800
People have called itmarket Leninism,
371
00:20:40,800 --> 00:20:43,570
authoritarian capitalism.
372
00:20:43,570 --> 00:20:46,970
We are watching a kindof a Petri dish
373
00:20:46,970 --> 00:20:54,270
in which an experiment of, youknow, extraordinary importance
374
00:20:54,270 --> 00:20:55,970
to the world isbeing carried out.
375
00:20:55,970 --> 00:20:59,030
Whether you can combine thesethings
376
00:20:59,030 --> 00:21:02,170
and get somethingthat's more powerful,
377
00:21:02,170 --> 00:21:04,870
that's coherent,that's durable in the world.
378
00:21:04,870 --> 00:21:07,600
Whether you can bring togethera one-party state
379
00:21:07,600 --> 00:21:12,370
with an innovative sector,both economically
380
00:21:12,370 --> 00:21:14,400
and technologically innovative,
381
00:21:14,400 --> 00:21:20,700
and that's something we thoughtcould not coexist.
382
00:21:20,700 --> 00:21:23,070
NARRATOR:As China reinvents itself,
383
00:21:23,070 --> 00:21:25,170
it has set its sightson leading the world
384
00:21:25,170 --> 00:21:29,170
in artificial intelligenceby 2030.
385
00:21:29,170 --> 00:21:32,230
But that means taking on theworld's most innovative
386
00:21:32,230 --> 00:21:34,100
A.I. culture.
387
00:21:34,100 --> 00:21:37,600
♪ ♪
388
00:21:46,900 --> 00:21:49,930
On an interstatein the U.S. Southwest,
389
00:21:49,930 --> 00:21:52,830
artificial intelligence is atwork solving the problem
390
00:21:52,830 --> 00:21:56,030
that's become emblematicof the new age,
391
00:21:56,030 --> 00:21:58,800
replacing a human driver.
392
00:21:58,800 --> 00:22:04,370
♪ ♪
393
00:22:04,370 --> 00:22:08,730
This is the company's C.E.O.,24-year-old Alex Rodrigues.
394
00:22:11,630 --> 00:22:13,800
The more things we buildsuccessfully,
395
00:22:13,800 --> 00:22:15,970
the less people ask questions
396
00:22:15,970 --> 00:22:18,870
about how old you are when youhave working trucks.
397
00:22:18,870 --> 00:22:21,800
NARRATOR: And this is whathe's built.
398
00:22:21,800 --> 00:22:24,670
Commercial goods are beingdriven from California
399
00:22:24,670 --> 00:22:29,170
to Arizona on Interstate 10.
400
00:22:29,170 --> 00:22:34,270
There is a driver in the cab,but he's not driving.
401
00:22:34,270 --> 00:22:40,630
It's a path set by a C.E.O.with an unusual CV.
402
00:22:40,630 --> 00:22:42,930
Are we ready, Henry?
403
00:22:42,930 --> 00:22:47,730
The aim is to score these pucksinto the scoring area.
404
00:22:47,730 --> 00:22:51,400
So I, I did competitive roboticsstarting when I was 11,
405
00:22:51,400 --> 00:22:53,130
and I took it very, veryseriously.
406
00:22:53,130 --> 00:22:55,900
To, to give you a sense, I wonthe Robotics World Championships
407
00:22:55,900 --> 00:22:57,830
for the first timewhen I was 13.
408
00:22:57,830 --> 00:22:59,430
I've been to worlds seven times
409
00:22:59,430 --> 00:23:02,200
between the ages of 13and 20-ish.
410
00:23:02,200 --> 00:23:04,330
I eventually founded a team,
411
00:23:04,330 --> 00:23:07,000
did a lot of work at avery high competitive level.
412
00:23:07,000 --> 00:23:08,470
Things looking pretty good.
413
00:23:08,470 --> 00:23:10,930
NARRATOR: This was aprototype of sorts,
414
00:23:10,930 --> 00:23:15,130
from which he has built hismulti-million-dollar company.
415
00:23:15,130 --> 00:23:18,100
I hadn't built a robot in awhile, wanted to get back to it,
416
00:23:18,100 --> 00:23:21,030
and felt that this was by farthe most exciting piece
417
00:23:21,030 --> 00:23:22,930
of robotics technology that wasup and coming.
418
00:23:22,930 --> 00:23:25,170
A lot of people told us wewouldn't be able to build it.
419
00:23:25,170 --> 00:23:28,570
But knew roughly the techniquesthat you would use.
420
00:23:28,570 --> 00:23:30,370
And I was pretty confident thatif you put them together,
421
00:23:30,370 --> 00:23:32,330
you would get somethingthat worked.
422
00:23:32,330 --> 00:23:35,900
Took the summer off, built in myparents' garage a golf cart
423
00:23:35,900 --> 00:23:40,470
that could drive itself.
424
00:23:40,470 --> 00:23:42,430
NARRATOR: That golf cartgot the attention
425
00:23:42,430 --> 00:23:45,400
of Silicon Valley,and the first of several rounds
426
00:23:45,400 --> 00:23:47,570
of venture capital.
427
00:23:47,570 --> 00:23:50,670
He formed a team and thendecided the business opportunity
428
00:23:50,670 --> 00:23:53,700
was in self-driving trucks.
429
00:23:53,700 --> 00:23:56,470
He says there's alsoa human benefit.
430
00:23:56,470 --> 00:23:58,630
If we can build a truckthat's ten times safer
431
00:23:58,630 --> 00:24:02,770
than a human driver, then notmuch else actually matters.
432
00:24:02,770 --> 00:24:05,770
When we talk to regulators,especially,
433
00:24:05,770 --> 00:24:08,930
everyone agrees that the onlyway that we're going to get
434
00:24:08,930 --> 00:24:11,770
to zero highway deaths,which is everyone's objective,
435
00:24:11,770 --> 00:24:13,800
is to use self-driving.
436
00:24:13,800 --> 00:24:17,030
And so, I'm sure you've heardthe statistic,
437
00:24:17,030 --> 00:24:19,230
more than 90% of all crashes
438
00:24:19,230 --> 00:24:20,870
have a human driveras the cause.
439
00:24:20,870 --> 00:24:24,230
So if you want to solvetraffic fatalities,
440
00:24:24,230 --> 00:24:28,170
which, in my opinion, are thesingle biggest tragedy
441
00:24:28,170 --> 00:24:30,970
that happens year after yearin the United States,
442
00:24:30,970 --> 00:24:33,800
this is the only solution.
443
00:24:33,800 --> 00:24:36,230
NARRATOR:It's an ambitious goal,
444
00:24:36,230 --> 00:24:38,430
but only possible becauseof the recent breakthroughs
445
00:24:38,430 --> 00:24:40,170
in deep learning.
446
00:24:40,170 --> 00:24:42,300
Artificial intelligence isone of those key pieces
447
00:24:42,300 --> 00:24:46,530
that has made it possible nowto do driverless vehicles
448
00:24:46,530 --> 00:24:49,070
where it wasn't possibleten years ago,
449
00:24:49,070 --> 00:24:53,870
particularly in the abilityto see and understand scenes.
450
00:24:53,870 --> 00:24:57,130
A lot of people don't know this,but it's remarkably hard
451
00:24:57,130 --> 00:24:58,870
for computers,until very, very recently,
452
00:24:58,870 --> 00:25:02,800
to do even the most basicvisual tasks,
453
00:25:02,800 --> 00:25:04,530
like seeing a pictureof a person
454
00:25:04,530 --> 00:25:06,070
and knowing that it's a person.
455
00:25:06,070 --> 00:25:09,270
And we've made gigantic strideswith artificial intelligence
456
00:25:09,270 --> 00:25:11,530
in being able to see andunderstanding tasks,
457
00:25:11,530 --> 00:25:14,000
and that's obviously fundamentalto being able to understand
458
00:25:14,000 --> 00:25:15,770
the world around youwith the sensors that,
459
00:25:15,770 --> 00:25:19,870
that you have available.
460
00:25:19,870 --> 00:25:21,270
NARRATOR: That's now possible
461
00:25:21,270 --> 00:25:23,970
because of the algorithmswritten by Yoshua Bengio
462
00:25:23,970 --> 00:25:28,070
and a small group of scientists.
463
00:25:28,070 --> 00:25:30,630
There are many aspectsof the world
464
00:25:30,630 --> 00:25:34,200
which we can't explainwith words.
465
00:25:34,200 --> 00:25:36,400
And that part of our knowledgeis actually
466
00:25:36,400 --> 00:25:39,170
probably the majority of it.
467
00:25:39,170 --> 00:25:41,400
So, like, the stuff we cancommunicate verbally
468
00:25:41,400 --> 00:25:43,370
is the tip of the iceberg.
469
00:25:43,370 --> 00:25:48,600
And so to get at the bottom ofthe iceberg, the solution was,
470
00:25:48,600 --> 00:25:53,000
the computers have to acquirethat knowledge by themselves
471
00:25:53,000 --> 00:25:54,500
from data, from examples.
472
00:25:54,500 --> 00:25:58,400
Just like children learn,most not from their teachers,
473
00:25:58,400 --> 00:26:01,370
but from interactingwith the world,
474
00:26:01,370 --> 00:26:03,500
and playing around, and, andtrying things
475
00:26:03,500 --> 00:26:05,570
and seeing what worksand what doesn't work.
476
00:26:05,570 --> 00:26:07,870
NARRATOR: This is an earlydemonstration.
477
00:26:07,870 --> 00:26:12,470
In 2013, deep-mind scientistsset a machine-learning program
478
00:26:12,470 --> 00:26:16,070
on the Atari video gameBreakout.
479
00:26:16,070 --> 00:26:19,870
The computer was only toldthe goal-- to win the game.
480
00:26:19,870 --> 00:26:24,230
After 100 games, it learned touse the bat at the bottom
481
00:26:24,230 --> 00:26:27,770
to hit the ball and breakthe bricks at the top.
482
00:26:27,770 --> 00:26:33,030
After 300, it could do thatbetter than a human player.
483
00:26:33,030 --> 00:26:37,970
After 500 games, it came up witha creative way to win the game--
484
00:26:37,970 --> 00:26:40,730
by digging a tunnel on the side
485
00:26:40,730 --> 00:26:42,270
and sending the ballaround the top
486
00:26:42,270 --> 00:26:44,730
to break many brickswith one hit.
487
00:26:44,730 --> 00:26:48,170
That was deep learning.
488
00:26:48,170 --> 00:26:50,630
That's the A.I. program basedon learning,
489
00:26:50,630 --> 00:26:52,430
really, that has beenso successful
490
00:26:52,430 --> 00:26:54,870
in the last few years and has...
491
00:26:54,870 --> 00:26:57,430
It wasn't clear ten years agothat it would work,
492
00:26:57,430 --> 00:27:00,600
but it has completely changedthe map
493
00:27:00,600 --> 00:27:06,570
and is now used in almostevery sector of society.
494
00:27:06,570 --> 00:27:08,970
Even the best and brightestamong us,
495
00:27:08,970 --> 00:27:11,000
we just don't have enoughcompute power
496
00:27:11,000 --> 00:27:13,530
inside of our heads.
497
00:27:13,530 --> 00:27:16,000
NARRATOR: Amy Webb is aprofessor at N.Y.U.
498
00:27:16,000 --> 00:27:19,970
and founder of the Future TodayInstitute.
499
00:27:19,970 --> 00:27:26,270
As A.I. progresses, the greatpromise is that they...
500
00:27:26,270 --> 00:27:30,700
they, these, these machines,alongside of us,
501
00:27:30,700 --> 00:27:34,330
are able to think and imagineand see things
502
00:27:34,330 --> 00:27:36,730
in ways that we never havebefore,
503
00:27:36,730 --> 00:27:40,370
which means that maybe we havesome kind of new,
504
00:27:40,370 --> 00:27:45,330
weird, seemingly implausiblesolution to climate change.
505
00:27:45,330 --> 00:27:49,530
Maybe we have some radicallydifferent approach
506
00:27:49,530 --> 00:27:52,930
to dealing withincurable cancers.
507
00:27:52,930 --> 00:27:58,330
The real practical and wonderfulpromise is that machines help us
508
00:27:58,330 --> 00:28:02,170
be more creative, and,using that creativity,
509
00:28:02,170 --> 00:28:06,430
we get to terrific solutions.
510
00:28:06,430 --> 00:28:09,500
NARRATOR: Solutions thatcould come unexpectedly
511
00:28:09,500 --> 00:28:11,770
to urgent problems.
512
00:28:11,770 --> 00:28:13,700
It's going to changethe face of breast cancer.
513
00:28:13,700 --> 00:28:16,870
Right now, 40,000 womenin the U.S. alone
514
00:28:16,870 --> 00:28:19,670
die from breast cancerevery single year.
515
00:28:19,670 --> 00:28:21,870
NARRATOR: Dr. Connie Lehmanis head
516
00:28:21,870 --> 00:28:23,230
of the breast imaging center
517
00:28:23,230 --> 00:28:26,470
at Massachusetts GeneralHospital in Boston.
518
00:28:26,470 --> 00:28:28,930
We've become so complacentabout it,
519
00:28:28,930 --> 00:28:31,070
we almost don't think it canreally be changed.
520
00:28:31,070 --> 00:28:33,470
We, we somehow think we shouldput all of our energy
521
00:28:33,470 --> 00:28:36,670
into chemotherapiesto save women
522
00:28:36,670 --> 00:28:38,370
with metastatic breast cancer,
523
00:28:38,370 --> 00:28:41,430
and yet, you know, when we findit early, we cure it,
524
00:28:41,430 --> 00:28:44,730
and we cure it without havingthe ravages to the body
525
00:28:44,730 --> 00:28:46,830
when we diagnose it late.
526
00:28:46,830 --> 00:28:51,700
This shows the progression of asmall, small spot from one year
527
00:28:51,700 --> 00:28:54,530
to the next,and then to the diagnosis
528
00:28:54,530 --> 00:28:57,730
of the small cancer here.
529
00:28:57,730 --> 00:28:59,830
NARRATOR: This is whathappened when a woman
530
00:28:59,830 --> 00:29:02,100
who had been diagnosedwith breast cancer
531
00:29:02,100 --> 00:29:04,170
started to ask questions
532
00:29:04,170 --> 00:29:07,370
about why it couldn't have beendiagnosed earlier.
533
00:29:07,370 --> 00:29:10,170
It really brings a lot ofanxiety,
534
00:29:10,170 --> 00:29:12,270
and you're asking the questions,you know,
535
00:29:12,270 --> 00:29:13,530
"Am I going to survive?
536
00:29:13,530 --> 00:29:15,200
What's going to happento my son?"
537
00:29:15,200 --> 00:29:19,230
And I start askingother questions.
538
00:29:19,230 --> 00:29:21,700
NARRATOR: She was used toasking questions.
539
00:29:21,700 --> 00:29:24,970
At M.I.T.'sartificial-intelligence lab,
540
00:29:24,970 --> 00:29:27,970
Professor Regina Barzilay usesdeep learning
541
00:29:27,970 --> 00:29:31,100
to teach the computer tounderstand language,
542
00:29:31,100 --> 00:29:34,270
as well as read text and data.
543
00:29:34,270 --> 00:29:37,600
I was really surprisedthat the very basic question
544
00:29:37,600 --> 00:29:39,970
that I ask my physicians,
545
00:29:39,970 --> 00:29:43,330
which were really excellentphysicians here at MGH,
546
00:29:43,330 --> 00:29:47,030
they couldn't give me answersthat I was looking for.
547
00:29:47,030 --> 00:29:50,770
NARRATOR: She was convincedthat if you analyze enough data,
548
00:29:50,770 --> 00:29:53,530
from mammogramsto diagnostic notes,
549
00:29:53,530 --> 00:29:56,800
the computer could predictearly-stage conditions.
550
00:29:56,800 --> 00:30:02,830
If we fast-forward from 2012to '13 to 2014,
551
00:30:02,830 --> 00:30:05,900
we then see when Reginawas diagnosed,
552
00:30:05,900 --> 00:30:10,170
because of this spot on hermammogram.
553
00:30:10,170 --> 00:30:14,830
Is it possible, with moreelegant computer applications,
554
00:30:14,830 --> 00:30:19,100
that we might have identifiedthis spot the year before,
555
00:30:19,100 --> 00:30:21,200
or even back here?
556
00:30:21,200 --> 00:30:22,830
So, those are standardprediction problems
557
00:30:22,830 --> 00:30:26,870
in machine learning-- there isnothing special about them.
558
00:30:26,870 --> 00:30:29,900
And to my big surprise,none of the technologies
559
00:30:29,900 --> 00:30:33,130
that we are developingat M.I.T.,
560
00:30:33,130 --> 00:30:38,730
even in the most simple form,doesn't penetrate the hospital.
561
00:30:38,730 --> 00:30:41,670
NARRATOR: Regina and Conniebegan the slow process
562
00:30:41,670 --> 00:30:45,070
of getting access to thousandsof mammograms and records
563
00:30:45,070 --> 00:30:46,770
from MGH's breast-imagingprogram.
564
00:30:49,470 --> 00:30:53,130
So, our first foray was justto take all of the patients
565
00:30:53,130 --> 00:30:56,130
we had at MGH duringa period of time,
566
00:30:56,130 --> 00:30:58,530
who had had breast surgeryfor a certain type
567
00:30:58,530 --> 00:31:00,430
of high-risk lesion.
568
00:31:00,430 --> 00:31:03,700
And we found that most of themdidn't really need the surgery.
569
00:31:03,700 --> 00:31:05,070
They didn't have cancer.
570
00:31:05,070 --> 00:31:07,670
But about ten percentdid have cancer.
571
00:31:07,670 --> 00:31:10,570
With Regina's techniquesin deep learning
572
00:31:10,570 --> 00:31:13,370
and machine learning, we wereable to predict the women
573
00:31:13,370 --> 00:31:15,600
that truly needed the surgeryand separate out
574
00:31:15,600 --> 00:31:19,470
those that really could avoidthe unnecessary surgery.
575
00:31:19,470 --> 00:31:23,030
What machine can do, it cantake hundreds of thousands
576
00:31:23,030 --> 00:31:25,730
of images where the outcomeis known
577
00:31:25,730 --> 00:31:30,700
and learn, based on how, youknow, pixels are distributed,
578
00:31:30,700 --> 00:31:35,170
what are the very uniquepatterns that correlate highly
579
00:31:35,170 --> 00:31:38,370
with future occurrenceof the disease.
580
00:31:38,370 --> 00:31:40,900
So, instead of using humancapacity
581
00:31:40,900 --> 00:31:44,770
to kind of recognize pattern,formalize pattern--
582
00:31:44,770 --> 00:31:48,700
which is inherently limitedby our cognitive capacity
583
00:31:48,700 --> 00:31:50,800
and how much we can seeand remember--
584
00:31:50,800 --> 00:31:53,700
we're providing machine with alot of data
585
00:31:53,700 --> 00:31:57,630
and make it learnthis prediction.
586
00:31:57,630 --> 00:32:02,370
So, we are using technologynot only to be better
587
00:32:02,370 --> 00:32:04,770
at assessing the breast density,
588
00:32:04,770 --> 00:32:07,200
but to get more to the point ofwhat we're trying to predict.
589
00:32:07,200 --> 00:32:10,930
"Does this woman havea cancer now,
590
00:32:10,930 --> 00:32:13,170
and will she develop a cancerin five years? "
591
00:32:13,170 --> 00:32:16,770
And that's, again, wherethe artificial intelligence,
592
00:32:16,770 --> 00:32:18,700
machine and deep learning canreally help us
593
00:32:18,700 --> 00:32:20,770
and our patients.
594
00:32:20,770 --> 00:32:22,830
NARRATOR: In the age of A.I.,
595
00:32:22,830 --> 00:32:26,330
the algorithms are transportingus into a universe
596
00:32:26,330 --> 00:32:29,970
of vast potential andtransforming almost every aspect
597
00:32:29,970 --> 00:32:34,200
of human endeavor andexperience.
598
00:32:34,200 --> 00:32:38,000
Andrew McAfee is a researchscientist at M.I.T.
599
00:32:38,000 --> 00:32:42,000
who co-authored"The Second Machine Age."
600
00:32:42,000 --> 00:32:45,070
The great compliment that asongwriter gives another one is,
601
00:32:45,070 --> 00:32:46,600
"Gosh, I wish I had writtenthat one."
602
00:32:46,600 --> 00:32:49,100
The great compliment a geekgives another one is,
603
00:32:49,100 --> 00:32:50,900
"Wow, I wish I had drawnthat graph."
604
00:32:50,900 --> 00:32:53,630
So, I wish I had drawnthis graph.
605
00:32:53,630 --> 00:32:55,500
NARRATOR:The graph uses a formula
606
00:32:55,500 --> 00:32:59,400
to show human development andgrowth since 2000 BCE.
607
00:32:59,400 --> 00:33:01,570
The state of humancivilization
608
00:33:01,570 --> 00:33:04,970
is not very advanced, and it'snot getting better
609
00:33:04,970 --> 00:33:07,130
very quickly at all,and this is true for thousands
610
00:33:07,130 --> 00:33:08,970
and thousands of years.
611
00:33:08,970 --> 00:33:12,470
When we, when we formed empiresand empires got overturned,
612
00:33:12,470 --> 00:33:16,530
when we tried democracy,when we invented zero
613
00:33:16,530 --> 00:33:19,630
and mathematics and fundamentaldiscoveries about the universe,
614
00:33:19,630 --> 00:33:21,400
big deal.
615
00:33:21,400 --> 00:33:23,300
It just, the numbers don'tchange very much.
616
00:33:23,300 --> 00:33:26,900
What's weird is that the numberschange essentially in the blink
617
00:33:26,900 --> 00:33:28,370
of an eye at one point in time.
618
00:33:28,370 --> 00:33:32,030
And it goes from reallyhorizontal, unchanging,
619
00:33:32,030 --> 00:33:36,600
uninteresting, to, holy Toledo,crazy vertical.
620
00:33:36,600 --> 00:33:39,200
And then the question is,what on Earth happened
621
00:33:39,200 --> 00:33:40,570
to cause that change?
622
00:33:40,570 --> 00:33:42,770
And the answeris the Industrial Revolution.
623
00:33:42,770 --> 00:33:44,800
There were other things thathappened,
624
00:33:44,800 --> 00:33:46,830
but really what fundamentallyhappened is
625
00:33:46,830 --> 00:33:49,530
we overcame the limitationsof our muscle power.
626
00:33:49,530 --> 00:33:52,400
Something equally interesting ishappening right now.
627
00:33:52,400 --> 00:33:55,330
We are overcoming thelimitations of our minds.
628
00:33:55,330 --> 00:33:56,930
We're not getting rid of them,
629
00:33:56,930 --> 00:33:58,970
we're not making themunnecessary,
630
00:33:58,970 --> 00:34:02,500
but, holy cow, can we leveragethem and amplify them now.
631
00:34:02,500 --> 00:34:04,170
You have to be a huge pessimist
632
00:34:04,170 --> 00:34:06,730
not to find that profoundlygood news.
633
00:34:06,730 --> 00:34:09,370
I really do think the worldhas entered a new era.
634
00:34:09,370 --> 00:34:12,830
Artificial intelligence holds somuch promise,
635
00:34:12,830 --> 00:34:15,730
but it's going to reshape everyaspect of the economy,
636
00:34:15,730 --> 00:34:17,370
so many aspects of our lives.
637
00:34:17,370 --> 00:34:20,770
Because A.I. is a little bitlike electricity.
638
00:34:20,770 --> 00:34:22,670
Everybody's going to use it.
639
00:34:22,669 --> 00:34:26,399
Every company is going to beincorporating A.I.,
640
00:34:26,399 --> 00:34:28,299
integrating it intowhat they do,
641
00:34:28,300 --> 00:34:29,630
governments are going to beusing it,
642
00:34:29,629 --> 00:34:33,599
nonprofit organizations aregoing to be using it.
643
00:34:33,600 --> 00:34:37,200
It's going to create all kindsof benefits
644
00:34:37,200 --> 00:34:41,070
in ways large and small,and challenges for us, as well.
645
00:34:41,070 --> 00:34:44,730
NARRATOR: The challenges,the benefits--
646
00:34:44,730 --> 00:34:47,000
the autonomous truckrepresents both
647
00:34:47,000 --> 00:34:50,070
as it maneuversinto the marketplace.
648
00:34:50,070 --> 00:34:53,070
The engineers are confidentthat, in spite of questions
649
00:34:53,070 --> 00:34:55,370
about when this will happen,
650
00:34:55,370 --> 00:34:57,330
they can get it working safelysooner
651
00:34:57,330 --> 00:34:58,770
than most people realize.
652
00:34:58,770 --> 00:35:02,130
I think that you will see thefirst vehicles operating
653
00:35:02,130 --> 00:35:05,570
with no one inside them movingfreight in the next few years,
654
00:35:05,570 --> 00:35:07,700
and then you're going to seethat expanding to more freight,
655
00:35:07,700 --> 00:35:11,030
more geographies,more weather over time as,
656
00:35:11,030 --> 00:35:12,530
as that capability builds up.
657
00:35:12,530 --> 00:35:16,600
We're talking, like,less than half a decade.
658
00:35:16,600 --> 00:35:19,370
NARRATOR: He already has aFortune 500 company
659
00:35:19,370 --> 00:35:23,830
as a client, shipping appliancesacross the Southwest.
660
00:35:23,830 --> 00:35:27,330
He says the sales pitchis straightforward.
661
00:35:27,330 --> 00:35:30,070
They spend hundreds ofmillions of dollars a year
662
00:35:30,070 --> 00:35:31,670
shipping parts aroundthe country.
663
00:35:31,670 --> 00:35:34,100
We can bring that cost in half.
664
00:35:34,100 --> 00:35:36,930
And they're really excited to beable to start working with us,
665
00:35:36,930 --> 00:35:39,800
both because of the potential,
666
00:35:39,800 --> 00:35:42,100
the potential savings fromdeploying self-driving,
667
00:35:42,100 --> 00:35:44,470
and also because of all theoperational efficiencies
668
00:35:44,470 --> 00:35:47,830
that they see, the biggest onebeing able to operate
669
00:35:47,830 --> 00:35:49,800
24 hours a day.
670
00:35:49,800 --> 00:35:51,970
So, right now, human drivers arelimited to 11 hours
671
00:35:51,970 --> 00:35:55,470
by federal law,and a driverless truck
672
00:35:55,470 --> 00:35:57,000
obviously wouldn't havethat limitation.
673
00:35:57,000 --> 00:36:02,530
♪ ♪
674
00:36:02,530 --> 00:36:05,330
NARRATOR: The idea of adriverless truck comes up often
675
00:36:05,330 --> 00:36:11,430
in discussions about artificialintelligence.
676
00:36:11,430 --> 00:36:14,800
Steve Viscelli is a sociologistwho drove a truck
677
00:36:14,800 --> 00:36:20,330
while researching his book "TheBig Rig" about the industry.
678
00:36:20,330 --> 00:36:23,000
This is one of the mostremarkable stories
679
00:36:23,000 --> 00:36:25,830
in, in U.S. labor history,I think,
680
00:36:25,830 --> 00:36:30,400
is, you know, the decline of,of unionized trucking.
681
00:36:30,400 --> 00:36:33,600
The industry was deregulatedin 1980,
682
00:36:33,600 --> 00:36:37,400
and at that time, you know,truck drivers were earning
683
00:36:37,400 --> 00:36:41,400
the equivalent of over$100,000 in today's dollars.
684
00:36:41,400 --> 00:36:45,500
And today the typical truckdriver will earn
685
00:36:45,500 --> 00:36:50,370
a little over $40,000 a year.
686
00:36:50,370 --> 00:36:52,630
And I think it'san important part
687
00:36:52,630 --> 00:36:54,230
of the automation story, right?
688
00:36:54,230 --> 00:36:56,900
Why are they so afraid ofautomation?
689
00:36:56,900 --> 00:37:00,670
Because we've had four decadesof rising inequality in wages.
690
00:37:00,670 --> 00:37:03,330
And if anybody is going to takeit on the chin
691
00:37:03,330 --> 00:37:05,330
from automationin the trucking industry,
692
00:37:05,330 --> 00:37:07,630
the, the first in line is goingto be the driver,
693
00:37:07,630 --> 00:37:12,300
without a doubt.
694
00:37:12,300 --> 00:37:14,730
NARRATOR: For his research,Viscelli tracked down truckers
695
00:37:14,730 --> 00:37:17,600
and their families,like Shawn and Hope Cumbee
696
00:37:17,600 --> 00:37:19,530
of Beaverton, Michigan.Hi.
697
00:37:19,530 --> 00:37:20,870
Hey, Hope,I'm Steve Viscelli.
698
00:37:20,870 --> 00:37:21,870
Hi, Steve, nice to meet you.Come on in.
699
00:37:21,870 --> 00:37:24,800
Great to meet you, too,thanks.
700
00:37:24,800 --> 00:37:26,430
NARRATOR: And their sonCharlie.
701
00:37:26,430 --> 00:37:31,730
This is Daddy, me,Daddy, and Mommy.
702
00:37:31,730 --> 00:37:34,230
NARRATOR: But Daddy's nothere.
703
00:37:34,230 --> 00:37:38,900
Shawn Cumbee's truck has brokendown in Tennessee.
704
00:37:38,900 --> 00:37:43,470
Hope, who drove a truck herself,knows the business well.
705
00:37:43,470 --> 00:37:46,870
We made $150,000, right,in a year.
706
00:37:46,870 --> 00:37:48,070
That sounds great, right?
707
00:37:48,070 --> 00:37:50,400
That's, like, good money.
708
00:37:50,400 --> 00:37:53,870
We paid $100,000 in fuel, okay?
709
00:37:53,870 --> 00:37:57,030
So, right there,now I made $50,000.
710
00:37:57,030 --> 00:37:59,030
But I didn't really, because,you know,
711
00:37:59,030 --> 00:38:00,600
you get an oil change everymonth,
712
00:38:00,600 --> 00:38:02,200
so that's $300 a month.
713
00:38:02,200 --> 00:38:04,170
You still have to doall the maintenance.
714
00:38:04,170 --> 00:38:06,500
We had a motor blow out, right?
715
00:38:06,500 --> 00:38:09,170
$13,000. Right?
716
00:38:09,170 --> 00:38:11,800
I know, I mean, I choke up alittle just thinking about it,
717
00:38:11,800 --> 00:38:13,770
because it was...
718
00:38:13,770 --> 00:38:17,470
And it was 13,000, and we wereoff work for two weeks.
719
00:38:17,470 --> 00:38:19,670
So, by the end of the year,with that $150,000,
720
00:38:19,670 --> 00:38:22,670
by the end of the year,we'd made about 20...
721
00:38:22,670 --> 00:38:26,030
About $22,000.
722
00:38:26,030 --> 00:38:28,400
NARRATOR: In a truck stopin Tennessee,
723
00:38:28,400 --> 00:38:31,500
Shawn has been sidelinedwaiting for a new part.
724
00:38:31,500 --> 00:38:35,300
The garage owner is letting himstay in the truck to save money.
725
00:38:37,870 --> 00:38:39,770
Hi, baby.
726
00:38:39,770 --> 00:38:41,330
(on phone): Hey, how's itgoing?
727
00:38:41,330 --> 00:38:42,730
It's going.Chunky-butt!
728
00:38:42,730 --> 00:38:44,600
Hi, Daddy!Hi, Chunky-butt.
729
00:38:44,600 --> 00:38:47,300
What're you doing?(talking inaudibly)
730
00:38:47,300 --> 00:38:49,600
Believe it or not,I do it because I love it.
731
00:38:49,600 --> 00:38:51,330
I mean, you know,it's in the blood.
732
00:38:51,330 --> 00:38:52,900
Third-generation driver.
733
00:38:52,900 --> 00:38:55,230
And my granddaddy told me a longtime ago,
734
00:38:55,230 --> 00:38:58,630
when I was probably11, 12 years old, probably,
735
00:38:58,630 --> 00:39:01,500
he said, "The world meets nobodyhalfway.
736
00:39:01,500 --> 00:39:02,930
Nobody."
737
00:39:02,930 --> 00:39:07,030
He said, "If you want it,you have to earn it."
738
00:39:07,030 --> 00:39:09,870
And that's what I do every day.
739
00:39:09,870 --> 00:39:11,330
I live by that creed.
740
00:39:11,330 --> 00:39:16,100
And I've lived by thatsince it was told to me.
741
00:39:16,100 --> 00:39:18,300
So, if you're down for a weekin a truck,
742
00:39:18,300 --> 00:39:19,870
you still have to pay yourbills.
743
00:39:19,870 --> 00:39:22,100
I have enough money in mychecking account at all times
744
00:39:22,100 --> 00:39:23,470
to pay a month's worth of bills.
745
00:39:23,470 --> 00:39:25,070
That does not include my food.
746
00:39:25,070 --> 00:39:27,630
That doesn't include field tripsfor my son's school.
747
00:39:27,630 --> 00:39:31,700
My son and I just went to ouryearly doctor appointment.
748
00:39:31,700 --> 00:39:36,270
I took, I took money out of myson's piggy bank to pay for it,
749
00:39:36,270 --> 00:39:40,600
because it's not...it's not scheduled in.
750
00:39:40,600 --> 00:39:43,430
It's, it's not something thatyou can, you know, afford.
751
00:39:43,430 --> 00:39:45,500
I mean, like, when...
752
00:39:45,500 --> 00:39:46,900
(sighs): Sorry.
753
00:39:46,900 --> 00:39:48,970
It's okay.
754
00:39:48,970 --> 00:39:52,600
♪ ♪
755
00:39:57,230 --> 00:39:59,170
Have you guys ever talked aboutself-driving trucks?
756
00:39:59,170 --> 00:40:00,500
Is he...
757
00:40:00,500 --> 00:40:03,130
(laughing): So, kind of.
758
00:40:03,130 --> 00:40:05,830
Um, I asked him once, you know.
759
00:40:05,830 --> 00:40:07,230
And he laughed so hard.
760
00:40:07,230 --> 00:40:10,330
He said, "No way will theyever have a truck
761
00:40:10,330 --> 00:40:12,970
that can drive itself."
762
00:40:12,970 --> 00:40:15,230
It's kind of interesting whenyou think about it, you know,
763
00:40:15,230 --> 00:40:17,730
they're putting all this newtechnology into things,
764
00:40:17,730 --> 00:40:19,570
but, you know,it's still man-made.
765
00:40:19,570 --> 00:40:22,970
And man, you know,does make mistakes.
766
00:40:22,970 --> 00:40:26,170
I really don't see it beinga problem with the industry,
767
00:40:26,170 --> 00:40:28,770
'cause, one, you still got tohave a driver in it,
768
00:40:28,770 --> 00:40:30,330
because I don't see itdoing city.
769
00:40:30,330 --> 00:40:32,600
I don't see it doing,you know, main things.
770
00:40:32,600 --> 00:40:34,700
I don't see it backing intoa dock.
771
00:40:34,700 --> 00:40:37,870
I don't see the automation part,you know, doing...
772
00:40:37,870 --> 00:40:39,900
maybe the box-trailer side,you know, I can see that,
773
00:40:39,900 --> 00:40:41,400
but not stuff like I do.
774
00:40:41,400 --> 00:40:44,830
So, I ain't really worried aboutthe automation of trucks.
775
00:40:44,830 --> 00:40:46,230
How near of a future is it?
776
00:40:46,230 --> 00:40:49,300
Yeah, self-driving, um...
777
00:40:49,300 --> 00:40:52,600
So, some, you know, somecompanies are already operating.
778
00:40:52,600 --> 00:40:56,170
Embark, for instance, is onethat has been doing
779
00:40:56,170 --> 00:40:59,030
driverless truckson the interstate.
780
00:40:59,030 --> 00:41:01,930
And what's called exit-to-exitself-driving.
781
00:41:01,930 --> 00:41:04,830
And they're currently runningreal freight.
782
00:41:04,830 --> 00:41:07,530
Really?Yeah, on I-10.
783
00:41:07,530 --> 00:41:10,530
♪ ♪
784
00:41:10,530 --> 00:41:15,170
(on P.A.): Shower guest 100,your shower is now ready.
785
00:41:15,170 --> 00:41:18,430
NARRATOR: Over time, it hasbecome harder and harder
786
00:41:18,430 --> 00:41:21,230
for veteran independent driverslike the Cumbees
787
00:41:21,230 --> 00:41:23,070
to make a living.
788
00:41:23,070 --> 00:41:25,070
They've been replaced byyounger,
789
00:41:25,070 --> 00:41:28,200
less experienced drivers.
790
00:41:28,200 --> 00:41:32,630
So, the, the truckingindustry's $740 billion a year,
791
00:41:32,630 --> 00:41:34,770
and, again, in, in manyof these operations,
792
00:41:34,770 --> 00:41:37,470
labor's a third of that cost.
793
00:41:37,470 --> 00:41:40,500
By my estimate, I, you know,I think we're in the range
794
00:41:40,500 --> 00:41:42,970
of 300,000 or so jobsin the foreseeable future
795
00:41:42,970 --> 00:41:47,930
that could be automated to somesignificant extent.
796
00:41:47,930 --> 00:41:50,630
♪ ♪
797
00:41:50,630 --> 00:41:53,530
(groans)
798
00:41:53,530 --> 00:41:57,070
♪ ♪
799
00:42:03,000 --> 00:42:06,130
NARRATOR: The A.I. futurewas built with great optimism
800
00:42:06,130 --> 00:42:09,100
out here in the West.
801
00:42:09,100 --> 00:42:12,630
In 2018, many of the peoplewho invented it
802
00:42:12,630 --> 00:42:16,170
gathered in San Francisco tocelebrate the 25th anniversary
803
00:42:16,170 --> 00:42:18,700
of the industry magazine.
804
00:42:18,700 --> 00:42:22,300
Howdy, welcome to WIRED25.
805
00:42:22,300 --> 00:42:24,200
NARRATOR: It is acelebration, for sure,
806
00:42:24,200 --> 00:42:27,070
but there's also a growing senseof caution
807
00:42:27,070 --> 00:42:28,670
and even skepticism.
808
00:42:31,130 --> 00:42:33,330
We're having a really goodweekend here.
809
00:42:33,330 --> 00:42:37,030
NARRATOR: Nick Thompson iseditor-in-chief of "Wired."
810
00:42:37,030 --> 00:42:40,030
When it started,it was very much a magazine
811
00:42:40,030 --> 00:42:44,100
about what's coming and why youshould be excited about it.
812
00:42:44,100 --> 00:42:47,730
Optimism was the definingfeature of "Wired"
813
00:42:47,730 --> 00:42:49,400
for many, many years.
814
00:42:49,400 --> 00:42:53,130
Or, as our slogan used to be,"Change Is Good."
815
00:42:53,130 --> 00:42:55,070
And over time,it shifted a little bit.
816
00:42:55,070 --> 00:42:59,170
And now it's more,"We love technology,
817
00:42:59,170 --> 00:43:00,630
but let's look at someof the big issues,
818
00:43:00,630 --> 00:43:03,400
and let's look at some of themcritically,
819
00:43:03,400 --> 00:43:05,730
and let's look at the wayalgorithms are changing
820
00:43:05,730 --> 00:43:07,930
the way we behave,for good and for ill."
821
00:43:07,930 --> 00:43:12,030
So, the whole nature of "Wired"has gone from a champion
822
00:43:12,030 --> 00:43:14,830
of technological change to moreof a observer
823
00:43:14,830 --> 00:43:16,700
of technological change.
824
00:43:16,700 --> 00:43:18,570
So, um, before we start...
825
00:43:18,570 --> 00:43:20,530
NARRATOR: Thereare 25 speakers,
826
00:43:20,530 --> 00:43:23,700
all named as iconsof the last 25 years
827
00:43:23,700 --> 00:43:25,500
of technological progress.
828
00:43:25,500 --> 00:43:27,770
So, why is Apple sosecretive?
829
00:43:27,770 --> 00:43:29,470
(chuckling)
830
00:43:29,470 --> 00:43:31,630
NARRATOR: Jony Ive, whodesigned Apple's iPhone.
831
00:43:31,630 --> 00:43:34,300
It would be bizarrenot to be.
832
00:43:34,300 --> 00:43:36,670
There's this question of,like,
833
00:43:36,670 --> 00:43:39,000
what are we doing here in thislife, in this reality?
834
00:43:39,000 --> 00:43:43,170
NARRATOR: Jaron Lanier, whopioneered virtual reality.
835
00:43:43,170 --> 00:43:46,500
And Jeff Bezos,the founder of Amazon.
836
00:43:46,500 --> 00:43:47,870
Amazon was a garage startup.
837
00:43:47,870 --> 00:43:49,370
Now it's a very large company.
838
00:43:49,370 --> 00:43:50,570
Two kids in a dorm...
839
00:43:50,570 --> 00:43:52,070
NARRATOR: His message is,
840
00:43:52,070 --> 00:43:54,730
"All will be wellin the new world."
841
00:43:54,730 --> 00:43:58,470
I guess, first of all, Iremain incredibly optimistic
842
00:43:58,470 --> 00:43:59,630
about technology,
843
00:43:59,630 --> 00:44:01,830
and technologies alwaysare two-sided.
844
00:44:01,830 --> 00:44:03,230
But that's not new.
845
00:44:03,230 --> 00:44:05,400
That's always been the case.
846
00:44:05,400 --> 00:44:07,830
And, and we will figure it out.
847
00:44:07,830 --> 00:44:10,570
The last thing we would everwant to do is stop the progress
848
00:44:10,570 --> 00:44:16,630
of new technologies,even when they are dual-use.
849
00:44:16,630 --> 00:44:19,800
NARRATOR: But, says Thompson,beneath the surface,
850
00:44:19,800 --> 00:44:22,530
there's a worry most of themdon't like to talk about.
851
00:44:22,530 --> 00:44:26,630
There are some people inSilicon Valley who believe that,
852
00:44:26,630 --> 00:44:29,900
"You just have to trustthe technology.
853
00:44:29,900 --> 00:44:32,870
Throughout history, there's beena complicated relationship
854
00:44:32,870 --> 00:44:34,470
between humans and machines,
855
00:44:34,470 --> 00:44:36,770
we've always worried aboutmachines,
856
00:44:36,770 --> 00:44:38,130
and it's always been fine.
857
00:44:38,130 --> 00:44:41,000
And we don't know how A.I. willchange the labor force,
858
00:44:41,000 --> 00:44:42,300
but it will be okay."
859
00:44:42,300 --> 00:44:44,070
So, that argument exists.
860
00:44:44,070 --> 00:44:45,700
There's another argument,
861
00:44:45,700 --> 00:44:48,170
which is what I think most ofthem believe deep down,
862
00:44:48,170 --> 00:44:51,100
which is, "This is different.
863
00:44:51,100 --> 00:44:52,930
We're going to have labor-forcedisruption
864
00:44:52,930 --> 00:44:55,030
like we've never seen before.
865
00:44:55,030 --> 00:44:59,370
And if that happens,will they blame us?"
866
00:44:59,370 --> 00:45:02,600
NARRATOR: There is, however,one of the WIRED25 icons
867
00:45:02,600 --> 00:45:05,800
willing to take on the issue.
868
00:45:05,800 --> 00:45:09,470
Onstage, Kai-Fu Lee dispenseswith one common fear.
869
00:45:09,470 --> 00:45:11,670
Well, I think there are somany myths out there.
870
00:45:11,670 --> 00:45:14,530
I think one, one myth is that
871
00:45:14,530 --> 00:45:17,570
because A.I. is so good at asingle task,
872
00:45:17,570 --> 00:45:21,600
that one day we'll wake up, andwe'll all be enslaved
873
00:45:21,600 --> 00:45:24,100
or forced to plug our brainsto the A.I.
874
00:45:24,100 --> 00:45:28,800
But it is nowhere closeto displacing humans.
875
00:45:28,800 --> 00:45:32,130
NARRATOR: But in interviewsaround the event and beyond,
876
00:45:32,130 --> 00:45:37,430
he takes a decidedly contrarianposition on A.I. and job loss.
877
00:45:37,430 --> 00:45:41,270
The A.I. giants want to paintthe rosier picture
878
00:45:41,270 --> 00:45:43,500
because they're happilymaking money.
879
00:45:43,500 --> 00:45:47,330
So, I think they prefer not totalk about the negative side.
880
00:45:47,330 --> 00:45:53,070
I believe about 50% of jobswill be
881
00:45:53,070 --> 00:45:56,900
somewhat or extremelythreatened by A.I.
882
00:45:56,900 --> 00:46:00,500
in the next 15 years or so.
883
00:46:00,500 --> 00:46:02,570
NARRATOR: Kai-Fu Lee alsomakes a great deal
884
00:46:02,570 --> 00:46:04,900
of money from A.I.
885
00:46:04,900 --> 00:46:06,800
What separates him from most ofhis colleagues
886
00:46:06,800 --> 00:46:09,930
is that he's frankabout its downside.
887
00:46:09,930 --> 00:46:13,900
Yes, yes, we, we've madeabout 40 investments in A.I.
888
00:46:13,900 --> 00:46:16,930
I think, based on these 40investments,
889
00:46:16,930 --> 00:46:20,000
most of them are not impactinghuman jobs.
890
00:46:20,000 --> 00:46:21,970
They're creating value,making high margins,
891
00:46:21,970 --> 00:46:24,300
inventing a new model.
892
00:46:24,300 --> 00:46:27,730
But I could list seven or eight
893
00:46:27,730 --> 00:46:32,670
that would lead to a very cleardisplacement of human jobs.
894
00:46:32,670 --> 00:46:34,370
NARRATOR: He says that A.I.is coming,
895
00:46:34,370 --> 00:46:36,470
whether we like it or not.
896
00:46:36,470 --> 00:46:38,300
And he wants to warn society
897
00:46:38,300 --> 00:46:41,030
about what he sees asinevitable.
898
00:46:41,030 --> 00:46:43,600
You have a view which I thinkis different than many others,
899
00:46:43,600 --> 00:46:48,670
which is that A.I. is not goingto take blue-collar jobs
900
00:46:48,670 --> 00:46:51,230
so quickly, but is actuallygoing to take white-collar jobs.
901
00:46:51,230 --> 00:46:53,770
Yeah.Well, both will happen.
902
00:46:53,770 --> 00:46:57,000
A.I. will be, at the same time,a replacement for blue-collar,
903
00:46:57,000 --> 00:47:00,630
white-collar jobs, and bea great symbiotic tool
904
00:47:00,630 --> 00:47:03,630
for doctors, lawyers, and you,for example.
905
00:47:03,630 --> 00:47:05,700
But the white-collar jobs areeasier to take,
906
00:47:05,700 --> 00:47:10,030
because they're a purequantitative analytical process.
907
00:47:10,030 --> 00:47:15,370
Let's say reporters, traders,telemarketing,
908
00:47:15,370 --> 00:47:17,270
telesales, customer service...
909
00:47:17,270 --> 00:47:18,730
Analysts?
910
00:47:18,730 --> 00:47:23,170
Analysts, yes, these can allbe replaced just by a software.
911
00:47:23,170 --> 00:47:26,330
To do blue-collar, some of thework requires, you know,
912
00:47:26,330 --> 00:47:30,030
hand-eye coordination, thingsthat machines are not yet
913
00:47:30,030 --> 00:47:32,300
good enough to do.
914
00:47:32,300 --> 00:47:36,400
Today, there are many peoplewho are ringing the alarm,
915
00:47:36,400 --> 00:47:37,600
"Oh, my God, what are we goingto do?
916
00:47:37,600 --> 00:47:39,830
Half the jobs are going away."
917
00:47:39,830 --> 00:47:43,430
I believe that's true, buthere's the missing fact.
918
00:47:43,430 --> 00:47:46,400
I've done the research on this,and if you go back 20, 30,
919
00:47:46,400 --> 00:47:50,930
or 40 years ago, you will findthat 50% of the jobs
920
00:47:50,930 --> 00:47:54,400
that people performed back thenare gone today.
921
00:47:54,400 --> 00:47:56,900
You know, where are all thetelephone operators,
922
00:47:56,900 --> 00:48:00,600
bowling-pin setters,elevator operators?
923
00:48:00,600 --> 00:48:04,270
You used to have seas ofsecretaries in corporations
924
00:48:04,270 --> 00:48:06,070
that have now been eliminated--travel agents.
925
00:48:06,070 --> 00:48:08,770
You can just go through fieldafter field after field.
926
00:48:08,770 --> 00:48:12,100
That same pattern has recurredmany times throughout history,
927
00:48:12,100 --> 00:48:14,230
with each new waveof automation.
928
00:48:14,230 --> 00:48:20,270
But I would argue thathistory is only trustable
929
00:48:20,270 --> 00:48:24,670
if it is multiple repetitionsof similar events,
930
00:48:24,670 --> 00:48:28,670
not once-in-a-blue-moonoccurrence.
931
00:48:28,670 --> 00:48:33,070
So, over the history of manytech inventions,
932
00:48:33,070 --> 00:48:34,770
most are small things.
933
00:48:34,770 --> 00:48:41,330
Only maybe three are at themagnitude of A.I. revolution--
934
00:48:41,330 --> 00:48:44,730
the steam, steam engine,electricity,
935
00:48:44,730 --> 00:48:46,570
and the computer revolution.
936
00:48:46,570 --> 00:48:48,970
I'd say everything elseis too small.
937
00:48:48,970 --> 00:48:52,670
And the reason I think it mightbe something brand-new
938
00:48:52,670 --> 00:48:58,930
is that A.I. is fundamentallyreplacing our cognitive process
939
00:48:58,930 --> 00:49:03,670
in doing a job in itssignificant entirety,
940
00:49:03,670 --> 00:49:06,400
and it can do it dramaticallybetter.
941
00:49:06,400 --> 00:49:08,570
NARRATOR: This argumentabout job loss
942
00:49:08,570 --> 00:49:11,470
in the age of A.I. was ignitedsix years ago
943
00:49:11,470 --> 00:49:15,830
amid the gargoyles and spiresof Oxford University.
944
00:49:15,830 --> 00:49:19,970
Two researchers had been poringthrough U.S. labor statistics,
945
00:49:19,970 --> 00:49:25,270
identifying jobs that could bevulnerable to A.I. automation.
946
00:49:25,270 --> 00:49:27,300
Well, vulnerable toautomation,
947
00:49:27,300 --> 00:49:30,730
in the context that we discussedfive years ago now,
948
00:49:30,730 --> 00:49:34,430
essentially meant that thosejobs are potentially automatable
949
00:49:34,430 --> 00:49:36,900
over an unspecified number ofyears.
950
00:49:36,900 --> 00:49:41,530
And the figure we came up withwas 47%.
951
00:49:41,530 --> 00:49:43,330
NARRATOR: 47%.
952
00:49:43,330 --> 00:49:46,470
That number quickly traveledthe world in headlines
953
00:49:46,470 --> 00:49:47,830
and news bulletins.
954
00:49:47,830 --> 00:49:51,030
But authors Carl Freyand Michael Osborne
955
00:49:51,030 --> 00:49:52,770
offered a caution.
956
00:49:52,770 --> 00:49:57,670
They can't predict how many jobswill be lost, or how quickly.
957
00:49:57,670 --> 00:50:02,430
But Frey believes that there arelessons in history.
958
00:50:02,430 --> 00:50:04,830
And what worries me the mostis that there is actually
959
00:50:04,830 --> 00:50:08,830
one episode that looks quitefamiliar to today,
960
00:50:08,830 --> 00:50:12,270
which is the BritishIndustrial Revolution,
961
00:50:12,270 --> 00:50:16,400
where wages didn't growfor nine decades,
962
00:50:16,400 --> 00:50:20,530
and a lot of people actuallysaw living standards decline
963
00:50:20,530 --> 00:50:23,870
as technology progressed.
964
00:50:23,870 --> 00:50:25,630
♪ ♪
965
00:50:25,630 --> 00:50:28,370
NARRATOR: Saginaw, Michigan,knows about decline
966
00:50:28,370 --> 00:50:31,170
in living standards.
967
00:50:31,170 --> 00:50:34,900
Harry Cripps, an auto workerand a local union president,
968
00:50:34,900 --> 00:50:40,730
has witnessed what 40 years ofautomation can do to a town.
969
00:50:40,730 --> 00:50:43,470
You know, we're one of thecities in the country that,
970
00:50:43,470 --> 00:50:47,170
I think we were left behind inthis recovery.
971
00:50:47,170 --> 00:50:51,670
And I just... I don't know howwe get on the bandwagon now.
972
00:50:54,770 --> 00:50:57,030
NARRATOR: Once, this was theU.A.W. hall
973
00:50:57,030 --> 00:50:59,230
for one local union.
974
00:50:59,230 --> 00:51:03,670
Now, with falling membership,it's shared by five locals.
975
00:51:03,670 --> 00:51:05,730
Rudy didn't get his shift.
976
00:51:05,730 --> 00:51:07,330
NARRATOR: This day,it's the center
977
00:51:07,330 --> 00:51:09,570
for a Christmas food drive.
978
00:51:09,570 --> 00:51:12,030
Even in a growth economy,
979
00:51:12,030 --> 00:51:14,830
unemployment here is nearsix percent.
980
00:51:14,830 --> 00:51:18,930
Poverty in Saginaw is over 30%.
981
00:51:21,830 --> 00:51:25,130
Our factory has about1.9 million square feet.
982
00:51:25,130 --> 00:51:29,100
Back in the '70s, that 1.9million square feet
983
00:51:29,100 --> 00:51:32,330
had about 7,500 U.A.W.automotive workers
984
00:51:32,330 --> 00:51:34,300
making middle-class wage withdecent benefits
985
00:51:34,300 --> 00:51:36,770
and able to send their kids tocollege and do all the things
986
00:51:36,770 --> 00:51:39,000
that the middle-class familyshould be able to do.
987
00:51:39,000 --> 00:51:42,270
Our factory today, withautomation,
988
00:51:42,270 --> 00:51:46,300
would probably be about700 United Auto Workers.
989
00:51:46,300 --> 00:51:50,130
That's a dramatic change.
990
00:51:50,130 --> 00:51:52,230
Lot of union brothers usedto work there, buddy.
991
00:51:52,230 --> 00:51:55,130
The TRW plant, that wasunfortunate.
992
00:51:55,130 --> 00:51:57,830
Delphi... looks like they'restarting to tear it down now.
993
00:51:57,830 --> 00:51:59,300
Wow.
994
00:51:59,300 --> 00:52:02,770
Automations is, is definitelytaking away a lot of jobs.
995
00:52:02,770 --> 00:52:05,530
Robots, I don't know how theybuy cars,
996
00:52:05,530 --> 00:52:07,300
I don't know howthey buy sandwiches,
997
00:52:07,300 --> 00:52:09,100
I don't know how they go to thegrocery store.
998
00:52:09,100 --> 00:52:11,430
They definitely don't pay taxes,which serves the infrastructure.
999
00:52:11,430 --> 00:52:15,300
So, you don't have the sheriffsand the police and the firemen,
1000
00:52:15,300 --> 00:52:18,830
and anybody else that supportsthe city is gone,
1001
00:52:18,830 --> 00:52:19,900
'cause there's no tax base.
1002
00:52:19,900 --> 00:52:23,770
Robots don't pay taxes.
1003
00:52:23,770 --> 00:52:25,900
NARRATOR: The averagepersonal income in Saginaw
1004
00:52:25,900 --> 00:52:29,570
is $16,000 a year.
1005
00:52:29,570 --> 00:52:32,600
A lot of the families that Iwork with here in the community,
1006
00:52:32,600 --> 00:52:33,830
both parents are working.
1007
00:52:33,830 --> 00:52:35,470
They're working two jobs.
1008
00:52:35,470 --> 00:52:38,370
Mainly, it's the wages,you know,
1009
00:52:38,370 --> 00:52:43,270
people not making a decent wageto be able to support a family.
1010
00:52:43,270 --> 00:52:46,930
Like, back in the day, my dadeven worked at the plant.
1011
00:52:46,930 --> 00:52:49,300
My mom stayed home,raised the children.
1012
00:52:49,300 --> 00:52:52,000
And that give us the opportunityto put food on the table,
1013
00:52:52,000 --> 00:52:53,370
and things of that nature.
1014
00:52:53,370 --> 00:52:56,000
And, and them times are gone.
1015
00:52:56,000 --> 00:52:57,930
If you look at this graph ofwhat's been happening
1016
00:52:57,930 --> 00:52:59,670
to America since the endof World War II,
1017
00:52:59,670 --> 00:53:03,000
you see a line for ourproductivity,
1018
00:53:03,000 --> 00:53:05,730
and our productivitygets better over time.
1019
00:53:05,730 --> 00:53:08,830
It used to be the casethat our pay, our income,
1020
00:53:08,830 --> 00:53:12,700
would increase in lockstep withthose productivity increases.
1021
00:53:12,700 --> 00:53:17,570
The weird part about this graphis how the income has decoupled,
1022
00:53:17,570 --> 00:53:21,900
is not going up the same waythat productivity is anymore.
1023
00:53:21,900 --> 00:53:24,170
NARRATOR: As automation hastaken over,
1024
00:53:24,170 --> 00:53:27,770
workers are either laid off orleft with less-skilled jobs
1025
00:53:27,770 --> 00:53:31,400
for less pay,while productivity goes up.
1026
00:53:31,400 --> 00:53:33,100
There are still plentyof factories in America.
1027
00:53:33,100 --> 00:53:35,430
We are a manufacturingpowerhouse,
1028
00:53:35,430 --> 00:53:37,670
but if you go walk aroundan American factory,
1029
00:53:37,670 --> 00:53:40,070
you do not see long linesof people
1030
00:53:40,070 --> 00:53:42,470
doing repetitive manual labor.
1031
00:53:42,470 --> 00:53:44,600
You see a whole lotof automation.
1032
00:53:44,600 --> 00:53:46,230
If you go upstairs in thatfactory
1033
00:53:46,230 --> 00:53:47,830
and look at the payrolldepartment,
1034
00:53:47,830 --> 00:53:51,130
you see one or two peoplelooking into a screen all day.
1035
00:53:51,130 --> 00:53:53,800
So, the activity is still there,
1036
00:53:53,800 --> 00:53:56,000
but the number of jobsis very, very low,
1037
00:53:56,000 --> 00:53:58,330
because of automationand tech progress.
1038
00:53:58,330 --> 00:54:01,130
Now, dealing withthat challenge,
1039
00:54:01,130 --> 00:54:02,900
and figuring out whatthe next generation
1040
00:54:02,900 --> 00:54:05,700
of the American middle classshould be doing,
1041
00:54:05,700 --> 00:54:07,700
is a really important challenge,
1042
00:54:07,700 --> 00:54:10,530
because I am pretty confidentthat we are never again
1043
00:54:10,530 --> 00:54:13,330
going to have this large,stable, prosperous
1044
00:54:13,330 --> 00:54:15,730
middle class doing routine work.
1045
00:54:15,730 --> 00:54:19,430
♪ ♪
1046
00:54:19,430 --> 00:54:21,970
NARRATOR: Evidence of howA.I. is likely to bring
1047
00:54:21,970 --> 00:54:25,530
accelerated change to the U.S.workforce can be found
1048
00:54:25,530 --> 00:54:27,970
not far from Saginaw.
1049
00:54:27,970 --> 00:54:29,600
This is the U.S. headquarters
1050
00:54:29,600 --> 00:54:34,070
for one of the world's largestbuilders of industrial robots,
1051
00:54:34,070 --> 00:54:38,030
a Japanese-owned company calledFanuc Robotics.
1052
00:54:38,030 --> 00:54:41,230
We've been producing robotsfor well over 35 years.
1053
00:54:41,230 --> 00:54:42,770
And you can imagine,over the years,
1054
00:54:42,770 --> 00:54:45,330
they've changed quite a bit.
1055
00:54:45,330 --> 00:54:48,230
We're utilizing the artificialintelligence
1056
00:54:48,230 --> 00:54:49,800
to really make the robotseasier to use
1057
00:54:49,800 --> 00:54:54,400
and be able to handle a broaderspectrum of opportunities.
1058
00:54:54,400 --> 00:54:57,770
We see a huge growth potentialin robotics.
1059
00:54:57,770 --> 00:55:00,330
And we see that growth potentialas being, really,
1060
00:55:00,330 --> 00:55:03,230
there's 90% of the market left.
1061
00:55:03,230 --> 00:55:05,230
NARRATOR: The industry saysoptimistically
1062
00:55:05,230 --> 00:55:09,270
that with that growth,they can create more jobs.
1063
00:55:09,270 --> 00:55:11,630
Even if there were fivepeople on a job,
1064
00:55:11,630 --> 00:55:12,870
and we reduced that down to twopeople,
1065
00:55:12,870 --> 00:55:15,800
because we automatedsome level of it,
1066
00:55:15,800 --> 00:55:18,570
we might produce two times moreparts than we did before,
1067
00:55:18,570 --> 00:55:20,170
because we automated it.
1068
00:55:20,170 --> 00:55:26,430
So now, there might be the needfor two more fork-truck drivers,
1069
00:55:26,430 --> 00:55:29,900
or two more quality-inspectionpersonnel.
1070
00:55:29,900 --> 00:55:31,870
So, although we reducesome of the people,
1071
00:55:31,870 --> 00:55:36,100
we grow in other areas as weproduce more things.
1072
00:55:36,100 --> 00:55:41,070
When I increase productivitythrough automation, I lose jobs.
1073
00:55:41,070 --> 00:55:42,370
Jobs go away.
1074
00:55:42,370 --> 00:55:45,170
And I don't care what the robotmanufacturers say,
1075
00:55:45,170 --> 00:55:47,830
you aren't replacing those tenproduction people
1076
00:55:47,830 --> 00:55:51,570
that that robot is now doingthat job, with ten people.
1077
00:55:51,570 --> 00:55:54,830
You can increase productivity toa level to stay competitive
1078
00:55:54,830 --> 00:55:58,970
with the global market-- that'swhat they're trying to do.
1079
00:55:58,970 --> 00:56:00,530
♪ ♪
1080
00:56:00,530 --> 00:56:02,900
NARRATOR:In the popular telling,
1081
00:56:02,900 --> 00:56:06,800
blame for widespread job losshas been aimed overseas,
1082
00:56:06,800 --> 00:56:08,900
at what's called offshoring.
1083
00:56:08,900 --> 00:56:11,200
We want to keepour factories here,
1084
00:56:11,200 --> 00:56:13,100
we want to keepour manufacturing here.
1085
00:56:13,100 --> 00:56:17,470
We don't want them movingto China, to Mexico, to Japan,
1086
00:56:17,470 --> 00:56:21,630
to India, to Vietnam.
1087
00:56:21,630 --> 00:56:23,770
NARRATOR: But it turns outmost of the job loss
1088
00:56:23,770 --> 00:56:26,370
isn't because of offshoring.
1089
00:56:26,370 --> 00:56:27,700
There's been offshoring.
1090
00:56:27,700 --> 00:56:32,300
And I think offshoring isresponsible for maybe 20%
1091
00:56:32,300 --> 00:56:34,000
of the jobs that have been lost.
1092
00:56:34,000 --> 00:56:36,270
I would say most of the jobsthat have been lost,
1093
00:56:36,270 --> 00:56:38,830
despite what most Americansthinks, was due to automation
1094
00:56:38,830 --> 00:56:41,830
or productivity growth.
1095
00:56:41,830 --> 00:56:43,570
NARRATOR:Mike Hicks is an economist
1096
00:56:43,570 --> 00:56:46,600
at Ball State Universityin Muncie, Indiana.
1097
00:56:46,600 --> 00:56:50,300
He and sociologist Emily Wornellhave been documenting
1098
00:56:50,300 --> 00:56:52,670
employment trendsin Middle America.
1099
00:56:52,670 --> 00:56:57,130
Hicks says that automation hasbeen a mostly silent job killer,
1100
00:56:57,130 --> 00:56:59,200
lowering the standard of living.
1101
00:56:59,200 --> 00:57:02,400
So, in the last 15 years, thestandard of living has dropped
1102
00:57:02,400 --> 00:57:04,600
by 15, ten to 15 percent.
1103
00:57:04,600 --> 00:57:07,100
So, that's unusualin a developed world.
1104
00:57:07,100 --> 00:57:08,600
A one-year declineis a recession.
1105
00:57:08,600 --> 00:57:12,470
A 15-year decline givesan entirely different sense
1106
00:57:12,470 --> 00:57:14,830
about the prospectsof a community.
1107
00:57:14,830 --> 00:57:18,500
And so that is commonfrom the Canadian border
1108
00:57:18,500 --> 00:57:20,970
to the Gulf of Mexico
1109
00:57:20,970 --> 00:57:23,300
in the middle swathof the United States.
1110
00:57:23,300 --> 00:57:26,130
This is something we're gonnado for you guys.
1111
00:57:26,130 --> 00:57:30,730
These were left over from oursuggestion drive that we did,
1112
00:57:30,730 --> 00:57:32,200
and we're going to give themeach two.
1113
00:57:32,200 --> 00:57:33,300
That is awesome.I mean,
1114
00:57:33,300 --> 00:57:35,070
that is going to go a long ways,right?
1115
00:57:35,070 --> 00:57:37,070
I mean, that'll really help thatfamily out during the holidays.
1116
00:57:37,070 --> 00:57:39,800
Yes, well, with the kids homefrom school,
1117
00:57:39,800 --> 00:57:41,430
the families have three mealsa day that they got
1118
00:57:41,430 --> 00:57:43,170
to put on the table.
1119
00:57:43,170 --> 00:57:45,130
So, it's going to make a bigdifference.
1120
00:57:45,130 --> 00:57:47,130
So, thank you, guys.You're welcome.
1121
00:57:47,130 --> 00:57:48,830
This is wonderful.Let them know Merry Christmas
1122
00:57:48,830 --> 00:57:50,370
on behalf of us hereat the local, okay?
1123
00:57:50,370 --> 00:57:52,930
Absolutely, you guys arejust, just amazing, thank you.
1124
00:57:52,930 --> 00:57:56,270
And please, tell, tell all theworkers how grateful
1125
00:57:56,270 --> 00:57:57,900
these families will be.We will.
1126
00:57:57,900 --> 00:58:00,870
I mean, this is not a smallproblem.
1127
00:58:00,870 --> 00:58:02,700
The need is so great.
1128
00:58:02,700 --> 00:58:05,830
And I can tell youthat it's all races,
1129
00:58:05,830 --> 00:58:08,070
it's all income classes
1130
00:58:08,070 --> 00:58:09,700
that you might think someonemight be from.
1131
00:58:09,700 --> 00:58:11,900
But I can tell you that when yousee it,
1132
00:58:11,900 --> 00:58:15,000
and you deliver this typeof gift to somebody
1133
00:58:15,000 --> 00:58:18,600
who is in need, just thegratitude that they show you
1134
00:58:18,600 --> 00:58:22,470
is incredible.
1135
00:58:22,470 --> 00:58:26,470
We actually know that peopleare at greater risk of mortality
1136
00:58:26,470 --> 00:58:30,130
for over 20 years after theylose their job due to,
1137
00:58:30,130 --> 00:58:32,670
due to no fault of their own, sosomething like automation
1138
00:58:32,670 --> 00:58:34,770
or offshoring.
1139
00:58:34,770 --> 00:58:36,970
They're at higher riskfor cardiovascular disease,
1140
00:58:36,970 --> 00:58:42,500
they're at higher riskfor depression and suicide.
1141
00:58:42,500 --> 00:58:44,630
But then with theintergenerational impacts,
1142
00:58:44,630 --> 00:58:48,230
we also see their childrenare more likely--
1143
00:58:48,230 --> 00:58:50,300
children of parents who havelost their job
1144
00:58:50,300 --> 00:58:53,670
due to automation-- are morelikely to repeat a grade,
1145
00:58:53,670 --> 00:58:55,570
they're more likely to drop outof school,
1146
00:58:55,570 --> 00:58:57,700
they're more likely to besuspended from school,
1147
00:58:57,700 --> 00:58:59,470
and they have lower educationalattainment
1148
00:58:59,470 --> 00:59:03,200
over their entire lifetimes.
1149
00:59:03,200 --> 00:59:06,200
It's the future of this,not the past, that scares me.
1150
00:59:06,200 --> 00:59:08,700
Because I think we're in theearly decades
1151
00:59:08,700 --> 00:59:11,170
of what is a multi-decadeadjustment period.
1152
00:59:11,170 --> 00:59:14,000
♪ ♪
1153
00:59:14,000 --> 00:59:18,170
NARRATOR: The world is beingre-imagined.
1154
00:59:18,170 --> 00:59:20,370
This is a supermarket.
1155
00:59:20,370 --> 00:59:24,800
Robots, guided by A.I., packeverything from soap powder
1156
00:59:24,800 --> 00:59:29,530
to cantaloupes for onlineconsumers.
1157
00:59:29,530 --> 00:59:31,600
Machines that pick groceries,
1158
00:59:31,600 --> 00:59:35,170
machines that can also readreports, learn routines,
1159
00:59:35,170 --> 00:59:38,730
and comprehend are reaching deepinto factories,
1160
00:59:38,730 --> 00:59:41,870
stores, and offices.
1161
00:59:41,870 --> 00:59:43,800
At a college in Goshen, Indiana,
1162
00:59:43,800 --> 00:59:47,030
a group of local business andpolitical leaders come together
1163
00:59:47,030 --> 00:59:52,830
to try to understand the impactof A.I. and the new machines.
1164
00:59:52,830 --> 00:59:54,870
Molly Kinder studiesthe future of work
1165
00:59:54,870 --> 00:59:56,470
at a Washington think tank.
1166
00:59:56,470 --> 00:59:58,970
How many people have goneinto a fast-food restaurant
1167
00:59:58,970 --> 01:00:01,370
and done a self-ordering?
1168
01:00:01,370 --> 01:00:02,530
Anyone, yes?
1169
01:00:02,530 --> 01:00:04,400
Panera, for instance,is doing this.
1170
01:00:04,400 --> 01:00:08,270
Cashier was my first job,and in, in, where I live,
1171
01:00:08,270 --> 01:00:10,830
in Washington, DC, it's actuallythe number-one occupation
1172
01:00:10,830 --> 01:00:12,300
for the greater DC region.
1173
01:00:12,300 --> 01:00:14,670
There are millions of people whowork in cashier positions.
1174
01:00:14,670 --> 01:00:17,000
This is not a futuristicchallenge,
1175
01:00:17,000 --> 01:00:19,800
this is something that'shappening sooner than we think.
1176
01:00:19,800 --> 01:00:24,770
In the popular discussions aboutrobots and automation and work,
1177
01:00:24,770 --> 01:00:28,600
almost every image is of a manon a factory floor
1178
01:00:28,600 --> 01:00:29,770
or a truck driver.
1179
01:00:29,770 --> 01:00:32,900
And yet, in our data, when welooked,
1180
01:00:32,900 --> 01:00:35,900
women disproportionately holdthe jobs that today
1181
01:00:35,900 --> 01:00:37,900
are at highest riskof automation.
1182
01:00:37,900 --> 01:00:40,800
And that's not really beingtalked about,
1183
01:00:40,800 --> 01:00:43,700
and that's in part because womenare over-represented
1184
01:00:43,700 --> 01:00:45,570
in some of these marginalizedoccupations,
1185
01:00:45,570 --> 01:00:48,230
like a cashieror a fast-food worker.
1186
01:00:48,230 --> 01:00:53,670
And also in a large numbersin clerical jobs in offices--
1187
01:00:53,670 --> 01:00:57,400
HR departments,payroll, finance,
1188
01:00:57,400 --> 01:01:00,900
a lot of that is more routineprocessing information,
1189
01:01:00,900 --> 01:01:03,530
processing paper,transferring data.
1190
01:01:03,530 --> 01:01:08,000
That has huge potential forautomation.
1191
01:01:08,000 --> 01:01:11,000
A.I. is going to dosome of that, software,
1192
01:01:11,000 --> 01:01:12,900
robots are going to dosome of that.
1193
01:01:12,900 --> 01:01:14,830
So how many people are stillworking
1194
01:01:14,830 --> 01:01:16,300
as switchboard operators?
1195
01:01:16,300 --> 01:01:18,170
Probably none in this country.
1196
01:01:18,170 --> 01:01:20,470
NARRATOR: The workplace ofthe future will demand
1197
01:01:20,470 --> 01:01:24,230
different skills, and gainingthem, says Molly Kinder,
1198
01:01:24,230 --> 01:01:26,300
will depend on whocan afford them.
1199
01:01:26,300 --> 01:01:28,570
I mean it's not a goodsituation in the United States.
1200
01:01:28,570 --> 01:01:30,330
There's been some excellentresearch that says
1201
01:01:30,330 --> 01:01:32,800
that half of Americanscouldn't afford
1202
01:01:32,800 --> 01:01:35,300
a $400 unexpected expense.
1203
01:01:35,300 --> 01:01:38,630
And if you want to get to a$1,000, there's even less.
1204
01:01:38,630 --> 01:01:41,270
So imagine you're going to goout without a month's pay,
1205
01:01:41,270 --> 01:01:43,330
two months' pay, a year.
1206
01:01:43,330 --> 01:01:47,030
Imagine you want to put savingstoward a course
1207
01:01:47,030 --> 01:01:49,670
to, to redevelop your career.
1208
01:01:49,670 --> 01:01:52,330
People can't afford to take timeoff of work.
1209
01:01:52,330 --> 01:01:56,600
They don't have a cushion, sothis lack of economic stability,
1210
01:01:56,600 --> 01:01:59,500
married with the disruptions inpeople's careers,
1211
01:01:59,500 --> 01:02:01,230
is a really toxic mix.
1212
01:02:01,230 --> 01:02:03,630
(blowing whistle)
1213
01:02:03,630 --> 01:02:05,530
NARRATOR: The new machineswill penetrate every sector
1214
01:02:05,530 --> 01:02:08,600
of the economy:from insurance companies
1215
01:02:08,600 --> 01:02:11,130
to human resource departments;
1216
01:02:11,130 --> 01:02:14,030
from law firms to the tradingfloors of Wall Street.
1217
01:02:14,030 --> 01:02:15,470
Wall Street'sgoing through it,
1218
01:02:15,470 --> 01:02:16,970
but every industry is goingthrough it.
1219
01:02:16,970 --> 01:02:19,630
Every company is looking at allof the disruptive technologies,
1220
01:02:19,630 --> 01:02:23,630
could be robotics or dronesor blockchain.
1221
01:02:23,630 --> 01:02:27,130
And whatever it is, everycompany's using everything
1222
01:02:27,130 --> 01:02:29,570
that's developed, everythingthat's disruptive,
1223
01:02:29,570 --> 01:02:32,370
in thinking about, "How doI apply that to my business
1224
01:02:32,370 --> 01:02:35,000
to make myself more efficient?"
1225
01:02:35,000 --> 01:02:37,400
And what efficiency means is,mostly,
1226
01:02:37,400 --> 01:02:40,670
"How do I do thiswith fewer workers?"
1227
01:02:43,900 --> 01:02:47,330
And I do think that when we lookat some of the studies
1228
01:02:47,330 --> 01:02:50,700
about opportunityin this country,
1229
01:02:50,700 --> 01:02:53,030
and the inequalityof opportunity,
1230
01:02:53,030 --> 01:02:55,830
the likelihood that you won't beable to advance
1231
01:02:55,830 --> 01:02:59,300
from where your parents were, Ithink that's, that's,
1232
01:02:59,300 --> 01:03:02,000
is very serious and getsto the heart of the way
1233
01:03:02,000 --> 01:03:06,430
we like to think of America asthe land of opportunity.
1234
01:03:06,430 --> 01:03:08,970
NARRATOR: Inequality has beenrising in America.
1235
01:03:08,970 --> 01:03:13,270
It used to be the top 1%of earners-- here in red--
1236
01:03:13,270 --> 01:03:16,670
owned a relatively small portionof the country's wealth.
1237
01:03:16,670 --> 01:03:20,100
Middle and lower earners--in blue-- had the largest share.
1238
01:03:20,100 --> 01:03:24,970
Then, 15 years ago,the lines crossed.
1239
01:03:24,970 --> 01:03:29,500
And inequality has beenincreasing ever since.
1240
01:03:29,500 --> 01:03:31,830
There's many factors that aredriving inequality today,
1241
01:03:31,830 --> 01:03:33,330
and unfortunately,artificial intelligence--
1242
01:03:33,330 --> 01:03:38,270
without being thoughtfulabout it--
1243
01:03:38,270 --> 01:03:41,430
is a driver for increasedinequality
1244
01:03:41,430 --> 01:03:43,900
because it's a form ofautomation,
1245
01:03:43,900 --> 01:03:47,000
and automation is thesubstitution of capital
1246
01:03:47,000 --> 01:03:49,070
for labor.
1247
01:03:49,070 --> 01:03:52,800
And when you do that,the people with the capital win.
1248
01:03:52,800 --> 01:03:55,800
So Karl Marx was right,
1249
01:03:55,800 --> 01:03:58,100
it's a struggle between capitaland labor,
1250
01:03:58,100 --> 01:03:59,600
and with artificialintelligence,
1251
01:03:59,600 --> 01:04:02,830
we're putting our finger on thescale on the side of capital,
1252
01:04:02,830 --> 01:04:05,770
and how we wish to distributethe benefits,
1253
01:04:05,770 --> 01:04:07,230
the economic benefits,
1254
01:04:07,230 --> 01:04:09,130
that that will create is goingto be a major
1255
01:04:09,130 --> 01:04:13,430
moral consideration for societyover the next several decades.
1256
01:04:13,430 --> 01:04:19,370
This is really an outgrowthof the increasing gaps
1257
01:04:19,370 --> 01:04:23,600
of haves and have-nots--the wealthy getting wealthier,
1258
01:04:23,600 --> 01:04:24,870
the poor getting poorer.
1259
01:04:24,870 --> 01:04:28,400
It may not be specificallyrelated to A.I.,
1260
01:04:28,400 --> 01:04:30,770
but as... but A.I. willexacerbate that.
1261
01:04:30,770 --> 01:04:36,430
And that, I think, will tearthe society apart,
1262
01:04:36,430 --> 01:04:38,930
because the rich will have justtoo much,
1263
01:04:38,930 --> 01:04:44,200
and those who are have-nots willhave perhaps very little way
1264
01:04:44,200 --> 01:04:46,700
of digging themselvesout of the hole.
1265
01:04:46,700 --> 01:04:50,800
And with A.I. making its impact,it, it'll be worse, I think.
1266
01:04:50,800 --> 01:04:56,170
♪ ♪
1267
01:04:56,170 --> 01:05:01,870
(crowd cheering and applauding)
1268
01:05:01,870 --> 01:05:05,000
(speaking on P.A.)
1269
01:05:05,000 --> 01:05:08,830
I'm here today for one mainreason.
1270
01:05:08,830 --> 01:05:12,630
To say thank you to Ohio.
1271
01:05:12,630 --> 01:05:17,400
(crowd cheering and applauding)
1272
01:05:17,400 --> 01:05:20,300
I think the Trump votewas a protest.
1273
01:05:20,300 --> 01:05:22,030
I mean that for whatever reason,
1274
01:05:22,030 --> 01:05:25,000
whatever the hot button wasthat, you know,
1275
01:05:25,000 --> 01:05:28,800
that really hit home with theseAmericans who voted for him
1276
01:05:28,800 --> 01:05:30,800
were, it was a protest vote.
1277
01:05:30,800 --> 01:05:34,530
They didn't like the directionthings were going.
1278
01:05:34,530 --> 01:05:38,270
(crowd booing and shouting)
1279
01:05:39,170 --> 01:05:40,700
I'm scared.
1280
01:05:40,700 --> 01:05:42,900
I'm gonna be quite honest withyou, I worry about the future
1281
01:05:42,900 --> 01:05:47,100
of not just this country,but the, the entire globe.
1282
01:05:47,100 --> 01:05:51,100
If we continue to go in anautomated system,
1283
01:05:51,100 --> 01:05:52,730
what are we going to do?
1284
01:05:52,730 --> 01:05:54,870
Now I've got a group of peopleat the top
1285
01:05:54,870 --> 01:05:57,170
that are making all the moneyand I don't have anybody
1286
01:05:57,170 --> 01:06:00,070
in the middlethat can support a family.
1287
01:06:00,070 --> 01:06:05,330
So do we have to go to the pointwhere we crash to come back?
1288
01:06:05,330 --> 01:06:06,630
And in this case,
1289
01:06:06,630 --> 01:06:08,030
the automation's already gonnabe there,
1290
01:06:08,030 --> 01:06:09,630
so I don't know howyou come back.
1291
01:06:09,630 --> 01:06:11,730
I'm really worriedabout where this,
1292
01:06:11,730 --> 01:06:13,700
where this leads usin the future.
1293
01:06:13,700 --> 01:06:17,000
♪ ♪
1294
01:06:27,200 --> 01:06:28,730
NARRATOR: The future islargely being shaped
1295
01:06:28,730 --> 01:06:32,370
by a few hugely successfultech companies.
1296
01:06:32,370 --> 01:06:35,700
They're constantly buying upsuccessful smaller companies
1297
01:06:35,700 --> 01:06:37,900
and recruiting talent.
1298
01:06:37,900 --> 01:06:39,830
Between the U.S. and China,
1299
01:06:39,830 --> 01:06:42,800
they employ a great majority ofthe leading A.I. researchers
1300
01:06:42,800 --> 01:06:46,070
and scientists.
1301
01:06:46,070 --> 01:06:48,370
In the course of amassingsuch power,
1302
01:06:48,370 --> 01:06:51,930
they've also become among therichest companies in the world.
1303
01:06:51,930 --> 01:06:58,130
A.I. really is the ultimatetool of wealth creation.
1304
01:06:58,130 --> 01:07:03,730
Think about the massive datathat, you know, Facebook has
1305
01:07:03,730 --> 01:07:08,270
on user preferences, and howit can very smartly target
1306
01:07:08,270 --> 01:07:10,400
an ad that you might buysomething
1307
01:07:10,400 --> 01:07:16,430
and get a much bigger cut thata smaller company couldn't do.
1308
01:07:16,430 --> 01:07:18,970
Same with Google,same with Amazon.
1309
01:07:18,970 --> 01:07:23,300
So it's... A.I. is a set oftools
1310
01:07:23,300 --> 01:07:26,500
that helps you maximize anobjective function,
1311
01:07:26,500 --> 01:07:32,200
and that objective functioninitially will simply be,
1312
01:07:32,200 --> 01:07:34,200
make more money.
1313
01:07:34,200 --> 01:07:36,400
NARRATOR: And it is how thesecompanies make that money,
1314
01:07:36,400 --> 01:07:41,230
and how their algorithms reachdeeper and deeper into our work,
1315
01:07:41,230 --> 01:07:42,870
our daily lives,and our democracy,
1316
01:07:42,870 --> 01:07:47,870
that makes many peopleincreasingly uncomfortable.
1317
01:07:47,870 --> 01:07:52,330
Pedro Domingos wrote the book"The Master Algorithm."
1318
01:07:52,330 --> 01:07:55,470
Everywhere you go,you generate a cloud of data.
1319
01:07:55,470 --> 01:07:58,500
You're trailing data, everythingthat you do is producing data.
1320
01:07:58,500 --> 01:07:59,900
And then there are computerslooking at that data
1321
01:07:59,900 --> 01:08:02,970
that are learning, and thesecomputers are essentially
1322
01:08:02,970 --> 01:08:05,100
trying to serve you better.
1323
01:08:05,100 --> 01:08:07,270
They're trying to personalizethings to you.
1324
01:08:07,270 --> 01:08:08,900
They're trying to adaptthe world to you.
1325
01:08:08,900 --> 01:08:10,800
So on the one hand,this is great,
1326
01:08:10,800 --> 01:08:12,430
because the world will getadapted to you
1327
01:08:12,430 --> 01:08:15,900
without you even having toexplicitly adapt it.
1328
01:08:15,900 --> 01:08:18,600
There's also a danger, becausethe entities in the companies
1329
01:08:18,600 --> 01:08:20,100
that are in control of thosealgorithms
1330
01:08:20,100 --> 01:08:22,000
don't necessarily have the samegoals as you,
1331
01:08:22,000 --> 01:08:24,800
and this is where I think peopleneed to be aware that,
1332
01:08:24,799 --> 01:08:29,999
what's going on, so they canhave more control over it.
1333
01:08:30,000 --> 01:08:31,630
You know, we came into thisnew world thinking
1334
01:08:31,630 --> 01:08:35,530
that we were usersof social media.
1335
01:08:35,529 --> 01:08:37,799
It didn't occur to usthat social media
1336
01:08:37,799 --> 01:08:40,429
was actually using us.
1337
01:08:40,430 --> 01:08:43,830
We thought that we weresearching Google.
1338
01:08:43,830 --> 01:08:48,700
We had no idea that Googlewas searching us.
1339
01:08:48,700 --> 01:08:50,800
NARRATOR: Shoshana Zuboffis a Harvard Business School
1340
01:08:50,799 --> 01:08:52,869
professor emerita.
1341
01:08:52,870 --> 01:08:55,930
In 1988, she wrote a definitivebook called
1342
01:08:55,930 --> 01:08:58,370
"In the Age ofthe Smart Machine."
1343
01:08:58,370 --> 01:09:01,970
For the last seven years,she has worked on a new book,
1344
01:09:01,970 --> 01:09:04,730
making the case that we have nowentered a new phase
1345
01:09:04,730 --> 01:09:09,970
of the economy, which she calls"surveillance capitalism."
1346
01:09:09,970 --> 01:09:16,330
So, famously, industrialcapitalism claimed nature.
1347
01:09:16,330 --> 01:09:20,300
Innocent rivers, and meadows,and forests, and so forth,
1348
01:09:20,299 --> 01:09:25,129
for the market dynamic to bereborn as real estate,
1349
01:09:25,130 --> 01:09:27,970
as land that could be soldand purchased.
1350
01:09:27,970 --> 01:09:32,230
Industrial capitalism claimedwork for the market dynamic
1351
01:09:32,230 --> 01:09:35,170
to reborn, to be reborn as labor
1352
01:09:35,170 --> 01:09:38,700
that could be soldand purchased.
1353
01:09:38,700 --> 01:09:40,830
Now, here comes surveillancecapitalism,
1354
01:09:40,830 --> 01:09:47,700
following this pattern, but witha dark and startling twist.
1355
01:09:47,700 --> 01:09:51,700
What surveillance capitalismclaims is private,
1356
01:09:51,700 --> 01:09:53,930
human experience.
1357
01:09:53,930 --> 01:09:58,970
Private, human experience isclaimed as a free source
1358
01:09:58,970 --> 01:10:05,800
of raw material, fabricated intopredictions of human behavior.
1359
01:10:05,800 --> 01:10:09,370
And it turns out that there area lot of businesses
1360
01:10:09,370 --> 01:10:14,270
that really want to know whatwe will do now, soon, and later.
1361
01:10:17,800 --> 01:10:19,430
NARRATOR: Like most people,
1362
01:10:19,430 --> 01:10:21,470
Alastair Mactaggarthad know idea
1363
01:10:21,470 --> 01:10:23,600
about this new surveillancebusiness,
1364
01:10:23,600 --> 01:10:27,370
until one evening in 2015.
1365
01:10:27,370 --> 01:10:30,230
I had a conversation with afellow who's an engineer,
1366
01:10:30,230 --> 01:10:33,870
and I was just talking to himone night at a,
1367
01:10:33,870 --> 01:10:35,270
you know, a dinner,at a cocktail party.
1368
01:10:35,270 --> 01:10:37,470
And I... there had beensomething in the press that day
1369
01:10:37,470 --> 01:10:39,900
about privacy in the paper,and I remember asking him--
1370
01:10:39,900 --> 01:10:41,870
he worked for Google-- "What'sthe big deal about all,
1371
01:10:41,870 --> 01:10:44,270
why are people so worked upabout it?"
1372
01:10:44,270 --> 01:10:45,670
And I thought it was gonna beone of those conversations,
1373
01:10:45,670 --> 01:10:49,330
like, with, you know, if youever ask an airline pilot,
1374
01:10:49,330 --> 01:10:50,570
"Should I be worried aboutflying?"
1375
01:10:50,570 --> 01:10:52,070
and they say,"Oh, the most dangerous part
1376
01:10:52,070 --> 01:10:55,030
is coming to the airport,you know, in the car."
1377
01:10:55,030 --> 01:10:57,730
And he said, "Oh, you'd behorrified
1378
01:10:57,730 --> 01:10:59,900
if you knew how much we knewabout you."
1379
01:10:59,900 --> 01:11:02,100
And I remember that kind ofstuck in my head,
1380
01:11:02,100 --> 01:11:04,530
because it was notwhat I expected.
1381
01:11:04,530 --> 01:11:08,400
NARRATOR: That questionwould change his life.
1382
01:11:08,400 --> 01:11:09,930
A successful California realestate developer,
1383
01:11:09,930 --> 01:11:15,730
Mactaggart began researchingthe new business model.
1384
01:11:15,730 --> 01:11:17,870
What I've learned since isthat their entire business
1385
01:11:17,870 --> 01:11:20,630
is learning as much about youas they can.
1386
01:11:20,630 --> 01:11:21,970
Everything about your thoughts,and your desires,
1387
01:11:21,970 --> 01:11:25,730
and your dreams,and who your friends are,
1388
01:11:25,730 --> 01:11:27,700
and what you're thinking, whatyour private thoughts are.
1389
01:11:27,700 --> 01:11:29,770
And with that,that's true power.
1390
01:11:29,770 --> 01:11:33,430
And so, I think...I didn't know that at the time.
1391
01:11:33,430 --> 01:11:35,470
That their entire businessis basically mining
1392
01:11:35,470 --> 01:11:37,200
the data of your life.
1393
01:11:37,200 --> 01:11:39,070
♪ ♪
1394
01:11:39,070 --> 01:11:43,200
NARRATOR: Shoshana Zuboff hadbeen doing her own research.
1395
01:11:43,200 --> 01:11:45,970
You know, I'd been readingand reading and reading.
1396
01:11:45,970 --> 01:11:48,370
From patents, to transcriptsof earnings calls,
1397
01:11:48,370 --> 01:11:50,430
research reports.
1398
01:11:50,430 --> 01:11:52,130
And, you know,just literally everything,
1399
01:11:52,130 --> 01:11:56,730
for years and years and years.
1400
01:11:56,730 --> 01:11:57,930
NARRATOR: Her studiesincluded the early days
1401
01:11:57,930 --> 01:12:00,400
of Google, started in 1998
1402
01:12:00,400 --> 01:12:02,570
by two young Stanford gradstudents,
1403
01:12:02,570 --> 01:12:06,330
Sergey Brin and Larry Page.
1404
01:12:06,330 --> 01:12:10,000
In the beginning, they had noclear business model.
1405
01:12:10,000 --> 01:12:13,900
Their unofficial motto was,"Don't Be Evil."
1406
01:12:13,900 --> 01:12:16,270
Right from the start,the founders,
1407
01:12:16,270 --> 01:12:20,170
Larry Page and Sergey Brin,they had been very public
1408
01:12:20,170 --> 01:12:26,200
about their antipathytoward advertising.
1409
01:12:26,200 --> 01:12:31,300
Advertising would distortthe internet
1410
01:12:31,300 --> 01:12:37,800
and it would distort anddisfigure the, the purity
1411
01:12:37,800 --> 01:12:41,700
of any search engine,including their own.
1412
01:12:41,700 --> 01:12:43,070
Once in love with e-commerce,
1413
01:12:43,070 --> 01:12:46,300
Wall Street has turned its backon the dotcoms.
1414
01:12:46,300 --> 01:12:49,400
NARRATOR: Then came thedotcom crash of the early 2000s.
1415
01:12:49,400 --> 01:12:51,400
...has left hundreds ofunprofitable internet companies
1416
01:12:51,400 --> 01:12:54,830
begging for love and money.
1417
01:12:54,830 --> 01:12:56,730
NARRATOR: While Google hadrapidly become the default
1418
01:12:56,730 --> 01:12:58,830
search engine for tens ofmillions of users,
1419
01:12:58,830 --> 01:13:04,070
their investors were pressuringthem to make more money.
1420
01:13:04,070 --> 01:13:06,100
Without a new business model,
1421
01:13:06,100 --> 01:13:10,530
the founders knew that the youngcompany was in danger.
1422
01:13:10,530 --> 01:13:14,470
In this state of emergency,the founders decided,
1423
01:13:14,470 --> 01:13:19,300
"We've simply got to find a wayto save this company."
1424
01:13:19,300 --> 01:13:25,630
And so, parallel to this wereanother set of discoveries,
1425
01:13:25,630 --> 01:13:30,970
where it turns out that wheneverwe search or whenever we browse,
1426
01:13:30,970 --> 01:13:35,030
we're leaving behind traces--digital traces--
1427
01:13:35,030 --> 01:13:37,230
of our behavior.
1428
01:13:37,230 --> 01:13:39,330
And those traces,back in these days,
1429
01:13:39,330 --> 01:13:43,500
were called digital exhaust.
1430
01:13:43,500 --> 01:13:45,400
NARRATOR: They realized howvaluable this data could be
1431
01:13:45,400 --> 01:13:47,700
by applying machine learningalgorithms
1432
01:13:47,700 --> 01:13:52,570
to predict users' interests.
1433
01:13:52,570 --> 01:13:54,670
What happened was,they decided to turn
1434
01:13:54,670 --> 01:13:57,630
to those data logsin a systematic way,
1435
01:13:57,630 --> 01:14:01,670
and to begin to use thesesurplus data
1436
01:14:01,670 --> 01:14:06,970
as a way to come up withfine-grained predictions
1437
01:14:06,970 --> 01:14:11,330
of what a user would click on,what kind of ad
1438
01:14:11,330 --> 01:14:14,230
a user would click on.
1439
01:14:14,230 --> 01:14:18,700
And inside Google, they startedseeing these revenues
1440
01:14:18,700 --> 01:14:22,830
pile up at a startling rate.
1441
01:14:22,830 --> 01:14:26,000
They realized that they had tokeep it secret.
1442
01:14:26,000 --> 01:14:28,930
They didn't want anyone to knowhow much money they were making,
1443
01:14:28,930 --> 01:14:31,500
or how they were making it.
1444
01:14:31,500 --> 01:14:35,700
Because users had no idea thatthese extra-behavioral data
1445
01:14:35,700 --> 01:14:39,070
that told so much about them,you know, was just out there,
1446
01:14:39,070 --> 01:14:43,700
and now it was being usedto predict their future.
1447
01:14:43,700 --> 01:14:46,100
NARRATOR: When Google'sI.P.O. took place
1448
01:14:46,100 --> 01:14:47,170
just a few years later,
1449
01:14:47,170 --> 01:14:49,700
the company had a marketcapitalization
1450
01:14:49,700 --> 01:14:53,230
of around $23 billion.
1451
01:14:53,230 --> 01:14:56,000
Google's stock was now asvaluable as General Motors.
1452
01:14:56,000 --> 01:14:59,100
♪ ♪
1453
01:14:59,100 --> 01:15:02,330
And it was only when Googlewent public in 2004
1454
01:15:02,330 --> 01:15:05,600
that the numbers were released.
1455
01:15:05,600 --> 01:15:10,600
And it's at that point that welearn that between the year 2000
1456
01:15:10,600 --> 01:15:14,500
and the year 2004, Google'srevenue line increased
1457
01:15:14,500 --> 01:15:20,400
by 3,590%.
1458
01:15:20,400 --> 01:15:22,470
Let's talk a little aboutinformation, and search,
1459
01:15:22,470 --> 01:15:24,630
and how people consume it.
1460
01:15:24,630 --> 01:15:27,230
NARRATOR: By 2010, the C.E.O.of Google, Eric Schmidt,
1461
01:15:27,230 --> 01:15:29,570
would tell "The Atlantic"magazine...
1462
01:15:29,570 --> 01:15:33,230
...is, we don't need you totype at all.
1463
01:15:33,230 --> 01:15:35,930
Because we know where you are,with your permission,
1464
01:15:35,930 --> 01:15:39,630
we know where you've been,with your permission.
1465
01:15:39,630 --> 01:15:41,700
We can more or less guess whatyou're thinking about.
1466
01:15:41,700 --> 01:15:44,200
(audience laughing)Now, is that over the line?
1467
01:15:44,200 --> 01:15:45,800
NARRATOR: Eric Schmidtand Google declined
1468
01:15:45,800 --> 01:15:49,370
to be interviewedfor this program.
1469
01:15:49,370 --> 01:15:52,670
Google's new business model forpredicting users' profiles
1470
01:15:52,670 --> 01:15:58,100
had migrated to other companies,particularly Facebook.
1471
01:15:58,100 --> 01:16:00,130
Roger McNamee was an earlyinvestor
1472
01:16:00,130 --> 01:16:02,330
and adviser to Facebook.
1473
01:16:02,330 --> 01:16:05,900
He's now a critic, and wrotea book about the company.
1474
01:16:05,900 --> 01:16:08,930
He says he's concerned about howwidely companies like Facebook
1475
01:16:08,930 --> 01:16:11,770
and Google have been castingthe net for data.
1476
01:16:11,770 --> 01:16:13,530
And then they realized,"Wait a minute,
1477
01:16:13,530 --> 01:16:16,470
there's all this data inthe economy we don't have."
1478
01:16:16,470 --> 01:16:18,700
So they went to credit cardprocessors,
1479
01:16:18,700 --> 01:16:20,430
and credit rating services,
1480
01:16:20,430 --> 01:16:23,030
and said, "We wantto buy your data."
1481
01:16:23,030 --> 01:16:25,100
They go to health and wellnessapps and say,
1482
01:16:25,100 --> 01:16:26,600
"Hey, you got women'smenstrual cycles?
1483
01:16:26,600 --> 01:16:28,530
We want all that stuff."
1484
01:16:28,530 --> 01:16:30,830
Why are they doing that?
1485
01:16:30,830 --> 01:16:34,430
They're doing that becausebehavioral prediction
1486
01:16:34,430 --> 01:16:38,270
is about taking uncertaintyout of life.
1487
01:16:38,270 --> 01:16:40,670
Advertising and marketingare all about uncertainty--
1488
01:16:40,670 --> 01:16:43,530
you never really know who'sgoing to buy your product.
1489
01:16:43,530 --> 01:16:45,500
Until now.
1490
01:16:45,500 --> 01:16:49,870
We have to recognize that wegave technology a place
1491
01:16:49,870 --> 01:16:55,700
in our livesthat it had not earned.
1492
01:16:55,700 --> 01:17:00,530
That essentially, becausetechnology always made things
1493
01:17:00,530 --> 01:17:03,530
better in the '50s, '60s, '70s,'80s, and '90s,
1494
01:17:03,530 --> 01:17:07,030
we developed a sense ofinevitability
1495
01:17:07,030 --> 01:17:10,000
that it will always make thingsbetter.
1496
01:17:10,000 --> 01:17:13,630
We developed a trust, and theindustry earned good will
1497
01:17:13,630 --> 01:17:20,330
that Facebook and Google havecashed in.
1498
01:17:20,330 --> 01:17:23,500
NARRATOR: The model is simplythis: provide a free service--
1499
01:17:23,500 --> 01:17:26,630
like Facebook-- and in exchange,you collect the data
1500
01:17:26,630 --> 01:17:28,870
of the millions who use it.
1501
01:17:28,870 --> 01:17:31,800
♪ ♪
1502
01:17:31,800 --> 01:17:37,470
And every sliver of informationis valuable.
1503
01:17:37,470 --> 01:17:41,370
It's not just what you post,it's that you post.
1504
01:17:41,370 --> 01:17:44,800
It's not just that you makeplans to see your friends later.
1505
01:17:44,800 --> 01:17:47,470
It's whether you say,"I'll see you later,"
1506
01:17:47,470 --> 01:17:51,000
or, "I'll see you at 6:45."
1507
01:17:51,000 --> 01:17:54,030
It's not just that you talkabout the things
1508
01:17:54,030 --> 01:17:56,330
that you have to do today.
1509
01:17:56,330 --> 01:17:59,230
It's whether you simply rattlethem on in a,
1510
01:17:59,230 --> 01:18:04,500
in a rambling paragraph,or list them as bullet points.
1511
01:18:04,500 --> 01:18:09,030
All of these tiny signals arethe behavioral surplus
1512
01:18:09,030 --> 01:18:13,630
that turns out to have immensepredictive value.
1513
01:18:13,630 --> 01:18:16,300
NARRATOR: In 2010, Facebookexperimented
1514
01:18:16,300 --> 01:18:19,070
with A.I.'s predictive powersin what they called
1515
01:18:19,070 --> 01:18:21,830
a "social contagion" experiment.
1516
01:18:21,830 --> 01:18:25,470
They wanted to see if, throughonline messaging,
1517
01:18:25,470 --> 01:18:30,070
they could influence real-worldbehavior.
1518
01:18:30,070 --> 01:18:32,670
The aim was to get more peopleto the polls
1519
01:18:32,670 --> 01:18:34,530
in the 2010 midterm elections.
1520
01:18:34,530 --> 01:18:38,000
Cleveland, I need you to keepon fighting.
1521
01:18:38,000 --> 01:18:41,030
I need you to keep on believing.
1522
01:18:41,030 --> 01:18:42,530
NARRATOR: They offered61 million users
1523
01:18:42,530 --> 01:18:45,470
an "I voted" button togetherwith faces of friends
1524
01:18:45,470 --> 01:18:47,330
who had voted.
1525
01:18:47,330 --> 01:18:52,000
A subset of users receivedjust the button.
1526
01:18:52,000 --> 01:18:56,130
In the end, they claimed to havenudged 340,000 people to vote.
1527
01:19:00,270 --> 01:19:03,170
They would conduct other"massive contagion" experiments.
1528
01:19:03,170 --> 01:19:06,600
Among them, one showing that byadjusting their feeds,
1529
01:19:06,600 --> 01:19:12,000
they could make usershappy or sad.
1530
01:19:12,000 --> 01:19:13,300
When they went to write upthese findings,
1531
01:19:13,300 --> 01:19:16,270
they boasted about two things.
1532
01:19:16,270 --> 01:19:19,770
One was, "Oh, my goodness.
1533
01:19:19,770 --> 01:19:24,770
Now we know that we can use cuesin the online environment
1534
01:19:24,770 --> 01:19:28,830
to change real-world behavior.
1535
01:19:28,830 --> 01:19:31,770
That's big news."
1536
01:19:31,770 --> 01:19:35,600
The second thing that theyunderstood, and they celebrated,
1537
01:19:35,600 --> 01:19:39,030
was that, "We can do this in away that bypasses
1538
01:19:39,030 --> 01:19:43,370
the users' awareness."
1539
01:19:43,370 --> 01:19:47,500
Private corporations havebuilt a corporate surveillance
1540
01:19:47,500 --> 01:19:52,370
state without our awarenessor permission.
1541
01:19:52,370 --> 01:19:55,230
And the systems necessary tomake it work
1542
01:19:55,230 --> 01:19:58,430
are getting a lot better,specifically with what are known
1543
01:19:58,430 --> 01:20:01,500
as internet of things,smart appliances, you know,
1544
01:20:01,500 --> 01:20:04,430
powered by the Alexa voicerecognition system,
1545
01:20:04,430 --> 01:20:06,870
or the Google Home system.
1546
01:20:06,870 --> 01:20:09,700
Okay, Google,play the morning playlist.
1547
01:20:09,700 --> 01:20:12,200
Okay, playing morningplaylist.
1548
01:20:12,200 --> 01:20:14,300
♪ ♪
1549
01:20:14,300 --> 01:20:16,370
Okay, Google,play music in all rooms.
1550
01:20:16,370 --> 01:20:18,030
♪ ♪
1551
01:20:18,030 --> 01:20:21,000
And those will put thesurveillance in places
1552
01:20:21,000 --> 01:20:22,270
we've never had it before--
1553
01:20:22,270 --> 01:20:24,800
living rooms, kitchens,bedrooms.
1554
01:20:24,800 --> 01:20:27,300
And I find all of thatterrifying.
1555
01:20:27,300 --> 01:20:29,630
Okay, Google, I'm listening.
1556
01:20:29,630 --> 01:20:31,400
NARRATOR: The companies saythey're not using the data
1557
01:20:31,400 --> 01:20:36,770
to target ads, but helping A.I.improve the user experience.
1558
01:20:36,770 --> 01:20:40,030
Alexa, turn on the fan.
1559
01:20:40,030 --> 01:20:41,500
(fan clicks on)
1560
01:20:41,500 --> 01:20:42,670
Okay.
1561
01:20:42,670 --> 01:20:43,830
NARRATOR: Meanwhile, they areresearching
1562
01:20:43,830 --> 01:20:45,930
and applying for patents
1563
01:20:45,930 --> 01:20:48,900
to expand their reachinto homes and lives.
1564
01:20:48,900 --> 01:20:51,230
Alexa, take a video.
1565
01:20:51,230 --> 01:20:52,670
(camera chirps)
1566
01:20:52,670 --> 01:20:54,570
The more and more that youuse spoken interfaces--
1567
01:20:54,570 --> 01:20:57,800
so smart speakers-- they'rebeing trained
1568
01:20:57,800 --> 01:21:00,770
not just to recognizewho you are,
1569
01:21:00,770 --> 01:21:03,970
but they're starting to takebaselines
1570
01:21:03,970 --> 01:21:09,770
and comparing changes over time.
1571
01:21:09,770 --> 01:21:12,970
So does your cadence increaseor decrease?
1572
01:21:12,970 --> 01:21:15,600
Are you sneezingwhile you're talking?
1573
01:21:15,600 --> 01:21:18,730
Is your voice a little wobbly?
1574
01:21:18,730 --> 01:21:21,570
The purpose of doing this isto understand
1575
01:21:21,570 --> 01:21:24,430
more about you in real time.
1576
01:21:24,430 --> 01:21:27,830
So that a system could makeinferences, perhaps,
1577
01:21:27,830 --> 01:21:30,600
like, do you have a cold?
1578
01:21:30,600 --> 01:21:33,370
Are you in a manic phase?
1579
01:21:33,370 --> 01:21:35,100
Are you feeling depressed?
1580
01:21:35,100 --> 01:21:38,700
So that is an extraordinaryamount of information
1581
01:21:38,700 --> 01:21:41,670
that can be gleaned by yousimply waking up
1582
01:21:41,670 --> 01:21:45,330
and asking your smart speaker,"What's the weather today?"
1583
01:21:45,330 --> 01:21:47,430
Alexa, what's the weatherfor tonight?
1584
01:21:47,430 --> 01:21:50,630
Currently, in Pasadena, it's58 degrees with cloudy skies.
1585
01:21:50,630 --> 01:21:52,900
Inside it is, then.
1586
01:21:52,900 --> 01:21:54,700
Dinner!
1587
01:21:54,700 --> 01:21:57,630
The point is that thisis the same
1588
01:21:57,630 --> 01:22:01,800
micro-behavioral targeting thatis directed
1589
01:22:01,800 --> 01:22:08,670
toward individuals based onintimate, detailed understanding
1590
01:22:08,670 --> 01:22:11,000
of personalities.
1591
01:22:11,000 --> 01:22:15,600
So this is precisely whatCambridge Analytica did,
1592
01:22:15,600 --> 01:22:19,300
simply pivoting fromthe advertisers
1593
01:22:19,300 --> 01:22:23,500
to the political outcomes.
1594
01:22:23,500 --> 01:22:26,170
NARRATOR: The CambridgeAnalytica scandal of 2018
1595
01:22:26,170 --> 01:22:29,970
engulfed Facebook, forcingMark Zuckerberg to appear
1596
01:22:29,970 --> 01:22:32,900
before Congress to explain howthe data
1597
01:22:32,900 --> 01:22:36,300
of up to 87 million Facebookusers had been harvested
1598
01:22:36,300 --> 01:22:42,870
by a political consultingcompany based in the U.K.
1599
01:22:42,870 --> 01:22:45,500
The purpose was to targetand manipulate voters
1600
01:22:45,500 --> 01:22:48,170
in the 2016 presidentialcampaign,
1601
01:22:48,170 --> 01:22:51,800
as well as the Brexitreferendum.
1602
01:22:51,800 --> 01:22:53,870
Cambridge Analytica had beenlargely funded
1603
01:22:53,870 --> 01:22:58,830
by conservative hedge fundbillionaire Robert Mercer.
1604
01:22:58,830 --> 01:23:02,370
And now we know that anybillionaire with enough money,
1605
01:23:02,370 --> 01:23:04,200
who can buy the data,
1606
01:23:04,200 --> 01:23:07,230
buy the machine intelligencecapabilities,
1607
01:23:07,230 --> 01:23:10,700
buy the skilled data scientists,
1608
01:23:10,700 --> 01:23:16,330
you know, they too cancommandeer the public,
1609
01:23:16,330 --> 01:23:23,370
and infect and infiltrate andupend our democracy
1610
01:23:23,370 --> 01:23:27,770
with the same methodologies thatsurveillance capitalism
1611
01:23:27,770 --> 01:23:32,270
uses every single day.
1612
01:23:32,270 --> 01:23:35,070
We didn't take a broad enoughview of our responsibility,
1613
01:23:35,070 --> 01:23:37,230
and that was a big mistake.
1614
01:23:37,230 --> 01:23:40,830
And it was my mistake,and I'm sorry.
1615
01:23:40,830 --> 01:23:41,770
NARRATOR:Zuckerberg has apologized
1616
01:23:41,770 --> 01:23:44,370
for numerous violations ofprivacy,
1617
01:23:44,370 --> 01:23:47,130
and his company was recentlyfined $5 billion
1618
01:23:47,130 --> 01:23:50,300
by the Federal Trade Commission.
1619
01:23:50,300 --> 01:23:53,230
He has said Facebook will nowmake data protection a priority,
1620
01:23:53,230 --> 01:23:56,800
and the company has suspendedtens of thousands
1621
01:23:56,800 --> 01:23:59,400
of third-party apps from itsplatform
1622
01:23:59,400 --> 01:24:02,930
as a result of an internalinvestigation.
1623
01:24:02,930 --> 01:24:06,870
You know, I wish I could saythat after Cambridge Analytica,
1624
01:24:06,870 --> 01:24:09,030
we've learned our lesson andthat everything will be much
1625
01:24:09,030 --> 01:24:12,930
better after that, but I'mafraid the opposite is true.
1626
01:24:12,930 --> 01:24:14,900
In some ways, CambridgeAnalytica was using tools
1627
01:24:14,900 --> 01:24:16,830
that were ten years old.
1628
01:24:16,830 --> 01:24:18,570
It was really, in some ways,old-school,
1629
01:24:18,570 --> 01:24:20,670
first-wave data science.
1630
01:24:20,670 --> 01:24:22,270
What we're looking at now,with current tools
1631
01:24:22,270 --> 01:24:26,270
and machine learning, is thatthe ability for manipulation,
1632
01:24:26,270 --> 01:24:28,930
both in terms of electionsand opinions,
1633
01:24:28,930 --> 01:24:31,700
but more broadly,just how information travels,
1634
01:24:31,700 --> 01:24:34,630
That is a much bigger problem,
1635
01:24:34,630 --> 01:24:36,200
and certainly much more seriousthan what we faced
1636
01:24:36,200 --> 01:24:40,070
with Cambridge Analytica.
1637
01:24:40,070 --> 01:24:43,670
NARRATOR: A.I. pioneer YoshuaBengio also has concerns
1638
01:24:43,670 --> 01:24:48,470
about how his algorithmsare being used.
1639
01:24:48,470 --> 01:24:51,600
So the A.I.s are tools.
1640
01:24:51,600 --> 01:24:56,330
And they will serve the peoplewho control those tools.
1641
01:24:56,330 --> 01:25:01,700
If those people's interests goagainst the, the values
1642
01:25:01,700 --> 01:25:04,670
of democracy, then democracy isin danger.
1643
01:25:04,670 --> 01:25:10,430
So I believe that scientistswho contribute to science,
1644
01:25:10,430 --> 01:25:14,670
when that science can or willhave an impact on society,
1645
01:25:14,670 --> 01:25:17,670
those scientists have aresponsibility.
1646
01:25:17,670 --> 01:25:19,700
It's a little bit like thephysicists of,
1647
01:25:19,700 --> 01:25:21,800
around the Second World War,
1648
01:25:21,800 --> 01:25:25,100
who rose up to tellthe governments,
1649
01:25:25,100 --> 01:25:29,130
"Wait, nuclear powercan be dangerous
1650
01:25:29,130 --> 01:25:31,930
and nuclear war can be really,really destructive."
1651
01:25:31,930 --> 01:25:36,400
And today, the equivalent of aphysicist of the '40s and '50s
1652
01:25:36,400 --> 01:25:38,730
and '60s are,are the computer scientists
1653
01:25:38,730 --> 01:25:41,430
who are doing machine learningand A.I.
1654
01:25:41,430 --> 01:25:45,000
♪ ♪
1655
01:25:45,000 --> 01:25:46,300
NARRATOR: One person whowanted to do something
1656
01:25:46,300 --> 01:25:49,330
about the dangers was nota computer scientist,
1657
01:25:49,330 --> 01:25:53,330
but an ordinary citizen.
1658
01:25:53,330 --> 01:25:55,600
Alastair Mactaggart was alarmed.
1659
01:25:55,600 --> 01:25:58,800
Voting is, for me,the most alarming one.
1660
01:25:58,800 --> 01:26:00,570
If less than 100,000 votesseparated
1661
01:26:00,570 --> 01:26:03,330
the last two candidates in thelast presidential election,
1662
01:26:03,330 --> 01:26:06,900
in three states...
1663
01:26:06,900 --> 01:26:10,430
NARRATOR: He began a solitarycampaign.
1664
01:26:10,430 --> 01:26:12,100
We're talking aboutconvincing a relatively tiny
1665
01:26:12,100 --> 01:26:14,900
fraction of the votersin a very...
1666
01:26:14,900 --> 01:26:17,700
in a handful of statesto either come out and vote
1667
01:26:17,700 --> 01:26:18,970
or stay home.
1668
01:26:18,970 --> 01:26:21,200
And remember, these companiesknow everybody intimately.
1669
01:26:21,200 --> 01:26:24,470
They know who's a racist,who's a misogynist,
1670
01:26:24,470 --> 01:26:26,770
who's a homophobe,who's a conspiracy theorist.
1671
01:26:26,770 --> 01:26:28,770
They know the lazy people andthe gullible people.
1672
01:26:28,770 --> 01:26:31,130
They have access to the greatesttrove of personal information
1673
01:26:31,130 --> 01:26:32,670
that's ever been assembled.
1674
01:26:32,670 --> 01:26:35,370
They have the world's best datascientists.
1675
01:26:35,370 --> 01:26:37,430
And they have essentiallya frictionless way
1676
01:26:37,430 --> 01:26:39,670
of communicating with you.
1677
01:26:39,670 --> 01:26:43,070
This is power.
1678
01:26:43,070 --> 01:26:44,800
NARRATOR: Mactaggart starteda signature drive
1679
01:26:44,800 --> 01:26:47,000
for a California ballotinitiative,
1680
01:26:47,000 --> 01:26:51,230
for a law to give consumerscontrol of their digital data.
1681
01:26:51,230 --> 01:26:54,670
In all, he would spend$4 million of his own money
1682
01:26:54,670 --> 01:26:58,600
in an effort to rein in thegoliaths of Silicon Valley.
1683
01:26:58,600 --> 01:27:02,570
Google, Facebook, AT&T,and Comcast
1684
01:27:02,570 --> 01:27:06,170
all opposed his initiative.
1685
01:27:06,170 --> 01:27:09,200
I'll tell you, I was scared.Fear.
1686
01:27:09,200 --> 01:27:12,000
Fear of looking likea world-class idiot.
1687
01:27:12,000 --> 01:27:14,930
The market cap of all the firmsarrayed against me were,
1688
01:27:14,930 --> 01:27:19,800
was over $6 trillion.
1689
01:27:19,800 --> 01:27:21,600
NARRATOR: He needed 500,000signatures
1690
01:27:21,600 --> 01:27:25,070
to get his initiativeon the ballot.
1691
01:27:25,070 --> 01:27:27,370
He got well over 600,000.
1692
01:27:27,370 --> 01:27:33,530
Polls showed 80% approvalfor a privacy law.
1693
01:27:33,530 --> 01:27:37,670
That made the politicians inSacramento pay attention.
1694
01:27:37,670 --> 01:27:40,030
So Mactaggart decided thatbecause he was holding
1695
01:27:40,030 --> 01:27:44,100
a strong hand, it was worthnegotiating with them.
1696
01:27:44,100 --> 01:27:46,530
And if AB-375 passesby tomorrow
1697
01:27:46,530 --> 01:27:48,230
and is signed into lawby the governor,
1698
01:27:48,230 --> 01:27:49,770
we will withdraw the initiative.
1699
01:27:49,770 --> 01:27:51,270
Our deadline to do so istomorrow at 5:00.
1700
01:27:51,270 --> 01:27:53,470
NARRATOR:At the very last moment,
1701
01:27:53,470 --> 01:27:55,600
a new law was rushed to thefloor of the state house.
1702
01:27:55,600 --> 01:27:57,470
Everyone take their seats,please.
1703
01:27:57,470 --> 01:28:01,800
Mr. Secretary,please call the roll.
1704
01:28:01,800 --> 01:28:05,900
The voting starts.Alan, aye.
1705
01:28:05,900 --> 01:28:07,570
And the first guy,I think, was a Republican,
1706
01:28:07,570 --> 01:28:08,870
and he voted for it.
1707
01:28:08,870 --> 01:28:10,600
And everybody had said theRepublicans won't vote for it
1708
01:28:10,600 --> 01:28:11,670
because it has this privateright of action,
1709
01:28:11,670 --> 01:28:13,830
where consumers can sue.
1710
01:28:13,830 --> 01:28:15,530
And the guy in the Senate,he calls the name.
1711
01:28:15,530 --> 01:28:16,770
Aye, Roth.
1712
01:28:16,770 --> 01:28:17,830
Aye, Skinner.
1713
01:28:17,830 --> 01:28:19,000
Aye, Stern.
1714
01:28:19,000 --> 01:28:20,800
Aye, Stone.
1715
01:28:20,800 --> 01:28:23,600
You can see down below,and everyone went green,
1716
01:28:23,600 --> 01:28:26,270
and then it passed unanimously.
1717
01:28:26,270 --> 01:28:29,770
Ayes 36; No zero,the measure passes.
1718
01:28:29,770 --> 01:28:32,200
Immediate transmittal to the...
1719
01:28:32,200 --> 01:28:34,530
So I was blown away.
1720
01:28:34,530 --> 01:28:36,500
It was, it was a day I willnever forget.
1721
01:28:41,770 --> 01:28:43,630
So in January, next year,you as a California resident
1722
01:28:43,630 --> 01:28:45,900
will have the right to go to anycompany and say,
1723
01:28:45,900 --> 01:28:47,270
"What have you collected on mein the last 12 years...
1724
01:28:47,270 --> 01:28:48,700
12 months?
1725
01:28:48,700 --> 01:28:51,370
What of my personal informationdo you have?"
1726
01:28:51,370 --> 01:28:52,300
So that's the first right.
1727
01:28:52,300 --> 01:28:54,200
It's right of... we call thatthe right to know.
1728
01:28:54,200 --> 01:28:56,230
The second is the rightto say no.
1729
01:28:56,230 --> 01:28:59,030
And that's the right to go toany company and click a button,
1730
01:28:59,030 --> 01:29:00,930
on any page where they'recollecting your information,
1731
01:29:00,930 --> 01:29:03,200
and say, "Do not sellmy information."
1732
01:29:03,200 --> 01:29:06,430
More importantly, we requirethat they honor
1733
01:29:06,430 --> 01:29:09,370
what's called a third-partyopt-out.
1734
01:29:09,370 --> 01:29:11,130
You will click oncein your browser,
1735
01:29:11,130 --> 01:29:13,830
"Don't sell my information,"
1736
01:29:13,830 --> 01:29:18,230
and it will then send the signalto every single website
1737
01:29:18,230 --> 01:29:21,400
that you visit: "Don't sellthis person's information."
1738
01:29:21,400 --> 01:29:22,970
And that's gonna have a hugeimpact on the spread
1739
01:29:22,970 --> 01:29:25,530
of your informationacross the internet.
1740
01:29:25,530 --> 01:29:27,900
NARRATOR: The tech companieshad been publicly cautious,
1741
01:29:27,900 --> 01:29:31,500
but privately alarmedabout regulation.
1742
01:29:31,500 --> 01:29:34,400
Then one tech giant came onboard in support
1743
01:29:34,400 --> 01:29:37,000
of Mactaggart's efforts.
1744
01:29:37,000 --> 01:29:39,670
I find the reaction amongother tech companies to,
1745
01:29:39,670 --> 01:29:42,730
at this point, be pretty muchall over the place.
1746
01:29:42,730 --> 01:29:45,970
Some people are saying,"You're right to raise this.
1747
01:29:45,970 --> 01:29:47,430
These are good ideas."
1748
01:29:47,430 --> 01:29:49,100
Some people say, "We're not surethese are good ideas,
1749
01:29:49,100 --> 01:29:50,870
but you're right to raise it,"
1750
01:29:50,870 --> 01:29:54,300
and some people are saying,"We don't want regulation."
1751
01:29:54,300 --> 01:29:56,970
And so, you know, we haveconversations with people
1752
01:29:56,970 --> 01:30:00,030
where we point out that the autoindustry is better
1753
01:30:00,030 --> 01:30:03,130
because there aresafety standards.
1754
01:30:03,130 --> 01:30:05,430
Pharmaceuticals,even food products,
1755
01:30:05,430 --> 01:30:08,200
all of these industries arebetter because the public
1756
01:30:08,200 --> 01:30:11,000
has confidence in the products,
1757
01:30:11,000 --> 01:30:14,930
in part because of a mixtureof responsible companies
1758
01:30:14,930 --> 01:30:19,030
and responsible regulation.
1759
01:30:19,030 --> 01:30:21,400
NARRATOR: But the lobbyistsfor big tech have been working
1760
01:30:21,400 --> 01:30:24,270
the corridors in Washington.
1761
01:30:24,270 --> 01:30:26,300
They're looking fora more lenient
1762
01:30:26,300 --> 01:30:29,670
national privacy standard,one that could perhaps override
1763
01:30:29,670 --> 01:30:33,300
the California lawand others like it.
1764
01:30:33,300 --> 01:30:34,570
But while hearings are held,
1765
01:30:34,570 --> 01:30:37,170
and anti-trust legislationthreatened,
1766
01:30:37,170 --> 01:30:40,530
the problem is that A.I.has already spread so far
1767
01:30:40,530 --> 01:30:43,630
into our lives and work.
1768
01:30:43,630 --> 01:30:46,100
Well, it's in healthcare,it's in education,
1769
01:30:46,100 --> 01:30:48,670
it's in criminal justice,it's in the experience
1770
01:30:48,670 --> 01:30:51,230
of shopping as you walk downthe street.
1771
01:30:51,230 --> 01:30:54,700
It has pervaded so many elementsof everyday life,
1772
01:30:54,700 --> 01:30:57,370
and in a way that, in manycases, is completely opaque
1773
01:30:57,370 --> 01:30:59,230
to people.
1774
01:30:59,230 --> 01:31:00,830
While we can see a phone andlook at it and we know that
1775
01:31:00,830 --> 01:31:02,970
there's some A.I. technologybehind it,
1776
01:31:02,970 --> 01:31:05,200
many of us don't know that whenwe go for a job interview
1777
01:31:05,200 --> 01:31:07,030
and we sit downand we have a conversation,
1778
01:31:07,030 --> 01:31:09,970
that we're being filmed, andthat our micro expressions
1779
01:31:09,970 --> 01:31:12,670
are being analyzedby hiring companies.
1780
01:31:12,670 --> 01:31:14,700
Or that if you're in thecriminal justice system,
1781
01:31:14,700 --> 01:31:16,670
that there are risk assessmentalgorithms
1782
01:31:16,670 --> 01:31:18,830
that are decidingyour "risk number,"
1783
01:31:18,830 --> 01:31:22,530
which could determine whetheror not you receive bail or not.
1784
01:31:22,530 --> 01:31:24,770
These are systems which, in manycases, are hidden
1785
01:31:24,770 --> 01:31:28,400
in the back end of our sortof social institutions.
1786
01:31:28,400 --> 01:31:29,400
And so, one of the bigchallenges we have is,
1787
01:31:29,400 --> 01:31:31,970
how do we make that moreapparent?
1788
01:31:31,970 --> 01:31:32,830
How do we make it transparent?
1789
01:31:32,830 --> 01:31:36,700
And how do we make itaccountable?
1790
01:31:36,700 --> 01:31:39,830
For a very long time,we have felt like as humans,
1791
01:31:39,830 --> 01:31:43,130
as Americans,we have full agency
1792
01:31:43,130 --> 01:31:48,830
in determining our own futures--what we read, what we see,
1793
01:31:48,830 --> 01:31:50,330
we're in charge.
1794
01:31:50,330 --> 01:31:53,070
What Cambridge Analytica taughtus,
1795
01:31:53,070 --> 01:31:55,770
and what Facebook continuesto teach us,
1796
01:31:55,770 --> 01:31:58,830
is that we don't have agency.
1797
01:31:58,830 --> 01:32:00,470
We're not in charge.
1798
01:32:00,470 --> 01:32:05,330
This is machines that areautomating some of our skills,
1799
01:32:05,330 --> 01:32:09,330
but have made decisions aboutwho...
1800
01:32:09,330 --> 01:32:12,630
Who we are.
1801
01:32:12,630 --> 01:32:16,300
And they're using thatinformation to tell others
1802
01:32:16,300 --> 01:32:19,570
the story of us.
1803
01:32:19,570 --> 01:32:22,130
♪ ♪
1804
01:32:32,970 --> 01:32:35,470
NARRATOR: In China,in the age of A.I.,
1805
01:32:35,470 --> 01:32:38,130
there's no doubtabout who is in charge.
1806
01:32:38,130 --> 01:32:41,370
In an authoritarian state,social stability
1807
01:32:41,370 --> 01:32:43,770
is the watchwordof the government.
1808
01:32:43,770 --> 01:32:47,930
(whistle blowing)
1809
01:32:47,930 --> 01:32:51,070
And artificial intelligence hasincreased its ability to scan
1810
01:32:51,070 --> 01:32:54,430
the country for signs of unrest.
1811
01:32:54,430 --> 01:32:57,470
(whistle blowing)
1812
01:32:57,470 --> 01:33:00,430
It's been projected that over600 million cameras
1813
01:33:00,430 --> 01:33:04,700
will be deployed by 2020.
1814
01:33:04,700 --> 01:33:07,730
Here, they may be used todiscourage jaywalking.
1815
01:33:07,730 --> 01:33:10,400
But they also serve to remindpeople
1816
01:33:10,400 --> 01:33:14,830
that the state is watching.
1817
01:33:14,830 --> 01:33:18,030
And now, there is a projectcalled Sharp Eyes,
1818
01:33:18,030 --> 01:33:22,670
which is putting cameraon every major street
1819
01:33:22,670 --> 01:33:29,730
and the corner of every villagein China-- meaning everywhere.
1820
01:33:29,730 --> 01:33:33,530
Matching with the most advancedartificial intelligence
1821
01:33:33,530 --> 01:33:36,830
algorithm, which they canactually use this data,
1822
01:33:36,830 --> 01:33:39,470
real-time data, to pick upa face or pick up a action.
1823
01:33:39,470 --> 01:33:42,330
♪ ♪
1824
01:33:42,330 --> 01:33:44,370
NARRATOR: Frequent securityexpos feature companies
1825
01:33:44,370 --> 01:33:48,530
like Megvii and its facial-recognition technology.
1826
01:33:48,530 --> 01:33:51,970
They show off cameras with A.I.that can track cars,
1827
01:33:51,970 --> 01:33:54,800
and identify individualsby face,
1828
01:33:54,800 --> 01:33:58,270
or just by the way they walk.
1829
01:33:58,270 --> 01:34:02,130
The place is just filled withthese screens where you can see
1830
01:34:02,130 --> 01:34:04,530
the computers are actuallyreading people's faces
1831
01:34:04,530 --> 01:34:07,630
and trying to digest that data,and basically track
1832
01:34:07,630 --> 01:34:09,700
and identify who each person is.
1833
01:34:09,700 --> 01:34:11,600
And it's incredible to see somany,
1834
01:34:11,600 --> 01:34:12,870
because just twoor three years ago,
1835
01:34:12,870 --> 01:34:14,700
we hardly sawthat kind of thing.
1836
01:34:14,700 --> 01:34:16,700
So, a big part of it isgovernment spending.
1837
01:34:16,700 --> 01:34:18,370
And so the technology's reallytaken off,
1838
01:34:18,370 --> 01:34:21,530
and a lot of companies havestarted to sort of glom onto
1839
01:34:21,530 --> 01:34:25,700
this idea that thisis the future.
1840
01:34:25,700 --> 01:34:29,470
China is on its wayto building
1841
01:34:29,470 --> 01:34:32,330
a total surveillance state.
1842
01:34:32,330 --> 01:34:33,830
NARRATOR: And this is thetest lab
1843
01:34:33,830 --> 01:34:36,370
for the surveillance state.
1844
01:34:36,370 --> 01:34:40,170
Here, in the far northwest ofChina,
1845
01:34:40,170 --> 01:34:41,830
is the autonomous regionof Xinjiang.
1846
01:34:41,830 --> 01:34:45,170
Of the 25 million peoplewho live here,
1847
01:34:45,170 --> 01:34:48,170
almost half are a Muslim Turkicspeaking people
1848
01:34:48,170 --> 01:34:52,400
called the Uighurs.
1849
01:34:52,400 --> 01:34:53,900
(people shouting)
1850
01:34:53,900 --> 01:34:57,670
In 2009, tensions with localHan Chinese led to protests
1851
01:34:57,670 --> 01:35:01,300
and then riots in the capital,Urumqi.
1852
01:35:01,300 --> 01:35:04,200
(people shouting, guns firing)
1853
01:35:04,200 --> 01:35:05,670
(people shouting)
1854
01:35:08,200 --> 01:35:11,200
As the conflict has grown,the authorities have brought in
1855
01:35:11,200 --> 01:35:13,670
more police,and deployed extensive
1856
01:35:13,670 --> 01:35:17,530
surveillance technology.
1857
01:35:17,530 --> 01:35:20,700
That data feeds an A.I. systemthat the government claims
1858
01:35:20,700 --> 01:35:24,300
can predict individuals proneto "terrorism"
1859
01:35:24,300 --> 01:35:27,570
and detect those in need of"re-education"
1860
01:35:27,570 --> 01:35:30,730
in scores of recentlybuilt camps.
1861
01:35:30,730 --> 01:35:35,700
It is a campaign that hasalarmed human rights groups.
1862
01:35:35,700 --> 01:35:39,100
Chinese authorities are,without any legal basis,
1863
01:35:39,100 --> 01:35:42,770
arbitrarily detaining upto a million Turkic Muslims
1864
01:35:42,770 --> 01:35:44,800
simply on the basisof their identity.
1865
01:35:44,800 --> 01:35:49,230
But even outside the facilitiesin which these people
1866
01:35:49,230 --> 01:35:51,300
are being held, most of thepopulation there
1867
01:35:51,300 --> 01:35:53,470
is being subjected toextraordinary levels
1868
01:35:53,470 --> 01:35:58,470
of high-tech surveillance suchthat almost no aspect of life
1869
01:35:58,470 --> 01:36:01,100
anymore, you know, takes placeoutside
1870
01:36:01,100 --> 01:36:02,600
the state's line of sight.
1871
01:36:02,600 --> 01:36:06,230
And so the kinds of behaviorthat's now being monitored--
1872
01:36:06,230 --> 01:36:07,770
you know, which language do youspeak at home,
1873
01:36:07,770 --> 01:36:09,530
whether you're talking to yourrelatives
1874
01:36:09,530 --> 01:36:13,330
in other countries,how often you pray--
1875
01:36:13,330 --> 01:36:16,230
that information is now beinghoovered up
1876
01:36:16,230 --> 01:36:19,230
and used to decide whetherpeople should be subjected
1877
01:36:19,230 --> 01:36:21,800
to political re-educationin these camps.
1878
01:36:21,800 --> 01:36:24,570
NARRATOR: There have beenreports of torture
1879
01:36:24,570 --> 01:36:27,000
and deaths in the camps.
1880
01:36:27,000 --> 01:36:28,800
And for Uighurs on the outside,
1881
01:36:28,800 --> 01:36:31,670
Xinjiang has already beendescribed
1882
01:36:31,670 --> 01:36:34,770
as an "open-air prison."
1883
01:36:34,770 --> 01:36:36,930
Trying to have a normal lifeas a Uighur
1884
01:36:36,930 --> 01:36:40,430
is impossible both insideand outside of China.
1885
01:36:40,430 --> 01:36:43,530
Just imagine, while you're onyour way to work,
1886
01:36:43,530 --> 01:36:47,600
police subject you to scanyour I.D.,
1887
01:36:47,600 --> 01:36:51,770
forcing you to lift your chin,while machines take your photo
1888
01:36:51,770 --> 01:36:54,970
and wait... you wait until youfind out if you can go.
1889
01:36:54,970 --> 01:36:59,100
Imagine police take your phoneand run data scan,
1890
01:36:59,100 --> 01:37:02,500
and force you to installcompulsory software
1891
01:37:02,500 --> 01:37:07,630
allowing your phone calls andmessages to be monitored.
1892
01:37:07,630 --> 01:37:09,730
NARRATOR: Nury Turkel, alawyer and a prominent
1893
01:37:09,730 --> 01:37:14,470
Uighur activist, addresses ademonstration in Washington, DC.
1894
01:37:14,470 --> 01:37:18,700
Many among the Uighur diasporahave lost all contact
1895
01:37:18,700 --> 01:37:21,030
with their families back home.
1896
01:37:21,030 --> 01:37:26,100
Turkel warns that this dystopiandeployment of new technology
1897
01:37:26,100 --> 01:37:29,330
is a demonstration projectfor authoritarian regimes
1898
01:37:29,330 --> 01:37:31,430
around the world.
1899
01:37:31,430 --> 01:37:35,430
They have a bar codes insomebody's home doors
1900
01:37:35,430 --> 01:37:39,730
to identify what kind of citizenthat he is.
1901
01:37:39,730 --> 01:37:42,670
What we're talking about is acollective punishment
1902
01:37:42,670 --> 01:37:45,100
of an ethnic group.
1903
01:37:45,100 --> 01:37:48,200
Not only that, the Chinesegovernment has been promoting
1904
01:37:48,200 --> 01:37:53,100
its methods, its technology,it is...
1905
01:37:53,100 --> 01:37:58,630
to other countries, namelyPakistan, Venezuela, Sudan,
1906
01:37:58,630 --> 01:38:04,400
and others to utilize, tosquelch political resentment
1907
01:38:04,400 --> 01:38:07,970
or prevent a political upheavalin their various societies.
1908
01:38:07,970 --> 01:38:10,430
♪ ♪
1909
01:38:10,430 --> 01:38:13,500
NARRATOR: China has a grandscheme to spread its technology
1910
01:38:13,500 --> 01:38:15,470
and influence around the world.
1911
01:38:15,470 --> 01:38:19,770
Launched in 2013, it startedalong the old Silk Road
1912
01:38:19,770 --> 01:38:23,270
out of Xinjiang,and now goes far beyond.
1913
01:38:23,270 --> 01:38:29,570
It's called "the Belt and RoadInitiative."
1914
01:38:29,570 --> 01:38:31,370
So effectivelywhat the Belt and Road
1915
01:38:31,370 --> 01:38:35,630
is is China's attempt to,via spending and investment,
1916
01:38:35,630 --> 01:38:37,700
project its influenceall over the world.
1917
01:38:37,700 --> 01:38:39,830
And we've seen, you know,massive infrastructure projects
1918
01:38:39,830 --> 01:38:43,300
going in in places likePakistan, in, in Venezuela,
1919
01:38:43,300 --> 01:38:45,330
in Ecuador, in Bolivia--
1920
01:38:45,330 --> 01:38:47,400
you know, all over the world,Argentina,
1921
01:38:47,400 --> 01:38:49,630
in America's backyard,in Africa.
1922
01:38:49,630 --> 01:38:51,530
Africa's been a huge place.
1923
01:38:51,530 --> 01:38:54,230
And what the Belt and Roadultimately does is, it attempts
1924
01:38:54,230 --> 01:38:56,400
to kind of create a politicalleverage
1925
01:38:56,400 --> 01:39:00,070
for the Chinese spendingcampaign all over the globe.
1926
01:39:00,070 --> 01:39:03,600
NARRATOR: Like Xi Jinping's2018 visit to Senegal,
1927
01:39:03,600 --> 01:39:06,700
where Chinese contractors hadjust built a new stadium,
1928
01:39:06,700 --> 01:39:10,970
arranged loans for a newinfrastructure development,
1929
01:39:10,970 --> 01:39:13,270
and, said the Foreign Ministry,
1930
01:39:13,270 --> 01:39:16,370
there would be help"maintaining social stability."
1931
01:39:16,370 --> 01:39:19,470
As China comes into thesecountries and provides
1932
01:39:19,470 --> 01:39:21,770
these loans, what you end upwith is Chinese technology
1933
01:39:21,770 --> 01:39:24,430
being sold and built out by,you know, by Chinese companies
1934
01:39:24,430 --> 01:39:26,370
in these countries.
1935
01:39:26,370 --> 01:39:27,500
We've started to see it alreadyin terms
1936
01:39:27,500 --> 01:39:29,170
of surveillance systems.
1937
01:39:29,170 --> 01:39:31,170
Not the kind of high-level A.I.stuff yet, but, you know,
1938
01:39:31,170 --> 01:39:32,600
lower-level, camera-based,you know,
1939
01:39:32,600 --> 01:39:36,670
manual sort of observation-typethings all over.
1940
01:39:36,670 --> 01:39:38,200
You know, you see it inCambodia, you see it in Ecuador,
1941
01:39:38,200 --> 01:39:39,770
you see it in Venezuela.
1942
01:39:39,770 --> 01:39:42,570
And what they do is, they sella dam, sell some other stuff,
1943
01:39:42,570 --> 01:39:44,100
and they say, "You know,by the way, we can give you
1944
01:39:44,100 --> 01:39:46,730
these camera systems and,for your emergency response.
1945
01:39:46,730 --> 01:39:49,000
And it'll cost you $300 million,
1946
01:39:49,000 --> 01:39:50,500
and we'll build a ton ofcameras,
1947
01:39:50,500 --> 01:39:52,800
and we'll build you a kind of,you know, a main center
1948
01:39:52,800 --> 01:39:55,100
where you have police who canwatch these cameras."
1949
01:39:55,100 --> 01:39:57,870
And that's going in all overthe world already.
1950
01:39:57,870 --> 01:40:03,570
♪ ♪
1951
01:40:03,570 --> 01:40:06,600
There are 58 countries thatare starting to plug in
1952
01:40:06,600 --> 01:40:10,230
to China's vision of artificialintelligence.
1953
01:40:10,230 --> 01:40:15,300
Which means effectively thatChina is in the process
1954
01:40:15,300 --> 01:40:17,770
of raising a bamboo curtain.
1955
01:40:17,770 --> 01:40:20,770
One that does not need to...
1956
01:40:20,770 --> 01:40:24,130
One that is sort ofall-encompassing,
1957
01:40:24,130 --> 01:40:26,700
that has shared resources,
1958
01:40:26,700 --> 01:40:28,600
shared telecommunicationssystems,
1959
01:40:28,600 --> 01:40:31,970
shared infrastructure,shared digital systems--
1960
01:40:31,970 --> 01:40:35,400
even shared mobile-phonetechnologies--
1961
01:40:35,400 --> 01:40:38,700
that is, that is quickly goingup all around the world
1962
01:40:38,700 --> 01:40:41,700
to the exclusion of usin the West.
1963
01:40:41,700 --> 01:40:43,170
Well, one of the thingsI worry about the most
1964
01:40:43,170 --> 01:40:45,130
is that the worldis gonna split in two,
1965
01:40:45,130 --> 01:40:47,230
and that there will bea Chinese tech sector
1966
01:40:47,230 --> 01:40:48,970
and there will be anAmerican tech sector.
1967
01:40:48,970 --> 01:40:51,830
And countries will effectivelyget to choose
1968
01:40:51,830 --> 01:40:53,170
which one they want.
1969
01:40:53,170 --> 01:40:55,770
It'll be kind of like the ColdWar, where you decide,
1970
01:40:55,770 --> 01:40:57,970
"Oh, are we gonna alignwith the Soviet Union
1971
01:40:57,970 --> 01:40:59,570
or are we gonna alignwith the United States?"
1972
01:40:59,570 --> 01:41:02,200
And the Third World gets tochoose this or that.
1973
01:41:02,200 --> 01:41:06,000
And that's not a world that'sgood for anybody.
1974
01:41:06,000 --> 01:41:09,130
The markets in Asia and theU.S. falling sharply
1975
01:41:09,130 --> 01:41:11,470
on news that a top Chineseexecutive
1976
01:41:11,470 --> 01:41:13,100
has been arrested in Canada.
1977
01:41:13,100 --> 01:41:14,300
Her name is Sabrina Meng.
1978
01:41:14,300 --> 01:41:19,530
She is the CFO of the Chinesetelecom Huawei.
1979
01:41:19,530 --> 01:41:21,270
NARRATOR: News of thedramatic arrest of an important
1980
01:41:21,270 --> 01:41:24,530
Huawei executive was ostensiblyabout the company
1981
01:41:24,530 --> 01:41:26,430
doing business with Iran.
1982
01:41:26,430 --> 01:41:29,600
But it seemed to be more aboutAmerican distrust
1983
01:41:29,600 --> 01:41:32,600
of the company's technology.
1984
01:41:32,600 --> 01:41:33,900
From its headquartersin southern China--
1985
01:41:33,900 --> 01:41:38,930
designed to look like fancifulEuropean capitals--
1986
01:41:38,930 --> 01:41:41,570
Huawei is the second-biggestseller of smartphones,
1987
01:41:41,570 --> 01:41:45,630
and the world leaderin building 5G networks,
1988
01:41:45,630 --> 01:41:50,970
the high-speed backbonefor the age of A.I.
1989
01:41:50,970 --> 01:41:53,070
Huawei's C.E.O.,a former officer
1990
01:41:53,070 --> 01:41:54,970
in the People's Liberation Army,
1991
01:41:54,970 --> 01:41:57,830
was defiant aboutthe American actions.
1992
01:41:57,830 --> 01:41:59,470
(speaking Mandarin)
1993
01:41:59,470 --> 01:42:02,600
(translated): There's no waythe U.S. can crush us.
1994
01:42:02,600 --> 01:42:08,500
The world needs Huawei becausewe are more advanced.
1995
01:42:08,500 --> 01:42:12,900
If the lights go out in theWest, the East will still shine.
1996
01:42:12,900 --> 01:42:16,270
And if the North goes dark,then there is still the South.
1997
01:42:16,270 --> 01:42:19,730
America doesn't representthe world.
1998
01:42:19,730 --> 01:42:22,270
NARRATOR: The U.S. governmentfears that as Huawei supplies
1999
01:42:22,270 --> 01:42:26,400
countries around the worldwith 5G,
2000
01:42:26,400 --> 01:42:28,670
the Chinese government couldhave back-door access
2001
01:42:28,670 --> 01:42:30,700
to their equipment.
2002
01:42:30,700 --> 01:42:34,400
Recently, the C.E.O. promisedcomplete transparency
2003
01:42:34,400 --> 01:42:36,700
into the company's software,
2004
01:42:36,700 --> 01:42:39,470
but U.S. authoritiesare not convinced.
2005
01:42:39,470 --> 01:42:44,530
Nothing in China exists freeand clear of the party-state.
2006
01:42:44,530 --> 01:42:48,730
Those companies can only existand prosper
2007
01:42:48,730 --> 01:42:51,030
at the sufferance of the party.
2008
01:42:51,030 --> 01:42:55,030
And it's made very explicit thatwhen the party needs them,
2009
01:42:55,030 --> 01:42:58,900
they either have to respondor they will be dethroned.
2010
01:42:58,900 --> 01:43:03,770
So this is the challenge with acompany like Huawei.
2011
01:43:03,770 --> 01:43:08,900
So Huawei, Ren Zhengfei, thehead of Huawei, he can say,
2012
01:43:08,900 --> 01:43:12,000
"Well, we... we're just aprivate company and we just...
2013
01:43:12,000 --> 01:43:15,470
We don't take ordersfrom the Communist Party."
2014
01:43:15,470 --> 01:43:18,370
Well, maybe they haven't yet.
2015
01:43:18,370 --> 01:43:20,870
But what the Pentagon sees,
2016
01:43:20,870 --> 01:43:23,100
the National IntelligenceCouncil sees,
2017
01:43:23,100 --> 01:43:27,070
and what the FBI sees is,"Well, maybe not yet."
2018
01:43:27,070 --> 01:43:30,200
But when the call comes,
2019
01:43:30,200 --> 01:43:35,430
everybody knows what thecompany's response will be.
2020
01:43:35,430 --> 01:43:37,000
NARRATOR: The U.S. CommerceDepartment
2021
01:43:37,000 --> 01:43:39,400
has recently blacklistedeight companies
2022
01:43:39,400 --> 01:43:42,870
for doing business withgovernment agencies in Xinjiang,
2023
01:43:42,870 --> 01:43:45,370
claiming they are aidingin the "repression"
2024
01:43:45,370 --> 01:43:49,300
of the Muslim minority.
2025
01:43:49,300 --> 01:43:52,270
Among the companies is Megvii.
2026
01:43:52,270 --> 01:43:55,170
They have strongly objectedto the blacklist,
2027
01:43:55,170 --> 01:43:57,630
saying that it's "amisunderstanding of our company
2028
01:43:57,630 --> 01:44:01,500
and our technology."
2029
01:44:01,500 --> 01:44:04,430
♪ ♪
2030
01:44:04,430 --> 01:44:07,370
President Xi has increased hisauthoritarian grip
2031
01:44:07,370 --> 01:44:11,070
on the country.
2032
01:44:11,070 --> 01:44:14,530
In 2018, he had the Chineseconstitution changed
2033
01:44:14,530 --> 01:44:20,070
so that he could be presidentfor life.
2034
01:44:20,070 --> 01:44:21,370
If you had asked me20 years ago,
2035
01:44:21,370 --> 01:44:23,230
"What will happen to China?",I would've said,
2036
01:44:23,230 --> 01:44:27,170
"Well, over time, the GreatFirewall will break down.
2037
01:44:27,170 --> 01:44:29,770
Of course, people will getaccess to social media,
2038
01:44:29,770 --> 01:44:31,800
they'll get access to Google...
2039
01:44:31,800 --> 01:44:35,700
Eventually, it'll become a muchmore democratic place,
2040
01:44:35,700 --> 01:44:38,370
with free expressionand lots of Western values."
2041
01:44:38,370 --> 01:44:41,870
And the last time I checked,that has not happened.
2042
01:44:41,870 --> 01:44:46,600
In fact, technology's becomea tool of control.
2043
01:44:46,600 --> 01:44:48,570
And as China has gone throughthis amazing period of growth
2044
01:44:48,570 --> 01:44:51,870
and wealth and openness incertain ways,
2045
01:44:51,870 --> 01:44:53,430
there has not been thedemocratic transformation
2046
01:44:53,430 --> 01:44:55,330
that I thought.
2047
01:44:55,330 --> 01:44:57,570
And it may turn out that,in fact,
2048
01:44:57,570 --> 01:45:00,600
technology is a better tool forauthoritarian governments
2049
01:45:00,600 --> 01:45:02,570
than it is for democraticgovernments.
2050
01:45:02,570 --> 01:45:04,900
NARRATOR: To dominatethe world in A.I.,
2051
01:45:04,900 --> 01:45:08,000
President Xi is depending onChinese tech
2052
01:45:08,000 --> 01:45:11,330
to lead the way.
2053
01:45:11,330 --> 01:45:13,030
While companies likeBaidu, Alibaba,
2054
01:45:13,030 --> 01:45:17,870
and Tencent are growing morepowerful and competitive,
2055
01:45:17,870 --> 01:45:20,500
they're also beginning to havedifficulty accessing
2056
01:45:20,500 --> 01:45:24,930
American technology, and areracing to develop their own.
2057
01:45:27,600 --> 01:45:31,200
With a continuing trade warand growing distrust,
2058
01:45:31,200 --> 01:45:33,630
the longtime argument forengagement
2059
01:45:33,630 --> 01:45:38,230
between the two countrieshas been losing ground.
2060
01:45:38,230 --> 01:45:42,100
I've seen more and moreof my colleagues move
2061
01:45:42,100 --> 01:45:44,270
from a position when theythought,
2062
01:45:44,270 --> 01:45:47,330
"Well, if we just keep engagingChina,
2063
01:45:47,330 --> 01:45:50,800
the lines betweenthe two countries
2064
01:45:50,800 --> 01:45:52,800
will slowly converge."
2065
01:45:52,800 --> 01:45:56,570
You know, whether it's ineconomics, technology, politics.
2066
01:45:56,570 --> 01:45:58,370
And the transformation,
2067
01:45:58,370 --> 01:46:01,230
where they now thinkthey're diverging.
2068
01:46:01,230 --> 01:46:05,000
So, in other words, the wholeidea of engagement
2069
01:46:05,000 --> 01:46:07,430
is coming under question.
2070
01:46:07,430 --> 01:46:15,600
And that's cast an entirelydifferent light on technology,
2071
01:46:15,600 --> 01:46:18,800
because if you're diverging andyou're heading into a world
2072
01:46:18,800 --> 01:46:23,900
of antagonism-- you know,conflict, possibly,
2073
01:46:23,900 --> 01:46:25,930
then suddenly, technology issomething
2074
01:46:25,930 --> 01:46:27,930
that you don't want to share.
2075
01:46:27,930 --> 01:46:30,870
You want to sequester,
2076
01:46:30,870 --> 01:46:34,300
to protect your own nationalinterest.
2077
01:46:34,300 --> 01:46:38,130
And I think the tipping-pointmoment we are at now,
2078
01:46:38,130 --> 01:46:41,130
which is what is castingthe whole question of things
2079
01:46:41,130 --> 01:46:45,130
like artificial intelligenceand technological innovation
2080
01:46:45,130 --> 01:46:47,470
into a completely differentframework,
2081
01:46:47,470 --> 01:46:51,700
is that if in fact Chinaand the U.S. are in some way
2082
01:46:51,700 --> 01:46:54,730
fundamentally antagonisticto each other,
2083
01:46:54,730 --> 01:46:59,900
then we're in a completelydifferent world.
2084
01:46:59,900 --> 01:47:05,670
NARRATOR: In the age of A.I.,a new reality is emerging.
2085
01:47:05,670 --> 01:47:07,600
That with so much accumulatedinvestment
2086
01:47:07,600 --> 01:47:11,800
and intellectual power, theworld is already dominated
2087
01:47:11,800 --> 01:47:16,200
by just two A.I. superpowers.
2088
01:47:16,200 --> 01:47:22,130
That's the premise of a new bookwritten by Kai-Fu Lee.
2089
01:47:22,130 --> 01:47:23,500
Hi, I'm Kai-Fu.
2090
01:47:23,500 --> 01:47:25,600
Hi, Dr. Lee, sonice to meet you.
2091
01:47:25,600 --> 01:47:26,670
Really nice to meet you.
2092
01:47:26,670 --> 01:47:28,230
Look at all these dog ears.
2093
01:47:28,230 --> 01:47:30,000
I love, I love that.You like that?
2094
01:47:30,000 --> 01:47:31,970
But I... but I don't like youdidn't buy the book,
2095
01:47:31,970 --> 01:47:33,470
you... you borrowed it.
2096
01:47:33,470 --> 01:47:35,570
I couldn't find it!Oh, really?
2097
01:47:35,570 --> 01:47:36,900
Yeah!And, and you...
2098
01:47:36,900 --> 01:47:39,130
you're coming to my talk?Of course!
2099
01:47:39,130 --> 01:47:40,500
Oh, hi.I did my homework,
2100
01:47:40,500 --> 01:47:41,600
I'm telling you.
2101
01:47:41,600 --> 01:47:42,600
Oh, my goodness, thank you.
2102
01:47:42,600 --> 01:47:44,670
Laurie, can you get thisgentleman a book?
2103
01:47:44,670 --> 01:47:46,430
(people talking in background)
2104
01:47:46,430 --> 01:47:47,730
NARRATOR: In his bookand in life,
2105
01:47:47,730 --> 01:47:50,830
the computerscientist-cum-venture capitalist
2106
01:47:50,830 --> 01:47:52,230
walks a careful path.
2107
01:47:52,230 --> 01:47:56,370
Criticism of the Chinesegovernment is avoided,
2108
01:47:56,370 --> 01:47:58,700
while capitalist successis celebrated.
2109
01:47:58,700 --> 01:48:00,800
I'm studying electricalengineering.
2110
01:48:00,800 --> 01:48:03,230
Sure, send me a resume.Okay, thanks.
2111
01:48:03,230 --> 01:48:06,230
NARRATOR: Now, with the riseof the two superpowers,
2112
01:48:06,230 --> 01:48:09,400
he wants to warn the worldof what's coming.
2113
01:48:09,400 --> 01:48:11,600
Are you the new leaders?
2114
01:48:11,600 --> 01:48:14,230
If we're not the new leaders,we're pretty close.
2115
01:48:14,230 --> 01:48:15,800
(laughs)
2116
01:48:15,800 --> 01:48:18,030
Thank you very much.Thanks.
2117
01:48:18,030 --> 01:48:20,630
NARRATOR: "Never," he writes,"has the potential
2118
01:48:20,630 --> 01:48:22,800
for human flourishing beenhigher
2119
01:48:22,800 --> 01:48:26,230
or the stakes of failuregreater."
2120
01:48:26,230 --> 01:48:27,700
♪ ♪
2121
01:48:27,700 --> 01:48:32,070
So if one has to say who'sahead, I would say today,
2122
01:48:32,070 --> 01:48:34,670
China is quickly catching up.
2123
01:48:34,670 --> 01:48:38,970
China actually beganits big push
2124
01:48:38,970 --> 01:48:42,370
in A.I. only two-and-a-halfyears ago,
2125
01:48:42,370 --> 01:48:46,700
when the AlphaGo-Lee Sedol matchbecame the Sputnik moment.
2126
01:48:46,700 --> 01:48:49,870
NARRATOR: He says he believesthat the two A.I. superpowers
2127
01:48:49,870 --> 01:48:52,700
should lead the way and worktogether
2128
01:48:52,700 --> 01:48:55,270
to make A.I. a force for good.
2129
01:48:55,270 --> 01:48:58,400
If we do, we may have a chanceof getting it right.
2130
01:48:58,400 --> 01:49:00,730
If we do a very good jobin the next 20 years,
2131
01:49:00,730 --> 01:49:04,100
A.I. will be viewed as an age ofenlightenment.
2132
01:49:04,100 --> 01:49:08,370
Our children and their childrenwill see A.I. as serendipity.
2133
01:49:08,370 --> 01:49:13,800
That A.I. is here to liberate usfrom having to do routine jobs,
2134
01:49:13,800 --> 01:49:15,830
and push us to do what we love,
2135
01:49:15,830 --> 01:49:19,530
and push us to think what itmeans to be human.
2136
01:49:19,530 --> 01:49:23,600
NARRATOR: But what if humansmishandle this new power?
2137
01:49:23,600 --> 01:49:25,930
Kai-Fu Lee understandsthe stakes.
2138
01:49:25,930 --> 01:49:28,270
After all, he invested earlyin Megvii,
2139
01:49:28,270 --> 01:49:33,030
which is now on the U.S.blacklist.
2140
01:49:33,030 --> 01:49:35,630
He says he's reduced his stakeand doesn't speak
2141
01:49:35,630 --> 01:49:38,070
for the company.
2142
01:49:38,070 --> 01:49:40,370
Asked about the governmentusing A.I.
2143
01:49:40,370 --> 01:49:44,570
for social control,he chose his words carefully.
2144
01:49:44,570 --> 01:49:49,630
Um... A.I. is a technologythat can be used
2145
01:49:49,630 --> 01:49:52,230
for good and for evil.
2146
01:49:52,230 --> 01:50:00,700
So how... how do governmentslimit themselves in,
2147
01:50:00,700 --> 01:50:04,670
on the one hand,using this A.I. technology
2148
01:50:04,670 --> 01:50:07,970
and the database to maintaina safe environment
2149
01:50:07,970 --> 01:50:11,400
for its citizens, but,but not encroach
2150
01:50:11,400 --> 01:50:14,370
on a individual's rightsand privacies?
2151
01:50:14,370 --> 01:50:17,530
That, I think, is also a trickyissue, I think,
2152
01:50:17,530 --> 01:50:19,200
for, for every country.
2153
01:50:19,200 --> 01:50:22,170
I think for... I think everycountry will be tempted
2154
01:50:22,170 --> 01:50:26,030
to use A.I. probablybeyond the limits
2155
01:50:26,030 --> 01:50:29,930
to which that you and I wouldlike the government to use.
2156
01:50:35,370 --> 01:50:40,970
♪ ♪
2157
01:50:40,970 --> 01:50:43,030
NARRATOR: Emperor Yao devisedthe game of Go
2158
01:50:43,030 --> 01:50:48,770
to teach his son discipline,concentration, and balance.
2159
01:50:48,770 --> 01:50:52,630
Over 4,000 years later,in the age of A.I.,
2160
01:50:52,630 --> 01:50:56,230
those words still resonate withone of its architects.
2161
01:50:56,230 --> 01:50:58,330
♪ ♪
2162
01:50:58,330 --> 01:51:02,270
So A.I. can be used in manyways that are very beneficial
2163
01:51:02,270 --> 01:51:03,700
for society.
2164
01:51:03,700 --> 01:51:08,230
But the current use of A.I.isn't necessarily aligned
2165
01:51:08,230 --> 01:51:11,630
with the goals of buildinga better society,
2166
01:51:11,630 --> 01:51:12,900
unfortunately.
2167
01:51:12,900 --> 01:51:16,570
But, but we could change that.
2168
01:51:16,570 --> 01:51:19,570
NARRATOR: In 2016, a game ofGo gave us a glimpse
2169
01:51:19,570 --> 01:51:24,970
of the future of artificialintelligence.
2170
01:51:24,970 --> 01:51:27,500
Since then, it has become clearthat we will need
2171
01:51:27,500 --> 01:51:32,900
a careful strategy to harnessthis new and awesome power.
2172
01:51:35,800 --> 01:51:38,600
I, I do think that democracyis threatened by the progress
2173
01:51:38,600 --> 01:51:42,300
of these tools unless we improveour social norms
2174
01:51:42,300 --> 01:51:46,430
and we increasethe collective wisdom
2175
01:51:46,430 --> 01:51:51,830
at the planet level to, to dealwith that increased power.
2176
01:51:51,830 --> 01:51:57,600
I'm hoping that my concerns arenot founded,
2177
01:51:57,600 --> 01:51:59,900
but the stakes are so high
2178
01:51:59,900 --> 01:52:06,600
that I don't think we shouldtake these concerns lightly.
2179
01:52:06,600 --> 01:52:11,930
I don't think we can play withthose possibilities and just...
2180
01:52:11,930 --> 01:52:17,030
race ahead without thinkingabout the potential outcomes.
2181
01:52:17,030 --> 01:52:20,870
♪ ♪
2182
01:52:27,330 --> 01:52:31,100
Go to pbs.org/frontline formore of the impact
2183
01:52:31,100 --> 01:52:32,870
of A.I. on jobs.
2184
01:52:32,870 --> 01:52:37,730
I believe about fifty percentof jobs will be somewhat
2185
01:52:37,730 --> 01:52:41,230
or extremely threatened by A.I.in the next 15 years or so.
2186
01:52:41,230 --> 01:52:43,530
And a look at the potentialfor racial bias
2187
01:52:43,530 --> 01:52:45,200
in this technology.
2188
01:52:45,200 --> 01:52:47,000
We've had issues with bias,with discrimination,
2189
01:52:47,000 --> 01:52:48,670
with poor system design,with errors.
2190
01:52:48,670 --> 01:52:51,500
Connect to the "Frontline"community on Facebook
2191
01:52:51,500 --> 01:52:54,570
and Twitter, and watch anytimeon the PBS Video app
2192
01:52:54,570 --> 01:52:56,500
or pbs.org/frontline.
2193
01:52:58,130 --> 01:53:02,070
♪ ♪
2194
01:53:25,000 --> 01:53:26,800
For more on this andother "Frontline" programs,
2195
01:53:26,800 --> 01:53:30,100
visit our websiteat pbs.org/frontline.
2196
01:53:34,930 --> 01:53:37,400
♪ ♪
2197
01:53:40,300 --> 01:53:43,530
To order "Frontline's""In the Age of A.I." on DVD,
2198
01:53:43,530 --> 01:53:48,830
visit ShopPBS or call1-800-PLAY-PBS.
2199
01:53:48,830 --> 01:53:52,570
This program is also availableon Amazon Prime Video.
2200
01:53:57,870 --> 01:54:01,170
♪ ♪
177822
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.