Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:16,041 --> 00:00:19,083
♪ ♪
2
00:00:20,542 --> 00:00:23,583
(rattling)
3
00:00:28,125 --> 00:00:30,291
(Artificial Intelligence
voice speaking)
4
00:00:33,417 --> 00:00:36,208
We're riding in the back
of an autonomous truck.
5
00:00:36,291 --> 00:00:39,500
It's about to take a left turn
into oncoming traffic.
6
00:00:39,583 --> 00:00:41,083
So, if their sensors--
7
00:00:41,166 --> 00:00:43,291
(AI voice speaks)
8
00:00:46,166 --> 00:00:48,750
Andavolu:
It's accelerating on its own.
It's breaking on its own.
9
00:00:48,834 --> 00:00:50,458
It's steering on its own.
10
00:00:51,542 --> 00:00:53,208
Trucking in
the United States is, like,
11
00:00:53,291 --> 00:00:55,375
a $700-billion-a-year industry,
12
00:00:55,458 --> 00:00:58,083
and it employs
1.8 million people.
13
00:00:58,166 --> 00:00:59,458
Thus far, it's been pretty
14
00:00:59,542 --> 00:01:02,291
immune to the changes of
globalization and technology,
15
00:01:02,375 --> 00:01:05,542
but that's about to change
with technology like this.
16
00:01:05,625 --> 00:01:07,750
(AI voice speaking)
17
00:01:08,917 --> 00:01:11,583
♪ ♪
18
00:01:11,667 --> 00:01:13,083
Andavolu:
There's an operator here,
19
00:01:13,166 --> 00:01:14,667
there's a safety engineer here,
20
00:01:14,750 --> 00:01:15,875
'cause they're
still experimenting.
21
00:01:15,959 --> 00:01:17,291
It's still kind
of in development.
22
00:01:17,375 --> 00:01:19,166
(AI voice speaks)
23
00:01:19,792 --> 00:01:22,000
Andavolu:
Have you put your hand
on the wheel at any point?
24
00:01:22,083 --> 00:01:24,208
(operator speaks)
25
00:01:25,041 --> 00:01:27,125
Andavolu:
This is like a pretty complex
traffic situation.
26
00:01:27,208 --> 00:01:29,750
There are cars merging.
There are cars passing us.
27
00:01:29,834 --> 00:01:33,625
This truck is changing lanes
to go around slow traffic,
28
00:01:33,709 --> 00:01:35,917
anticipating when
people are stopping.
29
00:01:36,458 --> 00:01:38,583
And honestly, I just, like,
kind of can't believe it.
30
00:01:38,667 --> 00:01:40,583
It's... It's driving itself,
31
00:01:40,667 --> 00:01:42,500
and it's doing
a pretty good job.
32
00:01:42,583 --> 00:01:44,083
(AI voice speaks)
33
00:01:44,166 --> 00:01:47,750
Operator:
Here we go, making our
right turn into our driveway.
34
00:01:47,834 --> 00:01:51,417
♪ ♪
35
00:01:51,500 --> 00:01:55,583
Chuck Price:
We recognize that this is
a highly disruptive technology,
36
00:01:55,667 --> 00:01:57,750
on the order of
10 million people,
37
00:01:57,834 --> 00:02:01,750
and displacing
rapidly that many people
38
00:02:01,834 --> 00:02:04,583
-would have a dramatic
societal impact.
-Mm-hmm.
39
00:02:04,667 --> 00:02:07,625
We certainly
don't want to see that.
We're not targeting that.
40
00:02:07,709 --> 00:02:10,458
We're focused
on relieving a shortage.
41
00:02:10,542 --> 00:02:14,291
But what we're hoping
is that there will be
a natural evolution
42
00:02:14,375 --> 00:02:16,959
into the jobs of the future,
just as there has been
43
00:02:17,041 --> 00:02:19,166
in every other
technological change.
44
00:02:19,250 --> 00:02:24,166
-What do you tell a trucker?
-We try not to tell truckers
things. We try to listen.
45
00:02:24,250 --> 00:02:27,083
♪ ♪
46
00:02:30,667 --> 00:02:32,458
(sizzling)
47
00:02:32,542 --> 00:02:35,542
(patrons chattering)
48
00:02:35,625 --> 00:02:37,041
This is Chuck.
49
00:02:37,125 --> 00:02:39,792
-Hello, Chuck.
-Chuck runs a company that is
50
00:02:39,875 --> 00:02:43,041
developing
and is gonna produce
self-driving trucks,
51
00:02:43,125 --> 00:02:46,083
that are autonomous,
that do it themselves.
52
00:02:46,166 --> 00:02:48,250
-You guys are truck drivers.
-Don Schrader: Yes, sir.
53
00:02:48,333 --> 00:02:50,333
How do you feel about it?
54
00:02:50,417 --> 00:02:51,834
If the truck doesn't
have a driver in it,
55
00:02:51,917 --> 00:02:54,667
is somebody at headquarters
sitting behind a monitor?
56
00:02:54,750 --> 00:02:56,000
Schrader:
No, there's
a driver in there.
57
00:02:56,083 --> 00:02:58,208
No. In the future,
58
00:02:58,291 --> 00:03:01,041
when there isn't, you know,
once it's fully tested...
59
00:03:01,125 --> 00:03:03,041
-Oh.
-...and we don't need
to have a driver in it.
60
00:03:03,125 --> 00:03:05,834
-Oh, driver assist?
-No, it's not driver assist.
61
00:03:05,917 --> 00:03:08,709
It is purely self-driving.
62
00:03:08,792 --> 00:03:10,166
So, like a GPS basically?
63
00:03:10,250 --> 00:03:12,500
You're going from
point A to point B?
64
00:03:12,583 --> 00:03:14,375
There's GPS in it.
There's a lot of tech,
65
00:03:14,458 --> 00:03:16,625
-but GPS is a part of it.
-Okay.
66
00:03:17,500 --> 00:03:19,667
Andavolu:
He took me into one
of his trucks today.
67
00:03:19,750 --> 00:03:22,458
We did a 90-minute run
up and down on I-10.
68
00:03:22,542 --> 00:03:24,959
Traffic was merging in.
It was changing lanes.
69
00:03:25,041 --> 00:03:27,291
-It was braking, it was
accelerating. It was like--
-Schrader: Really?
70
00:03:27,375 --> 00:03:29,458
So, does it do really good
when the cars cut you off?
71
00:03:29,542 --> 00:03:32,375
A car cut us off!
And you know what?
It didn't slam on the brakes.
72
00:03:32,458 --> 00:03:34,500
The truck just
kind of kept rolling.
73
00:03:34,583 --> 00:03:37,166
Schrader:
How is your equipment
handling the high winds?
74
00:03:37,250 --> 00:03:40,041
-We're actually surprisingly
good in high winds.
-Really?
75
00:03:40,125 --> 00:03:41,750
How much weight
in the trailer?
76
00:03:41,834 --> 00:03:43,250
We've gone empty and loaded.
77
00:03:43,333 --> 00:03:44,875
-Empty and loaded?
-Yes, sir.
78
00:03:44,959 --> 00:03:47,500
We have very complex
control algorithms,
79
00:03:47,583 --> 00:03:49,625
and now we hold it
with such precision that
80
00:03:49,709 --> 00:03:51,375
it is perfectly straight.
81
00:03:51,458 --> 00:03:53,208
Yeah.
It's tough.
82
00:03:53,291 --> 00:03:55,208
Andavolu:
You seem pretty impressed.
83
00:03:55,291 --> 00:03:56,834
Actually, yeah, I am.
84
00:03:56,917 --> 00:03:59,417
I wasn't all for it,
but, I mean, it's...
85
00:03:59,500 --> 00:04:01,208
I gotta,
I gotta see this.
86
00:04:01,291 --> 00:04:03,917
♪ ♪
87
00:04:09,125 --> 00:04:11,291
Price:
That's laser.
That's lidar.
88
00:04:11,375 --> 00:04:14,917
It's a laser,
like a laser radar.
89
00:04:15,000 --> 00:04:17,166
Andavolu:
What do you love
about driving a truck?
90
00:04:17,250 --> 00:04:18,583
Oh, gosh. (chuckles)
91
00:04:18,667 --> 00:04:21,458
I love to drive.
For me, it's a privilege.
92
00:04:22,458 --> 00:04:24,083
It's a privilege
to get out there,
93
00:04:24,166 --> 00:04:25,875
get behind the wheel
of 80,000 pounds,
94
00:04:25,959 --> 00:04:29,083
drive that thing down the road,
knowing, hey, you know what?
95
00:04:29,166 --> 00:04:32,083
-I can do this,
you can't do it.
-No, I can't.
96
00:04:32,166 --> 00:04:34,041
Schrader:
And I know
I'm providing the US.
97
00:04:34,125 --> 00:04:35,291
I'm providing
the world with whatever
98
00:04:35,375 --> 00:04:37,041
I got in the back
of my freight.
99
00:04:37,125 --> 00:04:39,792
-I deliver your clothes,
your food that you're eating.
-Mm-hmm.
100
00:04:39,875 --> 00:04:41,375
A lot of people
don't see that.
101
00:04:41,458 --> 00:04:44,500
And it's a good
feeling as a driver,
as a human being.
102
00:04:45,500 --> 00:04:47,333
Schrader:
You really gotta have faith
103
00:04:47,417 --> 00:04:50,417
to rely on, is
this gonna kill me or...
104
00:04:50,500 --> 00:04:53,041
Yep, and we'll have
to do a lot of testing
105
00:04:53,125 --> 00:04:54,458
-to prove that.
-Wow.
106
00:04:54,542 --> 00:04:55,875
All right.
107
00:04:55,959 --> 00:04:58,125
That's definitely different.
108
00:04:58,208 --> 00:05:01,959
Over the next year,
we're building a fleet
of 200 trucks,
109
00:05:02,041 --> 00:05:04,291
and we're gonna be operating
them day and night,
110
00:05:04,375 --> 00:05:06,458
just to validate it,
to prove it.
111
00:05:06,542 --> 00:05:07,959
We have to prove to you.
112
00:05:08,041 --> 00:05:11,792
We have to prove
to the regulators,
to the states.
113
00:05:11,875 --> 00:05:14,208
We have to prove
to ourselves.
114
00:05:14,291 --> 00:05:16,500
Schrader:
When you asked me
what would I do
if I didn't drive?
115
00:05:16,583 --> 00:05:17,959
I can't honestly answer that
116
00:05:18,041 --> 00:05:20,208
'cause I really don't know
what I would do.
117
00:05:20,291 --> 00:05:22,041
I'd... I'd be scared.
118
00:05:26,041 --> 00:05:27,417
♪ ♪
119
00:05:27,500 --> 00:05:29,291
(beeping)
120
00:05:29,375 --> 00:05:31,542
(engine rumbling)
121
00:05:32,709 --> 00:05:35,417
♪ ♪
122
00:05:44,375 --> 00:05:46,667
♪ ♪
123
00:05:46,750 --> 00:05:50,333
Andavolu:
It's one of the first questions
that every kid gets asked.
124
00:05:50,417 --> 00:05:52,750
What do you wanna be
when you grow up?
125
00:05:52,834 --> 00:05:54,375
Previous generations
grew up confident
126
00:05:54,458 --> 00:05:57,500
that no matter the answer,
they'd be something.
127
00:05:57,583 --> 00:05:59,583
That's actually not
so certain anymore.
128
00:05:59,667 --> 00:06:01,792
Automation already
affects most jobs,
129
00:06:01,875 --> 00:06:03,166
but the pace of change
130
00:06:03,250 --> 00:06:06,667
and the sheer capabilities
of artificial intelligence
131
00:06:06,750 --> 00:06:10,166
are revolutionizing
our relationship to work,
132
00:06:10,250 --> 00:06:12,917
something economists are
paying close attention to.
133
00:06:13,959 --> 00:06:16,709
Technologies are tools.
They don't decide what we do.
134
00:06:16,792 --> 00:06:18,291
We decide what we do
with those tools.
135
00:06:18,375 --> 00:06:21,583
If we have more powerful tools,
by definition,
136
00:06:21,667 --> 00:06:25,125
we have more power
to change the world
than we ever had before.
137
00:06:25,208 --> 00:06:28,709
Andrew McAfee:
It took about 600 years
138
00:06:28,792 --> 00:06:30,750
for global
average incomes
139
00:06:30,834 --> 00:06:32,709
-to increase by 50%.
-Wow.
140
00:06:32,792 --> 00:06:35,250
McAfee: Here we are,
from 1988 to now,
141
00:06:35,333 --> 00:06:38,709
almost 50% or more increases
across the board of humanity.
142
00:06:38,792 --> 00:06:41,500
This is because
of economic freedom
143
00:06:41,583 --> 00:06:43,083
and because of
technological progress.
144
00:06:43,166 --> 00:06:44,959
It just expands
the possibilities,
145
00:06:45,041 --> 00:06:46,709
and therefore,
expands our income
146
00:06:46,792 --> 00:06:49,792
so much more quickly
than we've ever seen before.
147
00:06:49,875 --> 00:06:53,500
Joseph Stiglitz: The problem is
that as the economic pie's
gotten bigger,
148
00:06:53,583 --> 00:06:56,375
not everyone has shared.
149
00:06:56,458 --> 00:06:58,792
Wages at the bottom
today are the same,
150
00:06:58,875 --> 00:07:02,417
-adjusted for inflation,
as they were 60 years ago.
-Mm-hmm.
151
00:07:02,500 --> 00:07:04,291
So, all that growth
152
00:07:04,375 --> 00:07:08,041
didn't go down to the people
at the bottom.
153
00:07:08,125 --> 00:07:10,417
Andavolu:
The future of work has
even caught the attention
154
00:07:10,500 --> 00:07:11,917
of experts like Richard Haass,
155
00:07:12,000 --> 00:07:14,041
and the Council
on Foreign Relations,
156
00:07:14,125 --> 00:07:15,667
who've authored
a report on the threats
157
00:07:15,750 --> 00:07:18,500
it poses
to geopolitical stability.
158
00:07:18,583 --> 00:07:20,834
Millions of jobs are
beginning to disappear.
159
00:07:20,917 --> 00:07:22,834
You've got now
a whole new generation,
160
00:07:22,917 --> 00:07:24,333
whether it's
artificial intelligence,
161
00:07:24,417 --> 00:07:26,125
robotics or autonomous
162
00:07:26,208 --> 00:07:28,667
or driverless vehicles
that are coming along,
163
00:07:28,750 --> 00:07:32,000
that will displace millions
of workers in this country,
164
00:07:32,083 --> 00:07:34,417
in the United States,
but also around the world.
165
00:07:34,500 --> 00:07:37,667
And then suddenly,
these technologies come along,
166
00:07:37,750 --> 00:07:41,083
and they destroy
our existing relationships.
167
00:07:41,166 --> 00:07:43,291
The stakes for us
as individuals are enormous,
168
00:07:43,375 --> 00:07:46,333
and unless
we can replace all that,
what's gonna come of us?
169
00:07:47,208 --> 00:07:51,166
Now, you drop
a few drops of blood
in the shark tank...
170
00:07:51,250 --> 00:07:52,792
-Krishna; Which is AI.
-...which is AI,
171
00:07:52,875 --> 00:07:54,291
-and machine learning.
-Machine learning.
172
00:07:54,375 --> 00:07:56,000
Goolsbee:
And we're gonna
shut down factories,
173
00:07:56,083 --> 00:07:57,667
we're gonna replace
truck drivers.
174
00:07:57,750 --> 00:08:00,417
What if everything
in the country's owned
by three people
175
00:08:00,500 --> 00:08:02,291
who are the ones
who invented the robots...
176
00:08:02,375 --> 00:08:05,375
-Jeff Bezos. Mark Zukerberg.
-...and we're all begging them
for a crumb.
177
00:08:05,458 --> 00:08:07,959
What can I eat today? Please,
will you hand me something?
178
00:08:08,041 --> 00:08:09,792
You can see why
people get upset.
179
00:08:09,875 --> 00:08:12,500
Even thinking about it,
your blood is gonna boil.
180
00:08:13,041 --> 00:08:19,125
The discontent that is evident
in the Brexit vote in 2016,
181
00:08:19,208 --> 00:08:21,500
the vote for Trump in 2016.
182
00:08:21,583 --> 00:08:24,333
What is very clear,
there's a lot of discontent.
183
00:08:24,417 --> 00:08:27,291
A lot of people have not
been doing very well.
184
00:08:27,375 --> 00:08:28,917
It isn't clear
how long we have
185
00:08:29,000 --> 00:08:32,917
before the political system
comes under enormous stresses.
186
00:08:33,417 --> 00:08:37,000
We as a society have not even
begun to have a sustained
187
00:08:37,083 --> 00:08:39,166
or comprehensive
national conversation.
188
00:08:39,250 --> 00:08:41,000
And what worries me is
189
00:08:41,083 --> 00:08:44,792
by the time we really get
around to dealing with this,
it's gonna be too late.
190
00:08:44,875 --> 00:08:46,333
(beeps)
191
00:08:46,417 --> 00:08:48,291
(sizzling)
192
00:08:48,375 --> 00:08:49,875
♪ ♪
193
00:08:49,959 --> 00:08:52,583
(indistinct chatter)
194
00:08:52,667 --> 00:08:56,000
Andavolu:
Over 3.5 million people
work in fast food.
195
00:08:56,083 --> 00:08:57,834
It's one of the easiest
jobs to get,
196
00:08:57,917 --> 00:09:00,458
and a good first step
on the ladder up to a job
197
00:09:00,542 --> 00:09:02,500
with better pay
and less grease.
198
00:09:02,583 --> 00:09:05,250
But at Southern
California's Caliburger,
199
00:09:05,333 --> 00:09:08,625
Gianna Toboni found out even
this first step is in jeopardy.
200
00:09:09,542 --> 00:09:11,875
Our vision is that we want
to take the restaurant industry,
201
00:09:11,959 --> 00:09:13,875
and more broadly
the retail industry,
202
00:09:13,959 --> 00:09:16,709
and make it operate
more like the internet.
203
00:09:16,792 --> 00:09:19,125
We wanna automate
as much as we can,
204
00:09:19,208 --> 00:09:21,375
and allow merchants
that operate
205
00:09:21,458 --> 00:09:23,875
brick-and-mortar businesses
to see their customers
206
00:09:23,959 --> 00:09:26,083
in the same way that Amazon
sees their customers.
207
00:09:26,166 --> 00:09:29,333
-There you go. So, you're done.
-That was 10 seconds, probably.
208
00:09:33,083 --> 00:09:34,208
(sizzling)
209
00:09:34,291 --> 00:09:37,417
♪ ♪
210
00:09:38,750 --> 00:09:41,542
Miller:
This was our first robot
to work on the grill.
211
00:09:41,625 --> 00:09:43,542
The entire fleet of robots
that we deploy
212
00:09:43,625 --> 00:09:46,083
learns from the training that
takes place here in Pasadena.
213
00:09:46,166 --> 00:09:49,500
So, unlike humans, we teach
one robot and you perfect it,
214
00:09:49,583 --> 00:09:50,959
and then you can
deploy that software
215
00:09:51,041 --> 00:09:53,166
to all the robots in the field.
216
00:09:54,750 --> 00:09:58,166
What are the advantages of
automating a business like this?
217
00:09:58,250 --> 00:10:01,583
For a restaurant chain,
consistency is critical
to success.
218
00:10:01,667 --> 00:10:03,041
Right? Critical to scale.
219
00:10:03,125 --> 00:10:05,250
These restaurants will be
safer when you automate them.
220
00:10:05,333 --> 00:10:07,917
Humans touching the food less.
221
00:10:08,000 --> 00:10:09,542
Labor costs is a big
issue right now, right?
222
00:10:09,625 --> 00:10:11,917
It's not just
rising minimum wage,
but it's turnover.
223
00:10:12,000 --> 00:10:14,208
They come in,
they get trained,
and then they leave
224
00:10:14,291 --> 00:10:16,917
to go drive an Uber
or do something else.
225
00:10:17,000 --> 00:10:19,333
I notice you still have
some employees back here.
226
00:10:19,417 --> 00:10:21,417
-It's not just Flippy
running the whole show.
-That's right. Yeah.
227
00:10:21,500 --> 00:10:23,000
Miller:
So, we currently
think about this
228
00:10:23,083 --> 00:10:25,083
as a cobotic
working arrangement,
229
00:10:25,166 --> 00:10:27,959
where we have the robot
automating certain tasks
230
00:10:28,041 --> 00:10:29,875
that humans don't
necessarily like to do.
231
00:10:29,959 --> 00:10:31,875
But we still need people
232
00:10:31,959 --> 00:10:33,667
in the kitchen managing
the robotic systems
233
00:10:33,750 --> 00:10:36,291
and working side-by-side
with the robot to do things
234
00:10:36,375 --> 00:10:38,333
that it's not possible
to automate at this
point in time.
235
00:10:38,417 --> 00:10:40,333
Toboni:
Natalie, how do you
like working here?
236
00:10:40,417 --> 00:10:42,834
I really like it.
It's something different.
237
00:10:42,917 --> 00:10:45,208
It takes a little bit
of getting used to,
238
00:10:45,291 --> 00:10:47,917
-but I really do
like it. Yeah.
-Yeah. Cool.
239
00:10:49,208 --> 00:10:51,750
(whirring)
240
00:10:51,834 --> 00:10:54,875
♪ ♪
241
00:10:56,041 --> 00:10:59,000
(whirring)
242
00:11:01,083 --> 00:11:04,417
♪ ♪
243
00:11:06,792 --> 00:11:08,458
Andavolu:
"Cobotics" is a new word,
244
00:11:08,542 --> 00:11:10,458
but the idea
has been around forever:
245
00:11:10,542 --> 00:11:14,166
Let's integrate
new tools into old tasks
and do them faster.
246
00:11:14,250 --> 00:11:16,583
(beeping, whirring)
247
00:11:18,250 --> 00:11:20,834
At Amazon's robotic-enabled
fulfillment centers,
248
00:11:20,917 --> 00:11:23,875
thousands of newly
developed AI-powered cobots
249
00:11:23,959 --> 00:11:27,333
are rebuilding how man
and machine work together.
250
00:11:27,417 --> 00:11:29,083
We actually started introducing
251
00:11:29,166 --> 00:11:31,709
robotics in around 2012.
252
00:11:32,208 --> 00:11:33,792
-So just, like,
seven years ago?
-Yep.
253
00:11:33,875 --> 00:11:37,667
Brady:
Since then, we have created
almost 300,000 jobs.
254
00:11:37,750 --> 00:11:40,000
-Just in
fulfillment centers like this?
-In fulfillment centers
255
00:11:40,083 --> 00:11:42,125
across the Amazon workforce.
256
00:11:42,792 --> 00:11:46,792
-We have this first example
of human-machine collaboration.
-Uh-huh.
257
00:11:47,375 --> 00:11:49,667
These are mobile shelves
that are drive units
258
00:11:49,750 --> 00:11:52,250
of the little orange robots
that you see below?
-Andavolu: Yeah.
259
00:11:52,333 --> 00:11:56,041
Brady:
You can move those at will,
any shelf, at any time.
260
00:11:56,125 --> 00:11:57,750
And at the right time,
like magic,
261
00:11:57,834 --> 00:11:59,542
that universal station,
it's gonna say,
262
00:11:59,625 --> 00:12:02,208
-"Hey, I think
that object is right here."
-Andavolu: Wow.
263
00:12:02,291 --> 00:12:04,709
Brady:
Now, she is gonna do
a pick operations,
264
00:12:04,792 --> 00:12:07,417
that's scanned,
and if it passes that bar,
it's the right object.
265
00:12:07,500 --> 00:12:10,417
-Look at them go!
These shelves are crazy!
-Brady: It's awesome.
266
00:12:11,041 --> 00:12:13,208
Andavolu:
But it's interesting.
So, who is sort of...
267
00:12:13,291 --> 00:12:15,959
Who's working
for who here? 'Cause, like,
the robots are coming to her.
268
00:12:16,041 --> 00:12:17,834
She's taking the stuff out
and putting it in,
269
00:12:17,917 --> 00:12:20,208
but then she has to say
to the robot, "Oh yeah,
it's actually it."
270
00:12:20,291 --> 00:12:23,208
-Is it on her time, or is
it on the robot's time?
-(laughing): I love it.
271
00:12:23,291 --> 00:12:25,333
It's... I love that question,
it's great.
272
00:12:25,417 --> 00:12:28,125
-It is a symphony of humans
and machines working together.
-Uh-huh.
273
00:12:28,208 --> 00:12:32,125
You put them both together in
order to create a better system.
274
00:12:32,208 --> 00:12:34,500
Is there a day where
there's gonna be a robot
275
00:12:34,583 --> 00:12:37,166
who can pick
and look through things
just as good as she can?
276
00:12:37,250 --> 00:12:38,959
And is that a day
that you're planning for?
277
00:12:39,041 --> 00:12:40,333
Are you already
planning for it?
278
00:12:40,417 --> 00:12:42,458
Brady:
Humans are amazing
at problem-solving.
279
00:12:42,542 --> 00:12:44,667
Humans are amazing
at generalization.
280
00:12:44,750 --> 00:12:47,458
Humans have high value,
high judgment, right?
281
00:12:47,542 --> 00:12:51,291
Why would we ever want
to separate that away
from our machines?
282
00:12:51,375 --> 00:12:53,875
We actually want to make
that more cohesive.
283
00:12:53,959 --> 00:12:55,625
♪ ♪
284
00:12:55,709 --> 00:12:58,959
-What's cool is that Amazon
is growing big time, right?
-Yeah.
285
00:12:59,041 --> 00:13:00,417
And it's creating
a lot of jobs.
286
00:13:00,500 --> 00:13:02,542
It's one of the biggest
job creators in the world.
287
00:13:02,625 --> 00:13:06,166
So, with automation working
hand in hand with people,
288
00:13:06,250 --> 00:13:07,792
is it making
jobs better?
289
00:13:07,875 --> 00:13:10,917
I think it is making it better.
First of all, our associates,
290
00:13:11,000 --> 00:13:14,333
-they choose to come and work
in our fulfillment centers.
-They choose to be here, yeah.
291
00:13:14,417 --> 00:13:17,166
And we're really proud
of the wage benefit
292
00:13:17,250 --> 00:13:19,041
that we are offering
our associates.
293
00:13:19,125 --> 00:13:20,959
-It's a $15 minimum...
-Uh-huh.
294
00:13:21,041 --> 00:13:23,291
...that we instituted this year.
We're really proud of that.
295
00:13:23,375 --> 00:13:25,959
They are the reason
that we're so successful
296
00:13:26,041 --> 00:13:27,709
inside our fulfillment centers.
297
00:13:30,583 --> 00:13:31,875
Andavolu:
This is Jav.
298
00:13:31,959 --> 00:13:34,792
He's 23 and at the very
beginning of his career.
299
00:13:34,875 --> 00:13:36,959
He dropped out of college
for financial reasons,
300
00:13:37,041 --> 00:13:38,625
then left a job
at an elementary school
301
00:13:38,709 --> 00:13:41,333
to become
an Amazon associate
because it paid better.
302
00:13:41,417 --> 00:13:46,250
He still works there,
which is why he asked us
not to use his last name.
303
00:13:46,333 --> 00:13:47,667
When I heard we was
working with robots,
304
00:13:47,750 --> 00:13:49,500
-I thought
the idea was cool.
-Uh-huh.
305
00:13:49,583 --> 00:13:53,917
It's something
I only imagined coming
out of a sci-fi movie.
306
00:13:56,458 --> 00:13:59,041
But, um,
I guess for the most part,
307
00:13:59,125 --> 00:14:01,375
I dislike
the stowing process.
308
00:14:01,458 --> 00:14:03,291
Stowing,
so what's that exactly?
309
00:14:03,375 --> 00:14:08,500
Just imagine just standing
there, just... all day (laughs),
310
00:14:08,583 --> 00:14:10,041
-for those 10 hours.
-Yeah.
311
00:14:10,125 --> 00:14:13,333
You know, you feel like
you have to move fast.
312
00:14:13,417 --> 00:14:16,500
You have to do right
by the robots, you know?
313
00:14:16,583 --> 00:14:19,792
Do the robots work for you,
or do you work for the robots?
314
00:14:19,875 --> 00:14:21,500
Wow, that's
a good question.
315
00:14:21,583 --> 00:14:23,750
I feel like I work
for the robots. (laughs)
316
00:14:25,417 --> 00:14:29,000
At Amazon, data seems
to be like this huge thing.
317
00:14:29,083 --> 00:14:32,166
Like, do they track
your data as a human?
318
00:14:32,250 --> 00:14:33,625
They track everyone.
(chuckles)
319
00:14:33,709 --> 00:14:36,208
How many products are you
actually moving in a single day?
320
00:14:36,291 --> 00:14:39,333
I think the highest
I've ever stowed was...
321
00:14:40,125 --> 00:14:42,458
-2,300 units. Yeah.
-Wow.
322
00:14:43,250 --> 00:14:46,375
You feel like the robots
you're working with. (laughs)
323
00:14:46,458 --> 00:14:48,709
♪ ♪
324
00:14:48,792 --> 00:14:49,625
Oh!
325
00:14:50,208 --> 00:14:52,041
Andavolu:
Something you wanna
keep doing for a while?
326
00:14:52,125 --> 00:14:54,000
You wanna stick
around with it?
327
00:14:54,083 --> 00:14:55,542
-What, Amazon? No.
-Yeah.
328
00:14:55,625 --> 00:14:56,959
(laughter)
329
00:14:57,041 --> 00:14:59,333
-That was quick, right?
-(laughing)
330
00:15:00,750 --> 00:15:02,458
(panting)
331
00:15:02,542 --> 00:15:04,959
What agency
do you have
332
00:15:05,041 --> 00:15:06,875
when you step
into that building?
333
00:15:06,959 --> 00:15:10,125
What am I gonna eat
for lunch? (laughs)
334
00:15:11,250 --> 00:15:12,375
But it's funny
'cause it's like,
335
00:15:12,458 --> 00:15:16,000
I'm hearing from
people at Amazon that
336
00:15:16,083 --> 00:15:18,750
human creativity
and problem-solving
337
00:15:18,834 --> 00:15:21,542
is still something
that they value.
338
00:15:23,542 --> 00:15:26,792
-You heard that
from someone?
-I did.
339
00:15:28,417 --> 00:15:29,875
I don't know.
340
00:15:29,959 --> 00:15:33,750
I haven't been put in a position
where I can like, you know,
341
00:15:33,834 --> 00:15:35,583
be creative.
342
00:15:35,667 --> 00:15:37,458
I'm pretty sure
a lot of people haven't.
343
00:15:37,542 --> 00:15:39,917
♪ ♪
344
00:15:40,000 --> 00:15:42,917
I know that one day,
I would like to get a career
345
00:15:43,000 --> 00:15:45,417
that I don't feel like
I need a vacation from.
346
00:15:45,500 --> 00:15:48,417
It's the whole thing
of being useful,
you know, like,
347
00:15:48,500 --> 00:15:50,083
that's part of being a human.
348
00:15:50,166 --> 00:15:53,834
We all have to feel useful
for something, you know?
349
00:15:55,166 --> 00:15:57,750
-Andavolu:
Do you feel replaceable?
-Jav: Yeah, of course.
350
00:15:57,834 --> 00:16:01,625
I know that if I get fired,
there'll be another person
in my place...
351
00:16:01,709 --> 00:16:04,458
-ASAP, so...
-(indistinct bus announcement)
352
00:16:04,542 --> 00:16:07,542
Andavolu: Do you think
they're gonna try to automate
you out of your job?
353
00:16:09,625 --> 00:16:11,125
Jav:
Yeah, 'cause...
354
00:16:11,208 --> 00:16:13,291
they wouldn't have
to worry about
355
00:16:13,375 --> 00:16:16,333
people being injured
on the job, you know?
356
00:16:16,417 --> 00:16:18,750
So, if they could
replace us with robots,
357
00:16:18,834 --> 00:16:21,333
I think it could be done.
358
00:16:21,417 --> 00:16:23,041
(whirring)
359
00:16:23,125 --> 00:16:25,041
Andavolu:
From 2015 to 2017,
360
00:16:25,125 --> 00:16:28,500
Amazon held competitions where
college teams designed robots
361
00:16:28,583 --> 00:16:31,417
to do more or less
what Jav does all day:
362
00:16:31,500 --> 00:16:33,542
single out objects, grab them,
363
00:16:33,625 --> 00:16:35,792
then stow them
in a specific place.
364
00:16:35,875 --> 00:16:37,083
See how I can collapse the hand,
365
00:16:37,166 --> 00:16:39,458
so I can get it
into where it is?
366
00:16:39,542 --> 00:16:41,792
A robotic hand,
just I haven't seen anything
367
00:16:41,875 --> 00:16:43,291
with that sort of dexterity.
368
00:16:43,375 --> 00:16:44,667
Andavolu:
That's Tye Brady,
369
00:16:44,750 --> 00:16:46,917
the same Amazon
exec we met earlier.
370
00:16:47,000 --> 00:16:50,166
Sure, he hasn't seen a robot
perform that task yet,
371
00:16:50,250 --> 00:16:53,083
but that's exactly why
it's the holy grail
372
00:16:53,166 --> 00:16:56,333
for making robots as
physically versatile as humans.
373
00:16:56,417 --> 00:16:59,625
♪ ♪
374
00:17:00,917 --> 00:17:03,083
-Can I... Will it hurt my hand?
-Lillian Chin: No.
375
00:17:03,166 --> 00:17:06,500
-(whirs)
-Andavolu (stammers): Hi.
Yeah, that's not bad.
376
00:17:06,583 --> 00:17:08,166
Yeah, that's actually
kind of gentle.
377
00:17:08,250 --> 00:17:11,375
So, we've spent, I don't know,
however many hundreds of PhDs
378
00:17:11,458 --> 00:17:14,166
and decades trying to make
robots smarter at grasping.
379
00:17:14,250 --> 00:17:15,667
We're starting
to get there with
380
00:17:15,750 --> 00:17:18,166
-artificial intelligence
and neural networks...
-Uh-huh.
381
00:17:18,250 --> 00:17:19,875
...but even still,
it's still very early
382
00:17:19,959 --> 00:17:21,667
-in terms of our
ability to grasp.
-Right.
383
00:17:21,750 --> 00:17:24,709
Right? And the Amazon
picking challenge is
a perfect example.
384
00:17:24,792 --> 00:17:26,625
A bunch of minds working
on it for a long time,
385
00:17:26,709 --> 00:17:28,166
and we still haven't
figured out
386
00:17:28,250 --> 00:17:30,375
how to just pick
things from a bin.
387
00:17:30,458 --> 00:17:32,041
And that's
a multi-billion dollar,
388
00:17:32,125 --> 00:17:34,542
potentially trillion dollar
value proposition.
389
00:17:34,625 --> 00:17:36,208
What's so
hard about it?
390
00:17:36,291 --> 00:17:37,458
Computers are
smart, right?
391
00:17:37,542 --> 00:17:39,000
What we think is hard
is very different
392
00:17:39,083 --> 00:17:40,709
from what computers
think is hard.
393
00:17:40,792 --> 00:17:43,208
So, we think that being
a chess grandmaster is
a hard challenge,
394
00:17:43,291 --> 00:17:45,625
but the computer can just go
through all the possibilities.
395
00:17:45,709 --> 00:17:48,792
Whereas this, there's
infinite possibilities
to grab that apple.
396
00:17:48,875 --> 00:17:51,375
What happens
when we crack
the grasping problem?
397
00:17:51,458 --> 00:17:54,041
-A lot fewer
Amazon employees.
-(laughing)
398
00:17:54,125 --> 00:17:55,750
(whirring)
399
00:17:57,208 --> 00:17:59,583
Andavolu:
This March, MIT
and Harvard debuted
400
00:17:59,667 --> 00:18:01,542
a new concept of the grabber,
401
00:18:01,625 --> 00:18:04,875
saying Amazon's just the kind
of company that could use it.
402
00:18:05,000 --> 00:18:08,166
Amazon already
sucks up
nearly 50%
403
00:18:08,250 --> 00:18:10,375
of America's
e-commerce
transactions,
404
00:18:10,458 --> 00:18:14,083
and makes a dollar for every
20 spent by American shoppers.
405
00:18:14,166 --> 00:18:17,417
There's a reason it's valued
at nearly a trillion dollars.
406
00:18:17,500 --> 00:18:19,667
But Amazon and its
tech peers, like Apple,
407
00:18:19,750 --> 00:18:22,458
Google, and Facebook,
employ far fewer people
408
00:18:22,542 --> 00:18:25,375
than the richest
companies of
previous eras.
409
00:18:25,458 --> 00:18:27,709
So, even if robots aren't
helping you find the right size
410
00:18:27,792 --> 00:18:29,583
at a brick-and-mortar
Gap quite yet,
411
00:18:29,667 --> 00:18:32,834
that doesn't mean
they aren't eroding
the future of retail work.
412
00:18:32,917 --> 00:18:36,083
-Goolsbee: I spent
most of my childhood
in Southern California.
-Mm-hmm.
413
00:18:36,166 --> 00:18:39,834
The Whitwood Mall
and La Puente Mall were the...
414
00:18:39,917 --> 00:18:42,291
the center of social life.
415
00:18:42,375 --> 00:18:45,083
Andavolu:
Austin Goolsbee is
a professor of economics
416
00:18:45,166 --> 00:18:47,709
and was the chair of economic
advisors under President Obama.
417
00:18:47,792 --> 00:18:52,250
-Retail was, kind of,
often an entry-level job...
-Mm-hmm.
418
00:18:52,333 --> 00:18:55,125
...with probably 16 million,
15 million people
419
00:18:55,208 --> 00:18:57,458
-in the United States
working retail.
-Yeah.
420
00:18:57,542 --> 00:18:59,000
And this technology,
421
00:18:59,083 --> 00:19:01,041
-if you wanna think
of it as that...
-Yeah, it's interesting.
422
00:19:01,125 --> 00:19:03,125
...replaced a different
kind of retail.
423
00:19:03,208 --> 00:19:05,792
Mm-hmm. So, could you see
a mall like this
424
00:19:05,875 --> 00:19:07,959
an example of kind of
creative destruction?
425
00:19:08,041 --> 00:19:10,667
I mean, you can see
the destruction.
426
00:19:11,667 --> 00:19:14,083
How could you make
a living doing that?
427
00:19:14,166 --> 00:19:16,125
You know,
the Hat World.
428
00:19:16,208 --> 00:19:17,417
Hat World.
Well, there's two.
429
00:19:17,500 --> 00:19:18,959
There's Hat World
and Hat Station!
430
00:19:19,041 --> 00:19:21,417
-You know, they're competing
against each other.
-Yeah.
431
00:19:21,500 --> 00:19:22,458
And now,
432
00:19:22,542 --> 00:19:25,000
-it almost seems quaint.
-Mm-hmm.
433
00:19:25,083 --> 00:19:29,667
From the '80s and the '90s,
and into the 2000s,
434
00:19:29,750 --> 00:19:33,458
if you had, say,
a college degree,
435
00:19:33,542 --> 00:19:35,417
the technology
has been great.
436
00:19:35,500 --> 00:19:38,208
-Mm-hmm.
-And it's allowed you
to increase your income a lot.
437
00:19:38,291 --> 00:19:41,166
-That's right, yeah.
-If you're the financial guy,
438
00:19:41,250 --> 00:19:44,667
the number of deals you can do
has expanded exponentially
439
00:19:44,750 --> 00:19:47,667
as the computing power
has gone up.
440
00:19:47,750 --> 00:19:52,208
Those same technologies
have come at the expense
441
00:19:52,291 --> 00:19:55,834
-of expensive physical labor.
-Mm-hmm.
442
00:19:55,917 --> 00:19:58,208
That's the first thing
they tried to replace.
443
00:19:58,291 --> 00:20:01,500
♪ ♪
444
00:20:01,583 --> 00:20:04,959
Andavolu:
One virtue of technology
is that it's impersonal.
445
00:20:05,041 --> 00:20:07,000
It's an equal
opportunity disruptor.
446
00:20:07,083 --> 00:20:10,792
So, even as automation
and AI hit lower-wage jobs,
447
00:20:10,875 --> 00:20:13,583
they're coming for
higher-wage jobs too.
448
00:20:13,667 --> 00:20:17,250
And the people
at this MIT conference
are pretty excited about it.
449
00:20:17,333 --> 00:20:20,041
(applause)
450
00:20:20,125 --> 00:20:21,875
Eric Schmidt:
The biggest, I think, focus
451
00:20:21,959 --> 00:20:24,208
for a while is gonna be
AI acceleration.
452
00:20:24,291 --> 00:20:27,041
Basically, can we use
machine learning and AI
453
00:20:27,125 --> 00:20:28,875
in fields that have
not had it so far?
454
00:20:28,959 --> 00:20:30,250
What do we have to do?
Let's fund them,
455
00:20:30,333 --> 00:20:31,834
let's hire the people,
and so forth,
456
00:20:31,917 --> 00:20:34,583
to bring those tools
and techniques to science.
457
00:20:36,458 --> 00:20:38,083
McAfee:
Most of us walk around
458
00:20:38,166 --> 00:20:40,917
with this implicit rule
of thumb in our heads
459
00:20:41,000 --> 00:20:44,041
about how we should divide up
all the work that needs
to get done
460
00:20:44,125 --> 00:20:45,583
between human beings
and machines.
461
00:20:45,667 --> 00:20:48,625
It says, look, the machines
are better than us
at arithmetic.
462
00:20:48,709 --> 00:20:50,625
They're better
at transaction processing.
463
00:20:50,709 --> 00:20:52,375
They're better
at record keeping.
464
00:20:52,458 --> 00:20:55,417
They're better at all this
low-level detail stuff
than we are.
465
00:20:55,500 --> 00:20:58,166
Awesome. Give all that work
to the machines.
466
00:20:58,250 --> 00:21:00,750
Let the human beings
do the judgment jobs,
467
00:21:00,834 --> 00:21:03,083
the communication jobs,
the pattern-matching jobs.
468
00:21:03,166 --> 00:21:05,250
When I think about
the progress that we're seeing
469
00:21:05,333 --> 00:21:07,750
with AI and machine learning
right now,
470
00:21:07,834 --> 00:21:10,333
that progress is
calling into question
471
00:21:10,417 --> 00:21:13,250
that rule of thumb
in a really profound way.
472
00:21:13,333 --> 00:21:14,834
-Right.
-Because
what we're seeing
473
00:21:14,917 --> 00:21:16,500
over and over is
that the computers
474
00:21:16,583 --> 00:21:18,375
are better at pattern
matching than we are,
475
00:21:18,458 --> 00:21:19,917
even the expert human beings,
476
00:21:20,000 --> 00:21:22,208
and actually they've
got better judgment.
477
00:21:22,291 --> 00:21:24,333
♪ ♪
478
00:21:24,417 --> 00:21:25,750
Andavolu:
Judgment calls
are basically
479
00:21:25,834 --> 00:21:27,917
all we do
at the office
every day.
480
00:21:28,000 --> 00:21:31,458
We take the facts at hand,
run them through
past experiences,
481
00:21:31,542 --> 00:21:34,041
give them a gut check,
and then execute.
482
00:21:34,125 --> 00:21:39,291
And these days,
we're offloading judgment calls
to computers all the time.
483
00:21:39,375 --> 00:21:41,000
Whether it's to take the subway,
484
00:21:41,083 --> 00:21:43,375
take the streets,
or take the highway,
485
00:21:43,458 --> 00:21:46,458
or what to binge-watch next.
486
00:21:46,542 --> 00:21:48,542
And what's behind
these decision-making tools
487
00:21:48,625 --> 00:21:51,208
is a technology called
machine learning.
488
00:21:51,291 --> 00:21:53,041
Some machine learning
relies on programmers
489
00:21:53,125 --> 00:21:55,625
presetting the rules
of the game, like chess.
490
00:21:55,709 --> 00:21:57,667
Newswoman:
As world champion
Garry Kasparov
491
00:21:57,750 --> 00:21:59,250
walked away from the match,
492
00:21:59,333 --> 00:22:01,875
never looking back at the
computer that just beat him.
493
00:22:01,959 --> 00:22:04,291
Others utilize what's
called neural networks
494
00:22:04,375 --> 00:22:07,667
to figure out the rules
for themselves, like a baby.
495
00:22:07,750 --> 00:22:10,166
This is called deep learning.
496
00:22:10,250 --> 00:22:13,667
However it's done, programmers
use other recent innovations,
497
00:22:13,750 --> 00:22:15,875
like natural
language processing,
498
00:22:15,959 --> 00:22:18,458
image recognition
and speech recognition
499
00:22:18,542 --> 00:22:22,250
to take the messy world
as we know it, and shove it
into the machine.
500
00:22:22,333 --> 00:22:26,417
And the machine
can process more data,
more dimensions of data,
501
00:22:26,500 --> 00:22:29,500
more outcomes from the past,
more everything,
502
00:22:29,583 --> 00:22:32,875
and could even go from helping
making judgments in real time
503
00:22:32,959 --> 00:22:35,500
to making predictions
about the future.
504
00:22:35,583 --> 00:22:40,834
With vast amounts of data
available in the legal,
medical, and financial world,
505
00:22:40,917 --> 00:22:45,458
and the tools to shove
them all into the computer,
the machines are coming.
506
00:22:45,542 --> 00:22:47,250
♪ ♪
507
00:22:51,834 --> 00:22:54,083
Gianna visited a tech
company called LawGeex
508
00:22:54,166 --> 00:22:56,500
to see whether their new
machine-learning software
509
00:22:56,583 --> 00:22:59,041
could put lawyers
on the chopping block.
510
00:22:59,125 --> 00:23:01,041
We've built an AI engine
511
00:23:01,125 --> 00:23:04,458
that was trained
after reviewing
512
00:23:04,542 --> 00:23:05,875
many, many, many
different contracts.
513
00:23:05,959 --> 00:23:09,667
-How many? Wow.
-Tens of thousands,
even more.
514
00:23:09,750 --> 00:23:11,083
We decided to focus on
515
00:23:11,166 --> 00:23:13,709
automating the review
and approval of contracts.
516
00:23:13,792 --> 00:23:17,166
So, simplifying and making
that process faster.
517
00:23:17,250 --> 00:23:19,792
The actual analysis
that happens on the back end
518
00:23:19,875 --> 00:23:21,000
takes a couple of seconds.
519
00:23:21,083 --> 00:23:23,291
The work is actually
going through the report
520
00:23:23,375 --> 00:23:24,750
that the system generates,
521
00:23:24,834 --> 00:23:27,542
and then fixing whatever
issue that is found.
522
00:23:27,625 --> 00:23:30,000
So, the system has flagged
all of these items for you.
523
00:23:30,083 --> 00:23:31,250
-Bechor: Exactly.
-Toboni: Okay.
524
00:23:31,333 --> 00:23:32,458
Some of them are marked red,
525
00:23:32,542 --> 00:23:34,500
meaning that they
don't match my policy,
526
00:23:34,583 --> 00:23:36,125
and some of them
are marked green,
527
00:23:36,208 --> 00:23:37,875
meaning that they
do match my policy.
528
00:23:37,959 --> 00:23:40,917
It gives me all of
the guidelines about what
529
00:23:41,000 --> 00:23:43,834
the problem means, and also what
I need to do in order to fix it,
530
00:23:43,917 --> 00:23:45,917
but then I fix the problem.
531
00:23:46,000 --> 00:23:49,375
So, didn't spell-check
start doing this in the '90s?
532
00:23:49,458 --> 00:23:50,875
-Okay, that's...
-(both laugh)
533
00:23:50,959 --> 00:23:52,458
-Like, how is this different?
-Yeah.
534
00:23:52,542 --> 00:23:54,709
The way LawGeex works
is very, very different.
535
00:23:54,792 --> 00:23:56,583
It actually looks at the text,
536
00:23:56,667 --> 00:23:58,709
understands the meaning
behind the text.
537
00:23:58,792 --> 00:24:00,417
-Oh, wow.
-Very similar
538
00:24:00,500 --> 00:24:02,709
to how a human lawyer
would review it, right?
539
00:24:02,792 --> 00:24:05,917
The only difference is
that with the AI system,
540
00:24:06,000 --> 00:24:08,000
it never forgets,
541
00:24:08,083 --> 00:24:09,709
it doesn't get tired,
542
00:24:09,792 --> 00:24:11,709
and it doesn't need
to drink coffee.
543
00:24:11,792 --> 00:24:13,083
-Hey. Gianna.
-Tunji.
544
00:24:13,166 --> 00:24:14,500
-Nice to meet you.
-How's it going?
545
00:24:15,083 --> 00:24:16,583
-You ready for this?
-Yeah, I think so.
546
00:24:16,667 --> 00:24:18,917
-Let's do this.
-All right. I feel
like John Henry.
547
00:24:19,000 --> 00:24:20,208
(laughs)
548
00:24:20,291 --> 00:24:22,750
So, Tunji, you're going up
against this AI system
549
00:24:22,834 --> 00:24:25,875
with Noory to spot legal
issues in two NDAs.
550
00:24:25,959 --> 00:24:29,375
We're rating you guys
on both speed and accuracy.
551
00:24:29,458 --> 00:24:31,583
And because this isn't
a commercial for LawGeex,
552
00:24:31,667 --> 00:24:33,041
I need you to try your hardest
553
00:24:33,125 --> 00:24:35,125
-to beat this computer.
-I'm on it.
554
00:24:35,208 --> 00:24:37,333
All righty. On your marks,
555
00:24:37,417 --> 00:24:38,792
get set,
556
00:24:38,875 --> 00:24:39,625
go!
557
00:24:39,709 --> 00:24:41,917
♪ ♪
558
00:24:47,083 --> 00:24:48,709
(clicking)
559
00:24:52,208 --> 00:24:54,792
(quietly):
So, we snuck a little phrase
into this contract.
560
00:24:54,875 --> 00:24:57,333
It says, "In case of any
breach by the recipient
561
00:24:57,417 --> 00:24:58,792
"of any obligations
under this agreement,
562
00:24:58,875 --> 00:25:01,458
"the recipient will
pay Vice News a penalty
563
00:25:01,542 --> 00:25:03,875
of $15,000 per event."
564
00:25:03,959 --> 00:25:05,166
So, we're gonna see if Tunji
565
00:25:05,250 --> 00:25:07,625
and the AI system can find it.
566
00:25:08,250 --> 00:25:10,917
♪ ♪
567
00:25:15,583 --> 00:25:19,041
♪ ♪
568
00:25:19,834 --> 00:25:22,500
Poor Tunji's just still
working away over here.
569
00:25:22,583 --> 00:25:24,250
So far, it's taken
him more than
570
00:25:24,333 --> 00:25:26,333
double the time that
it took the computer.
571
00:25:26,417 --> 00:25:29,542
And meanwhile,
Noory and I just got a coffee.
572
00:25:29,625 --> 00:25:31,000
He's having
a meeting right now.
573
00:25:31,083 --> 00:25:34,875
Pretty clear just how much
time this technology saves.
574
00:25:37,458 --> 00:25:38,625
Done.
575
00:25:40,417 --> 00:25:41,750
Toboni:
Okay, guys.
576
00:25:41,834 --> 00:25:44,041
-The results are in.
-Tunji: Okay.
577
00:25:44,125 --> 00:25:48,834
LawGeex:
95% on the first NDA.
578
00:25:48,917 --> 00:25:51,000
Tunji, 85%.
579
00:25:51,083 --> 00:25:52,709
Second NDA,
580
00:25:52,792 --> 00:25:54,875
95% for the computer,
581
00:25:54,959 --> 00:25:56,875
83% for Tunji.
582
00:25:56,959 --> 00:25:58,375
You don't seem disappointed.
583
00:25:58,458 --> 00:26:00,709
I wasn't disappointed
when the iPhone came out
584
00:26:00,792 --> 00:26:02,000
and I could do more things
585
00:26:02,083 --> 00:26:03,291
with this new piece
of technology.
586
00:26:03,375 --> 00:26:05,083
So, this is exciting to me.
587
00:26:05,166 --> 00:26:07,041
So, Noory,
did the computer catch
588
00:26:07,125 --> 00:26:09,375
the phrase we put
in there about
589
00:26:09,458 --> 00:26:10,875
Vice News?
590
00:26:10,959 --> 00:26:12,083
Yeah, the computer
caught it,
591
00:26:12,166 --> 00:26:14,041
and I was able
to strike it out.
592
00:26:14,125 --> 00:26:15,542
Tunji, did you catch it?
593
00:26:15,625 --> 00:26:18,333
I straight up missed it.
I didn't see it at all.
594
00:26:18,417 --> 00:26:20,041
We take cash, check,
595
00:26:20,125 --> 00:26:22,417
whatever you have.
(laughs) Just kidding.
596
00:26:22,500 --> 00:26:25,750
So, McKinsey says that
22% of a lawyer's job,
597
00:26:25,834 --> 00:26:27,750
35% of
a paralegal's job,
598
00:26:27,834 --> 00:26:30,417
can now, today,
be automated.
599
00:26:30,500 --> 00:26:32,959
So then, what happens?
These jobs don't go away?
600
00:26:33,041 --> 00:26:34,500
People just take
longer lunch breaks
601
00:26:34,583 --> 00:26:36,125
or take on more
clients or what?
602
00:26:36,208 --> 00:26:38,208
We're kidding ourselves
if we think that
603
00:26:38,291 --> 00:26:39,709
things are not
going to change.
604
00:26:39,792 --> 00:26:42,500
But similar to pilots
with the autopilot,
605
00:26:42,583 --> 00:26:44,792
it's not like we don't
need pilots anymore.
606
00:26:44,875 --> 00:26:46,709
Could this technology
pass a bar?
607
00:26:46,792 --> 00:26:47,583
No.
608
00:26:47,667 --> 00:26:49,417
Okay, so we still need lawyers
609
00:26:49,500 --> 00:26:51,667
to be signing off
on these legal documents,
610
00:26:51,750 --> 00:26:53,542
even if they're not doing
the nitty-gritty of them.
611
00:26:53,625 --> 00:26:55,458
Absolutely. Defining the policy,
612
00:26:55,542 --> 00:26:58,500
serving escalation points,
handling negotiation,
613
00:26:58,583 --> 00:27:01,458
and then also handling more
complex contract categories.
614
00:27:01,542 --> 00:27:04,125
-Mm-hmm.
-You have a new
generation of lawyers
615
00:27:04,208 --> 00:27:05,750
that are much
more tech-savvy.
616
00:27:05,834 --> 00:27:08,792
The ones that can actually
leverage technology
617
00:27:08,875 --> 00:27:12,917
are the ones that
manage to prosper.
618
00:27:13,000 --> 00:27:17,458
Andavolu: Unlike the law,
medicine has always been
intertwined with technology.
619
00:27:17,542 --> 00:27:20,125
But it's relationship with
robotics isn't just graceful,
620
00:27:20,208 --> 00:27:22,083
it's miraculous.
621
00:27:22,166 --> 00:27:23,875
Toboni:
What is the most
groundbreaking thing
622
00:27:23,959 --> 00:27:26,166
about how far we've come
with robotic surgery?
623
00:27:26,250 --> 00:27:28,792
So, robotic surgery
allows us to really
624
00:27:28,875 --> 00:27:31,417
treat tissue
in a more delicate way...
625
00:27:31,500 --> 00:27:32,792
(whirring)
626
00:27:32,875 --> 00:27:35,208
...allow us to be much more
efficient in suturing,
627
00:27:35,291 --> 00:27:36,458
decreasing bleeding.
628
00:27:36,542 --> 00:27:39,083
We are improving outcomes,
629
00:27:39,166 --> 00:27:40,792
shortening hospital stay,
630
00:27:40,875 --> 00:27:43,709
and that's why we're using
more and more now.
631
00:27:43,792 --> 00:27:45,667
So, the robot is
enabling surgeons,
632
00:27:45,750 --> 00:27:48,125
and by enabling
surgeons is giving
633
00:27:48,208 --> 00:27:50,583
access to more patients
to minimal invasive surgery.
634
00:27:51,333 --> 00:27:52,667
Horgan:
Okay, we're good to go.
635
00:27:52,750 --> 00:27:54,792
♪ ♪
636
00:27:54,875 --> 00:27:55,875
Toboni:
That's great, right?
637
00:27:55,959 --> 00:27:57,500
'Cause one of the huge problems
638
00:27:57,583 --> 00:28:00,667
in our health care system is
that not enough patients are
639
00:28:00,750 --> 00:28:02,834
getting seen when
they need to be seen.
640
00:28:02,917 --> 00:28:05,041
Not only that, not every
patient get the same care.
641
00:28:05,125 --> 00:28:07,500
You may have great
surgeons in one area,
642
00:28:07,583 --> 00:28:09,834
but not in other areas
with the same experience.
643
00:28:09,917 --> 00:28:12,458
The robot is flattening that.
644
00:28:12,542 --> 00:28:13,834
-We're ready?
-Man: Ready.
645
00:28:13,917 --> 00:28:17,500
♪ ♪
646
00:28:17,583 --> 00:28:19,625
Horgan:
I think that the biggest
groundbreaking part
647
00:28:19,709 --> 00:28:23,417
is the fact that I am
operating on a console,
648
00:28:23,500 --> 00:28:25,667
and the system is
on the patient.
649
00:28:25,750 --> 00:28:27,583
We have all the degrees
of articulation
650
00:28:27,667 --> 00:28:28,875
you would have in your hand
651
00:28:28,959 --> 00:28:30,667
inside the abdomen.
652
00:28:30,750 --> 00:28:32,375
That's revolutionary.
653
00:28:32,458 --> 00:28:34,792
Man:
Geez. That's close.
654
00:28:36,417 --> 00:28:37,542
Horgan:
Okay!
655
00:28:37,625 --> 00:28:40,709
I'm gonna go out
and have my espresso.
656
00:28:40,792 --> 00:28:44,542
Toboni:
The way robotic
surgery is evolving,
do you see any jobs,
657
00:28:44,625 --> 00:28:48,208
like the job of a technician
or a physician's assistant
going away?
658
00:28:48,291 --> 00:28:50,166
No, but what we have seen is
the opposite--
659
00:28:50,250 --> 00:28:52,709
is them being
much more involved.
660
00:28:52,792 --> 00:28:54,250
Okay, stapler.
661
00:28:54,333 --> 00:28:55,959
Assistant:
Loading up the seam guard.
662
00:28:56,041 --> 00:28:58,208
Horgan:
You need a nurse that knows
how to move it around.
663
00:28:58,291 --> 00:29:01,083
You need a scrub tech
that knows how to load
the instruments,
664
00:29:01,166 --> 00:29:03,166
how to clean the camera,
and how to move the arm,
665
00:29:03,250 --> 00:29:04,917
-how to undock.
-Toboni: Right.
666
00:29:05,583 --> 00:29:07,000
Horgan:
I like it a lot there.
What do you think?
667
00:29:07,083 --> 00:29:09,083
-Assistant: Yeah, looks good.
-Horgan: Only good?
668
00:29:09,166 --> 00:29:11,166
-Assistant: Perfect.
Unbelievable.
-Horgan: Eh?
669
00:29:11,250 --> 00:29:12,875
Horgan:
Look at that.
670
00:29:12,959 --> 00:29:15,166
Toboni:
How do you think
automation will
671
00:29:15,250 --> 00:29:17,166
evolve in this area?
672
00:29:17,250 --> 00:29:20,041
The anatomy changes
from patient to patient,
673
00:29:20,125 --> 00:29:21,792
but with machine learning,
674
00:29:21,875 --> 00:29:24,834
the system will be able
to recognize different tissues.
675
00:29:24,917 --> 00:29:28,083
-Mm.
-And I'm sure
that in the next year,
we'll see the system
676
00:29:28,166 --> 00:29:31,583
-telling you that's cancer,
that's not cancer.
-Wow.
677
00:29:31,667 --> 00:29:33,709
And that's where
the benefits are gonna be.
678
00:29:34,375 --> 00:29:36,834
♪ ♪
679
00:29:36,917 --> 00:29:38,792
Andavolu:
Finance has been cashing
in on the benefits
680
00:29:38,875 --> 00:29:40,417
of machine learning for years,
681
00:29:40,500 --> 00:29:41,959
hiring so many programmers
682
00:29:42,041 --> 00:29:45,083
that it's virtually
indistinguishable from tech.
683
00:29:45,166 --> 00:29:47,250
Michael Moynihan
visited Goldman Sachs
684
00:29:47,333 --> 00:29:50,417
to see how they're leveraging
AI's predictive capabilities
685
00:29:50,500 --> 00:29:52,250
to make more money.
686
00:29:52,333 --> 00:29:55,125
How has your job changed,
687
00:29:55,208 --> 00:29:56,959
and how has this
industry changed?
688
00:29:57,041 --> 00:30:00,166
Over time, there's been
a lot of automation.
689
00:30:00,250 --> 00:30:02,291
I think it lends itself
naturally to trading, right?
690
00:30:02,375 --> 00:30:04,166
If I'm trading Google,
691
00:30:04,250 --> 00:30:05,625
if I have to make
a decision whether
692
00:30:05,709 --> 00:30:07,500
I'm going to buy it
or sell it at a certain price,
693
00:30:07,583 --> 00:30:10,792
there are hundreds of variables
that go into that decision.
694
00:30:10,875 --> 00:30:13,041
And you can code an algorithm
695
00:30:13,125 --> 00:30:15,458
to assess all those variables,
696
00:30:15,542 --> 00:30:17,125
and when you swing
that bat 1,000 times,
697
00:30:17,208 --> 00:30:19,500
it's going to do it more
efficiently than a human would.
698
00:30:19,583 --> 00:30:23,959
-(bell ringing)
-So, when you look on the floor
of the New York Stock Exchange,
699
00:30:24,041 --> 00:30:26,458
there aren't a lot of
traders down there anymore.
700
00:30:26,542 --> 00:30:29,458
Moynihan:
If I went down there today,
what would I see?
701
00:30:29,542 --> 00:30:31,291
Levine:
You wouldn't see
a lot of people.
702
00:30:31,375 --> 00:30:33,083
You might see them clustered.
703
00:30:33,166 --> 00:30:35,000
Moynihan:
Little tiny clusters
of people?
704
00:30:35,083 --> 00:30:37,083
Levine:
It'll be a cluster.
705
00:30:37,166 --> 00:30:39,542
Moynihan:
So, to be in this industry now,
706
00:30:39,625 --> 00:30:41,959
do you need to understand
707
00:30:42,041 --> 00:30:45,000
lines of code and what they
do and how to produce it?
708
00:30:45,083 --> 00:30:46,750
Sinead Strain:
I'd say more and more, yes.
709
00:30:46,834 --> 00:30:49,083
If we look at this firm,
710
00:30:49,166 --> 00:30:52,291
the number of engineers,
technologists that we have here,
711
00:30:52,375 --> 00:30:54,834
we're probably the largest
division within the firm.
712
00:30:54,917 --> 00:30:56,875
♪ ♪
713
00:30:56,959 --> 00:30:58,208
So, you have
a math background?
714
00:30:58,291 --> 00:30:59,917
Yeah, math
and computer science.
715
00:31:00,000 --> 00:31:02,375
-But yours is economics
and computer science?
-Yes.
716
00:31:02,458 --> 00:31:04,166
Moynihan:
But you learned this on the fly?
717
00:31:04,250 --> 00:31:05,542
Jameson Schriber:
Yeah, in school or here.
718
00:31:05,625 --> 00:31:08,166
More likely, most of
the things were picked up here.
719
00:31:08,250 --> 00:31:10,333
Can I see like,
you know, an algorithm,
720
00:31:10,417 --> 00:31:12,291
give me some
sense of code?
721
00:31:12,375 --> 00:31:14,166
Phillips:
I'll show you a really
simple example.
722
00:31:14,250 --> 00:31:17,208
So, in this case, I'm gonna
run a volatility function,
723
00:31:17,291 --> 00:31:20,375
essentially an algorithm,
and this is showing me now,
724
00:31:20,458 --> 00:31:22,208
on a 22-day rolling day basis,
725
00:31:22,291 --> 00:31:24,709
what's the volatility
level of the S&P 500.
726
00:31:24,792 --> 00:31:26,875
Essentially, we would
write code to do that.
727
00:31:26,959 --> 00:31:28,375
So, in this library,
728
00:31:28,458 --> 00:31:30,875
this is the volatility function.
It's actually fairly simple.
729
00:31:30,959 --> 00:31:32,709
-But, we--
-Wait, I'm sorry.
730
00:31:32,792 --> 00:31:34,291
That's fairly
simple?
731
00:31:34,375 --> 00:31:35,500
This is all documentation.
732
00:31:35,583 --> 00:31:37,250
This is actually a fairly
simple algorithm.
733
00:31:37,333 --> 00:31:40,125
It's essentially the code
that's generating
what you're seeing.
734
00:31:41,083 --> 00:31:43,083
Looks terribly
complicated to me.
735
00:31:43,166 --> 00:31:45,166
People like
you guys still
736
00:31:45,250 --> 00:31:47,208
need to exist to create
these things, right?
737
00:31:47,291 --> 00:31:49,125
I mean, are they
self-sustaining?
738
00:31:49,208 --> 00:31:51,834
Or can you write
yourself out of a job?
739
00:31:51,917 --> 00:31:54,500
-I think we're... Yeah.
-I think that would be hard.
740
00:31:54,583 --> 00:31:56,291
♪ ♪
741
00:31:56,375 --> 00:31:58,208
Moynihan:
Formerly a tech CEO,
742
00:31:58,291 --> 00:32:00,458
Marty Chavez is
now global co-head
743
00:32:00,542 --> 00:32:03,041
of Goldman's
securities division.
744
00:32:03,125 --> 00:32:05,333
It strikes me, obviously,
that this is an industry
745
00:32:05,417 --> 00:32:08,583
that has been on
the forefront of using
746
00:32:08,667 --> 00:32:11,417
AI, computers, to...
747
00:32:11,500 --> 00:32:13,917
you know,
make big decisions
and make a lot of money.
748
00:32:14,000 --> 00:32:17,542
What has that done
to the kind of, you know,
749
00:32:17,625 --> 00:32:20,917
the job market within--
even within this company.
750
00:32:21,000 --> 00:32:26,542
It's creating new jobs that
couldn't have existed before.
751
00:32:26,625 --> 00:32:28,542
Whole new businesses
752
00:32:28,625 --> 00:32:31,041
now exist for us
and out in the world
753
00:32:31,125 --> 00:32:32,917
that wouldn't
have been possible
754
00:32:33,000 --> 00:32:34,750
without the technologies
755
00:32:34,834 --> 00:32:36,500
that have arisen
over the past few years,
756
00:32:36,583 --> 00:32:38,959
whether they're machine
learning, cloud services,
757
00:32:39,041 --> 00:32:40,250
open-source, right?
758
00:32:40,333 --> 00:32:43,125
All of those
activities go into,
759
00:32:43,208 --> 00:32:45,709
for instance,
our new consumer
760
00:32:45,792 --> 00:32:48,041
lending and deposit
taking activity.
761
00:32:51,542 --> 00:32:53,500
You know the counterargument
to this, don't you?
762
00:32:53,583 --> 00:32:56,083
These are jobs that are being
created for smart people,
763
00:32:56,166 --> 00:32:58,667
-educated people,
rich people.
-Mm-hmm.
764
00:32:58,750 --> 00:33:01,625
Other people out there,
who are being made
redundant by robots,
765
00:33:01,709 --> 00:33:03,542
don't have the skills.
766
00:33:03,625 --> 00:33:06,500
-It's only for
a rarefied few.
-Mm.
767
00:33:06,583 --> 00:33:08,250
How do you respond to that?
768
00:33:08,333 --> 00:33:10,500
Technological change is
disruptive,
769
00:33:10,583 --> 00:33:12,250
and there's
a lot of pain,
770
00:33:12,333 --> 00:33:15,625
and it's something
that we must concern
ourselves with.
771
00:33:15,709 --> 00:33:18,000
What do you do during
the disruption,
772
00:33:18,083 --> 00:33:23,291
which has been continuous
since the agricultural
and industrial Revolutions,
773
00:33:23,375 --> 00:33:25,709
and I expect it will continue
774
00:33:25,792 --> 00:33:28,667
and will accelerate
in all likelihood.
775
00:33:28,750 --> 00:33:31,959
And so, sitting back
and complaining about it,
776
00:33:32,041 --> 00:33:34,125
sitting back and doing
nothing about it,
777
00:33:34,208 --> 00:33:35,792
don't seem to be options.
778
00:33:35,875 --> 00:33:37,667
At the same time,
I don't think the answer is
779
00:33:37,750 --> 00:33:41,375
-to stop the progression
of technology.
-Mm.
780
00:33:41,458 --> 00:33:43,000
Haven't seen that work.
781
00:33:43,083 --> 00:33:46,125
(brakes screeching)
782
00:33:47,583 --> 00:33:51,208
(PA announcement):
This is a Queens-bound
M local train.
783
00:33:51,291 --> 00:33:54,041
The next stop is Myrtle Avenue.
784
00:33:55,542 --> 00:33:58,542
It's worth noting at this
point that even if AI tools
785
00:33:58,625 --> 00:34:01,834
are better at making decisions,
high-level judgment jobs
786
00:34:01,917 --> 00:34:04,375
aren't about to be automated
away any time soon.
787
00:34:04,458 --> 00:34:06,709
And let's be clear,
the people who are
788
00:34:06,792 --> 00:34:09,834
most excited about
AI's creative potential
789
00:34:09,917 --> 00:34:11,709
aren't the ones
getting replaced.
790
00:34:11,792 --> 00:34:15,208
So the question becomes,
as this disruption accelerates,
791
00:34:15,291 --> 00:34:18,792
how do you benefit from it
if you're not already rich,
792
00:34:18,875 --> 00:34:21,750
or white, or male,
or have a spot at the top?
793
00:34:22,875 --> 00:34:24,709
Let's put it this way.
794
00:34:24,792 --> 00:34:27,834
It's a lot easier for all
of us to get directions,
795
00:34:27,917 --> 00:34:29,625
but it's becoming
increasingly hard
796
00:34:29,709 --> 00:34:33,125
for most people to find
their way to a stable career.
797
00:34:33,208 --> 00:34:35,291
And there are not a lot
of people whining about it.
798
00:34:35,375 --> 00:34:38,125
There are a lot of people
who are racing to catch up.
799
00:34:39,834 --> 00:34:42,583
♪ ♪
800
00:34:42,667 --> 00:34:44,500
We visited a class
at Per Scholas,
801
00:34:44,583 --> 00:34:46,750
a non-profit that
skills up people
802
00:34:46,834 --> 00:34:48,083
in New York and other cities
803
00:34:48,166 --> 00:34:50,041
around the country for free.
804
00:34:50,125 --> 00:34:51,792
(indistinct chatter)
805
00:34:55,166 --> 00:34:57,500
(chatter continuing)
806
00:35:00,333 --> 00:35:03,834
Student:
I have had just about every
terrible job you can imagine.
807
00:35:03,917 --> 00:35:06,917
Fast food, grocery store,
stock clerk.
808
00:35:07,000 --> 00:35:10,792
I was a supervisor
at a retail store
while I was in college.
809
00:35:10,875 --> 00:35:13,625
I was a mechanic.
I worked in the industry
for over 10 years.
810
00:35:13,709 --> 00:35:15,291
Then I went
to substitute teaching.
811
00:35:15,375 --> 00:35:17,834
Then I went into solar,
doing sales.
812
00:35:17,917 --> 00:35:21,166
And then I was a writer,
and then I became
an English teacher.
813
00:35:21,250 --> 00:35:22,583
Reservation sales agent.
814
00:35:22,667 --> 00:35:24,375
Customer service,
marketing.
815
00:35:24,458 --> 00:35:27,083
Now I am here
at Per Scholas.
816
00:35:27,166 --> 00:35:29,959
-I should do that
in CodePen, right?
-Yeah.
817
00:35:30,041 --> 00:35:32,917
The job market is really
shifting towards a gig economy.
818
00:35:33,000 --> 00:35:35,083
What do you have
that you can work for yourself,
819
00:35:35,166 --> 00:35:36,667
or work for someone else?
820
00:35:36,750 --> 00:35:38,667
A year ago,
I would've told you
that I was gonna go
821
00:35:38,750 --> 00:35:42,125
to grad school and get a PhD
and all that good stuff.
822
00:35:42,208 --> 00:35:44,291
But the reality is
when I graduated,
823
00:35:44,375 --> 00:35:45,709
and I was looking for jobs,
824
00:35:45,792 --> 00:35:47,291
one of the main things
that kept popping up
825
00:35:47,375 --> 00:35:49,375
was software engineer,
coding, tech.
826
00:35:49,458 --> 00:35:52,542
Do you think
you would've made the move
if this place wasn't free?
827
00:35:53,291 --> 00:35:54,291
No. (laughs)
828
00:35:54,375 --> 00:35:56,208
Because just
graduated college,
829
00:35:56,291 --> 00:35:57,959
so, you know, Sally Mae
is still knocking on my door.
830
00:35:58,041 --> 00:35:59,417
-They're knocking
on my door right now.
-(knocking on door)
831
00:35:59,500 --> 00:36:02,083
-Is that them?
-You see?
832
00:36:02,166 --> 00:36:04,667
-(laughter)
-I just said their name,
and here they are.
833
00:36:04,750 --> 00:36:06,667
When I put
the comma, right?
834
00:36:06,750 --> 00:36:09,667
And I put the P, does
that mean parent and P?
835
00:36:09,750 --> 00:36:11,291
-I'm a mom of three.
-Mm-hmm.
836
00:36:11,375 --> 00:36:14,208
So, my last job, I was working
at a busy call center,
837
00:36:14,291 --> 00:36:17,166
and I was in a place where
I felt you know, undervalued.
838
00:36:17,250 --> 00:36:19,166
Robots was gonna come
and take my job,
839
00:36:19,250 --> 00:36:20,667
and I would've
been without a job,
840
00:36:20,750 --> 00:36:24,000
and what would I've been able
to pass onto my children?
841
00:36:24,083 --> 00:36:25,291
"Hi, how are you doing?"
842
00:36:25,375 --> 00:36:27,333
With this,
I can give them a skill.
843
00:36:27,417 --> 00:36:30,333
They can be in a better place
in the next 10, 20 years,
844
00:36:30,417 --> 00:36:33,667
opposed to how long it took
for me to figure this out.
845
00:36:33,750 --> 00:36:37,125
Do you think you're at
a disadvantage as far as
the job market itself?
846
00:36:37,208 --> 00:36:40,125
You look at the tech industry,
and it's like,
847
00:36:40,208 --> 00:36:42,333
other than south Asians,
east Asians,
848
00:36:42,417 --> 00:36:44,583
it's like, there are not
a lot of people of color.
849
00:36:44,667 --> 00:36:46,250
Yeah, I wanted
to touch on that.
850
00:36:46,333 --> 00:36:48,375
This is actually why
I decided to specifically
851
00:36:48,458 --> 00:36:50,166
go into coding because
this is one where,
852
00:36:50,250 --> 00:36:52,208
at the end of
the day, it's just
853
00:36:52,291 --> 00:36:55,792
how good your
applications are,
your codes are.
854
00:36:55,875 --> 00:36:57,834
Growing up, I used to think
this was just magic,
855
00:36:57,917 --> 00:37:02,458
or like it's done
by all the smart kids
in California, you know?
856
00:37:02,542 --> 00:37:04,750
So, I wanna prove
that it can be done
857
00:37:04,834 --> 00:37:08,375
by some kid in Queens
who just wanted to do it.
858
00:37:08,458 --> 00:37:11,000
(indistinct chatter)
859
00:37:11,083 --> 00:37:13,000
Kelly Richardson:
The perfect world is
860
00:37:13,083 --> 00:37:15,583
that we have enough investment
861
00:37:15,667 --> 00:37:17,709
that we could grow to meet
862
00:37:17,792 --> 00:37:20,041
-both the size of the demand
and the size of the supply.
-Mm-hmm.
863
00:37:20,125 --> 00:37:24,291
Richardson:
We have more employer
partners willing to hire
than we have graduates for.
864
00:37:24,375 --> 00:37:25,917
We have more students
or applicants
865
00:37:26,000 --> 00:37:28,500
-applying to Per Scholas
than we have spots for.
-Right.
866
00:37:28,583 --> 00:37:30,750
The constraint to our
growth is resources.
867
00:37:30,834 --> 00:37:34,000
The domestic investment
in workforce retraining
868
00:37:34,083 --> 00:37:36,875
-is so small, and...
-Mm.
869
00:37:36,959 --> 00:37:38,834
...the impact automation
is going to have
870
00:37:38,917 --> 00:37:40,583
-is not going to be equitable.
-Mm-hmm.
871
00:37:40,667 --> 00:37:42,792
That it's largely
people of color,
872
00:37:42,875 --> 00:37:46,000
largely women who are
in the current low-wage
873
00:37:46,083 --> 00:37:48,000
occupations that are
gonna be displaced.
874
00:37:48,083 --> 00:37:50,041
There really should be
some critical thinking
875
00:37:50,125 --> 00:37:52,375
and some action that
legislators are taking
876
00:37:52,458 --> 00:37:54,291
to invest in
programs like this.
877
00:37:54,375 --> 00:37:56,583
For decades, the federal
government has repeatedly
878
00:37:56,667 --> 00:37:58,583
taken action
to fund reskilling.
879
00:37:58,667 --> 00:38:01,792
I am proud today
to sign into law
880
00:38:01,875 --> 00:38:03,625
the Job Training
Partnership Act,
881
00:38:03,709 --> 00:38:07,125
a program that looks to
the future instead of the past.
882
00:38:07,208 --> 00:38:10,417
Giving all Americans
the tools they need
to learn for a lifetime
883
00:38:10,500 --> 00:38:14,917
is critical to our ability
to continue to grow.
884
00:38:15,000 --> 00:38:16,959
So, the bill I'm about to sign
will give communities
885
00:38:17,041 --> 00:38:19,959
more certainty to invest
in job training programs
for the long run.
886
00:38:20,041 --> 00:38:21,291
Andavolu:
But in today's dollars,
887
00:38:21,375 --> 00:38:24,417
that funding
has fallen
for years.
888
00:38:24,500 --> 00:38:25,667
President Trump campaigned
889
00:38:25,750 --> 00:38:27,959
on bringing jobs back
to American workers.
890
00:38:28,041 --> 00:38:32,125
A Trump administration
will stop the jobs
from leaving America.
891
00:38:32,208 --> 00:38:35,166
Andavolu:
And he signed an executive
order enacting, what he calls,
892
00:38:35,250 --> 00:38:38,041
the White House's Pledge
to America's Workers,
893
00:38:38,125 --> 00:38:40,417
installing his
advisor and daughter,
894
00:38:40,500 --> 00:38:42,375
Ivanka Trump,
to lead the charge.
895
00:38:42,458 --> 00:38:46,834
It's one of my favorite
words, reskilling. Reskilling.
896
00:38:46,917 --> 00:38:49,667
We're calling upon government
and the private sector
897
00:38:49,750 --> 00:38:51,709
to equip our
students and workers
898
00:38:51,792 --> 00:38:55,792
with the skills they need
to thrive in the modern economy.
899
00:38:55,875 --> 00:38:58,625
♪ ♪
900
00:39:02,750 --> 00:39:06,417
(indistinct PA announcement)
901
00:39:10,250 --> 00:39:12,083
(horn beeps)
902
00:39:12,166 --> 00:39:13,834
Andavolu:
The Trump
administration's strategy
903
00:39:13,917 --> 00:39:16,125
on reskilling America's
workforce is a lot
904
00:39:16,208 --> 00:39:19,542
like its strategy for other
big problems America is facing.
905
00:39:19,625 --> 00:39:21,500
Rather than increasing
public investment,
906
00:39:21,583 --> 00:39:25,333
they prefer to see private
industry fill the void.
907
00:39:25,417 --> 00:39:27,250
For the past nine months,
the Trump administration
908
00:39:27,333 --> 00:39:30,542
has been twisting the arms
of CEOs to promise funding
909
00:39:30,625 --> 00:39:33,917
for worker education
and training.
910
00:39:34,000 --> 00:39:37,166
And so far, more than 200
companies have signed on,
911
00:39:37,250 --> 00:39:38,875
with Toyota being the latest.
912
00:39:38,959 --> 00:39:41,166
Ivanka:
Toyota signed our pledge
913
00:39:41,250 --> 00:39:44,166
to America's workers for 100,000
914
00:39:44,250 --> 00:39:48,041
enhanced career opportunities,
apprenticeship, um...
915
00:39:48,125 --> 00:39:49,875
-new jobs...
-New development.
916
00:39:49,959 --> 00:39:51,792
New development,
workforce training.
917
00:39:51,875 --> 00:39:54,834
I think James was
a little bit inspired.
918
00:39:54,917 --> 00:39:57,834
He's just increased that
commitment to 200,000,
919
00:39:57,917 --> 00:40:00,375
-which is unbelievably exciting.
-(cheering)
920
00:40:00,458 --> 00:40:02,750
Bevin:
The American worker is great,
921
00:40:02,834 --> 00:40:04,750
but the Kentucky worker
922
00:40:04,834 --> 00:40:07,333
-is greater still.
-(cheering)
923
00:40:07,417 --> 00:40:09,709
(applause)
924
00:40:09,792 --> 00:40:12,041
Does government have
a role? Absolutely.
925
00:40:12,125 --> 00:40:15,542
But is it to define what
the workforce looks like? No.
926
00:40:15,625 --> 00:40:17,792
If the state were
to develop a program
927
00:40:17,875 --> 00:40:19,542
for Toyota's workers
of the future,
928
00:40:19,625 --> 00:40:21,917
it would be a failure.
I'm telling you, straight up.
929
00:40:22,000 --> 00:40:23,709
Toyota knows what Toyota needs.
930
00:40:23,792 --> 00:40:27,458
But the risk, perhaps,
is a reliance on a company
931
00:40:27,542 --> 00:40:30,333
that isn't owned by you and me,
like the government is.
932
00:40:30,417 --> 00:40:32,709
It's owned by shareholders.
Is there a problem with having
933
00:40:32,792 --> 00:40:34,625
a private response
to a public problem?
934
00:40:34,709 --> 00:40:37,750
Who has more of a vested
interest in getting this right?
935
00:40:37,834 --> 00:40:40,959
The government
or the private company?
The private company does.
936
00:40:41,041 --> 00:40:44,583
It is in the best interest of
Toyota or any other company
937
00:40:44,667 --> 00:40:47,125
to train the best
quality people,
938
00:40:47,208 --> 00:40:48,917
pay them as much as possible,
939
00:40:49,000 --> 00:40:51,709
give them the standard of
living and the quality of life
940
00:40:51,792 --> 00:40:53,208
that makes them wanna come
941
00:40:53,291 --> 00:40:56,583
and retire 20,
30, 40 years later
942
00:40:56,667 --> 00:40:58,166
from the very same company.
943
00:40:58,250 --> 00:41:00,792
It's in the company's
interest until it isn't.
944
00:41:00,875 --> 00:41:02,792
And the thing about a company
is, like, we don't elect
945
00:41:02,875 --> 00:41:05,792
their executives,
but we elect our
government officials.
946
00:41:05,875 --> 00:41:08,458
But the idea that we're
gonna rely on government
947
00:41:08,542 --> 00:41:11,000
and "elected people,"
to come up with rules
948
00:41:11,083 --> 00:41:13,500
for training people
for jobs that you will
949
00:41:13,583 --> 00:41:16,917
volitionally want to buy
the products of is crazy.
950
00:41:17,000 --> 00:41:18,291
♪ ♪
951
00:41:18,375 --> 00:41:19,792
Andavolu:
Under the pledge,
952
00:41:19,875 --> 00:41:22,166
they're promising 200,000
new opportunities.
953
00:41:22,250 --> 00:41:23,959
They're promising to reskill
954
00:41:24,041 --> 00:41:26,625
and retrain 200,000 people.
955
00:41:26,709 --> 00:41:28,125
They have the capacity
to do that.
956
00:41:28,208 --> 00:41:29,750
What happens
if they break the promise?
957
00:41:29,834 --> 00:41:31,500
Again, they're
gonna do their--
958
00:41:31,583 --> 00:41:33,417
Can you mandate
the government does it?
959
00:41:33,500 --> 00:41:37,166
If they don't do it,
it's not like we're gonna
not reelect their CEO.
960
00:41:37,250 --> 00:41:40,458
They don't do it, they might
have some bad publicity--
961
00:41:40,542 --> 00:41:42,458
You think it won't
affect their CEO?
962
00:41:42,542 --> 00:41:44,291
Here's what happens.
If they don't do it,
963
00:41:44,375 --> 00:41:46,500
if they're not making
these investments,
964
00:41:46,583 --> 00:41:48,542
there's not a chance
that they survive.
965
00:41:48,625 --> 00:41:50,333
(applause)
966
00:41:50,417 --> 00:41:53,917
♪ ♪
967
00:42:01,709 --> 00:42:02,834
(indistinct chatter)
968
00:42:02,917 --> 00:42:04,291
Andavolu:
This is Landon and Tim,
969
00:42:04,375 --> 00:42:07,333
two buddies who work together
at the Toyota factory.
970
00:42:07,417 --> 00:42:11,834
Tim's just retired,
but Landon sees himself
working at Toyota for decades.
971
00:42:11,917 --> 00:42:13,959
It must be kind of nice
to at least know that
972
00:42:14,041 --> 00:42:16,542
at a high level,
they're thinking about
973
00:42:16,625 --> 00:42:18,500
your jobs, the things
that you guys do,
974
00:42:18,583 --> 00:42:19,875
the opportunities
that you guys have.
975
00:42:19,959 --> 00:42:21,500
Does it strike you that way?
976
00:42:21,583 --> 00:42:23,166
Tim Smith:
With Ivanka,
977
00:42:23,250 --> 00:42:25,458
I just looked
and said okay, it's PR.
978
00:42:26,458 --> 00:42:28,291
Do people want to be reskilled?
979
00:42:28,375 --> 00:42:30,834
If their job depended on it,
980
00:42:30,917 --> 00:42:32,291
and they know it's coming,
981
00:42:32,375 --> 00:42:34,500
I would say sure. Yes.
982
00:42:34,583 --> 00:42:37,875
It would be depending on
your personal circumstances,
983
00:42:37,959 --> 00:42:39,083
that could be very hard
984
00:42:39,166 --> 00:42:42,917
because if you work,
you know, a full shift
985
00:42:43,000 --> 00:42:45,083
and you have a family,
986
00:42:45,166 --> 00:42:49,458
you may only have an hour
or two of free time a day,
987
00:42:49,542 --> 00:42:52,917
are you supposed to go drive
to a training facility
988
00:42:53,000 --> 00:42:57,250
and spend a few hours
a day there before
you go do your shift,
989
00:42:57,333 --> 00:42:59,166
-you work your shift?
-Mm-hmm.
990
00:42:59,250 --> 00:43:01,625
I don't think very many
people would do that.
991
00:43:02,291 --> 00:43:04,375
Do you guys like your job,
or do you like working there?
992
00:43:04,458 --> 00:43:06,166
(chuckles)
993
00:43:06,834 --> 00:43:08,583
-You go, Timmy.
-(laughing)
994
00:43:08,667 --> 00:43:10,834
-I actually love my job.
-Yeah.
995
00:43:10,917 --> 00:43:13,125
Um, you know,
because I've always
996
00:43:13,208 --> 00:43:15,250
loved working on cars,
997
00:43:15,333 --> 00:43:18,125
so my job was pretty
much up my alley.
998
00:43:19,208 --> 00:43:21,291
-There are parts
of my job that I like.
-Mm-hmm.
999
00:43:21,375 --> 00:43:23,750
It's, you know, you're
creating something.
1000
00:43:23,834 --> 00:43:25,250
There's no way it could exist
1001
00:43:25,333 --> 00:43:26,750
without someone
putting it together.
1002
00:43:26,834 --> 00:43:28,792
(laughing)
1003
00:43:28,875 --> 00:43:30,792
-Exactly.
-And so...
1004
00:43:30,875 --> 00:43:33,417
-Yeah, I have a wife
and two daughters.
-Right.
1005
00:43:33,500 --> 00:43:36,125
-I wanna spend as much
time with them as I can.
-Mm-hmm.
1006
00:43:36,208 --> 00:43:39,083
They're young and they're not
gonna stay young for long,
1007
00:43:39,166 --> 00:43:43,250
and I hate missing the time
I miss with them at work
already.
1008
00:43:43,333 --> 00:43:45,875
I just wanna spend as much time
as I can with them.
1009
00:43:45,959 --> 00:43:49,417
I wanna retire.
That's the... that's the goal.
1010
00:43:49,500 --> 00:43:51,333
Andavolu:
For people in
the workforce today,
1011
00:43:51,417 --> 00:43:55,500
reskilling boils down
to doing more work
just to keep up.
1012
00:43:55,583 --> 00:43:58,625
To Andrew Yang, a former
job creation specialist,
1013
00:43:58,709 --> 00:44:00,875
and now a long-shot
presidential candidate,
1014
00:44:00,959 --> 00:44:03,834
the spotlight on reskilling
hides a larger imbalance
1015
00:44:03,917 --> 00:44:05,458
between the goals of workers
1016
00:44:05,542 --> 00:44:07,458
and the goals of
their employers.
1017
00:44:07,542 --> 00:44:09,542
We are so brainwashed
by the market that otherwise
1018
00:44:09,625 --> 00:44:12,125
intelligent, well-meaning
people will legitimately say,
1019
00:44:12,208 --> 00:44:14,125
"We should retrain
the coal miners to be coders."
1020
00:44:14,208 --> 00:44:15,083
Yep.
1021
00:44:15,166 --> 00:44:16,500
Yang:
We are trained to think
1022
00:44:16,583 --> 00:44:19,125
that we have no value unless
1023
00:44:19,208 --> 00:44:23,291
the market says that
there's a need for what we do.
1024
00:44:23,375 --> 00:44:26,792
And so, if coal miners
now have zero value,
then the thought process is,
1025
00:44:26,875 --> 00:44:31,208
"Oh, we have to turn them
into something that does have
value. What has value? Coders!"
1026
00:44:31,291 --> 00:44:32,959
And then, 12 years from now,
1027
00:44:33,041 --> 00:44:35,500
-AI's gonna be able to do
basic coding anyway.
-Sure.
1028
00:44:35,583 --> 00:44:38,375
So, this is a race
we will not win.
1029
00:44:38,458 --> 00:44:41,125
The goal posts are gonna
move the whole time on us.
1030
00:44:41,208 --> 00:44:43,250
What the solutions?
What are you proposing?
1031
00:44:43,333 --> 00:44:44,875
We start issuing a dividend
1032
00:44:44,959 --> 00:44:47,750
to all American adults,
starting at age 18,
1033
00:44:47,834 --> 00:44:49,417
where everyone gets
$1,000 a month.
1034
00:44:49,500 --> 00:44:51,834
So, basically
a universal basic income.
1035
00:44:51,917 --> 00:44:54,458
-Yes. We've rebranded it
the Freedom Dividend...
-Okay.
1036
00:44:54,542 --> 00:44:57,667
...because it tests much
better with conservatives
with the word freedom in it.
1037
00:44:57,750 --> 00:44:59,166
(laughing)
1038
00:44:59,250 --> 00:45:01,333
And it's not a basic income,
it's a dividend.
1039
00:45:01,417 --> 00:45:03,834
Which is to say the economy
is at a surplus,
1040
00:45:03,917 --> 00:45:05,834
so everyone deserves
a piece of the pie.
1041
00:45:05,917 --> 00:45:09,458
Yeah, no. All of us are owners
and shareholders of the richest
1042
00:45:09,542 --> 00:45:11,709
society in the history of
the world that can easily afford
1043
00:45:11,792 --> 00:45:13,542
a dividend of
$1,000 per adult.
1044
00:45:13,625 --> 00:45:17,041
People need meaning, structure,
purpose, fulfillment.
1045
00:45:17,125 --> 00:45:20,959
And that is the generational
challenge that faces us.
1046
00:45:21,041 --> 00:45:22,709
It's not like the Freedom
Dividend, giving everyone
1047
00:45:22,792 --> 00:45:25,208
$1,000 a month, solves
that challenge. It does not.
1048
00:45:25,291 --> 00:45:27,375
-But what it does is it--
-Buys us time?
1049
00:45:27,458 --> 00:45:29,917
It buys us time and
also channels resources
1050
00:45:30,000 --> 00:45:32,542
into the pursuit of
meeting that challenge.
1051
00:45:32,625 --> 00:45:35,917
It ends up
supercharging our ability
1052
00:45:36,000 --> 00:45:39,750
to address what
we should be doing.
1053
00:45:39,834 --> 00:45:43,500
Andavolu:
Universal basic income, once
a marginal political fantasy,
1054
00:45:43,583 --> 00:45:47,542
has been embraced by more
than a few rabid capitalists
in recent years.
1055
00:45:47,625 --> 00:45:50,208
We should explore ideas
like universal basic income
1056
00:45:50,291 --> 00:45:53,250
to make sure that everyone has
a cushion to try new ideas.
1057
00:45:53,333 --> 00:45:54,917
It's free money for everybody.
1058
00:45:55,000 --> 00:45:58,375
Enough to pay
for your basic needs:
food, shelter, education.
1059
00:45:58,458 --> 00:45:59,959
I don't think we're
gonna have a choice.
1060
00:46:00,041 --> 00:46:01,166
It will come about one day.
1061
00:46:01,250 --> 00:46:02,750
-And I think--
-Out of necessity?
1062
00:46:02,834 --> 00:46:04,959
Out of necessity, and...
1063
00:46:05,041 --> 00:46:08,792
and I think that cities
should experiment.
1064
00:46:08,875 --> 00:46:12,208
Andavolu:
Michael Moynihan went
to one city that already is.
1065
00:46:12,291 --> 00:46:13,792
-Moynihan: Mayor.
-Good morning.
1066
00:46:13,875 --> 00:46:15,667
Michael Moynihan.
How are you?
1067
00:46:15,750 --> 00:46:18,500
Moynihan:
Stockton's mayor won election
at just 26 years old,
1068
00:46:18,583 --> 00:46:21,959
taking office five years after
the city declared bankruptcy.
1069
00:46:22,041 --> 00:46:23,083
So, you grew
up here?
1070
00:46:23,166 --> 00:46:24,166
-Born and raised.
-Born and raised?
1071
00:46:24,250 --> 00:46:25,750
-This is home.
-And you left
1072
00:46:25,834 --> 00:46:27,083
for a brief period
to go to Stanford?
1073
00:46:27,166 --> 00:46:29,417
Left for four years,
I came right back.
1074
00:46:29,500 --> 00:46:31,083
Moynihan:
With funding
from Silicon Valley,
1075
00:46:31,166 --> 00:46:34,750
he's launched a pilot program
that's giving 125 residents
1076
00:46:34,834 --> 00:46:37,333
$500 a month for 18 months.
1077
00:46:37,417 --> 00:46:40,417
Why do Silicon Valley
guys like this so much?
1078
00:46:40,500 --> 00:46:43,041
Can't speak for all of them,
but I think a lot of them see
1079
00:46:43,125 --> 00:46:45,041
how detrimental it
would be to society
1080
00:46:45,125 --> 00:46:47,583
if there's a mass number
of people who are automated
1081
00:46:47,667 --> 00:46:49,625
without any way to make
a means for themselves,
1082
00:46:49,709 --> 00:46:51,834
without any way to provide
for themselves.
1083
00:46:51,917 --> 00:46:54,834
I mean, is it people in tech
kind of paying indulgences,
1084
00:46:54,917 --> 00:46:57,959
and saying, "Hey, we're
kind of screwing this up.
1085
00:46:58,041 --> 00:47:00,041
We feel bad about it.
Here's some money"?
1086
00:47:00,125 --> 00:47:02,125
I know some people are
talking about robot taxes
1087
00:47:02,208 --> 00:47:03,917
and things of that
sort, and I...
1088
00:47:04,000 --> 00:47:05,291
I'm in very much
agreement with that.
1089
00:47:05,375 --> 00:47:07,333
That they have
a responsibility
1090
00:47:07,417 --> 00:47:08,834
to society.
1091
00:47:08,917 --> 00:47:11,458
It's voluntary now, but it might
be at the point of a gun later.
1092
00:47:11,542 --> 00:47:13,542
Yeah, voluntary
to start and pilot,
1093
00:47:13,625 --> 00:47:15,583
but absolutely
I think to scale it,
1094
00:47:15,667 --> 00:47:16,834
it's not gonna be
about generosity.
1095
00:47:16,917 --> 00:47:18,166
It's gonna be
a matter of policy.
1096
00:47:18,250 --> 00:47:19,667
♪ ♪
1097
00:47:19,750 --> 00:47:21,458
Moynihan:
The criticism that
most people get
1098
00:47:21,542 --> 00:47:23,625
for UBI type
experiments is that,
1099
00:47:23,709 --> 00:47:26,750
all right, you're just giving
people money, a handout, etc.
1100
00:47:26,834 --> 00:47:28,333
That's a start,
1101
00:47:28,417 --> 00:47:31,583
but what's the long-term
goal for jobs in Stockton?
1102
00:47:31,667 --> 00:47:33,417
The hypothesis
we're working on is
1103
00:47:33,500 --> 00:47:35,542
that folks have their
basic needs met,
1104
00:47:35,625 --> 00:47:38,959
and then create
the workforce of the future.
1105
00:47:39,041 --> 00:47:41,792
Primarily starting with
the kids in our schools now,
but also with the adults,
1106
00:47:41,875 --> 00:47:44,834
to give them opportunities
for retraining, reskilling,
1107
00:47:44,917 --> 00:47:47,333
but also supporting people in
their entrepreneurial pursuits.
1108
00:47:47,417 --> 00:47:49,500
Folks with a lot of potential
but historically folks
1109
00:47:49,583 --> 00:47:51,667
who haven't been seen as
important enough for investment
1110
00:47:51,750 --> 00:47:53,875
or important enough
for government
to really partner with,
1111
00:47:53,959 --> 00:47:55,458
so that's what
makes people excited
1112
00:47:55,542 --> 00:47:57,250
about the work we're
doing in Stockton.
1113
00:47:58,083 --> 00:48:01,667
So yeah, this is
the downtown marina.
Watch your step.
1114
00:48:01,750 --> 00:48:03,375
Moynihan:
There was basically
nothing here
1115
00:48:03,458 --> 00:48:05,875
when you were a kid,
when you were five,
six years old.
1116
00:48:05,959 --> 00:48:07,667
Tubbs:
I never really came out here
1117
00:48:07,750 --> 00:48:09,542
till I came back
for city council.
1118
00:48:09,625 --> 00:48:12,750
-Yeah.
-This wasn't part of my
Stockton, but I think for me,
1119
00:48:12,834 --> 00:48:15,750
this spot's so important 'cause
it represents real potential.
1120
00:48:15,834 --> 00:48:19,333
-There's not many cities
that have... this.
-Yeah.
1121
00:48:20,458 --> 00:48:24,041
When we talk about the future
of work, we're talking about
1122
00:48:24,125 --> 00:48:26,250
how do we ensure that
those left behind today
1123
00:48:26,333 --> 00:48:27,875
aren't further left
behind tomorrow,
1124
00:48:27,959 --> 00:48:30,083
and that's my biggest fear.
1125
00:48:30,166 --> 00:48:31,875
The folks who are making
the least now
1126
00:48:31,959 --> 00:48:35,166
are the most likely
to be automated out of jobs
and making anything.
1127
00:48:35,250 --> 00:48:38,667
So for me, a basic income isn't
even about the future of work.
1128
00:48:38,750 --> 00:48:41,041
It's about getting
our foundation set
in the present,
1129
00:48:41,125 --> 00:48:42,792
so that when
the future work happens,
1130
00:48:42,875 --> 00:48:45,125
we have a firm foundation
on which we can pivot
1131
00:48:45,208 --> 00:48:47,417
and figure out what we can do
with and for people.
1132
00:48:47,500 --> 00:48:50,709
Andavolu:
What's attractive about
UBI is its simplicity.
1133
00:48:50,792 --> 00:48:53,875
But that's also what makes
it vulnerable to critique.
1134
00:48:53,959 --> 00:48:57,375
Daron Acemoglu:
I don't think these
Utopian ideas
1135
00:48:57,458 --> 00:48:59,291
of universal basic income,
1136
00:48:59,375 --> 00:49:01,166
robots do all the production
and then everybody
1137
00:49:01,250 --> 00:49:04,208
stays at home
with a decent income
and plays video games.
1138
00:49:04,291 --> 00:49:06,125
I think that's
a very dystopic future,
1139
00:49:06,208 --> 00:49:09,959
and it won't work,
and I think it will lead
to a huge amount of discontent.
1140
00:49:10,041 --> 00:49:13,917
We really have no option
but create jobs for the future.
1141
00:49:14,000 --> 00:49:16,417
Politicians have to start
engaging these issues
1142
00:49:16,500 --> 00:49:19,583
to figure out
what's politically feasible.
1143
00:49:19,667 --> 00:49:22,041
It's gonna trigger
fundamental debates
1144
00:49:22,125 --> 00:49:23,875
about things like
a universal basic income
1145
00:49:23,959 --> 00:49:26,417
that everybody ought to get
money and the question is,
1146
00:49:26,500 --> 00:49:28,917
okay, where's that
money gonna come from?
1147
00:49:29,000 --> 00:49:31,709
Andavolu:
According to a progressive
think tank's analysis,
1148
00:49:31,792 --> 00:49:35,625
a UBI program giving every
American $10,000 a year
1149
00:49:35,709 --> 00:49:39,208
would cost the government
more than $3 trillion annually.
1150
00:49:39,291 --> 00:49:44,291
The entire 2018 federal budget
was just over $4 trillion.
1151
00:49:44,375 --> 00:49:47,375
Universal basic income isn't
the only policy being floated
1152
00:49:47,458 --> 00:49:51,000
to shore up the workforce's
shaky financial foundation,
1153
00:49:51,083 --> 00:49:55,125
whether it's using federal
funds to guarantee jobs
for anyone who wants one,
1154
00:49:55,208 --> 00:49:58,583
giving tax breaks to companies
to create more jobs,
1155
00:49:58,667 --> 00:50:01,792
or strong-arming companies
to keep their factories open.
1156
00:50:01,875 --> 00:50:04,959
What all these policies
and strategies tell us
1157
00:50:05,041 --> 00:50:07,417
is that the broad consensus
is that right now...
1158
00:50:07,500 --> 00:50:09,041
Protesters:
We say fight back!
1159
00:50:09,125 --> 00:50:11,542
Andavolu:
...things aren't working.
1160
00:50:14,583 --> 00:50:16,583
♪ ♪
1161
00:50:18,125 --> 00:50:20,333
Brynjolfsson:
We've designed a system where,
1162
00:50:20,417 --> 00:50:23,417
as the technology
creates more wealth,
1163
00:50:23,500 --> 00:50:27,041
some people, bizarrely,
are made worse off.
1164
00:50:27,667 --> 00:50:30,500
We have to update
and reinvent our system
1165
00:50:30,583 --> 00:50:35,000
so that this explosion
of wealth and productivity
1166
00:50:35,083 --> 00:50:37,375
benefits not just
the few, but the many.
1167
00:50:37,458 --> 00:50:39,458
Haass:
This challenge
of new technology
1168
00:50:39,542 --> 00:50:42,083
is not taking place
in a vacuum.
1169
00:50:42,166 --> 00:50:43,875
This country's
already divided.
1170
00:50:43,959 --> 00:50:48,166
It's divided geographically.
It's divided culturally,
politically.
1171
00:50:48,250 --> 00:50:50,375
We've gotta be prepared
for the fact that it could
1172
00:50:50,458 --> 00:50:51,875
actually take
the social differences
1173
00:50:51,959 --> 00:50:54,500
we already have,
and make it worse.
1174
00:50:54,583 --> 00:50:57,083
There's gotta be
a sense of urgency here.
1175
00:50:58,083 --> 00:50:59,792
A higher level of disconnection,
1176
00:50:59,875 --> 00:51:02,500
alienation, more declines
in social capital,
1177
00:51:02,583 --> 00:51:04,500
more groups of
people left behind,
1178
00:51:04,583 --> 00:51:06,583
more geographic
areas left behind.
1179
00:51:06,667 --> 00:51:08,250
-Protester: What do we want?
-All: Justice!
1180
00:51:08,333 --> 00:51:10,125
-Protester: When do we want it?
-All: Now!
1181
00:51:10,208 --> 00:51:13,625
McAfee:
This is not a recipe
for a stable, prosperous,
happy society.
1182
00:51:13,709 --> 00:51:16,458
My worry is not that the robots
will take all the jobs.
1183
00:51:16,542 --> 00:51:18,792
My worry is that more
people will be left behind
1184
00:51:18,875 --> 00:51:21,375
and will feel left behind
by what's going on.
1185
00:51:21,458 --> 00:51:25,041
Stiglitz:
If we continue in the way
we've run our economy
1186
00:51:25,125 --> 00:51:29,667
for the last 40 years,
it will be disastrous.
1187
00:51:29,750 --> 00:51:32,542
So, when you talk about
the future of work,
1188
00:51:32,625 --> 00:51:34,583
we're kind of talking about
the future of the whole system?
1189
00:51:34,667 --> 00:51:37,417
That's right. I mean,
you cannot have
1190
00:51:37,500 --> 00:51:40,792
a prosperous economy
without prosperous workers.
1191
00:51:40,875 --> 00:51:42,667
Voltaire said
that work saves us
1192
00:51:42,750 --> 00:51:44,500
from three great evils.
1193
00:51:44,583 --> 00:51:46,792
Boredom, vice, and need.
1194
00:51:46,875 --> 00:51:50,500
Haass:
So much of our identities
are tied up with our jobs.
1195
00:51:50,583 --> 00:51:52,750
People ask you how you doing,
who you are, what you do,
1196
00:51:52,834 --> 00:51:55,667
and that's essential
to our sense of self.
1197
00:51:56,375 --> 00:51:58,667
If we have millions,
or tens of millions,
1198
00:51:58,750 --> 00:52:01,333
of chronically,
long-term unemployed,
1199
00:52:01,417 --> 00:52:02,583
what's gonna become
of those people?
1200
00:52:02,667 --> 00:52:04,583
It's not simply
a question of how are they
1201
00:52:04,667 --> 00:52:07,333
going to support themselves.
What are they gonna do?
1202
00:52:09,041 --> 00:52:10,375
(horn honks)
1203
00:52:12,667 --> 00:52:14,250
(engine rumbling)
1204
00:52:16,041 --> 00:52:19,250
(drill whirring)
1205
00:52:24,041 --> 00:52:26,208
This is your NOx sensor drift.
1206
00:52:27,000 --> 00:52:29,083
-One box.
-Conversion efficiency.
1207
00:52:29,166 --> 00:52:31,792
Yep. One box gone.
1208
00:52:31,875 --> 00:52:33,083
(rattling)
1209
00:52:33,166 --> 00:52:35,041
You know,
someone might say
1210
00:52:35,125 --> 00:52:37,959
robots are coming. Might as
well hang up the keys now.
1211
00:52:38,041 --> 00:52:39,125
Does that go
through your head?
1212
00:52:39,208 --> 00:52:40,875
It's gonna happen.
1213
00:52:40,959 --> 00:52:42,834
-It's gonna happen, I mean--
-What's gonna happen?
1214
00:52:42,917 --> 00:52:45,834
With these vehicles
driving by themselves.
1215
00:52:45,917 --> 00:52:48,667
Change is good. Some change
ain't good, you know?
1216
00:52:48,750 --> 00:52:51,667
I mean, that's gonna be a lot
of people out of work, and...
1217
00:52:51,750 --> 00:52:53,333
Andavolu:
What do you think
you're gonna do?
1218
00:52:53,417 --> 00:52:55,625
♪ ♪
1219
00:52:55,709 --> 00:52:57,625
There's gonna be a crash.
1220
00:52:57,709 --> 00:53:00,500
I think there's gonna
be a lot of outrage.
1221
00:53:00,583 --> 00:53:03,625
-Riots more or less,
you know what I mean?
-Sure.
1222
00:53:03,709 --> 00:53:06,375
'Cause they're gonna fight
to try to keep their job.
1223
00:53:06,458 --> 00:53:09,125
I mean, would I do it?
Yeah, I would do it.
1224
00:53:09,208 --> 00:53:11,542
It's gonna be
a chain reaction.
1225
00:53:11,625 --> 00:53:13,667
In reality, you gotta
look at the economy
1226
00:53:13,750 --> 00:53:15,250
and what is it gonna
do to the economy.
1227
00:53:15,333 --> 00:53:17,875
What is it gonna do to
the American people?
1228
00:53:18,583 --> 00:53:22,458
Not just the industry,
I mean to... to the world.
1229
00:53:22,542 --> 00:53:25,458
What's gonna happen you got
the whole world pissed off?
1230
00:53:25,542 --> 00:53:28,166
(hammering, clatter)
1231
00:53:28,500 --> 00:53:30,333
But me,
I wanna die in a truck.
1232
00:53:30,417 --> 00:53:33,166
-Really? (laughs)
-I'm gonna die
in a truck, brother.
1233
00:53:33,250 --> 00:53:36,333
I've told all my friends,
I've told my family.
1234
00:53:36,417 --> 00:53:39,959
-That's when I retire is
when I die in a truck.
-Yeah.
1235
00:53:40,041 --> 00:53:43,458
I mean, I been doing
it for too long.
1236
00:53:43,542 --> 00:53:45,000
It's in the blood.
1237
00:53:45,083 --> 00:53:48,125
♪ ♪
1238
00:54:14,291 --> 00:54:17,333
♪ ♪
100809
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.