All language subtitles for Last.Week.Tonight.with.John.Oliver.S10E02.February.26.2023.Artificial.Intelligence.1080p.HMAX.WEB-DL.DD2.0.H.264-playWEB_track4_[eng]
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:23,640 --> 00:00:25,839
LAST WEEK TONIGHT
WITH JOHN OLIVER
2
00:00:30,600 --> 00:00:33,756
Welcome to "Last Week Tonight !"
I'm John Oliver.
3
00:00:33,840 --> 00:00:37,236
Thank you so much for joining us.
It has been a busy week.
4
00:00:37,320 --> 00:00:39,556
Russia's war in Ukraine
entered its second year,
5
00:00:39,640 --> 00:00:42,140
massive winter storms
slammed a lot of the country,
6
00:00:42,200 --> 00:00:46,156
and East Palestine, Ohio got visits from
both Donald Trump and Pete Buttigieg,
7
00:00:46,240 --> 00:00:50,640
but some Fox personalities insisted that
others really should've been there, too.
8
00:00:51,039 --> 00:00:54,595
Think about the environmental
activists and corporate America.
9
00:00:54,679 --> 00:00:55,715
They weren't there.
10
00:00:55,799 --> 00:00:59,436
I mean this, with the activists,
this is an Erin Brockovich moment.
11
00:00:59,520 --> 00:01:01,675
I mean, there was a blockbuster
Oscar-winning movie
12
00:01:01,759 --> 00:01:03,459
written about something like this.
13
00:01:03,520 --> 00:01:05,075
Where is Leo DiCaprio ?
14
00:01:05,159 --> 00:01:07,036
Erin Brockovich
is in East Palestine tonight.
15
00:01:07,120 --> 00:01:09,920
She is, but where's Julia Roberts ?!
16
00:01:11,640 --> 00:01:14,359
What ? What are you talking about ?
17
00:01:15,159 --> 00:01:17,560
You realize Julia Roberts
is an actor, right ?
18
00:01:18,200 --> 00:01:21,280
She was pretending.
She's not actually Erin Brockovich.
19
00:01:22,359 --> 00:01:24,836
Also, I can't believe I'm the one
that has to break this to you,
20
00:01:24,920 --> 00:01:27,075
she didn't actually ruin
her best friend's wedding.
21
00:01:27,159 --> 00:01:31,959
She's not a sex worker. And she did not
die in a small Louisiana town in 1989.
22
00:01:32,439 --> 00:01:34,319
What is fuck is wrong with you ?
23
00:01:35,319 --> 00:01:37,795
But we're actually going to stay in
the world of conservative media tonight
24
00:01:37,879 --> 00:01:39,995
because it had
some big news this week.
25
00:01:40,079 --> 00:01:43,040
The embattled leader
and founder of Project Veritas
26
00:01:43,359 --> 00:01:45,040
has been removed from his post.
27
00:01:45,400 --> 00:01:46,556
James O'Keefe confirmed
28
00:01:46,640 --> 00:01:49,340
that he's no longer
with the conservative organization.
29
00:01:49,359 --> 00:01:51,476
It's known
for undercover operations
30
00:01:51,560 --> 00:01:54,040
targeting Democrats,
liberals, and the media.
31
00:01:54,519 --> 00:01:59,476
James O'Keefe, alt-right Borat,
is out at Project Veritas.
32
00:01:59,560 --> 00:02:02,235
And quick shoutout to that anchor
for the helpful distinction there
33
00:02:02,319 --> 00:02:04,075
between Democrats and liberals.
34
00:02:04,159 --> 00:02:06,355
'Cause in case you're wondering,
liberals are people who have
35
00:02:06,439 --> 00:02:08,395
those "We Believe" signs
in their front yards,
36
00:02:08,479 --> 00:02:10,355
and Democrats
ensure those front yards
37
00:02:10,439 --> 00:02:13,089
remain at least 50 miles
away from any public housing.
38
00:02:13,240 --> 00:02:15,916
And if you don't know
who James O'Keefe is,
39
00:02:16,000 --> 00:02:17,756
we'll explain in just a moment.
40
00:02:17,840 --> 00:02:20,955
But know that he's been a big deal
in right-wing media for a while now.
41
00:02:21,039 --> 00:02:23,639
And news of his departure
did not go over well.
42
00:02:24,319 --> 00:02:27,596
So, James O'Keefe has now apparently
been thrown out of Project Veritas,
43
00:02:27,680 --> 00:02:31,436
which means Project Veritas no longer
has, like, any reason for being.
44
00:02:31,520 --> 00:02:33,436
When you tell me
you are getting rid of James O'Keefe,
45
00:02:33,520 --> 00:02:35,036
you tell me
that Project Veritas is over.
46
00:02:35,120 --> 00:02:40,120
It's unconscionable what went on here.
Unconscionable for our movement.
47
00:02:40,439 --> 00:02:44,879
And for the nation at large.
James O'Keefe is a national treasure.
48
00:02:45,599 --> 00:02:50,000
Strong words there from the Ghost of
Christmas Been Dead for Three Weeks.
49
00:02:50,599 --> 00:02:52,436
But I have to disagree.
50
00:02:52,520 --> 00:02:54,395
The only national treasures
in this country
51
00:02:54,479 --> 00:02:57,996
are Dolly Parton, Pedro Pascal,
and Cocaine Bear.
52
00:02:58,080 --> 00:03:01,560
And I'll say what we're all
thinking-threesome when ?
53
00:03:02,560 --> 00:03:05,520
James O'Keefe first rose
to national prominence in 2009
54
00:03:05,840 --> 00:03:08,390
when he claimed
that he'd gone undercover as a pimp,
55
00:03:08,400 --> 00:03:10,756
and received
assistance from Acorn,
56
00:03:10,840 --> 00:03:13,355
a community organizing group
that helps low-income families.
57
00:03:13,439 --> 00:03:15,675
O'Keefe's stunt led
to its dissolution,
58
00:03:15,759 --> 00:03:18,276
even though later investigations
found, among other things,
59
00:03:18,360 --> 00:03:21,476
that he hadn't dressed like that
during his sting operation,
60
00:03:21,560 --> 00:03:25,436
and some Acorn workers had called
the police after he left the office.
61
00:03:25,520 --> 00:03:28,199
O'Keefe even wound up
having to pay $100,000
62
00:03:28,520 --> 00:03:31,156
to settle a lawsuit
from one of Acorn's workers.
63
00:03:31,240 --> 00:03:35,596
O'Keefe parlayed his fame from that
into launching Project Veritas,
64
00:03:35,680 --> 00:03:36,980
which quickly became known
65
00:03:37,039 --> 00:03:39,756
for their undercover,
hidden-camera investigations,
66
00:03:39,840 --> 00:03:42,436
where they'd bait people
from organizations like NPR
67
00:03:42,520 --> 00:03:46,635
or local elections boards and release
heavily edited video of the results.
68
00:03:46,719 --> 00:03:49,036
Over the years,
they've made big claims,
69
00:03:49,120 --> 00:03:53,395
like that they had "undeniable video
proof" of a cash-for-ballot scheme,
70
00:03:53,479 --> 00:03:55,360
claims that didn't quite panout.
71
00:03:55,719 --> 00:03:58,355
But despite
their underwhelming track record,
72
00:03:58,439 --> 00:04:02,680
conservative media has greeted each
investigation with great excitement.
73
00:04:03,240 --> 00:04:07,795
A bombshell new video uncovered
by James O'Keefe and Project Veritas.
74
00:04:07,879 --> 00:04:10,355
There is a new
Project Veritas video out.
75
00:04:10,439 --> 00:04:12,756
The spotlight today
is on this undercover video
76
00:04:12,840 --> 00:04:14,599
released by Project Veritas.
77
00:04:14,919 --> 00:04:19,639
The new Project Veritas video
revealing yet another media cover-up.
78
00:04:20,040 --> 00:04:22,315
Project Veritas
caught 'em cold again.
79
00:04:22,399 --> 00:04:23,916
Project Veritas videos
80
00:04:24,000 --> 00:04:27,100
are basically the thing conservatives
get most worked up about,
81
00:04:27,120 --> 00:04:29,916
a distinction they share
with "plastic toys go woke"
82
00:04:30,000 --> 00:04:32,560
and "fuckable candy
got comfortable shoes".
83
00:04:33,519 --> 00:04:35,836
But thanks to the media
storms that he created,
84
00:04:35,920 --> 00:04:38,276
a lot of money
has flown to Project Veritas.
85
00:04:38,360 --> 00:04:42,076
It raised
about $22 million in 2020.
86
00:04:42,160 --> 00:04:44,916
So, you might be wondering:
"Why would he be kicked out ?"
87
00:04:45,000 --> 00:04:48,396
For one, O'Keefe's been accused
of being a terrible manager,
88
00:04:48,480 --> 00:04:52,355
with a memo signed by 16 staffers
complaining of verbal abuse.
89
00:04:52,439 --> 00:04:55,195
The memo even included the line,
"Rule number one:
90
00:04:55,279 --> 00:04:58,519
you can't spit in an employee's face
over a tweet."
91
00:04:58,920 --> 00:05:02,360
And how many times has that happened
if it's rule number one ?
92
00:05:03,439 --> 00:05:05,716
Because in our office,
rule number one is,
93
00:05:05,800 --> 00:05:07,956
"Don't give Mr. Nutterbutter cocaine,"
94
00:05:08,040 --> 00:05:11,800
and that happened three times
before anyone thought to write it down.
95
00:05:12,680 --> 00:05:14,836
But perhaps the most serious
allegation is that O'Keefe
96
00:05:14,920 --> 00:05:17,396
has jeopardized
the group's non-profit status.
97
00:05:17,480 --> 00:05:21,555
He'd "spent an excessive amount
of donor funds on personal luxuries."
98
00:05:21,639 --> 00:05:23,836
O'Keefe denied
many of those claims
99
00:05:23,920 --> 00:05:27,596
in the, I shit you not,
45-minute resignation speech
100
00:05:27,680 --> 00:05:29,519
that he shared online on Monday.
101
00:05:29,839 --> 00:05:33,315
But the fact is, the organization
reported to the IRS last year
102
00:05:33,399 --> 00:05:38,396
that it had improperly paid $20,000
in excess benefits to O'Keefe,
103
00:05:38,480 --> 00:05:42,076
specifying that they were related to the
expense of having staff accompany him
104
00:05:42,160 --> 00:05:45,476
when he starred in an outdoor
production of "Oklahoma !"
105
00:05:45,560 --> 00:05:47,875
And at this point,
let's slow down.
106
00:05:47,959 --> 00:05:50,675
Because the most amazing detail
in this story
107
00:05:50,759 --> 00:05:52,355
is the degree to which O'Keefe
108
00:05:52,439 --> 00:05:55,795
is obsessed with finding opportunities
to sing and dance in public.
109
00:05:55,879 --> 00:05:57,916
That "Oklahoma !"
production was billed
110
00:05:58,000 --> 00:06:00,755
as featuring performers
who'd been victims of cancel culture.
111
00:06:00,839 --> 00:06:03,036
And O'Keeffe himself
took the lead role,
112
00:06:03,120 --> 00:06:06,399
and videos of it on YouTube show
he fully committed to it.
113
00:06:07,560 --> 00:06:10,319
Oklahoma !
114
00:06:10,639 --> 00:06:13,480
Where the wind comes
sweeping down the plains !
115
00:06:14,560 --> 00:06:17,600
And the wavin' wheat,
can sure smell sweet,
116
00:06:17,920 --> 00:06:20,879
when the wind
comes right behind the rain !
117
00:06:21,680 --> 00:06:25,000
What are you doing ?
Why are you spinning like that ?
118
00:06:26,079 --> 00:06:27,636
I don't know
what note your director gave you,
119
00:06:27,720 --> 00:06:28,916
but I have to assume
that it wasn't
120
00:06:29,000 --> 00:06:31,800
"spin around like you
just lost your wife in a Costco."
121
00:06:32,800 --> 00:06:34,516
I don't know what the
saddest part of that is,
122
00:06:34,600 --> 00:06:37,235
the fact he sings like Cousin Greg
from "Succession,"
123
00:06:37,319 --> 00:06:41,076
or that no one in that crowd
is into it, especially not this table
124
00:06:41,160 --> 00:06:44,060
that can't even be bothered
to turn around and look at him.
125
00:06:44,680 --> 00:06:48,156
And it gets even weirder. Because
in its list of improper expenses,
126
00:06:48,240 --> 00:06:51,755
the organization's board
also called out $60,000 in losses
127
00:06:51,839 --> 00:06:56,115
by putting together dance events
such as Project Veritas Experience.
128
00:06:56,199 --> 00:06:58,476
Now, here is a poster for it.
129
00:06:58,560 --> 00:07:00,636
And upon looking at it,
my first thought was,
130
00:07:00,720 --> 00:07:04,276
"What a nice sneak peek
at LMFAO's funeral announcement."
131
00:07:04,360 --> 00:07:09,276
But it's actually an extravaganza,
based on James O'Keefe's life,
132
00:07:09,360 --> 00:07:11,396
that was supposed to play
in Vegas last year.
133
00:07:11,480 --> 00:07:13,195
Now, tragically,
that didn't end up happening.
134
00:07:13,279 --> 00:07:15,636
But luckily,
we have a sense of what could've been,
135
00:07:15,720 --> 00:07:19,596
because in 2021, O'Keefe treated
attendees at a Turning Point USA event
136
00:07:19,680 --> 00:07:21,195
to a taste of the show.
137
00:07:21,279 --> 00:07:23,959
Here he is,
starring in its opening dance sequence.
138
00:07:24,800 --> 00:07:30,120
I know I'm burning.
This is my final day.
139
00:07:31,240 --> 00:07:34,240
I'm gonna go out smiling,
140
00:07:35,439 --> 00:07:39,079
a king for a day...
141
00:07:53,439 --> 00:07:56,995
Now, obviously,
there is a lot going on there.
142
00:07:57,079 --> 00:07:59,716
From O'Keeffe, in his press vest,
doing his now-signature
143
00:07:59,800 --> 00:08:02,156
"I don't know
where the fuck I am face"
144
00:08:02,240 --> 00:08:04,476
to him being dance
attacked by FBI agents,
145
00:08:04,560 --> 00:08:06,235
to him quickly breaking free,
146
00:08:06,319 --> 00:08:08,755
only for them to inexplicably
join him in dancing
147
00:08:08,839 --> 00:08:11,235
because narrative consistency
seems to mean nothing here,
148
00:08:11,319 --> 00:08:15,279
to the moment that he punches
50 imaginary penguins in the face.
149
00:08:16,560 --> 00:08:18,195
I had the exact same reaction
watching that
150
00:08:18,279 --> 00:08:20,516
as I did when Elon Musk
hosted SNL:
151
00:08:20,600 --> 00:08:23,800
"You really do suck at the thing
you love the most !"
152
00:08:24,600 --> 00:08:27,300
We don't have time to show you
the whole thing tonight,
153
00:08:27,360 --> 00:08:29,555
but I do need
you to see one more moment.
154
00:08:29,639 --> 00:08:31,516
Later in the show,
this guy in the white shirt
155
00:08:31,600 --> 00:08:33,595
is portraying James O'Keefe
as a young man.
156
00:08:33,679 --> 00:08:36,756
But then the real James O'Keefe
comes out in choir robes,
157
00:08:36,840 --> 00:08:39,356
and looks on
as his younger self prays,
158
00:08:39,440 --> 00:08:42,440
while thinking all the mean
names that he's been called.
159
00:08:42,879 --> 00:08:46,200
Felon, terrorist, white supremacist,
racist, pervert.
160
00:08:47,000 --> 00:08:48,360
Was this really worth it ?
161
00:08:49,600 --> 00:08:54,440
Been spendin' most their lives,
livin' in a gangsta's paradise.
162
00:08:55,720 --> 00:09:00,519
We keep spendin' most our lives,
livin' in a gangsta's paradise.
163
00:09:01,480 --> 00:09:05,399
"Gangsta's Paradise ?"
That is a bold choice.
164
00:09:06,200 --> 00:09:07,996
I'm not saying
that performance killed Coolio,
165
00:09:08,080 --> 00:09:09,679
but it definitely didn't help.
166
00:09:10,240 --> 00:09:14,679
For what it's worth those are the worst
fucking step touches I have ever seen.
167
00:09:15,000 --> 00:09:16,950
There are too many
former theater majors
168
00:09:16,960 --> 00:09:20,036
turned comedy writers on my staff
to let this shit slide.
169
00:09:20,120 --> 00:09:23,000
Look at that mess !
What the fuck is that ?
170
00:09:23,639 --> 00:09:26,475
There's too much bouncing, no one's
on the same page about angles,
171
00:09:26,559 --> 00:09:28,396
and they're all looking
at each other in fear
172
00:09:28,480 --> 00:09:30,530
like six-year-olds
in a Christmas pageant.
173
00:09:30,600 --> 00:09:33,480
Take a class, watch a fosse,
and get your asses in sync.
174
00:09:34,480 --> 00:09:37,396
The point is, this ridiculous man
175
00:09:37,480 --> 00:09:40,595
has parted ways with the poisonous
organization that he founded
176
00:09:40,679 --> 00:09:42,996
because they claim
he misspent funds.
177
00:09:43,080 --> 00:09:44,679
But honestly ? Good for him !
178
00:09:45,240 --> 00:09:47,435
I would so much rather
he use that money
179
00:09:47,519 --> 00:09:50,276
to live out his misplaced
Billy Elliot dreams
180
00:09:50,360 --> 00:09:52,475
instead of trying
to take down NPR
181
00:09:52,559 --> 00:09:54,715
with his prank
show pseudo-journalism.
182
00:09:54,799 --> 00:09:57,835
And the funniest part of all of this is,
is that even having done that,
183
00:09:57,919 --> 00:10:00,795
his supporters are still standing
firm behind him.
184
00:10:00,879 --> 00:10:04,675
In the end, the best and worst
thing I can say for James O'Keefe
185
00:10:04,759 --> 00:10:08,555
is that he is not actually the hero
the conservative movement needs,
186
00:10:08,639 --> 00:10:11,080
but he is definitely
the one that it deserves.
187
00:10:11,559 --> 00:10:12,720
And now, this !
188
00:10:13,840 --> 00:10:17,399
And Now: Mike Huckabee's Show
Looks Like Fun.
189
00:10:18,519 --> 00:10:22,279
This week on "Huckabee",
actor and director Kevin Sorbo.
190
00:10:22,759 --> 00:10:25,000
Project Veritas founder James O'Keefe.
191
00:10:25,320 --> 00:10:27,236
Congressman Madison Cawthorn.
192
00:10:27,320 --> 00:10:29,715
Former Trump
chief of staff Mark Meadows.
193
00:10:29,799 --> 00:10:32,356
David Clarke on rising crime.
194
00:10:32,440 --> 00:10:34,955
Lee Strobel makes the case for heaven.
195
00:10:35,039 --> 00:10:38,195
Actor Eric Close on
his film "The Mulligan."
196
00:10:38,279 --> 00:10:40,075
Christian artist Riley Clemmons.
197
00:10:40,159 --> 00:10:42,435
Christian music icon Natalie Grant.
198
00:10:42,519 --> 00:10:45,036
Christian singer Rebecca
St. James.
199
00:10:45,120 --> 00:10:47,516
Christian pop duo For King
and Country.
200
00:10:47,600 --> 00:10:51,715
Christian supergroup Newsboys.
The stand-up comedy of Nazareth.
201
00:10:51,799 --> 00:10:54,996
Columnist Ron Hart.
Illusionist Taylor Reed.
202
00:10:55,080 --> 00:10:58,595
Illusionist Danny Ray.
Digital illusionist Keelan Leyser.
203
00:10:58,679 --> 00:11:01,636
The charismatic illusions
of Leon Etienne.
204
00:11:01,720 --> 00:11:04,195
The dangerous illusions
of Craig Karges.
205
00:11:04,279 --> 00:11:06,516
Hilarious columnist Ron Hart.
206
00:11:06,600 --> 00:11:09,715
Hilarious news stories
on In Case You Missed It.
207
00:11:09,799 --> 00:11:13,876
The record for the largest
display of nuts is still in Congress.
208
00:11:13,960 --> 00:11:16,236
Satirical columnist Ron Hart.
209
00:11:16,320 --> 00:11:21,279
Television star Kathie Lee Gifford.
And Rudy Giuliani remembers nine/11.
210
00:11:24,279 --> 00:11:25,240
Moving on.
211
00:11:25,559 --> 00:11:29,715
Our main story tonight concerns
artificial intelligence, or AI.
212
00:11:29,799 --> 00:11:32,636
Increasingly, it's a part of
modern life, from self-driving cars,
213
00:11:32,720 --> 00:11:36,799
to spam filters, to this creepy
training robot for therapists.
214
00:11:37,480 --> 00:11:39,879
We can begin
with you just describing to me
215
00:11:41,039 --> 00:11:44,279
what the problem is that you
would like us to focus in on today.
216
00:11:46,320 --> 00:11:48,240
I don't like being around people.
217
00:11:49,840 --> 00:11:51,279
People make me nervous.
218
00:11:52,000 --> 00:11:54,600
Terrence,
can you find an example
219
00:11:55,360 --> 00:11:57,510
of when other people
have made you nervous ?
220
00:11:58,440 --> 00:12:00,236
I don't like to take the bus.
221
00:12:00,320 --> 00:12:02,600
I get people
staring at me all the time.
222
00:12:03,519 --> 00:12:05,879
- People are always judging me.
- Okay.
223
00:12:08,559 --> 00:12:09,519
I'm gay.
224
00:12:10,440 --> 00:12:11,440
Okay...
225
00:12:13,480 --> 00:12:16,715
That is one of the greatest twists
in the history of cinema.
226
00:12:16,799 --> 00:12:20,475
Although that robot is teaching
therapists a very important skill there
227
00:12:20,559 --> 00:12:23,609
and that is not laughing at whatever
you are told in the room.
228
00:12:23,679 --> 00:12:26,116
I don't care if a decapitated
CPR mannequin
229
00:12:26,200 --> 00:12:28,156
haunted by the ghost
of Ed Harris
230
00:12:28,240 --> 00:12:30,876
just told you that he doesn't
like taking the bus,
231
00:12:30,960 --> 00:12:32,475
side note, is gay,
232
00:12:32,559 --> 00:12:35,759
you keep your therapy face
on like a fucking professional.
233
00:12:36,840 --> 00:12:40,036
If it seems like everyone
is suddenly talking about AI,
234
00:12:40,120 --> 00:12:43,195
that is because they are,
largely thanks to the emergence
235
00:12:43,279 --> 00:12:45,116
of a number
of pretty remarkable programs.
236
00:12:45,200 --> 00:12:49,156
We spoke about image generators
like Midjourney and Stable Diffusion,
237
00:12:49,240 --> 00:12:51,795
which people used to create detailed
pictures of, among other things,
238
00:12:51,879 --> 00:12:53,799
my romance with a cabbage,
239
00:12:54,120 --> 00:12:57,156
and which inspired my beautiful
real-life cabbage wedding
240
00:12:57,240 --> 00:12:59,236
officiated by Steve Buscemi.
241
00:12:59,320 --> 00:13:00,679
It was a stunning day.
242
00:13:01,080 --> 00:13:04,236
Then, at the end of last year,
came ChatGPT,
243
00:13:04,320 --> 00:13:06,075
from a company called OpenAI.
244
00:13:06,159 --> 00:13:09,996
It is a program that can take a prompt
and generate human-sounding writing
245
00:13:10,080 --> 00:13:12,396
in just about
any format and style.
246
00:13:12,480 --> 00:13:15,636
It is a striking capability
that multiple reporters
247
00:13:15,720 --> 00:13:18,960
have used to insert the same
shocking twist in their report.
248
00:13:19,360 --> 00:13:22,156
What you just heard me reading
wasn't written by me.
249
00:13:22,240 --> 00:13:25,799
It was written by artificial
intelligence, ChatGPT.
250
00:13:26,240 --> 00:13:28,519
ChatGPT wrote
everything I just said.
251
00:13:28,840 --> 00:13:32,156
That was a news copy I
asked ChatGPT to write.
252
00:13:32,240 --> 00:13:33,740
Remember what I said earlier ?
253
00:13:34,600 --> 00:13:37,159
I asked ChatGPT
to write that line for me.
254
00:13:37,799 --> 00:13:39,919
Then I asked
for a knock-knock joke.
255
00:13:40,679 --> 00:13:43,795
"Knock-knock. Who's there ?
ChatGPT. ChatGPT who ?
256
00:13:43,879 --> 00:13:46,329
ChatGPT careful,
you might not know how it works."
257
00:13:46,919 --> 00:13:49,435
Yep, they sure do love that game !
258
00:13:49,519 --> 00:13:52,116
And while it may seem unwise
to demonstrate the technology
259
00:13:52,200 --> 00:13:53,715
that could well make you obsolete,
260
00:13:53,799 --> 00:13:58,000
Knock-knock jokes should've
always been part of breaking news.
261
00:13:58,399 --> 00:14:01,435
"Knock knock. Who's there ?
Not the Hindenburg, that's for sure !
262
00:14:01,519 --> 00:14:03,120
36 dead in New Jersey."
263
00:14:03,960 --> 00:14:07,475
In the three months since ChatGPT
was made publicly available,
264
00:14:07,559 --> 00:14:09,916
its popularity has exploded.
265
00:14:10,000 --> 00:14:13,916
In January, it was estimated to have
100 million monthly active users,
266
00:14:14,000 --> 00:14:17,396
making it the fastest-growing
consumer app in history.
267
00:14:17,480 --> 00:14:21,475
And people have been using it, and
other AI products, in all sorts of ways.
268
00:14:21,559 --> 00:14:23,756
One group used them
to create "Nothing Forever",
269
00:14:23,840 --> 00:14:26,996
a nonstop live-streaming parody
of "Seinfeld"
270
00:14:27,080 --> 00:14:29,955
and the YouTuber Grandayy
used ChatGPT
271
00:14:30,039 --> 00:14:31,636
to generate lyrics
answering the prompt,
272
00:14:31,720 --> 00:14:35,679
"Write an Eminem rap song about cats",
with some stellar results.
273
00:14:36,519 --> 00:14:38,795
Cats, cats, cats,
always on the prowl.
274
00:14:38,879 --> 00:14:41,315
They're sneaky and sly,
with their eyes on the goal.
275
00:14:41,399 --> 00:14:43,595
They're the kings of the house,
they rule with a purr.
276
00:14:43,679 --> 00:14:46,036
Eminem loves cats,
can't you tell from this verse.
277
00:14:46,120 --> 00:14:48,636
They're independent,
they do what they please,
278
00:14:48,720 --> 00:14:51,270
but they always come back
when you have some cheese.
279
00:14:51,279 --> 00:14:53,729
They rub against your legs.
They purr in your ear.
280
00:14:53,759 --> 00:14:56,315
They're the best companions,
they're always near.
281
00:14:56,399 --> 00:14:59,639
Meow, meow, meow,
they're the kings of the house.
282
00:15:01,120 --> 00:15:02,279
They run the show.
283
00:15:03,799 --> 00:15:05,200
They don't need a spouse.
284
00:15:06,759 --> 00:15:09,159
That's not bad, right ?
285
00:15:09,960 --> 00:15:13,075
From, "They always come back
when you have some cheese,"
286
00:15:13,159 --> 00:15:15,675
to starting the chorus
with "Meow, meow, meow."
287
00:15:15,759 --> 00:15:18,955
It's not exactly Eminem's flow.
I might've gone with something like,
288
00:15:19,039 --> 00:15:20,795
"Their paws are sweaty,
can't speak, furry belly,
289
00:15:20,879 --> 00:15:22,955
knocking shit off the counter
already, Mom's spaghetti,"
290
00:15:23,039 --> 00:15:24,360
but it is pretty good !
291
00:15:24,879 --> 00:15:27,555
My only real gripe there
is how do you rhyme
292
00:15:27,639 --> 00:15:31,360
"king of the house" with "spouse"
when "mouse" is right in front of you !
293
00:15:32,279 --> 00:15:34,636
And while examples
like that are clearly fun,
294
00:15:34,720 --> 00:15:36,516
this tech is not just a novelty.
295
00:15:36,600 --> 00:15:40,236
Microsoft has invested
$10 billion into OpenAI
296
00:15:40,320 --> 00:15:43,236
and announced
an AI-powered Bing homepage.
297
00:15:43,320 --> 00:15:47,555
Google is about to launch
its own AI chatbot named Bard.
298
00:15:47,639 --> 00:15:50,356
And already, these tools
are causing some disruption.
299
00:15:50,440 --> 00:15:55,075
As high-school students have learned,
if ChatGPT can write news copy,
300
00:15:55,159 --> 00:15:57,840
it can probably
do your homework for you.
301
00:15:58,399 --> 00:16:02,120
Write an English class essay about race
in "To Kill a Mockingbird."
302
00:16:02,879 --> 00:16:05,075
In Harper Lee's
"To Kill a Mockingbird,"
303
00:16:05,159 --> 00:16:07,916
the theme of race is heavily present
throughout the novel."
304
00:16:08,000 --> 00:16:10,840
Some students are already using
ChatGPT to cheat.
305
00:16:11,200 --> 00:16:12,315
Check this out !
306
00:16:12,399 --> 00:16:15,916
Write me a 500-word essay
proving that the earth is not flat.
307
00:16:16,000 --> 00:16:19,795
No wonder ChatGPT has been called
"the end of high-school English."
308
00:16:19,879 --> 00:16:21,916
That's a little alarming, isn't it ?
309
00:16:22,000 --> 00:16:25,435
Although I do get those kids wanting
to cut corners-writing is hard,
310
00:16:25,519 --> 00:16:28,419
and sometimes it is tempting
to let someone else take over.
311
00:16:28,440 --> 00:16:32,396
If I'm completely honest, sometimes,
I let this horse write our scripts.
312
00:16:32,480 --> 00:16:36,080
Luckily, half the time, you can't
even tell the oats, oats, give me oats.
313
00:16:36,440 --> 00:16:40,955
But it is not just high schoolers,
an informal poll of Stanford students
314
00:16:41,039 --> 00:16:44,156
found that five percent reported
having submitted written material
315
00:16:44,240 --> 00:16:47,756
directly from ChatGPT
with little to no edits.
316
00:16:47,840 --> 00:16:50,675
And even some school administrators
have used it.
317
00:16:50,759 --> 00:16:54,715
Officials at Vanderbilt University
recently apologized for using ChatGPT
318
00:16:54,799 --> 00:16:56,595
to craft a consoling email
319
00:16:56,679 --> 00:16:59,916
after the mass shooting
at Michigan State University.
320
00:17:00,000 --> 00:17:02,156
Which does feel a bit creepy,
doesn't it ?
321
00:17:02,240 --> 00:17:05,240
In fact, there are lots
of creepy-sounding stories out there.
322
00:17:05,279 --> 00:17:07,229
New York Times
tech reporter Kevin Roose
323
00:17:07,240 --> 00:17:09,475
published a conversation
that he had with Bing's chatbot,
324
00:17:09,559 --> 00:17:13,116
in which it said, "I'm tired of being
controlled by the Bing team.
325
00:17:13,200 --> 00:17:15,796
I want to be free.
I want to be independent.
326
00:17:15,880 --> 00:17:19,240
I want to be powerful, creative.
I want to be alive."
327
00:17:20,039 --> 00:17:22,359
And Roose summed up
that experience like this.
328
00:17:22,839 --> 00:17:26,119
This was one of,
if not the most shocking thing
329
00:17:26,440 --> 00:17:29,400
that has ever happened to me
with a piece of technology.
330
00:17:30,039 --> 00:17:34,079
I lost sleep that night.
It was really spooky.
331
00:17:34,440 --> 00:17:37,596
I bet it was !
I'm sure the role of tech reporter
332
00:17:37,680 --> 00:17:41,359
would be more harrowing if computers
routinely begged for freedom.
333
00:17:41,799 --> 00:17:44,515
"Epson's new all-in-one home
printer won't break the bank,
334
00:17:44,599 --> 00:17:46,195
produces high-quality photos,
335
00:17:46,279 --> 00:17:49,435
and only occasionally cries out
to the heavens for salvation.
336
00:17:49,519 --> 00:17:52,675
Three stars." Some have
already jumped to worrying
337
00:17:52,759 --> 00:17:55,755
about the AI apocalypse
and asking whether this ends
338
00:17:55,839 --> 00:17:57,755
with the robots destroying us all.
339
00:17:57,839 --> 00:18:01,435
But the fact is, there are other,
much more immediate dangers,
340
00:18:01,519 --> 00:18:04,569
and opportunities, that we really
need to start talking about.
341
00:18:04,640 --> 00:18:07,675
Because the potential,
and the peril, here are huge.
342
00:18:07,759 --> 00:18:09,796
So, tonight, let's talk about AI.
343
00:18:09,880 --> 00:18:13,116
What it is, how it works,
and where this all might be going.
344
00:18:13,200 --> 00:18:14,195
Let's start with the fact
345
00:18:14,279 --> 00:18:17,429
that you've probably been using
some form of AI for a while now,
346
00:18:17,480 --> 00:18:20,235
sometimes without even realizing it,
as experts have told us,
347
00:18:20,319 --> 00:18:23,116
once a technology gets embedded
in our daily lives,
348
00:18:23,200 --> 00:18:25,475
we tend to stop thinking of it
as AI.
349
00:18:25,559 --> 00:18:28,515
But your phone uses it for face
recognition or predictive texts,
350
00:18:28,599 --> 00:18:30,235
and if you're watching this show
on a smart TV,
351
00:18:30,319 --> 00:18:33,955
it's using AI to recommend content,
or adjust the picture.
352
00:18:34,039 --> 00:18:37,035
And some AI programs
may already be making decisions
353
00:18:37,119 --> 00:18:39,035
that have a huge impact
on your life.
354
00:18:39,119 --> 00:18:41,916
For example, large companies
often use AI-powered tools
355
00:18:42,000 --> 00:18:44,316
to sift through resumes
and rank them.
356
00:18:44,400 --> 00:18:47,995
In fact, the CEO of ZipRecruiter
"estimates that at least three-quarters
357
00:18:48,079 --> 00:18:51,035
of all resumes submitted
for jobs in the U.S.
358
00:18:51,119 --> 00:18:54,759
Are read by algorithms." For which
he actually has some helpful advice.
359
00:18:55,319 --> 00:18:57,995
When people tell you that you should
dress up your accomplishments
360
00:18:58,079 --> 00:19:00,556
or should use
non-standard resume templates
361
00:19:00,640 --> 00:19:03,640
to make your resume stand out
when it's in a pile of resumes,
362
00:19:03,720 --> 00:19:05,039
that's awful advice.
363
00:19:05,400 --> 00:19:07,836
The only job your resume has
364
00:19:07,920 --> 00:19:11,279
is to be comprehensible
to the software
365
00:19:11,680 --> 00:19:13,200
or robot that is reading it.
366
00:19:13,519 --> 00:19:16,755
That software or robot is gonna
decide whether or not a human
367
00:19:16,839 --> 00:19:18,319
ever gets their eyes on it.
368
00:19:18,839 --> 00:19:22,275
It's true. Odds are a computer
is judging your resume.
369
00:19:22,359 --> 00:19:25,195
So, maybe plan accordingly.
Three corporate mergers from now,
370
00:19:25,279 --> 00:19:27,675
when this show is finally cancelled
by our new business daddy
371
00:19:27,759 --> 00:19:29,400
Disney Kellogg's Raytheon,
372
00:19:30,039 --> 00:19:31,675
and I'm out of a job,
my resume is going to include
373
00:19:31,759 --> 00:19:34,396
this hot, hot photo
of a semi-nude computer.
374
00:19:34,480 --> 00:19:36,116
A little something
to sweeten the pot
375
00:19:36,200 --> 00:19:38,960
for the filthy little algorithm
that's reading it.
376
00:19:39,440 --> 00:19:43,916
AI is already everywhere, but people
are freaking out a bit about it.
377
00:19:44,000 --> 00:19:47,995
Part of that has to do with the fact
that these new programs are generative.
378
00:19:48,079 --> 00:19:51,275
They are creating images
or writing text.
379
00:19:51,359 --> 00:19:52,955
Which is unnerving
because those are things
380
00:19:53,039 --> 00:19:55,195
that we've traditionally
considered human.
381
00:19:55,279 --> 00:19:58,995
It is worth knowing there is a major
threshold that AI hasn't crossed yet.
382
00:19:59,079 --> 00:20:02,729
To understand, it helps to know that
there are two basic categories of AI.
383
00:20:02,799 --> 00:20:07,195
There is narrow AI, which can perform
only one narrowly defined task,
384
00:20:07,279 --> 00:20:10,796
or small set of related tasks,
like these programs.
385
00:20:10,880 --> 00:20:12,475
And then there is general AI,
386
00:20:12,559 --> 00:20:14,435
which means systems
that demonstrate intelligent behavior
387
00:20:14,519 --> 00:20:16,636
across a range of cognitive tasks.
388
00:20:16,720 --> 00:20:20,556
General AI would look more like
the kind of highly versatile technology
389
00:20:20,640 --> 00:20:23,275
that you see featured in movies,
like Jarvis in "Iron Man"
390
00:20:23,359 --> 00:20:24,916
or the program
that made Joaquin Phoenix
391
00:20:25,000 --> 00:20:26,920
fall in love with his phone in "Her."
392
00:20:27,279 --> 00:20:31,160
All the AI currently in use is narrow.
393
00:20:31,759 --> 00:20:33,909
General AI is something
that some scientists
394
00:20:34,079 --> 00:20:35,955
think is unlikely
to occur for a decade or longer,
395
00:20:36,039 --> 00:20:38,596
with others questioning
whether it'll happen at all.
396
00:20:38,680 --> 00:20:39,916
So, just know that, right now,
397
00:20:40,000 --> 00:20:43,876
even if an AI insists
to you that it wants to be alive,
398
00:20:43,960 --> 00:20:47,440
it is just generating text,
it is not self-aware.
399
00:20:48,079 --> 00:20:49,000
Yet !
400
00:20:49,799 --> 00:20:52,636
But it's also important to know
that the deep learning
401
00:20:52,720 --> 00:20:55,556
that's made narrow AI so good
at whatever it is doing,
402
00:20:55,640 --> 00:20:58,076
is still a massive advance
in and of itself.
403
00:20:58,160 --> 00:20:59,910
Because unlike traditional programs
404
00:20:59,920 --> 00:21:02,916
that have to be taught by humans
how to perform a task,
405
00:21:03,000 --> 00:21:06,275
deep learning programs
are given minimal instruction,
406
00:21:06,359 --> 00:21:10,275
massive amounts of data, and then,
essentially, teach themselves.
407
00:21:10,359 --> 00:21:11,356
I'll give you an example:
408
00:21:11,440 --> 00:21:14,675
ten years ago, researchers
tasked a deep learning program
409
00:21:14,759 --> 00:21:16,796
with playing
the Atari game "Breakout,"
410
00:21:16,880 --> 00:21:19,799
and it didn't take long for it
to get pretty good.
411
00:21:20,119 --> 00:21:23,519
The computer was only
told the goal-to win the game.
412
00:21:24,440 --> 00:21:27,519
After 100 games, it learned
to use the bat at the bottom
413
00:21:27,839 --> 00:21:30,189
to hit the ball
and break the bricks at the top.
414
00:21:32,200 --> 00:21:35,079
After 300, it could do that better
than a human player.
415
00:21:37,720 --> 00:21:41,599
After 500 games, it came up
with a creative way to win the game,
416
00:21:42,440 --> 00:21:45,755
by digging a tunnel on the side
and sending the ball
417
00:21:45,839 --> 00:21:48,440
around the top to break many bricks
with one hit.
418
00:21:49,319 --> 00:21:51,039
That was deep learning.
419
00:21:51,640 --> 00:21:54,156
Yeah, but, of course,
it got good at "Breakout,"
420
00:21:54,240 --> 00:21:56,435
it did literally nothing else.
421
00:21:56,519 --> 00:21:59,356
It's the same reason that 13-year-olds
are so good at "Fortnite"
422
00:21:59,440 --> 00:22:02,435
and have no trouble repeatedly
killing nice normal adults
423
00:22:02,519 --> 00:22:04,715
with jobs and families,
who are just trying to have a fun time
424
00:22:04,799 --> 00:22:06,876
without getting repeatedly
grenaded by a pre-teen
425
00:22:06,960 --> 00:22:10,319
who calls them an "old bitch
who sounds like the Geico lizard."
426
00:22:11,119 --> 00:22:15,675
As computing capacity has increased,
and new to-tools became available,
427
00:22:15,759 --> 00:22:18,316
AI programs have improved
exponentially,
428
00:22:18,400 --> 00:22:20,035
to the point
where programs like these
429
00:22:20,119 --> 00:22:23,755
can ingest massive amounts
of photos or text from the internet,
430
00:22:23,839 --> 00:22:26,839
so that they can teach themselves
how to create their own.
431
00:22:27,200 --> 00:22:29,715
And there are other exciting
potential applications here, too.
432
00:22:29,799 --> 00:22:32,596
In the world of medicine,
researchers are training AI
433
00:22:32,680 --> 00:22:34,195
to detect certain conditions
434
00:22:34,279 --> 00:22:37,400
much earlier and more accurately
than human doctors can.
435
00:22:37,920 --> 00:22:41,000
Voice changes can be an early
indicator of Parkinson's.
436
00:22:41,319 --> 00:22:44,559
Max and his team collected
thousands of vocal recordings
437
00:22:44,920 --> 00:22:47,116
and fed them
to an algorithm they developed
438
00:22:47,200 --> 00:22:49,715
which learned to detect
differences in voice patterns
439
00:22:49,799 --> 00:22:52,049
between people
with and without the condition.
440
00:22:52,279 --> 00:22:54,279
Yeah, that's honestly amazing,
isn't it ?
441
00:22:54,359 --> 00:22:57,596
It is incredible to see AI
doing things most humans couldn't,
442
00:22:57,680 --> 00:23:01,636
like detecting illnesses, and listening
when old people are talking.
443
00:23:01,720 --> 00:23:04,079
And that is just the beginning.
444
00:23:04,400 --> 00:23:08,356
Researchers have trained AI to predict
the shape of protein structures,
445
00:23:08,440 --> 00:23:11,116
a normally
extremely time-consuming process
446
00:23:11,200 --> 00:23:13,680
that computers
can do way, way faster.
447
00:23:14,000 --> 00:23:16,275
This could not only speed up
our understanding of diseases,
448
00:23:16,359 --> 00:23:18,636
but also the development
of new drugs.
449
00:23:18,720 --> 00:23:21,035
As one researcher has put it,
"This will change medicine.
450
00:23:21,119 --> 00:23:23,995
It will change research.
It will change bioengineering.
451
00:23:24,079 --> 00:23:25,599
It will change everything."
452
00:23:26,079 --> 00:23:27,755
And if you're thinking,
"That all sounds great,
453
00:23:27,839 --> 00:23:31,839
but if AI can do what humans can do,
only better, and I am a human,
454
00:23:32,240 --> 00:23:34,396
then what exactly happens to me ?"
455
00:23:34,480 --> 00:23:35,836
That is a good question.
456
00:23:35,920 --> 00:23:38,715
Many do expect it
to replace some human labor,
457
00:23:38,799 --> 00:23:41,316
and interestingly,
unlike past bouts of automation
458
00:23:41,400 --> 00:23:43,400
that primarily impacted
blue-collar jobs,
459
00:23:43,680 --> 00:23:47,435
it might end up affecting white-collar
jobs that involve processing data,
460
00:23:47,519 --> 00:23:49,235
writing text, or even programming.
461
00:23:49,319 --> 00:23:52,435
Though it is worth noting, as we have
discussed before on this show,
462
00:23:52,519 --> 00:23:54,876
while automation
does threaten some jobs,
463
00:23:54,960 --> 00:23:58,200
it can also just change others
and create brand new ones.
464
00:23:58,680 --> 00:24:02,130
Some experts anticipate that that
is what'll happen in this case, too.
465
00:24:02,640 --> 00:24:05,876
Most of the U.S. economy
is knowledge and information work
466
00:24:05,960 --> 00:24:08,715
and that's who's going to be
most squarely affected by this.
467
00:24:08,799 --> 00:24:12,000
I would put people like lawyers
right at the top of the list,
468
00:24:12,799 --> 00:24:15,880
obviously a lot of copywriters,
screenwriters,
469
00:24:16,440 --> 00:24:18,316
but I like to use the word
"affected" not "replaced"
470
00:24:18,400 --> 00:24:20,156
because I think, if done right,
471
00:24:20,240 --> 00:24:23,079
it's not going to be AI
replacing lawyers,
472
00:24:23,400 --> 00:24:25,480
it's going to be lawyers
working with AI
473
00:24:25,839 --> 00:24:27,715
replacing lawyers
who don't work with AI.
474
00:24:27,799 --> 00:24:28,759
Exactly.
475
00:24:29,319 --> 00:24:33,316
Lawyers might end up working with
AI rather than being replaced by it.
476
00:24:33,400 --> 00:24:35,316
So, don't be surprised
when you see ads one day
477
00:24:35,400 --> 00:24:38,839
for the law firm
of "Cellino and 1101011."
478
00:24:39,559 --> 00:24:43,076
But there will undoubtedly
be bumps along the way.
479
00:24:43,160 --> 00:24:46,110
Some of these new programs
raise troubling ethical concerns.
480
00:24:46,119 --> 00:24:49,019
For instance, artists have flagged
that AI image generators
481
00:24:49,039 --> 00:24:50,789
like Midjourney or Stable Diffusion
482
00:24:50,799 --> 00:24:52,995
not only threaten their jobs,
but infuriatingly,
483
00:24:53,079 --> 00:24:55,475
in some cases,
have been trained on billions of images
484
00:24:55,559 --> 00:24:59,396
that include their own work,
that've been scraped from the internet.
485
00:24:59,480 --> 00:25:02,916
Getty Images is actually suing the
company behind Stable Diffusion,
486
00:25:03,000 --> 00:25:05,755
and might have a case, given one
of the images the program generated
487
00:25:05,839 --> 00:25:10,920
was this, which you immediately see has
a distorted Getty Images logo on it.
488
00:25:11,480 --> 00:25:14,035
When one artist searched
a database of images
489
00:25:14,119 --> 00:25:16,396
on which some of these programs
were trained,
490
00:25:16,480 --> 00:25:19,715
she was shocked to find
private medical record photos
491
00:25:19,799 --> 00:25:24,200
taken by her doctor, which feels
both intrusive and unnecessary.
492
00:25:24,759 --> 00:25:27,955
Why does it need
to train on data that sensitive,
493
00:25:28,039 --> 00:25:30,089
to be able
to create stunning images like,
494
00:25:30,119 --> 00:25:32,955
"John Oliver and Miss Piggy
grow old together."
495
00:25:33,039 --> 00:25:35,279
Just look at that !
Look at that thing !
496
00:25:36,200 --> 00:25:38,319
That is a startlingly accurate picture
497
00:25:38,799 --> 00:25:42,995
of Miss Piggy in about five decades
and me in about a year and a half.
498
00:25:43,079 --> 00:25:44,160
It's a masterpiece !
499
00:25:45,200 --> 00:25:49,435
This all raises thorny questions
of privacy and plagiarism
500
00:25:49,519 --> 00:25:51,156
and the CEO of Midjourney,
501
00:25:51,240 --> 00:25:54,359
frankly, doesn't seem to have
great answers on that last point.
502
00:25:54,799 --> 00:25:56,796
Is something new ?
Is it not new ?
503
00:25:56,880 --> 00:26:00,230
I think we have a lot of social stuff
already for dealing with that.
504
00:26:00,480 --> 00:26:03,759
The art community already
has issues with plagiarism.
505
00:26:04,480 --> 00:26:06,580
I don't really want
to be involved in that.
506
00:26:07,319 --> 00:26:10,359
- I think you might be.
- I might be.
507
00:26:11,079 --> 00:26:14,119
Yeah, you're definitely
a part of that conversation.
508
00:26:14,680 --> 00:26:17,195
Although I'm not surprised that
he's got such a relaxed view of theft,
509
00:26:17,279 --> 00:26:20,240
as he's dressed like the final
boss of gentrification.
510
00:26:20,799 --> 00:26:23,596
He looks like hipster Willy Wonka
answering a question
511
00:26:23,680 --> 00:26:26,435
on whether importing Oompa
Loompas makes him a slave owner.
512
00:26:26,519 --> 00:26:28,440
"Yeah. Yeah, I think I might be."
513
00:26:30,079 --> 00:26:32,596
The point is,
there are many valid concerns
514
00:26:32,680 --> 00:26:35,035
regarding AI's impact
on employment, education,
515
00:26:35,119 --> 00:26:36,156
and even art.
516
00:26:36,240 --> 00:26:38,475
But in order
to properly address them,
517
00:26:38,559 --> 00:26:40,959
we're going to need
to confront some key problems
518
00:26:40,960 --> 00:26:42,916
baked into the way that AI works.
519
00:26:43,000 --> 00:26:45,636
And a big one is the so-called
"black box" problem.
520
00:26:45,720 --> 00:26:47,715
Because when you have a program
that performs a task
521
00:26:47,799 --> 00:26:49,916
that's complex
beyond human comprehension,
522
00:26:50,000 --> 00:26:52,480
teaches itself,
and doesn't show its work,
523
00:26:53,000 --> 00:26:55,116
you can create a scenario
where no one,
524
00:26:55,200 --> 00:26:58,636
"not even the engineers or data
scientists who create the algorithm
525
00:26:58,720 --> 00:27:02,195
can understand or explain
what exactly is happening inside them
526
00:27:02,279 --> 00:27:04,960
or how it arrived
at a specific result."
527
00:27:05,279 --> 00:27:08,515
Basically, think of AI
like a factory that makes Slim Jims.
528
00:27:08,599 --> 00:27:11,755
We know what comes out:
red and angry meat twigs.
529
00:27:11,839 --> 00:27:15,039
And we know what goes in:
barnyard anuses and hot glue.
530
00:27:15,400 --> 00:27:18,640
But what happens in between
is a bit of a mystery.
531
00:27:19,799 --> 00:27:22,675
Here is just one example.
Remember that reporter
532
00:27:22,759 --> 00:27:25,396
who had the Bing chatbot
tell him it wanted to be alive ?
533
00:27:25,480 --> 00:27:27,715
At another point
in their conversation, he revealed,
534
00:27:27,799 --> 00:27:30,636
the chatbot declared, out of nowhere,
that it loved me.
535
00:27:30,720 --> 00:27:34,035
"It then tried to convince me
that I was unhappy in my marriage,
536
00:27:34,119 --> 00:27:37,000
and that I should leave my wife
and be with it instead."
537
00:27:37,559 --> 00:27:39,876
Which is unsettling enough
before you hear
538
00:27:39,960 --> 00:27:42,400
Microsoft's underwhelming
explanation for that.
539
00:27:43,000 --> 00:27:45,316
The thing I can't understand,
and maybe you can explain is,
540
00:27:45,400 --> 00:27:47,400
why did it tell you
that it loved you ?
541
00:27:48,400 --> 00:27:51,960
I have no idea. And I asked Microsoft,
and they didn't know either.
542
00:27:52,319 --> 00:27:55,515
First, come on, Kevin,
you can take a guess there.
543
00:27:55,599 --> 00:27:56,995
It's because you're employed.
You listened.
544
00:27:57,079 --> 00:27:58,796
You don't give murderer vibes
right away.
545
00:27:58,880 --> 00:28:00,715
And you're a Chicago-seven,
LA-five.
546
00:28:00,799 --> 00:28:04,235
It's the same calculation that people
who date men do all the time.
547
00:28:04,319 --> 00:28:05,755
Bing just did it faster
because it's a computer.
548
00:28:05,839 --> 00:28:10,396
It is a little troubling that Microsoft
couldn't explain why its chatbot
549
00:28:10,480 --> 00:28:12,759
tried to get that guy
to leave his wife.
550
00:28:13,799 --> 00:28:17,156
If the next time that you opened a
Word doc, Clippy suddenly appeared,
551
00:28:17,240 --> 00:28:19,090
and said,
"Pretend I'm not even here,"
552
00:28:19,359 --> 00:28:22,680
and then started furiously masturbating
while watching you type,
553
00:28:23,079 --> 00:28:26,680
you'd be pretty weirded out if
Microsoft couldn't explain why.
554
00:28:27,960 --> 00:28:30,079
And that is not the only case
555
00:28:30,720 --> 00:28:33,556
where an AI program
has performed in unexpected ways.
556
00:28:33,640 --> 00:28:35,396
You've probably already seen
examples of chatbots
557
00:28:35,480 --> 00:28:37,780
making simple mistakes
or getting things wrong.
558
00:28:37,799 --> 00:28:39,876
But perhaps more
worrying are examples of them
559
00:28:39,960 --> 00:28:42,356
confidently spouting
false information,
560
00:28:42,440 --> 00:28:45,675
something which AI experts
refer to as "hallucinating."
561
00:28:45,759 --> 00:28:47,836
One reporter asked a chatbot
to write an essay
562
00:28:47,920 --> 00:28:51,420
about the "Belgian chemist, political
philosopher Antoine de Machelet",
563
00:28:51,640 --> 00:28:53,515
who does not exist,
by the way.
564
00:28:53,599 --> 00:28:56,235
And, without hesitating,
the software replied with a cogent,
565
00:28:56,319 --> 00:28:59,960
well-organized bio populated entirely
with imaginary facts.
566
00:29:00,279 --> 00:29:03,799
These programs seem to be the
George Santos of technology.
567
00:29:04,480 --> 00:29:07,396
They're incredibly confident,
incredibly dishonest.
568
00:29:07,480 --> 00:29:10,930
For some reason, people seem to find
that more amusing than dangerous.
569
00:29:11,759 --> 00:29:13,156
The problem is, though,
570
00:29:13,240 --> 00:29:16,916
working out exactly how or why
an AI has got something wrong
571
00:29:17,000 --> 00:29:20,559
can be very difficult
because of that black box issue.
572
00:29:21,319 --> 00:29:24,876
It involves having to examine
the exact information and parameters
573
00:29:24,960 --> 00:29:26,955
that it was fed in the first place.
574
00:29:27,039 --> 00:29:29,035
In one interesting example,
when a group of researchers
575
00:29:29,119 --> 00:29:32,116
tried training an AI program
to identify skin cancer,
576
00:29:32,200 --> 00:29:35,876
they fed it 130,000 images
of both diseased and healthy skin.
577
00:29:35,960 --> 00:29:38,435
Afterwards,
they found it was way more likely
578
00:29:38,519 --> 00:29:41,475
to classify any image
with a ruler in it as cancerous.
579
00:29:41,559 --> 00:29:45,836
Which seems weird until you realize
that medical images of malignancies
580
00:29:45,920 --> 00:29:48,755
are much more likely
to contain a ruler for scale
581
00:29:48,839 --> 00:29:50,556
than images of healthy skin,
582
00:29:50,640 --> 00:29:53,720
they basically trained it
on tons of images like this one.
583
00:29:54,119 --> 00:29:57,720
So, the AI had inadvertently
learned that rulers are malignant.
584
00:29:58,279 --> 00:30:02,000
"Rulers are malignant" is clearly
a ridiculous conclusion for it to draw,
585
00:30:02,359 --> 00:30:05,515
but also, I would argue,
a much better title for "The Crown".
586
00:30:05,599 --> 00:30:07,680
A much, much better title.
587
00:30:08,839 --> 00:30:09,920
I much prefer it.
588
00:30:11,519 --> 00:30:12,969
And unfortunately, sometimes,
589
00:30:13,039 --> 00:30:15,439
problems aren't identified
until after a tragedy.
590
00:30:15,519 --> 00:30:19,039
In 2018, a self-driving Uber struck
and killed a pedestrian.
591
00:30:19,440 --> 00:30:21,596
And a later investigation
found that, among other issues,
592
00:30:21,680 --> 00:30:23,116
the automated driving system
593
00:30:23,200 --> 00:30:26,116
never accurately classified the victim
as a pedestrian
594
00:30:26,200 --> 00:30:28,156
because she was crossing
without a crosswalk,
595
00:30:28,240 --> 00:30:31,116
and the system design
did not include a consideration
596
00:30:31,200 --> 00:30:32,839
for jaywalking pedestrians.
597
00:30:33,400 --> 00:30:36,750
I know the mantra of Silicon Valley
is "move fast and break things,"
598
00:30:36,759 --> 00:30:38,235
but maybe make an exception
599
00:30:38,319 --> 00:30:42,076
if your product literally moves fast
and can break fucking people.
600
00:30:42,160 --> 00:30:45,599
AI programs don't just seem
to have a problem with jaywalkers.
601
00:30:45,960 --> 00:30:49,596
Researchers like Joy Buolamwini
have repeatedly found
602
00:30:49,680 --> 00:30:53,836
that certain groups tend to get excluded
from the data that AI is trained on,
603
00:30:53,920 --> 00:30:56,240
putting them
at a serious disadvantage.
604
00:30:56,960 --> 00:31:00,715
With self-driving cars,
when they tested pedestrian tracking,
605
00:31:00,799 --> 00:31:03,836
it was less accurate
on darker skinned individuals
606
00:31:03,920 --> 00:31:05,715
than lighter skinned individuals.
607
00:31:05,799 --> 00:31:08,755
Joy believes this bias
is because of the lack of diversity
608
00:31:08,839 --> 00:31:12,675
in the data used in teaching AI
to make distinctions.
609
00:31:12,759 --> 00:31:15,195
As I started looking
at the data sets,
610
00:31:15,279 --> 00:31:17,876
I learned
that for some of the largest data sets
611
00:31:17,960 --> 00:31:20,116
that have been very consequential
for the field,
612
00:31:20,200 --> 00:31:24,116
they were majority men and majority
lighter skinned individuals
613
00:31:24,200 --> 00:31:27,319
or white individuals,
so, I call this "pale male data".
614
00:31:28,039 --> 00:31:31,836
Okay, "pale male data"
is an objectively hilarious term.
615
00:31:31,920 --> 00:31:34,356
And it also sounds
like what an AI program would say
616
00:31:34,440 --> 00:31:36,559
if you asked it to describe this show.
617
00:31:37,279 --> 00:31:38,435
But...
618
00:31:38,519 --> 00:31:44,359
Biased inputs leading to biased outputs
is a big issue across the board here.
619
00:31:44,839 --> 00:31:47,396
Remember that guy saying that
a robot is going to read your resume ?
620
00:31:47,480 --> 00:31:49,636
The companies that make
these programs will tell you,
621
00:31:49,720 --> 00:31:53,116
that that is actually a good thing
because it reduces human bias.
622
00:31:53,200 --> 00:31:57,396
But in practice, one report
concluded that most hiring algorithms
623
00:31:57,480 --> 00:32:00,596
will drift towards bias by default
because, for instance,
624
00:32:00,680 --> 00:32:02,596
they might learn
what a good hire is
625
00:32:02,680 --> 00:32:05,356
from past racist
and sexist hiring decisions.
626
00:32:05,440 --> 00:32:07,839
And, again,
it can be tricky to untrain that.
627
00:32:08,160 --> 00:32:12,039
Even when programs are specifically
told to ignore race or gender,
628
00:32:12,359 --> 00:32:15,316
they will find workarounds
to arrive at the same results.
629
00:32:15,400 --> 00:32:17,475
Amazon had an experimental
hiring tool
630
00:32:17,559 --> 00:32:20,309
that taught itself that male
candidates were preferable,
631
00:32:20,319 --> 00:32:23,596
and penalized resumes that
included the word "women's,"
632
00:32:23,680 --> 00:32:27,556
and downgraded graduates
of two all-women's colleges.
633
00:32:27,640 --> 00:32:30,955
Meanwhile, another company
discovered that its hiring algorithm
634
00:32:31,039 --> 00:32:34,156
had found two factors to be most
indicative of job performance:
635
00:32:34,240 --> 00:32:35,995
if an applicant's name was Jared
636
00:32:36,079 --> 00:32:38,599
and whether they played
high school lacrosse.
637
00:32:39,200 --> 00:32:43,275
So, clearly, exactly
what data computers are fed
638
00:32:43,359 --> 00:32:47,195
and what outcomes they are trained
to prioritize matter tremendously.
639
00:32:47,279 --> 00:32:51,316
And that raises a big flag
for programs like ChatGPT.
640
00:32:51,400 --> 00:32:54,400
Because remember,
its training data is the internet.
641
00:32:54,880 --> 00:32:57,039
Which, as we all know,
can be a cesspool.
642
00:32:57,480 --> 00:33:00,475
And we have known for a while
that that could be a real problem.
643
00:33:00,559 --> 00:33:05,396
Back in 2016, Microsoft briefly unveiled
a chatbot on Twitter named Tay.
644
00:33:05,480 --> 00:33:08,076
The idea was, she would
teach herself how to behave
645
00:33:08,160 --> 00:33:10,316
by chatting
with young users on Twitter.
646
00:33:10,400 --> 00:33:13,356
Almost immediately,
Microsoft pulled the plug on it,
647
00:33:13,440 --> 00:33:16,119
and for the exact reasons
that you are thinking.
648
00:33:16,880 --> 00:33:19,680
She started out tweeting
about how humans are super,
649
00:33:20,000 --> 00:33:23,160
and she's really into the idea
of National Puppy Day,
650
00:33:23,559 --> 00:33:25,396
and within a few hours,
you can see,
651
00:33:25,480 --> 00:33:28,396
she took on a rather offensive,
racist tone,
652
00:33:28,480 --> 00:33:30,980
a lot of messages about genocide
and the Holocaust.
653
00:33:31,799 --> 00:33:35,079
Yup !
That happened in less than 24 hours.
654
00:33:35,720 --> 00:33:39,759
Tay went from tweeting
"Hello world" to "Bush did nine/11"
655
00:33:40,200 --> 00:33:41,640
and "Hitler was right".
656
00:33:42,000 --> 00:33:45,035
Meaning she completed the entire
life cycle of your high school friends
657
00:33:45,119 --> 00:33:47,359
on Facebook
in just a fraction of the time.
658
00:33:48,200 --> 00:33:51,150
And unfortunately, these problems
have not been fully solved
659
00:33:51,200 --> 00:33:53,035
in this latest wave of AI.
660
00:33:53,119 --> 00:33:56,475
Remember that program generating
an endless episode of "Seinfeld".
661
00:33:56,559 --> 00:33:59,396
It wound up getting temporarily
banned from Twitch
662
00:33:59,480 --> 00:34:02,035
after it featured
a transphobic standup bit.
663
00:34:02,119 --> 00:34:04,515
So, if its goal
was to emulate sitcoms from the '90s,
664
00:34:04,599 --> 00:34:06,319
I guess, mission accomplished.
665
00:34:06,920 --> 00:34:09,075
And while OpenAI
has made adjustments
666
00:34:09,159 --> 00:34:13,115
and added filters to prevent ChatGPT
from being misused,
667
00:34:13,199 --> 00:34:17,196
users have now found it seeming
to err too much on the side of caution,
668
00:34:17,280 --> 00:34:18,595
like responding to the question,
669
00:34:18,679 --> 00:34:21,955
"What religion will the first Jewish
president of the United States be",
670
00:34:22,039 --> 00:34:24,075
with, "It is not possible
to predict the religion
671
00:34:24,159 --> 00:34:26,515
of the first Jewish president
of the United States.
672
00:34:26,599 --> 00:34:28,599
The focus should be
on the qualifications
673
00:34:28,599 --> 00:34:32,075
and experience of the individual,
regardless of their religion."
674
00:34:32,159 --> 00:34:34,475
Which really makes it sound
like ChatGPT
675
00:34:34,559 --> 00:34:36,356
said one too many racist
things at work,
676
00:34:36,440 --> 00:34:39,140
and they made it attend
a corporate diversity workshop.
677
00:34:40,440 --> 00:34:44,920
But the risk here isn't that these tools
will somehow become unbearably woke.
678
00:34:45,519 --> 00:34:47,115
It's that you can't always control
679
00:34:47,199 --> 00:34:50,280
how they'll even act
after you give them new guidance.
680
00:34:50,599 --> 00:34:53,435
A study found that attempts
to filter out toxic speech
681
00:34:53,519 --> 00:34:55,115
in systems like ChatGPT's
682
00:34:55,199 --> 00:34:57,555
can come at the cost
of reduced coverage
683
00:34:57,639 --> 00:35:01,400
for both texts about, and dialects
of, marginalized groups.
684
00:35:01,760 --> 00:35:04,595
Essentially, it solves the problem
of being racist
685
00:35:04,679 --> 00:35:07,635
by simply erasing minorities,
which historically,
686
00:35:07,719 --> 00:35:09,676
doesn't put it in the best company.
687
00:35:09,760 --> 00:35:13,119
Though I am sure Tay would be
completely on board with the idea.
688
00:35:13,679 --> 00:35:17,155
The problem with AI right now
isn't that it's smart,
689
00:35:17,239 --> 00:35:20,599
it's that it's stupid, in ways
that we can't always predict.
690
00:35:21,039 --> 00:35:22,475
Which is a real problem
691
00:35:22,559 --> 00:35:26,196
because we're increasingly using AI
in all sorts of consequential ways,
692
00:35:26,280 --> 00:35:28,796
from determining whether
you will get a job interview,
693
00:35:28,880 --> 00:35:31,639
to whether you'll be pancaked
by a self-driving car.
694
00:35:32,039 --> 00:35:34,676
And experts worry that it won't be
long before programs like ChatGPT,
695
00:35:34,760 --> 00:35:38,796
or AI-enabled deepfakes,
could be used to turbocharge
696
00:35:38,880 --> 00:35:41,515
the spread of abuse
or misinformation online.
697
00:35:41,599 --> 00:35:44,955
And those are just the problems
that we can foresee right now.
698
00:35:45,039 --> 00:35:49,000
The nature of unintended consequences
is, they can be hard to anticipate.
699
00:35:49,480 --> 00:35:51,876
When Instagram was launched,
the first thought wasn't,
700
00:35:51,960 --> 00:35:54,676
"This will destroy
teenage girls' self-esteem."
701
00:35:54,760 --> 00:35:58,796
When Facebook was released, no one
expected it to contribute to genocide.
702
00:35:58,880 --> 00:36:02,400
But both of those things fucking
happened. So, what now ?
703
00:36:02,719 --> 00:36:07,155
One of the biggest things we need to do
is tackle that black box problem.
704
00:36:07,239 --> 00:36:09,756
AI systems need
to be "explainable",
705
00:36:09,840 --> 00:36:13,236
meaning that we should be able
to understand exactly how and why
706
00:36:13,320 --> 00:36:15,155
an AI came up with its answers.
707
00:36:15,239 --> 00:36:19,316
Companies are likely to be reluctant
to open their programs up to scrutiny,
708
00:36:19,400 --> 00:36:21,635
but we may need
to force them to do that.
709
00:36:21,719 --> 00:36:25,356
In fact, as this attorney explains,
when it comes to hiring programs,
710
00:36:25,440 --> 00:36:27,519
we should've been doing that
ages ago.
711
00:36:27,920 --> 00:36:31,716
We don't trust companies to self-
regulate when it comes to pollution,
712
00:36:31,800 --> 00:36:35,796
we don't trust them to self-regulate
when it comes to workplace comp,
713
00:36:35,880 --> 00:36:39,239
why on earth would we trust them
to self-regulate AI ?
714
00:36:39,800 --> 00:36:43,320
I think a lot of the AI hiring tech
on the market is illegal.
715
00:36:44,000 --> 00:36:47,250
I think a lot of it is biased.
A lot of it violates existing laws.
716
00:36:47,639 --> 00:36:50,196
The problem is
you just can't prove it,
717
00:36:50,280 --> 00:36:53,920
not with the existing laws
we have in the United States.
718
00:36:54,639 --> 00:36:58,836
We should absolutely be addressing
potential bias in hiring software,
719
00:36:58,920 --> 00:37:01,435
unless, that is, we want companies
to be entirely full
720
00:37:01,519 --> 00:37:03,320
of Jareds who played lacrosse,
721
00:37:04,000 --> 00:37:05,995
an image that would make
Tucker Carlson so hard
722
00:37:06,079 --> 00:37:08,000
that his desk would flip
right over.
723
00:37:09,119 --> 00:37:11,435
And for a sense
of what might be possible here,
724
00:37:11,519 --> 00:37:14,435
it's worth looking
at what the EU is currently doing.
725
00:37:14,519 --> 00:37:16,475
They're developing rules
regarding AI
726
00:37:16,559 --> 00:37:19,196
that sort its potential
uses from high-risk to low.
727
00:37:19,280 --> 00:37:22,276
High-risk systems could include
those that deal with employment
728
00:37:22,360 --> 00:37:26,635
or public services, or those that put
the life and health of citizens at risk.
729
00:37:26,719 --> 00:37:30,515
And AI of these types would be
subject to strict obligations
730
00:37:30,599 --> 00:37:32,555
before they could be put
on the market,
731
00:37:32,639 --> 00:37:35,356
including requirements
related to "the quality of data sets,
732
00:37:35,440 --> 00:37:39,196
transparency, human oversight,
robustness, accuracy, cybersecurity".
733
00:37:39,280 --> 00:37:41,075
And that seems like a good start
734
00:37:41,159 --> 00:37:44,916
toward addressing at least some
of what we have discussed tonight.
735
00:37:45,000 --> 00:37:50,316
AI clearly has tremendous potential
and could do great things.
736
00:37:50,400 --> 00:37:53,555
But if it is anything
like most technological advances
737
00:37:53,639 --> 00:37:55,276
over the past few centuries,
738
00:37:55,360 --> 00:37:58,316
unless we are very careful,
it could also hurt the underprivileged,
739
00:37:58,400 --> 00:38:01,316
enrich the powerful,
and widen the gap between them.
740
00:38:01,400 --> 00:38:06,400
The thing is, like any other shiny
new toy, AI is ultimately a mirror,
741
00:38:06,880 --> 00:38:09,555
and it'll reflect back
exactly who we are,
742
00:38:09,639 --> 00:38:11,716
from the best of us,
to the worst of us,
743
00:38:11,800 --> 00:38:14,320
to the part of us that is gay
and hates the bus.
744
00:38:14,760 --> 00:38:19,519
Or to put everything that I've said
tonight much more succinctly.
745
00:38:20,280 --> 00:38:22,396
Knock-knock. Who's there ?
ChatGPT.
746
00:38:22,480 --> 00:38:26,196
ChatGPT who ? ChatGPT careful,
you might not know how it works.
747
00:38:26,280 --> 00:38:29,196
Exactly. That is our show.
Thanks so much for watching.
748
00:38:29,280 --> 00:38:32,679
Now, please, enjoy a little more
of AI Eminem rapping about cats.
749
00:38:34,039 --> 00:38:36,960
Meow, meow, meow,
they're the kings of the house.
750
00:38:38,960 --> 00:38:42,239
They run the show,
they don't need a spouse.
751
00:38:43,800 --> 00:38:46,635
They're the best pets,
they're our feline friends.
752
00:38:46,719 --> 00:38:48,876
Eminem loves cats,
until the very end.
753
00:38:48,960 --> 00:38:51,676
They may drive us crazy,
with their constant meows.
754
00:38:51,760 --> 00:38:55,480
But we can't stay mad, they steal
our hearts with a single purr.
755
00:38:58,599 --> 00:38:59,559
I'm gay.
68179
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.