Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,125 --> 00:00:02,168
Do you affirm
the testimony you're about to give before
2
00:00:02,168 --> 00:00:03,878
the committee will be the truth,
the whole truth,
3
00:00:03,878 --> 00:00:15,098
and nothing but the truth,
so help you God?
4
00:00:15,181 --> 00:00:15,515
Chairman
5
00:00:15,515 --> 00:00:18,893
Durbin, Ranking Member Graham,
and members of the Committee.
6
00:00:18,977 --> 00:00:23,064
Every day, teens and young people
do amazing things on our services.
7
00:00:23,106 --> 00:00:26,901
They use our apps to create new things,
express themselves, explore the world
8
00:00:26,901 --> 00:00:30,155
around them, and feel more connected
to the people they care about.
9
00:00:30,238 --> 00:00:33,825
Overall, teens tell us that
this is a positive part of their lives.
10
00:00:33,908 --> 00:00:37,620
But some face challenges online,
so we work hard to provide
11
00:00:37,620 --> 00:00:42,542
parents and teens support
and controls to reduce potential harms.
12
00:00:42,625 --> 00:00:45,462
Being a parent is one of the hardest jobs
in the world.
13
00:00:45,462 --> 00:00:47,964
Technology gives us new ways
to communicate with our kids
14
00:00:47,964 --> 00:00:49,215
and feel connected to their lives.
15
00:00:49,215 --> 00:00:51,676
But it can also make parenting
more complicated.
16
00:00:51,676 --> 00:00:55,972
And it's important to me that our services
are positive for everyone who uses them.
17
00:00:56,056 --> 00:01:00,185
We are on the side of parents everywhere,
working hard to raise their kids.
18
00:01:00,268 --> 00:01:03,646
Over the last eight years, we've built
more than 30 different tools, resources
19
00:01:03,646 --> 00:01:07,484
and features that parents can set time
limits for their teens using our apps,
20
00:01:07,567 --> 00:01:12,030
see who they're following,
or if they report someone for bullying.
21
00:01:12,113 --> 00:01:14,157
For teens,
we've added nudges to remind them
22
00:01:14,157 --> 00:01:15,825
when they've been using Instagram
for a while
23
00:01:15,825 --> 00:01:17,869
or if it's getting late
and they should go to sleep
24
00:01:17,869 --> 00:01:21,623
as well as ways to hide words or people
without those people finding out.
25
00:01:21,706 --> 00:01:25,126
We put special restrictions
on teen accounts on Instagram by default.
26
00:01:25,210 --> 00:01:27,712
Accounts for under six
teens are set to private,
27
00:01:27,712 --> 00:01:31,299
have the most restrictive content settings
and can't be messaged by adults
28
00:01:31,299 --> 00:01:34,219
that they don't follow
or people they aren't connected to.
29
00:01:34,302 --> 00:01:34,969
With so much of
30
00:01:34,969 --> 00:01:37,972
our lives spent on mobile devices
and social media,
31
00:01:38,056 --> 00:01:42,143
it's important to look into the effects
on teen mental health and well-being.
32
00:01:42,185 --> 00:01:44,354
I take this very seriously.
33
00:01:44,354 --> 00:01:46,648
Mental health is a complex issue
34
00:01:46,648 --> 00:01:50,068
and the existing body of scientific work
has not shown it.
35
00:01:50,068 --> 00:01:53,154
Causal link between using social media
and young people
36
00:01:53,154 --> 00:01:56,366
having more mental health outcomes.
37
00:01:56,407 --> 00:01:58,451
A recent National Academies of Science
38
00:01:58,451 --> 00:02:03,581
report evaluated over 300 studies
and found that research, quote,
39
00:02:03,665 --> 00:02:06,376
did not support the conclusion
that social media causes
40
00:02:06,376 --> 00:02:10,213
changes in adolescent mental health
at the population level, unquote,
41
00:02:10,296 --> 00:02:13,299
and also suggested
that social media can provide significant
42
00:02:13,299 --> 00:02:16,511
positive benefits when young people use it
to express themselves,
43
00:02:16,594 --> 00:02:18,638
explore and connect with others.
44
00:02:18,638 --> 00:02:20,682
Still, we're going to continue
to monitor the research
45
00:02:20,682 --> 00:02:23,184
and use it to inform our roadmap.
46
00:02:23,268 --> 00:02:23,852
Keeping young
47
00:02:23,852 --> 00:02:27,522
people safe online has been a challenge
since the Internet began,
48
00:02:27,605 --> 00:02:32,777
and as criminals evolve their tactics,
we have to evolve our defenses too.
49
00:02:32,861 --> 00:02:33,611
We work closely
50
00:02:33,611 --> 00:02:37,448
with law enforcement to find bad actors
and help bring them to justice.
51
00:02:37,532 --> 00:02:40,952
But the difficult reality
is that no matter how much we invest
52
00:02:40,952 --> 00:02:44,289
or how effective our tools are,
there are always more.
53
00:02:44,289 --> 00:02:47,250
There's always more to learn
and more improvements to make.
54
00:02:47,250 --> 00:02:51,171
But we remain ready to work with members
of this committee, industry and parents
55
00:02:51,254 --> 00:02:54,007
to make the Internet safer for everyone.
56
00:02:54,007 --> 00:02:56,926
I'm proud of the work
that our teams do to improve online
57
00:02:56,926 --> 00:03:00,138
child safety on our services
and across the entire Internet.
58
00:03:00,221 --> 00:03:04,017
We have around 40,000 people overall
working on safety and security,
59
00:03:04,017 --> 00:03:07,979
and we've invested
more than $20 billion in this since 2016,
60
00:03:07,979 --> 00:03:11,316
including around
$5 billion in the last year alone.
61
00:03:11,399 --> 00:03:13,776
We have many teams
dedicated to child safety
62
00:03:13,776 --> 00:03:15,528
and teen well-being,
and we lead the industry
63
00:03:15,528 --> 00:03:18,323
in a lot of the areas
that we're discussing today.
64
00:03:18,406 --> 00:03:21,284
We built technology
to tackle the worst online risks
65
00:03:21,284 --> 00:03:24,787
and share it to help our whole industry
get better, like Project Lantern,
66
00:03:24,787 --> 00:03:28,458
which helps companies share data
about people who break child safety rules.
67
00:03:28,458 --> 00:03:31,544
And we're founding members of
Take It Down a platform which helps
68
00:03:31,544 --> 00:03:35,256
young people prevent their nude images
from being spread online.
69
00:03:35,340 --> 00:03:37,592
We also go beyond legal requirements
70
00:03:37,592 --> 00:03:41,804
and use sophisticated technology
to proactively discover abusive material.
71
00:03:41,888 --> 00:03:44,140
And as a result, we find and report
72
00:03:44,140 --> 00:03:47,936
more inappropriate content
than anyone else in the industry.
73
00:03:47,977 --> 00:03:51,564
As the National Center for Missing
and Exploited Children put it this week,
74
00:03:51,606 --> 00:03:56,027
media goes, quote, above and beyond
to make sure that there are no portions
75
00:03:56,027 --> 00:03:59,948
of their network where this type of
activity occurs, end quote.
76
00:04:00,031 --> 00:04:03,034
I hope we can have a substantive
discussion today that drives improvements
77
00:04:03,034 --> 00:04:03,826
across the industry,
78
00:04:03,826 --> 00:04:08,539
including legislation that delivers
what parents say they want a clear system
79
00:04:08,539 --> 00:04:13,336
for age verification and control over
what apps their kids are using.
80
00:04:13,419 --> 00:04:14,879
Three out of four parents
81
00:04:14,879 --> 00:04:18,424
want App store age verification
and four out of five
82
00:04:18,424 --> 00:04:23,179
want parental approval of whatever
whenever teens download apps.
83
00:04:23,263 --> 00:04:25,390
We support this.
84
00:04:25,390 --> 00:04:28,017
Parents should have the final say on
what apps are appropriate
85
00:04:28,017 --> 00:04:31,729
for their children and shouldn't
have to upload their ID every time.
86
00:04:31,813 --> 00:04:34,148
That's what app stores are for.
87
00:04:34,148 --> 00:04:37,694
We also support setting industry
standards on age appropriate content
88
00:04:37,694 --> 00:04:40,697
and limiting signals
for advertising to teens
89
00:04:40,738 --> 00:04:43,616
to age and location and not behavior.
90
00:04:43,616 --> 00:04:47,203
At the end of the day,
we want everyone who uses our services
91
00:04:47,203 --> 00:04:50,206
to have safe and positive experiences.
92
00:04:50,290 --> 00:04:53,418
Before we wrap up,
I want to recognize the families
93
00:04:53,418 --> 00:04:56,879
who are here today
who have lost a loved one or lived through
94
00:04:56,879 --> 00:05:02,260
some some terrible things
that no family should have to endure.
95
00:05:02,343 --> 00:05:02,593
These
96
00:05:02,593 --> 00:05:05,596
issues are important for every parent
and every platform.
97
00:05:05,680 --> 00:05:08,141
I'm committed
to continuing to work in these areas
98
00:05:08,141 --> 00:05:09,934
and I hope we can make progress today.
99
00:05:09,934 --> 00:05:10,435
Okay.
100
00:05:10,435 --> 00:05:13,438
Are there others
that support that bill on this?
101
00:05:13,563 --> 00:05:16,024
No. Okay.
102
00:05:16,024 --> 00:05:17,442
Last Mr.
103
00:05:17,442 --> 00:05:23,031
Zuckerberg, in 2021, the Wall Street
Journal reported on internal matter
104
00:05:23,031 --> 00:05:27,410
research documents asking
why do we care about tweens?
105
00:05:27,452 --> 00:05:28,953
These were internal documents.
106
00:05:28,953 --> 00:05:31,998
I'm quoting the documents
and answering its own question
107
00:05:31,998 --> 00:05:35,001
by citing Mehta internal emails.
108
00:05:35,168 --> 00:05:39,255
They are a valuable but untapped audience.
109
00:05:39,339 --> 00:05:41,883
At a Commerce hearing,
I'm also on that committee.
110
00:05:41,883 --> 00:05:45,136
I asked Matt as head of Global Safety, why
111
00:05:45,136 --> 00:05:48,514
children age 10
to 12 are so valuable to Marta.
112
00:05:48,723 --> 00:05:51,726
She responded,
We do not knowingly attempt to recruit
113
00:05:51,726 --> 00:05:54,979
people
who aren't old enough to use our apps.
114
00:05:55,063 --> 00:05:59,734
Well, when the 42 state attorneys
general, Democrat and Republican,
115
00:05:59,817 --> 00:06:03,446
brought their case,
they said this statement was inaccurate.
116
00:06:03,488 --> 00:06:04,781
Few examples.
117
00:06:04,781 --> 00:06:07,325
In 2021, she received an email, Ms..
118
00:06:07,325 --> 00:06:11,621
Davis from Instagram's research director,
saying that Instagram
119
00:06:11,621 --> 00:06:18,086
is investing and experiencing
targeting young age roughly 10 to 12.
120
00:06:18,127 --> 00:06:22,048
In a February 20, 21 instant message,
one of your employees wrote
121
00:06:22,048 --> 00:06:27,887
that Mata is working to recruit Gen Alpha
before they reach teenage years.
122
00:06:27,970 --> 00:06:32,892
A 2018 email that circulated inside
Mata says that you were briefed
123
00:06:32,892 --> 00:06:36,437
that children under 13 will be critical
for increasing
124
00:06:36,437 --> 00:06:43,069
the rate of acquisition
when users turn 13.
125
00:06:43,152 --> 00:06:44,946
Explain that with what I heard
126
00:06:44,946 --> 00:06:48,866
at that testimony at the Commerce
hearing that they weren't being targeted.
127
00:06:48,950 --> 00:06:53,621
And I just ask again,
as the other witnesses, why your company
128
00:06:53,788 --> 00:06:59,419
does not support the STOP System Act
or the SHIELD Act.
129
00:06:59,502 --> 00:07:00,128
Sure, Senator.
130
00:07:00,128 --> 00:07:02,505
I'm happy to talk to both of those.
131
00:07:02,505 --> 00:07:05,925
We had discussions
internally about whether we should build
132
00:07:05,925 --> 00:07:09,345
a kid's version of Instagram like that.
133
00:07:09,512 --> 00:07:13,015
I remember that YouTube
and other services.
134
00:07:13,099 --> 00:07:13,599
We haven't
135
00:07:13,599 --> 00:07:16,602
actually moved forward with that
and we currently have no plans to do so.
136
00:07:16,602 --> 00:07:21,732
So I can't speak directly to the exact
emails that you that you cited,
137
00:07:21,732 --> 00:07:25,069
but it sounds to me
like they were deliberations
138
00:07:25,069 --> 00:07:28,072
around a project that people internally
thought was important
139
00:07:28,072 --> 00:07:30,867
and we didn't end up moving forward with.
140
00:07:30,950 --> 00:07:31,534
Okay.
141
00:07:31,534 --> 00:07:34,537
And the bills, what are you going to say
about those two bills?
142
00:07:34,662 --> 00:07:34,912
Sure.
143
00:07:34,912 --> 00:07:39,375
So overall, I mean, my position on
the bills is I agree with the the
144
00:07:39,417 --> 00:07:40,835
the goal of all of them.
145
00:07:40,835 --> 00:07:43,212
There are most things that I agree
with within them.
146
00:07:43,212 --> 00:07:45,798
There are specific things
that I would probably do differently.
147
00:07:45,798 --> 00:07:49,385
We also have our own legislative proposal
for what we think
148
00:07:49,635 --> 00:07:53,931
would be most effective in terms
of helping the Internet
149
00:07:54,015 --> 00:07:58,644
and the various companies
give parents control over the experience.
150
00:07:58,686 --> 00:08:00,855
So I'm happy to go into the detail
on any one of them.
151
00:08:00,855 --> 00:08:02,064
But ultimately, I mean.
152
00:08:02,064 --> 00:08:05,193
I just again, I think these parents
will tell you that that
153
00:08:05,193 --> 00:08:08,196
this stuff hasn't
worked to just give parents control.
154
00:08:08,196 --> 00:08:09,405
They don't know what to do.
155
00:08:09,405 --> 00:08:10,740
It's very, very hard.
156
00:08:10,740 --> 00:08:14,494
And that's why we are coming up
with other solutions that we think
157
00:08:14,577 --> 00:08:17,079
are much more helpful to law enforcement,
158
00:08:17,079 --> 00:08:20,750
but also this idea of finally
getting something going on liability,
159
00:08:20,917 --> 00:08:24,253
because I just believe
with all the resources you have
160
00:08:24,337 --> 00:08:27,882
that you actually would be able to do
more than you're doing are these parents
161
00:08:27,882 --> 00:08:32,094
would be sitting behind you right now
in this Senate hearing room?
162
00:08:32,136 --> 00:08:33,304
Thank you, Senator.
163
00:08:33,304 --> 00:08:34,388
And I speak to that.
164
00:08:34,388 --> 00:08:39,060
Or do you? I'm
going to come back later. Please go ahead.
165
00:08:39,143 --> 00:08:41,979
I don't think
that parents should have to upload an I.D.
166
00:08:41,979 --> 00:08:42,438
or prove
167
00:08:42,438 --> 00:08:46,817
that they're the parent of a child in
every single app that their children use.
168
00:08:46,901 --> 00:08:49,820
I think the right place to do this
and a place where it'd be actually
169
00:08:49,820 --> 00:08:53,366
very easy for it to work
is within the app stores themselves,
170
00:08:53,407 --> 00:08:57,245
where my understanding
is Apple and Google already, or at least
171
00:08:57,245 --> 00:09:01,749
Apple already requires parental consent
when a child does a payment with an app.
172
00:09:01,832 --> 00:09:04,460
So it should be pretty trivial
to pass a law
173
00:09:04,460 --> 00:09:08,673
that requires them to make it
so that parents have control.
174
00:09:08,756 --> 00:09:13,511
Any time a child downloads
an app and offers consent to that,
175
00:09:13,594 --> 00:09:16,847
and the research that we've done
we've done shows that
176
00:09:16,931 --> 00:09:19,725
the vast majority of parents want that.
177
00:09:19,725 --> 00:09:22,979
And I think that that's the type
of legislation, in addition
178
00:09:22,979 --> 00:09:24,689
to some of the other ideas
that you all have
179
00:09:24,689 --> 00:09:26,315
that would make this
a lot easier for parents.
180
00:09:26,315 --> 00:09:29,777
Just just to be clear, I remember one
mom telling me with all these things
181
00:09:29,777 --> 00:09:32,029
she could maybe do that
she can't figure out.
182
00:09:32,029 --> 00:09:35,658
It's like a faucet overflowing in a sink
and she's out there with a mop
183
00:09:35,825 --> 00:09:37,243
while her kids are getting addicted
184
00:09:37,243 --> 00:09:40,246
to more and more different apps
and being exposed to material.
185
00:09:40,246 --> 00:09:43,958
We've got to make this simpler for parents
so they can protect their kids.
186
00:09:43,958 --> 00:09:46,752
And I just don't think
this is going to be the way to do it.
187
00:09:46,752 --> 00:09:49,714
I think the answer is what Senator
Graham has been talking about,
188
00:09:49,714 --> 00:09:52,717
which is opening up
the halls of the courtroom.
189
00:09:52,758 --> 00:09:57,096
So that puts it on you guys to protect
these parents and protect these kids
190
00:09:57,263 --> 00:10:01,726
and then also to pass some of these laws
that makes it easier for law enforcement.
191
00:10:01,809 --> 00:10:05,021
We pioneered quarterly
reporting on our community
192
00:10:05,021 --> 00:10:08,649
standards enforcement across all these
different categories of harmful content.
193
00:10:08,899 --> 00:10:11,902
We focus on prevalence,
which you you mentioned,
194
00:10:12,153 --> 00:10:17,158
because what we're focused on is what
percent of the content that we take down.
195
00:10:17,199 --> 00:10:17,742
So, Mr.
196
00:10:17,742 --> 00:10:20,745
Zuckerberg, I'm going to wrap
and you're very talented.
197
00:10:20,745 --> 00:10:22,830
I have very little time left.
198
00:10:22,830 --> 00:10:27,043
I'm trying to get an answer to a question,
not as a percentage of the total,
199
00:10:27,251 --> 00:10:30,630
because remember, it's a huge number,
so the percentage is small.
200
00:10:30,713 --> 00:10:34,091
But do you report
the actual amount of content
201
00:10:34,091 --> 00:10:37,928
and the amount of use self-harm
content received?
202
00:10:38,012 --> 00:10:39,722
No, I believe we focus on prevalence.
203
00:10:39,722 --> 00:10:41,390
Correct? You don't.
204
00:10:41,390 --> 00:10:42,975
What's what's. Odd?
205
00:10:42,975 --> 00:10:47,396
What I'm trying to understand is
206
00:10:47,480 --> 00:10:51,651
why it is that Instagram is
207
00:10:51,734 --> 00:10:53,444
only restricting it's
208
00:10:53,444 --> 00:10:58,157
restricting access
just to sexually explicit content,
209
00:10:58,240 --> 00:11:02,411
but only for teens ages 13 to 15.
210
00:11:02,495 --> 00:11:07,041
Why not restrict it for 16 and 17
year olds as well?
211
00:11:07,124 --> 00:11:07,541
Senator, my.
212
00:11:07,541 --> 00:11:10,544
Understanding is that we don't allow
sexually explicit content
213
00:11:10,544 --> 00:11:13,547
on the service for people of any age.
214
00:11:13,631 --> 00:11:15,299
The the.
215
00:11:15,299 --> 00:11:17,510
How is that going?
216
00:11:17,510 --> 00:11:20,304
you know, our, our,
217
00:11:20,304 --> 00:11:23,307
our prevalence metrics suggest that.
218
00:11:23,349 --> 00:11:26,894
I think it's 99%
or so of the content that we remove.
219
00:11:26,894 --> 00:11:29,647
We're able to identify
automatically using a system.
220
00:11:29,647 --> 00:11:31,399
So I think that our efforts in this,
221
00:11:31,399 --> 00:11:34,068
while they're not perfect,
I think are industry leading.
222
00:11:34,068 --> 00:11:37,655
The other thing that you asked about
was self-harm content,
223
00:11:37,655 --> 00:11:42,827
which is what we recently restricted
and we made that shift to the the
224
00:11:42,910 --> 00:11:46,372
I think the state of the science
is shifting a bit.
225
00:11:46,455 --> 00:11:49,166
Previously we believed
226
00:11:49,208 --> 00:11:52,753
that when
people were thinking about self-harm,
227
00:11:52,753 --> 00:11:55,756
it was important for them to be able
to express that and get support.
228
00:11:55,840 --> 00:12:00,052
And now more of the the,
the thinking in the field is that it's
229
00:12:00,052 --> 00:12:01,929
just better
to not show that content at all,
230
00:12:01,929 --> 00:12:05,474
which is why we recently moved
to restrict that from showing up for,
231
00:12:05,558 --> 00:12:08,102
for, for those teens at all.
232
00:12:08,102 --> 00:12:10,938
Is there a way for
233
00:12:10,938 --> 00:12:14,900
for parents to make a request
on what their kid can see or not
234
00:12:14,900 --> 00:12:18,446
see on your sites?
235
00:12:18,529 --> 00:12:20,448
There are a lot of parental controls.
236
00:12:20,448 --> 00:12:21,991
I'm not sure if there are.
237
00:12:21,991 --> 00:12:24,994
I don't think that
we currently have a control around topics,
238
00:12:25,161 --> 00:12:29,123
but we do allow parents to control
239
00:12:29,206 --> 00:12:31,459
the time
that the children are on the site.
240
00:12:31,459 --> 00:12:34,628
And also a lot of it
is based on kind of monitoring
241
00:12:34,628 --> 00:12:38,966
and understanding
what the teens experience is.
242
00:12:39,049 --> 00:12:40,050
Instagram also
243
00:12:40,050 --> 00:12:43,262
display the following warning screen
244
00:12:43,345 --> 00:12:45,973
to individuals
who were searching for child
245
00:12:45,973 --> 00:12:50,728
abuse material.
246
00:12:50,811 --> 00:12:52,396
The these results
247
00:12:52,396 --> 00:12:55,775
may contain images of child sexual abuse.
248
00:12:55,775 --> 00:12:59,028
And then you gave users two choices
249
00:12:59,111 --> 00:13:02,031
get resources
250
00:13:02,031 --> 00:13:06,577
or see results anyway.
251
00:13:06,660 --> 00:13:11,624
Mr. Zuckerberg,
What the hell were you thinking?
252
00:13:11,707 --> 00:13:13,334
All right, Senator,
253
00:13:13,334 --> 00:13:15,586
the the the basic science behind
254
00:13:15,586 --> 00:13:20,591
that is that when people are searching
for something that is problematic,
255
00:13:20,674 --> 00:13:25,721
it's often helpful to rather than just
blocking it, to help direct them towards
256
00:13:25,721 --> 00:13:29,558
something that that could be helpful
for getting them to get help in.
257
00:13:29,558 --> 00:13:33,229
What also can get resources
in what same universe
258
00:13:33,395 --> 00:13:36,607
is there a link for see results anyway?
259
00:13:36,690 --> 00:13:38,025
Or because we might be wrong?
260
00:13:38,025 --> 00:13:43,697
We try to trigger this
this warning or we tried to.
261
00:13:43,781 --> 00:13:46,242
When we think that there's any chance
that.
262
00:13:46,242 --> 00:13:47,326
Okay, you might be wrong.
263
00:13:47,326 --> 00:13:51,205
Let me ask you, how many times
was this warning screen displayed?
264
00:13:51,247 --> 00:13:51,789
I don't know.
265
00:13:51,789 --> 00:13:54,708
But you don't know. Why don't you know?
266
00:13:54,708 --> 00:13:56,377
I, I don't know the answer to that.
267
00:13:56,377 --> 00:13:58,629
I'll stop my head.
But you know what, Mr. Zuckerberg?
268
00:13:58,629 --> 00:13:59,922
It's interesting you say you don't know
269
00:13:59,922 --> 00:14:04,510
what off the top of your head
because I asked it in June of 2023
270
00:14:04,510 --> 00:14:08,639
in an oversight oversight letter,
and your company refused to answer.
271
00:14:08,722 --> 00:14:11,767
Will you commit right now
to within five days
272
00:14:11,767 --> 00:14:14,186
answering this question
for this committee?
273
00:14:14,186 --> 00:14:16,272
We'll follow up on that. Is that a yes?
274
00:14:16,272 --> 00:14:17,356
Not a will follow up.
275
00:14:17,356 --> 00:14:20,442
I know how lawyers write statements
saying we're not going to answer.
276
00:14:20,526 --> 00:14:25,072
Will you tell us how many times this
warning screen was displayed, yes or no?
277
00:14:25,155 --> 00:14:26,866
Senator, I'll personally look into it.
278
00:14:26,866 --> 00:14:27,825
I'm not sure. If. Okay.
279
00:14:27,825 --> 00:14:30,578
So you're refusing to answer that.
Let me ask you this.
280
00:14:30,578 --> 00:14:35,165
How many times did an Instagram user
who got this warning
281
00:14:35,249 --> 00:14:37,626
that you're seeing images of child
sexual abuse?
282
00:14:37,626 --> 00:14:40,504
How many times did that
user click on see results?
283
00:14:40,504 --> 00:14:43,007
Anyway? I want to see that,
284
00:14:43,090 --> 00:14:43,465
Senator.
285
00:14:43,465 --> 00:14:44,633
I'm not sure if we stored that,
286
00:14:44,633 --> 00:14:46,927
but I'll personally look into this
and we'll follow up.
287
00:14:46,927 --> 00:14:49,805
And what follow up to Instagram
288
00:14:49,805 --> 00:14:53,100
do when you have
289
00:14:53,183 --> 00:14:57,396
a potential pedophile
clicking on I'd like to see child porn.
290
00:14:57,521 --> 00:15:00,816
What did you do next
when that happens, Senator?
291
00:15:00,816 --> 00:15:01,483
I think that an
292
00:15:01,483 --> 00:15:05,988
important piece of context here is that
any context that we think is child. So.
293
00:15:06,030 --> 00:15:08,157
Mr. Zuckerberg that's called a question.
294
00:15:08,157 --> 00:15:13,037
What did you do next when someone clicked?
295
00:15:13,120 --> 00:15:15,748
You may be getting child
296
00:15:15,748 --> 00:15:19,251
sexual abuse images
and they click see results.
297
00:15:19,251 --> 00:15:21,462
Anyway, what was your next step?
298
00:15:21,462 --> 00:15:23,172
You said you might be wrong.
299
00:15:23,172 --> 00:15:26,926
Did anyone examine was it
in fact child sexual abuse material?
300
00:15:26,926 --> 00:15:28,636
Did anyone report that user?
301
00:15:28,636 --> 00:15:30,971
Did anyone go
and try to protect that child?
302
00:15:30,971 --> 00:15:33,766
What did you do next?
303
00:15:33,766 --> 00:15:37,019
Senator We take down
anything that we think is sexual abuse
304
00:15:37,019 --> 00:15:38,812
material on the service. And we did.
305
00:15:38,812 --> 00:15:43,734
Did anyone verify whether it was
in fact child sexual abuse material?
306
00:15:43,776 --> 00:15:46,904
Senator, I don't know if if
every single searches or following up on.
307
00:15:46,904 --> 00:15:49,239
But did you report
the people who wanted it?
308
00:15:49,239 --> 00:15:50,324
Do you want me to answer your call?
309
00:15:50,324 --> 00:15:52,076
I want you to answer the question
I'm asking.
310
00:15:52,076 --> 00:15:53,744
Did you. Report time to.
311
00:15:53,744 --> 00:15:56,956
The people who click see results? Anyway.
312
00:15:57,039 --> 00:16:00,709
That's probably one of the factors
that we use in reporting in general.
313
00:16:00,709 --> 00:16:04,672
And we've reported more people and done
more reports like this to Nick Mick,
314
00:16:04,672 --> 00:16:07,257
the National Center
for Missing and Exploited Children,
315
00:16:07,257 --> 00:16:08,717
than any other company in the industry.
316
00:16:08,717 --> 00:16:12,221
We proactively go out of our way
across our services to do this
317
00:16:12,304 --> 00:16:15,599
and have made
I think it's more than 26 million reports,
318
00:16:15,599 --> 00:16:18,227
which is more than the whole rest
of the industry combined.
319
00:16:18,227 --> 00:16:19,687
Mr. Zuckerberg, let me start with you.
320
00:16:19,687 --> 00:16:22,189
Did I hear you
say in your opening statement
321
00:16:22,189 --> 00:16:27,277
that there is no link between mental
health and social media use?
322
00:16:27,361 --> 00:16:30,072
Senator, What I said is I think
it's important to look at the science.
323
00:16:30,072 --> 00:16:33,659
I know it's people widely talk about this
as if that is something
324
00:16:33,659 --> 00:16:35,160
that's already been proven.
325
00:16:35,160 --> 00:16:38,664
And I think that the bulk of the
scientific evidence does not support that.
326
00:16:38,706 --> 00:16:42,876
Well, really, let me just remind you of
some of the science from your own company.
327
00:16:42,876 --> 00:16:48,173
Instagram studied
the effect of your platform on teenagers.
328
00:16:48,215 --> 00:16:49,967
Let me just read you some quotes
from The Wall Street
329
00:16:49,967 --> 00:16:51,301
Journal's report on this company.
330
00:16:51,301 --> 00:16:55,264
Researchers found that Instagram
is harmful for a sizable percentage
331
00:16:55,264 --> 00:16:58,267
of teenagers, most notably teenage girls.
332
00:16:58,434 --> 00:16:59,768
Here's a quote from your own study.
333
00:16:59,768 --> 00:17:05,607
Quote, We make body image issues
worse for one in three teen girls.
334
00:17:05,691 --> 00:17:07,568
Here's
another quote Teens blame Instagram.
335
00:17:07,568 --> 00:17:11,905
This is your study for increases
in the rate of anxiety and depression.
336
00:17:11,947 --> 00:17:15,868
This reaction was unprompted
and consistent across all groups.
337
00:17:15,951 --> 00:17:17,786
That's your study.
338
00:17:17,870 --> 00:17:18,287
Senator.
339
00:17:18,287 --> 00:17:23,375
We try to understand the the feedback
and how people feel about the services
340
00:17:23,375 --> 00:17:23,917
we can improve.
341
00:17:23,917 --> 00:17:24,209
Wait a minute.
342
00:17:24,209 --> 00:17:25,377
You're always out of your own
343
00:17:25,377 --> 00:17:30,716
studies says that you make life worse
for one in three teenage girls.
344
00:17:30,758 --> 00:17:32,926
You increase anxiety and depression.
345
00:17:32,926 --> 00:17:33,719
That's what it says.
346
00:17:33,719 --> 00:17:36,680
And you're here testifying to us in public
that there's no link.
347
00:17:36,680 --> 00:17:37,973
You've been doing this for years.
348
00:17:37,973 --> 00:17:40,976
For years you've been coming in public
and testifying under oath
349
00:17:41,018 --> 00:17:42,770
that there's absolutely no link.
350
00:17:42,770 --> 00:17:44,146
Your product is wonderful.
351
00:17:44,146 --> 00:17:48,275
The science is nascent, full speed ahead,
while internally you know full
352
00:17:48,275 --> 00:17:52,029
well your product is a disaster
for teenagers
353
00:17:52,112 --> 00:17:55,324
and you keep right on doing what
you're doing right.
354
00:17:55,365 --> 00:17:56,617
True. That's not true.
355
00:17:56,617 --> 00:17:58,452
Let me let me
let me show you some other facts.
356
00:17:58,452 --> 00:18:01,455
I know that you're familiar with a minute.
Wait a minute.
357
00:18:01,455 --> 00:18:03,248
That's not a question.
That's not a question.
358
00:18:03,248 --> 00:18:05,000
Those are facts, Mr. Zuckerberg.
359
00:18:05,000 --> 00:18:06,710
That's not a question. That's the answer.
360
00:18:06,710 --> 00:18:08,295
Let me tell you some more facts.
361
00:18:08,295 --> 00:18:11,715
Here are some and here's some information
from a whistleblower who came before
362
00:18:11,715 --> 00:18:13,634
the Senate testified under oath in public.
363
00:18:13,634 --> 00:18:15,886
He worked for you to senior executive.
364
00:18:15,886 --> 00:18:20,265
Here's what he showed
he found when he studied your products.
365
00:18:20,349 --> 00:18:24,436
So, for example, this is girls
between the ages of 13 and 15 years old.
366
00:18:24,686 --> 00:18:28,690
37% of them reported
that they had been exposed
367
00:18:28,816 --> 00:18:32,569
to nudity on the platform
unwanted in the last seven days.
368
00:18:32,653 --> 00:18:37,491
24% said that they had experienced
unwanted sexual advances.
369
00:18:37,491 --> 00:18:40,494
They've been propositioned
in the last seven days.
370
00:18:40,619 --> 00:18:43,747
17% said they had encountered
self-harm content
371
00:18:43,747 --> 00:18:47,835
pushed at them in the last seven days.
372
00:18:47,918 --> 00:18:49,461
Now, I
know you're familiar with these stats
373
00:18:49,461 --> 00:18:52,422
because he sent you an email
where he lined it all out.
374
00:18:52,422 --> 00:18:56,218
I mean, we've got a copy of it right here.
375
00:18:56,301 --> 00:19:00,055
My question is, who did you fire for this?
376
00:19:00,139 --> 00:19:02,349
Who got fired because of that?
377
00:19:02,349 --> 00:19:05,769
Senator, we study all this because it's
important we want to improve our service.
378
00:19:05,769 --> 00:19:09,439
Well, you just told me so long ago
you studied that there was no linkage.
379
00:19:09,523 --> 00:19:10,691
Who did you. Fire?
380
00:19:10,691 --> 00:19:12,109
I said you mischaracterized 30.
381
00:19:12,109 --> 00:19:15,946
7% of teenage girls between 13 and 15
382
00:19:16,071 --> 00:19:19,658
were exposed to unwanted nudity
in a week on Instagram.
383
00:19:19,700 --> 00:19:21,493
You knew about it. Who did you fire?
384
00:19:21,493 --> 00:19:23,287
Senator. This is why we're building.
385
00:19:23,287 --> 00:19:25,414
Who did you fire?
386
00:19:25,414 --> 00:19:26,999
Senator, That's.
I don't think that that's.
387
00:19:26,999 --> 00:19:28,208
Who did you. Fire?
388
00:19:28,208 --> 00:19:30,085
I'm not going to answer that.
389
00:19:30,085 --> 00:19:32,087
That's because you. Didn't fire anybody.
390
00:19:32,087 --> 00:19:32,796
Right.
391
00:19:32,796 --> 00:19:34,464
You didn't take any significant action.
392
00:19:34,464 --> 00:19:37,092
It's appropriate to talk about it.
393
00:19:37,092 --> 00:19:39,011
It's not appropriate. Decisions.
394
00:19:39,011 --> 00:19:41,305
Do you know who's sitting behind you?
395
00:19:41,346 --> 00:19:44,349
You've got families from across the nation
396
00:19:44,600 --> 00:19:48,145
whose children are either
severely harmed or gone.
397
00:19:48,228 --> 00:19:50,022
And you don't think it's appropriate
to take.
398
00:19:50,022 --> 00:19:53,442
Talk about steps that you took, the fact
that you didn't fire a single person.
399
00:19:53,442 --> 00:19:54,193
Let me ask you this.
400
00:19:54,193 --> 00:19:57,070
Let me ask you this.
Have you compensated any of the victims?
401
00:19:57,070 --> 00:19:57,613
Sorry?
402
00:19:57,613 --> 00:20:00,908
Have you compensated any of the victims
these girls?
403
00:20:00,908 --> 00:20:02,075
Have you compensated them?
404
00:20:02,075 --> 00:20:04,244
I don't believe so.
405
00:20:04,244 --> 00:20:06,788
Why not?
406
00:20:06,788 --> 00:20:09,499
Don't
you think they deserve some compensation
407
00:20:09,499 --> 00:20:11,168
for what your platform has done?
408
00:20:11,168 --> 00:20:12,753
Help with counseling services.
409
00:20:12,753 --> 00:20:16,632
Help with dealing with the issues
that your your service is cause.
410
00:20:16,840 --> 00:20:20,177
Our job is to make sure
that we build tools to help keep you.
411
00:20:20,260 --> 00:20:22,304
Are you going to compensate them?
412
00:20:22,304 --> 00:20:25,390
Senator, our job
and what we take seriously is making sure
413
00:20:25,390 --> 00:20:29,102
that we build industry leading tools
to find harmful to make money,
414
00:20:29,102 --> 00:20:32,940
take it off the services, to make money,
and to build tools that empower parents.
415
00:20:32,981 --> 00:20:34,650
So you didn't take any action.
416
00:20:34,650 --> 00:20:36,735
You didn't take any action,
you didn't fire anybody.
417
00:20:36,735 --> 00:20:39,196
You haven't compensated a single victim.
Let me ask you this.
418
00:20:39,196 --> 00:20:39,780
Let me ask you this.
419
00:20:39,780 --> 00:20:41,531
There's families of victims here today.
420
00:20:41,531 --> 00:20:43,867
Have you apologized to the victims?
421
00:20:43,951 --> 00:20:46,578
I would you like to do so now?
422
00:20:46,578 --> 00:20:49,039
Well, they're here.
You're on national television.
423
00:20:49,039 --> 00:20:51,083
Would you like now to apologize
424
00:20:51,083 --> 00:20:52,876
to the victims
who have been harmed by your product?
425
00:20:52,876 --> 00:20:54,836
Show them the pictures.
426
00:20:54,836 --> 00:20:59,925
Would you like to apologize for what
you've done to these good people? I,
427
00:21:00,008 --> 00:21:05,222
I, I'm supposed
428
00:21:05,305 --> 00:21:06,306
to go through
429
00:21:06,306 --> 00:21:09,101
the things
that your families have suffered.
430
00:21:09,101 --> 00:21:11,103
And this is why we invested so much.
431
00:21:11,103 --> 00:21:12,646
And our hope
432
00:21:12,646 --> 00:21:16,984
is rebuilding efforts to to make sure that
433
00:21:17,067 --> 00:21:18,193
no one has to go through the.
434
00:21:18,193 --> 00:21:23,573
Types of things
that your family has suffered.
435
00:21:23,657 --> 00:21:24,908
Will you commit
436
00:21:24,908 --> 00:21:28,370
to reporting measurable child safety data
437
00:21:28,578 --> 00:21:33,792
on your quarterly earnings
reports and calls?
438
00:21:33,875 --> 00:21:35,210
Senator, it's a it's a good question.
439
00:21:35,210 --> 00:21:39,339
We actually already have
a quarterly report that we issue
440
00:21:39,339 --> 00:21:43,302
and do a call to answer questions for how
we're enforcing our community standards.
441
00:21:43,302 --> 00:21:46,722
That includes
not just the child safety issues.
442
00:21:46,805 --> 00:21:49,141
So is that a yes?
443
00:21:49,141 --> 00:21:51,226
We have a separate call
that we do this on.
444
00:21:51,226 --> 00:21:53,061
But we've I think that,
445
00:21:53,061 --> 00:21:56,356
you know, you have to report
we report your earnings to the FCC.
446
00:21:56,440 --> 00:22:00,485
Will you report to them this kind of data
And by numbers, by the way,
447
00:22:00,485 --> 00:22:01,987
because Senator Cruz
448
00:22:01,987 --> 00:22:05,532
and others have said percentages
don't really tell the full story.
449
00:22:05,615 --> 00:22:09,953
Will you report to the FCC
the number of teens?
450
00:22:09,995 --> 00:22:12,247
And sometimes you don't even know
where they're teens are,
451
00:22:12,247 --> 00:22:14,207
not because they just claim to be adults.
452
00:22:14,207 --> 00:22:18,879
Will you report the number of underage
children on your platforms
453
00:22:18,879 --> 00:22:25,719
who experience unwanted
these and other kinds of
454
00:22:25,802 --> 00:22:27,554
messaging
455
00:22:27,554 --> 00:22:29,639
that they that harm them?
456
00:22:29,639 --> 00:22:33,894
Will you commit to
457
00:22:33,935 --> 00:22:38,523
citing those numbers to the FCC
when you make your quarterly report?
458
00:22:38,607 --> 00:22:41,109
Well, Senator,
I'm not sure it would make as much sense
459
00:22:41,109 --> 00:22:42,486
to include it in the SEC filing,
460
00:22:42,486 --> 00:22:45,489
but we file it publicly
so that way everyone can see this.
461
00:22:45,614 --> 00:22:49,242
And we I'd be happy to follow up and talk
about what specific metrics.
462
00:22:49,242 --> 00:22:51,495
I think the specific things
that some of the ones
463
00:22:51,495 --> 00:22:54,664
that you just mentioned
around underage people in our services,
464
00:22:54,748 --> 00:22:57,250
we don't allow
people under the age of 13 on our service.
465
00:22:57,250 --> 00:23:00,462
So if we find anyone who's under the age
of 13, we remove them from our service.
466
00:23:00,462 --> 00:23:03,215
Now, I'm not saying that people don't lie
and that there aren't. Yes.
467
00:23:03,215 --> 00:23:04,841
That's under the age of 13.
Who's using it.
468
00:23:04,841 --> 00:23:06,009
But I'm not going to Bill, too.
469
00:23:06,009 --> 00:23:07,761
We're not going to count
how many people there are,
470
00:23:07,761 --> 00:23:11,390
because fundamentally, if we identify
that someone is underage, we remove them.
471
00:23:11,515 --> 00:23:13,934
I think that's really important
that we get actual numbers
472
00:23:13,934 --> 00:23:15,352
because these are real human beings.
473
00:23:15,352 --> 00:23:17,562
That's
why all these parents and others are here,
474
00:23:17,562 --> 00:23:21,233
because each time that a person,
a young person is exposed
475
00:23:21,316 --> 00:23:25,404
to this kind of unwanted material,
they get hooked.
476
00:23:25,445 --> 00:23:27,781
It is a danger to that individual.
477
00:23:27,781 --> 00:23:31,827
So I'm hoping that you are saying
that you do report
478
00:23:31,827 --> 00:23:36,164
this kind of information to the
if not the FCC, that it is made public.
479
00:23:36,164 --> 00:23:37,582
I think I'm hearing that.
480
00:23:37,582 --> 00:23:38,458
Yes, you do.
481
00:23:38,458 --> 00:23:42,045
So, yes, Senator,
I think we report more publicly
482
00:23:42,254 --> 00:23:45,424
on our enforcement
than any other company in the industry.
483
00:23:45,590 --> 00:23:49,094
And we're very supportive of transparency.
484
00:23:49,177 --> 00:23:50,512
I'm running out of time, Mr.
485
00:23:50,512 --> 00:23:53,890
Zuckerberg,
But so I will follow up with what exactly
486
00:23:53,890 --> 00:23:56,893
it is that you do report.
487
00:23:56,893 --> 00:23:58,770
Again, for you women to automatically
488
00:23:58,770 --> 00:24:01,773
places young people's accounts
and you testified to this
489
00:24:01,815 --> 00:24:05,944
on the most restrictive privacy
and content sensitivity sessions.
490
00:24:06,027 --> 00:24:09,406
And yet teens are able to opt
out of these safeguards.
491
00:24:09,448 --> 00:24:10,657
Isn't that right?
492
00:24:10,657 --> 00:24:15,871
Yeah, It's not mandatory
that they remain on these settings.
493
00:24:15,954 --> 00:24:17,581
They can opt out.
494
00:24:17,581 --> 00:24:17,956
Senator?
495
00:24:17,956 --> 00:24:22,043
Yes, we default
teens into a private accounts,
496
00:24:22,043 --> 00:24:24,045
so they have a private
restricted experience.
497
00:24:24,045 --> 00:24:28,049
But some teens want to be creators
and want to have content
498
00:24:28,091 --> 00:24:29,259
that they share more broadly.
499
00:24:29,259 --> 00:24:33,472
And I don't think that that's something
that should just blanketly be banned.
500
00:24:33,555 --> 00:24:36,516
You have
501
00:24:36,600 --> 00:24:41,354
you have convinced over 2 billion people
502
00:24:41,438 --> 00:24:42,939
to give up all of
503
00:24:42,939 --> 00:24:47,027
their personal information,
every bit of it,
504
00:24:47,110 --> 00:24:50,947
in exchange for getting to see
505
00:24:51,031 --> 00:24:55,410
what their house school friends
had for dinner Saturday night
506
00:24:55,494 --> 00:24:59,331
as pretty much your business
model, isn't it?
507
00:24:59,414 --> 00:25:01,082
It's not how I would characterize it.
508
00:25:01,082 --> 00:25:04,211
I mean, we give people the ability
to connect with the people they care about
509
00:25:04,211 --> 00:25:08,882
and and to engage
with the topics that they care about.
510
00:25:09,007 --> 00:25:13,220
And and you take this information,
511
00:25:13,303 --> 00:25:16,223
this abundance of personal information,
512
00:25:16,223 --> 00:25:19,226
and then you develop algorithms
513
00:25:19,476 --> 00:25:23,939
to punch people's hot buttons,
514
00:25:24,022 --> 00:25:27,901
which
and some and steer to them information
515
00:25:27,984 --> 00:25:32,072
that punches their hot buttons again
and again and again
516
00:25:32,280 --> 00:25:36,618
to keep them coming back
and to keep them staying longer.
517
00:25:36,701 --> 00:25:43,875
And as a result, your users
see only one side of an issue.
518
00:25:43,959 --> 00:25:46,711
And so to some extent, your
519
00:25:46,711 --> 00:25:51,383
platform has become a killing field
for the true, passionate.
520
00:25:51,466 --> 00:25:52,842
I mean, Senator, I disagree with that.
521
00:25:52,842 --> 00:25:55,095
That characterization.
522
00:25:55,095 --> 00:25:57,847
You know,
we build ranking and recommendations
523
00:25:57,847 --> 00:26:01,017
because people have a lot of friends
and a lot of interests,
524
00:26:01,017 --> 00:26:04,271
and they want to make sure that they see
the content that's relevant to them.
525
00:26:04,354 --> 00:26:08,775
We're trying to make a product that's
useful to people and make our services
526
00:26:08,858 --> 00:26:10,986
as helpful as possible
for people to connect with the people
527
00:26:10,986 --> 00:26:12,404
they care about in the interest they care.
528
00:26:12,404 --> 00:26:14,614
But you don't show them both sides.
529
00:26:14,614 --> 00:26:17,075
You don't give them balanced information.
530
00:26:17,075 --> 00:26:20,662
You just keep punching their hot buttons,
punching their hot buttons.
531
00:26:20,745 --> 00:26:23,623
You don't show them balanced information
532
00:26:23,623 --> 00:26:26,626
so people can discern
the truth of themselves
533
00:26:26,668 --> 00:26:33,466
and you rev them up so much that that show
often your platform and others
534
00:26:33,550 --> 00:26:35,510
becomes just cesspools of
535
00:26:35,510 --> 00:26:39,431
snark where nobody learns anything.
536
00:26:39,514 --> 00:26:40,515
Don't they?
537
00:26:40,515 --> 00:26:42,058
Well, Senator. I disagree with that.
538
00:26:42,058 --> 00:26:45,520
I think people can engage
in the things that they're interested in
539
00:26:45,604 --> 00:26:47,606
and learn quite a bit about those.
540
00:26:47,606 --> 00:26:52,986
We have done a handful of different
experiments and things in the past around
541
00:26:53,069 --> 00:26:58,575
news and trying to show content
on a diverse set of of of perspectives.
542
00:26:58,658 --> 00:27:01,828
I think that there's more
that needs to be explored there, but
543
00:27:01,911 --> 00:27:04,456
I don't think that we can solve
that by ourselves. We do that.
544
00:27:04,456 --> 00:27:07,459
I think. I'm sorry to cut you off, Mr.
545
00:27:07,626 --> 00:27:10,420
President,
but I'm going to run out of time.
546
00:27:10,420 --> 00:27:14,090
Do you think your users really understand
547
00:27:14,174 --> 00:27:17,260
what they're giving to you,
all their personal information
548
00:27:17,469 --> 00:27:21,473
and how you how you process it
and how you monitor it?
549
00:27:21,514 --> 00:27:24,684
Do you think people really understand.
550
00:27:24,768 --> 00:27:25,060
Senator?
551
00:27:25,060 --> 00:27:29,064
I think people understand the basic terms.
552
00:27:29,272 --> 00:27:32,233
I mean, I think that there's
I things that a lot of.
553
00:27:32,233 --> 00:27:33,443
Me put it another way.
554
00:27:33,443 --> 00:27:36,404
We spent a couple of years
as we talked about this.
555
00:27:36,404 --> 00:27:40,575
Does your user agreement still suck?
556
00:27:40,659 --> 00:27:42,202
I'm not sure how to answer that.
557
00:27:42,202 --> 00:27:43,953
Senator can deal with that.
558
00:27:43,953 --> 00:27:48,166
Can you still have a dead body
and all that legalese
559
00:27:48,249 --> 00:27:50,919
where nobody can find it?
560
00:27:50,919 --> 00:27:51,878
Senator, I'm not I'm not.
561
00:27:51,878 --> 00:27:52,879
Quite sure what you're referring to,
562
00:27:52,879 --> 00:27:56,383
but I think people get the basic deal
of using these services.
563
00:27:56,383 --> 00:27:57,676
It's a free service.
564
00:27:57,676 --> 00:28:00,053
You're using it to connect
to the people you care about.
565
00:28:00,053 --> 00:28:01,763
If you share something with people,
566
00:28:01,763 --> 00:28:03,682
other people will be able
to see your information.
567
00:28:03,682 --> 00:28:04,933
It's inherently,
568
00:28:04,933 --> 00:28:07,936
you know, if you're putting
something out there to be shared publicly
569
00:28:08,103 --> 00:28:09,729
or with a private set of people, it's,
570
00:28:09,729 --> 00:28:11,272
you know,
you're inherently putting it out there.
571
00:28:11,272 --> 00:28:13,358
So I think people get that basic.
572
00:28:13,358 --> 00:28:14,526
But part of how but let's.
573
00:28:14,526 --> 00:28:17,862
Just talk about
you're in the foothills of creeping you
574
00:28:17,862 --> 00:28:22,075
track you track you track people
575
00:28:22,117 --> 00:28:24,703
who aren't even Facebook users.
576
00:28:24,703 --> 00:28:29,374
You track your own people,
your own users who are your product, even
577
00:28:29,457 --> 00:28:33,545
even when they're not on Facebook.
578
00:28:33,628 --> 00:28:37,298
I mean, I'm going to land this plane
pretty quickly, Mr.
579
00:28:37,298 --> 00:28:40,301
Chairman. I mean, it's creepy.
580
00:28:40,427 --> 00:28:44,514
And I understand
you make a lot of money doing it,
581
00:28:44,556 --> 00:28:49,352
but I just wonder if our technology
582
00:28:49,436 --> 00:28:51,813
is greater than our humanity.
583
00:28:51,813 --> 00:28:54,315
I mean, let me ask you
this final question.
584
00:28:54,315 --> 00:28:56,860
Instagram
585
00:28:56,860 --> 00:28:59,237
is harmful
586
00:28:59,237 --> 00:29:02,615
to young people, isn't it.
587
00:29:02,699 --> 00:29:03,867
Senator? I disagree with that.
588
00:29:03,867 --> 00:29:05,326
That's not what the research shows.
589
00:29:05,326 --> 00:29:07,328
On balance,
that doesn't mean that individual
590
00:29:07,328 --> 00:29:10,832
people don't have issues
and that there aren't things that we need
591
00:29:10,832 --> 00:29:13,835
to do to to help
provide the right tools for people.
592
00:29:13,835 --> 00:29:17,172
But across all of the research
that we've done internally,
593
00:29:17,255 --> 00:29:23,344
I mean, the survey
that the senator previously cited,
594
00:29:23,428 --> 00:29:25,096
you know, there are
595
00:29:25,096 --> 00:29:29,309
12 or 15 different categories of harm
that we asked
596
00:29:29,392 --> 00:29:32,395
teens if they felt that Instagram
made it worse or better.
597
00:29:32,562 --> 00:29:35,899
And across all of them,
except for the one that that that
598
00:29:35,982 --> 00:29:39,486
that Senator Hawley cited, more
people said that using.
599
00:29:39,486 --> 00:29:40,987
Instagram, I've got two lands.
600
00:29:40,987 --> 00:29:43,323
Apiece, either positive or.
601
00:29:43,323 --> 00:29:45,366
I would just have to agree to disagree.
602
00:29:45,366 --> 00:29:49,913
If you believe that Instagram,
I know I'm not saying it's intentional,
603
00:29:49,913 --> 00:29:52,415
but if you agree that Instagram,
604
00:29:52,415 --> 00:29:56,419
if you think that Instagram is not hurting
millions of our young people,
605
00:29:56,503 --> 00:30:01,966
particularly young teens, particularly
young women, you shouldn't be driving.
606
00:30:02,050 --> 00:30:04,636
Mr. Zuckerberg,
I want to come back to you.
607
00:30:04,636 --> 00:30:10,099
I talked with you about being a parent
608
00:30:10,183 --> 00:30:13,603
to a young child who doesn't have a phone,
609
00:30:13,812 --> 00:30:17,899
doesn't,
you know, is not on social media at all.
610
00:30:17,982 --> 00:30:21,945
And one of the things
that I am deeply concerned
611
00:30:21,945 --> 00:30:27,951
with as a parent to a young black girl
612
00:30:28,034 --> 00:30:32,372
is the utilization of filters.
613
00:30:32,455 --> 00:30:34,666
On. Your platform.
614
00:30:34,666 --> 00:30:37,293
That would suggest
615
00:30:37,293 --> 00:30:40,421
to young girls utilizing your platform
616
00:30:40,588 --> 00:30:45,969
the evidence that they are
not good enough as they are.
617
00:30:46,052 --> 00:30:48,346
I want to
618
00:30:48,346 --> 00:30:50,890
ask more specifically
619
00:30:50,890 --> 00:30:54,811
and refer to some unredacted court
documents.
620
00:30:54,811 --> 00:30:57,814
That revealed at your own. Researchers
621
00:30:57,856 --> 00:30:59,983
concluded that these face filters
622
00:30:59,983 --> 00:31:03,611
that mimic plastic surgery
623
00:31:03,695 --> 00:31:05,780
negatively impact youth mental
624
00:31:05,780 --> 00:31:09,075
health, indeed, and well-being.
625
00:31:09,075 --> 00:31:11,661
Why should we. Believe.
626
00:31:11,661 --> 00:31:13,371
Why should we believe that?
627
00:31:13,371 --> 00:31:18,042
Because that you are going to do more
to protect young women.
628
00:31:18,209 --> 00:31:23,464
And young girls when it is that
you give them the tools to affirm the.
629
00:31:23,464 --> 00:31:24,966
Self-Hate that is.
630
00:31:24,966 --> 00:31:27,385
Spewed across your platforms?
631
00:31:27,385 --> 00:31:29,679
Why is it that we should believe.
632
00:31:29,679 --> 00:31:34,809
That you are committed to doing anything
more to keep our children safe?
633
00:31:34,893 --> 00:31:36,311
Sorry, there's a lot to unpack there.
634
00:31:36,311 --> 00:31:39,439
There is a lot of tools
to express themselves
635
00:31:39,439 --> 00:31:43,526
in different ways, and people use face
filters and different tools
636
00:31:43,568 --> 00:31:49,699
to make media and photos
and videos that are fun or interesting
637
00:31:49,782 --> 00:31:52,577
across a lot of the different products
that plastic.
638
00:31:52,577 --> 00:31:58,291
Surgery pens are good tools to express
creativity. Of.
639
00:31:58,374 --> 00:32:00,126
Senator, I'm not speaking skin.
640
00:32:00,126 --> 00:32:03,338
Lightning tools are tools. To express.
641
00:32:03,379 --> 00:32:04,130
Creativity.
642
00:32:04,130 --> 00:32:06,549
This is the direct thing
that I'm asking you about.
643
00:32:06,549 --> 00:32:09,135
Defending any specific one of those.
644
00:32:09,135 --> 00:32:11,888
I think that the ability
645
00:32:11,888 --> 00:32:15,099
to kind of filter and
646
00:32:15,183 --> 00:32:18,686
and edit images is generally a useful tool
for expression
647
00:32:18,937 --> 00:32:22,482
for that specifically, I'm not familiar
with the study that you're referring to,
648
00:32:22,482 --> 00:32:27,320
but we did make it so that we're not
recommending this type of content.
649
00:32:27,403 --> 00:32:30,073
Mr. Zuckerberg, I want to begin by
just asking you a simple question,
650
00:32:30,073 --> 00:32:34,702
which is, do you want
kids to use your platform more or less?
651
00:32:34,786 --> 00:32:37,330
Well, we don't want people
under the age of 13 using do you.
652
00:32:37,330 --> 00:32:42,585
Want teenagers 13 and up
to use your platform more or less.
653
00:32:42,669 --> 00:32:44,170
What we would like to build a product
654
00:32:44,170 --> 00:32:46,422
that is useful
and that people want to use.
655
00:32:46,422 --> 00:32:49,008
At my time is is going to be limited.
So it's just.
656
00:32:49,008 --> 00:32:51,552
Do you want them to use it? More or less?
657
00:32:51,552 --> 00:32:53,680
Teenagers, 13 to 17 years old.
658
00:32:53,680 --> 00:32:55,807
Do you want them using metal products?
659
00:32:55,807 --> 00:32:57,100
More or less.
660
00:32:57,100 --> 00:33:00,103
I'd like them to be useful enough
that they want to use them more.
661
00:33:00,353 --> 00:33:04,816
And some of the children from our state,
some of the children,
662
00:33:04,899 --> 00:33:08,236
the parents that we have worked with,
663
00:33:08,319 --> 00:33:11,322
just to think whether it is
664
00:33:11,364 --> 00:33:14,367
Becca Schmidt, David Malak,
665
00:33:14,450 --> 00:33:18,413
Sarah Flat, Emily Shout,
666
00:33:18,496 --> 00:33:22,166
Would you say that life is only worth
667
00:33:22,166 --> 00:33:25,670
$270?
668
00:33:25,753 --> 00:33:28,339
What could possibly lead you?
669
00:33:28,339 --> 00:33:31,175
I mean, I listened to that.
I know you're a dad.
670
00:33:31,175 --> 00:33:34,053
I'm a mom. I'm a grand mom.
671
00:33:34,053 --> 00:33:39,976
And how could you possibly even
have that thought?
672
00:33:40,184 --> 00:33:42,645
It is astounding to me.
673
00:33:42,645 --> 00:33:47,483
And I think this
is one of the reasons that
674
00:33:47,567 --> 00:33:48,901
states 42
675
00:33:48,901 --> 00:33:53,573
states are now suing you
because of features that they consider
676
00:33:53,573 --> 00:33:58,286
to be addictive,
that you are pushing forward.
677
00:33:58,327 --> 00:34:01,622
And in the emails that we've got from 2021
678
00:34:01,622 --> 00:34:04,876
that go from August to November,
679
00:34:04,959 --> 00:34:07,879
there is the staff plan
that is being discussed.
680
00:34:07,879 --> 00:34:11,841
And Anthony Davis,
Nick Clegg, Sheryl Sandberg, Chris Cox,
681
00:34:12,050 --> 00:34:15,803
Alec Schultz,
Adam Mosseri are all on this chain
682
00:34:15,803 --> 00:34:20,016
of emails on the well-being plan
and then we get to one.
683
00:34:20,016 --> 00:34:25,855
Nick did email Mark for emphasis
to emphasize his support
684
00:34:25,897 --> 00:34:29,358
for the package,
but it sounds like it lost out
685
00:34:29,358 --> 00:34:33,071
to various other pressures in
686
00:34:33,154 --> 00:34:36,866
See, this is what bothers us.
687
00:34:36,908 --> 00:34:39,285
Children are not your priority.
688
00:34:39,285 --> 00:34:42,246
Children are your product.
689
00:34:42,246 --> 00:34:48,294
Children you see as a way to make money
690
00:34:48,336 --> 00:34:49,962
and children
691
00:34:49,962 --> 00:34:53,424
protecting children in this virtual space.
692
00:34:53,508 --> 00:34:57,011
You made a conscious decision
693
00:34:57,095 --> 00:35:00,556
even though Nick Clegg
694
00:35:00,640 --> 00:35:02,266
and others
695
00:35:02,266 --> 00:35:06,646
were going through the process of saying,
This is what we do,
696
00:35:06,729 --> 00:35:09,732
these documents are really illuminating,
697
00:35:09,816 --> 00:35:14,946
and it just shows me that
698
00:35:15,029 --> 00:35:17,115
growing this business,
699
00:35:17,115 --> 00:35:20,618
expanding your revenue,
700
00:35:20,701 --> 00:35:26,249
what you were going to put on those
quarterly filings, that was the priority.
701
00:35:26,332 --> 00:35:32,088
The children were not very clear.
702
00:35:32,171 --> 00:35:34,382
I want to talk with you
about a pedophile ring,
703
00:35:34,382 --> 00:35:40,096
because that came up earlier and The Wall
Street Journal reported on that.
704
00:35:40,179 --> 00:35:45,017
And one of the things that we found out
was after that became evident,
705
00:35:45,101 --> 00:35:50,189
then you didn't take that content down
and it was content.
706
00:35:50,189 --> 00:35:56,195
It showed that teens were for sell
and were offering themselves to older men.
707
00:35:56,279 --> 00:36:01,200
And you didn't take it down because it
didn't violate your community standards.
708
00:36:01,284 --> 00:36:06,831
Do you know how often a child is
bought or sold for sex in this country?
709
00:36:06,914 --> 00:36:08,916
Every 2 minutes.
710
00:36:08,916 --> 00:36:12,587
Every 2 minutes a child
711
00:36:12,670 --> 00:36:15,840
is bought or sold for sex?
712
00:36:15,840 --> 00:36:18,092
That's not my stat.
713
00:36:18,092 --> 00:36:22,096
That is a TBI stat
714
00:36:22,180 --> 00:36:25,433
now signed like this content
715
00:36:25,516 --> 00:36:28,936
was taken down
after a congressional staffer
716
00:36:28,936 --> 00:36:31,939
went to MED as global head of safety.
717
00:36:32,023 --> 00:36:36,110
So would you please explain to me
and to all these parents
718
00:36:36,194 --> 00:36:40,448
why explicit predatory content
does not violate
719
00:36:40,448 --> 00:36:46,287
your platform's terms of service
or your community standards?
720
00:36:46,370 --> 00:36:48,998
Sure, Senator, let me try to address
all of the things that you just said.
721
00:36:48,998 --> 00:36:51,125
It does violate our standards.
722
00:36:51,125 --> 00:36:53,794
We work very hard to take.
It didn't take it down.
723
00:36:53,794 --> 00:36:55,588
We've what we've reported,
724
00:36:55,588 --> 00:36:59,091
I think it's more than 26 million examples
of this kind of content.
725
00:36:59,091 --> 00:37:02,261
Take it down until a congressional staffer
brought it up.
726
00:37:02,470 --> 00:37:05,389
It may be that in this case
we made a mistake, missed something.
727
00:37:05,389 --> 00:37:06,724
I think a lot of mistake.
728
00:37:06,724 --> 00:37:08,893
But we have we have a leading one that.
729
00:37:08,893 --> 00:37:12,939
I want to talk with you about,
your Instagram
730
00:37:12,939 --> 00:37:15,233
creator's program and about the push.
731
00:37:15,233 --> 00:37:19,779
We found out through these documents
that you actually are pushing forward
732
00:37:20,029 --> 00:37:23,741
because you want to bring kids in early.
733
00:37:23,950 --> 00:37:26,744
You see these younger
734
00:37:26,744 --> 00:37:30,581
teenagers as valuable,
but an untapped audience.
735
00:37:30,665 --> 00:37:36,128
Quoting from the emails and suggesting
teens are actually household influencers
736
00:37:36,337 --> 00:37:39,340
to bring their younger siblings
737
00:37:39,548 --> 00:37:43,010
into your platform, into Instagram.
738
00:37:43,094 --> 00:37:45,888
Now, how can you ensure that
739
00:37:45,888 --> 00:37:49,308
Instagram creators, your product,
740
00:37:49,392 --> 00:37:53,396
your program, does not facilitate
illegal activities
741
00:37:53,396 --> 00:37:57,650
when you fail to remove content
742
00:37:57,733 --> 00:38:01,612
pertaining to the sale of minors?
743
00:38:01,696 --> 00:38:08,452
And it is happening once
every 2 minutes in this country.
744
00:38:08,536 --> 00:38:09,662
Senator, are
745
00:38:09,662 --> 00:38:12,665
tools for identifying
that kind of content or industry leading.
746
00:38:12,665 --> 00:38:13,958
That doesn't mean we're perfect.
747
00:38:13,958 --> 00:38:16,961
There are definitely issues that we have,
but we.
748
00:38:17,169 --> 00:38:20,840
Mr.. ZUCKERBERG Yes, there are a lot
that is slipping through.
749
00:38:20,840 --> 00:38:24,719
It appears that you're trying to be
the premier sex trafficking.
750
00:38:24,802 --> 00:38:26,595
Force in. This country.
751
00:38:26,595 --> 00:38:29,098
That's ridiculous. No,
it is not ridiculous.
752
00:38:29,098 --> 00:38:30,349
You we don't.
753
00:38:30,349 --> 00:38:31,976
Want this content on our platforms.
754
00:38:31,976 --> 00:38:34,186
Why don't you take it down? We do.
755
00:38:34,186 --> 00:38:35,354
We are here. Discuss.
756
00:38:35,354 --> 00:38:35,771
We do.
757
00:38:35,771 --> 00:38:38,232
We're working to take it down because.
758
00:38:38,232 --> 00:38:40,901
No, you're not.
759
00:38:40,901 --> 00:38:42,403
You are not.
760
00:38:42,403 --> 00:38:45,531
And the problem
is, we've been working on this.
761
00:38:45,531 --> 00:38:46,824
Senator Welch is over there.
762
00:38:46,824 --> 00:38:49,535
We've been working on this stuff
for a decade.
763
00:38:49,535 --> 00:38:54,999
You have an army of lawyers and lobbyists
that have fought us on this
764
00:38:54,999 --> 00:38:59,211
every step of the way you work
with Net Choice, the Cato Institute,
765
00:38:59,211 --> 00:39:03,466
Taxpayers Protection
Alliance and Chamber of Progress
766
00:39:03,549 --> 00:39:06,427
to actually fight
767
00:39:06,427 --> 00:39:11,599
our bipartisan legislation
to keep kids safe online.
768
00:39:11,682 --> 00:39:14,393
So are you going to stop
funding these groups?
769
00:39:14,393 --> 00:39:18,314
Are you going to stop lobbying
against this and come to the table
770
00:39:18,314 --> 00:39:19,065
and work with this?
771
00:39:19,065 --> 00:39:20,232
Yes or no?
772
00:39:20,232 --> 00:39:23,444
Senator, We have a yes or no.
773
00:39:23,527 --> 00:39:25,488
Of course
we'll work with you one on the ledge.
774
00:39:25,488 --> 00:39:26,322
Okay.
775
00:39:26,322 --> 00:39:27,740
The door is open.
776
00:39:27,740 --> 00:39:30,534
We've got all these bills you need.
777
00:39:30,534 --> 00:39:32,203
You need to come to the table.
778
00:39:32,203 --> 00:39:36,957
Each and every one of you need to come
to the table and you need to work with us.
779
00:39:37,166 --> 00:40:00,523
Kids are dying.
780
00:40:00,606 --> 00:40:01,649
Chair Durbin,
781
00:40:01,649 --> 00:40:04,568
Ranking Member Graham,
and members of the committee,
782
00:40:04,568 --> 00:40:08,322
I appreciate the opportunity to appear
before you today.
783
00:40:08,406 --> 00:40:12,368
My name is Show Chew
and I am the CEO of Tik-Tok,
784
00:40:12,451 --> 00:40:16,288
an online community
of more than 1 billion people worldwide,
785
00:40:16,372 --> 00:40:19,667
including well over 170 million Americans
786
00:40:19,750 --> 00:40:22,545
who use our app every month to create,
787
00:40:22,545 --> 00:40:25,297
to share, and to discover
788
00:40:25,297 --> 00:40:28,467
that although the average age on TikTok in
the U.S.
789
00:40:28,509 --> 00:40:33,431
is over 30, we recognize that special
safeguards are required to protect minors,
790
00:40:33,514 --> 00:40:37,768
and especially when it comes to combating
all forms of seasonal.
791
00:40:37,852 --> 00:40:41,564
As a father of three young children
myself, I know that the issues
792
00:40:41,564 --> 00:40:46,402
that we're discussing today are horrific
and the nightmare of every parent.
793
00:40:46,485 --> 00:40:49,738
I am proud of our efforts
to address the threats
794
00:40:49,822 --> 00:40:53,451
to young people online, from a commitment
to protecting them
795
00:40:53,534 --> 00:40:57,663
to our industry leading policies,
use of innovative technology
796
00:40:57,746 --> 00:41:03,878
and significant ongoing investments
in trust and safety to achieve this goal.
797
00:41:03,961 --> 00:41:06,255
Tick Tock is vigilant about
798
00:41:06,255 --> 00:41:10,634
its 13 and up age
policy and offers an experience for teens
799
00:41:10,801 --> 00:41:16,390
that is much more restrictive
than you and I would have as adults.
800
00:41:16,474 --> 00:41:19,477
We make careful product design choices
to help
801
00:41:19,477 --> 00:41:23,647
make our app inhospitable
to those seeking to harm teens.
802
00:41:23,731 --> 00:41:26,817
Let me give you a few examples of long
standing policies that are
803
00:41:26,859 --> 00:41:27,943
unique to tick tock.
804
00:41:27,943 --> 00:41:30,112
We didn't do them last week.
805
00:41:30,112 --> 00:41:32,406
First, direct messaging
806
00:41:32,406 --> 00:41:36,368
is not available
to any users under the age of 16.
807
00:41:36,452 --> 00:41:38,037
Second,
808
00:41:38,037 --> 00:41:40,122
accounts for people under 16
809
00:41:40,122 --> 00:41:44,418
are automatically set to private
along with their content.
810
00:41:44,502 --> 00:41:47,421
Furthermore,
the content cannot be downloaded
811
00:41:47,421 --> 00:41:51,967
and will not be recommended to people
they do not know.
812
00:41:52,051 --> 00:41:56,514
Third, every teen under 18
has a screen time limit
813
00:41:56,597 --> 00:42:00,976
automatically automatically
set to 60 minutes and fourth,
814
00:42:01,060 --> 00:42:06,649
only people 18 and above are allowed to
use our live stream feature.
815
00:42:06,732 --> 00:42:07,191
I'm proud
816
00:42:07,191 --> 00:42:10,945
to say that TikTok was among the first
to empower parents
817
00:42:10,986 --> 00:42:13,989
to supervise their teens on our app
818
00:42:13,989 --> 00:42:16,700
with our family pairing tools.
819
00:42:16,700 --> 00:42:19,954
This includes setting screen time
limits, filtering out
820
00:42:19,954 --> 00:42:23,874
content from his feet, amongst others.
821
00:42:23,916 --> 00:42:27,169
We made these choices
after consulting with doctors
822
00:42:27,253 --> 00:42:32,049
and safety experts who understand
the unique stages of teenage development.
823
00:42:32,132 --> 00:42:34,760
To ensure
that we have the appropriate safeguards
824
00:42:34,760 --> 00:42:38,472
to prevent harm and minimize risk.
825
00:42:38,556 --> 00:42:41,350
Now, safety is one of the core priorities
826
00:42:41,350 --> 00:42:44,353
that defines tick tock my leadership.
827
00:42:44,353 --> 00:42:46,730
We currently have more than 40,000
828
00:42:46,730 --> 00:42:51,110
trust and safety professionals
working to protect our community globally,
829
00:42:51,193 --> 00:42:54,947
and we expect to invest
more than $2 billion
830
00:42:55,030 --> 00:42:58,367
in trust
and safety efforts this year alone.
831
00:42:58,450 --> 00:43:00,536
A significant part of that, you know, U.S.
832
00:43:00,536 --> 00:43:02,121
operations,
833
00:43:02,204 --> 00:43:03,247
our robust
834
00:43:03,247 --> 00:43:08,961
community guidelines, strictly prohibit
content or behavior that puts teenagers
835
00:43:08,961 --> 00:43:11,964
at risk of exploitation or other harm,
836
00:43:12,006 --> 00:43:15,968
and we vigorously enforce them
837
00:43:16,051 --> 00:43:19,555
all technology, moderates,
all content uploaded to our app
838
00:43:19,638 --> 00:43:24,893
to help quickly identify potential cesium
and other material that breaks our rules.
839
00:43:24,935 --> 00:43:28,022
It automatically removes the content
or elevates it
840
00:43:28,022 --> 00:43:31,567
to our safety professionals
for further review.
841
00:43:31,650 --> 00:43:33,569
We also moderate direct messages
842
00:43:33,569 --> 00:43:37,865
for season and related material
and use third party
843
00:43:37,865 --> 00:43:42,369
tools like photo DNA
and take it down to combat system
844
00:43:42,453 --> 00:43:46,415
to prevent content
from being uploaded to our platform.
845
00:43:46,498 --> 00:43:49,918
We continually meet with parents, teachers
and teens.
846
00:43:50,002 --> 00:43:53,255
In fact, I sat down with a group
just a few days ago.
847
00:43:53,339 --> 00:43:57,176
We use their insight
to the protections on our platform,
848
00:43:57,217 --> 00:44:01,889
and we also work with leading groups
like the Technology Coalition.
849
00:44:01,972 --> 00:44:02,556
The steps that
850
00:44:02,556 --> 00:44:05,559
we're taking to protect
teens are a critical part
851
00:44:05,559 --> 00:44:09,980
of a larger trust and safety work
as we continue our voluntary
852
00:44:10,064 --> 00:44:15,110
and unprecedented efforts to build a safe
and secure data environment for U.S.
853
00:44:15,110 --> 00:44:18,697
users,
ensuring that our platform remains free
854
00:44:18,781 --> 00:44:22,785
from outside manipulation
and implementing safeguards
855
00:44:22,868 --> 00:44:26,830
on our content recommendation
Moderation Tools.
856
00:44:26,914 --> 00:44:28,666
Keeping teens safe online requires
857
00:44:28,666 --> 00:44:31,669
a collaborative effort
as well as collective action.
858
00:44:31,835 --> 00:44:36,382
We share the committee's concern and
commitment to protect young people online,
859
00:44:36,465 --> 00:44:37,716
and we welcome the opportunity
860
00:44:37,716 --> 00:44:41,303
to work with you on legislation
to achieve this goal.
861
00:44:41,345 --> 00:44:44,056
Our commitment is ongoing and unwavering
862
00:44:44,056 --> 00:44:47,935
because there is no finish line
when it comes to protecting teens.
863
00:44:47,976 --> 00:44:50,938
Thank you for your time and consideration
today.
864
00:44:50,938 --> 00:44:52,815
I'm happy to answer your questions.
865
00:44:52,815 --> 00:44:55,859
Can you explain to us
what you are doing particularly
866
00:44:55,859 --> 00:45:01,073
and whether you've seen
any evidence of cesium in your business?
867
00:45:01,156 --> 00:45:02,282
Yes, Ed,
868
00:45:02,282 --> 00:45:05,953
we have a strong commitment
to invest in trust and safety.
869
00:45:06,036 --> 00:45:09,456
And as I said in my opening statement,
I intend to invest
870
00:45:09,707 --> 00:45:13,085
more than $2 billion in trust and safety
this year alone.
871
00:45:13,168 --> 00:45:17,005
We have 40,000 safety professionals
working on this topic.
872
00:45:17,047 --> 00:45:21,135
We have built a specialized child
safety team to help us identify
873
00:45:21,135 --> 00:45:26,724
specialized issues, horrific issues like
material like the ones you have mentioned.
874
00:45:26,765 --> 00:45:31,145
If we identify any on our platform
and we proactively do do it detection,
875
00:45:31,353 --> 00:45:37,276
we will remove it and we will report
to a bank and other authorities.
876
00:45:37,359 --> 00:45:38,610
Why is it tick tock?
877
00:45:38,610 --> 00:45:43,949
Allowing children to be exploited
into performing commercialized sex acts?
878
00:45:44,032 --> 00:45:44,533
A Senator.
879
00:45:44,533 --> 00:45:47,119
I respectfully disagree
with that characterization.
880
00:45:47,119 --> 00:45:51,498
Our live streaming product
is not for anyone below the age of 18.
881
00:45:51,582 --> 00:45:53,375
We have taken action to
882
00:45:53,375 --> 00:45:57,212
to identify anyone who violates
that, and we will remove them
883
00:45:57,296 --> 00:46:00,299
from the from using that service.
884
00:46:00,299 --> 00:46:03,010
Mr.. CHIU I'm a co-sponsor of Chair
885
00:46:03,010 --> 00:46:07,055
Dermot Steps System Act of 2023,
886
00:46:07,139 --> 00:46:10,684
along with Senator Hollies,
the lead Republican I believe,
887
00:46:10,768 --> 00:46:14,646
which among other things empowers victims
by making it easier for them to ask
888
00:46:14,646 --> 00:46:18,317
tech companies to remove the material
889
00:46:18,400 --> 00:46:21,236
and related imagery from their platforms.
890
00:46:21,236 --> 00:46:23,947
Why would you not support this bill?
891
00:46:23,947 --> 00:46:25,866
Senator, We largely support it.
892
00:46:25,866 --> 00:46:28,869
I think the spirit of it is very aligned
with what we want to do.
893
00:46:29,036 --> 00:46:32,706
There are questions about implementation
that I think companies like us
894
00:46:32,706 --> 00:46:35,125
and some other groups have,
and we look forward to asking those.
895
00:46:35,125 --> 00:46:39,254
And of course, if this legislation is law,
we will comply.
896
00:46:39,338 --> 00:46:40,756
Mr. Chiu,
897
00:46:40,756 --> 00:46:45,969
I think your company is unique among
the ones represented here today
898
00:46:45,969 --> 00:46:50,724
because of its ownership by Bytedance,
899
00:46:50,766 --> 00:46:53,602
a Chinese company.
900
00:46:53,602 --> 00:46:56,605
And I know there have been some steps
that you've taken
901
00:46:56,647 --> 00:47:00,984
to wall off the data
collected here in the United States.
902
00:47:00,984 --> 00:47:04,988
But the fact of the matter
is that under Chinese law
903
00:47:04,988 --> 00:47:07,991
and Chinese national intelligence laws,
904
00:47:08,158 --> 00:47:12,788
all information accumulated by companies
in the people's Republic of China
905
00:47:12,788 --> 00:47:20,379
are required to be shared
with the Chinese intelligence services.
906
00:47:20,462 --> 00:47:22,506
By Dance,
907
00:47:22,506 --> 00:47:26,176
the initial release of Tik Tok,
I understand, was in 2016.
908
00:47:26,260 --> 00:47:29,263
Are these efforts that you made
with Oracle
909
00:47:29,429 --> 00:47:32,516
under the so-called project Texas
to wall off the U.S.
910
00:47:32,516 --> 00:47:35,686
data was in 2021 that apparently
911
00:47:35,769 --> 00:47:39,439
allegedly fully walled off in March of 23.
912
00:47:39,523 --> 00:47:43,318
What happened to all of the data
that TikTok collected
913
00:47:43,402 --> 00:47:46,071
before that?
914
00:47:46,071 --> 00:47:47,656
Senator, thank you.
915
00:47:47,656 --> 00:47:48,949
From American users.
916
00:47:48,949 --> 00:47:52,786
Understand TikTok is owned by Bytedance,
which is majority
917
00:47:52,786 --> 00:47:54,079
owned by global investors.
918
00:47:54,079 --> 00:47:57,708
And we have three Americans on the board
out of five.
919
00:47:57,791 --> 00:48:00,711
You are right in pointing out that over
the last three years,
920
00:48:00,711 --> 00:48:04,006
we have spent billions of dollars
building out Project Texas,
921
00:48:04,006 --> 00:48:07,009
which is a plan
that is unprecedented in our industry
922
00:48:07,217 --> 00:48:10,888
to all our firewalled off protected us
data from the rest of our staff.
923
00:48:10,971 --> 00:48:11,680
We also have I'm.
924
00:48:11,680 --> 00:48:16,435
Asking about all of the data that you
collected prior to that, that event.
925
00:48:16,518 --> 00:48:19,396
Yes, Senator,
we have studied the data deletion plan.
926
00:48:19,396 --> 00:48:21,273
I talked about this a year ago.
927
00:48:21,273 --> 00:48:23,442
We have finished the first phase
of data deletion
928
00:48:23,442 --> 00:48:27,195
through our data centers outside
of the Oracle Cloud infrastructure.
929
00:48:27,279 --> 00:48:29,990
And we're beginning
phase two where we will not only delete
930
00:48:29,990 --> 00:48:33,619
from the data centers, we will hire
a third party to verify that work.
931
00:48:33,702 --> 00:48:37,664
And then we will go into,
you know, for example, employees
932
00:48:37,748 --> 00:48:39,666
working laptops to delete data as well.
933
00:48:39,666 --> 00:48:45,047
As all of the data collected
by tick tock prior to Project Texas shared
934
00:48:45,047 --> 00:48:50,010
with the Chinese government
pursuant to the national intelligence law,
935
00:48:50,093 --> 00:48:51,219
that country.
936
00:48:51,219 --> 00:48:53,555
Senator,
we have not been asked for any data
937
00:48:53,555 --> 00:49:01,355
by the Chinese government
and we have never provided it.
938
00:49:01,438 --> 00:49:03,982
Your company is unique, again, among the
939
00:49:03,982 --> 00:49:07,444
the ones represented here today
because you're currently undergoing
940
00:49:07,486 --> 00:49:10,906
review by the Committee on
Foreign Investment in the United States.
941
00:49:10,906 --> 00:49:12,783
Is that correct?
942
00:49:12,783 --> 00:49:15,077
Senator, Yes,
there are ongoing discussions
943
00:49:15,077 --> 00:49:19,206
and a lot of our project Texas
work is informed by the
944
00:49:19,289 --> 00:49:23,210
the discussions with many agencies
under the surface umbrella.
945
00:49:23,293 --> 00:49:27,172
Well, Surface is designed
specifically to review foreign investments
946
00:49:27,172 --> 00:49:31,885
in the states
for national security risks, correct?
947
00:49:31,927 --> 00:49:32,886
Yes, I believe so.
948
00:49:32,886 --> 00:49:38,058
And your company is currently
being reviewed by this interagency
949
00:49:38,141 --> 00:49:41,019
committee of the
at the Treasury Department
950
00:49:41,019 --> 00:49:44,564
for potential national security risks.
951
00:49:44,648 --> 00:49:49,444
Senator,
this review is on a acquisition musically,
952
00:49:49,444 --> 00:49:52,823
which is an acquisition
that was done many years ago.
953
00:49:52,906 --> 00:49:58,161
I mean, is this a casual conversation
or are you actually providing information
954
00:49:58,161 --> 00:50:04,376
to the Treasury Department about your
how your your platform operates
955
00:50:04,418 --> 00:50:07,754
for evaluating
a potential national security risk?
956
00:50:07,838 --> 00:50:11,049
Senator, it's
been many years across two administrations
957
00:50:11,174 --> 00:50:16,680
and a lot of discussions around
how our plans are, how our systems work.
958
00:50:16,763 --> 00:50:22,561
We have a lot a lot of robust discussions
about a lot of detail.
959
00:50:22,644 --> 00:50:26,356
63% of teens, I understand, use Tik Tok.
960
00:50:26,356 --> 00:50:29,359
Does that sound about right?
961
00:50:29,526 --> 00:50:31,778
A senator, I cannot verify that.
962
00:50:31,778 --> 00:50:34,364
We know we are popular amongst many age
groups.
963
00:50:34,364 --> 00:50:35,866
The average age in the U.S.
964
00:50:35,866 --> 00:50:40,203
today for our userbase is over 30,
but we are where we are popular.
965
00:50:40,287 --> 00:50:44,416
And you reside in Singapore
with your family, correct?
966
00:50:44,499 --> 00:50:45,584
Yes, I have.
967
00:50:45,584 --> 00:50:48,336
I reside in Singapore and I work here
in the United States as well.
968
00:50:48,336 --> 00:50:52,674
And do your children
have access to Tik in Singapore?
969
00:50:52,758 --> 00:50:54,968
Senator,
if they lived in the United States,
970
00:50:54,968 --> 00:50:57,179
I would give them access
to our under 13 experience.
971
00:50:57,179 --> 00:50:59,806
My children are below the age of 13.
972
00:50:59,806 --> 00:51:03,226
My question is in Singapore,
do they have access to Tik tok
973
00:51:03,226 --> 00:51:07,314
or is that restricted by by domestic law?
974
00:51:07,355 --> 00:51:10,025
We do not have an under 13 experience
in Singapore.
975
00:51:10,025 --> 00:51:14,571
We have that in the United States
because we were in a mixed audience app
976
00:51:14,654 --> 00:51:19,951
and we created an experience
in response to that.
977
00:51:20,035 --> 00:51:22,996
A Wall
Street Journal article published yesterday
978
00:51:22,996 --> 00:51:27,626
directly contradicts
what your company has stated publicly.
979
00:51:27,667 --> 00:51:31,797
According to the Journal, Employees
under the project, Texas say that U.S.
980
00:51:31,797 --> 00:51:35,300
user data, including user emails,
981
00:51:35,383 --> 00:51:39,429
birthdate IP addresses
continue to be shared
982
00:51:39,513 --> 00:51:44,851
with Bytedance's staff, again
owned by a Chinese company.
983
00:51:44,935 --> 00:51:46,812
Do you dispute that?
984
00:51:46,812 --> 00:51:49,064
Yes, Senator,
there are many things about that article.
985
00:51:49,064 --> 00:51:50,232
They are inaccurate.
986
00:51:50,232 --> 00:51:54,903
Where it gets right is that
this is a voluntary project that we built.
987
00:51:54,986 --> 00:51:56,488
We spent billions of dollars.
988
00:51:56,488 --> 00:51:59,491
There are thousands of employees involved
and it's very difficult
989
00:51:59,491 --> 00:52:03,912
because it's unprecedented.
990
00:52:03,995 --> 00:52:04,621
Why is
991
00:52:04,621 --> 00:52:08,458
why is it important
that the data collected from U.S.
992
00:52:08,458 --> 00:52:13,380
users be stored in the United States?
993
00:52:13,463 --> 00:52:16,258
A Senator
this was a project we built in response
994
00:52:16,258 --> 00:52:19,970
to some of the concerns that were raised
by members of this committee and others.
995
00:52:20,053 --> 00:52:23,056
And that was because of concerns
that the data that was stored
996
00:52:23,056 --> 00:52:28,937
in China could be accessed
by the Chinese Communist Party by
997
00:52:28,937 --> 00:52:31,940
and according to the National Intelligence
laws.
998
00:52:31,940 --> 00:52:33,191
Correct?
999
00:52:33,191 --> 00:52:37,320
Senator, we are not the only
company that does business
1000
00:52:37,404 --> 00:52:39,197
that has Chinese employees, for example.
1001
00:52:39,197 --> 00:52:42,492
We're not even the only company
in this room that hires Chinese nationals.
1002
00:52:42,576 --> 00:52:44,828
But in order to address
some of these concerns,
1003
00:52:44,828 --> 00:52:47,622
we have moved the data
into the Oracle Cloud infrastructure.
1004
00:52:47,622 --> 00:52:51,835
We built a 2000 person team to oversee
the management of the data base here.
1005
00:52:52,085 --> 00:52:55,088
We fired, walled it off
from the rest of the organization,
1006
00:52:55,213 --> 00:52:58,258
and then we opened it up to third parties
like Oracle,
1007
00:52:58,508 --> 00:53:02,012
and we will onboard others
to give them third party validation.
1008
00:53:02,095 --> 00:53:03,972
This is unprecedented access.
1009
00:53:03,972 --> 00:53:04,848
I think we are unique
1010
00:53:04,848 --> 00:53:08,018
in taking even more steps
to protect user data in the United States.
1011
00:53:08,018 --> 00:53:13,190
Disputed The Wall Street
Journal story published yesterday.
1012
00:53:13,273 --> 00:53:16,484
Are you going to conduct
any sort of investigation to see
1013
00:53:16,484 --> 00:53:18,904
whether there is any truth to
1014
00:53:18,987 --> 00:53:20,864
the allegations made in the article?
1015
00:53:20,864 --> 00:53:23,200
Or are you just going to dismiss them
outright?
1016
00:53:23,200 --> 00:53:24,409
We're not going to dismiss them.
1017
00:53:24,409 --> 00:53:29,080
So we have ongoing security inspections,
not only by our own personnel,
1018
00:53:29,164 --> 00:53:33,501
but also by third parties to ensure
that the system is rigorous and robust.
1019
00:53:33,585 --> 00:53:36,546
No system that we,
any one of us can build is perfect.
1020
00:53:36,671 --> 00:53:38,173
But what we need to do is to make sure
1021
00:53:38,173 --> 00:53:41,551
that we are always improving it
and testing it against that.
1022
00:53:41,551 --> 00:53:43,094
People who may try to bypass it.
1023
00:53:43,094 --> 00:53:46,056
And if anyone breaks our policies
within our organization,
1024
00:53:46,056 --> 00:53:48,433
we will take disciplinary action
against them.
1025
00:53:48,433 --> 00:53:50,769
Mr. CHU Let's get straight to the chase.
1026
00:53:50,769 --> 00:53:56,816
Is tick tock under the influence
of the Chinese Communist Party?
1027
00:53:56,858 --> 00:53:57,525
No, Senator.
1028
00:53:57,525 --> 00:53:59,194
We are private business.
1029
00:53:59,194 --> 00:54:02,364
Okay,
So you can see that your parent Bytedance
1030
00:54:02,364 --> 00:54:05,367
is subject to the 2017
National Security law,
1031
00:54:05,533 --> 00:54:08,495
which requires Chinese companies
to turn over information
1032
00:54:08,495 --> 00:54:11,248
to the Chinese government
and conceal it from the rest of the world.
1033
00:54:11,248 --> 00:54:12,749
You can see that, correct.
1034
00:54:12,749 --> 00:54:14,542
Senator? The Chinese businesses.
1035
00:54:14,542 --> 00:54:15,669
You can say that early.
1036
00:54:15,669 --> 00:54:16,628
Any company
1037
00:54:16,628 --> 00:54:20,465
that does any global businesses
does business in China has to follow law.
1038
00:54:20,548 --> 00:54:22,008
Isn't it the case that Bytedance
1039
00:54:22,008 --> 00:54:25,679
also has an internal Chinese Communist
Party committee?
1040
00:54:25,762 --> 00:54:28,765
Like I said, all businesses that
operate in China have to follow the law.
1041
00:54:28,890 --> 00:54:32,769
So your parent company is subject
to the national security law
1042
00:54:32,811 --> 00:54:34,437
that requires it to answer to the party.
1043
00:54:34,437 --> 00:54:37,440
It has its own internal Chinese Communist
Party committee.
1044
00:54:37,607 --> 00:54:39,567
You answer to that parent company,
1045
00:54:39,567 --> 00:54:40,944
but you expect us to believe that you're
1046
00:54:40,944 --> 00:54:43,446
not under the influence
of the Chinese Communist Party.
1047
00:54:43,446 --> 00:54:47,784
I understand this concern, Senator,
which is why we built the.
1048
00:54:47,867 --> 00:54:49,703
But you used to work for Bytedance,
didn't you?
1049
00:54:49,703 --> 00:54:51,496
You were the CFO for Bytedance.
1050
00:54:51,496 --> 00:54:52,289
That is correct.
1051
00:54:52,289 --> 00:54:55,292
In April 2021, while you were the CFO,
1052
00:54:55,417 --> 00:54:59,004
the Chinese Communist Party's
China Internet Investment Fund purchased
1053
00:54:59,004 --> 00:55:04,259
a 1% stake in Bytedance's main Chinese
subsidiary, the Bite Technology Company.
1054
00:55:04,259 --> 00:55:07,887
In return for that so-called
1% golden share.
1055
00:55:07,971 --> 00:55:11,850
The party took one of three board seats
at that subsidiary company.
1056
00:55:11,891 --> 00:55:13,560
That's correct, doesn't it?
1057
00:55:13,560 --> 00:55:15,270
It's for the Chinese business.
1058
00:55:15,270 --> 00:55:16,271
Is that correct?
1059
00:55:16,271 --> 00:55:17,480
It is for the Chinese that.
1060
00:55:17,480 --> 00:55:21,526
That deal was finalized on April
30th, 2021.
1061
00:55:21,693 --> 00:55:24,029
Is it true that you were appointed
the CEO of Tik Tok?
1062
00:55:24,029 --> 00:55:27,198
The very next day on May one, 2021?
1063
00:55:27,282 --> 00:55:28,992
Well, this is a coincidence.
1064
00:55:28,992 --> 00:55:30,076
It's a coincidence.
1065
00:55:30,076 --> 00:55:31,453
It is. The CFO.
1066
00:55:31,453 --> 00:55:31,995
Senator of the.
1067
00:55:31,995 --> 00:55:35,081
Chinese Communist Party, took its golden
chair and its board seat.
1068
00:55:35,206 --> 00:55:38,209
And the very next day,
you were appointed the CEO of Tik Tok.
1069
00:55:38,376 --> 00:55:42,255
That's a hell of a coincidence.
It really is, Senator.
1070
00:55:42,339 --> 00:55:43,757
It is.
1071
00:55:43,757 --> 00:55:45,133
Okay.
1072
00:55:45,133 --> 00:55:48,595
And before Bytedance,
you were at a Chinese company called Zhao
1073
00:55:48,636 --> 00:55:50,472
Me, is that correct?
1074
00:55:50,472 --> 00:55:52,265
Yes. I used to work around the world.
1075
00:55:52,265 --> 00:55:54,392
Where did you live
when you worked at Jamie?
1076
00:55:54,392 --> 00:55:56,436
I lived in China like. Many experts.
1077
00:55:56,436 --> 00:55:57,228
Where exactly?
1078
00:55:57,228 --> 00:55:58,063
In Beijing. In China.
1079
00:55:58,063 --> 00:56:00,190
How many years did you live in Beijing?
1080
00:56:00,190 --> 00:56:02,108
Senate.
I worked there for about five years.
1081
00:56:02,108 --> 00:56:03,943
So you lived there five years? Yes.
1082
00:56:03,943 --> 00:56:06,654
Is it the case that Zhao Mi was sanctioned
by the United States
1083
00:56:06,654 --> 00:56:10,909
government in 2021 for being a communist
Chinese military company?
1084
00:56:10,992 --> 00:56:12,786
I'm here to talk about tech, I think.
1085
00:56:12,786 --> 00:56:15,455
I think they then had a lawsuit
and it was overturned.
1086
00:56:15,455 --> 00:56:16,289
I can't remember that.
1087
00:56:16,289 --> 00:56:16,581
No, no.
1088
00:56:16,581 --> 00:56:20,085
It's another company by an administration
that reversed those sanctions, just like,
1089
00:56:20,085 --> 00:56:23,713
by the way, they reversed the terrorist
designation on the WHO Houthis in Yemen.
1090
00:56:23,713 --> 00:56:25,173
How's that working out for them?
1091
00:56:25,173 --> 00:56:29,302
But it was sanctioned
as a Chinese communist military company.
1092
00:56:29,511 --> 00:56:33,973
So you said today, as you often say,
that you live in Singapore.
1093
00:56:34,057 --> 00:56:36,518
Of what nation are you? A citizen?
1094
00:56:36,518 --> 00:56:37,602
Singapore.
1095
00:56:37,602 --> 00:56:39,604
Are you a citizen of any other nation?
1096
00:56:39,604 --> 00:56:40,230
No, Senator.
1097
00:56:40,230 --> 00:56:42,524
Have you ever applied
for Chinese citizenship?
1098
00:56:42,524 --> 00:56:44,859
Senator, I serve my nation in Singapore.
1099
00:56:44,859 --> 00:56:46,403
No, I did not.
1100
00:56:46,403 --> 00:56:48,279
Do you have a Singaporean passport?
1101
00:56:48,279 --> 00:56:51,074
Yes. And I served my military
for two and a half years in Singapore.
1102
00:56:51,074 --> 00:56:53,201
Have any other.
Do you have any other passports?
1103
00:56:53,201 --> 00:56:54,661
Many other notes.
1104
00:56:54,661 --> 00:56:56,079
Your wife is an American citizen.
1105
00:56:56,079 --> 00:56:58,123
Your children are American citizens.
That's correct.
1106
00:56:58,123 --> 00:57:00,458
Have you ever applied
for American citizenship?
1107
00:57:00,458 --> 00:57:05,171
Not no, not yet. Okay.
1108
00:57:05,255 --> 00:57:07,674
Have you ever been
a member of the Chinese Communist Party?
1109
00:57:07,674 --> 00:57:09,884
Senator. I'm Singaporean. No.
1110
00:57:09,884 --> 00:57:12,470
Have you ever been associated
or affiliated with the Chinese Communist
1111
00:57:12,470 --> 00:57:14,806
Party?
No, Senator. Again, I'm Singaporean.
1112
00:57:14,806 --> 00:57:16,683
Let me ask you some
hopefully simple question.
1113
00:57:16,683 --> 00:57:20,728
You said earlier in response to a question
about what happened at Tiananmen
1114
00:57:20,728 --> 00:57:24,023
Square in June of 1989
was a massive protest.
1115
00:57:24,107 --> 00:57:26,526
Anything else happened in Tiananmen
Square?
1116
00:57:26,526 --> 00:57:28,987
Yes, I think it's well documented.
There was a massacre.
1117
00:57:28,987 --> 00:57:33,950
There was indiscriminate slaughter of
hundreds or thousands of Chinese citizens.
1118
00:57:34,033 --> 00:57:37,036
Do you agree with the Trump administration
and the Biden administration
1119
00:57:37,162 --> 00:57:40,415
that the Chinese government is committing
genocide against the weaker people?
1120
00:57:40,498 --> 00:57:41,916
Senator, I've said this before.
1121
00:57:41,916 --> 00:57:43,042
I think it's really important
1122
00:57:43,042 --> 00:57:46,212
that anyone who cares about this topic
or any topic can freely express
1123
00:57:46,212 --> 00:57:47,046
themselves on TikTok.
1124
00:57:47,046 --> 00:57:48,173
It's a very simple question
1125
00:57:48,173 --> 00:57:52,135
that unites both parties in our country
and governments around the world.
1126
00:57:52,135 --> 00:57:55,638
Is the Chinese government committing
genocide against the weaker people?
1127
00:57:55,722 --> 00:57:59,767
Senator, anyone, including you,
can come in to talk about this.
1128
00:57:59,767 --> 00:58:01,227
Topic asking you. Any topic.
1129
00:58:01,227 --> 00:58:02,145
United States,
1130
00:58:02,145 --> 00:58:06,441
cosmopolitan, well-educated man who's
expressed many opinions on many topics.
1131
00:58:06,524 --> 00:58:09,611
Is the Chinese government committing
genocide against the weaker people?
1132
00:58:09,819 --> 00:58:12,572
Actually, Senator,
I talk about many about my company,
1133
00:58:12,572 --> 00:58:15,283
and I'm here to talk about what
Trump does. Yes or not, you're here.
1134
00:58:15,283 --> 00:58:19,537
We allow you're here to give testimony
that's truthful and honest and complete.
1135
00:58:19,621 --> 00:58:20,330
Let me ask you this.
1136
00:58:20,330 --> 00:58:23,082
Joe Biden last year
said that Jinping was a dictator.
1137
00:58:23,082 --> 00:58:25,376
Do you agree with Joe Biden,
the paying a dictator?
1138
00:58:25,376 --> 00:58:28,338
Senator, I'm not going to comment on
any world leaders.
1139
00:58:28,338 --> 00:58:30,590
Why won't you answer these
very simple questions?
1140
00:58:30,590 --> 00:58:31,633
Senator, it's not appropriate.
1141
00:58:31,633 --> 00:58:33,343
Are you serious
and spent to come in a world leader?
1142
00:58:33,343 --> 00:58:35,470
Are you scared that you'll lose your job
if you say anything
1143
00:58:35,470 --> 00:58:37,639
about negative
about the Chinese Communist Party?
1144
00:58:37,639 --> 00:58:40,600
I disagree that you will find content
that is critical of China, and I'll.
1145
00:58:40,600 --> 00:58:41,559
Tell you next time you go.
1146
00:58:41,559 --> 00:58:43,311
Are you scared that you'll be arrested
and disappear
1147
00:58:43,311 --> 00:58:45,104
the next time you go to mainland China?
1148
00:58:45,104 --> 00:58:47,690
Senator, you will find content
that is critical of China
1149
00:58:47,690 --> 00:58:49,651
and any other country freely on TikTok.
1150
00:58:49,651 --> 00:58:50,735
Okay. Okay.
1151
00:58:50,735 --> 00:58:54,614
Let's let let's turn to what
Tik Tok, a tool of the Chinese Communist
1152
00:58:54,614 --> 00:58:56,533
Party is doing to America's youth.
1153
00:58:56,533 --> 00:59:01,246
Does the does
the name Mason Edens ring a bell?
1154
00:59:01,329 --> 00:59:03,998
Senator, you may have to
give me more specifics, if you don't mind.
1155
00:59:03,998 --> 00:59:04,249
Yeah.
1156
00:59:04,249 --> 00:59:07,710
He was a 16 year old Arkansan
after a break up in 2022.
1157
00:59:07,752 --> 00:59:09,254
He went on your platform and searched
1158
00:59:09,254 --> 00:59:12,882
for things like inspirational quotes
and positive affirmations.
1159
00:59:13,091 --> 00:59:17,804
Instead, he was served up numerous videos
glamorizing suicide
1160
00:59:17,804 --> 00:59:21,558
until he killed himself by gun.
1161
00:59:21,641 --> 00:59:22,934
What about the name Chase?
1162
00:59:22,934 --> 00:59:24,644
NASCAR?
1163
00:59:24,644 --> 00:59:25,853
Does that ring a bell?
1164
00:59:25,853 --> 00:59:27,605
Would you mind giving me more details?
1165
00:59:27,605 --> 00:59:30,483
He was a 16 year old
who saw more than a thousand videos
1166
00:59:30,483 --> 00:59:32,652
on your platform
about violence and suicide
1167
00:59:32,652 --> 00:59:36,155
until he took his own life
by stepping in front of a plane or train.
1168
00:59:36,239 --> 00:59:39,367
Are you aware that his parents, Dean
and Michelle, suing
1169
00:59:39,367 --> 00:59:43,538
TikTok and Bytedance for pushing their
son to take his own life?
1170
00:59:43,621 --> 00:59:45,415
Yes, I'm aware of that.
1171
00:59:45,415 --> 00:59:47,959
Okay.
1172
00:59:47,959 --> 00:59:48,668
Finally, Mr.
1173
00:59:48,668 --> 00:59:52,672
Chew, has the Federal Trade
Commission sued TikTok
1174
00:59:52,672 --> 00:59:55,049
during the Biden administration?
1175
00:59:55,049 --> 00:59:57,176
Senator,
I cannot talk about whether there's any.
1176
00:59:57,176 --> 00:59:57,760
Are you being.
1177
00:59:57,760 --> 01:00:00,638
Are you currently being sued
by the Federal Trade Commission?
1178
01:00:00,638 --> 01:00:03,516
Senator, I cannot talk about
any potential lawsuits.
1179
01:00:03,516 --> 01:00:06,978
Related to actual you being sued
by the Federal Trade Commission.
1180
01:00:07,061 --> 01:00:08,521
Senator, I think I've given you my answer.
1181
01:00:08,521 --> 01:00:10,481
I cannot talk about no.
1182
01:00:10,481 --> 01:00:13,151
Mrs. Jack Company is being sued,
I believe.
1183
01:00:13,151 --> 01:00:16,112
Mr. Zuckerberg's company is being sued,
I believe.
1184
01:00:16,112 --> 01:00:18,823
Yet Tik Tok,
the agent of the Chinese Communist
1185
01:00:18,823 --> 01:00:21,993
Party, is not being sued
by the Biden administration.
1186
01:00:22,076 --> 01:00:24,329
Are you familiar with
1187
01:00:24,329 --> 01:00:27,957
the name Christina Canfora?
1188
01:00:28,041 --> 01:00:29,250
You may have to give me more.
1189
01:00:29,250 --> 01:00:32,629
Christina Carfora was a paid adviser
to Bytedance,
1190
01:00:32,629 --> 01:00:35,965
your communist influenced parent company.
1191
01:00:36,049 --> 01:00:41,095
She was then hired by the Biden
FTC to advise on how to sue Mr.
1192
01:00:41,095 --> 01:00:43,640
Zuckerberg's company.
1193
01:00:43,723 --> 01:00:46,934
Senator Biden, this is a global company
and not a Chinese Communist Party.
1194
01:00:46,976 --> 01:00:48,853
The company is owned by global investors.
1195
01:00:48,853 --> 01:00:49,896
Public reports indicate
1196
01:00:49,896 --> 01:00:53,775
that your lobbyist visited the White House
more than 40 times in 2022.
1197
01:00:53,983 --> 01:00:55,693
How many times did your company visit?
1198
01:00:55,693 --> 01:00:58,613
Did your company's lobbyist visit
the White House last year?
1199
01:00:58,613 --> 01:00:59,947
I don't know that.
1200
01:00:59,947 --> 01:01:01,991
Are you
are you aware that the Biden campaign
1201
01:01:01,991 --> 01:01:05,495
and the Democratic National Committee
is on your platform?
1202
01:01:05,495 --> 01:01:07,372
They have Tik Tok accounts.
1203
01:01:07,372 --> 01:01:10,583
Senator, we encourage people to come on,
which, by the way, they don't.
1204
01:01:10,625 --> 01:01:12,669
They won't let their staffers
use their personal phones.
1205
01:01:12,669 --> 01:01:15,421
They give them separate phones
that they only use Tik Tok on.
1206
01:01:15,421 --> 01:01:17,507
We encourage everyone to join,
including myself, Senator.
1207
01:01:17,507 --> 01:01:20,468
So all these companies are being sued
by the FTC.
1208
01:01:20,468 --> 01:01:23,888
You're not the FTC has a former paid
adviser, your parent,
1209
01:01:24,097 --> 01:01:25,515
talking about how they can sue Mr.
1210
01:01:25,515 --> 01:01:28,476
Zuckerberg's company,
Joe Biden's reelection campaign.
1211
01:01:28,476 --> 01:01:31,479
The Democratic National Committee
is on your platform.
1212
01:01:31,479 --> 01:01:35,149
Let me ask you,
have you or anyone else at Tik Tok
1213
01:01:35,233 --> 01:01:38,903
communicated with or coordinated
with the Biden administration,
1214
01:01:38,903 --> 01:01:41,406
the Biden campaign
or the Democratic National Committee
1215
01:01:41,406 --> 01:01:45,076
to influence the flow of
information on your platform?
1216
01:01:45,159 --> 01:01:46,327
We work with
1217
01:01:46,327 --> 01:01:51,290
anyone, any creators who want to use
our campaign is all of the same process.
1218
01:01:51,374 --> 01:01:52,250
So what we have,
1219
01:01:52,250 --> 01:01:56,170
we have a company that is a tool of the
Chinese Communist Party that is poisoning
1220
01:01:56,170 --> 01:01:59,924
the minds of America's children,
in some cases driving to suicide,
1221
01:02:00,007 --> 01:02:03,136
and that at best, the Biden administration
is taking a pass on.
1222
01:02:03,219 --> 01:02:05,346
At worst, may be in collaboration with.
1223
01:02:05,346 --> 01:02:08,474
Thank you, Mr. Chu. Mr. Chu.
1224
01:02:08,516 --> 01:02:10,184
How many minors on Tik Tok
1225
01:02:10,184 --> 01:02:14,147
and how many of them have a caregiver
that uses your family tools?
1226
01:02:14,230 --> 01:02:17,734
Senator, I need to get back to you
on the specific numbers, but
1227
01:02:17,775 --> 01:02:21,904
we were one of the first platforms to give
what we call family pairing to parents.
1228
01:02:21,988 --> 01:02:25,324
You go to settings, you turn on the QR
code, your teenagers, Kyoko and yours,
1229
01:02:25,324 --> 01:02:26,200
you scan it.
1230
01:02:26,200 --> 01:02:28,995
And what it allows you to do is
you can set screen time limits,
1231
01:02:28,995 --> 01:02:33,291
you can filter out some keywords,
you can turn on a more restricted mode.
1232
01:02:33,374 --> 01:02:35,251
And we are always talking to parents.
1233
01:02:35,251 --> 01:02:39,255
I met a group of parents and teenagers
and high school teachers last week
1234
01:02:39,338 --> 01:03:05,364
to talk about what more we can provide
in the family pairing mode.
1235
01:03:05,448 --> 01:03:06,908
Chairman Durbin,
1236
01:03:06,908 --> 01:03:09,911
Ranking Member Graham,
and members of the Committee,
1237
01:03:10,036 --> 01:03:12,371
thank you for convening this hearing
and for moving forward.
1238
01:03:12,371 --> 01:03:15,374
Important Legislation
to Protect Children Online.
1239
01:03:15,541 --> 01:03:19,462
I'm Evan SPIEGEL,
the co-founder and CEO of SNAP.
1240
01:03:19,545 --> 01:03:23,174
We created Snapchat, an online service
that is used by more than 800 million
1241
01:03:23,174 --> 01:03:27,053
people worldwide to communicate
with their friends and family.
1242
01:03:27,136 --> 01:03:28,429
I know that many of you have been working
1243
01:03:28,429 --> 01:03:31,849
to protect children online
since before Snapchat was created,
1244
01:03:31,933 --> 01:03:34,685
and we are grateful for your long term
dedication to this cause
1245
01:03:34,685 --> 01:03:38,648
and your willingness to work together
to help keep our community safe.
1246
01:03:38,731 --> 01:03:41,776
I want to acknowledge the survivors
of online harms and the families
1247
01:03:41,776 --> 01:03:45,530
who are here today
who have suffered the loss of a loved one.
1248
01:03:45,613 --> 01:03:48,032
Words cannot begin
to express the profound sorrow.
1249
01:03:48,032 --> 01:03:49,283
I feel that a service
1250
01:03:49,283 --> 01:03:54,497
we designed to bring people happiness
and joy has been abused to cause harm.
1251
01:03:54,580 --> 01:03:56,165
I want to be clear that we understand
1252
01:03:56,165 --> 01:04:00,169
our responsibility
to keep our community safe.
1253
01:04:00,253 --> 01:04:03,381
I also want to recognize the many families
who have worked to raise awareness
1254
01:04:03,381 --> 01:04:07,301
on these issues, pushed for change,
and collaborated with lawmakers
1255
01:04:07,301 --> 01:04:12,098
on important legislation like, the Cooper
Davis Act, which can help save lives.
1256
01:04:12,181 --> 01:04:14,559
I started building Snapchat
with my co-founder, Bobby Murphy,
1257
01:04:14,559 --> 01:04:16,143
when I was 20 years old.
1258
01:04:16,143 --> 01:04:18,396
We designed Snapchat
to solve some of the problems
1259
01:04:18,396 --> 01:04:21,190
that we experienced online
when we were teenagers.
1260
01:04:21,190 --> 01:04:24,569
We didn't have an alternative
to social media that meant pictures shared
1261
01:04:24,569 --> 01:04:28,573
online were permanent public and subject
to popularity metrics.
1262
01:04:28,656 --> 01:04:30,575
It didn't feel very good.
1263
01:04:30,575 --> 01:04:32,785
We built Snapchat differently
because we wanted a new way
1264
01:04:32,785 --> 01:04:33,911
to communicate with our friends.
1265
01:04:33,911 --> 01:04:36,622
That was fast, fun and private.
1266
01:04:36,622 --> 01:04:38,165
A picture is worth a thousand,
1267
01:04:38,165 --> 01:04:41,627
so people communicate
with images and videos on Snapchat.
1268
01:04:41,711 --> 01:04:45,590
We don't have public likes or comments
when you share your story with friends.
1269
01:04:45,673 --> 01:04:49,385
Snapchat is private by default, meaning
that people need to opt in to add friends
1270
01:04:49,635 --> 01:04:52,013
and choose who can contact them.
1271
01:04:52,096 --> 01:04:54,098
When we built
Snapchat, we chose to have the images
1272
01:04:54,098 --> 01:04:57,101
and videos sent through our service
delete by default.
1273
01:04:57,226 --> 01:05:00,396
Unlike prior generations
who've enjoyed like prior generations
1274
01:05:00,396 --> 01:05:03,649
who've enjoyed the privacy afforded
by phone calls which aren't recorded.
1275
01:05:03,733 --> 01:05:06,485
Our generation has benefited
from the ability to share moments
1276
01:05:06,485 --> 01:05:07,361
through Snapchat
1277
01:05:07,361 --> 01:05:11,741
that may not be picture perfect, but
instead convey emotion without permanence.
1278
01:05:11,824 --> 01:05:14,076
Even those Snapchat messages
are deleted by default.
1279
01:05:14,076 --> 01:05:18,581
We let everyone know that images
and videos can be saved by the recipient.
1280
01:05:18,664 --> 01:05:21,500
When we take action on illegal
or potentially harmful content,
1281
01:05:21,500 --> 01:05:24,629
we also retain the evidence
for an extended period which allows us
1282
01:05:24,629 --> 01:05:28,341
to support law enforcement
and hold criminals accountable
1283
01:05:28,424 --> 01:05:30,801
to help prevent
the spread of harmful content on Snapchat.
1284
01:05:30,801 --> 01:05:34,513
We approve the content that is recommended
on our service using a combination
1285
01:05:34,513 --> 01:05:37,516
of automated processes and human review.
1286
01:05:37,600 --> 01:05:41,520
We apply our content rules consistently
and fairly across all accounts.
1287
01:05:41,604 --> 01:05:45,107
We run samples of our enforcement actions
through quality assurance to verify
1288
01:05:45,191 --> 01:05:46,943
that we are getting it right.
1289
01:05:46,943 --> 01:05:50,112
We also proactively scan for known child
sexual abuse, material,
1290
01:05:50,112 --> 01:05:53,115
drug related content,
and other types of harmful content.
1291
01:05:53,199 --> 01:05:56,869
Remove that content, deactivate and device
block accounts,
1292
01:05:56,953 --> 01:06:00,081
preserve the evidence for law enforcement,
and report certain content
1293
01:06:00,081 --> 01:06:02,917
to the relevant authorities
for further action.
1294
01:06:03,000 --> 01:06:05,336
Last year, we made 690,000 reports.
1295
01:06:05,336 --> 01:06:07,421
The National Center
for Missing and Exploited Children,
1296
01:06:07,421 --> 01:06:09,674
leading to more than 1000 arrests.
1297
01:06:09,674 --> 01:06:12,176
We also removed 2.2 million pieces of drug
1298
01:06:12,176 --> 01:06:16,639
related content
and blocked 705,000 associated accounts.
1299
01:06:16,722 --> 01:06:20,309
Even with our strict privacy
settings, content moderation efforts,
1300
01:06:20,393 --> 01:06:23,813
proactive detection, and law enforcement
collaboration.
1301
01:06:23,896 --> 01:06:27,650
Bad things can still happen
when people use online services.
1302
01:06:27,733 --> 01:06:29,735
That's why we believe
that people under the age of
1303
01:06:29,735 --> 01:06:33,155
13 are not ready
to communicate on Snapchat.
1304
01:06:33,239 --> 01:06:35,616
We strongly encourage parents
to use the device level
1305
01:06:35,616 --> 01:06:37,952
parental controls on iPhone and Android.
1306
01:06:37,952 --> 01:06:39,286
We use them in our own household.
1307
01:06:39,286 --> 01:06:43,666
And my wife's wife approves every app
that our 13 year old downloads.
1308
01:06:43,749 --> 01:06:45,668
For parents
who want more visibility and control,
1309
01:06:45,668 --> 01:06:47,837
we build a family center in Snapchat
where you can view
1310
01:06:47,837 --> 01:06:53,009
who your teen is talking to, review
privacy settings and set content limits.
1311
01:06:53,092 --> 01:06:55,761
We have worked for years with members
of the committee on Legislation
1312
01:06:55,761 --> 01:06:57,388
like the Kids Online Safety Act
1313
01:06:57,388 --> 01:07:00,224
and the Cooper Davis Act,
which we are proud to support.
1314
01:07:00,224 --> 01:07:01,726
I want to encourage broader industry
1315
01:07:01,726 --> 01:07:04,687
support for legislation
protecting children online.
1316
01:07:04,687 --> 01:07:10,401
No legislation is perfect, but some rules
of the road are better than none.
1317
01:07:10,484 --> 01:07:12,737
Much of the work
that we do to protect people that use
1318
01:07:12,737 --> 01:07:15,364
our service would not be possible
without the support of our partners
1319
01:07:15,364 --> 01:07:19,702
across the industry, government,
nonprofit organizations, NGOs,
1320
01:07:19,785 --> 01:07:22,371
and in particular law enforcement
and the first responders
1321
01:07:22,371 --> 01:07:25,374
who have committed their lives to helping
keep people safe.
1322
01:07:25,416 --> 01:07:28,419
I am profoundly grateful for the
extraordinary efforts across our country
1323
01:07:28,419 --> 01:07:32,256
and around the world to prevent criminals
from using online services to perpetrate
1324
01:07:32,256 --> 01:07:33,758
their crimes.
1325
01:07:33,841 --> 01:07:35,551
I feel an overwhelming sense of gratitude
1326
01:07:35,551 --> 01:07:38,554
for the opportunities that this country
has afforded me and my family.
1327
01:07:38,763 --> 01:07:42,141
I feel a deep obligation to give back
and to make a positive difference.
1328
01:07:42,141 --> 01:07:43,601
And I'm grateful to be here today
1329
01:07:43,601 --> 01:07:46,937
as part of this
vitally important democratic process.
1330
01:07:47,021 --> 01:07:48,314
Members of the committee,
1331
01:07:48,314 --> 01:07:52,193
I give you my commitment that we will be
part of the solution for online safety.
1332
01:07:52,276 --> 01:07:56,614
We will be honest about our shortcomings
and we will work continuously to improve.
1333
01:07:56,697 --> 01:07:57,156
Thank you.
1334
01:07:57,156 --> 01:07:59,867
And I look forward
to answering your questions.
1335
01:07:59,867 --> 01:08:00,993
It's never been a secret.
1336
01:08:00,993 --> 01:08:04,205
The Snapchat Snapchat is used to send
1337
01:08:04,205 --> 01:08:08,084
sexually explicit images in 2013.
1338
01:08:08,167 --> 01:08:11,212
Earlier in your company's history,
you admitted this in an interview.
1339
01:08:11,212 --> 01:08:15,007
Do you remember that interview?
1340
01:08:15,091 --> 01:08:17,093
Senator,
I don't recall the specific interview.
1341
01:08:17,093 --> 01:08:19,762
You said that when you were first trying
to get people on the app, you would,
1342
01:08:19,762 --> 01:08:23,265
quote, go up to the people and be like,
Hey, you should try this application.
1343
01:08:23,265 --> 01:08:25,017
You can send disappearing photos.
1344
01:08:25,017 --> 01:08:27,895
And they would say, for sexting,
1345
01:08:27,978 --> 01:08:30,940
Do you remember that interview?
1346
01:08:30,940 --> 01:08:33,567
Senator, when we first created
the application, it was actually called
1347
01:08:33,567 --> 01:08:34,068
Pick a Boo.
1348
01:08:34,068 --> 01:08:36,821
And the idea was around disappearing
images.
1349
01:08:36,821 --> 01:08:38,906
The feedback
we received from people using the app
1350
01:08:38,906 --> 01:08:40,783
is that they were actually
using it to communicate.
1351
01:08:40,783 --> 01:08:42,952
So we changed the name of the application
to Snapchat
1352
01:08:42,952 --> 01:08:45,955
and we found that people were using it
to talk visually.
1353
01:08:46,080 --> 01:08:49,875
As early as 2017, law enforcement
identified Snapchat
1354
01:08:49,959 --> 01:08:54,338
as the pedophile
go to sexual exploitation to the case
1355
01:08:54,338 --> 01:08:57,883
of a 12 year old girl identified in court
only as L.W.
1356
01:08:57,883 --> 01:08:59,510
shows the danger.
1357
01:08:59,510 --> 01:09:03,180
Over two and a half years of predators
sexually groomed her,
1358
01:09:03,264 --> 01:09:07,601
sending her ex sexually explicit
images and videos over Snapchat.
1359
01:09:07,685 --> 01:09:11,021
The man admitted that
he only used Snapchat with LW
1360
01:09:11,105 --> 01:09:16,277
and not any other platforms because he,
quote, knew the chats would go away.
1361
01:09:16,360 --> 01:09:19,738
Did you or Eric and everyone else at SNAP
really fail
1362
01:09:19,738 --> 01:09:25,119
to see that the platform
was the perfect tool for sexual predators?
1363
01:09:25,202 --> 01:09:27,997
Sir, that behavior is disgusting
and reprehensible.
1364
01:09:27,997 --> 01:09:31,458
We provide in-app reporting tools
so that people who are being harassed
1365
01:09:31,542 --> 01:09:34,086
or who have been shared inappropriate
sexual content
1366
01:09:34,086 --> 01:09:37,089
can report it in the case of harassment
or sexual content.
1367
01:09:37,089 --> 01:09:38,591
We typically respond to those reports
1368
01:09:38,591 --> 01:09:41,051
within 15 minutes
so that we can provide help.
1369
01:09:41,051 --> 01:09:43,721
When LW, the victim sued Snapchat.
1370
01:09:43,721 --> 01:09:48,350
Her case was dismissed under Section
230 of the Communications Decency Act.
1371
01:09:48,434 --> 01:09:53,147
Do you have any doubt that head snap face
the prospect of civil liability
1372
01:09:53,230 --> 01:09:55,816
for facilitating sexual exploitation?
1373
01:09:55,816 --> 01:10:00,863
The company would have implemented
even better safeguards?
1374
01:10:00,946 --> 01:10:04,909
Senator, We already work extensively to
proactively detect this type of behavior.
1375
01:10:04,909 --> 01:10:08,621
We make it very difficult for predators
to find teens on Snapchat.
1376
01:10:08,621 --> 01:10:12,291
There are no public friends lists,
no public profile photos.
1377
01:10:12,374 --> 01:10:14,960
When we recommend friends
1378
01:10:14,960 --> 01:10:18,672
for teens, we make sure that they have
several mutual friends in common.
1379
01:10:18,672 --> 01:10:20,382
Before making that recommendation.
1380
01:10:20,382 --> 01:10:21,967
And we believe those safeguards
are important
1381
01:10:21,967 --> 01:10:25,095
to preventing predators
from misusing our platform.
1382
01:10:25,179 --> 01:10:26,680
Mr. SPIEGEL.
1383
01:10:26,680 --> 01:10:28,557
I know we talked ahead of time.
1384
01:10:28,557 --> 01:10:33,312
I do appreciate your company support
for the Cooper Davis Act,
1385
01:10:33,395 --> 01:10:37,316
which will finally, it's a bill
with Senators Shaheen and Marshall,
1386
01:10:37,358 --> 01:10:41,070
which will allow law enforcement
to do more when it comes to fentanyl.
1387
01:10:41,111 --> 01:10:43,364
I think you know what a problem this is.
1388
01:10:43,364 --> 01:10:45,741
Devon Narang, teenagers from Hastings.
1389
01:10:45,741 --> 01:10:50,079
I mentioned his mom here
suffering dental pain and migraines.
1390
01:10:50,079 --> 01:10:52,998
So he bought
what he thought was a Percocet over snap.
1391
01:10:52,998 --> 01:10:56,001
But instead he bought a counterfeit drug
laced with a legal
1392
01:10:56,126 --> 01:10:58,087
lethal dose of fentanyl.
1393
01:10:58,087 --> 01:11:02,258
As his mom, who's here with us
today, said, all of the hopes and dreams
1394
01:11:02,258 --> 01:11:06,720
we as parents had for
Devon were erased in the blink of an eye.
1395
01:11:06,845 --> 01:11:10,224
And no mom should have to bury their kid.
1396
01:11:10,266 --> 01:11:14,770
Talk about why you
support the Cooper Davis act.
1397
01:11:14,853 --> 01:11:15,437
Senator, thank you.
1398
01:11:15,437 --> 01:11:19,108
We strongly support the Cooper Davis act,
and we believe it will help DEA
1399
01:11:19,191 --> 01:11:22,945
go after the cartels and get more dealers
off the streets to save more lives.
1400
01:11:23,028 --> 01:11:25,781
Mr. SPIEGEL, how many minors use Snapchat?
1401
01:11:25,781 --> 01:11:27,741
And of those minors,
how many have caretakers
1402
01:11:27,741 --> 01:11:30,286
that are registered
with your family center?
1403
01:11:30,286 --> 01:11:32,413
Senator I believe approximately.
1404
01:11:32,413 --> 01:11:34,873
In the United States,
there are approximately 20 million
1405
01:11:34,873 --> 01:11:38,711
teenage users of Snapchat,
I believe approximately 200,000 parents
1406
01:11:38,711 --> 01:11:41,964
use family center
and about 400,000 teens are linked
1407
01:11:42,047 --> 01:11:44,133
their account to their parents
using family center.
1408
01:11:44,133 --> 01:11:46,635
So 20,000 400,000
sounds like a big number,
1409
01:11:46,635 --> 01:11:50,264
but small in percentage of the minors
using Snapchat.
1410
01:11:50,347 --> 01:11:51,849
What are you doing to ensure
that young people
1411
01:11:51,849 --> 01:11:54,476
or guardians are aware of the tools
you offer?
1412
01:11:54,476 --> 01:11:58,314
So we create a banner for family center
on the users profiles.
1413
01:11:58,314 --> 01:12:01,066
So the accounts we believe
may be at the age that they could be.
1414
01:12:01,066 --> 01:12:05,070
Parents can see the entry point
into family center easily.
1415
01:12:05,154 --> 01:12:10,993
Mr. SPIEGEL There are a number of parents
whose children have been able.
1416
01:12:10,993 --> 01:12:12,119
To access.
1417
01:12:12,119 --> 01:12:14,455
Illegal drugs. On your platform.
1418
01:12:14,455 --> 01:12:19,043
What do you say to those parents
1419
01:12:19,126 --> 01:12:22,296
or. Center
We are devastated that we cannot.
1420
01:12:22,296 --> 01:12:23,839
Tell the parents.
1421
01:12:23,839 --> 01:12:25,799
What do you say to those parents? Mr.
1422
01:12:25,799 --> 01:12:29,803
SPIEGEL I'm sorry that we have
not been able to prevent these tragedies.
1423
01:12:29,803 --> 01:12:34,516
We work very hard to block all search
terms related to drugs from our platform.
1424
01:12:34,516 --> 01:12:37,811
We proactively look for and detect drug
related content.
1425
01:12:37,811 --> 01:12:40,814
We remove it from our platform,
preserve it as evidence,
1426
01:12:40,898 --> 01:12:43,901
and then we refer it to law enforcement
for action.
1427
01:12:44,109 --> 01:12:47,946
We've worked together with nonprofits
and with families on education campaigns
1428
01:12:47,946 --> 01:12:50,741
because the scale of the fentanyl
epidemic is extraordinary.
1429
01:12:50,741 --> 01:12:53,035
Over 100,000
people lost their lives last year.
1430
01:12:53,035 --> 01:12:56,622
And we believe people need to know
that one pill can kill that campaign.
1431
01:12:56,622 --> 01:12:56,789
Reach.
1432
01:12:56,789 --> 01:13:22,398
More than 200 was viewed
more than 260 million times on Snapchat.
1433
01:13:22,481 --> 01:13:24,108
My name is Jason Citron
1434
01:13:24,108 --> 01:13:28,404
and I am the co-founder
and CEO of Discord.
1435
01:13:28,445 --> 01:13:30,114
We are an American company
1436
01:13:30,114 --> 01:13:34,910
with about 800 employees
living and working in 33 states.
1437
01:13:34,993 --> 01:13:37,204
Today, discord has grown to more
1438
01:13:37,204 --> 01:13:40,999
than 150 million monthly active users.
1439
01:13:41,083 --> 01:13:43,460
Discord is a communications platform
1440
01:13:43,460 --> 01:13:47,965
where friends hang out and talk online
about shared interests
1441
01:13:48,048 --> 01:13:53,137
from fantasy sports
to writing music to video games.
1442
01:13:53,220 --> 01:13:57,224
I've been playing video games
since I was five years old, and as a kid
1443
01:13:57,307 --> 01:14:00,936
it's how I had fun and found friendship.
1444
01:14:01,019 --> 01:14:03,105
Many of my fondest memories are
1445
01:14:03,105 --> 01:14:06,066
of playing video games with friends.
1446
01:14:06,191 --> 01:14:09,236
We built Discord
so that anyone could build friendships.
1447
01:14:09,361 --> 01:14:15,325
Playing video games from Minecraft to Will
and everything in between.
1448
01:14:15,409 --> 01:14:17,202
Games have always brought us together.
1449
01:14:17,202 --> 01:14:20,956
And Discord makes that happen. Today
1450
01:14:21,039 --> 01:14:24,209
Discord is one of the many services
that have revolutionized
1451
01:14:24,251 --> 01:14:28,255
how we communicate with each other
in the different moments of our lives.
1452
01:14:28,297 --> 01:14:32,968
I message Zoom, Gmail and on and on.
1453
01:14:33,051 --> 01:14:35,971
They enrich our lives, create communities,
1454
01:14:35,971 --> 01:14:41,351
accelerate commerce,
health care and education.
1455
01:14:41,435 --> 01:14:45,439
Just like with all technology and tools,
there are people who exploit
1456
01:14:45,439 --> 01:14:50,360
and abuse our platforms
for immoral and illegal purposes.
1457
01:14:50,444 --> 01:14:54,531
All of us here on the panel today
and throughout the tech industry
1458
01:14:54,615 --> 01:14:58,744
have a solemn and urgent responsibility
to ensure that.
1459
01:14:58,744 --> 01:15:00,996
Everyone who uses our. Platforms.
1460
01:15:00,996 --> 01:15:05,792
Is protected from these criminals,
both online and off.
1461
01:15:05,876 --> 01:15:08,754
This court has a special responsibility
to do
1462
01:15:08,754 --> 01:15:11,757
because a lot of our users
are young people.
1463
01:15:11,757 --> 01:15:14,259
More than 60% of our active users
1464
01:15:14,259 --> 01:15:17,554
are between the ages of 13 and 24.
1465
01:15:17,638 --> 01:15:20,641
It's why safety is built into everything
we do.
1466
01:15:20,891 --> 01:15:24,436
It's essential to
our mission and our business.
1467
01:15:24,520 --> 01:15:28,023
And most of all, this is deeply personal.
1468
01:15:28,106 --> 01:15:30,484
I'm a dad with two kids.
1469
01:15:30,484 --> 01:15:33,487
I want this court to be a product
that they use
1470
01:15:33,654 --> 01:15:35,864
and love, and I want them to be safe.
1471
01:15:35,864 --> 01:15:37,699
On Discord.
1472
01:15:37,699 --> 01:15:39,660
I Want them to be proud of me for helping
1473
01:15:39,660 --> 01:15:43,247
to bring this product to the world.
1474
01:15:43,330 --> 01:15:45,916
That's why I am pleased to be here today
1475
01:15:45,916 --> 01:15:50,254
to discuss the important
topic of the online safety of minors.
1476
01:15:50,337 --> 01:15:54,341
My written testimony provides
a comprehensive overview of our safety
1477
01:15:54,341 --> 01:15:55,634
programs.
1478
01:15:55,634 --> 01:15:58,220
Here are a few examples of how we protect
1479
01:15:58,220 --> 01:16:01,223
and empower young people.
1480
01:16:01,473 --> 01:16:05,269
First, we've put our money into safety.
1481
01:16:05,352 --> 01:16:09,523
The tech sector has a reputation of larger
companies buying smaller ones
1482
01:16:09,565 --> 01:16:13,819
to increase user numbers
and boost financial results.
1483
01:16:13,902 --> 01:16:16,238
But the largest acquisition
we've ever made
1484
01:16:16,238 --> 01:16:19,241
at Dischord was a company called Center P.
1485
01:16:19,366 --> 01:16:23,245
It didn't help us expand our market share
or improve our bottom line.
1486
01:16:23,287 --> 01:16:27,624
In fact,
because it uses AI to help us identify,
1487
01:16:27,708 --> 01:16:31,086
ban and report criminals and bad behavior,
1488
01:16:31,169 --> 01:16:36,550
it has actually lowered our user count
by getting rid of bad actors.
1489
01:16:36,633 --> 01:16:37,593
Second,
1490
01:16:37,593 --> 01:16:40,596
you've heard of end encryption
that locks anyone,
1491
01:16:40,721 --> 01:16:45,225
including the platform itself,
from seeing users communications.
1492
01:16:45,309 --> 01:16:48,312
It's a feature on dozens of platforms,
1493
01:16:48,520 --> 01:16:50,897
but not on discord.
1494
01:16:50,897 --> 01:16:52,733
That's a choice we've made.
1495
01:16:52,733 --> 01:16:55,986
We don't believe
we can fulfill our safety obligations
1496
01:16:55,986 --> 01:16:58,989
if the text messages of teens
are fully encrypted
1497
01:16:59,197 --> 01:17:03,952
because encryption block our ability
to investigate a serious situation
1498
01:17:04,036 --> 01:17:08,123
and when appropriate, report
to law enforcement.
1499
01:17:08,206 --> 01:17:11,126
Third, we have a zero tolerance policy
1500
01:17:11,126 --> 01:17:14,588
on child sexual abuse, material or csam.
1501
01:17:14,671 --> 01:17:17,883
We scan images uploaded to discord
to detect
1502
01:17:17,883 --> 01:17:21,637
and block
the sharing of this abhorrent material.
1503
01:17:21,720 --> 01:17:26,224
We've also built an innovative tool
Teen Safety assists that blocks explicit
1504
01:17:26,224 --> 01:17:31,563
images and helps young people
easily report unwelcome conversations.
1505
01:17:31,647 --> 01:17:34,816
We've also developed a new semantic
hashing technology
1506
01:17:34,900 --> 01:17:37,903
for detecting novel
forms of CSM called Clip,
1507
01:17:38,111 --> 01:17:43,158
and we're sharing technology with other
platforms to the tech coalition.
1508
01:17:43,241 --> 01:17:46,286
Finally,
we recognize that improving online
1509
01:17:46,286 --> 01:17:50,040
safety requires
all of us to work together.
1510
01:17:50,123 --> 01:17:54,461
So we partner with nonprofits,
law enforcement and our tech colleagues
1511
01:17:54,461 --> 01:17:58,090
to stay ahead of the curve
in protecting young people online.
1512
01:17:58,173 --> 01:18:02,594
We want to be the platform that empowers
our users to have better online
1513
01:18:02,594 --> 01:18:05,847
experiences,
to build true connections, genuine
1514
01:18:05,847 --> 01:18:08,850
friendships, and to have fun.
1515
01:18:08,892 --> 01:18:11,895
Senators,
I sincerely hope today is the beginning
1516
01:18:11,895 --> 01:18:15,857
of an ongoing dialog that results
in real improvements in online safety.
1517
01:18:15,941 --> 01:18:17,526
I look forward to your questions
1518
01:18:17,526 --> 01:18:19,653
and to helping the committee
learn more about this court.
1519
01:18:19,653 --> 01:18:23,281
What it says is a civil liability.
1520
01:18:23,281 --> 01:18:28,662
If you intentionally or knowingly host
or store
1521
01:18:28,745 --> 01:18:31,998
child sexual abuse materials
or make child sex
1522
01:18:31,998 --> 01:18:36,211
abuse materials available.
1523
01:18:36,294 --> 01:18:39,589
Secondly,
intentionally or knowingly promote or aid
1524
01:18:39,589 --> 01:18:44,553
and abet a violation of child
sexual exploitation laws.
1525
01:18:44,636 --> 01:18:45,971
Is there anyone here who believes
1526
01:18:45,971 --> 01:18:49,975
you should not be held civilly liable
for that type of conduct?
1527
01:18:50,016 --> 01:18:54,938
Mr. Citron.
1528
01:18:55,021 --> 01:18:58,024
Good morning, Chair.
1529
01:18:58,233 --> 01:18:59,693
You know, we very much
1530
01:18:59,693 --> 01:19:05,323
believe that this content is disgusting
and that
1531
01:19:05,407 --> 01:19:09,202
there are many things about the autopsy,
Bill, that I think are very encouraging.
1532
01:19:09,286 --> 01:19:12,080
And we very much support
adding more resources
1533
01:19:12,080 --> 01:19:16,376
for the cyber tip line and and modernizing
that along with
1534
01:19:16,418 --> 01:19:18,920
giving more resources to fact check.
1535
01:19:18,920 --> 01:19:22,883
And we're very
1536
01:19:22,966 --> 01:19:24,217
open to having conversations with
1537
01:19:24,217 --> 01:19:27,220
you and your team to talk
through the details of the bill some more.
1538
01:19:27,220 --> 01:19:31,475
I sure would like to do that,
because if you intentionally or knowingly
1539
01:19:31,558 --> 01:19:35,729
post or store cesium, I think you ought
to at least civilly liable.
1540
01:19:35,812 --> 01:19:38,648
I can't imagine anyone who would disagree
with that.
1541
01:19:38,648 --> 01:19:41,234
It's disgusting. Content certainly is.
1542
01:19:41,234 --> 01:19:43,904
That's why we need
you supporting this legislation.
1543
01:19:43,904 --> 01:19:46,698
Mr. Citron,
according to Discord's website,
1544
01:19:46,698 --> 01:19:49,451
it takes a, quote, proactive
and automated approach
1545
01:19:49,451 --> 01:19:53,914
to safety only on servers
with more than 200 members.
1546
01:19:53,997 --> 01:19:55,957
Smaller servers rely on
1547
01:19:55,957 --> 01:20:00,754
server owners and community moderators
to define and enforce behavior.
1548
01:20:00,837 --> 01:20:05,258
So how do you defend an approach to safety
that relies on groups of fewer than 200
1549
01:20:05,258 --> 01:20:09,387
sexual predators to report themselves
for things like grooming,
1550
01:20:09,471 --> 01:20:13,266
trading a C scam or sextortion?
1551
01:20:13,350 --> 01:20:16,978
I'm sure our our goal is to
is to get all of that content
1552
01:20:17,062 --> 01:20:21,817
off of our platform and ideally prevent it
from showing up in the first place
1553
01:20:21,817 --> 01:20:25,654
or from people engaging
in these kind of horrific activities.
1554
01:20:25,737 --> 01:20:28,949
We deploy a wide array of techniques
that work
1555
01:20:28,949 --> 01:20:32,118
across every surface on our on discord.
1556
01:20:32,202 --> 01:20:35,288
And I mentioned we recently launched
something called Teen Safety Assist,
1557
01:20:35,288 --> 01:20:38,291
which works everywhere,
and it's on by default for teen users.
1558
01:20:38,333 --> 01:20:41,503
That kind of acts like a buddy
that lets them know
1559
01:20:41,503 --> 01:20:45,131
if they're in a situation or talking
with someone that may be inappropriate
1560
01:20:45,131 --> 01:20:49,177
so they can report that to us
and block that user.
1561
01:20:49,261 --> 01:20:53,765
So we determine if that were working,
we wouldn't be here today.
1562
01:20:53,849 --> 01:20:57,978
So this is an ongoing challenge
for all of us.
1563
01:20:58,061 --> 01:20:59,563
That is why we're here today.
1564
01:20:59,563 --> 01:21:03,108
But we we do have
50% of our company is focused
1565
01:21:03,275 --> 01:21:06,319
on trust and safety, of which
this is one of our top issues
1566
01:21:06,319 --> 01:21:09,364
that's more people than we have working on
marketing and promoting the company.
1567
01:21:09,364 --> 01:21:11,324
So we take these issues very seriously,
1568
01:21:11,324 --> 01:21:15,078
but we know it's an ongoing challenge
and I look forward to working with you
1569
01:21:15,078 --> 01:21:18,290
and collaborating with our tech peers
and the nonprofits to
1570
01:21:18,373 --> 01:21:19,916
to improve our approach.
1571
01:21:19,916 --> 01:21:22,168
Mr. Citron, you
1572
01:21:22,168 --> 01:21:25,171
we need to start a discussion.
1573
01:21:25,422 --> 01:21:26,172
be honest with you.
1574
01:21:26,172 --> 01:21:30,135
We've been having this discussion
for a very long time.
1575
01:21:30,218 --> 01:21:32,804
We need to get a result, not a discussion.
1576
01:21:32,804 --> 01:21:35,056
Do you agree with that?
1577
01:21:35,056 --> 01:21:36,683
I'm ranking member. I agree.
1578
01:21:36,683 --> 01:21:40,270
This is an issue
that we've also been very focused on.
1579
01:21:40,478 --> 01:21:42,939
We started our company 2013,
but this is the first.
1580
01:21:42,939 --> 01:21:47,903
Are you familiar with the Earn It Act,
authored by myself and Sir and Blumenthal?
1581
01:21:47,986 --> 01:21:49,404
A little bit, yes.
1582
01:21:49,404 --> 01:21:52,449
Okay. Do you support that?
1583
01:21:52,532 --> 01:21:56,119
We like yes or no?
1584
01:21:56,202 --> 01:21:58,914
We're not prepared to support it today,
but we believe in section.
1585
01:21:58,914 --> 01:22:02,542
Support the CSM Act. The stop CSM Act.
1586
01:22:02,626 --> 01:22:04,169
We are not prepared to support it.
1587
01:22:04,169 --> 01:22:06,796
Would you support. The SHIELD Act?
1588
01:22:06,796 --> 01:22:09,966
We we believe that the cyber tip line
support it.
1589
01:22:09,966 --> 01:22:11,343
Yes or no?
1590
01:22:11,343 --> 01:22:14,554
We we believe that the cyber tip line and.
1591
01:22:14,596 --> 01:22:19,726
That to be know the Project Safe
Childhood Act, you support it.
1592
01:22:19,809 --> 01:22:21,686
We believe that.
1593
01:22:21,686 --> 01:22:25,607
I'll tell you that if you know the report
act do you support it?
1594
01:22:25,690 --> 01:22:28,568
Ranking Member Graham We very much
look forward to having conversations
1595
01:22:28,568 --> 01:22:29,569
with you and your team.
1596
01:22:29,569 --> 01:22:32,572
I look forward to passing
the bill will solve the problem.
1597
01:22:32,697 --> 01:22:33,907
Do you support removing
1598
01:22:33,907 --> 01:22:37,786
Section 230 liability protections
for social media companies?
1599
01:22:37,869 --> 01:22:42,082
I believe that Section 230
is needs to be updated.
1600
01:22:42,082 --> 01:22:43,124
It's a very old law.
1601
01:22:43,124 --> 01:22:48,380
I do support repealing it so people can
sue if they believe they're harmed.
1602
01:22:48,463 --> 01:22:50,882
I think that Section 230 as written,
1603
01:22:50,882 --> 01:22:53,802
while it has many downsides,
has enabled innovation on the Internet,
1604
01:22:53,802 --> 01:22:56,680
which I think is largely.
Thank you very much.
1605
01:22:56,680 --> 01:22:58,807
So I'm going to start with that
with you, Mr.
1606
01:22:58,807 --> 01:23:00,225
Citron.
1607
01:23:00,225 --> 01:23:01,518
My bill was Senator Cornyn.
1608
01:23:01,518 --> 01:23:05,146
The Shield Act includes a threat provision
that would help protection
1609
01:23:05,271 --> 01:23:09,651
and accountability for those
that are threatened by these predators.
1610
01:23:09,818 --> 01:23:13,113
Yanking kids, get a picture, send it in.
1611
01:23:13,238 --> 01:23:17,158
Think they got a new girlfriend
or a new boyfriend, ruins their life
1612
01:23:17,158 --> 01:23:19,995
or they think it's going to be ruined
and they kill themselves.
1613
01:23:19,995 --> 01:23:26,042
So could you tell me
why you're not supporting the Shield Act?
1614
01:23:26,126 --> 01:23:28,211
Senator, we we think it's very important
1615
01:23:28,211 --> 01:23:31,881
that teens have a safe experience
on our platforms.
1616
01:23:31,965 --> 01:23:37,387
I think that the the the the portion
to strengthen law enforcement's ability
1617
01:23:37,387 --> 01:23:40,849
to investigate crimes against children
and hold bad actors accountable is.
1618
01:23:40,932 --> 01:23:43,518
So you're holding open
that you may support it.
1619
01:23:43,518 --> 01:23:45,812
We very much
would like to have conversations with you.
1620
01:23:45,812 --> 01:23:50,442
We're open to discussing further
and we do welcome legislation, regulation.
1621
01:23:50,525 --> 01:23:52,902
This is a very important issue
for our country
1622
01:23:52,902 --> 01:23:56,573
and we've been prioritizing safety
for change.
1623
01:23:56,656 --> 01:23:59,200
What about you, Mr. Citron, on Discord?
1624
01:23:59,200 --> 01:24:00,869
Do you have
1625
01:24:00,952 --> 01:24:02,829
to allow kids to have accounts
1626
01:24:02,829 --> 01:24:06,249
to access encrypted messaging?
1627
01:24:06,332 --> 01:24:08,668
Discord is not allowed
to be used by children
1628
01:24:08,668 --> 01:24:12,380
under the age of 13, and we do not use
end to end encryption for text messages.
1629
01:24:12,380 --> 01:24:15,633
You know, we believe that it's
very important to be able to respond to
1630
01:24:15,675 --> 01:24:19,304
welfare and law enforcement requests, too.
1631
01:24:19,387 --> 01:24:22,599
And we're also working
on proactively building technology.
1632
01:24:22,640 --> 01:24:24,309
We're working with a nonprofit
called Thorn
1633
01:24:24,309 --> 01:24:27,604
to build a grooming classifier
so that our teen Safety forces feature
1634
01:24:27,604 --> 01:24:30,565
can actually identify these conversations
if they might be happening
1635
01:24:30,565 --> 01:24:34,235
so we can intervene
and give those teens tools
1636
01:24:34,235 --> 01:24:36,279
to get out of that situation
or potentially
1637
01:24:36,279 --> 01:24:39,074
even report those conversations
and those people to law enforcement.
1638
01:24:39,074 --> 01:24:42,285
Mr. Citron This discord allows
pornography on its site.
1639
01:24:42,285 --> 01:24:46,664
Now, reportedly, 17% of minors
who use discord has have
1640
01:24:46,664 --> 01:24:52,462
had online sexual interactions
on your platform, 17% and 10%
1641
01:24:52,545 --> 01:24:52,962
have those
1642
01:24:52,962 --> 01:24:56,966
interactions with someone
that the minor believed
1643
01:24:57,008 --> 01:25:01,012
to be an adult to restrict minors from
1644
01:25:01,096 --> 01:25:03,765
from accessing servers
1645
01:25:03,765 --> 01:25:08,228
that host pornographic material on them.
1646
01:25:08,311 --> 01:25:10,230
Senator Yes, we do restrict
1647
01:25:10,230 --> 01:25:13,525
minors from accessing content
that is marked for adults.
1648
01:25:13,566 --> 01:25:15,819
Discord also does not recommend
content to people.
1649
01:25:15,819 --> 01:25:16,778
Discord As a chat app.
1650
01:25:16,778 --> 01:25:20,115
We do not have a feed or an algorithm
that boosts content,
1651
01:25:20,198 --> 01:25:23,827
so we allow adults to share content
with other adults
1652
01:25:23,827 --> 01:25:27,413
in adult label spaces and we do not
allow teens to access that content.
1653
01:25:27,455 --> 01:25:28,623
Okay.
1654
01:25:28,623 --> 01:25:33,002
Let me continue with a follow up
question for Mr.
1655
01:25:33,044 --> 01:25:33,878
Citron.
1656
01:25:33,878 --> 01:25:37,841
In addition, keeping parents informed
about the nature of various Internet
1657
01:25:37,841 --> 01:25:41,803
services is a lot more we obviously need
to do for today's purposes.
1658
01:25:41,886 --> 01:25:46,808
Well, many companies offer a broad range
of quote unquote user empowerment tools.
1659
01:25:47,058 --> 01:25:50,895
It's helpful to understand whether young
people even find these tools helpful.
1660
01:25:50,979 --> 01:25:53,523
So I appreciate you
sharing your teen safety
1661
01:25:53,523 --> 01:25:56,526
assist and the tools
and how you're advertising it.
1662
01:25:56,526 --> 01:25:59,320
But have you conducted any assessments
1663
01:25:59,404 --> 01:26:01,990
of how these features are impacting
minors?
1664
01:26:01,990 --> 01:26:05,326
Use of your platform?
1665
01:26:05,410 --> 01:26:09,622
Our intention is to is to give teens
tools capabilities that they can use
1666
01:26:09,622 --> 01:26:13,751
to keep themselves safe and also so
our teams can help keep teen safe.
1667
01:26:13,835 --> 01:26:17,714
We recently launched Teen Safety Assists
last year and we do not have a study
1668
01:26:17,714 --> 01:26:41,154
off the top of my head that we'd be happy
to follow up with you on that.
1669
01:26:41,196 --> 01:26:42,405
Chairman Durbin,
1670
01:26:42,405 --> 01:26:45,450
Ranking Member Graham, and esteemed
1671
01:26:45,575 --> 01:26:49,746
members of the Committee,
thank you for the opportunity
1672
01:26:49,746 --> 01:26:52,874
to discuss X's work to protect
1673
01:26:52,874 --> 01:26:56,586
the safety of minors online.
1674
01:26:56,669 --> 01:27:00,590
Today's hearing is titled A Crisis,
1675
01:27:00,673 --> 01:27:04,427
which calls for immediate Action.
1676
01:27:04,510 --> 01:27:08,097
As a mother, this is personal,
1677
01:27:08,181 --> 01:27:11,559
and I share the sense of urgency.
1678
01:27:11,643 --> 01:27:14,646
X is an entirely new company,
1679
01:27:14,896 --> 01:27:21,236
an indispensable platform
for the world and for democracy.
1680
01:27:21,319 --> 01:27:21,903
You have
1681
01:27:21,903 --> 01:27:26,616
my personal commitment
that X will be active
1682
01:27:26,699 --> 01:27:29,702
and a part of this solution.
1683
01:27:29,869 --> 01:27:34,624
While I joined X only in June of 2023,
1684
01:27:34,707 --> 01:27:38,670
I bring a history of working together
with governments
1685
01:27:38,753 --> 01:27:42,590
advocates and NGOs to harness
1686
01:27:42,674 --> 01:27:46,761
the power of media to protect people.
1687
01:27:46,844 --> 01:27:51,224
Before I joined,
I was struck the leadership steps
1688
01:27:51,307 --> 01:27:55,520
this new company was taking to protect
children.
1689
01:27:55,603 --> 01:27:57,772
X is not the platform
1690
01:27:57,772 --> 01:28:01,276
of choice for, children and teens.
1691
01:28:01,359 --> 01:28:07,115
We do not have a line of business
dedicated to children,
1692
01:28:07,198 --> 01:28:09,742
children under the age of 13
1693
01:28:09,742 --> 01:28:12,662
are not allowed to open an account.
1694
01:28:12,662 --> 01:28:16,624
Less than 1% of the U.S.
1695
01:28:16,666 --> 01:28:22,297
users on X
are between the ages of 13 and 17,
1696
01:28:22,380 --> 01:28:25,091
and those users are automatically
1697
01:28:25,091 --> 01:28:28,094
set to a private default setting
1698
01:28:28,136 --> 01:28:34,600
and cannot accept a message
from anyone they do not approve.
1699
01:28:34,684 --> 01:28:36,936
In the last 14 months,
1700
01:28:36,936 --> 01:28:43,192
X has made material changes
to protect minors.
1701
01:28:43,276 --> 01:28:45,403
Our policy is clear.
1702
01:28:45,403 --> 01:28:47,572
X has zero tolerance towards
1703
01:28:47,572 --> 01:28:50,992
any material that features or promotes
1704
01:28:51,075 --> 01:28:54,620
child sexual exploitation.
1705
01:28:54,704 --> 01:28:57,165
My written testimony details
1706
01:28:57,165 --> 01:29:00,835
X's extensive policies on content
1707
01:29:00,835 --> 01:29:05,256
or actions that are prohibited and include
1708
01:29:05,340 --> 01:29:07,842
grooming, blackmail
1709
01:29:07,842 --> 01:29:11,929
and identifying alleged victims of CSEA.
1710
01:29:12,013 --> 01:29:16,434
We've also strengthened our enforcement
with more tools
1711
01:29:16,517 --> 01:29:21,356
and technology to prevent those bad actors
from distributing.
1712
01:29:21,439 --> 01:29:25,860
Searching for
and engaging with CSC content.
1713
01:29:25,943 --> 01:29:30,531
If CSC content is posted on X,
we remove it.
1714
01:29:30,615 --> 01:29:34,577
And now we also remove any account
1715
01:29:34,660 --> 01:29:39,123
that engages with CSC content,
whether it is real
1716
01:29:39,207 --> 01:29:41,918
or computer generated.
1717
01:29:41,918 --> 01:29:44,379
Last year,
1718
01:29:44,379 --> 01:29:48,424
X suspended 12.4 million accounts
1719
01:29:48,424 --> 01:29:52,345
for violating our CSC policies.
1720
01:29:52,387 --> 01:29:57,016
This is up from 2.3 million accounts
1721
01:29:57,016 --> 01:30:02,188
that were removed by Twitter in 2022.
1722
01:30:02,271 --> 01:30:04,148
In 2023,
1723
01:30:04,148 --> 01:30:09,821
850,000 reports were sent to NECC MC,
1724
01:30:09,904 --> 01:30:14,283
including our first ever
auto generated report.
1725
01:30:14,367 --> 01:30:18,162
This is eight times more than was reported
1726
01:30:18,162 --> 01:30:22,458
by Twitter in 2020 to
1727
01:30:22,542 --> 01:30:24,961
We've changed our priorities.
1728
01:30:24,961 --> 01:30:28,256
We've restructured
our trust in safety teams
1729
01:30:28,256 --> 01:30:31,634
to remain strong and agile.
1730
01:30:31,717 --> 01:30:36,013
We are building a Trust
and Safety Center of Excellence
1731
01:30:36,097 --> 01:30:39,475
in Austin, Texas, to bring more agents
1732
01:30:39,684 --> 01:30:44,147
in-house, to accelerate our impact.
1733
01:30:44,230 --> 01:30:48,317
We're applying to the Technology
Coalitions project Lantern,
1734
01:30:48,401 --> 01:30:53,489
to make further industry
wide progress and impact.
1735
01:30:53,573 --> 01:30:56,159
We've also opened up our algorithms
1736
01:30:56,159 --> 01:30:59,328
for increased transparency.
1737
01:30:59,412 --> 01:31:04,000
We want America to lead in solution.
1738
01:31:04,083 --> 01:31:08,254
X commends the Senate
for passing the Report Act,
1739
01:31:08,337 --> 01:31:11,549
and we support the SHIELD Act.
1740
01:31:11,632 --> 01:31:16,012
It Is time for a federal standard
to criminalize
1741
01:31:16,095 --> 01:31:22,143
the sharing
of non-consensual intimate material?
1742
01:31:22,226 --> 01:31:22,518
Need to
1743
01:31:22,518 --> 01:31:26,272
raise
the standards across the entire Internet
1744
01:31:26,272 --> 01:31:30,860
ecosystem,
especially for those tech companies
1745
01:31:30,943 --> 01:31:36,240
that are not here today
and not stepping up.
1746
01:31:36,324 --> 01:31:41,370
X supports the Stop CSM Act.
1747
01:31:41,454 --> 01:31:45,791
The Kids Online
Safety Act should continue to progress
1748
01:31:45,875 --> 01:31:50,296
and we will support the continuation
to engage with it
1749
01:31:50,505 --> 01:31:55,718
and ensure the protections
of the freedom of speech.
1750
01:31:55,801 --> 01:31:57,970
There are two additional areas
1751
01:31:57,970 --> 01:32:02,141
that require everyone's attention.
1752
01:32:02,225 --> 01:32:04,560
First,
1753
01:32:04,560 --> 01:32:07,563
as the daughter of a police officer,
1754
01:32:07,647 --> 01:32:09,774
law enforcement
1755
01:32:09,774 --> 01:32:12,652
must have the critical resources
1756
01:32:12,652 --> 01:32:16,948
to bring these bad offenders to justice.
1757
01:32:17,031 --> 01:32:20,576
Second, with artificial intelligence
1758
01:32:20,660 --> 01:32:25,164
offenders, tactics
will continue to sophisticate and evolve.
1759
01:32:25,248 --> 01:32:30,461
Industry collaboration is imperative here.
1760
01:32:30,545 --> 01:32:33,214
X believes that the freedom of speech
1761
01:32:33,214 --> 01:32:38,135
and platform safety can and must coexist.
1762
01:32:38,219 --> 01:32:40,680
We agree that now is the time
1763
01:32:40,680 --> 01:32:43,891
to act with urgency.
1764
01:32:43,975 --> 01:32:44,767
Thank you.
1765
01:32:44,767 --> 01:32:46,602
I look forward
to answering your questions.
1766
01:32:46,602 --> 01:32:49,689
D Report the absolute number
of how many images.
1767
01:32:49,814 --> 01:32:54,527
Or how often posts and accounts
that we've taken down in 2023.
1768
01:32:54,610 --> 01:32:57,947
We've taken over
almost a million posts down
1769
01:32:58,030 --> 01:33:01,534
that in regards
to mental health and self-image. Ms.
1770
01:33:01,617 --> 01:33:05,746
PERINO Based on your understanding
of existing law, what might it take
1771
01:33:05,746 --> 01:33:10,418
for a person
to have those images remove, say, from x
1772
01:33:10,501 --> 01:33:13,170
eye? Senator Lee Thank you.
1773
01:33:13,170 --> 01:33:17,633
It sounds like what you're going
to introduce into law
1774
01:33:17,633 --> 01:33:22,179
in terms of ecosystem
wide and use her consent
1775
01:33:22,430 --> 01:33:25,266
sounds exactly like part of the philosophy
1776
01:33:25,266 --> 01:33:28,269
of why we're supporting the SHIELD Act.
1777
01:33:28,269 --> 01:33:32,690
And no one should have to endure
non-consensual images.
1778
01:33:32,690 --> 01:33:35,693
Being Michelle Carino,
how many minors use X
1779
01:33:35,693 --> 01:33:38,696
and are you planning to implement
safety measures or guidance
1780
01:33:38,696 --> 01:33:42,033
for caretakers
like you're sure companies have?
1781
01:33:42,116 --> 01:33:43,367
Thank you, Senator.
1782
01:33:43,367 --> 01:33:46,203
Less 1% of all U.S.
1783
01:33:46,203 --> 01:33:49,957
users are between the ages of 13 and 17.
1784
01:33:49,957 --> 01:33:50,875
Less than 1%.
1785
01:33:50,875 --> 01:33:54,086
How many of 90 million U.S. users?
1786
01:33:54,337 --> 01:33:56,172
So still hundreds of thousands continue.
1787
01:33:56,172 --> 01:33:56,922
Yes. Yes.
1788
01:33:56,922 --> 01:33:59,925
And every single one is very important.
1789
01:33:59,925 --> 01:34:03,846
Being a 14 month old company,
we have reprioritized child
1790
01:34:03,971 --> 01:34:08,768
protection and safety measures
and we have just begun to talk about
1791
01:34:08,768 --> 01:34:12,980
and discuss how we can enhance
those with parental controls.
1792
01:34:13,064 --> 01:34:15,608
Michelle
Guarino, have you talked to parents
1793
01:34:15,608 --> 01:34:19,362
directly, young people,
about designing your product
1794
01:34:19,445 --> 01:34:21,656
as a new leader of X?
1795
01:34:21,656 --> 01:34:22,615
The answer's yes.
1796
01:34:22,615 --> 01:34:26,744
I've spoken to them
about the behavioral patterns,
1797
01:34:26,827 --> 01:34:31,666
even because less than 1% of our users
are in that age group.
1798
01:34:31,666 --> 01:34:33,751
But you have spoken to them.
1799
01:34:33,751 --> 01:34:37,171
A lot of examples of a young person
1800
01:34:37,254 --> 01:34:40,633
finding out about an image that is of them
1801
01:34:40,633 --> 01:34:45,680
and really compromises them
and actually can create suicidal thoughts.
1802
01:34:45,763 --> 01:34:49,308
And they want to call up or they want
to send an email and say, take it down.
1803
01:34:49,517 --> 01:34:55,106
I mean, why is it not possible
for that to be responded to immediately?
1804
01:34:55,147 --> 01:34:58,025
Well, we all strive to take down any type
1805
01:34:58,025 --> 01:35:02,321
of violative content
or disturbing content immediately.
1806
01:35:02,405 --> 01:35:07,702
At X, we have increased our capability
to start reporting process.
1807
01:35:07,827 --> 01:35:11,455
I'm a parent or I'm a kid
and I want this down.
1808
01:35:11,497 --> 01:35:16,252
Shouldn't there be a message in place
where it comes down?
1809
01:35:16,252 --> 01:35:17,878
You can see what the image is.
157790
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.