Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:19,300 --> 00:00:22,195
- TikTok has very
much become a way
2
00:00:22,220 --> 00:00:23,660
for the young generation
3
00:00:23,700 --> 00:00:25,600
to express ourselves
in every way.
4
00:00:27,113 --> 00:00:29,676
You can be yourself, you're
at home, you're filming,
5
00:00:29,700 --> 00:00:31,586
and there's always
gonna be like millions
6
00:00:31,610 --> 00:00:33,036
of people watching you.
7
00:00:33,540 --> 00:00:35,202
I like this one. Look and see.
8
00:00:35,382 --> 00:00:37,254
You never know when
you could blow up.
9
00:00:41,450 --> 00:00:43,036
- I think every
young kid's dream
10
00:00:43,060 --> 00:00:45,047
is to be successful online.
11
00:00:45,530 --> 00:00:47,326
The Chinese
social media platform,
12
00:00:47,350 --> 00:00:49,963
TikTok, has
changed the internet.
13
00:00:51,070 --> 00:00:54,346
It's become the most
popular app in the world.
14
00:00:54,370 --> 00:00:56,166
- It's fishing
videos, it's cooking.
15
00:00:56,190 --> 00:00:58,056
You can make skits,
singing, dancing.
16
00:00:58,080 --> 00:01:00,827
Literally everything you
can think of, TikTok have.
17
00:01:05,540 --> 00:01:07,306
- It's not an app on
their phone anymore.
18
00:01:07,330 --> 00:01:08,356
It's their livelihood.
19
00:01:08,380 --> 00:01:10,166
It's how they communicate
with their friends.
20
00:01:10,190 --> 00:01:11,636
It's how they see their world.
21
00:01:11,660 --> 00:01:13,976
That's a part that I
don't think everybody
22
00:01:14,000 --> 00:01:15,273
has adjusted to yet.
23
00:01:17,060 --> 00:01:20,546
- We're really at risk of having
generations of young people
24
00:01:20,570 --> 00:01:23,506
that performed identities
in response to something
25
00:01:23,530 --> 00:01:27,683
that a technology platform
prescribes to be the new normal.
26
00:01:29,250 --> 00:01:31,106
Behind the shiny dance videos,
27
00:01:31,130 --> 00:01:34,673
the platform is leading
people down dangerous paths.
28
00:01:36,130 --> 00:01:37,916
- I'd like to think that I
wouldn't have struggled
29
00:01:37,940 --> 00:01:40,349
with an eating disorder if I
hadn't downloaded TikTok.
30
00:01:41,840 --> 00:01:43,106
- My claim with TikTok
31
00:01:43,130 --> 00:01:47,146
is that they are harvesting
huge amounts of data illegally
32
00:01:47,170 --> 00:01:49,552
without the consent of
children or their parents.
33
00:01:50,990 --> 00:01:52,746
- If you just look at
TikTok in isolation,
34
00:01:52,770 --> 00:01:54,006
it seems innocuous.
35
00:01:54,030 --> 00:01:56,876
But it's really takes place
in this much larger context
36
00:01:56,900 --> 00:01:59,656
of data collection,
artificial intelligence,
37
00:01:59,680 --> 00:02:00,966
and a real effort by the Chinese
38
00:02:00,990 --> 00:02:04,483
to consolidate influence in
the region and across the globe.
39
00:02:06,150 --> 00:02:08,556
- Tonight on four
corners, TikTok.
40
00:02:08,580 --> 00:02:11,256
In a joint investigation
with Hack on Triple J,
41
00:02:11,280 --> 00:02:13,056
we're going down the rabbit hole
42
00:02:13,080 --> 00:02:15,176
to reveal the dark side of app.
43
00:02:15,200 --> 00:02:17,686
How the platform
censors political content
44
00:02:17,710 --> 00:02:20,046
and harvests children's data.
45
00:02:20,070 --> 00:02:23,886
And how the app's powerful
algorithm exposes people
46
00:02:23,910 --> 00:02:26,773
to misinformation and
dangerous content.
47
00:02:35,660 --> 00:02:37,456
- Hi, my name is Rory Eliza.
48
00:02:37,480 --> 00:02:38,925
And, then what?
49
00:02:40,416 --> 00:02:42,076
And what do you do?
50
00:02:42,100 --> 00:02:43,848
- I am a full-time TikToker.
51
00:02:53,250 --> 00:02:54,626
So in the morning I'll wake up,
52
00:02:54,650 --> 00:02:55,946
maybe eight
o'clock, nine o'clock.
53
00:02:55,970 --> 00:02:57,226
I'll check my phone,
54
00:02:57,250 --> 00:02:59,176
check if my videos
have done well,
55
00:02:59,200 --> 00:03:01,646
or how my followers are reacting
56
00:03:01,670 --> 00:03:03,723
to the content that
I've just posted.
57
00:03:07,060 --> 00:03:09,740
TikTok honestly, I get
so much love on there.
58
00:03:09,770 --> 00:03:12,136
It's so weird because
that's my biggest platform,
59
00:03:12,160 --> 00:03:13,736
is TikTok with 5
million followers.
60
00:03:13,760 --> 00:03:16,326
It's crazy to think
that 5 million people,
61
00:03:16,350 --> 00:03:18,376
that's people, it's
not just the number.
62
00:03:18,400 --> 00:03:20,216
And if you really think
about it's 5 million people
63
00:03:20,240 --> 00:03:21,640
that have tapped
that follow button.
64
00:03:23,230 --> 00:03:24,492
They're all just so friendly and
65
00:03:24,516 --> 00:03:25,656
they're kind of
like your family.
66
00:03:25,680 --> 00:03:27,483
It's just weird, like you
don't know these people,
67
00:03:27,507 --> 00:03:29,216
but they know so much about you
68
00:03:29,240 --> 00:03:31,481
that they treat you
like a family member.
69
00:03:43,750 --> 00:03:46,763
And on that note, welcome
to the new Rory Eliza.
70
00:03:47,900 --> 00:03:49,796
Rory Eliza is one of millions
71
00:03:49,820 --> 00:03:53,026
of young Australians
recording virtually every moment
72
00:03:53,050 --> 00:03:55,666
of their lives to get
famous on TikTok.
73
00:03:55,690 --> 00:03:57,466
- Get ready with me for a date.
74
00:03:57,490 --> 00:03:58,916
Yo, I chose an outfit, let's go.
75
00:03:58,940 --> 00:04:00,686
Transition, yeah.
76
00:04:00,710 --> 00:04:02,486
So I think every
young kid's dream
77
00:04:02,510 --> 00:04:04,776
is to, you know, be
successful online.
78
00:04:04,800 --> 00:04:08,116
So, I think there's
definitely a group
79
00:04:08,140 --> 00:04:10,276
where they all
wanna be influencers
80
00:04:10,300 --> 00:04:11,726
'cause it's kind of
like the in thing now.
81
00:04:11,750 --> 00:04:13,456
And I think that's
because of TikTok.
82
00:04:14,340 --> 00:04:15,700
TikTok has been downloaded
83
00:04:15,710 --> 00:04:18,663
more than 3 billion
times around the world.
84
00:04:21,710 --> 00:04:23,717
It's become a
cultural phenomenon.
85
00:04:23,742 --> 00:04:24,773
I'm 21 and just
86
00:04:24,798 --> 00:04:26,123
learned how to
do my own laundry.
87
00:04:26,210 --> 00:04:28,536
Some tomatoes and some cheese.
88
00:04:28,560 --> 00:04:31,320
Dude, no you're
gotta go late like, hey.
89
00:04:32,380 --> 00:04:33,553
Oh, okay, okay. Okay.
90
00:04:38,470 --> 00:04:40,570
Everything is about going viral.
91
00:04:41,850 --> 00:04:43,406
The dance, started by someone
92
00:04:43,430 --> 00:04:46,138
in their living room
and uploaded to TikTok,
93
00:04:47,720 --> 00:04:49,876
can turn into a
stadium full of people,
94
00:04:49,900 --> 00:04:51,586
performing it in unison.
95
00:04:58,165 --> 00:05:00,096
- I like how creative
you can be on it.
96
00:05:00,120 --> 00:05:02,293
Like it's just so fun
to go on that app
97
00:05:02,317 --> 00:05:04,138
and just express your real self.
98
00:05:13,970 --> 00:05:16,626
Rory started posting
comedic skits on TikTok
99
00:05:16,650 --> 00:05:18,153
and her following snowballed.
100
00:05:19,090 --> 00:05:21,827
- No worries. I'll scan that one
right through for you right now.
101
00:05:22,700 --> 00:05:24,436
This was her first viral video.
102
00:05:24,460 --> 00:05:26,746
It got nearly 14 million views.
103
00:05:31,050 --> 00:05:33,637
- How about some Peking duck?
104
00:05:33,870 --> 00:05:36,816
Oh yeah, but we actually
don't have the Peking duck.
105
00:05:36,840 --> 00:05:38,586
But we've got this
sneaking goose.
106
00:05:38,610 --> 00:05:40,102
What an odd name.
107
00:05:40,126 --> 00:05:42,756
It is pretty normal for a book.
108
00:05:42,780 --> 00:05:46,016
In 2019, Rory
decided to leave school
109
00:05:46,040 --> 00:05:48,166
to become a full time TikToker.
110
00:05:48,590 --> 00:05:50,623
- Wait, wait, is this a library?
111
00:05:51,510 --> 00:05:53,870
School was just one of those
things. I just was not good at.
112
00:05:54,320 --> 00:05:57,156
I decided to leave school
when I was in year 11
113
00:05:57,180 --> 00:05:58,536
and I was never there, you know.
114
00:05:58,560 --> 00:06:00,336
I was always in
Sydney doing meetings
115
00:06:00,360 --> 00:06:02,526
or presentations
for TikTok ends.
116
00:06:02,550 --> 00:06:04,576
I just wasn't there. And
when I would come to school,
117
00:06:04,600 --> 00:06:06,476
I would have no
idea what we're doing.
118
00:06:06,500 --> 00:06:09,093
'Cause you know, I've
been away for heaps of days.
119
00:06:10,440 --> 00:06:13,076
No worries at all. Thanks
for coming to our library.
120
00:06:13,100 --> 00:06:15,051
School, you can
go back and do it at TAFE.
121
00:06:15,075 --> 00:06:17,026
You can go back any
time and do it if you need it.
122
00:06:17,050 --> 00:06:19,106
But you may never get
this opportunity again.
123
00:06:19,130 --> 00:06:21,256
So, we just thought it
was worth leaving school
124
00:06:21,280 --> 00:06:22,796
and pursuing all the
business opportunities
125
00:06:22,820 --> 00:06:23,846
while they were there for her.
126
00:06:23,870 --> 00:06:26,416
- No worries at all. Thanks
for coming to our library.
127
00:06:26,680 --> 00:06:28,648
- How do you feel about the
fact that 5 million people are
128
00:06:28,672 --> 00:06:30,446
watching her content?
129
00:06:30,470 --> 00:06:31,927
- It's incredible.
It's even, when she
130
00:06:31,951 --> 00:06:33,606
goes live, there was
a time she went live
131
00:06:33,630 --> 00:06:36,226
and she had 22,000 people
watching her in her room.
132
00:06:36,250 --> 00:06:38,226
And I just sort of, in
my mind, goes back
133
00:06:38,250 --> 00:06:39,263
to Elton John concert here.
134
00:06:39,287 --> 00:06:40,756
And she had more
people watching her.
135
00:06:40,780 --> 00:06:42,436
Than we had at that
Elton John concert.
136
00:06:42,460 --> 00:06:44,226
And it kind of way
out that's happening
137
00:06:44,250 --> 00:06:46,336
in my daughter's
bedroom at the moment.
138
00:06:46,360 --> 00:06:49,849
It was a bit yeah, different.
139
00:06:54,130 --> 00:06:56,266
- Big fashion and
cosmetic brands
140
00:06:56,290 --> 00:06:59,126
started noticing Rory
success on TikTok
141
00:06:59,150 --> 00:07:01,553
and wanted to tap into
a growing audience.
142
00:07:07,149 --> 00:07:09,306
Companies sponsor
influences like Rory.
143
00:07:09,330 --> 00:07:13,083
And businesses pay TikTok
to advertise on the platform.
144
00:07:14,680 --> 00:07:17,553
This is central to the app's
lucrative business model.
145
00:07:19,110 --> 00:07:21,226
- In this work industry
being an influencer,
146
00:07:21,250 --> 00:07:23,436
you have to present
yourself as a brand, you know.
147
00:07:23,460 --> 00:07:25,566
We aren't really people
anymore, we're brands.
148
00:07:25,590 --> 00:07:26,926
We're selling
products for brands.
149
00:07:26,950 --> 00:07:28,750
So, you kind of
gotta look the part.
150
00:07:32,070 --> 00:07:34,494
The money involved
it's enough to live off.
151
00:07:34,518 --> 00:07:36,835
So, it's a pretty fair amount.
152
00:07:36,860 --> 00:07:38,416
I'm about in the
medium to high range
153
00:07:38,440 --> 00:07:39,926
of incomes in Australia.
154
00:07:39,950 --> 00:07:41,549
So yeah. Very, very
descent.
155
00:07:42,600 --> 00:07:44,246
- Well, it's hard not to
even be jealous sometimes
156
00:07:44,270 --> 00:07:46,016
'cause you look at
our life and you know,
157
00:07:46,040 --> 00:07:48,590
we get up and we go to
work and we come home.
158
00:07:49,905 --> 00:07:51,245
And she can earn money
159
00:07:51,270 --> 00:07:53,696
that can take us days
to earn in minutes.
160
00:07:53,720 --> 00:07:56,526
- I found myself
driving and just crying,
161
00:07:56,550 --> 00:07:58,666
having like a total breakdown.
162
00:07:58,690 --> 00:07:59,856
And I found myself having
163
00:07:59,880 --> 00:08:02,206
some really quite
nasty thoughts and-
164
00:08:02,230 --> 00:08:05,566
- Rory shares her
life with 5 million people.
165
00:08:05,590 --> 00:08:07,438
Even her lowest moments.
166
00:08:07,462 --> 00:08:08,994
- Why am I meant
to be on this earth.
167
00:08:09,018 --> 00:08:11,626
Like, why does no one like me?
168
00:08:11,650 --> 00:08:13,263
Why do I have no friends?
169
00:08:14,630 --> 00:08:17,459
But most days
she feels very alone.
170
00:08:17,483 --> 00:08:19,286
- Okay. That's
an old name for,
171
00:08:19,310 --> 00:08:22,176
Being away from people, it's
definitely lonely, you know.
172
00:08:22,200 --> 00:08:24,136
I film, oh, four videos a day.
173
00:08:24,160 --> 00:08:25,778
That's a good three
hours outta my day.
174
00:08:25,820 --> 00:08:27,100
And then I've got
another eight hours
175
00:08:27,111 --> 00:08:28,476
and I'm like, what the
heck am I gonna do
176
00:08:28,500 --> 00:08:30,412
for the rest of the day? Like
I can't ring out my friends.
177
00:08:30,436 --> 00:08:32,226
Like, y'all, want to hang
out? 'Cause they're at work.
178
00:08:32,250 --> 00:08:34,201
So it definitely
gets lonely at times.
179
00:08:34,230 --> 00:08:35,726
And you know,
sometimes if you're reading
180
00:08:35,750 --> 00:08:38,006
the hate comments
and the stress load,
181
00:08:38,030 --> 00:08:39,386
it can be so much for your body
182
00:08:39,410 --> 00:08:41,856
and you're just overwhelmed
and you're lonely.
183
00:08:41,880 --> 00:08:43,984
So that can also
creep into depression.
184
00:08:44,008 --> 00:08:45,294
- Catherine hasn't
had a question.
185
00:08:45,318 --> 00:08:47,953
I'm happy to return to you
but let's just keep it civil.
186
00:08:47,977 --> 00:08:48,977
Andrew?
187
00:08:50,860 --> 00:08:51,860
Catherine.
188
00:08:55,860 --> 00:08:58,106
With people stuck
at home during lockdown,
189
00:08:58,130 --> 00:08:59,896
desperate for entertainment.
190
00:08:59,920 --> 00:09:03,660
TikTok became the world's
most downloaded app in 2020.
191
00:09:07,010 --> 00:09:09,779
And it's continued to
hold that title this year.
192
00:09:13,172 --> 00:09:17,416
- TikTok in Australia has
seen the same kind of bump
193
00:09:17,440 --> 00:09:19,663
in 2020 as elsewhere
in the world.
194
00:09:23,280 --> 00:09:24,306
In October of 2020, there were
195
00:09:24,330 --> 00:09:26,406
an estimated 2.5
million users on TikTok.
196
00:09:26,430 --> 00:09:29,230
Which was about a 50% growth
from earlier on in the year.
197
00:09:41,863 --> 00:09:43,522
Of the popular
social media apps,
198
00:09:43,547 --> 00:09:45,545
TikTok is the most addictive.
199
00:09:46,030 --> 00:09:48,966
Late TikTok advertising
data shows users spend
200
00:09:48,990 --> 00:09:52,213
an average of an hour and
a half on the app each day.
201
00:09:54,270 --> 00:09:56,688
- You know it's like 8:00 PM
and I'm watching and watching
202
00:09:56,712 --> 00:09:59,216
and then I look up at
my clock and it's 2:00 AM.
203
00:09:59,434 --> 00:10:01,394
And I'm like, where the
heck did those hours go?
204
00:10:01,420 --> 00:10:03,380
It's cause this, um, "For
You" page is so addictive.
205
00:10:03,390 --> 00:10:05,174
It's just so spot on.
206
00:10:09,290 --> 00:10:12,090
TikTok's algorithm
is its most valuable asset.
207
00:10:14,610 --> 00:10:17,296
It's designed to
determine your interests
208
00:10:17,320 --> 00:10:19,406
and send you
personalized content
209
00:10:19,430 --> 00:10:22,086
to keep you on the app
for as long as possible.
210
00:10:22,110 --> 00:10:23,186
- I went and saw my mama.
211
00:10:23,210 --> 00:10:25,226
And I went and got my
hair done, as well just to-
212
00:10:25,250 --> 00:10:28,656
- TikTok works by
recommending content to you
213
00:10:28,680 --> 00:10:30,976
through your
activity on the app.
214
00:10:31,000 --> 00:10:33,506
So the more that you
scroll through the app,
215
00:10:33,530 --> 00:10:35,176
the better the
recommendations are tailored
216
00:10:35,200 --> 00:10:36,783
to your specific interests.
217
00:10:37,850 --> 00:10:40,806
Rather than selecting
content that you want to watch
218
00:10:40,830 --> 00:10:43,231
like you would on
YouTube or on Netflix.
219
00:10:43,600 --> 00:10:47,216
You primarily access
content through one main feed,
220
00:10:47,240 --> 00:10:49,185
which is called the For
You page on TikTok.
221
00:10:49,450 --> 00:10:51,936
Which is essentially just
an endlessly scrolling,
222
00:10:51,960 --> 00:10:55,266
algorithmically
curated feed of videos
223
00:10:55,290 --> 00:10:57,543
that refreshes each
time you open the app.
224
00:10:58,947 --> 00:11:00,696
As soon as
you sign up to TikTok,
225
00:11:00,720 --> 00:11:02,476
the app starts collecting data
226
00:11:02,500 --> 00:11:05,676
about you, your
location, gender, and age,
227
00:11:05,700 --> 00:11:08,726
and also your facial data
to figure out who you are
228
00:11:08,750 --> 00:11:10,863
and what kind of
videos you want to see.
229
00:11:11,730 --> 00:11:15,703
- Your face is a form
of biometric information.
230
00:11:15,727 --> 00:11:19,475
And your face can be
analyzed to distinguish a range
231
00:11:19,500 --> 00:11:21,860
of personality and
demographic traits.
232
00:11:24,590 --> 00:11:26,336
TikTok collects your facial data
233
00:11:26,360 --> 00:11:29,746
every time you make a video
or use a filter on the app.
234
00:11:29,770 --> 00:11:33,816
And can even access photos
and videos saved on your phone
235
00:11:33,840 --> 00:11:35,930
that aren't being
used on the platform.
236
00:11:36,680 --> 00:11:40,686
To understand how an app
like TikTok interprets that data
237
00:11:40,710 --> 00:11:42,656
scientists in Melbourne
have developed
238
00:11:42,680 --> 00:11:44,706
what's called a
biometric mirror.
239
00:11:45,160 --> 00:11:46,866
- So biometric
mirror for instance,
240
00:11:46,890 --> 00:11:49,486
is trained by way of
artificial intelligence
241
00:11:49,510 --> 00:11:51,466
to distinguish how
intelligent you are,
242
00:11:51,490 --> 00:11:54,596
how attractive, how
weird, how responsible
243
00:11:54,620 --> 00:11:57,723
and how emotionally
unstable you are.
244
00:11:58,900 --> 00:12:00,366
The interesting thing
there is of course,
245
00:12:00,390 --> 00:12:02,756
is that biometric mirror
bases it's assumptions
246
00:12:02,788 --> 00:12:05,115
on a single
snapshot of your face.
247
00:12:05,140 --> 00:12:07,936
So all of these assumptions
are generated based
248
00:12:07,960 --> 00:12:10,606
on the exact
appearance of your face
249
00:12:10,630 --> 00:12:13,653
at that exact microsecond
that the photo has been taken.
250
00:12:14,834 --> 00:12:17,356
The TikTok algorithm
might read your face
251
00:12:17,380 --> 00:12:19,906
and think that you are dealing
252
00:12:19,930 --> 00:12:22,456
with a significant
mental health challenge.
253
00:12:22,480 --> 00:12:25,346
You might be
presented with videos
254
00:12:25,370 --> 00:12:27,726
that are created by
users with-going through
255
00:12:27,750 --> 00:12:29,946
a similar challenge
at that time.
256
00:12:29,970 --> 00:12:34,076
And it might really create a
very colored worldview for you
257
00:12:34,100 --> 00:12:35,516
where it's really hard to deal
258
00:12:35,540 --> 00:12:37,716
with your mental health
challenge at that time.
259
00:12:44,980 --> 00:12:47,636
Lauren Hemings
is studying to be a midwife.
260
00:12:47,660 --> 00:12:49,544
She used to spend
her uni breaks,
261
00:12:49,568 --> 00:12:51,265
scrolling through TikTok.
262
00:12:51,880 --> 00:12:53,896
- I think it was
quarantine boredom
263
00:12:53,920 --> 00:12:56,863
that kind of motivated
me to download it.
264
00:12:57,950 --> 00:12:59,276
It was quite an innocent hope
265
00:12:59,300 --> 00:13:01,376
of just getting a
good laugh, really.
266
00:13:01,400 --> 00:13:03,426
You know, like
getting funny videos
267
00:13:03,450 --> 00:13:05,633
and seeing what was on it.
268
00:13:06,870 --> 00:13:08,136
I never had the intention
269
00:13:08,160 --> 00:13:09,716
of making TikToks
or sharing them.
270
00:13:09,740 --> 00:13:13,045
It was more just kind of from
the viewpoint of a viewer.
271
00:13:13,820 --> 00:13:15,046
Lauren started following
272
00:13:15,070 --> 00:13:17,679
a popular fitness
influencer on the app.
273
00:13:20,100 --> 00:13:24,436
- There's one woman who had
like quite a similar body type
274
00:13:24,460 --> 00:13:28,486
to me and she'd expressed
that she was unhappy
275
00:13:28,510 --> 00:13:29,593
with that body type.
276
00:13:30,770 --> 00:13:33,136
And she had started tracking
calories over quarantine.
277
00:13:33,160 --> 00:13:36,003
She had lost a really, really
significant amount of weight.
278
00:13:39,260 --> 00:13:41,306
The algorithm
then flooded her feed
279
00:13:41,330 --> 00:13:44,283
with content promoting
unhealthy weight loss.
280
00:13:46,330 --> 00:13:48,516
- I was no longer saying
funny dance videos or anything.
281
00:13:48,540 --> 00:13:50,736
It was just like
this complete focus
282
00:13:50,760 --> 00:13:54,863
on that like fitness and
healthy lifestyle goal.
283
00:13:58,763 --> 00:14:01,366
- TikTok pushed Lauren
toward the popular trend
284
00:14:01,390 --> 00:14:05,303
of meticulously tracking how
many calories you eat in a day.
285
00:14:06,290 --> 00:14:09,453
Something researches, warn,
promotes disordered eating.
286
00:14:10,850 --> 00:14:12,746
The hashtag,
What I eat in a day,
287
00:14:12,770 --> 00:14:15,665
has more than 7
billion views on TikTok.
288
00:14:18,480 --> 00:14:20,056
- It turned into
like this obsession
289
00:14:20,080 --> 00:14:23,496
and I felt that I
could not eat anything
290
00:14:23,520 --> 00:14:26,526
without knowing how
many calories it contained
291
00:14:26,550 --> 00:14:28,796
and without meeting, you
know, my target number
292
00:14:28,820 --> 00:14:30,336
of calories throughout the day.
293
00:14:30,360 --> 00:14:32,516
There was a few months
where I didn't put anything
294
00:14:32,540 --> 00:14:34,673
into my mouth that
I had not weighed.
295
00:14:36,150 --> 00:14:38,356
Four months
after downloading TikTok,
296
00:14:38,380 --> 00:14:40,636
Lauren admitted to
her friends and family
297
00:14:40,660 --> 00:14:42,337
she had an eating disorder.
298
00:14:43,000 --> 00:14:44,806
- I'd like to think that I
wouldn't have struggled
299
00:14:44,830 --> 00:14:47,156
with an eating disorder if I
hadn't downloaded TikTok.
300
00:14:47,180 --> 00:14:48,236
I think, you know, TikTok
301
00:14:48,260 --> 00:14:51,783
was the main contributor
to the development of that.
302
00:14:53,470 --> 00:14:56,156
Young users are
increasingly turning to TikTok
303
00:14:56,180 --> 00:15:00,086
to find and spread information
on how to restrict food
304
00:15:00,110 --> 00:15:02,893
and hide their disordered,
eating from their families.
305
00:15:04,250 --> 00:15:06,916
- What they do is they
actually share content
306
00:15:06,940 --> 00:15:10,176
of what they go through and
what they have done for the day
307
00:15:10,200 --> 00:15:11,913
in the fascination
to become thin.
308
00:15:12,800 --> 00:15:14,286
So they would share recipes.
309
00:15:14,310 --> 00:15:16,526
They would share diet plans.
310
00:15:16,550 --> 00:15:19,549
They would share how
you need to be disciplined.
311
00:15:19,980 --> 00:15:23,620
For someone who's
vulnerable and desperate,
312
00:15:23,650 --> 00:15:26,286
they would follow
anyone's advice.
313
00:15:26,310 --> 00:15:28,306
None of this advice
is actually good
314
00:15:28,330 --> 00:15:31,396
because some of these
advice is, oh lick a pumpkin
315
00:15:31,420 --> 00:15:33,811
for your lunch, but don't eat.
316
00:15:34,600 --> 00:15:36,973
Drink a liter of water
and you should be fine.
317
00:15:37,900 --> 00:15:40,046
- I was super hesitant
to get on TikTok
318
00:15:40,070 --> 00:15:41,806
because I'd heard that
it was a really bad space
319
00:15:41,830 --> 00:15:42,956
for people with
eating disorders.
320
00:15:42,980 --> 00:15:45,006
Because the algorithm
knows everything
321
00:15:45,030 --> 00:15:46,916
and then it would
curate your feed
322
00:15:46,940 --> 00:15:48,485
to be interested
in that kind of stuff.
323
00:15:49,440 --> 00:15:51,906
Claire Benstead
has been in and out of hospital
324
00:15:51,930 --> 00:15:54,504
for anorexia for
more than five years.
325
00:15:54,590 --> 00:15:56,346
She decided to download TikTok
326
00:15:56,370 --> 00:15:59,613
to find support and to
promote her earrings business.
327
00:16:00,090 --> 00:16:01,736
- You want that support
328
00:16:01,760 --> 00:16:03,326
because it's such
an isolating illness.
329
00:16:03,350 --> 00:16:04,766
And there's so many
people in my life
330
00:16:04,790 --> 00:16:06,926
that don't get it and
don't understand it.
331
00:16:06,950 --> 00:16:09,286
Claire says the
TikTok algorithm identified
332
00:16:09,310 --> 00:16:11,556
she had an eating
disorder and she noticed
333
00:16:11,580 --> 00:16:15,436
an immediate change to the
types of videos on her feed.
334
00:16:15,460 --> 00:16:16,686
- So it went from
being, you know,
335
00:16:16,710 --> 00:16:18,726
my algorithm was, you
know, Australian humor
336
00:16:18,750 --> 00:16:21,536
and musical theater humor,
and all of that kind of stuff
337
00:16:21,560 --> 00:16:25,346
to just being eating
disorder content all the time.
338
00:16:25,370 --> 00:16:28,330
And as I got sicker and
I got more obsessive,
339
00:16:28,840 --> 00:16:30,746
all I could do was just
flick through my phone,
340
00:16:30,770 --> 00:16:33,692
and look at this footage.
341
00:16:34,220 --> 00:16:37,076
I spent hours on it
and just fixated on it.
342
00:16:37,100 --> 00:16:38,106
I wasn't recovering it all.
343
00:16:38,130 --> 00:16:39,723
I was actively relapsing.
344
00:16:40,520 --> 00:16:42,256
Claire was admitted to hospital.
345
00:16:42,280 --> 00:16:45,566
As part of her treatment, her
psychologists worked with her
346
00:16:45,590 --> 00:16:48,976
to remove the toxic
content from her TikTok feed
347
00:16:49,000 --> 00:16:52,326
by unfollowing accounts
and reporting videos.
348
00:16:52,350 --> 00:16:54,436
How long did it actually
take you to get rid
349
00:16:54,460 --> 00:16:56,875
of that eating disorder
content from your algorithm.
350
00:16:57,850 --> 00:16:58,858
Ages.
351
00:16:58,882 --> 00:17:01,126
Pretty much being in hospital,
so probably two months,
352
00:17:01,150 --> 00:17:02,510
it took me to
change the algorithm.
353
00:17:03,200 --> 00:17:04,796
When you're kind of
scrolling through like this-
354
00:17:04,820 --> 00:17:06,768
- Even while Claire was
showing me her
355
00:17:06,792 --> 00:17:08,436
cleaned up TikTok feed,
356
00:17:08,460 --> 00:17:11,659
videos about eating
disorders began reappearing.
357
00:17:12,020 --> 00:17:14,021
Hey, there we go.
Here's one right now.
358
00:17:14,540 --> 00:17:17,246
Just every five or six videos.
359
00:17:17,270 --> 00:17:21,246
And so, I'm in a good spot
that this doesn't trigger me.
360
00:17:21,270 --> 00:17:22,856
- So even though you're
saying not interested,
361
00:17:22,880 --> 00:17:24,586
it's still coming up?
- It's still coming up.
362
00:17:24,610 --> 00:17:27,086
If you report TikTok
videos, the company says
363
00:17:27,110 --> 00:17:30,046
its moderators then
decide whether to ban them.
364
00:17:30,070 --> 00:17:31,486
Which in turn is supposed
365
00:17:31,510 --> 00:17:34,076
to teach the algorithm
to stop featuring them.
366
00:17:34,100 --> 00:17:35,736
- I just say that
I'm not interested in that-
367
00:17:35,760 --> 00:17:38,676
- TikToks policies,
say the app bans content
368
00:17:38,700 --> 00:17:41,963
promoting, normalizing or
glorifying eating disorders.
369
00:17:41,980 --> 00:17:43,580
And you can
say that it's offensive,
370
00:17:43,588 --> 00:17:44,966
But when users like Claire,
371
00:17:44,990 --> 00:17:47,676
have reported those
videos, they were told
372
00:17:47,700 --> 00:17:50,026
they don't breach
any guidelines.
373
00:17:50,050 --> 00:17:51,946
- You would think that, you
know, something this serious
374
00:17:51,970 --> 00:17:53,476
and it's got the
highest mortality rate
375
00:17:53,500 --> 00:17:55,456
of any mental illness,
you would think that,
376
00:17:55,480 --> 00:17:57,056
that would be something
that you could report.
377
00:17:57,080 --> 00:17:58,556
Because it is promoting
those behaviors
378
00:17:58,580 --> 00:18:00,375
and it's making it worse.
379
00:18:05,940 --> 00:18:07,158
TikTok also says it
380
00:18:07,180 --> 00:18:09,295
bans pro eating
disorder hashtags
381
00:18:09,320 --> 00:18:11,776
so users can search
for those videos.
382
00:18:11,800 --> 00:18:13,496
And if they try to, a number
383
00:18:13,520 --> 00:18:15,356
for eating disorder
support service,
384
00:18:15,380 --> 00:18:18,756
the Butterfly Foundation
automatically pops up.
385
00:18:18,780 --> 00:18:20,823
But users find ways around it.
386
00:18:21,790 --> 00:18:25,486
- But the issue is now
that it's ever evolving.
387
00:18:25,510 --> 00:18:27,836
Like there's a hashtag
now that people
388
00:18:27,860 --> 00:18:29,356
with eating disorders use.
389
00:18:29,380 --> 00:18:31,106
And you would never guess that
390
00:18:31,130 --> 00:18:33,096
it was an eating
disorder hashtag.
391
00:18:33,120 --> 00:18:34,720
Like it's after a famous singer.
392
00:18:36,700 --> 00:18:40,046
So just changing them
to be completely irrelevant
393
00:18:40,070 --> 00:18:42,328
from what an eating disorder is.
394
00:18:42,670 --> 00:18:44,386
And so it's so hard
to escape now.
395
00:18:44,410 --> 00:18:46,496
And I think it's
really hard for TikTok
396
00:18:46,520 --> 00:18:47,818
to keep up with that all.
397
00:18:48,360 --> 00:18:51,066
- There are mechanisms in place
398
00:18:51,090 --> 00:18:52,446
to screen some of that content.
399
00:18:52,470 --> 00:18:55,646
But a lot of it is also reliant
on human moderation.
400
00:18:55,670 --> 00:18:58,026
And when you consider
the amount of videos
401
00:18:58,050 --> 00:19:00,039
and the volume that is
being uploaded to TikTok,
402
00:19:00,063 --> 00:19:02,176
it's a very difficult
task to imagine
403
00:19:02,200 --> 00:19:04,100
human moderators
can catch everything.
404
00:19:09,526 --> 00:19:12,286
Last year,
TikTok established a council
405
00:19:12,310 --> 00:19:14,366
of outside experts to advise
406
00:19:14,390 --> 00:19:16,773
the company about
content moderation.
407
00:19:17,730 --> 00:19:19,553
David Polgar is one of them.
408
00:19:20,940 --> 00:19:23,226
- As we know with great power
comes great responsibility.
409
00:19:23,250 --> 00:19:25,566
There's a lot of power
in TikToks algorithm.
410
00:19:25,590 --> 00:19:28,196
Therefore you have
to constantly be aware
411
00:19:28,220 --> 00:19:31,836
of how it's impacting
other individuals
412
00:19:31,860 --> 00:19:33,261
and other communities.
413
00:19:33,980 --> 00:19:37,060
I think comparatively
speaking TikTok
414
00:19:37,082 --> 00:19:39,886
has done a pretty decent job
415
00:19:39,910 --> 00:19:44,016
with being more
reflective on rabbit holes
416
00:19:44,040 --> 00:19:47,986
and how that can
affect individuals.
417
00:19:48,010 --> 00:19:53,010
But at the same time, you're
dealing with human behaviour.
418
00:19:53,180 --> 00:19:55,656
You're dealing with bad actors.
419
00:19:55,680 --> 00:19:58,106
You're dealing with
major differences
420
00:19:58,130 --> 00:20:02,653
of how people define appropriate
versus inappropriate.
421
00:20:02,790 --> 00:20:04,106
And we have this tricky kind
422
00:20:04,130 --> 00:20:07,013
of balancing act that's
constantly happening.
423
00:20:09,700 --> 00:20:13,380
- TikTok's business model
is built on creating a fun,
424
00:20:13,410 --> 00:20:16,356
glossy and glamorous
version of the world.
425
00:20:16,380 --> 00:20:19,526
And the company has been found
to strictly control content
426
00:20:19,550 --> 00:20:21,446
that doesn't fit
with that image.
427
00:20:21,470 --> 00:20:24,936
In March last year, TikTok
policy documents were leaked.
428
00:20:24,960 --> 00:20:27,456
Showing content
moderators were instructed
429
00:20:27,480 --> 00:20:29,796
to suppress posts by creators
430
00:20:29,820 --> 00:20:32,663
considered ugly,
poor or disabled.
431
00:20:36,460 --> 00:20:38,288
The documents said,
"Videos, including
432
00:20:38,312 --> 00:20:40,786
people who had chubby or obese
433
00:20:40,810 --> 00:20:42,096
with ugly facial looks,
434
00:20:42,120 --> 00:20:44,946
like too many wrinkles
or facial deformities
435
00:20:44,970 --> 00:20:47,806
and other disabilities
should be excluded."
436
00:20:47,830 --> 00:20:51,403
TikTok has said it no longer
engages in these practices.
437
00:20:55,899 --> 00:20:57,206
- I don't want to
admit it, but looks
438
00:20:57,230 --> 00:20:58,326
have a lot to do with it.
439
00:20:58,350 --> 00:21:00,706
And you know, we're
all secretly a bit vain.
440
00:21:00,730 --> 00:21:02,396
As much you don't
wanna admit it,
441
00:21:02,420 --> 00:21:04,806
you go for looks over
non-looks, you know.
442
00:21:04,830 --> 00:21:06,596
So I think looks definitely
have a lot to do with it.
443
00:21:06,620 --> 00:21:09,426
And if you look at all the
really big time influencers,
444
00:21:09,450 --> 00:21:10,546
they're all beautiful.
445
00:21:10,570 --> 00:21:12,259
Like, if you look at
all these influencers,
446
00:21:12,283 --> 00:21:14,376
they're all stunning, like
nothing wrong with them.
447
00:21:14,400 --> 00:21:16,865
So I think looks definitely
have a lot to do with it.
448
00:21:30,347 --> 00:21:31,968
Much of TikTok's popularity
449
00:21:31,992 --> 00:21:33,584
is driven by dance trends,
450
00:21:33,620 --> 00:21:35,826
choreographed by black creators
451
00:21:35,850 --> 00:21:38,302
and then copied
by white influencers.
452
00:21:40,290 --> 00:21:42,026
But black content makers say
453
00:21:42,050 --> 00:21:45,343
that the platform actively
discriminates against them.
454
00:21:47,220 --> 00:21:49,746
Think it's high
time we let black women
455
00:21:49,770 --> 00:21:52,126
on this app also be famous
for doing the bare minimum.
456
00:21:52,150 --> 00:21:54,836
Like I should be able
to just sit here in silence,
457
00:21:54,860 --> 00:21:56,006
and let y'all look at me
458
00:21:56,030 --> 00:21:58,830
and the next thing you know,
I have a million followers.
459
00:22:00,180 --> 00:22:01,540
- Petition for black
people for the rest
460
00:22:01,560 --> 00:22:03,206
of April to stop talking.
461
00:22:03,230 --> 00:22:04,856
- There have been instances
462
00:22:04,880 --> 00:22:08,116
of black creator
led mass walk offs
463
00:22:08,140 --> 00:22:09,756
from the platform
called Blackouts.
464
00:22:09,780 --> 00:22:11,166
Where on a certain day,
465
00:22:11,190 --> 00:22:12,726
black creators will
stop using the platform
466
00:22:12,750 --> 00:22:15,896
or urge other creators
to leave the platform
467
00:22:15,920 --> 00:22:19,466
because of TikToks inaction
and failure to respond to
468
00:22:19,490 --> 00:22:21,696
or engage with
some of the criticisms
469
00:22:21,720 --> 00:22:25,224
and the discourse that
black creators have raised.
470
00:22:25,294 --> 00:22:30,869
So if the company continues
to be reactive and responsive,
471
00:22:30,980 --> 00:22:33,826
rather than proactive and
really meaningfully engage,
472
00:22:33,850 --> 00:22:36,125
then these issues are
gonna continue to occur.
473
00:22:39,260 --> 00:22:41,356
- Often, it makes me
quite furious, I guess,
474
00:22:41,380 --> 00:22:42,886
'cause it's like
these black creators,
475
00:22:42,910 --> 00:22:45,526
they got talent, they're
out here dancing
476
00:22:45,550 --> 00:22:47,566
and showing what
they're capable of.
477
00:22:47,880 --> 00:22:50,246
So it's kind of very
much disappointing
478
00:22:50,270 --> 00:22:53,516
and hard on us when
we're out here expected
479
00:22:53,540 --> 00:22:55,976
to have all of these in
order to get the views
480
00:22:56,000 --> 00:22:57,956
in order to get the
likes and shares.
481
00:22:57,980 --> 00:23:00,066
But no matter how much we try,
482
00:23:00,090 --> 00:23:02,022
we're just not gonna get that.
483
00:23:08,620 --> 00:23:12,896
- Unice Wani is an 18 year
old TikTok creator from Perth.
484
00:23:13,010 --> 00:23:14,800
- I like this one.
Look and see.
485
00:23:15,120 --> 00:23:16,456
I feel like the more I go viral,
486
00:23:16,480 --> 00:23:20,496
the more I can basically
show the younger generation
487
00:23:20,520 --> 00:23:23,216
and show more
colored girls, I guess,
488
00:23:23,240 --> 00:23:27,016
or people out there like
I'm okay in my own skin
489
00:23:27,040 --> 00:23:28,726
and I love myself the way I am.
490
00:23:28,750 --> 00:23:31,236
I don't care what social
media says about me.
491
00:23:31,260 --> 00:23:34,146
What people on the other side
of the screen says about me.
492
00:23:34,170 --> 00:23:36,163
You can be yourself
at the end of the day.
493
00:23:36,668 --> 00:23:38,136
Let me quickly address this-
494
00:23:38,160 --> 00:23:39,326
- As her following grew,
495
00:23:39,350 --> 00:23:41,816
so did the hateful comments.
496
00:23:41,840 --> 00:23:44,816
And she decided to
confront the issue on the app.
497
00:23:44,840 --> 00:23:46,636
- So a majority of you
guys still feel the need
498
00:23:46,660 --> 00:23:49,226
to comment about my skin
color and about how dark I am
499
00:23:49,250 --> 00:23:51,616
and about how black, black,
black, black, black I am.
500
00:23:51,620 --> 00:23:52,654
Well, guess what?
501
00:23:52,679 --> 00:23:54,523
I'm black and I'm so proud.
502
00:23:57,891 --> 00:24:00,516
- Unice says often
her videos are hidden
503
00:24:00,540 --> 00:24:02,356
or muted from the TikTok feed.
504
00:24:02,380 --> 00:24:03,946
Meaning few people see them.
505
00:24:03,970 --> 00:24:05,917
A practice known
as Shadow Banning.
506
00:24:05,941 --> 00:24:07,617
- Are you pressed?
507
00:24:07,641 --> 00:24:09,147
Are you mad?
508
00:24:09,171 --> 00:24:10,598
Are you upset?
509
00:24:10,622 --> 00:24:11,622
Are you sad?
510
00:24:12,920 --> 00:24:14,076
Sorry, what?
511
00:24:14,100 --> 00:24:17,236
I guess you tend to
get a lot of shadow bans
512
00:24:17,260 --> 00:24:20,556
for speaking up about
stuff such as racism.
513
00:24:20,580 --> 00:24:21,866
Stuff you couldn't mention.
514
00:24:21,890 --> 00:24:24,476
One word, black,
could say all of this
515
00:24:24,500 --> 00:24:26,876
and your video could
get shadow banned.
516
00:24:26,900 --> 00:24:32,300
When you post a video,
the video, just it's on the app.
517
00:24:32,330 --> 00:24:34,536
It's just, you're not
gonna get any views for it.
518
00:24:34,560 --> 00:24:36,916
So you can see it. It's just
other people can't see it
519
00:24:36,940 --> 00:24:39,286
when they go onto
your account as well.
520
00:24:39,310 --> 00:24:42,509
So it's up there. It's just,
it's not going to get any views.
521
00:24:47,270 --> 00:24:49,506
Last year
TikTok creators noticed
522
00:24:49,530 --> 00:24:51,776
the algorithm was
suppressing posts
523
00:24:51,800 --> 00:24:55,113
with the hashtag Black
Lives Matter or George Floyd.
524
00:24:59,620 --> 00:25:04,316
- So word on the street
is that TikTok has banned
525
00:25:04,410 --> 00:25:06,169
the Black Lives Matter hashtag.
526
00:25:12,060 --> 00:25:14,332
One of those
creators was Sydney man,
527
00:25:14,357 --> 00:25:16,512
Paniora Nukunuku,
who had created
528
00:25:16,540 --> 00:25:18,900
a video using a pool table
529
00:25:18,930 --> 00:25:22,296
to explain the Black Lives
Matter issue to Australians.
530
00:25:22,320 --> 00:25:23,886
- This is a white
Australia table.
531
00:25:23,910 --> 00:25:25,796
And they pretty much
had 200 years head-start
532
00:25:25,820 --> 00:25:27,517
and they had established
everything in the country.
533
00:25:27,541 --> 00:25:29,476
So their break looks like this.
534
00:25:29,500 --> 00:25:32,455
Bro, can you get home
ownership in business?
535
00:25:32,500 --> 00:25:33,540
Beautiful.
536
00:25:33,552 --> 00:25:35,056
That was spicy.
537
00:25:35,080 --> 00:25:37,161
That blew up bigger
than I thought it would.
538
00:25:37,278 --> 00:25:38,635
I just need to put this here.
539
00:25:38,660 --> 00:25:39,806
Boy, what the,
540
00:25:39,830 --> 00:25:42,586
- Don't worry. It's trauma,
injustice and discrimination.
541
00:25:42,610 --> 00:25:44,056
But I said, sorry,
so it should be fine.
542
00:25:44,080 --> 00:25:45,230
So just go for it, bro.
543
00:25:46,520 --> 00:25:48,526
It was the biggest video
at the time that I've done.
544
00:25:48,540 --> 00:25:50,060
I think you're just
being lazy, hey.
545
00:25:50,085 --> 00:25:51,596
I don't know why.
546
00:25:51,620 --> 00:25:53,962
Oh, I do know why,
because it was good.
547
00:25:53,980 --> 00:25:55,361
I shouldn't look at
the camera, but I'm
548
00:25:55,385 --> 00:25:56,981
just really proud right now.
549
00:25:57,220 --> 00:25:58,396
Using these two cue balls,
550
00:25:58,420 --> 00:25:59,616
I'll explain to
you, that resulted
551
00:25:59,640 --> 00:26:01,136
in my account getting banned
552
00:26:01,160 --> 00:26:02,776
for like seven days.
553
00:26:02,800 --> 00:26:03,856
I don't know why.
554
00:26:03,880 --> 00:26:07,786
They claimed that my video
breached community guidelines,
555
00:26:07,810 --> 00:26:10,926
which is extremely vague
because there is no swearing,
556
00:26:10,950 --> 00:26:13,079
there is no explicit language.
557
00:26:13,103 --> 00:26:17,536
There's no nudity or
explicit like sexual stuff.
558
00:26:17,560 --> 00:26:20,139
None of that.
And my account got banned.
559
00:26:20,900 --> 00:26:22,566
- The Black Lives Matter
is trending on TikTok,
560
00:26:22,590 --> 00:26:24,976
which is ironic considering
how much time TikTok spends
561
00:26:25,000 --> 00:26:26,696
silencing the voices
of black creators.
562
00:26:26,720 --> 00:26:28,966
TikTok apologized
for suppressing hashtags,
563
00:26:28,990 --> 00:26:30,786
referring to Black Lives Matter,
564
00:26:30,849 --> 00:26:32,815
blaming a glitch
in the algorithm.
565
00:26:33,003 --> 00:26:34,803
- Let's take a moment
of silence for this man.
566
00:26:34,939 --> 00:26:37,019
- The company responded
with a new initiative
567
00:26:37,050 --> 00:26:40,056
for black creators called the
TikTok Black Creator Programme.
568
00:26:40,080 --> 00:26:42,926
I've spoken to creators
who had been approached
569
00:26:42,950 --> 00:26:46,566
for that programme, who
felt that it was lip service.
570
00:26:46,590 --> 00:26:49,836
It wasn't really a
well-meaning effort
571
00:26:49,860 --> 00:26:52,826
to engage with black voices
and engage with discourse
572
00:26:52,850 --> 00:26:54,800
that is important to
black communities.
573
00:26:56,840 --> 00:27:00,835
Paniora, has more than
180,000 followers on TikTok.
574
00:27:01,500 --> 00:27:03,996
He often posts about
living with a disability.
575
00:27:04,020 --> 00:27:05,326
- So growing up
with the fake leg,
576
00:27:05,350 --> 00:27:06,876
I always got in trouble
every time I park
577
00:27:06,900 --> 00:27:08,216
in my disabled spot.
578
00:27:08,240 --> 00:27:10,396
The first video I
did, was me going up
579
00:27:10,420 --> 00:27:13,516
to a pool and telling
my friends to record me,
580
00:27:13,540 --> 00:27:16,476
dip my fake leg in the
water to test the water out.
581
00:27:16,500 --> 00:27:18,236
It was a really dumb idea.
582
00:27:18,260 --> 00:27:19,816
But for some reason,
people loved it.
583
00:27:19,840 --> 00:27:21,606
And in this space
of eight hours,
584
00:27:21,630 --> 00:27:24,353
it hit about 780,000 views.
585
00:27:25,190 --> 00:27:29,486
If you have this many
followers and that many likes,
586
00:27:29,510 --> 00:27:30,916
it's 'cause you're pretty.
587
00:27:30,940 --> 00:27:32,986
If you have this many followers
588
00:27:33,010 --> 00:27:34,783
and the same amount of likes,
589
00:27:38,410 --> 00:27:39,410
you're just funny.
590
00:27:41,280 --> 00:27:44,106
Paniora ran into trouble
with the TikTok censors
591
00:27:44,130 --> 00:27:46,766
when he posted a
video of a confrontation
592
00:27:46,790 --> 00:27:48,296
with someone
who was telling him,
593
00:27:48,320 --> 00:27:50,896
he shouldn't have
a disability permit
594
00:27:50,920 --> 00:27:54,326
- So this old lady had
the nerve to ask me
595
00:27:54,350 --> 00:27:57,063
if this is my disability card.
596
00:27:58,900 --> 00:28:00,186
This,
597
00:28:00,210 --> 00:28:02,816
I wonder if this is enough.
598
00:28:02,840 --> 00:28:04,696
The video was taken down.
599
00:28:04,720 --> 00:28:07,906
TikTok said it breached the
app's community guidelines.
600
00:28:07,930 --> 00:28:10,606
Paniora appealed
and it was put back up.
601
00:28:10,630 --> 00:28:11,886
But he's had other videos
602
00:28:11,910 --> 00:28:14,456
about his disability
removed as well.
603
00:28:14,480 --> 00:28:15,796
- You don't need
to worry about it.
604
00:28:15,820 --> 00:28:18,020
The video got taken down
and I didn't even know it
605
00:28:18,030 --> 00:28:20,056
until I looked back
at the hashtags
606
00:28:20,080 --> 00:28:22,386
and decided to see which
videos that I've done
607
00:28:22,410 --> 00:28:24,808
have like made it to the
top and that wasn't there.
608
00:28:25,260 --> 00:28:28,386
I appealed it and I don't
know why that was taken down.
609
00:28:28,420 --> 00:28:29,656
Don't ever do that again.
610
00:28:29,820 --> 00:28:31,906
Do I feel like TikTok
is being racist?
611
00:28:31,930 --> 00:28:32,930
I don't know.
612
00:28:34,190 --> 00:28:36,602
Has TikTok been
hit up in the past,
613
00:28:36,660 --> 00:28:39,086
around the moderators being told
614
00:28:39,100 --> 00:28:44,580
to limit the exposure of
disabled people and ugly people?
615
00:28:44,590 --> 00:28:46,216
Yes. They've been
called out on that.
616
00:28:46,240 --> 00:28:47,426
Is this happening again?
617
00:28:47,450 --> 00:28:51,016
I hope not, but it
definitely feels like it has.
618
00:28:51,040 --> 00:28:54,186
We know
that to decolonize Palestine
619
00:28:54,210 --> 00:28:56,516
means also to decolonize-
620
00:28:56,540 --> 00:28:58,986
- I'll probably keep
moving, get some shots.
621
00:28:59,010 --> 00:29:00,016
In may of this year,
622
00:29:00,040 --> 00:29:03,406
Paniora posted a video
from a pro-Palestine rally.
623
00:29:03,430 --> 00:29:05,636
But TikTok's
algorithm flagged it.
624
00:29:05,660 --> 00:29:07,439
And it was instantly taken down.
625
00:29:09,950 --> 00:29:13,086
Other creators posting
TikToks about Palestine
626
00:29:13,110 --> 00:29:15,855
have said they've
experienced the same thing.
627
00:29:18,198 --> 00:29:20,856
- When TikTok started
removing my videos
628
00:29:20,880 --> 00:29:22,566
about the protests in regards
629
00:29:22,590 --> 00:29:25,506
to the Palestinian
situation, I was furious.
630
00:29:25,530 --> 00:29:29,216
I was like, why? There
is nothing in these videos
631
00:29:29,240 --> 00:29:33,066
that would justify,
like a removal.
632
00:29:33,090 --> 00:29:34,333
There really isn't.
633
00:29:34,710 --> 00:29:37,326
- One of the big
problems with TikTok
634
00:29:37,350 --> 00:29:42,046
and the unique nature
of its opaque algorithm,
635
00:29:42,070 --> 00:29:45,736
is that it's very
difficult to understand
636
00:29:45,760 --> 00:29:49,534
or to recognise when
censorship is taking place.
637
00:29:49,540 --> 00:29:51,140
People came together to try to-
638
00:29:51,167 --> 00:29:54,796
- So it is possible
for content on the app
639
00:29:54,820 --> 00:29:58,676
to be promoted or demoted
without anyone knowing.
640
00:29:58,700 --> 00:30:01,566
- I'm so sick and tired of
every social media platform
641
00:30:01,590 --> 00:30:03,366
silencing Palestinian voices.
642
00:30:03,390 --> 00:30:08,066
- But we also see evidence
of how content moderation
643
00:30:08,090 --> 00:30:09,616
that takes place in China.
644
00:30:09,640 --> 00:30:13,806
How that type of
thinking is still applied
645
00:30:13,830 --> 00:30:15,843
to TikTok outside of China.
646
00:30:18,591 --> 00:30:21,445
TikTok is owned by
Chinese start-up, ByteDance,
647
00:30:21,470 --> 00:30:25,098
which is believed to be
worth more than $250 billion.
648
00:30:25,490 --> 00:30:28,166
It's heavily regulated by
the Chinese government.
649
00:30:28,190 --> 00:30:30,847
And there's a Communist
Party Internal Committee
650
00:30:30,872 --> 00:30:34,696
in ByteDance, which ensures
the parties political goals
651
00:30:34,720 --> 00:30:37,009
are pursued alongside
the company's.
652
00:30:38,880 --> 00:30:40,976
- We have to be extra concerned
653
00:30:41,000 --> 00:30:45,006
about how apps like
TikTok can be used
654
00:30:45,030 --> 00:30:47,883
as a vector for censorship
and surveillance.
655
00:30:49,990 --> 00:30:52,386
- The Australian
Strategic Policy Institute
656
00:30:52,410 --> 00:30:54,566
did the first
academic investigation
657
00:30:54,590 --> 00:30:56,346
into censorship on TikTok,
658
00:30:56,370 --> 00:30:59,376
concluding the company
actively uses the algorithm
659
00:30:59,400 --> 00:31:02,660
to hide political speech
it deems controversial.
660
00:31:03,020 --> 00:31:05,616
The research was funded
by the US State Department
661
00:31:05,640 --> 00:31:08,566
and found anti-Russian
government videos
662
00:31:08,580 --> 00:31:12,035
as well as hashtags
about LGBTQI issues
663
00:31:12,060 --> 00:31:13,820
and the mass
detention of Uyghurs
664
00:31:13,860 --> 00:31:15,833
were among those
being suppressed.
665
00:31:18,510 --> 00:31:22,636
- The company has cooperated
with public security bureaus
666
00:31:22,660 --> 00:31:26,076
all throughout China
and including in Xinjiang.
667
00:31:26,100 --> 00:31:30,306
And that means that they work,
668
00:31:30,330 --> 00:31:33,176
they coordinate with
government agencies
669
00:31:33,200 --> 00:31:36,676
to ensure that the
information space in China
670
00:31:36,700 --> 00:31:39,966
is pumped full of
this propaganda.
671
00:31:39,990 --> 00:31:42,596
That shows a very rosy picture
672
00:31:42,620 --> 00:31:44,484
of what's happening in Xinjiang.
673
00:31:48,640 --> 00:31:51,296
In 2018, then CEO of ByteDance
674
00:31:51,320 --> 00:31:53,366
was forced to
publicly apologise.
675
00:31:53,390 --> 00:31:55,496
Saying one of the
company's platforms
676
00:31:55,520 --> 00:31:59,396
had gone against China's
core socialist values.
677
00:31:59,420 --> 00:32:02,986
- We have a very
clear public statement
678
00:32:03,010 --> 00:32:04,766
from the founder of ByteDance,
679
00:32:04,790 --> 00:32:08,786
that this is something that
he's committed to doing
680
00:32:08,810 --> 00:32:11,426
and to ensuring that
the company continues
681
00:32:11,450 --> 00:32:15,316
to push this type of propaganda,
certainly inside of China.
682
00:32:15,340 --> 00:32:18,676
Whether that is then extended
out to the rest of the world
683
00:32:18,700 --> 00:32:22,916
via apps like TikTok,
is another question.
684
00:32:22,940 --> 00:32:25,183
And it's something
worth watching.
685
00:32:26,190 --> 00:32:27,626
In a statement TikTok said,
686
00:32:27,650 --> 00:32:30,216
it does not moderate
or remove content
687
00:32:30,240 --> 00:32:32,086
based on political
sensitivities.
688
00:32:32,110 --> 00:32:34,516
And has never
content at the request
689
00:32:34,540 --> 00:32:35,906
of the Chinese government.
690
00:32:35,930 --> 00:32:37,976
It also said it
embraces diversity
691
00:32:38,000 --> 00:32:41,006
and denied it discriminates
against any creator
692
00:32:41,030 --> 00:32:42,943
or community on our platform.
693
00:32:46,730 --> 00:32:49,186
- We've known for a
better part of a decade,
694
00:32:49,210 --> 00:32:51,146
both here in the
US and in Australia,
695
00:32:51,170 --> 00:32:53,706
about the concerns
raised by the prevalence
696
00:32:53,730 --> 00:32:55,306
of Chinese
telecommunications companies.
697
00:32:55,330 --> 00:32:56,726
And so then the next
question became,
698
00:32:56,750 --> 00:32:58,996
well, what about all
these apps that have,
699
00:32:59,020 --> 00:33:00,906
of companies that are
headquartered in China?
700
00:33:00,930 --> 00:33:03,072
They're collecting tremendous
amounts of user data.
701
00:33:03,280 --> 00:33:06,233
They have access to
the devices of individuals.
702
00:33:07,440 --> 00:33:08,846
Jamil Jaffer is Founder
703
00:33:08,870 --> 00:33:11,996
of the National Security
Institute in Washington.
704
00:33:12,020 --> 00:33:16,016
And has advised the us
government on cyber-security.
705
00:33:16,040 --> 00:33:18,476
- In China, it's all the
central government,
706
00:33:18,500 --> 00:33:19,506
the Communist Party.
707
00:33:19,530 --> 00:33:21,926
There's no separation
between the branches.
708
00:33:21,950 --> 00:33:24,426
And so, when these
apps have all that data,
709
00:33:24,450 --> 00:33:26,006
it's much easier for
the Chinese government
710
00:33:26,030 --> 00:33:28,436
to simply obtain
access to that data.
711
00:33:31,320 --> 00:33:32,326
- My understanding is that,
712
00:33:32,350 --> 00:33:34,916
about a quarter of
the world's population
713
00:33:34,940 --> 00:33:37,316
is a member of TikTok
if I'm not mistaken.
714
00:33:37,340 --> 00:33:39,596
So that's obviously an
enormous amount of data
715
00:33:39,620 --> 00:33:40,676
that's being generated.
716
00:33:40,700 --> 00:33:43,006
That's being
handed over for free
717
00:33:43,030 --> 00:33:45,016
to that single social network
718
00:33:45,040 --> 00:33:46,856
that has pretty
much full control
719
00:33:46,880 --> 00:33:48,876
over what it does to the data.
720
00:33:48,900 --> 00:33:52,875
It might analyze it to
generate personalized content
721
00:33:52,900 --> 00:33:55,180
for you, but it might
also use that data
722
00:33:55,190 --> 00:33:58,026
to offer technology
products and services
723
00:33:58,050 --> 00:34:01,056
to other companies
moving forward in the future.
724
00:34:23,830 --> 00:34:25,476
- Hello, it's Avani in Sydney.
725
00:34:25,500 --> 00:34:26,726
How's it going?
726
00:34:26,750 --> 00:34:28,246
- Hi.
727
00:34:28,270 --> 00:34:29,836
Anne Longfield is England's
728
00:34:29,860 --> 00:34:31,764
former children's commissioner.
729
00:34:31,780 --> 00:34:33,650
Anne's interview, take one.
730
00:34:33,660 --> 00:34:36,300
- She's representing
millions of kids on TikTok
731
00:34:36,340 --> 00:34:39,650
in the UK and Europe in a class
action against the company.
732
00:34:39,950 --> 00:34:43,036
- My claim with TikTok
at the moment is that,
733
00:34:43,060 --> 00:34:46,216
they are harvesting huge
amounts of data illegally
734
00:34:46,240 --> 00:34:48,856
without the consent of
children or their parents.
735
00:34:48,880 --> 00:34:52,056
And they aren't
giving the right level
736
00:34:52,080 --> 00:34:55,016
of transparency about
what happens to that data,
737
00:34:55,040 --> 00:34:57,993
or actually what
that data includes.
738
00:35:01,600 --> 00:35:06,086
Almost a third of TikTok's
Australian users are under 14.
739
00:35:06,110 --> 00:35:08,666
Lawyers say TikTok
takes personal information
740
00:35:08,690 --> 00:35:11,836
like phone numbers, videos,
locations, and facial data
741
00:35:11,860 --> 00:35:13,896
from kids without their consent.
742
00:35:13,920 --> 00:35:17,416
As well as the photos and
videos recorded using TikTok,
743
00:35:17,440 --> 00:35:20,033
but not uploaded or
saved to the platform.
744
00:35:21,180 --> 00:35:25,216
- Given the level of data
and the lack of transparency
745
00:35:25,240 --> 00:35:27,306
around there, it's
difficult to imagine
746
00:35:27,330 --> 00:35:28,536
that this isn't just a kind of
747
00:35:28,560 --> 00:35:30,386
information gathering service,
748
00:35:30,410 --> 00:35:34,966
which is thinly veiled as some
kind of enjoyable platform,
749
00:35:34,990 --> 00:35:38,006
which appeals to young children.
750
00:35:38,030 --> 00:35:41,516
So the real incentive here,
751
00:35:41,540 --> 00:35:43,813
when you look at it
in really cold terms,
752
00:35:43,837 --> 00:35:47,596
seems to be, to gather
as much data as possible
753
00:35:47,620 --> 00:35:50,460
to really be able
to monetize that.
754
00:35:53,180 --> 00:35:55,676
TikTok's already
been fined millions of dollars
755
00:35:55,700 --> 00:35:59,113
in the US and South Korea
for harvesting children's data.
756
00:36:01,290 --> 00:36:04,136
The company restricted
app access for children
757
00:36:04,160 --> 00:36:08,196
and has taken down millions
of under-age users' accounts.
758
00:36:08,220 --> 00:36:10,855
There's been no legal
action in Australia.
759
00:36:12,530 --> 00:36:15,156
- I think that governments
do have a responsibility
760
00:36:15,180 --> 00:36:18,066
to intervene to
ensure that children
761
00:36:18,090 --> 00:36:20,546
are protected in whatever
kind of environment they're in.
762
00:36:20,570 --> 00:36:23,106
And you see those
protections and measures
763
00:36:23,130 --> 00:36:25,356
in terms of the
physical environment,
764
00:36:25,380 --> 00:36:28,376
in terms of their
safety, you know,
765
00:36:28,400 --> 00:36:29,856
in the communities they live in,
766
00:36:29,880 --> 00:36:31,726
in the environments they are.
767
00:36:31,750 --> 00:36:34,387
But it hasn't always
been the case online.
768
00:36:34,420 --> 00:36:37,420
And some governments have
769
00:36:37,461 --> 00:36:39,529
struggled to see
what that means.
770
00:36:42,730 --> 00:36:44,254
If the case is successful,
771
00:36:44,278 --> 00:36:45,946
TikTok could have
to pay children
772
00:36:45,970 --> 00:36:49,166
from the UK and Europe,
billions in compensation.
773
00:36:49,360 --> 00:36:50,826
TikTok is fighting the case.
774
00:36:50,850 --> 00:36:52,453
In a statement,
the company said,
775
00:36:52,477 --> 00:36:55,696
"Privacy and safety are
top priorities for TikTok
776
00:36:55,720 --> 00:36:58,026
and we have robust
policies, processes,
777
00:36:58,050 --> 00:37:01,736
and technologies in place
to help protect all users
778
00:37:01,760 --> 00:37:04,056
and our teenage
users in particular.
779
00:37:04,080 --> 00:37:05,836
We believe the claims lack merit
780
00:37:05,860 --> 00:37:08,627
and intend to vigorously
defend the action."
781
00:37:11,630 --> 00:37:14,006
The US government
is reviewing TikTok
782
00:37:14,030 --> 00:37:15,928
and hasn't ruled out a ban.
783
00:37:16,340 --> 00:37:17,585
- The real question
you ask is, what
784
00:37:17,609 --> 00:37:19,106
about the national
security implications?
785
00:37:19,130 --> 00:37:21,426
So, okay. Yes, a lot of
people are using it, right.
786
00:37:21,450 --> 00:37:22,536
But why does that matter?
787
00:37:22,560 --> 00:37:24,986
And it matters, I think,
because of the access,
788
00:37:25,010 --> 00:37:27,110
it gives you to this
large amount of data.
789
00:37:28,290 --> 00:37:31,416
You never think about the
Chinese government in Beijing,
790
00:37:31,440 --> 00:37:34,136
having videos of
you in your home,
791
00:37:34,160 --> 00:37:36,936
outside your home, at
the park with your kids,
792
00:37:36,960 --> 00:37:38,333
knowing who your kids play with.
793
00:37:38,760 --> 00:37:40,666
I mean, that's
what they have now
794
00:37:40,690 --> 00:37:42,186
potentially with this data set.
795
00:37:42,210 --> 00:37:44,516
We've seen now two
consecutive presidents
796
00:37:44,540 --> 00:37:45,616
sign executive orders,
797
00:37:45,640 --> 00:37:47,976
making clear that they
are very concerned
798
00:37:48,000 --> 00:37:49,996
about the national
security implications
799
00:37:50,020 --> 00:37:51,954
of TikTok's data collection.
800
00:37:52,249 --> 00:37:54,025
As well as the impact it has
801
00:37:54,050 --> 00:37:56,250
on the privacy and civil
liberties of Americans.
802
00:38:01,550 --> 00:38:03,574
India has
announced a ban on TikTok.
803
00:38:04,050 --> 00:38:06,716
And in July last year, Prime
Minister Scott Morrison
804
00:38:06,740 --> 00:38:10,223
ordered a review by intelligence
agencies into the app.
805
00:38:12,750 --> 00:38:16,503
We are always
very mindful of those risks
806
00:38:16,527 --> 00:38:20,326
and we are always
monitoring them very closely.
807
00:38:20,350 --> 00:38:22,246
And if we've considered,
there is a need
808
00:38:22,270 --> 00:38:24,836
to take further action
than we are taking now,
809
00:38:24,860 --> 00:38:27,073
then I can tell you, we
won't be shy about it.
810
00:38:28,720 --> 00:38:31,016
- It's certainly a
security concern
811
00:38:31,040 --> 00:38:33,506
that the data of
Australian users
812
00:38:33,530 --> 00:38:36,726
is potentially going
back to Beijing.
813
00:38:37,260 --> 00:38:39,780
TikTok maintained
that this is not the case.
814
00:38:39,810 --> 00:38:44,486
And our analysis showed that
815
00:38:44,510 --> 00:38:46,546
there's certainly
not a fire hose
816
00:38:46,570 --> 00:38:50,016
of content that's being
sent back to Beijing.
817
00:38:50,040 --> 00:38:51,386
But that doesn't mean that,
818
00:38:51,410 --> 00:38:53,486
that content can't be accessed
819
00:38:53,510 --> 00:38:55,531
from Beijing, if
that's required.
820
00:38:55,820 --> 00:38:56,826
How are you going?
821
00:38:56,874 --> 00:38:58,406
- I'm good, thank
you. How are you?
822
00:38:58,431 --> 00:39:00,735
TikTok maintains
Australian users' data
823
00:39:00,760 --> 00:39:03,426
is held on servers in
the US and Singapore.
824
00:39:03,450 --> 00:39:05,456
And that it has
never provided data
825
00:39:05,480 --> 00:39:06,923
to the Chinese government.
826
00:39:08,480 --> 00:39:11,036
Staff in the Home Affairs
and Defense Departments
827
00:39:11,060 --> 00:39:14,016
have been told not to
have TikTok on their phones
828
00:39:14,040 --> 00:39:15,842
because of security risks.
829
00:39:16,470 --> 00:39:19,196
But Scott Morrison said
there wasn't enough evidence
830
00:39:19,220 --> 00:39:21,037
to ban TikTok in Australia.
831
00:39:21,780 --> 00:39:24,276
- The scope of the
investigation did seem
832
00:39:24,300 --> 00:39:25,996
to be quite limited.
833
00:39:26,020 --> 00:39:31,075
And that scope is not
really enough to be able
834
00:39:31,100 --> 00:39:34,110
to tell the rest of Australia
835
00:39:34,180 --> 00:39:36,196
and regular Australian citizens,
836
00:39:36,220 --> 00:39:39,786
whether it's a good idea
for them to be using the app.
837
00:39:41,150 --> 00:39:44,556
- There should definitely
be another more rigorous
838
00:39:44,580 --> 00:39:47,046
and lengthy review into TikTok
839
00:39:47,070 --> 00:39:51,478
to fully understand the
risks that TikTok presents.
840
00:39:52,300 --> 00:39:53,973
- And so if you just look
at TikTok in isolation,
841
00:39:53,997 --> 00:39:55,683
and say, well, it's
just this one app,
842
00:39:55,707 --> 00:39:57,516
and it's just kids
doing dancing videos.
843
00:39:57,540 --> 00:39:58,836
It seems innocuous.
844
00:39:58,860 --> 00:40:01,706
But it's really takes place
in this much larger context
845
00:40:01,730 --> 00:40:04,486
of data collection,
artificial intelligence,
846
00:40:04,510 --> 00:40:08,046
and a real effort by the
Chinese to consolidate influence
847
00:40:08,070 --> 00:40:09,602
in the region and
across the globe.
848
00:40:13,450 --> 00:40:14,942
In just two years,
849
00:40:14,980 --> 00:40:17,875
TikTok has cemented
itself as the app of choice
850
00:40:17,900 --> 00:40:20,016
for millions of Australians.
851
00:40:20,040 --> 00:40:22,916
- So you guys kept telling
me to go on the Voice 2021.
852
00:40:22,940 --> 00:40:25,446
So, I did.
853
00:40:25,470 --> 00:40:26,926
There are serious concerns
854
00:40:26,950 --> 00:40:30,406
that TikToks fun and
beautiful version of reality
855
00:40:30,430 --> 00:40:33,103
is distorting the way
we see the world.
856
00:40:34,680 --> 00:40:38,967
And questions about whether
its users understand the risks.
857
00:40:41,280 --> 00:40:44,136
- So we're really at risk
of having generations
858
00:40:44,160 --> 00:40:46,306
of young people
that haven't been able
859
00:40:46,330 --> 00:40:49,416
to form their own
identity in natural ways.
860
00:40:49,440 --> 00:40:51,396
And instead have
formed identities
861
00:40:51,420 --> 00:40:54,026
in response to something
that a technology
862
00:40:54,050 --> 00:40:56,456
or a technology
platform prescribes
863
00:40:56,480 --> 00:40:58,476
to be the normal
or the new normal.
864
00:40:58,500 --> 00:41:00,177
- Last time this lady
came up to me and go,
865
00:41:00,201 --> 00:41:01,673
you don't look disabled enough.
866
00:41:01,820 --> 00:41:02,980
I don't look disabled enough?
867
00:41:03,010 --> 00:41:05,296
- I understand that
TikTok is trying its very best
868
00:41:05,320 --> 00:41:07,706
to make the platform
palatable to everyone
869
00:41:07,730 --> 00:41:12,730
by just having fun dance
videos and lip sync videos.
870
00:41:16,290 --> 00:41:18,946
But I know that my
content gives value
871
00:41:18,970 --> 00:41:21,156
to so many people
who look like me.
872
00:41:21,180 --> 00:41:23,876
Who live the same life like me.
873
00:41:23,900 --> 00:41:25,595
Who are brown like me.
874
00:41:33,750 --> 00:41:37,976
- I ended up cutting off
TikTok like after a few months.
875
00:41:38,000 --> 00:41:40,186
But even with that,
like it still left me
876
00:41:40,210 --> 00:41:42,526
with the eating
disorder, you know.
877
00:41:42,550 --> 00:41:45,336
Like, TikTok kind of led to
the development and then
878
00:41:45,360 --> 00:41:48,923
it has taken a really,
really long time to fix that.
879
00:41:50,238 --> 00:41:52,275
TikTok isn't out
here to help people.
880
00:41:52,300 --> 00:41:53,716
I don't think it's
coming to the world
881
00:41:53,740 --> 00:41:55,396
with this intention
of helping people.
882
00:41:55,420 --> 00:41:57,226
If they're going to make
money off of something,
883
00:41:57,250 --> 00:41:59,646
then they will make
money off of something.
884
00:42:00,120 --> 00:42:02,246
I think they maybe
need to realise
885
00:42:02,270 --> 00:42:04,170
the impact that is
having on people.
66937
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.