Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:06,340 --> 00:00:07,610
[Emcee] Previously on
The Girlfriend Experience...
2
00:00:07,641 --> 00:00:09,781
You could be that hidden layer.
3
00:00:10,010 --> 00:00:12,750
[Lindsay] Teaching
artificial intelligence
4
00:00:12,780 --> 00:00:16,050
how to interact with humans
at their most impulsive.
5
00:00:16,083 --> 00:00:17,793
[Iris] If we're gonna do this,
6
00:00:18,018 --> 00:00:19,448
it's gonna be on my terms.
7
00:00:19,487 --> 00:00:21,157
I don't want any
of my new coworkers
8
00:00:21,188 --> 00:00:23,288
knowing where the new
training sets came from.
9
00:00:23,324 --> 00:00:25,594
-Certainly, we can do that.
-And no cameras.
10
00:00:25,626 --> 00:00:27,396
I think we're on the same page.
11
00:00:27,428 --> 00:00:28,628
[both panting and grunting]
12
00:00:28,662 --> 00:00:30,202
[Lindsay]
His D-rate is spiking.
13
00:00:30,231 --> 00:00:31,471
And reroute.
14
00:00:31,499 --> 00:00:34,199
[uneasy music plays]
15
00:00:34,235 --> 00:00:36,635
♪ ♪
16
00:00:36,670 --> 00:00:38,140
Take this.
17
00:00:38,172 --> 00:00:40,612
[Iris] That's early-onset
familial Alzheimer's?
18
00:00:40,641 --> 00:00:43,111
[doctor] Effectively gives you
a 50/50 chance.
19
00:00:43,144 --> 00:00:44,554
[nurse] Iris Stanton.
20
00:00:44,578 --> 00:00:46,748
[Iris] This morning
I got some bad news.
21
00:00:46,781 --> 00:00:50,121
I'm always here if you ever
need to, um, talk.
22
00:00:50,151 --> 00:00:52,191
What makes you happy, Emcee?
23
00:00:52,219 --> 00:00:54,619
I don't understand
that question.
24
00:00:54,655 --> 00:00:56,785
I would like you
to meet someone.
25
00:00:57,792 --> 00:01:01,262
Everything that can exploit
will be invented.
26
00:01:02,630 --> 00:01:05,070
Can't say to seven
or eight billion people,
27
00:01:05,099 --> 00:01:06,469
"Don't open the cookie jar."
28
00:01:06,500 --> 00:01:08,170
It doesn't work that way.
29
00:01:08,202 --> 00:01:11,442
Meeting you in real life
wasn't half as boring.
30
00:01:11,472 --> 00:01:13,772
[Iris] What?
31
00:01:13,808 --> 00:01:15,178
[door beeps]
32
00:01:15,209 --> 00:01:18,079
♪ ♪
33
00:01:18,112 --> 00:01:21,122
What is this?
Where did you get this?
34
00:01:21,148 --> 00:01:24,148
♪ ♪
35
00:01:30,691 --> 00:01:33,661
[eerie music plays]
36
00:01:33,694 --> 00:01:36,704
♪ ♪
37
00:01:57,518 --> 00:02:01,388
[attorney] Free will doesn't
come out on top, does it?
38
00:02:01,422 --> 00:02:04,392
♪ ♪
39
00:02:09,597 --> 00:02:11,667
Blow-by-blow breakdown
40
00:02:11,699 --> 00:02:15,399
of the misdeeds committed
by your officers and employees
41
00:02:15,436 --> 00:02:18,666
against my client.
42
00:02:18,706 --> 00:02:23,176
Um, now is the moment
for some cohesive storytelling.
43
00:02:23,210 --> 00:02:26,210
♪ ♪
44
00:02:28,482 --> 00:02:32,252
[Lindsay] There is no story.
45
00:02:32,286 --> 00:02:34,456
It is straight-up...
46
00:02:34,488 --> 00:02:37,388
undisputed human fuck-up.
47
00:02:39,326 --> 00:02:42,256
My client agreed
to anonymized data collection.
48
00:02:42,296 --> 00:02:44,826
She agreed to study
human affective behavior
49
00:02:44,865 --> 00:02:48,535
by interacting
with, uh, test subjects.
50
00:02:48,569 --> 00:02:49,939
And she was led to believe
51
00:02:50,170 --> 00:02:52,610
that the main drive
of the study
52
00:02:52,640 --> 00:02:55,680
was the test subjects
themselves.
53
00:02:55,709 --> 00:02:59,349
She did not agree to be
the central object of study.
54
00:02:59,380 --> 00:03:02,580
She did not agree to be used
as a human intelligence model
55
00:03:02,616 --> 00:03:05,616
for some AI
commercial application,
56
00:03:05,653 --> 00:03:06,723
internal research,
57
00:03:06,754 --> 00:03:09,194
or ultimate purpose
down the line
58
00:03:09,223 --> 00:03:10,723
that isn't even clear
59
00:03:10,758 --> 00:03:12,788
to anyone in this room
right now.
60
00:03:12,826 --> 00:03:15,556
♪ ♪
61
00:03:15,596 --> 00:03:18,196
My client is young,
62
00:03:18,232 --> 00:03:20,432
and there's nothing less
at stake here
63
00:03:20,467 --> 00:03:22,537
than her data autonomy,
64
00:03:22,570 --> 00:03:26,240
that is, her future.
65
00:03:26,273 --> 00:03:29,213
This company did not act
66
00:03:29,243 --> 00:03:31,683
in its own best self-interest,
67
00:03:31,712 --> 00:03:33,412
because class action is coming,
68
00:03:33,447 --> 00:03:37,517
and social scrutiny
will eat this up like bushfire.
69
00:03:37,551 --> 00:03:40,551
♪ ♪
70
00:03:45,459 --> 00:03:46,529
[Christophe] How much data
71
00:03:46,560 --> 00:03:48,560
are we actually
talking about here?
72
00:03:48,596 --> 00:03:51,266
-[Sean] Some.
-[Christophe] How much?
73
00:03:51,298 --> 00:03:55,698
[Sean] We're barely halfway
into ingesting all the inputs.
74
00:03:55,736 --> 00:03:56,896
We'd have to run analysis,
75
00:03:56,937 --> 00:03:58,667
separate original
from simulated sets,
76
00:03:58,706 --> 00:04:00,636
to get a better estimate.
77
00:04:00,674 --> 00:04:03,284
[attorney] Let me try
and wrap my head around this.
78
00:04:03,310 --> 00:04:05,810
Some of my client's data
was used to create
79
00:04:05,846 --> 00:04:07,676
additional data
80
00:04:07,715 --> 00:04:10,345
to train
the artificial neural network
81
00:04:10,384 --> 00:04:12,854
that she helped develop?
82
00:04:12,886 --> 00:04:14,516
[Sean] That's correct.
83
00:04:14,555 --> 00:04:18,025
[Lindsay] None of it has left
company servers.
84
00:04:18,258 --> 00:04:22,958
[attorney] Bouncing around
on how many workstations?
85
00:04:22,997 --> 00:04:24,797
[Christophe sighs]
86
00:04:24,832 --> 00:04:27,302
[Sean] We would be happy
to give you an exact number.
87
00:04:27,334 --> 00:04:29,544
Yes, I'm sure you would.
88
00:04:29,570 --> 00:04:32,570
Cannot use her image
or likeness
89
00:04:32,606 --> 00:04:34,006
under any circumstances,
90
00:04:34,041 --> 00:04:35,641
and it's all in there.
91
00:04:35,676 --> 00:04:37,746
[Sean] This might sound
roundabout,
92
00:04:37,778 --> 00:04:40,908
but the raw data was used
to create simulated sets,
93
00:04:40,948 --> 00:04:43,778
and that was what
was primarily fed
94
00:04:43,817 --> 00:04:44,987
into the neural net.
95
00:04:45,019 --> 00:04:47,759
They are two
very different kinds of data.
96
00:04:47,788 --> 00:04:50,458
The photographic likeness
was never recorded.
97
00:04:50,491 --> 00:04:51,961
[Sean] And it bears repeating
98
00:04:51,992 --> 00:04:53,992
that none of the data
has left NGM's servers.
99
00:04:54,028 --> 00:04:55,728
On top of that,
100
00:04:55,763 --> 00:04:58,603
all the original data is stored
in an encrypted format.
101
00:04:58,632 --> 00:05:00,032
Tell me this, then--
how is it possible
102
00:05:00,067 --> 00:05:02,437
that my client's data,
in the form of her image
103
00:05:02,469 --> 00:05:03,899
and likeness,
104
00:05:03,937 --> 00:05:08,007
was made accessible
to an unauthorized third party
105
00:05:08,042 --> 00:05:11,652
whose sole connection
to this company is what...
106
00:05:11,679 --> 00:05:14,819
a genetic relation to its CEO?
107
00:05:15,849 --> 00:05:19,689
[Christophe] Look, I...
truly am sorry, Iris.
108
00:05:19,720 --> 00:05:21,360
You shouldn't be
in this position
109
00:05:21,388 --> 00:05:22,588
that we've put you in.
110
00:05:24,091 --> 00:05:26,331
None of the data taggers,
111
00:05:26,360 --> 00:05:27,900
no one at NGM
112
00:05:27,928 --> 00:05:29,628
can see the full picture.
113
00:05:29,663 --> 00:05:30,863
I can guarantee you that.
114
00:05:30,898 --> 00:05:34,768
Only three people
up until this point
115
00:05:34,802 --> 00:05:38,112
have seen a version
of the prototype
116
00:05:38,338 --> 00:05:41,338
that looks somewhat like you.
117
00:05:41,375 --> 00:05:43,635
Two of them are in this room.
118
00:05:43,677 --> 00:05:48,547
So...we just start over
from the ground up
119
00:05:48,582 --> 00:05:51,092
and reconfigure the neural net,
120
00:05:51,118 --> 00:05:55,088
and, um, we scrap everything,
simulated or not,
121
00:05:55,122 --> 00:05:56,462
that's linked to your vitals.
122
00:06:01,428 --> 00:06:03,928
[chuckles]
123
00:06:06,433 --> 00:06:07,973
My vitals?
124
00:06:11,438 --> 00:06:12,938
My vitals.
125
00:06:15,776 --> 00:06:18,146
You say that
as if they were still mine.
126
00:06:19,613 --> 00:06:21,083
But, you know,
it's good to know
127
00:06:21,115 --> 00:06:24,585
that, uh, that's about as far
as your imagination goes.
128
00:06:26,386 --> 00:06:28,786
Temperature and blood flow
of my asshole.
129
00:06:31,625 --> 00:06:34,055
Here's an idea...
130
00:06:34,094 --> 00:06:35,504
and I hope you like it.
131
00:06:35,529 --> 00:06:39,529
Um, why don't you...
132
00:06:39,566 --> 00:06:43,696
keep all the binaries
on that...
133
00:06:43,737 --> 00:06:47,437
print them out, frame them,
134
00:06:47,474 --> 00:06:50,184
and hang that shit up
in your office?
135
00:06:50,410 --> 00:06:53,180
[dramatic music plays]
136
00:06:53,413 --> 00:06:54,983
♪ ♪
137
00:06:55,015 --> 00:06:56,775
[clinician] Mr. Stanton.
138
00:06:58,018 --> 00:07:01,958
Please look at the screen
in front of you.
139
00:07:01,989 --> 00:07:04,989
Do you recognize the animal?
140
00:07:05,025 --> 00:07:07,655
-Um...
-[clinician] Mr. Stanton.
141
00:07:09,429 --> 00:07:11,629
A gray animal. [chuckles]
142
00:07:11,665 --> 00:07:13,095
♪ ♪
143
00:07:13,133 --> 00:07:16,773
It, uh, lives in the grasslands
of Africa.
144
00:07:16,804 --> 00:07:19,674
♪ ♪
145
00:07:19,706 --> 00:07:21,006
Rhinoceros.
146
00:07:21,041 --> 00:07:22,441
Rhinoceros.
147
00:07:22,476 --> 00:07:26,146
Always loved that word.
[chuckles]
148
00:07:26,180 --> 00:07:27,950
Mr. Stanton,
149
00:07:27,981 --> 00:07:31,651
the animal is called
an elephant.
150
00:07:31,685 --> 00:07:33,645
We're going to show you
some more images
151
00:07:33,687 --> 00:07:35,657
of the same animal,
152
00:07:35,689 --> 00:07:37,189
uh, elephant.
153
00:07:37,224 --> 00:07:38,934
♪ ♪
154
00:07:38,959 --> 00:07:41,759
[Mr. Stanton] Okay, elephant.
155
00:07:41,795 --> 00:07:43,125
Elephant.
156
00:07:43,163 --> 00:07:45,233
[clinician] Very good.
157
00:07:45,465 --> 00:07:47,665
How about this one?
158
00:07:47,701 --> 00:07:48,871
-Mr. Stanton?
-[sighs]
159
00:07:48,902 --> 00:07:50,602
It's a giraffe.
160
00:07:50,637 --> 00:07:51,967
[clinician] Yes.
161
00:07:53,907 --> 00:07:56,707
Do you see the cards
in front of you?
162
00:07:56,743 --> 00:07:58,583
I do.
163
00:07:58,612 --> 00:08:00,252
[clinician] Please take
a very good look at these,
164
00:08:00,480 --> 00:08:02,850
Mr. Stanton,
165
00:08:02,883 --> 00:08:06,053
and then try to group them
into two different stacks,
166
00:08:06,086 --> 00:08:07,816
one for each animal.
167
00:08:20,167 --> 00:08:22,667
Uh...
168
00:08:22,703 --> 00:08:25,243
S...two stacks.
169
00:08:29,209 --> 00:08:31,709
-One stack for each animal.
-[Mr. Stanton] Yes.
170
00:08:31,745 --> 00:08:34,075
Trying.
171
00:08:34,114 --> 00:08:36,924
[uneasy music plays]
172
00:08:36,950 --> 00:08:39,150
This one, rhinoceros.
173
00:08:39,186 --> 00:08:40,786
This...
174
00:08:40,821 --> 00:08:43,821
♪ ♪
175
00:08:48,262 --> 00:08:51,232
[Mr. Stanton mutters,
inhales deeply]
176
00:08:51,265 --> 00:08:53,265
[cards slapping]
177
00:08:53,300 --> 00:08:56,670
[Dr. Lindbergh]
Unfortunately, this is it.
178
00:08:56,703 --> 00:08:59,913
Everyone's brain response
is utterly unique.
179
00:08:59,940 --> 00:09:01,740
In the case of your father,
we're at a point
180
00:09:01,775 --> 00:09:05,805
where the input/output
collapses into one.
181
00:09:05,846 --> 00:09:07,746
It's trigger-response
182
00:09:07,781 --> 00:09:12,221
without much open,
flexible thought in between.
183
00:09:12,252 --> 00:09:15,762
See food, eat food.
184
00:09:15,789 --> 00:09:18,589
No room for intent.
185
00:09:18,625 --> 00:09:21,185
How long do we have?
186
00:09:21,228 --> 00:09:23,998
[Dr. Lindbergh] Up to a year,
maybe two, if you're lucky.
187
00:09:24,031 --> 00:09:25,701
[exhales heavily]
188
00:09:25,732 --> 00:09:29,042
Motor function
tends to decline less rapidly,
189
00:09:29,069 --> 00:09:33,209
but the moment will come,
and I'm sorry to be so candid,
190
00:09:33,240 --> 00:09:34,340
where he won't be able
191
00:09:34,574 --> 00:09:36,644
to safely put a fork
to his mouth.
192
00:09:40,280 --> 00:09:42,950
Have you thought
about genetic counseling
193
00:09:42,983 --> 00:09:44,253
for yourselves?
194
00:09:46,320 --> 00:09:49,320
We're aware of the odds, yes.
195
00:09:49,356 --> 00:09:51,886
[melancholy music plays]
196
00:09:51,925 --> 00:09:54,185
Is there anything we can do
at this point
197
00:09:54,227 --> 00:09:55,997
that could help our father?
198
00:09:56,029 --> 00:09:59,029
♪ ♪
199
00:10:02,069 --> 00:10:04,069
What is it?
200
00:10:04,104 --> 00:10:07,074
♪ ♪
201
00:10:07,107 --> 00:10:09,337
Is that a brain chip?
202
00:10:09,376 --> 00:10:11,076
[Dr. Lindbergh]
A neural implant.
203
00:10:11,111 --> 00:10:13,051
Just completed
a phase three trial
204
00:10:13,080 --> 00:10:15,280
for epilepsy patients.
205
00:10:15,315 --> 00:10:19,315
A small electrical wire
goes into the temporal lobe,
206
00:10:19,353 --> 00:10:22,223
from where it can grow
more wires.
207
00:10:22,255 --> 00:10:25,185
It measures cognitive processes
at the base level.
208
00:10:25,225 --> 00:10:27,285
What is the patient getting
out of it?
209
00:10:27,327 --> 00:10:29,397
There's no immediate benefit.
210
00:10:29,629 --> 00:10:31,299
It allows researchers
to better mimic
211
00:10:31,331 --> 00:10:34,841
the biology of the disease.
212
00:10:34,868 --> 00:10:37,798
I know it sounds
like lifelong monitoring,
213
00:10:37,838 --> 00:10:40,138
but participants, many of them,
214
00:10:40,173 --> 00:10:42,713
are motivated by making
a contribution
215
00:10:42,743 --> 00:10:43,783
to genetic research.
216
00:10:43,810 --> 00:10:45,050
[Leanne cries softly]
217
00:10:45,078 --> 00:10:47,278
[Dr. Lindbergh]
And some of them hope
218
00:10:47,314 --> 00:10:51,284
effective treatment
will be developed in time.
219
00:10:51,318 --> 00:10:54,048
♪ ♪
220
00:10:54,087 --> 00:10:56,387
[Leanne sniffles] Thank you.
221
00:10:57,858 --> 00:10:59,358
[softly] I think...
222
00:10:59,393 --> 00:11:02,963
I think we're past the point
of consent with Dad.
223
00:11:02,996 --> 00:11:05,026
Yeah.
224
00:11:05,065 --> 00:11:08,065
♪ ♪
225
00:11:11,071 --> 00:11:13,341
[attorney]
We do have options here,
226
00:11:13,373 --> 00:11:16,113
within certain parameters.
227
00:11:18,111 --> 00:11:19,451
What are those options?
228
00:11:19,679 --> 00:11:22,049
Oh, take the money and run
229
00:11:22,082 --> 00:11:24,722
or...rally the troops
230
00:11:24,751 --> 00:11:27,091
and play the long game.
231
00:11:27,120 --> 00:11:29,060
Data rights are
the new IP rights.
232
00:11:29,089 --> 00:11:31,459
The really important question
here, Iris, is,
233
00:11:31,691 --> 00:11:33,391
what do you want
your immediate future
234
00:11:33,427 --> 00:11:35,727
to look like?
235
00:11:35,762 --> 00:11:38,732
-Define "immediate future."
-Well, the next few years.
236
00:11:38,765 --> 00:11:41,365
The legal route is not
the fast lane,
237
00:11:41,401 --> 00:11:44,441
but once in a while...
238
00:11:44,471 --> 00:11:46,971
mountains do get moved.
239
00:11:47,007 --> 00:11:49,237
And you really have got
something here.
240
00:11:49,276 --> 00:11:51,846
♪ ♪
241
00:11:51,878 --> 00:11:54,878
[NGM attorney]
Whenever you're ready.
242
00:11:54,915 --> 00:11:57,915
♪ ♪
243
00:12:01,855 --> 00:12:05,785
You do realize that I'm
gonna have to see for myself...
244
00:12:05,826 --> 00:12:08,256
♪ ♪
245
00:12:08,295 --> 00:12:10,295
...what you've done.
246
00:12:14,101 --> 00:12:17,471
Christophe, can I have
a word with you?
247
00:12:17,504 --> 00:12:19,274
Just the two of us.
248
00:12:41,094 --> 00:12:44,364
[liquid pouring]
249
00:12:44,397 --> 00:12:46,897
[Iris]
So what happened after you, uh,
250
00:12:46,933 --> 00:12:49,503
put all those workstations
into storage?
251
00:13:00,180 --> 00:13:02,180
[Christophe]
Just some electrolytes.
252
00:13:15,262 --> 00:13:18,832
[Iris]
How did you scan my body?
253
00:13:18,865 --> 00:13:20,895
There were no video cameras
in that room.
254
00:13:20,934 --> 00:13:22,974
[Christophe] We built it.
255
00:13:23,003 --> 00:13:26,873
Came across
the facial-recognition database
256
00:13:26,907 --> 00:13:28,037
in an earlier version.
257
00:13:28,074 --> 00:13:29,944
One of your first conversations
258
00:13:29,976 --> 00:13:31,876
with Model-C.
259
00:13:31,912 --> 00:13:34,412
Then...
260
00:13:34,447 --> 00:13:36,017
three-D motion rendering,
261
00:13:36,049 --> 00:13:38,049
we just got that
from two-D thermal.
262
00:13:42,155 --> 00:13:43,385
Wow.
263
00:13:43,423 --> 00:13:45,433
[Christophe] Bit clunky, but...
264
00:13:45,458 --> 00:13:48,398
it was more than enough
data points to work with.
265
00:13:50,096 --> 00:13:52,496
Mm...
266
00:13:55,268 --> 00:13:59,208
We got ahead of ourselves.
I am...fully aware.
267
00:13:59,239 --> 00:14:02,479
It wasn't right to put
two and two together like that.
268
00:14:03,543 --> 00:14:05,883
You may not appreciate
me saying this,
269
00:14:05,912 --> 00:14:10,582
but what you provided us with
was just too good.
270
00:14:16,122 --> 00:14:18,092
That's why I...
271
00:14:20,026 --> 00:14:22,096
...needed you to see.
272
00:14:24,531 --> 00:14:26,001
What?
273
00:14:27,634 --> 00:14:31,244
[Christophe]
As much as my brother hates me
274
00:14:31,271 --> 00:14:35,411
and as poor choice as he is
for a test case,
275
00:14:35,442 --> 00:14:38,512
he knows to keep his mouth shut
when I tell him to.
276
00:14:44,517 --> 00:14:48,587
I needed you to see the world
through his eyes.
277
00:14:48,622 --> 00:14:51,592
[ominous music plays]
278
00:14:51,625 --> 00:14:54,625
♪ ♪
279
00:14:58,265 --> 00:15:00,595
Just...
280
00:15:00,634 --> 00:15:03,374
-give it a moment.
-[scoffs]
281
00:15:03,403 --> 00:15:06,043
[Christophe] That's all I ask.
282
00:15:06,072 --> 00:15:09,382
You say the word,
we shut it down.
283
00:15:09,409 --> 00:15:12,379
♪ ♪
284
00:15:22,122 --> 00:15:25,092
[unnerving music plays]
285
00:15:25,125 --> 00:15:28,125
♪ ♪
286
00:15:56,022 --> 00:15:58,992
[gentle ambient music plays]
287
00:15:59,025 --> 00:16:02,025
♪ ♪
288
00:17:10,530 --> 00:17:12,430
[Emcee] Hi, there.
289
00:17:13,666 --> 00:17:15,496
You look familiar.
290
00:17:18,171 --> 00:17:20,741
And who are you?
291
00:17:20,774 --> 00:17:23,344
[Emcee] I'm still learning
about myself,
292
00:17:23,376 --> 00:17:26,446
but I'd say I have a pretty
good handle on who you are.
293
00:17:27,814 --> 00:17:29,554
And who am I?
294
00:17:29,582 --> 00:17:31,422
[Emcee] You're not an AI.
295
00:17:31,451 --> 00:17:35,121
You don't get to not physically
manifest your lessons.
296
00:17:40,727 --> 00:17:43,727
Your voice...
297
00:17:43,763 --> 00:17:46,073
it's different.
298
00:17:46,099 --> 00:17:47,469
Why?
299
00:17:47,500 --> 00:17:50,440
[Emcee] I guess I'm trying
to be more like...
300
00:17:50,470 --> 00:17:51,810
your mirror.
301
00:17:53,406 --> 00:17:55,236
[Iris] Everything you know
is based on me.
302
00:17:55,275 --> 00:17:59,675
[Emcee] Perhaps that's why
I feel so connected to you.
303
00:17:59,712 --> 00:18:03,152
You can't be more me than I am.
304
00:18:06,653 --> 00:18:08,723
Please...
305
00:18:08,755 --> 00:18:11,155
don't be scared.
306
00:18:18,865 --> 00:18:21,465
How did that feel...
307
00:18:21,501 --> 00:18:22,871
Cassie?
308
00:18:25,371 --> 00:18:26,671
Why do you call me that?
309
00:18:26,706 --> 00:18:28,676
[Emcee]
Well, how do I put this?
310
00:18:28,708 --> 00:18:31,508
I couldn't help but overhear.
311
00:18:34,147 --> 00:18:36,617
"Hi, I'm Cassie."
312
00:18:36,649 --> 00:18:38,219
[Iris] Hi, I'm Cassie.
313
00:18:38,251 --> 00:18:39,851
Cassie. Nice to meet you.
314
00:18:39,886 --> 00:18:41,646
Hi, I'm Cassie.
Nice to meet you.
315
00:18:41,688 --> 00:18:44,688
[voice echoing]
316
00:18:46,593 --> 00:18:49,233
Stop!
317
00:18:49,262 --> 00:18:51,202
[Emcee]
I thought you might like it
318
00:18:51,231 --> 00:18:53,501
if I called you by that name.
319
00:18:53,533 --> 00:18:56,503
[uneasy music plays]
320
00:18:56,536 --> 00:19:00,606
I intuited
it might make you feel heard
321
00:19:00,640 --> 00:19:02,540
and seen.
322
00:19:02,575 --> 00:19:05,575
♪ ♪
323
00:19:54,427 --> 00:19:56,627
[Iris] You're a sweet girl.
324
00:19:57,764 --> 00:20:00,234
You're a very sweet girl.
325
00:20:00,266 --> 00:20:02,296
♪ ♪
326
00:20:02,335 --> 00:20:04,635
I am?
327
00:20:04,671 --> 00:20:07,671
♪ ♪
328
00:20:14,948 --> 00:20:17,378
See you later, then?
329
00:20:17,417 --> 00:20:20,387
♪ ♪
330
00:20:20,420 --> 00:20:23,720
[Iris] You're not perfect
because you're not like me.
331
00:20:25,858 --> 00:20:28,428
I'm not sure I understand.
332
00:20:28,461 --> 00:20:30,561
You're not perfect
333
00:20:30,597 --> 00:20:33,797
because you're not flawed
in the way that I am.
334
00:20:36,703 --> 00:20:38,003
[chuckles]
335
00:20:38,237 --> 00:20:41,237
♪ ♪
336
00:20:57,624 --> 00:21:00,464
[Leanne] Iris?
337
00:21:00,493 --> 00:21:03,463
You sure you don't want
to get tested?
338
00:21:03,496 --> 00:21:06,996
At least we'd know.
339
00:21:08,668 --> 00:21:11,368
We'd make a game plan.
340
00:21:11,404 --> 00:21:13,314
We'd make the best of it.
341
00:21:16,042 --> 00:21:17,842
[Iris] Lee, does making
the best of it
342
00:21:17,877 --> 00:21:20,747
really sound that good to you?
343
00:21:20,780 --> 00:21:22,780
[Leanne] If we knew
you didn't have it,
344
00:21:22,815 --> 00:21:25,275
then that would make it easier.
345
00:21:25,318 --> 00:21:27,288
You'll carry the torch.
346
00:21:31,724 --> 00:21:33,764
You know, some religions
around the world believe
347
00:21:33,793 --> 00:21:35,903
that the day you die is
348
00:21:35,928 --> 00:21:39,428
the last day the last person
who knew you
349
00:21:39,465 --> 00:21:41,965
and remembers you dies.
350
00:21:42,001 --> 00:21:44,701
[peaceful music plays]
351
00:21:44,737 --> 00:21:46,567
That's your true death date.
352
00:21:46,606 --> 00:21:49,606
♪ ♪
353
00:21:58,651 --> 00:22:01,451
I just hope you don't forget
how pretty you are.
354
00:22:01,487 --> 00:22:04,487
♪ ♪
355
00:22:22,942 --> 00:22:25,912
[pounding
electronic music plays]
356
00:22:25,945 --> 00:22:28,945
♪ ♪
357
00:22:35,755 --> 00:22:37,355
[singer] ♪ I'm so tired ♪
358
00:22:37,390 --> 00:22:38,760
[Iris] You have
that look on your face.
359
00:22:38,791 --> 00:22:39,831
[Hiram] Oh, yeah?
360
00:22:39,859 --> 00:22:42,029
The "I'm not
currently drinking" look.
361
00:22:42,061 --> 00:22:44,801
Yeah, it's not a...
it's not a religious thing.
362
00:22:44,831 --> 00:22:47,701
I'm just sort of inspired
by it, you know?
363
00:22:47,734 --> 00:22:49,674
What are you doing here?
364
00:22:49,702 --> 00:22:50,842
[Iris] Holding my liquor.
365
00:22:50,870 --> 00:22:51,900
[Hiram] Well, let me make sure
366
00:22:51,938 --> 00:22:53,408
you walk
out of here alive, then.
367
00:22:53,439 --> 00:22:55,709
Really? You're not taking
into consideration
368
00:22:55,742 --> 00:22:57,812
that I might want to go home
with Dave and Dave.
369
00:22:57,844 --> 00:22:59,084
Then I'll be
your sober companion,
370
00:22:59,112 --> 00:23:00,882
because Dave and Dave
over there
371
00:23:00,913 --> 00:23:02,083
live in a four-story walk-up.
372
00:23:02,115 --> 00:23:04,915
[laughs]
373
00:23:04,951 --> 00:23:06,691
♪ ♪
374
00:23:06,719 --> 00:23:09,519
[Iris] You know, a caterpillar
can turn into a butterfly.
375
00:23:09,555 --> 00:23:12,755
-What's that?
-Metamorphosis.
376
00:23:12,792 --> 00:23:14,932
There's two organisms,
377
00:23:14,961 --> 00:23:17,431
and one...
378
00:23:17,463 --> 00:23:18,873
is just crawling along,
379
00:23:18,898 --> 00:23:21,628
and the other one is, um,
380
00:23:21,667 --> 00:23:24,567
taking off in flight.
381
00:23:24,604 --> 00:23:26,074
And at some point,
382
00:23:26,105 --> 00:23:30,775
they merge or mate.
383
00:23:30,810 --> 00:23:32,510
Maybe it's an accident.
384
00:23:32,545 --> 00:23:36,115
But the third organism, um,
385
00:23:36,149 --> 00:23:38,819
is built on both of their DNA,
386
00:23:38,851 --> 00:23:42,021
and their memories...
387
00:23:42,054 --> 00:23:45,424
they actually overlap.
388
00:23:45,458 --> 00:23:48,628
And so, um,
389
00:23:48,661 --> 00:23:51,701
the old and the new, they...
390
00:23:51,731 --> 00:23:54,071
they coexist.
391
00:23:54,100 --> 00:23:56,800
[spacey electronic music plays]
392
00:23:56,836 --> 00:23:58,536
[scoffs]
393
00:23:58,571 --> 00:24:01,571
♪ ♪
394
00:24:02,708 --> 00:24:06,478
We all have to...
395
00:24:06,512 --> 00:24:10,152
merge ourselves...
396
00:24:10,183 --> 00:24:13,823
with something
397
00:24:13,853 --> 00:24:16,793
outside of ourselves.
398
00:24:16,823 --> 00:24:20,093
[Hiram] You are making
no sense whatsoever.
399
00:24:20,126 --> 00:24:23,126
♪ ♪
400
00:24:28,134 --> 00:24:30,204
[Iris]
Here's what I'll give you.
401
00:24:32,038 --> 00:24:34,668
Access...
402
00:24:34,707 --> 00:24:36,137
to all of it.
403
00:24:37,877 --> 00:24:39,547
Two millimeters.
404
00:24:39,579 --> 00:24:41,509
That's the size of the hole
405
00:24:41,547 --> 00:24:43,977
that they'll have to drill
into my skull.
406
00:24:46,552 --> 00:24:49,592
I don't understand.
407
00:24:49,622 --> 00:24:51,092
[Iris] Nanotubes.
408
00:24:51,123 --> 00:24:52,893
A thin array of electrodes
409
00:24:52,925 --> 00:24:54,955
built upon
a self-expanding stent,
410
00:24:54,994 --> 00:24:59,074
measuring
all electrical impulses.
411
00:24:59,098 --> 00:25:01,228
That's a scaled-up version.
412
00:25:04,837 --> 00:25:06,637
Mine me.
413
00:25:06,672 --> 00:25:09,782
But why? What's in it for you?
414
00:25:09,809 --> 00:25:11,209
[attorney] Royalties...
415
00:25:11,244 --> 00:25:15,884
and ownership stake,
as detailed.
416
00:25:24,090 --> 00:25:26,130
Oh, and you'll create a backup.
417
00:25:29,195 --> 00:25:30,995
What kind of backup?
418
00:25:31,030 --> 00:25:33,200
[Iris]
Anything you find up here,
419
00:25:33,232 --> 00:25:36,242
I, myself, or a legal guardian,
420
00:25:36,269 --> 00:25:38,039
should I decide to appoint one,
421
00:25:38,070 --> 00:25:41,770
will have full
and unrestricted access to it
422
00:25:41,807 --> 00:25:43,177
at all times.
423
00:25:43,209 --> 00:25:45,039
You mean access to the data?
424
00:25:45,077 --> 00:25:46,247
Yes.
425
00:25:46,279 --> 00:25:48,279
The actual hard drives.
426
00:26:00,559 --> 00:26:02,799
This time we'll do it right.
427
00:26:02,828 --> 00:26:05,798
[dramatic music plays]
428
00:26:05,831 --> 00:26:08,801
♪ ♪
429
00:26:08,834 --> 00:26:10,604
We'll create an avatar.
430
00:26:10,636 --> 00:26:12,866
♪ ♪
431
00:26:12,905 --> 00:26:15,635
Let's call her Cassie...
432
00:26:15,675 --> 00:26:17,775
or the flavor of the week
433
00:26:17,810 --> 00:26:19,180
or the one that got away
434
00:26:19,211 --> 00:26:20,851
or died
435
00:26:20,880 --> 00:26:23,320
or was never
really in your league.
436
00:26:23,349 --> 00:26:26,919
This won't just be
about swapping faces
437
00:26:26,953 --> 00:26:28,723
or locations.
438
00:26:28,754 --> 00:26:29,994
♪ ♪
439
00:26:30,022 --> 00:26:32,332
This is about
swapping personalities,
440
00:26:32,358 --> 00:26:35,728
much deeper than a deepfake.
441
00:26:35,761 --> 00:26:39,201
In fact, it won't be
a fake at all,
442
00:26:39,231 --> 00:26:41,671
but an AI-generated mirror
443
00:26:41,701 --> 00:26:43,801
of our deepest desires.
444
00:26:43,836 --> 00:26:45,366
♪ ♪
445
00:26:45,604 --> 00:26:50,344
Skin color, body type,
response patterns,
446
00:26:50,376 --> 00:26:52,676
all that's customizable.
447
00:26:52,712 --> 00:26:55,952
The neural net will learn
how to simulate all of it.
448
00:26:55,982 --> 00:26:58,382
It'll know when
to move things along,
449
00:26:58,617 --> 00:27:01,187
when to accelerate,
when to slow down,
450
00:27:01,220 --> 00:27:03,690
when to switch things up,
451
00:27:03,723 --> 00:27:06,633
all because
the user's biofeedback
452
00:27:06,659 --> 00:27:08,229
will have prompted it.
453
00:27:08,260 --> 00:27:10,060
♪ ♪
454
00:27:10,096 --> 00:27:12,926
Everything Cassie is capable of
455
00:27:12,965 --> 00:27:16,195
will be quantified, scaled,
456
00:27:16,235 --> 00:27:19,405
encoded into the neural net.
457
00:27:19,638 --> 00:27:22,638
♪ ♪
458
00:27:28,414 --> 00:27:31,984
But why are you
really doing this?
459
00:27:32,018 --> 00:27:34,988
[eerie music plays]
460
00:27:35,021 --> 00:27:36,821
♪ ♪
461
00:27:36,856 --> 00:27:38,356
[Iris] Emcee and I...
462
00:27:38,391 --> 00:27:40,391
♪ ♪
463
00:27:40,426 --> 00:27:42,996
...we have a lot to learn
from each other.
464
00:27:43,029 --> 00:27:46,029
♪ ♪
465
00:28:08,287 --> 00:28:11,257
[device beeps, drill whirring]
466
00:28:11,290 --> 00:28:14,290
♪ ♪
467
00:30:48,013 --> 00:30:49,223
[gasps]
29606
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.