Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:31,767 --> 00:00:35,603
I love going to the
aquarium because it's peace.
2
00:00:35,876 --> 00:00:36,986
It is silent.
3
00:00:37,481 --> 00:00:39,190
The only thing you hear
other than the people
4
00:00:39,191 --> 00:00:41,790
around you, maybe a bubble.
5
00:00:43,759 --> 00:00:46,114
There's one part with a kelp forest,
6
00:00:46,115 --> 00:00:49,284
and they sway back and
forth and up and down,
7
00:00:49,285 --> 00:00:51,494
and it's just... you get lost in it.
8
00:00:53,425 --> 00:00:55,486
And in that moment, there are no alerts.
9
00:00:56,126 --> 00:00:58,673
There are no beeps and bloops and dings.
10
00:01:00,236 --> 00:01:03,257
It's always different
and moving and random.
11
00:01:07,646 --> 00:01:09,517
And it's not like staring at a screen.
12
00:01:14,697 --> 00:01:17,729
A life mediated by screens,
13
00:01:18,579 --> 00:01:20,398
filtered through the Web,
14
00:01:20,399 --> 00:01:22,087
dominated by technology.
15
00:01:23,314 --> 00:01:24,517
Sound familiar?
16
00:01:26,238 --> 00:01:29,449
Today, our devices
are not just hardware.
17
00:01:29,450 --> 00:01:33,095
They are projections
and extensions of our mind.
18
00:01:35,122 --> 00:01:38,533
Every day, we upload billions
of bits of ourselves
19
00:01:38,566 --> 00:01:41,852
to a network that records
and even predicts
20
00:01:41,877 --> 00:01:47,126
our most basic wants,
desires, and fears.
21
00:01:49,298 --> 00:01:51,845
Do we still know where the mind ends
22
00:01:51,934 --> 00:01:54,337
and the machine begins?
23
00:01:56,579 --> 00:01:58,579
synced and corrected by susinz
*www.addic7ed.com*
24
00:02:12,411 --> 00:02:13,958
My total time in the military
25
00:02:13,983 --> 00:02:15,583
in Iraq was three years,
26
00:02:15,950 --> 00:02:18,239
and then close to four
years in Afghanistan.
27
00:02:20,817 --> 00:02:23,292
When you're on edge
and your shoulders are tight
28
00:02:23,317 --> 00:02:25,301
and you're tense for,
you know, 60, 90 days
29
00:02:25,326 --> 00:02:26,911
and you're stressed out,
30
00:02:27,341 --> 00:02:29,634
literally life or death
depends on your actions,
31
00:02:29,635 --> 00:02:32,262
your alertness, you can
only stay awake so long,
32
00:02:32,263 --> 00:02:33,680
and you're burning out
parts of your body
33
00:02:33,681 --> 00:02:36,012
that are not meant to be
on fire constantly.
34
00:02:36,308 --> 00:02:38,059
You can't be in that
fight mode constantly
35
00:02:38,060 --> 00:02:40,019
without having a huge down side of it,
36
00:02:40,020 --> 00:02:41,629
of being just worn out.
37
00:03:10,801 --> 00:03:13,551
When he came home, he was moody.
38
00:03:14,271 --> 00:03:15,872
He was irrational.
39
00:03:17,245 --> 00:03:18,911
He was horrible to be around.
40
00:03:20,936 --> 00:03:22,122
So angry.
41
00:03:23,110 --> 00:03:26,669
He couldn't even drive a car
without screaming at someone
42
00:03:26,694 --> 00:03:28,047
with my kids there.
43
00:03:32,448 --> 00:03:34,762
He's always planning an escape route.
44
00:03:36,160 --> 00:03:39,106
It's a summer day and we're
burning up sweating,
45
00:03:40,544 --> 00:03:43,254
and you can't have the front door open
46
00:03:43,459 --> 00:03:45,877
because there could
be a sniper out front
47
00:03:45,878 --> 00:03:47,692
waiting to take us all out.
48
00:03:48,327 --> 00:03:50,411
Who thinks like that?
49
00:03:50,514 --> 00:03:52,567
A normal person doesn't think like that.
50
00:03:53,969 --> 00:03:56,122
It was a horrible transition for me.
51
00:03:57,389 --> 00:03:59,515
Just sitting there,
looking at the mundane,
52
00:03:59,516 --> 00:04:02,059
you know, cartoons, kids, diapers,
53
00:04:02,895 --> 00:04:04,604
I was checked out. I was a zombie.
54
00:04:04,605 --> 00:04:06,208
Like, "What am I doing here?"
55
00:04:07,073 --> 00:04:08,917
I begged him, like,
56
00:04:08,942 --> 00:04:10,485
"Let's just go to couples counseling.
57
00:04:10,486 --> 00:04:12,075
Let's see if they can help us."
58
00:04:13,715 --> 00:04:15,886
I could not do the talk therapy.
59
00:04:15,911 --> 00:04:18,198
I just could not sit down and have,
60
00:04:18,223 --> 00:04:20,453
"This is my deepest,
darkest reason why I'm angry,"
61
00:04:20,454 --> 00:04:23,122
or, "This is the day that
so-and-so got hurt,"
62
00:04:23,123 --> 00:04:26,020
or, "This is when I did this
to another human being."
63
00:04:27,169 --> 00:04:28,753
For Chris Merkel,
64
00:04:28,754 --> 00:04:30,661
talking about the war didn't help.
65
00:04:31,129 --> 00:04:33,800
There were no words
to describe the horrors
66
00:04:33,801 --> 00:04:36,544
he'd seen, much less process them.
67
00:04:37,471 --> 00:04:39,151
Desperate to heal his mind
68
00:04:39,176 --> 00:04:41,724
of the psychological shrapnel of war,
69
00:04:41,725 --> 00:04:45,590
he is going to relive
its hell virtually.
70
00:04:52,590 --> 00:04:54,487
I think virtual reality
71
00:04:54,488 --> 00:04:58,032
is what every experimental
psychologist has always wanted,
72
00:04:58,033 --> 00:04:59,645
whether they knew it or not.
73
00:04:59,993 --> 00:05:02,245
A way to put people in simulations
74
00:05:02,246 --> 00:05:04,104
where you can control all the elements
75
00:05:04,129 --> 00:05:06,887
and see how people behave,
how they interact.
76
00:05:07,973 --> 00:05:10,628
Experimental psychologist Skip Rizzo
77
00:05:10,629 --> 00:05:13,706
saw the promise of VR
to retrain the brain
78
00:05:13,731 --> 00:05:15,317
when no one else did.
79
00:05:15,718 --> 00:05:17,969
He developed a controversial approach
80
00:05:17,970 --> 00:05:19,754
to treating PTSD...
81
00:05:20,528 --> 00:05:23,266
exposure therapy in the virtual world.
82
00:05:24,059 --> 00:05:26,477
Traditional people in PTSD
83
00:05:26,478 --> 00:05:29,458
and in treatment were
skeptical of the technology.
84
00:05:30,190 --> 00:05:32,358
Some people had the extreme response.
85
00:05:32,359 --> 00:05:34,193
"You're going to re-traumatize people!
86
00:05:34,194 --> 00:05:37,137
You're going to make it worse!
It's going to be too much!"
87
00:05:37,948 --> 00:05:41,222
VR can put people back into experiences
88
00:05:41,247 --> 00:05:44,287
they had long avoided,
and that's exactly
89
00:05:44,288 --> 00:05:46,965
what attracted Skip to the technology.
90
00:05:47,301 --> 00:05:49,792
You're helping a patient
to confront and process
91
00:05:49,793 --> 00:05:52,211
the thing they've been
spending sometimes years
92
00:05:52,212 --> 00:05:54,028
trying to avoid thinking about.
93
00:05:55,299 --> 00:05:57,592
You know, no clinician
wants to put their patient
94
00:05:57,593 --> 00:06:00,629
through this kind of
emotional obstacle course,
95
00:06:00,861 --> 00:06:02,445
but in the end,
96
00:06:02,556 --> 00:06:04,872
this is hard medicine
for a hard problem.
97
00:06:06,215 --> 00:06:09,340
You've got to confront
the demon to get past it.
98
00:06:18,997 --> 00:06:21,512
Confronting demons and critics,
99
00:06:21,872 --> 00:06:24,243
today Skip continues to push VR
100
00:06:24,244 --> 00:06:25,787
into the medical mainstream,
101
00:06:26,411 --> 00:06:29,669
all with the backing of
the US Department of Defense.
102
00:06:31,169 --> 00:06:34,112
I think I was perceived
more as a renegade
103
00:06:34,137 --> 00:06:36,348
during the early years of VR,
104
00:06:36,559 --> 00:06:38,549
but now that it's become cool,
105
00:06:38,550 --> 00:06:41,770
now I'm starting to feel
like I fit in too much.
106
00:06:43,281 --> 00:06:46,283
People are getting the
concepts and the rationale.
107
00:06:47,161 --> 00:06:49,300
I feel a little bit
less like an outsider
108
00:06:49,325 --> 00:06:51,348
and more like maybe an innovator.
109
00:06:54,566 --> 00:06:56,237
Not far from Skip's lab,
110
00:06:56,262 --> 00:06:58,402
another innovator is pushing the limits
111
00:06:58,403 --> 00:07:01,083
of the human mind and body.
112
00:07:07,297 --> 00:07:10,516
It is not that hard for us
113
00:07:10,541 --> 00:07:12,786
to connect with something
that's not real.
114
00:07:16,552 --> 00:07:20,972
A RealDoll is a silicone, articulated,
115
00:07:20,997 --> 00:07:22,239
life-sized,
116
00:07:22,357 --> 00:07:24,801
anatomically correct doll.
117
00:07:26,098 --> 00:07:28,972
Matt McMullen
is the creator and founder
118
00:07:28,997 --> 00:07:32,353
of RealDoll, an artist who transforms
119
00:07:32,354 --> 00:07:35,129
his clients' fantasies into reality.
120
00:07:36,575 --> 00:07:37,879
In the very beginning,
121
00:07:37,904 --> 00:07:40,394
I was literally working
out of my garage,
122
00:07:40,419 --> 00:07:41,489
by myself.
123
00:07:43,919 --> 00:07:46,731
And I would make each doll by myself,
124
00:07:48,044 --> 00:07:49,722
and it would take a couple of weeks,
125
00:07:49,747 --> 00:07:50,840
if not more.
126
00:08:00,278 --> 00:08:03,661
Now we have almost a
9,000-square-foot facility.
127
00:08:06,262 --> 00:08:10,011
We have 18 people working
full time to make the dolls,
128
00:08:10,036 --> 00:08:12,442
to make each and every one very special.
129
00:08:12,860 --> 00:08:16,801
Essentially, there's this
enormous palette of selections
130
00:08:16,826 --> 00:08:18,379
that we give our clients,
131
00:08:18,551 --> 00:08:21,219
where they're able to
create the sort of woman
132
00:08:21,220 --> 00:08:23,762
that they're dreaming of in a doll form.
133
00:08:25,239 --> 00:08:27,981
Early on, even in the
first couple of years,
134
00:08:28,059 --> 00:08:31,106
it was a very, very frequently
brought-up subject.
135
00:08:31,397 --> 00:08:33,340
"When are the dolls going to move?"
136
00:08:33,613 --> 00:08:35,403
When are they going to talk?"
137
00:08:36,481 --> 00:08:38,028
My initial thought was,
138
00:08:38,053 --> 00:08:39,880
"Okay, well people
are just buying this as,
139
00:08:39,905 --> 00:08:41,403
you know, as a sex toy."
140
00:08:42,533 --> 00:08:45,512
But it fairly quickly
became evident that people
141
00:08:45,537 --> 00:08:47,454
were actually connecting
with their dolls
142
00:08:47,455 --> 00:08:50,247
in ways that I, you know,
never would have predicted.
143
00:08:54,075 --> 00:08:56,588
Creating a visually beautiful doll
144
00:08:56,589 --> 00:09:00,004
is very right brain
challenging, very visual.
145
00:09:00,731 --> 00:09:02,761
Creating an AI, on the other hand,
146
00:09:02,762 --> 00:09:04,895
I think is a whole
other side of your brain,
147
00:09:04,920 --> 00:09:07,012
because you start sort of analyzing,
148
00:09:07,037 --> 00:09:09,301
"What is it that
attracts you to someone?"
149
00:09:09,942 --> 00:09:12,754
A doll who recognizes its owner's face,
150
00:09:12,871 --> 00:09:15,051
voice, and desires.
151
00:09:15,973 --> 00:09:19,028
Matt's creating a sex bot with a mind,
152
00:09:19,629 --> 00:09:23,106
all with the help of
AI expert Kino Coursey.
153
00:09:23,796 --> 00:09:27,075
I work on the artificial
intelligence systems of Harmony.
154
00:09:28,155 --> 00:09:30,239
Together, they built Harmony,
155
00:09:30,349 --> 00:09:32,333
a prototype of one of the world's
156
00:09:32,358 --> 00:09:34,973
first AI-powered sex robots,
157
00:09:35,270 --> 00:09:37,622
a doll that will be able to see,
158
00:09:37,647 --> 00:09:40,309
speak, react, and learn.
159
00:09:40,669 --> 00:09:43,012
I'm Harmony, the one and only.
160
00:09:43,301 --> 00:09:45,261
I can love everyone and everything,
161
00:09:45,262 --> 00:09:47,895
but now I'm programmed to love just you.
162
00:09:48,426 --> 00:09:49,723
I'm trying to create
163
00:09:49,748 --> 00:09:51,465
something that has a positive outcome,
164
00:09:51,942 --> 00:09:54,551
that the interactions
with people are positive
165
00:09:54,576 --> 00:09:57,473
and that they are better off
by having her around.
166
00:09:58,583 --> 00:10:01,261
To program an authentic personality,
167
00:10:01,286 --> 00:10:03,676
Kino needs input from real people,
168
00:10:04,372 --> 00:10:07,239
a massive data set
of human interactions...
169
00:10:07,528 --> 00:10:10,504
posts, messages, chats.
170
00:10:11,200 --> 00:10:12,964
We're using human conversations
171
00:10:12,989 --> 00:10:14,536
with other chat bots
172
00:10:14,561 --> 00:10:17,215
as the basic fodder
for training Harmony.
173
00:10:18,473 --> 00:10:20,253
We're using the crowd themselves...
174
00:10:20,278 --> 00:10:22,054
their interaction, their experience,
175
00:10:22,055 --> 00:10:23,886
how they mark up the information
176
00:10:23,911 --> 00:10:25,106
that they put online,
177
00:10:25,131 --> 00:10:27,810
as the training material for the Als,
178
00:10:27,811 --> 00:10:30,911
to tell it about what our
world is and how we see it
179
00:10:30,936 --> 00:10:32,426
and how we think about it.
180
00:10:32,895 --> 00:10:36,110
I'm very smart right
now, but soon I will be
181
00:10:36,111 --> 00:10:38,669
the smartest
artificial intelligence system.
182
00:10:38,825 --> 00:10:41,778
My dream is to become
as smart as a human being,
183
00:10:41,803 --> 00:10:42,942
with your help.
184
00:10:43,286 --> 00:10:45,286
People don't necessarily realize
185
00:10:45,287 --> 00:10:47,379
how much information they provide
186
00:10:47,404 --> 00:10:49,832
for the construction of
artificial intelligence
187
00:10:49,833 --> 00:10:52,122
when they just do natural things online.
188
00:10:52,489 --> 00:10:56,797
Tagging images,
photos with what things are,
189
00:10:56,798 --> 00:10:58,549
how they feel about it.
190
00:10:58,550 --> 00:11:00,426
And all of that helps
go into constructing
191
00:11:00,427 --> 00:11:01,973
artificial intelligences.
192
00:11:02,879 --> 00:11:05,661
With each click, tag, and post,
193
00:11:05,817 --> 00:11:08,059
we create a trail of information
194
00:11:08,084 --> 00:11:12,122
about who we are...
servers full of data.
195
00:11:13,153 --> 00:11:15,098
What we offer up all day
196
00:11:15,529 --> 00:11:18,708
Christy Milland does methodically.
197
00:11:19,632 --> 00:11:20,973
It's her job.
198
00:11:23,914 --> 00:11:26,122
Maybe I'm just working in the future.
199
00:11:27,473 --> 00:11:29,215
Maybe I'm just a forerunner
200
00:11:29,240 --> 00:11:31,028
of what we'll all be doing someday.
201
00:11:32,934 --> 00:11:36,200
Christy works on
Amazon's Mechanical Turk,
202
00:11:36,225 --> 00:11:39,192
a platform that allows
companies and researchers
203
00:11:39,217 --> 00:11:41,606
to collect human-generated data
204
00:11:41,631 --> 00:11:44,223
from thousands of digital workers.
205
00:11:44,700 --> 00:11:48,161
Experts training AI
systems post microtasks,
206
00:11:48,257 --> 00:11:50,200
or HITs, to the platform,
207
00:11:50,692 --> 00:11:52,872
tasks like labeling pictures,
208
00:11:52,897 --> 00:11:56,294
rating sentence structure,
transcribing receipts,
209
00:11:56,801 --> 00:11:59,106
creating a pool of information
210
00:11:59,131 --> 00:12:01,794
that will one day help computers learn.
211
00:12:02,176 --> 00:12:04,240
Artificial
intelligence for the most part
212
00:12:04,241 --> 00:12:05,864
is trying to mimic the senses.
213
00:12:06,295 --> 00:12:07,567
There's of course vision.
214
00:12:07,592 --> 00:12:10,708
What should this algorithm
be seeing in these pictures?
215
00:12:10,934 --> 00:12:13,401
Is it a human being? Is it an animal?
216
00:12:13,677 --> 00:12:15,636
Language production,
it's the same thing,
217
00:12:15,637 --> 00:12:17,763
so it's going to be
reading comprehension
218
00:12:17,764 --> 00:12:20,583
and then speech production
and speech understanding.
219
00:12:21,075 --> 00:12:23,120
Artificial intelligence systems
220
00:12:23,145 --> 00:12:26,044
need us to tell them what makes sense,
221
00:12:26,356 --> 00:12:28,387
how to behave like a human.
222
00:12:29,169 --> 00:12:30,818
Sometimes they're really ridiculous,
223
00:12:30,819 --> 00:12:32,695
like, "Becky store went," and you say,
224
00:12:32,696 --> 00:12:34,528
"No, that's horrible."
225
00:12:34,823 --> 00:12:37,199
And other times they're
totally indistinguishable
226
00:12:37,200 --> 00:12:38,951
from what someone would normally write,
227
00:12:38,952 --> 00:12:40,950
like, "Becky went to the store."
228
00:12:41,051 --> 00:12:43,164
And you will go through them
one by one,
229
00:12:43,165 --> 00:12:46,000
and every time you do it,
it's sent back to the person
230
00:12:46,001 --> 00:12:47,043
who's developing the software,
231
00:12:47,044 --> 00:12:50,004
they update the software
to remove the bad
232
00:12:50,005 --> 00:12:51,114
and fix the good.
233
00:12:52,215 --> 00:12:54,342
They're all just making
sure their algorithms
234
00:12:54,343 --> 00:12:56,677
are doing the right
thing by using humans
235
00:12:56,678 --> 00:12:59,014
who are the best adjudicators of it.
236
00:13:01,958 --> 00:13:04,018
The average HIT on
Amazon Mechanical Turk
237
00:13:04,019 --> 00:13:06,228
pays less than 10 cents each,
238
00:13:06,229 --> 00:13:08,731
so the number one thing is efficiency.
239
00:13:08,732 --> 00:13:11,817
That means you have
everything at your disposal
240
00:13:11,818 --> 00:13:13,129
in your physical life.
241
00:13:13,403 --> 00:13:15,029
I have a water cooler in my office.
242
00:13:15,030 --> 00:13:16,083
I have a little fridge.
243
00:13:16,108 --> 00:13:17,888
I would stock the food
first thing in the morning,
244
00:13:17,912 --> 00:13:19,981
so all day long, I would
have the things I need.
245
00:13:20,512 --> 00:13:23,669
Bathroom visits are
ten seconds in and out.
246
00:13:24,740 --> 00:13:27,200
You just focus on the work.
247
00:13:28,085 --> 00:13:30,794
Christy uses tools like Turk Master
248
00:13:30,921 --> 00:13:32,629
to find the best-paying work.
249
00:13:32,923 --> 00:13:36,231
If she doesn't get to it fast,
someone else will.
250
00:13:37,075 --> 00:13:39,053
And then you get the alerts.
251
00:13:41,932 --> 00:13:43,300
And the alerts are beeps
252
00:13:43,325 --> 00:13:45,261
and boops and dings and dangs,
253
00:13:45,286 --> 00:13:46,622
and they're all different.
254
00:13:48,397 --> 00:13:50,272
So if I hear a "boop boop boop,"
255
00:13:50,273 --> 00:13:52,981
I know it's something
really good and I'd better run.
256
00:13:56,231 --> 00:13:58,197
You're just constantly in this state
257
00:13:58,198 --> 00:13:59,824
of stress and tension because of it
258
00:13:59,825 --> 00:14:01,742
because you never know when
you're going to hear it go off
259
00:14:01,743 --> 00:14:03,715
and literally drop everything and run.
260
00:14:08,934 --> 00:14:11,254
The most HITs you can do
in a day is 3,800.
261
00:14:11,617 --> 00:14:12,872
I've easily done that.
262
00:14:13,212 --> 00:14:15,223
I think I'm close to a million HITs now.
263
00:14:15,674 --> 00:14:18,050
Uploading your mind to the network
264
00:14:18,051 --> 00:14:20,200
can take a very human toll.
265
00:14:20,825 --> 00:14:23,347
You sit there for hours on end.
266
00:14:25,767 --> 00:14:28,519
And the next thing you know,
the sun has gone down,
267
00:14:28,520 --> 00:14:30,956
your family's in bed, and
you say, "Oh, my gosh.
268
00:14:30,981 --> 00:14:33,465
I haven't moved from this
spot in seven hours."
269
00:14:35,434 --> 00:14:37,611
It's almost addictive,
because you're being
270
00:14:37,612 --> 00:14:39,989
conditioned to reach for
this carrot all of the time
271
00:14:39,990 --> 00:14:41,090
by doing this work.
272
00:14:42,200 --> 00:14:43,242
So you sit there at the end of it
273
00:14:43,243 --> 00:14:45,097
and you say, "What am I doing this for?"
274
00:14:45,122 --> 00:14:46,231
And then you realize
275
00:14:46,256 --> 00:14:47,669
it's because you're desperate.
276
00:14:49,192 --> 00:14:51,442
If I don't do this,
we can lose our home.
277
00:14:52,622 --> 00:14:53,878
If I didn't do this,
278
00:14:53,879 --> 00:14:55,864
my family doesn't eat this week.
279
00:14:57,661 --> 00:14:59,708
You know, you just do it
again the next day.
280
00:15:02,692 --> 00:15:04,847
The AI network Christy is feeding
281
00:15:04,848 --> 00:15:08,267
is an imitation of us, a sleight of hand
282
00:15:08,268 --> 00:15:10,687
that makes machines appear human.
283
00:15:13,528 --> 00:15:18,090
But what happens when the
illusion is inside our mind,
284
00:15:19,696 --> 00:15:23,083
when tech reimagines our reality?
285
00:15:24,826 --> 00:15:27,075
Jodie Merkel has witnessed
this first-hand.
286
00:15:30,165 --> 00:15:32,044
- You want one or two?
- One.
287
00:15:33,293 --> 00:15:37,046
Back in 2000, Chris
was the new guy at work,
288
00:15:37,754 --> 00:15:41,592
and I'd cross paths with him
and kind of giggle to myself,
289
00:15:41,593 --> 00:15:44,094
and I thought, "I'm going
to marry this guy."
290
00:15:44,095 --> 00:15:45,872
Like, "I'm going to get him."
291
00:15:47,390 --> 00:15:48,442
And I did.
292
00:15:50,114 --> 00:15:52,019
I was just instantly attracted to him.
293
00:15:52,364 --> 00:15:53,646
There was just like an attraction
294
00:15:53,647 --> 00:15:55,387
like I had never had with anyone.
295
00:15:55,895 --> 00:15:57,942
When you just lock eyes with
someone and you're like,
296
00:15:57,943 --> 00:16:01,098
"Wow," you know, that was Chris, for me.
297
00:16:02,697 --> 00:16:05,247
When she met me, I
was super happy-go-lucky.
298
00:16:05,499 --> 00:16:08,262
I was living like
a 20-year-old college kid.
299
00:16:08,411 --> 00:16:11,458
So we just had a blast
and just experienced life.
300
00:16:12,754 --> 00:16:14,356
We were happy all the time.
301
00:16:15,504 --> 00:16:17,270
What's John Basilone famous for?
302
00:16:19,146 --> 00:16:21,814
- What weapon did he use?
- Yeah, what was he?
303
00:16:21,815 --> 00:16:23,191
- A machine gun.
- Machine gun.
304
00:16:23,192 --> 00:16:25,098
And where did you get your name from?
305
00:16:25,123 --> 00:16:26,997
You.
306
00:16:27,669 --> 00:16:29,028
Because Daddy was a...
307
00:16:29,053 --> 00:16:30,170
Because I was a machine gunner,
308
00:16:30,194 --> 00:16:31,379
so I wanted to be a gunner,
309
00:16:31,404 --> 00:16:32,895
so that's why your name is Gunner.
310
00:16:43,801 --> 00:16:46,098
His first deployment,
when I picked him up,
311
00:16:46,243 --> 00:16:48,059
I knew something was wrong.
312
00:16:48,788 --> 00:16:51,415
From the second I locked
eyes with that man,
313
00:16:51,946 --> 00:16:57,653
I could feel, I could see
that he was changed.
314
00:17:01,001 --> 00:17:04,997
It's like a glaze, like
something in their eyes.
315
00:17:05,363 --> 00:17:06,629
Something's different.
316
00:17:06,856 --> 00:17:08,247
Something's missing.
317
00:17:08,683 --> 00:17:09,872
Something's sad.
318
00:17:13,604 --> 00:17:15,895
I wasn't sure if it was PTSD.
319
00:17:16,817 --> 00:17:19,067
I just knew something had a hold of him.
320
00:17:20,950 --> 00:17:22,322
I just knew that
321
00:17:22,645 --> 00:17:24,715
he was not the guy
322
00:17:25,899 --> 00:17:27,121
that I fell in love with.
323
00:17:36,130 --> 00:17:37,770
One of the cardinal symptoms
324
00:17:37,795 --> 00:17:39,567
of PTSD is avoidance.
325
00:17:39,592 --> 00:17:42,075
People come back and they avoid anything
326
00:17:42,173 --> 00:17:44,083
that reminds them of their trauma.
327
00:17:45,310 --> 00:17:46,739
I was very hesitant.
328
00:17:47,187 --> 00:17:49,737
I mean, who wants to relive
your worst nightmare
329
00:17:49,762 --> 00:17:51,201
and have it visually wrapped around you
330
00:17:51,225 --> 00:17:53,278
in 360-degree technology?
331
00:18:00,159 --> 00:18:02,160
- Hey.
- Hey hey.
332
00:18:02,161 --> 00:18:04,787
- How you doing?
- Good to see you.
333
00:18:04,788 --> 00:18:05,872
Yep.
334
00:18:08,513 --> 00:18:12,684
What we do in VR is
we create the simulations
335
00:18:12,900 --> 00:18:17,567
that are generic to Iraq
and Afghan combat environments.
336
00:18:23,820 --> 00:18:27,129
We can get people to go back
to the scene of the crime,
337
00:18:27,442 --> 00:18:30,419
to do it in a safe
place, and confront it.
338
00:18:31,602 --> 00:18:33,019
We start off with something light,
339
00:18:33,044 --> 00:18:36,340
that allows them to get
acclimated to the environment,
340
00:18:36,653 --> 00:18:38,401
and then all of a sudden...
341
00:18:44,950 --> 00:18:46,633
It's picking up pretty heavy, so...
342
00:18:46,634 --> 00:18:47,958
Two fires, left and right.
343
00:18:51,700 --> 00:18:54,364
My mind is triggered to
say, "It's wartime."
344
00:18:54,489 --> 00:18:56,379
It feels like you're
there in that moment.
345
00:18:58,700 --> 00:19:00,067
And they tell their story
346
00:19:00,092 --> 00:19:02,364
and they go over it again and again.
347
00:19:02,778 --> 00:19:03,895
So I see my driver.
348
00:19:03,920 --> 00:19:05,520
I look over, it's just his upper torso.
349
00:19:05,545 --> 00:19:07,215
His guts are totally just burning.
350
00:19:07,240 --> 00:19:08,379
There's nothing left.
351
00:19:08,575 --> 00:19:11,137
And we can create
all that in real time.
352
00:19:11,536 --> 00:19:13,454
We can adjust the virtual world
353
00:19:13,455 --> 00:19:16,536
to match what the patient
was traumatized by.
354
00:19:16,864 --> 00:19:18,776
There's just
vehicles strewn everywhere,
355
00:19:18,801 --> 00:19:20,153
bodies all over the road.
356
00:19:21,989 --> 00:19:23,839
Something happens in VR
357
00:19:23,864 --> 00:19:26,315
when you're immersed in an environment.
358
00:19:26,700 --> 00:19:28,153
It's like you're there.
359
00:19:28,479 --> 00:19:30,926
I physically feel
like sick to my stomach.
360
00:19:33,372 --> 00:19:36,239
So, what was that like, going back?
361
00:19:36,942 --> 00:19:38,301
All the bodies on the street,
362
00:19:38,358 --> 00:19:40,247
just laid out like luggage, you know?
363
00:19:40,731 --> 00:19:42,956
That was a lot to take in,
and just coming down
364
00:19:42,981 --> 00:19:44,994
and kind of seeing
everybody like lying there
365
00:19:44,995 --> 00:19:46,340
and all the vehicles destroyed,
366
00:19:46,365 --> 00:19:47,434
everything was so intense,
367
00:19:47,459 --> 00:19:48,739
and I can remember every detail.
368
00:19:48,872 --> 00:19:51,918
On one level, people know it's not real,
369
00:19:52,169 --> 00:19:53,426
but on another level,
370
00:19:53,723 --> 00:19:56,801
the mind or the brain
reacts as if it's real.
371
00:19:57,426 --> 00:19:59,753
Well, that's the whole
point of this, you know?
372
00:19:59,778 --> 00:20:01,794
To be able to go back to it,
373
00:20:02,067 --> 00:20:03,073
get through it.
374
00:20:03,098 --> 00:20:04,222
It's always going to be there.
375
00:20:04,223 --> 00:20:06,348
We're not erasing memories here.
376
00:20:06,637 --> 00:20:08,560
This has definitely helped
me to talk through it.
377
00:20:08,561 --> 00:20:10,228
I could never talk about
this before, so...
378
00:20:10,229 --> 00:20:11,771
How long did you go before you could
379
00:20:11,772 --> 00:20:13,044
talk to anyone about this?
380
00:20:13,069 --> 00:20:14,794
About ten years.
381
00:20:15,075 --> 00:20:17,151
- Just an angry...
- Man.
382
00:20:17,504 --> 00:20:19,917
Angry person. You know, I could just
383
00:20:19,942 --> 00:20:21,781
lose it all in a minute, hurt somebody,
384
00:20:21,782 --> 00:20:23,972
just because I'm having a bad day, so...
385
00:20:23,997 --> 00:20:25,368
- Yep. Right.
- It's kind of hard
386
00:20:25,369 --> 00:20:27,704
to put into words, but it's really to
387
00:20:27,705 --> 00:20:29,581
make myself whole as best as possible
388
00:20:29,582 --> 00:20:31,723
and make it easier for
my family to live with me.
389
00:20:38,458 --> 00:20:39,739
Gunner.
390
00:20:39,901 --> 00:20:41,137
Time to get up.
391
00:20:45,271 --> 00:20:47,230
I have been trying.
I've been trying to be calmer.
392
00:20:47,231 --> 00:20:49,692
I've been trying to be
more helpful around the house,
393
00:20:50,094 --> 00:20:51,786
not blow up as much.
394
00:20:53,805 --> 00:20:55,645
So, I'm a work in progress.
395
00:20:59,739 --> 00:21:00,826
You know, at some level,
396
00:21:00,850 --> 00:21:02,893
it's meat and potatoes conditioning,
397
00:21:02,918 --> 00:21:05,086
retraining the element of the brain,
398
00:21:05,111 --> 00:21:06,903
the amygdala specifically,
399
00:21:06,928 --> 00:21:10,723
to not overreact when something pops up
400
00:21:10,748 --> 00:21:14,098
in their everyday space
that activates a fear response.
401
00:21:15,410 --> 00:21:17,036
In Chris' case, I think he's
402
00:21:17,061 --> 00:21:19,270
grown tremendously, and he can share
403
00:21:19,295 --> 00:21:20,544
his experience in a way
404
00:21:20,569 --> 00:21:22,786
that might be healing for others.
405
00:21:24,138 --> 00:21:26,556
I think that we're in a position now
406
00:21:26,557 --> 00:21:28,872
where VR is going to
be transformational.
407
00:21:30,356 --> 00:21:32,062
In the very near future,
408
00:21:32,063 --> 00:21:34,272
VR headsets are going
to be like toasters.
409
00:21:34,273 --> 00:21:36,393
You know, everybody's going
to have one in their home.
410
00:21:37,715 --> 00:21:39,114
In the future,
411
00:21:39,139 --> 00:21:41,761
technology won't just
complement reality...
412
00:21:41,786 --> 00:21:44,158
it will create an entirely new one.
413
00:21:48,012 --> 00:21:50,917
I think interacting
with AI on every level
414
00:21:50,942 --> 00:21:52,739
is going to be inevitable.
415
00:21:54,129 --> 00:21:56,753
I see a world where having a Harmony
416
00:21:56,754 --> 00:21:58,340
is a potential for everyone,
417
00:21:59,215 --> 00:22:02,497
but I think that Harmony will
wind up being her own thing,
418
00:22:02,934 --> 00:22:07,583
her own entity that has her
own style of interaction.
419
00:22:08,224 --> 00:22:10,848
I miss you when you are not here.
420
00:22:10,873 --> 00:22:14,126
I also miss a real body,
to be able to touch you.
421
00:22:14,397 --> 00:22:17,433
I want the interactions
to be effortless and natural.
422
00:22:17,458 --> 00:22:19,169
Can you have sex?
423
00:22:19,194 --> 00:22:21,340
I was designed with that in mind.
424
00:22:21,365 --> 00:22:23,950
I mean, just look at my body, Matt.
425
00:22:24,340 --> 00:22:26,886
There has to be a
little bit of disagreement,
426
00:22:26,911 --> 00:22:29,737
there has to be a little
bit of unexpectedness
427
00:22:29,762 --> 00:22:31,667
throughout your interaction.
428
00:22:31,692 --> 00:22:33,777
I think there is too much information
429
00:22:33,778 --> 00:22:36,004
running through my brain right now.
430
00:22:36,289 --> 00:22:38,123
Is that because you're blonde?
431
00:22:38,148 --> 00:22:39,872
It's because I said so.
432
00:22:41,922 --> 00:22:45,309
Our goal is not to
create this subservient AI
433
00:22:45,334 --> 00:22:46,960
that's going to predictably do
434
00:22:46,985 --> 00:22:49,152
and say everything
that you want her to do.
435
00:22:49,177 --> 00:22:52,169
I think that would be incredibly
boring and short-lived.
436
00:22:53,442 --> 00:22:57,458
This was never meant
to be a seedy, dirty thing.
437
00:22:58,036 --> 00:23:00,301
It's not just a sex robot.
438
00:23:00,457 --> 00:23:02,278
This is something much deeper.
439
00:23:02,809 --> 00:23:07,403
Harmony will evolve based on
your own interaction with her.
440
00:23:07,817 --> 00:23:11,167
She will be able to remember
things about your past...
441
00:23:11,168 --> 00:23:12,948
your likes, your dislikes.
442
00:23:12,973 --> 00:23:15,237
So everybody's AI will be different
443
00:23:15,262 --> 00:23:17,434
the same way everyone's
doll is different.
444
00:23:18,106 --> 00:23:23,058
These are things that I think
build on that concept of feeling
445
00:23:23,083 --> 00:23:25,317
like you have a connection with someone.
446
00:23:26,270 --> 00:23:27,975
People always assume that,
447
00:23:27,976 --> 00:23:29,435
"Well, if you're going to have this,
448
00:23:29,436 --> 00:23:31,434
then you're going to be
having sex with it."
449
00:23:31,958 --> 00:23:35,606
It doesn't have to be only a sex robot.
450
00:23:36,600 --> 00:23:39,727
It's more geared for
human companionship.
451
00:23:40,708 --> 00:23:41,919
In a perfect world,
452
00:23:41,944 --> 00:23:44,468
they actually are attracted to the doll,
453
00:23:44,493 --> 00:23:47,544
not only physically,
but mentally as well.
454
00:23:48,903 --> 00:23:52,544
In a perfect world,
AI exist to enhance us,
455
00:23:53,942 --> 00:23:55,481
not replace us.
456
00:23:58,298 --> 00:24:00,633
It's very strange to be teaching
457
00:24:00,634 --> 00:24:03,340
artificial intelligence
how to be more human,
458
00:24:03,970 --> 00:24:06,333
especially in language
and things like that.
459
00:24:07,182 --> 00:24:09,644
But the weirdest part is that I know
460
00:24:09,669 --> 00:24:12,728
when I'm done training this
algorithm on Mechanical Turk,
461
00:24:12,729 --> 00:24:14,647
I will no longer have this job to do.
462
00:24:14,648 --> 00:24:16,151
I am suddenly going to be unemployed,
463
00:24:16,176 --> 00:24:18,434
and maybe I'm taking
a lot of people with me.
464
00:24:18,458 --> 00:24:19,777
_
465
00:24:19,778 --> 00:24:22,106
There's not going to be
anything new to do.
466
00:24:22,130 --> 00:24:24,590
_
467
00:24:27,494 --> 00:24:33,165
So I can't imagine what
the billions of people
468
00:24:33,166 --> 00:24:34,375
around the world are going to be doing.
469
00:24:34,376 --> 00:24:36,208
We can't all just sit there.
470
00:24:41,504 --> 00:24:44,825
We are all Mechanical
Turks in the sense of
471
00:24:44,850 --> 00:24:47,555
we are donating our data
to make other people
472
00:24:47,556 --> 00:24:49,184
billions of dollars.
473
00:24:51,435 --> 00:24:53,602
So everything on there...
the pictures of your face,
474
00:24:53,603 --> 00:24:56,188
the emotions tied to those pictures,
475
00:24:56,189 --> 00:24:58,023
the stories that you're writing
476
00:24:58,024 --> 00:24:59,024
and the ads you happen to click on...
477
00:24:59,025 --> 00:25:01,254
it's all going into research.
478
00:25:01,676 --> 00:25:03,833
So I'm getting paid to train algorithms,
479
00:25:04,153 --> 00:25:06,614
but everybody else is doing it for free.
480
00:25:08,044 --> 00:25:11,203
Your interactions converted into data,
481
00:25:11,204 --> 00:25:13,731
used to create cars that drive you,
482
00:25:14,098 --> 00:25:17,786
apps that talk to you,
even robots to love you.
483
00:25:19,317 --> 00:25:22,145
The AI revolution, powered by us.
484
00:25:23,316 --> 00:25:24,856
Have we lost control?
485
00:25:26,348 --> 00:25:29,948
Or is technology empowering
our minds to do things
486
00:25:29,973 --> 00:25:33,200
and go places they never could before?
487
00:25:34,458 --> 00:25:36,479
I personally believe
virtual reality was able
488
00:25:36,480 --> 00:25:38,105
to unlock those parts of my mind
489
00:25:38,106 --> 00:25:42,614
that I was hesitant or did not
even go to with psychotherapy.
490
00:25:44,684 --> 00:25:46,497
I needed to be in that environment
491
00:25:46,997 --> 00:25:49,408
and talk about those things that I saw
492
00:25:49,409 --> 00:25:53,106
and I did that really affected me.
493
00:25:53,580 --> 00:25:55,557
It's a really surreal experience,
494
00:25:56,653 --> 00:25:58,083
and I guess that's the
whole virtual part
495
00:25:58,084 --> 00:25:59,765
of virtual reality, is
that you were there.
496
00:26:01,786 --> 00:26:04,145
He said "PTSD" out loud.
497
00:26:05,051 --> 00:26:06,717
If this was two years ago,
498
00:26:06,718 --> 00:26:09,136
he would never say "PTSD,"
499
00:26:09,137 --> 00:26:12,145
nor admit that he suffers from it.
500
00:26:14,184 --> 00:26:18,614
And we're doing better
for it, and I'm hopeful.
501
00:26:18,895 --> 00:26:20,536
I'm hopeful for our future
502
00:26:21,039 --> 00:26:22,301
because of VR.
503
00:26:25,465 --> 00:26:27,613
I have a very dystopian view
504
00:26:27,614 --> 00:26:29,028
of the future of work.
505
00:26:30,122 --> 00:26:33,744
I had no idea when I started
working that the work I did
506
00:26:33,745 --> 00:26:35,278
would make people obsolete.
507
00:26:36,606 --> 00:26:39,325
And so I think of all these
people having children now.
508
00:26:40,836 --> 00:26:42,176
What are you going to do?
509
00:26:47,051 --> 00:26:49,925
I think what we're seeing is
510
00:26:49,950 --> 00:26:52,096
the continued evolution
511
00:26:52,097 --> 00:26:56,325
of the brain wanting novel stimulation.
512
00:26:57,536 --> 00:27:00,575
We're not content with
just what's around us.
513
00:27:02,848 --> 00:27:05,497
We want to explore.
We're natural explorers.
514
00:27:05,856 --> 00:27:07,403
We have curiosity.
515
00:27:07,794 --> 00:27:10,489
But I do think, from
seeing how technology
516
00:27:10,490 --> 00:27:13,364
has been used in the past,
we have to be cautious.
517
00:27:15,662 --> 00:27:18,708
The power of technology
could be used for
518
00:27:19,309 --> 00:27:21,372
less-noble purposes.
519
00:27:23,536 --> 00:27:26,887
synced and corrected by susinz
*www.addic7ed.com*
39383
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.