Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,750 --> 00:00:05,250
We live under
a billion unblinking eyes...
2
00:00:05,260 --> 00:00:09,420
A global surveillance system
that solves crime...
3
00:00:09,430 --> 00:00:12,960
Uncovers terrorist plots...
4
00:00:12,960 --> 00:00:16,300
And helps stop abuse of power.
5
00:00:16,300 --> 00:00:19,630
But are we ready
for a world without secrets...
6
00:00:21,540 --> 00:00:25,170
...where not even our homes
are off-limits
7
00:00:25,180 --> 00:00:27,640
and corporations know
our every desire?
8
00:00:27,640 --> 00:00:31,280
Should we say goodbye
to our privacy?
9
00:00:31,280 --> 00:00:34,850
Or is it time for the watched
to become the watchers?
10
00:00:40,390 --> 00:00:44,730
Space, time, life itself.
11
00:00:47,300 --> 00:00:52,230
The secrets of the cosmos
lie through the wormhole.
12
00:00:52,240 --> 00:00:55,240
Captions by vitac...
www.Vitac.Com
13
00:00:55,240 --> 00:00:58,240
captions paid for by
Discovery communications
14
00:01:04,410 --> 00:01:07,950
Ever have the feeling
you're being watched?
15
00:01:07,950 --> 00:01:11,250
Well, you probably are.
16
00:01:11,250 --> 00:01:12,990
If you live in a large city,
17
00:01:12,990 --> 00:01:15,460
surveillance cameras
take your picture
18
00:01:15,460 --> 00:01:17,660
hundreds of times per day.
19
00:01:17,660 --> 00:01:21,730
Every transaction you make
is electronically logged.
20
00:01:21,730 --> 00:01:24,770
Scanners at airports
can peer through your clothes.
21
00:01:24,770 --> 00:01:28,170
The latest models can even
detect your emotional state.
22
00:01:28,170 --> 00:01:31,410
And in those moments
when you're not being tracked,
23
00:01:31,410 --> 00:01:35,010
we're busy giving away
our personal information
24
00:01:35,010 --> 00:01:36,380
on social media.
25
00:01:36,380 --> 00:01:40,380
We think of our privacy
as a fundamental right.
26
00:01:40,380 --> 00:01:44,820
Now it appears to be
on the brink of extinction,
27
00:01:44,820 --> 00:01:47,560
which sounds like a nightmare.
28
00:01:47,560 --> 00:01:49,320
But is it?
29
00:01:53,000 --> 00:01:57,500
This footage was shot by the
Hawkeye ii surveillance camera
30
00:01:57,500 --> 00:02:01,640
flying two miles
above Ciudad Juárez in Mexico.
31
00:02:01,640 --> 00:02:04,540
Once every second
for hours on end,
32
00:02:04,540 --> 00:02:07,010
it takes a picture
of the entire city.
33
00:02:07,010 --> 00:02:10,980
Here it documents
the murder of a police officer
34
00:02:10,980 --> 00:02:13,310
by members of a drug cartel.
35
00:02:13,320 --> 00:02:16,580
But it also captures
the movements of the assassins.
36
00:02:16,590 --> 00:02:19,320
It tracks their cars
as they leave the scene
37
00:02:19,320 --> 00:02:21,860
and leads the police
to their hideout.
38
00:02:21,860 --> 00:02:24,060
Cities around the world
39
00:02:24,060 --> 00:02:27,100
are beginning to use these
total surveillance systems.
40
00:02:27,100 --> 00:02:30,830
One could be watching you
right now.
41
00:02:30,830 --> 00:02:35,370
Nick Bostrom runs
the Future of humanity institute
42
00:02:35,370 --> 00:02:37,310
at Oxford university.
43
00:02:37,310 --> 00:02:39,810
He believes
constant surveillance
44
00:02:39,810 --> 00:02:42,310
will radically reshape
our lives,
45
00:02:42,310 --> 00:02:45,950
but we won't end up fearing it
like big brother.
46
00:02:45,950 --> 00:02:49,050
Nick believes we'll embrace it.
47
00:02:49,050 --> 00:02:51,350
Surveillance technology
might be one of those things
48
00:02:51,350 --> 00:02:53,420
that could change
social dynamics
49
00:02:53,420 --> 00:02:55,220
in some fairly fundamental way.
50
00:02:55,230 --> 00:02:57,790
Yes, already
in urban environments,
51
00:02:57,790 --> 00:03:01,530
there are a lot of cameras
looking at us all the time.
52
00:03:01,530 --> 00:03:03,100
So, it's a lot of eyeballs,
53
00:03:03,100 --> 00:03:05,230
but they are
kind of semi-isolated.
54
00:03:05,240 --> 00:03:06,530
An obvious next step,
55
00:03:06,540 --> 00:03:09,600
where all these video feeds
are stored in perpetuity
56
00:03:09,610 --> 00:03:12,940
and coupled with
facial-recognition systems
57
00:03:12,940 --> 00:03:16,140
so you could automatically tag
and keep track
58
00:03:16,150 --> 00:03:18,650
of where
any individual has been,
59
00:03:18,650 --> 00:03:20,520
whom they have been
talking with,
60
00:03:20,520 --> 00:03:21,920
what they have been doing.
61
00:03:24,420 --> 00:03:27,020
That sounds like a bad thing,
62
00:03:27,020 --> 00:03:28,960
but it doesn't have to be.
63
00:03:28,960 --> 00:03:32,160
Think about contagious diseases.
64
00:03:32,160 --> 00:03:34,100
The virus this man is carrying
65
00:03:34,100 --> 00:03:37,230
could spread across the city
in just a few days.
66
00:03:38,370 --> 00:03:39,830
It could start a pandemic
67
00:03:39,840 --> 00:03:42,140
that kills tens of thousands.
68
00:03:43,540 --> 00:03:44,970
If you have a new outbreak,
69
00:03:44,980 --> 00:03:49,180
you have SARS or h1n1 or
some new disease that pops up,
70
00:03:49,180 --> 00:03:52,150
it gets really important
to try to trace
71
00:03:52,150 --> 00:03:53,950
who might have been exposed
to the germ.
72
00:03:53,950 --> 00:03:55,150
And it's painstaking work.
73
00:03:55,150 --> 00:03:56,580
You have to interview the people
74
00:03:56,590 --> 00:03:58,390
and try to get
who they have interacted with,
75
00:03:58,390 --> 00:04:00,420
and then you have to go
and interview those people.
76
00:04:00,420 --> 00:04:02,820
But constant surveillance
77
00:04:02,830 --> 00:04:06,290
could spot the origins
of the outbreaks in real time,
78
00:04:06,300 --> 00:04:08,060
locating the infected,
79
00:04:08,060 --> 00:04:10,870
dispatching
medical teams to them,
80
00:04:10,870 --> 00:04:14,000
and establishing quarantines.
81
00:04:14,000 --> 00:04:16,000
Please return to your homes.
82
00:04:16,010 --> 00:04:18,610
This area
is under temporary quarantine.
83
00:04:20,280 --> 00:04:22,510
Those things obviously
can become more efficient,
84
00:04:22,510 --> 00:04:24,680
the more detailed information
you have.
85
00:04:24,680 --> 00:04:26,210
So, if you nip it in the bud,
86
00:04:26,220 --> 00:04:28,820
you potentially save
millions of lives.
87
00:04:28,820 --> 00:04:32,920
Imagine if every single
person you interacted with
88
00:04:32,920 --> 00:04:35,690
were tracked 24/7.
89
00:04:35,690 --> 00:04:37,690
Actually to be able to see
90
00:04:37,690 --> 00:04:39,460
what somebody has been up to
in the past.
91
00:04:39,460 --> 00:04:40,700
Have they kept their promises?
92
00:04:40,700 --> 00:04:43,430
There are a lot of jerks
and cheaters in the world
93
00:04:43,430 --> 00:04:44,630
who get away with it.
94
00:04:44,630 --> 00:04:46,370
And by the time people wise up,
95
00:04:46,370 --> 00:04:48,770
they have moved on
to their next victims.
96
00:04:48,770 --> 00:04:49,970
Kind of nice
97
00:04:49,970 --> 00:04:53,470
to be able to disempower
the jerks and cheaters,
98
00:04:53,480 --> 00:04:55,340
and it would encourage
more people
99
00:04:55,350 --> 00:04:59,180
to behave in ways that...
that are good.
100
00:04:59,180 --> 00:05:02,950
And if cameras tracked you
every moment of the day,
101
00:05:02,950 --> 00:05:04,350
some aspects of your life
102
00:05:04,350 --> 00:05:06,390
would become
a lot more convenient.
103
00:05:06,390 --> 00:05:09,590
You could go into a shop
and just take what you need.
104
00:05:09,590 --> 00:05:11,760
And the camera
recognizes who you are
105
00:05:11,760 --> 00:05:14,600
and automatically rings it up
to your bank account,
106
00:05:14,600 --> 00:05:16,500
and it's all taken care of.
107
00:05:16,500 --> 00:05:18,900
If you hadn't yet
kicked the old habit
108
00:05:18,900 --> 00:05:20,500
of carrying a wallet,
109
00:05:20,500 --> 00:05:24,040
you'd never have to worry about
remembering where you left it.
110
00:05:24,040 --> 00:05:27,440
In some ways,
it's actually a return
111
00:05:27,440 --> 00:05:29,510
to a more normal
human condition.
112
00:05:29,510 --> 00:05:32,180
We used to live in small tribes,
small bands.
113
00:05:32,180 --> 00:05:33,980
You kind of know
what everybody's doing,
114
00:05:33,980 --> 00:05:35,580
who they are,
what they are up to,
115
00:05:35,590 --> 00:05:37,390
what they have been doing
in the past.
116
00:05:37,390 --> 00:05:39,350
In some respects,
it's not a complete novelty.
117
00:05:39,360 --> 00:05:41,860
It might be more a return
to normalcy.
118
00:05:41,860 --> 00:05:44,660
Life under global surveillance
119
00:05:44,660 --> 00:05:47,460
might resemble life
in a small village,
120
00:05:47,460 --> 00:05:50,970
but could we adapt
to being constantly surveilled,
121
00:05:50,970 --> 00:05:53,270
even inside our own homes?
122
00:05:53,270 --> 00:05:57,000
Most people say
they don't like being watched
123
00:05:57,010 --> 00:06:02,280
when they're eating, washing,
or doing anything in the nude.
124
00:06:02,280 --> 00:06:06,150
But our homes
are already full of cameras,
125
00:06:06,150 --> 00:06:08,650
from security cameras
and cellphones
126
00:06:08,650 --> 00:06:10,320
to laptops and tvs.
127
00:06:10,320 --> 00:06:13,050
They're even hidden
inside clocks
128
00:06:13,060 --> 00:06:15,260
that keep an eye on the nanny.
129
00:06:15,260 --> 00:06:17,760
You might think
these cameras are harmless,
130
00:06:17,760 --> 00:06:19,990
but they all connect
to the Internet,
131
00:06:20,000 --> 00:06:23,660
which means they can be hacked.
132
00:06:26,400 --> 00:06:29,500
Cognitive scientist
Antti Oulasvirta
133
00:06:29,510 --> 00:06:31,270
lives in Finland,
134
00:06:31,270 --> 00:06:35,140
a country known
for notoriously shy people.
135
00:06:35,150 --> 00:06:37,680
They say you can spot
an extroverted Finn
136
00:06:37,680 --> 00:06:41,180
because they're looking
at your shoes, not their own.
137
00:06:41,180 --> 00:06:45,520
Antti realized his countrymen
would be perfect Guinea pigs
138
00:06:45,520 --> 00:06:46,950
for an experiment
139
00:06:46,960 --> 00:06:51,030
to see how people react
to having no privacy.
140
00:06:51,030 --> 00:06:54,660
I have a motivation
to keep the kitchen clean,
141
00:06:54,660 --> 00:06:59,030
but I have an extra motivation
today... this camera.
142
00:06:59,040 --> 00:07:02,670
Antti wanted to see
how people's behavior changed
143
00:07:02,670 --> 00:07:06,070
when their homes were wired
for constant surveillance,
144
00:07:06,080 --> 00:07:08,680
so he persuaded
several households
145
00:07:08,680 --> 00:07:11,550
to do what for Finns
is unthinkable...
146
00:07:11,550 --> 00:07:15,820
submit to being watched
for an entire year.
147
00:07:15,820 --> 00:07:19,150
We wired 10 households
in Finland for 12 months,
148
00:07:19,160 --> 00:07:22,220
including cameras and microfilms
and even screen capture.
149
00:07:22,230 --> 00:07:24,190
So, this is Bob,
150
00:07:24,190 --> 00:07:27,460
and Bob looks like a regular
piece of home electronics,
151
00:07:27,460 --> 00:07:28,360
but he's not.
152
00:07:28,360 --> 00:07:29,930
B-o-b,
153
00:07:29,930 --> 00:07:33,430
which stands for
"behavioral observation system,"
154
00:07:33,440 --> 00:07:36,970
records all the video and audio
around the house.
155
00:07:36,970 --> 00:07:41,180
Bob also keeps track
of all e-mail, web traffic,
156
00:07:41,180 --> 00:07:44,810
online purchases,
and television-viewing habits.
157
00:07:44,810 --> 00:07:47,080
So, we had them covered
pretty well in all areas
158
00:07:47,080 --> 00:07:48,420
of ubiquitous surveillance.
159
00:07:48,420 --> 00:07:51,020
We didn't want to bust people
for doing anything wrong.
160
00:07:51,020 --> 00:07:53,020
We simply wanted to see
how they would react
161
00:07:53,020 --> 00:07:56,060
to not being able to be alone
in their own homes.
162
00:07:56,060 --> 00:07:58,460
In the first weeks of the study,
163
00:07:58,460 --> 00:08:01,060
Antti noticed his subjects
appeared unsettled
164
00:08:01,060 --> 00:08:03,160
by the presence of the cameras.
165
00:08:03,170 --> 00:08:06,470
They had to keep
their impulses in check,
166
00:08:06,470 --> 00:08:07,940
control their shouting.
167
00:08:07,940 --> 00:08:10,100
And if there was
a stressful situation
168
00:08:10,110 --> 00:08:11,570
playing out in their lives,
169
00:08:11,570 --> 00:08:13,410
that could amplify the stress.
170
00:08:13,410 --> 00:08:15,840
And... no surprise...
171
00:08:15,850 --> 00:08:18,250
they were sensitive
about being naked.
172
00:08:18,250 --> 00:08:20,580
Being naked was,
of course, an issue,
173
00:08:20,580 --> 00:08:24,090
and we left them a few spots...
for example, bathrooms...
174
00:08:24,090 --> 00:08:27,290
where they could be alone
without the cameras.
175
00:08:27,290 --> 00:08:29,960
They were like fish
in a fishbowl.
176
00:08:29,960 --> 00:08:32,230
But as time went on,
177
00:08:32,230 --> 00:08:35,900
Antti noticed
something surprising.
178
00:08:35,900 --> 00:08:39,200
So after surveilling
people for six months,
179
00:08:39,200 --> 00:08:41,940
we asked them to draw us a graph
of their stress levels.
180
00:08:41,940 --> 00:08:44,210
They were stressed out
in the beginning,
181
00:08:44,210 --> 00:08:47,880
but after a while,
it leveled off.
182
00:08:47,880 --> 00:08:51,410
Eventually,
the subjects started to relax.
183
00:08:51,410 --> 00:08:54,850
They stopped worrying about
being seen naked.
184
00:08:54,850 --> 00:08:56,450
So, the mentality was that,
185
00:08:56,450 --> 00:08:59,220
"now you've seen me once walking
around the kitchen naked,
186
00:08:59,220 --> 00:09:01,260
so what's the point
continuing to hide?"
187
00:09:01,260 --> 00:09:03,790
And when
they really needed privacy
188
00:09:03,790 --> 00:09:05,590
for a delicate conversation,
189
00:09:05,600 --> 00:09:07,560
they figured out how to get it.
190
00:09:07,560 --> 00:09:10,930
They went to cafés
to have private conversations,
191
00:09:10,930 --> 00:09:14,340
and they avoided the cameras
in creative ways.
192
00:09:14,340 --> 00:09:16,170
Antti's study shows
193
00:09:16,170 --> 00:09:19,170
we can adapt
to almost constant surveillance.
194
00:09:19,180 --> 00:09:21,880
He admits that
this was a special case.
195
00:09:21,880 --> 00:09:25,280
The subject knew him and trusted
him not to share the data.
196
00:09:25,280 --> 00:09:27,350
But to Antti's surprise,
197
00:09:27,350 --> 00:09:30,280
some people
didn't care who was watching.
198
00:09:30,290 --> 00:09:31,720
We asked the subjects
199
00:09:31,720 --> 00:09:33,690
who would they least want
to share the data with,
200
00:09:33,690 --> 00:09:35,760
and the most striking
feature was,
201
00:09:35,760 --> 00:09:37,560
some went as far as saying that,
202
00:09:37,560 --> 00:09:41,330
"it doesn't matter
to whom you share the data."
203
00:09:41,330 --> 00:09:44,170
We have an amazing ability
204
00:09:44,170 --> 00:09:47,500
to adapt
to changing environments.
205
00:09:47,500 --> 00:09:50,170
But if we learn
to ignore cameras,
206
00:09:50,170 --> 00:09:54,540
it won't be long before we stop
thinking about who's watching
207
00:09:54,540 --> 00:09:57,810
and why they are watching.
208
00:09:57,810 --> 00:10:01,520
They say ignorance is bliss.
209
00:10:01,520 --> 00:10:03,880
But in this case,
210
00:10:03,890 --> 00:10:06,620
what you don't know
could hurt you.
211
00:10:10,130 --> 00:10:14,140
In George Orwell's novel "1984"
212
00:10:14,140 --> 00:10:16,740
everyone lived
under the watchful eye
213
00:10:16,740 --> 00:10:19,110
of an authoritarian
government...
214
00:10:19,110 --> 00:10:21,080
Big Brother.
215
00:10:21,080 --> 00:10:22,780
Today, in real life,
216
00:10:22,780 --> 00:10:27,250
there's a different watchful eye
we should worry about...
217
00:10:27,250 --> 00:10:29,750
big business.
218
00:10:29,750 --> 00:10:31,450
Ask yourself this...
219
00:10:31,460 --> 00:10:35,160
do you know when corporations
are watching you
220
00:10:35,160 --> 00:10:39,900
or how much
they already know about you?
221
00:10:39,900 --> 00:10:42,870
The answer will shock you.
222
00:10:47,970 --> 00:10:51,440
Alessandro Acquisti
always puts safety first.
223
00:10:51,440 --> 00:10:54,910
He knows a helmet
will keep him safe on the road,
224
00:10:54,910 --> 00:10:58,850
and the same goes for his
social-media profile picture.
225
00:10:58,850 --> 00:11:01,280
I do have a Facebook profile.
226
00:11:01,290 --> 00:11:04,290
For my profile picture,
I wear a motorcycle helmet.
227
00:11:04,290 --> 00:11:09,260
Your name
and your profile picture
228
00:11:09,260 --> 00:11:10,730
are public by default.
229
00:11:10,730 --> 00:11:12,660
Therefore, they're searchable.
230
00:11:12,660 --> 00:11:17,370
So, my question is,
how much can I learn about you
231
00:11:17,370 --> 00:11:20,440
starting just from a photo
of your face?
232
00:11:20,440 --> 00:11:23,670
Alessandro
is a behavioral economist
233
00:11:23,680 --> 00:11:26,440
at Carnegie Mellon University
in Pittsburgh.
234
00:11:26,440 --> 00:11:29,550
He's trying to find out
what private corporations
235
00:11:29,550 --> 00:11:31,910
might be able to find out
about you
236
00:11:31,920 --> 00:11:34,380
just by taking a picture
of your face.
237
00:11:34,390 --> 00:11:35,890
Would you like to help us
with a study?
238
00:11:35,890 --> 00:11:36,890
Sure.
239
00:11:36,890 --> 00:11:39,520
So, Alessandro and his team
240
00:11:39,520 --> 00:11:42,190
developed their own
data-mining app.
241
00:11:42,190 --> 00:11:44,360
We took a shot of their faces.
242
00:11:44,360 --> 00:11:46,560
And then you wait a few seconds.
243
00:11:48,570 --> 00:11:49,800
In the meanwhile,
244
00:11:49,800 --> 00:11:52,400
the shot is being uploaded
to a cloud,
245
00:11:52,400 --> 00:11:55,670
where we have
previously downloaded
246
00:11:55,670 --> 00:11:59,240
a few hundred thousand images
from Facebook profiles.
247
00:11:59,240 --> 00:12:00,540
The app uses
248
00:12:00,550 --> 00:12:02,810
commercially available
facial-recognition software
249
00:12:02,810 --> 00:12:05,820
to find a matching face online.
250
00:12:05,820 --> 00:12:08,320
This information
is sent back to the phone
251
00:12:08,320 --> 00:12:12,420
and overlaid on the face
of the subject in front of you.
252
00:12:12,420 --> 00:12:15,630
To see if we can identify you
and see what information we...
253
00:12:15,630 --> 00:12:16,990
Once it matches the photo
254
00:12:17,000 --> 00:12:18,090
to a social-media profile,
255
00:12:18,100 --> 00:12:21,160
the software can find out
someone's name,
256
00:12:21,170 --> 00:12:24,870
their birth city,
their interests, and much more.
257
00:12:24,870 --> 00:12:28,440
Starting from just one snapshot
of a person...
258
00:12:28,440 --> 00:12:31,470
no name
and no personal information...
259
00:12:31,480 --> 00:12:33,280
we were able to lock on
260
00:12:33,280 --> 00:12:36,980
to the Facebook profiles
of these subjects.
261
00:12:36,980 --> 00:12:38,550
Wow. No way.
262
00:12:38,550 --> 00:12:40,680
And once you get
to the Facebook profile,
263
00:12:40,690 --> 00:12:43,190
a world of information opens up.
264
00:12:43,190 --> 00:12:44,620
That's really eerie.
265
00:12:44,620 --> 00:12:48,120
Most of us post
photos of ourselves online,
266
00:12:48,130 --> 00:12:52,500
but not everyone realizes
that photos are also data.
267
00:12:52,500 --> 00:12:55,460
And no one stole
the data from us.
268
00:12:55,470 --> 00:12:59,100
We are willingly
and publicly disclosing it.
269
00:12:59,100 --> 00:13:02,670
With a name
and birthplace in hand,
270
00:13:02,670 --> 00:13:06,010
deeper corporate data mining
can reveal data of birth,
271
00:13:06,010 --> 00:13:07,110
criminal record,
272
00:13:07,110 --> 00:13:08,980
and can even make a close guess
273
00:13:08,980 --> 00:13:11,310
at someone's
social security number.
274
00:13:11,320 --> 00:13:13,880
That's one digit away from my
actual social security number.
275
00:13:13,890 --> 00:13:16,120
How'd you predict that?
276
00:13:16,120 --> 00:13:19,160
Once they have this information,
277
00:13:19,160 --> 00:13:20,760
there is virtually no limit
278
00:13:20,760 --> 00:13:23,790
to what else they may be able
to find out about you.
279
00:13:23,800 --> 00:13:25,490
Alessandro says
280
00:13:25,500 --> 00:13:29,300
he designed this software
demonstration as a warning.
281
00:13:29,300 --> 00:13:32,070
It took very little effort
to develop it.
282
00:13:32,070 --> 00:13:35,440
Imagine what the corporations
that rule the Internet
283
00:13:35,440 --> 00:13:36,970
might already be doing.
284
00:13:36,970 --> 00:13:38,670
On any given day,
285
00:13:38,680 --> 00:13:42,650
2.1 billion people
are active on social media.
286
00:13:42,650 --> 00:13:44,880
They tweet 500 million times.
287
00:13:44,880 --> 00:13:47,680
They share on Facebook
one billion times.
288
00:13:47,690 --> 00:13:51,490
They upload 1.8 billion photos.
289
00:13:51,490 --> 00:13:54,120
And every time
you click on "like,"
290
00:13:54,130 --> 00:13:56,360
a record is made
of what you like.
291
00:13:58,160 --> 00:13:59,800
Today the Internet
292
00:13:59,800 --> 00:14:02,360
is essentially
a surveillance economy.
293
00:14:02,370 --> 00:14:05,470
Companies like Amazon
can sell millions of dollars
294
00:14:05,470 --> 00:14:07,600
of merchandising an hour.
295
00:14:07,610 --> 00:14:11,210
And much of these revenues
come through ads,
296
00:14:11,210 --> 00:14:13,680
which are tailored
to your preferences.
297
00:14:13,680 --> 00:14:15,980
The more a company can know you,
298
00:14:15,980 --> 00:14:18,110
the more they can manipulate you
299
00:14:18,120 --> 00:14:21,350
into clicking this link
or buying this product.
300
00:14:21,350 --> 00:14:24,190
Alessandro is convinced
301
00:14:24,190 --> 00:14:27,760
we'll keep giving up our
personal data and our privacy
302
00:14:27,760 --> 00:14:31,160
because corporations
make it so easy for us.
303
00:14:31,160 --> 00:14:34,430
Marketers entice us
304
00:14:34,430 --> 00:14:37,300
into revealing more and more
personal information.
305
00:14:37,300 --> 00:14:40,340
They work hard to make it
a good experience for us.
306
00:14:40,340 --> 00:14:43,140
To us, it looks like
the garden of Eden,
307
00:14:43,140 --> 00:14:44,840
where everything is free.
308
00:14:44,840 --> 00:14:47,140
You get free apps, free content,
309
00:14:47,150 --> 00:14:49,210
you get to play angry birds,
310
00:14:49,210 --> 00:14:50,710
all of this in exchange
311
00:14:50,720 --> 00:14:53,220
for, say,
having your location tracked
312
00:14:53,220 --> 00:14:54,880
1,000 times per day.
313
00:14:54,890 --> 00:14:56,650
Once corporations
314
00:14:56,650 --> 00:14:59,860
have collected enough
of your location history,
315
00:14:59,860 --> 00:15:02,390
they know you better
than you know yourself.
316
00:15:02,390 --> 00:15:06,930
They can predict where you will
be at a particular time of day
317
00:15:06,930 --> 00:15:12,370
with 80% accuracy
up to one year into the future.
318
00:15:12,370 --> 00:15:14,570
With that kind of information,
319
00:15:14,570 --> 00:15:15,910
your phone...
320
00:15:15,910 --> 00:15:18,510
and the companies that control
the data in your phone...
321
00:15:18,510 --> 00:15:22,080
will be able quite literally
to steer your day.
322
00:15:22,080 --> 00:15:24,350
They may buy new shoes for you
323
00:15:24,350 --> 00:15:27,180
before you even know
you need them.
324
00:15:27,190 --> 00:15:29,080
They may influence
your decisions,
325
00:15:29,090 --> 00:15:30,690
which job you're going to take.
326
00:15:30,690 --> 00:15:32,420
Alessandro believes
327
00:15:32,420 --> 00:15:34,960
we're losing the battle
for our privacy,
328
00:15:34,960 --> 00:15:38,560
and it's a battle we don't even
know we're fighting.
329
00:15:38,560 --> 00:15:40,460
The problem is, then,
330
00:15:40,460 --> 00:15:43,870
the system is basically built
around trying to nudge us
331
00:15:43,870 --> 00:15:46,670
into revealing more and more
personal information
332
00:15:46,670 --> 00:15:48,100
so that we no longer know
333
00:15:48,110 --> 00:15:50,640
whether what's being collected
about us
334
00:15:50,640 --> 00:15:53,210
will be used
in our best interest
335
00:15:53,210 --> 00:15:57,410
or will be used to the best
interests of another entity.
336
00:15:57,420 --> 00:16:00,450
But even if we wise up,
337
00:16:00,450 --> 00:16:03,220
we may have a hard time
stopping ourselves
338
00:16:03,220 --> 00:16:06,590
from sharing our most intimate
likes and needs online
339
00:16:06,590 --> 00:16:09,120
because this scientist believes
340
00:16:09,130 --> 00:16:12,900
we may already be hooked
on sharing, like a drug.
341
00:16:16,380 --> 00:16:19,580
People like to share.
342
00:16:19,590 --> 00:16:22,190
After all,
we are social animals.
343
00:16:22,190 --> 00:16:24,920
But somehow
the age of social media
344
00:16:24,930 --> 00:16:28,960
has got us sharing
more and more...
345
00:16:28,960 --> 00:16:35,170
No matter how uninteresting
it might be...
346
00:16:35,170 --> 00:16:38,770
Even though we know
every post gives marketers
347
00:16:38,770 --> 00:16:41,670
more and more information
about us.
348
00:16:41,680 --> 00:16:45,740
So, why do we do it?
349
00:16:45,750 --> 00:16:50,020
And could we stop ourselves
even if we tried?
350
00:16:52,190 --> 00:16:54,590
Psychologist Diana Tamir knows
351
00:16:54,590 --> 00:16:57,820
she's sometimes guilty
of over-sharing...
352
00:16:57,830 --> 00:17:00,060
- You gonna try this one?
- I'm gonna try this one.
353
00:17:00,060 --> 00:17:01,830
...Especially
when she tries out
354
00:17:01,830 --> 00:17:03,830
her favorite new hobby.
355
00:17:05,200 --> 00:17:07,900
It's super satisfying
to be able to do a route
356
00:17:07,900 --> 00:17:10,100
that you weren't able
to do before.
357
00:17:10,100 --> 00:17:12,900
That's part of the joy
of rock climbing.
358
00:17:12,910 --> 00:17:15,670
Getting to the top of a wall
can feel rewarding.
359
00:17:15,680 --> 00:17:18,040
You don't have to conquer K2
360
00:17:18,050 --> 00:17:20,280
to feel
that basic human impulse...
361
00:17:20,280 --> 00:17:23,180
the impulse
to talk about yourself.
362
00:17:27,790 --> 00:17:29,720
Check it out.
363
00:17:29,720 --> 00:17:31,260
People talk about themselves
all the time.
364
00:17:31,260 --> 00:17:32,620
They talk about themselves
365
00:17:32,630 --> 00:17:34,630
when they're having a
conversation with other people.
366
00:17:34,630 --> 00:17:37,200
They talk about themselves
367
00:17:37,200 --> 00:17:39,600
when they're sharing information
about themselves on social media
368
00:17:39,600 --> 00:17:41,800
or taking pictures of the thing
that they ate for breakfast
369
00:17:41,800 --> 00:17:44,400
and posting it
for the world to see.
370
00:17:44,410 --> 00:17:45,800
That's a really good one.
371
00:17:45,810 --> 00:17:47,110
There was a study
372
00:17:47,110 --> 00:17:48,610
that looked at what people
tweet about on Twitter,
373
00:17:48,610 --> 00:17:51,780
and they found that about 80% of
what people are tweeting about
374
00:17:51,780 --> 00:17:53,910
is just their own
personal experiences.
375
00:17:55,620 --> 00:17:58,550
Why do we enjoy this so much?
376
00:17:58,550 --> 00:18:00,590
As a neuroscientist,
377
00:18:00,590 --> 00:18:04,390
Diana thinks the answer
may be hiding in our brains.
378
00:18:04,390 --> 00:18:08,730
So she designed an experiment
using an mri scanner
379
00:18:08,730 --> 00:18:12,730
to see how talking about
ourselves... versus others...
380
00:18:12,730 --> 00:18:14,630
changes brain activity.
381
00:18:18,310 --> 00:18:22,110
Picture it like
a public-access talk show,
382
00:18:22,110 --> 00:18:25,010
with Diana taking the role
of the interviewer...
383
00:18:26,150 --> 00:18:27,780
Is it rewarding
to talk about yourself?
384
00:18:27,780 --> 00:18:29,350
Let's find out!
385
00:18:29,350 --> 00:18:32,420
...And a ficus
standing in for the scanner.
386
00:18:32,420 --> 00:18:34,120
- Hey, Adam.
- Hey.
387
00:18:34,120 --> 00:18:37,820
Do you get excited
to dress up for Halloween?
388
00:18:37,830 --> 00:18:39,720
Diana asks her subjects
389
00:18:39,730 --> 00:18:43,030
to respond to questions
about themselves or other people
390
00:18:43,030 --> 00:18:46,260
while showing them
corresponding photographs.
391
00:18:46,270 --> 00:18:48,370
Does your dad like
being photographed?
392
00:18:48,370 --> 00:18:51,600
Do you enjoy
spending time in nature?
393
00:18:54,240 --> 00:18:56,040
Do you like being photographed?
394
00:18:57,680 --> 00:19:00,550
Do you enjoy
having a dog as a pet?
395
00:19:01,880 --> 00:19:06,150
For Diana,
the answers weren't important.
396
00:19:06,150 --> 00:19:07,650
What mattered was
397
00:19:07,650 --> 00:19:10,660
how her subjects' brains
responded to the questions.
398
00:19:10,660 --> 00:19:14,130
All of them activated
the prefrontal cortex,
399
00:19:14,130 --> 00:19:17,360
a region associated
with higher thought.
400
00:19:17,360 --> 00:19:19,060
But something else happened
401
00:19:19,070 --> 00:19:22,300
when a subject answered
questions about themselves.
402
00:19:22,300 --> 00:19:25,740
Diana saw activation
in two brain regions...
403
00:19:25,740 --> 00:19:30,140
the ventral tegmental area
and the nucleus accumbens.
404
00:19:30,140 --> 00:19:32,940
They belong
to what neuroscientists call
405
00:19:32,950 --> 00:19:35,750
the reward pathway.
406
00:19:35,750 --> 00:19:38,120
So, we have these
reward pathways in our brain
407
00:19:38,120 --> 00:19:39,480
that motivates our behavior
408
00:19:39,490 --> 00:19:43,020
by helping us to learn
what things in the world
409
00:19:43,020 --> 00:19:44,260
feel rewarding
410
00:19:44,260 --> 00:19:48,430
that we need or want or desire,
like food or sex.
411
00:19:48,430 --> 00:19:50,600
The brain's reward system
412
00:19:50,600 --> 00:19:53,930
is powered by
a key chemical called dopamine.
413
00:19:53,930 --> 00:19:56,970
A surge of dopamine
can trigger pleasant feelings,
414
00:19:56,970 --> 00:20:00,070
which motivate us
to seek further rewards.
415
00:20:00,070 --> 00:20:02,410
It's the same system
that fires up
416
00:20:02,410 --> 00:20:04,440
when people do drugs
like cocaine
417
00:20:04,440 --> 00:20:06,610
or eat chocolate.
418
00:20:06,610 --> 00:20:10,180
So, like drugs,
sharing can become addictive.
419
00:20:10,180 --> 00:20:13,020
But why does
the dopamine system activate
420
00:20:13,020 --> 00:20:15,690
when we talk about ourselves?
421
00:20:15,690 --> 00:20:17,820
Humans have
a fundamental need to belong
422
00:20:17,820 --> 00:20:19,290
or connect with other people.
423
00:20:19,290 --> 00:20:21,660
So, social connection
and making friends
424
00:20:21,660 --> 00:20:23,230
and interacting with people
425
00:20:23,230 --> 00:20:25,600
are something that
we're highly motivated to get.
426
00:20:25,600 --> 00:20:29,400
So, being part of a group
gets you more resources, food,
427
00:20:29,400 --> 00:20:34,210
reproductive options
than if you were by yourself.
428
00:20:34,210 --> 00:20:37,840
Self-promotion
helps establish us
429
00:20:37,840 --> 00:20:39,840
as members of a group.
430
00:20:39,850 --> 00:20:42,080
And for hundreds
of thousands of years,
431
00:20:42,080 --> 00:20:45,820
being part of a group has been
essential to our survival.
432
00:20:45,820 --> 00:20:48,590
Even when we can't see
the other people in our group,
433
00:20:48,590 --> 00:20:53,860
we still have the instinctual
urge to promote ourselves.
434
00:20:53,860 --> 00:20:56,460
Part of the reason people share
so much on social media
435
00:20:56,460 --> 00:20:59,030
is because it activates
the same sort of neural systems
436
00:20:59,030 --> 00:21:01,130
as self-disclosing in person.
437
00:21:01,130 --> 00:21:04,770
Sharing stems from
a deep evolutionary drive.
438
00:21:04,770 --> 00:21:07,940
That's why it's so easy
to get hooked on it.
439
00:21:07,940 --> 00:21:11,340
Diana wanted to know
how easy it would be
440
00:21:11,350 --> 00:21:14,080
for her subjects to kick
the habit of over-sharing...
441
00:21:16,180 --> 00:21:18,280
...so she tried bribing them.
442
00:21:18,290 --> 00:21:20,550
What we were looking at
is whether or not
443
00:21:20,550 --> 00:21:23,120
people would kind of forego
some extra monetary rewards
444
00:21:23,120 --> 00:21:25,690
in order to answer a question
about themselves.
445
00:21:25,690 --> 00:21:29,930
This time,
Diana let her subjects decide...
446
00:21:29,930 --> 00:21:32,560
talk about yourself
and earn nothing
447
00:21:32,570 --> 00:21:35,700
or get paid
to talk about somebody else.
448
00:21:35,700 --> 00:21:37,770
Can you tell me about
whether you or your friend
449
00:21:37,770 --> 00:21:39,140
like spending time in nature?
450
00:21:39,140 --> 00:21:41,210
Mm-hmm.
451
00:21:41,210 --> 00:21:43,640
Money activates
the dopamine system.
452
00:21:43,640 --> 00:21:48,550
In fact, our neural wiring
has taught us to chase it.
453
00:21:48,550 --> 00:21:51,380
But they say
money can't buy happiness...
454
00:21:51,390 --> 00:21:54,020
at least,
not as much happiness as you get
455
00:21:54,020 --> 00:21:55,790
when you talk about you.
456
00:21:55,790 --> 00:21:58,660
While some participants
chose the money,
457
00:21:58,660 --> 00:22:00,560
most turned it down.
458
00:22:00,560 --> 00:22:03,260
We see that people place
significant amounts of value
459
00:22:03,260 --> 00:22:05,330
on answering questions
about themselves
460
00:22:05,330 --> 00:22:06,670
and significantly less value
461
00:22:06,670 --> 00:22:08,800
on answering questions
about other people.
462
00:22:08,800 --> 00:22:10,470
It kind of really
brought the point home
463
00:22:10,470 --> 00:22:12,270
that sharing information
is rewarding.
464
00:22:14,840 --> 00:22:17,280
Our compulsion to share
465
00:22:17,280 --> 00:22:20,950
is part of
our biological makeup.
466
00:22:20,950 --> 00:22:24,080
But our biology
could be the next target
467
00:22:24,080 --> 00:22:26,280
in the assault on our privacy.
468
00:22:26,290 --> 00:22:30,160
Our most sensitive
personal information
469
00:22:30,160 --> 00:22:33,830
may already have been sold
to the highest bidder.
470
00:22:38,470 --> 00:22:42,110
Which would you hate to lose
the most...
471
00:22:42,110 --> 00:22:44,410
your phone or your wallet?
472
00:22:44,410 --> 00:22:47,510
If either one of these
is stolen,
473
00:22:47,520 --> 00:22:49,680
it's a total hassle.
474
00:22:49,690 --> 00:22:52,190
Your private information
is exposed.
475
00:22:52,190 --> 00:22:54,450
However,
you can cancel bank cards,
476
00:22:54,460 --> 00:22:56,260
wipe the data from your phone,
477
00:22:56,260 --> 00:22:59,030
and change
all of your passwords.
478
00:22:59,030 --> 00:23:02,760
But there is something else
you leave behind every day
479
00:23:02,770 --> 00:23:06,800
that could be far more
devastating to your privacy.
480
00:23:09,040 --> 00:23:10,470
A single strand of hair
481
00:23:10,470 --> 00:23:13,440
contains the most private
information you have...
482
00:23:15,540 --> 00:23:18,380
...your DNA.
483
00:23:23,890 --> 00:23:27,020
Yaniv erlich is a former hacker
484
00:23:27,020 --> 00:23:30,220
who used to break into banks
to test their security.
485
00:23:40,470 --> 00:23:42,900
Now he's
a computational biologist,
486
00:23:42,910 --> 00:23:45,540
and he's concerned
about the security
487
00:23:45,540 --> 00:23:47,670
of a different kind of bank...
488
00:23:47,680 --> 00:23:48,880
a DNA bank,
489
00:23:48,880 --> 00:23:52,480
which can store
the individual genetic code
490
00:23:52,480 --> 00:23:54,950
of hundreds of thousands
of people.
491
00:23:57,250 --> 00:23:58,790
He believes that hackers
492
00:23:58,790 --> 00:24:01,720
will soon be able to break in
to those biobanks
493
00:24:01,720 --> 00:24:05,690
and steal our most valuable
and most private asset.
494
00:24:05,690 --> 00:24:09,560
A number of large-scale biobanks
495
00:24:09,570 --> 00:24:13,570
offer you the opportunity to
contribute your DNA to science.
496
00:24:13,570 --> 00:24:16,170
It just takes
a simple cheek swab
497
00:24:16,170 --> 00:24:18,270
to get the DNA
out of your mouth.
498
00:24:18,270 --> 00:24:21,610
And then, in a matter of days
with the current technology,
499
00:24:21,610 --> 00:24:24,240
we can analyze
your entire genome.
500
00:24:24,250 --> 00:24:30,550
Companies like
23andme and ancestry. Com
501
00:24:30,550 --> 00:24:32,020
will sequence your DNA
502
00:24:32,020 --> 00:24:35,360
and send you back information
about your family tree
503
00:24:35,360 --> 00:24:38,730
or whether you are at risk
for certain inherited diseases.
504
00:24:38,730 --> 00:24:40,360
And scientists are using
505
00:24:40,360 --> 00:24:43,060
this huge database
of genetic information
506
00:24:43,070 --> 00:24:48,670
to develop new cures
for a wide range of diseases.
507
00:24:48,670 --> 00:24:51,100
So, with all these types
of information,
508
00:24:51,110 --> 00:24:53,070
scientists can really understand
509
00:24:53,080 --> 00:24:55,880
how the vulnerability
within the population
510
00:24:55,880 --> 00:24:59,250
is affected
by the DNA material we have.
511
00:24:59,250 --> 00:25:01,880
If the contents of your DNA
512
00:25:01,880 --> 00:25:03,720
were stolen and disclosed,
513
00:25:03,720 --> 00:25:06,790
the consequences
could be disastrous.
514
00:25:06,790 --> 00:25:10,620
Imagine being denied health
insurance or losing your job
515
00:25:10,630 --> 00:25:12,360
because your genes show
516
00:25:12,360 --> 00:25:15,330
you're at high risk
for a heart attack.
517
00:25:15,330 --> 00:25:20,030
So biobanks say they make sure
your DNA remains anonymous.
518
00:25:20,040 --> 00:25:22,970
To increase the security,
519
00:25:22,970 --> 00:25:25,840
biobanks usually don't store
your identifiers
520
00:25:25,840 --> 00:25:28,210
together with
your genetic material.
521
00:25:28,210 --> 00:25:31,480
They will keep your name,
telephone number, and address
522
00:25:31,480 --> 00:25:33,880
totally separated
from this information.
523
00:25:33,880 --> 00:25:36,220
This way, no one knows
what is the origin
524
00:25:36,220 --> 00:25:39,820
of the genetic material
that you gave.
525
00:25:39,820 --> 00:25:42,390
But yaniv has found
a serious flaw
526
00:25:42,390 --> 00:25:43,620
in biobank security.
527
00:25:43,630 --> 00:25:45,890
In fact, he's discovered
that even those of us
528
00:25:45,890 --> 00:25:50,730
who have never had our
DNA sequenced are at risk, too.
529
00:25:52,630 --> 00:25:57,100
Our DNA is vulnerable
to theft every single day.
530
00:25:57,110 --> 00:25:59,810
Just think about what happens
when you get a haircut.
531
00:25:59,810 --> 00:26:02,110
Although DNA
is something very personal,
532
00:26:02,110 --> 00:26:03,710
you shed it everywhere.
533
00:26:03,710 --> 00:26:05,210
You go to the barbershop.
534
00:26:05,210 --> 00:26:06,450
You get a shave,
535
00:26:06,450 --> 00:26:09,220
you leave some of your DNA
on the blade.
536
00:26:09,220 --> 00:26:10,920
You take a sip from a glass.
537
00:26:10,920 --> 00:26:13,120
You have some of your saliva
on the glass,
538
00:26:13,120 --> 00:26:14,990
you leave behind
some of your DNA.
539
00:26:14,990 --> 00:26:19,190
Maybe if you're chewing gum
or you smoke a cigarette,
540
00:26:19,190 --> 00:26:21,460
you leave
the cigarette butt behind.
541
00:26:21,460 --> 00:26:23,030
You left some of your DNA.
542
00:26:23,030 --> 00:26:26,400
If a gene thief
got ahold of your DNA,
543
00:26:26,400 --> 00:26:29,570
they could discover which
inherited diseases you have,
544
00:26:29,570 --> 00:26:31,170
whether you have a tendency
545
00:26:31,170 --> 00:26:33,710
towards alcoholism
or mental illness,
546
00:26:33,710 --> 00:26:36,310
and threaten to reveal
that information
547
00:26:36,310 --> 00:26:41,580
to employers or insurers
unless you pay up.
548
00:26:41,580 --> 00:26:43,550
Aah!
549
00:26:43,550 --> 00:26:45,420
The key to being able to tie
550
00:26:45,420 --> 00:26:47,920
a piece of anonymous DNA
to a name,
551
00:26:47,920 --> 00:26:50,560
whether in a biobank
or a barbershop,
552
00:26:50,560 --> 00:26:53,330
is in the "y" chromosome.
553
00:26:53,330 --> 00:26:54,560
If you're a male,
554
00:26:54,560 --> 00:26:58,730
we can know more
about your paternal ancestry
555
00:26:58,730 --> 00:27:01,130
because you inherited
a short piece of DNA
556
00:27:01,140 --> 00:27:02,570
called the "y" chromosome
557
00:27:02,570 --> 00:27:04,500
that you just get
from your father's side.
558
00:27:04,510 --> 00:27:07,670
Now, here is the funny thing
about your "y" chromosome.
559
00:27:07,680 --> 00:27:09,980
You get your surname
from your father.
560
00:27:09,980 --> 00:27:12,180
He got it from his own father.
561
00:27:12,180 --> 00:27:14,550
And you got your "y" chromosome
from the same path.
562
00:27:14,550 --> 00:27:15,780
This creates a correlation
563
00:27:15,780 --> 00:27:18,590
between "y" chromosome
and surnames.
564
00:27:18,590 --> 00:27:20,920
In men, the "y" chromosome
565
00:27:20,920 --> 00:27:23,560
contains patterns
of repeating letters of DNA,
566
00:27:23,560 --> 00:27:24,960
a genetic fingerprint
567
00:27:24,960 --> 00:27:29,200
that passes from grandfather
to father to son unchanged,
568
00:27:29,200 --> 00:27:31,100
just like the surname does.
569
00:27:31,100 --> 00:27:34,700
To prove that our genetic
privacy is under threat,
570
00:27:34,700 --> 00:27:37,540
yaniv pretends to be
a gene thief.
571
00:27:37,540 --> 00:27:41,540
He downloads an anonymous
DNA sequence from a biobank
572
00:27:41,540 --> 00:27:44,910
and zeroes in on its unique
"y" chromosome patterns.
573
00:27:44,910 --> 00:27:47,680
Then he log on
to a genealogy database,
574
00:27:47,680 --> 00:27:51,820
where people voluntarily upload
their "y" chromosome sequences,
575
00:27:51,820 --> 00:27:55,820
along with their names,
to locate long-lost family.
576
00:27:55,820 --> 00:28:00,190
That allows him to match
the anonymous biobank DNA
577
00:28:00,200 --> 00:28:02,330
to a specific surname.
578
00:28:02,330 --> 00:28:04,830
And since the anonymous
biobank sequences
579
00:28:04,830 --> 00:28:07,600
are tagged with the age
and state of residence
580
00:28:07,600 --> 00:28:10,500
of the person
who supplied the DNA,
581
00:28:10,510 --> 00:28:14,440
a simple Internet search
reveals their identity.
582
00:28:14,440 --> 00:28:19,010
He has done this
successfully 50 times.
583
00:28:19,010 --> 00:28:21,450
I was so shocked by the results
584
00:28:21,450 --> 00:28:23,520
that I have to take a walk
585
00:28:23,520 --> 00:28:26,120
to think about the implications
of our method
586
00:28:26,120 --> 00:28:27,250
to genetic privacy.
587
00:28:27,260 --> 00:28:29,560
It means that if hackers can get
588
00:28:29,560 --> 00:28:32,230
the identified
genetic information
589
00:28:32,230 --> 00:28:33,890
that is allegedly anonymous,
590
00:28:33,900 --> 00:28:35,800
it means that we cannot promise,
591
00:28:35,800 --> 00:28:37,730
we cannot guarantee
full privacy,
592
00:28:37,730 --> 00:28:39,770
and we need to seek
a different way
593
00:28:39,770 --> 00:28:43,170
to engage participants
in these large-scale biobanks.
594
00:28:43,170 --> 00:28:46,770
In the wrong hands,
a single strand of hair
595
00:28:46,780 --> 00:28:50,740
can ruin the life of the person
who left it behind.
596
00:28:53,580 --> 00:28:58,220
How can we shield ourselves
from this privacy onslaught?
597
00:28:58,220 --> 00:29:01,960
Lock ourselves in our homes
and never go outside?
598
00:29:01,960 --> 00:29:04,960
Sterilize every room
we've been in?
599
00:29:04,960 --> 00:29:06,490
One scientist thinks
600
00:29:06,500 --> 00:29:09,530
there's only one way
to save our privacy.
601
00:29:09,530 --> 00:29:13,930
For him, the best defense
is offense.
602
00:29:19,120 --> 00:29:20,520
Feels like pretty soon,
603
00:29:20,520 --> 00:29:22,720
there won't be
a minute of the day
604
00:29:22,720 --> 00:29:24,420
when we aren't being watched.
605
00:29:24,420 --> 00:29:28,090
Any device you own
could be hacked into
606
00:29:28,090 --> 00:29:30,890
and used to spy on you.
607
00:29:30,900 --> 00:29:33,830
So, what's the answer?
608
00:29:33,830 --> 00:29:36,970
Go completely off-grid?
609
00:29:36,970 --> 00:29:39,700
Maybe there's another way.
610
00:29:39,700 --> 00:29:44,240
We could develop technology
to know when we're being watched
611
00:29:44,240 --> 00:29:49,140
and when we truly have privacy.
612
00:29:49,150 --> 00:29:54,650
Steve mann has worn a computer
every day for the last 38 years.
613
00:29:54,650 --> 00:29:56,920
In fact, he's been called
614
00:29:56,920 --> 00:29:59,890
the father
of wearable computing.
615
00:29:59,890 --> 00:30:04,160
His ideas inspired better-known
devices like Google glass.
616
00:30:04,160 --> 00:30:06,200
But back when he began,
617
00:30:06,200 --> 00:30:09,300
his digital eyeglass
was so bulky,
618
00:30:09,300 --> 00:30:12,430
he was often the subject
of ridicule.
619
00:30:12,440 --> 00:30:14,900
So, 35 years
of digital eyeglass,
620
00:30:14,910 --> 00:30:17,770
and finally we see
how the industry is catching on
621
00:30:17,780 --> 00:30:19,270
to some of these concepts.
622
00:30:19,280 --> 00:30:22,280
So I feel kind of vindicated
after people laugh at me
623
00:30:22,280 --> 00:30:25,380
for all the sort of stupid
eyeglasses and crazy things.
624
00:30:25,380 --> 00:30:29,850
Steve, a professor
at the university of Toronto,
625
00:30:29,850 --> 00:30:32,390
has a cult following
among his students
626
00:30:32,390 --> 00:30:34,620
as the original cyborg.
627
00:30:34,630 --> 00:30:38,360
His digital eyewear
is bolted to his skull.
628
00:30:38,360 --> 00:30:42,430
His interest in using technology
to augment what he could see
629
00:30:42,430 --> 00:30:44,470
began when he was a kid.
630
00:30:44,470 --> 00:30:46,770
Then in the 1970s,
631
00:30:46,770 --> 00:30:50,140
I started to notice
these things watching us
632
00:30:50,140 --> 00:30:51,370
and sensing us...
633
00:30:51,380 --> 00:30:54,080
microwave motion detectors
and burglar alarms
634
00:30:54,080 --> 00:30:55,310
and stuff like that.
635
00:30:55,310 --> 00:30:56,580
And I was wondering,
636
00:30:56,580 --> 00:30:58,110
"well, why are all
these machines spying on us?"
637
00:30:58,120 --> 00:31:01,480
And today he runs
an entire research team
638
00:31:01,490 --> 00:31:03,950
dedicated to developing
technology
639
00:31:03,950 --> 00:31:07,060
that can sniff out
when we're being surveilled.
640
00:31:07,060 --> 00:31:09,690
So, this device
will help you identify
641
00:31:09,690 --> 00:31:12,330
what devices
are recording your sound.
642
00:31:12,330 --> 00:31:15,800
So, the lights over here
move faster and bigger
643
00:31:15,800 --> 00:31:17,230
near a microphone.
644
00:31:17,230 --> 00:31:19,330
And then it locates the Mike,
645
00:31:19,340 --> 00:31:21,970
and that's how
you can sweep bugs.
646
00:31:21,970 --> 00:31:25,470
They have devices
that can pick up radio waves,
647
00:31:25,480 --> 00:31:28,180
including those
from your cellphone.
648
00:31:28,180 --> 00:31:30,850
So, the radio waves coming
from my smartphone here,
649
00:31:30,850 --> 00:31:32,180
for example...
650
00:31:32,180 --> 00:31:35,080
if I block that with my hand,
the wave is very weak.
651
00:31:35,090 --> 00:31:38,190
See how weak that wave is
when it's going through my hand,
652
00:31:38,190 --> 00:31:41,120
and then, whereas if I hold it
like this,
653
00:31:41,130 --> 00:31:42,960
the wave is much stronger.
654
00:31:42,960 --> 00:31:45,460
Perhaps
his most important invention
655
00:31:45,460 --> 00:31:48,160
in this age
of near total surveillance
656
00:31:48,170 --> 00:31:51,330
is technology
that can detect precisely
657
00:31:51,340 --> 00:31:54,100
when you're being watched
by a camera.
658
00:31:54,100 --> 00:31:56,540
So, there's a camera
inside this dome,
659
00:31:56,540 --> 00:31:58,440
and we don't know
which way it's pointing
660
00:31:58,440 --> 00:32:00,280
because it's shrouded
in this dark dome.
661
00:32:00,280 --> 00:32:01,710
But the light here,
662
00:32:01,710 --> 00:32:04,150
when it comes into the field
of view of the camera, glows,
663
00:32:04,150 --> 00:32:06,350
and when it goes out
of the field of the camera,
664
00:32:06,350 --> 00:32:07,820
it goes dim again.
665
00:32:07,820 --> 00:32:11,090
And so you can see here it sort
of paints out, if you will,
666
00:32:11,090 --> 00:32:14,120
the sight field of the camera.
667
00:32:14,120 --> 00:32:17,730
If I put my coat in front of it,
my jacket, the bulb...
668
00:32:17,730 --> 00:32:19,360
I haven't moved the bulb at all.
669
00:32:19,360 --> 00:32:21,330
I've just blocked it
with my jacket,
670
00:32:21,330 --> 00:32:22,960
and when I unblock it, it glows.
671
00:32:22,970 --> 00:32:27,900
Most of us are used to
seeing cameras everywhere.
672
00:32:27,910 --> 00:32:31,740
But Steve believes that if we
knew when we were being watched,
673
00:32:31,740 --> 00:32:36,550
we'd start asking more questions
about who's watching and why.
674
00:32:36,550 --> 00:32:37,880
It could be the police.
675
00:32:37,880 --> 00:32:39,050
It could be a computer.
676
00:32:39,050 --> 00:32:40,880
It could be
artificial intelligence.
677
00:32:40,880 --> 00:32:43,220
It could be machine learning.
We often don't know.
678
00:32:43,220 --> 00:32:45,920
Many times,
surveillance embraces hypocrisy,
679
00:32:45,920 --> 00:32:48,220
wanting to watch
and not be watched,
680
00:32:48,230 --> 00:32:50,060
wanting to see and not be seen,
681
00:32:50,060 --> 00:32:52,360
wanting to know
everything about us,
682
00:32:52,360 --> 00:32:54,260
but reveal nothing about itself.
683
00:32:54,260 --> 00:32:56,370
To redress that balance,
684
00:32:56,370 --> 00:32:59,100
Steve is working
to commercialize technology
685
00:32:59,100 --> 00:33:01,900
to detect the zones
where a camera sees us...
686
00:33:01,910 --> 00:33:05,740
What he calls
its veilance field.
687
00:33:05,740 --> 00:33:08,110
Ryan Jansen
is working with Steve
688
00:33:08,110 --> 00:33:12,380
on getting a veilance-field
detector into a wearable device.
689
00:33:12,380 --> 00:33:17,620
These are some glasses where
I can see the veilance fields
690
00:33:17,620 --> 00:33:19,860
from this surveillance camera.
691
00:33:19,860 --> 00:33:23,530
So, what it does is pokes
and prods at the optical field
692
00:33:23,530 --> 00:33:26,660
until it figures out
how much the camera is seeing.
693
00:33:26,660 --> 00:33:28,500
So, you can really
take this around
694
00:33:28,500 --> 00:33:30,270
and measure
a whole veilance field
695
00:33:30,270 --> 00:33:31,730
from a surveillance camera.
696
00:33:31,740 --> 00:33:35,140
What I'm excited about is to be
able to finally see and know
697
00:33:35,140 --> 00:33:36,540
how much we're watched,
698
00:33:36,540 --> 00:33:38,810
know how much the watchers
are watching us.
699
00:33:38,810 --> 00:33:40,910
Veilance fields
700
00:33:40,910 --> 00:33:43,750
aren't always places
you want to avoid.
701
00:33:43,750 --> 00:33:47,050
Sometimes
you may want to be watched.
702
00:33:47,050 --> 00:33:49,620
A lot of people who say, "oh,
you're into the sensing cameras",
703
00:33:49,620 --> 00:33:51,450
so must be against cameras."
704
00:33:51,460 --> 00:33:53,820
Sometimes I'm walking home
late at night in a dark alley
705
00:33:53,820 --> 00:33:55,590
and there's somebody
sharpening a knife
706
00:33:55,590 --> 00:33:57,160
and somebody
loading a gun down there,
707
00:33:57,160 --> 00:33:58,260
I might say, "you know what?"
708
00:33:58,260 --> 00:33:59,660
I think I'd like to be watched,"
709
00:33:59,660 --> 00:34:01,960
sort of say, "oh, there's
a veilance flux over there."
710
00:34:01,970 --> 00:34:03,930
I think I'm gonna move
towards the camera."
711
00:34:03,930 --> 00:34:06,900
What I'm really against
is the one-sided valence.
712
00:34:06,900 --> 00:34:08,900
Government or big business
713
00:34:08,910 --> 00:34:12,010
use cameras
to watch regular people.
714
00:34:12,010 --> 00:34:14,780
But regular people
rarely turn their cameras
715
00:34:14,780 --> 00:34:16,680
on big business and government.
716
00:34:16,680 --> 00:34:20,980
Steve believes wearable devices
like his digital eyeglass
717
00:34:20,980 --> 00:34:24,520
can help us watch the watchers.
718
00:34:24,520 --> 00:34:26,190
"Surveillance" is a French word
719
00:34:26,190 --> 00:34:27,690
that means
"to watch from above."
720
00:34:27,690 --> 00:34:29,460
When we're doing the watching,
721
00:34:29,460 --> 00:34:32,030
we call that undersight,
or sousveillance.
722
00:34:32,030 --> 00:34:34,930
But Steve has already discovered
723
00:34:34,930 --> 00:34:37,530
that sousveillance
can invite trouble.
724
00:34:37,530 --> 00:34:40,340
Recently, he walked
into a fast-food restaurant
725
00:34:40,340 --> 00:34:41,970
wearing his digital eyeglass
726
00:34:41,970 --> 00:34:44,810
and was confronted by employees
enforcing policies
727
00:34:44,810 --> 00:34:46,940
that don't allow filming
in their buildings.
728
00:34:46,940 --> 00:34:48,180
Hey!
729
00:34:48,180 --> 00:34:51,580
Despite the eyeglass
being bolted to his skull,
730
00:34:51,580 --> 00:34:54,120
the employees
tried to remove it,
731
00:34:54,120 --> 00:34:57,220
damaging it in the process.
732
00:34:57,220 --> 00:34:59,990
The cameras want to watch
but not be seen.
733
00:34:59,990 --> 00:35:02,090
And, in fact,
even if you photograph cameras,
734
00:35:02,090 --> 00:35:04,690
you find very quickly people
come running out to tell you,
735
00:35:04,700 --> 00:35:05,990
"no cameras are allowed here."
736
00:35:06,000 --> 00:35:07,730
And you say, "well, aren't those
all cameras around here?"
737
00:35:07,730 --> 00:35:09,830
"Oh, no, but those aren't cameras.
They're surveillance."
738
00:35:09,830 --> 00:35:13,370
Steve believes that if
we know when we're being watched
739
00:35:13,370 --> 00:35:16,170
and if sousveillance
becomes widespread,
740
00:35:16,170 --> 00:35:18,510
we'll finally have
the weapons we need
741
00:35:18,510 --> 00:35:22,040
to fight back against
the governments and corporations
742
00:35:22,050 --> 00:35:25,510
that constantly peer
into our private lives.
743
00:35:25,520 --> 00:35:27,520
The goal is to create systems
744
00:35:27,520 --> 00:35:30,520
that improve the quality
of people's lives,
745
00:35:30,520 --> 00:35:32,990
systems in which people
are innately aware
746
00:35:32,990 --> 00:35:34,190
of what's happening,
747
00:35:34,190 --> 00:35:36,360
to create a society in which
748
00:35:36,360 --> 00:35:39,930
sousveillance is balanced
with surveillance.
749
00:35:39,930 --> 00:35:43,570
One day,
widespread digital eyesight
750
00:35:43,570 --> 00:35:46,940
will merge sousveillance
and surveillance
751
00:35:46,940 --> 00:35:50,010
and transform society.
752
00:35:50,010 --> 00:35:52,570
Although someone
may be watching you,
753
00:35:52,580 --> 00:35:54,840
you can now
do your own watching.
754
00:35:54,850 --> 00:35:58,050
But what will a world
where everyone is watched
755
00:35:58,050 --> 00:36:00,620
and everyone is a watcher
look like?
756
00:36:00,620 --> 00:36:06,150
What will life be like
in a world with no more secrets?
757
00:36:11,590 --> 00:36:15,260
We stand
on the brink of a new era.
758
00:36:15,260 --> 00:36:17,060
Governments and corporations
759
00:36:17,060 --> 00:36:19,930
are peering into every corner
of our lives.
760
00:36:19,930 --> 00:36:24,430
And we are developing tools
to watch the watchers.
761
00:36:24,430 --> 00:36:29,200
So, will life in a world
with almost no secrets
762
00:36:29,210 --> 00:36:31,510
be a living nightmare?
763
00:36:31,510 --> 00:36:35,640
Or will the naked truth
set us free?
764
00:36:38,610 --> 00:36:42,180
Futurist and science-fiction
author David Brin
765
00:36:42,180 --> 00:36:45,120
thinks there is no point
in trying to hide,
766
00:36:45,120 --> 00:36:47,890
so he's putting it all
on display.
767
00:36:48,960 --> 00:36:51,090
In this modern era,
768
00:36:51,090 --> 00:36:54,400
when eyes are
proliferating everywhere,
769
00:36:54,400 --> 00:36:57,130
with the cameras getting
smaller, faster, cheaper,
770
00:36:57,130 --> 00:36:59,100
more numerous every day,
771
00:36:59,100 --> 00:37:01,370
the human reflex is to say,
772
00:37:01,370 --> 00:37:03,000
"get those things away from me.
773
00:37:03,010 --> 00:37:04,240
Ban them."
774
00:37:04,240 --> 00:37:07,640
But over the long run,
775
00:37:07,640 --> 00:37:10,840
that approach
is not only futile.
776
00:37:10,850 --> 00:37:14,620
It also is kind of cowardly.
777
00:37:17,350 --> 00:37:19,950
David has got used to the idea
778
00:37:19,960 --> 00:37:23,490
that even private spaces
aren't so private anymore.
779
00:37:23,490 --> 00:37:25,260
He thinks the key
to getting comfortable
780
00:37:25,260 --> 00:37:27,700
is to look to the past.
781
00:37:27,700 --> 00:37:30,100
After all,
for most of human history,
782
00:37:30,100 --> 00:37:31,670
we lived without privacy.
783
00:37:31,670 --> 00:37:34,070
- Thank you.
- Sure.
784
00:37:34,070 --> 00:37:35,670
Cheers.
785
00:37:35,670 --> 00:37:38,170
Our ancestors didn't have
much of a concept of privacy.
786
00:37:38,170 --> 00:37:41,210
Families would crowd
into single cottages,
787
00:37:41,210 --> 00:37:42,810
knowing each other's business,
788
00:37:42,810 --> 00:37:44,880
seeing everything
that was going on.
789
00:37:44,880 --> 00:37:49,050
The advantage was
everybody knew your name.
790
00:37:49,050 --> 00:37:50,990
There was
some sense of solidarity.
791
00:37:50,990 --> 00:37:54,320
But the olden times
were no utopia.
792
00:37:54,320 --> 00:37:56,420
The disadvantages were huge.
793
00:37:56,430 --> 00:37:57,890
You were dominated
794
00:37:57,890 --> 00:38:00,190
by the lord on the hill
and his thugs
795
00:38:00,200 --> 00:38:05,830
and by the local busybodies
who knew everybody's business.
796
00:38:05,840 --> 00:38:10,370
Today, we have our own
versions of these watchers.
797
00:38:10,370 --> 00:38:13,610
You can think of
the lord of the village
798
00:38:13,610 --> 00:38:15,410
as the NSA or the FBI.
799
00:38:18,150 --> 00:38:21,220
The busybodies are
the media and your neighbors,
800
00:38:21,220 --> 00:38:24,520
who can see
almost anything you do.
801
00:38:24,520 --> 00:38:27,820
David thinks
it's with our fellow citizens,
802
00:38:27,820 --> 00:38:29,190
not the government,
803
00:38:29,190 --> 00:38:32,530
that the battle to reclaim
our privacy must begin.
804
00:38:32,530 --> 00:38:34,630
The first step is to make sure
805
00:38:34,630 --> 00:38:38,670
people who are watching us
and talking about us can't hide.
806
00:38:38,670 --> 00:38:41,600
We're all so used
to personal gossip,
807
00:38:41,600 --> 00:38:47,540
where exchanging stories
about other people is so natural
808
00:38:47,540 --> 00:38:49,440
that we put up with
809
00:38:49,450 --> 00:38:53,210
the filthier,
more destructive aspects
810
00:38:53,220 --> 00:38:55,320
as just being part of life.
811
00:38:55,320 --> 00:38:57,450
What's going to bring this
to a head
812
00:38:57,450 --> 00:38:59,150
is what's happening online.
813
00:39:00,360 --> 00:39:04,060
We all know about
horrible crimes of bullying
814
00:39:04,060 --> 00:39:08,560
that have taken place online,
empowered by anonymity.
815
00:39:08,560 --> 00:39:11,970
But we can use
the tools of surveillance
816
00:39:11,970 --> 00:39:15,400
to expose prying eyes.
817
00:39:15,400 --> 00:39:19,410
The way to deal with the eyes
is to spot them.
818
00:39:19,410 --> 00:39:21,110
Hey!
819
00:39:21,110 --> 00:39:24,780
To find out who's looking
and hold them accountable.
820
00:39:24,780 --> 00:39:28,220
If we all look back
at the watchers,
821
00:39:28,220 --> 00:39:31,190
we have the power to change
the way they behave.
822
00:39:31,190 --> 00:39:36,790
It's a step towards what David
calls the transparent society.
823
00:39:36,790 --> 00:39:39,490
Transparency
can stamp out bad behavior
824
00:39:39,500 --> 00:39:40,960
from nosy neighbors.
825
00:39:40,960 --> 00:39:43,000
They won't be so quick
to talk about you
826
00:39:43,000 --> 00:39:45,000
if they know
you could talk about them.
827
00:39:45,000 --> 00:39:47,230
But it doesn't stop there.
828
00:39:47,240 --> 00:39:50,770
It ripples
all the way up our society.
829
00:39:50,770 --> 00:39:54,410
2013 was the best year
for civil liberties
830
00:39:54,410 --> 00:39:57,380
in the United States of America
in a generation.
831
00:39:59,280 --> 00:40:00,750
That was the year
832
00:40:00,750 --> 00:40:02,650
that the administration joined
the courts
833
00:40:02,650 --> 00:40:05,720
in declaring
a universal right of citizens
834
00:40:05,720 --> 00:40:09,590
to record their encounters
with police.
835
00:40:09,590 --> 00:40:11,990
It is empowering the good cops,
836
00:40:11,990 --> 00:40:16,200
but it's empowering groups
like black lives matter to say,
837
00:40:16,200 --> 00:40:19,170
"what you do to us
is what matters,
838
00:40:19,170 --> 00:40:21,100
and now we can prove it."
839
00:40:21,100 --> 00:40:24,000
A loss of privacy
for those in power
840
00:40:24,010 --> 00:40:26,870
can make society better.
841
00:40:26,880 --> 00:40:30,210
In fact, to make
the transparent society work,
842
00:40:30,210 --> 00:40:33,610
David believes the government's
right to secrecy
843
00:40:33,620 --> 00:40:35,580
must be massively curtailed.
844
00:40:35,580 --> 00:40:38,390
It should be able to keep
secrets for a while,
845
00:40:38,390 --> 00:40:41,920
like plans to arrest criminals
or military invasions,
846
00:40:41,920 --> 00:40:44,960
but nothing should
stay secret forever.
847
00:40:44,960 --> 00:40:47,660
Any practical,
tactical value to a secret
848
00:40:47,660 --> 00:40:51,630
is going to decay over time.
849
00:40:51,630 --> 00:40:55,470
Let's say government agencies,
corporations
850
00:40:55,470 --> 00:40:59,040
can get five years of secrecy
for free.
851
00:40:59,040 --> 00:41:00,310
After five years,
852
00:41:00,310 --> 00:41:03,540
you have to cache the secrets
in a secure place
853
00:41:03,550 --> 00:41:07,820
and pay money to extend it
another five years.
854
00:41:07,820 --> 00:41:11,190
It's for this reason
that David supports
855
00:41:11,190 --> 00:41:13,520
whistle-blowers
like Edward Snowden.
856
00:41:13,520 --> 00:41:17,190
They shine a light in the dark
corners of government.
857
00:41:17,190 --> 00:41:19,090
And in a free society,
858
00:41:19,090 --> 00:41:22,530
their leaks
ultimately make us stronger.
859
00:41:22,530 --> 00:41:25,100
Everything leaks.
860
00:41:25,100 --> 00:41:27,100
Not a month goes by
861
00:41:27,100 --> 00:41:30,700
when something
has not hemorrhaged
862
00:41:30,710 --> 00:41:32,140
all over the Internet,
863
00:41:32,140 --> 00:41:33,340
getting headlines.
864
00:41:33,340 --> 00:41:35,380
But somehow western governments
865
00:41:35,380 --> 00:41:38,080
and western civilization
keep surviving.
866
00:41:38,080 --> 00:41:40,710
In fact, it makes us better.
867
00:41:40,720 --> 00:41:42,580
Now think about our enemies...
868
00:41:42,590 --> 00:41:45,920
terrorists,
tyrannical governments,
869
00:41:45,920 --> 00:41:47,620
and criminal gangs.
870
00:41:47,620 --> 00:41:49,960
To them, it's lethal.
871
00:41:49,960 --> 00:41:52,760
The world is never going back
872
00:41:52,760 --> 00:41:56,230
to the way it was
just two decades ago.
873
00:41:56,230 --> 00:41:58,230
Eyes will be everywhere.
874
00:41:58,230 --> 00:42:00,870
There will be no escaping them.
875
00:42:00,870 --> 00:42:06,240
But if we change our behavior,
we can keep the privacy we need.
876
00:42:06,240 --> 00:42:08,240
Privacy is essential
to be human.
877
00:42:09,550 --> 00:42:13,480
We're just going to
have to defend it differently
878
00:42:13,480 --> 00:42:14,920
and redefine it.
879
00:42:14,920 --> 00:42:20,790
We will probably look back
on the last couple of centuries
880
00:42:20,790 --> 00:42:23,620
as a golden age of privacy,
881
00:42:23,630 --> 00:42:28,660
a time before the age
of almost total surveillance.
882
00:42:28,660 --> 00:42:30,860
But there is an upside.
883
00:42:30,870 --> 00:42:34,500
If we accept that
we are going to be watched,
884
00:42:34,500 --> 00:42:37,340
then governments
and corporations
885
00:42:37,340 --> 00:42:39,340
must accept the same.
886
00:42:39,340 --> 00:42:42,740
We need the privacy
of our bedrooms.
887
00:42:42,750 --> 00:42:46,050
Government needs the privacy
of its war rooms.
888
00:42:46,050 --> 00:42:50,180
Beyond that, our society
will be transparent.
889
00:42:50,190 --> 00:42:55,020
And this loss of secrecy
could herald a new age,
890
00:42:55,020 --> 00:42:57,760
the age of honesty.
69447
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.