All language subtitles for Dark.Net.S02E05.My.Identity.HDTV.x264-aAF.en

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian Download
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:21,272 --> 00:00:24,242 synced and corrected by susinz *www.addic7ed.com* 2 00:00:31,703 --> 00:00:34,704 I've been told that I look like an average person. 3 00:00:36,909 --> 00:00:40,042 But some of the characteristics 4 00:00:40,044 --> 00:00:42,512 that somewhat stick out are, 5 00:00:42,514 --> 00:00:45,014 I would say, I have a big nose. 6 00:00:45,016 --> 00:00:46,582 It's not very symmetrical. 7 00:00:51,923 --> 00:00:55,424 My eyes are spaced apart. 8 00:00:55,426 --> 00:00:58,828 I do have distinct moles on my right side of my face. 9 00:01:04,168 --> 00:01:06,671 I think a lot of people probably look like that. 10 00:01:08,306 --> 00:01:10,840 The face... 11 00:01:10,842 --> 00:01:14,142 it's the way we recognize each other. 12 00:01:14,144 --> 00:01:17,479 But to machines, we are code. 13 00:01:20,250 --> 00:01:23,273 We are being transformed into a face print, 14 00:01:24,476 --> 00:01:26,722 a unique set of points that converts 15 00:01:26,724 --> 00:01:29,531 our very humanity into data, 16 00:01:30,793 --> 00:01:33,250 traceable, trackable, 17 00:01:34,241 --> 00:01:35,375 forever, 18 00:01:36,578 --> 00:01:39,265 online or on the streets. 19 00:01:40,070 --> 00:01:43,414 Will you ever be a face in the crowd again? 20 00:01:44,328 --> 00:01:47,015 _ 21 00:01:58,954 --> 00:02:03,256 On September 5th, we opened the bank at 9:00. 22 00:02:04,093 --> 00:02:07,994 We had a line, probably five or six people in line. 23 00:02:08,703 --> 00:02:11,163 The next client that came to the window 24 00:02:11,165 --> 00:02:13,335 handed me a piece of paper. 25 00:02:13,769 --> 00:02:16,382 It was enclosed in a plastic envelope. 26 00:02:16,738 --> 00:02:19,238 And I'm going, "Oh, my gosh," you know. 27 00:02:19,240 --> 00:02:21,210 And it's like my heart just sunk. 28 00:02:22,176 --> 00:02:25,844 On September 5, 2014, Bonita Shipp 29 00:02:25,846 --> 00:02:29,081 came face-to-face with a bank robber. 30 00:02:32,753 --> 00:02:34,984 As I was getting the money ready, 31 00:02:35,460 --> 00:02:36,922 I studied his face. 32 00:02:36,924 --> 00:02:39,398 I tried to recognize 33 00:02:39,927 --> 00:02:41,421 any markings at all. 34 00:02:42,763 --> 00:02:44,430 He had sunglasses on. 35 00:02:44,432 --> 00:02:45,931 And he had a cap on. 36 00:02:45,933 --> 00:02:48,703 So I couldn't see anything from his eyes up. 37 00:02:49,512 --> 00:02:53,546 But I looked at his nose. He had thin lips. 38 00:02:53,571 --> 00:02:57,606 He had no tattoos, no scars, nothing. 39 00:02:57,929 --> 00:03:00,078 So I took all of the money out of my drawer, 40 00:03:00,080 --> 00:03:01,346 put it all in an envelope. 41 00:03:01,348 --> 00:03:03,640 And I handed it to him. And then he left. 42 00:03:04,406 --> 00:03:08,640 He got away with just a little under $3,000. 43 00:03:18,698 --> 00:03:21,312 One of the most prominent things about humans 44 00:03:21,801 --> 00:03:24,445 is that they use their vision to do many things. 45 00:03:24,903 --> 00:03:27,871 We recognize friends from foe. 46 00:03:29,508 --> 00:03:31,929 We recognize familiar faces. 47 00:03:33,346 --> 00:03:34,765 We recognize objects. 48 00:03:36,482 --> 00:03:38,983 We use our vision in a way that's very, 49 00:03:38,985 --> 00:03:41,140 very essential to our survival. 50 00:03:45,891 --> 00:03:48,892 Could computers one day recognize faces? 51 00:03:48,894 --> 00:03:50,393 That's the question 52 00:03:50,395 --> 00:03:54,039 that haunted face recognition pioneer Joseph Atick. 53 00:03:55,100 --> 00:03:57,398 I grew up in Jerusalem, 54 00:03:57,802 --> 00:04:00,970 where it was necessary 55 00:04:00,972 --> 00:04:02,938 to always travel with an I.D., 56 00:04:02,940 --> 00:04:05,796 to present it two or three times a day 57 00:04:05,910 --> 00:04:07,877 to go from one town to the other. 58 00:04:07,879 --> 00:04:10,079 I believed that we needed something 59 00:04:10,081 --> 00:04:12,281 that was more effective in securing the world. 60 00:04:12,283 --> 00:04:15,217 So I applied mathematical techniques 61 00:04:15,219 --> 00:04:17,653 to the study of the human brain. 62 00:04:20,109 --> 00:04:22,324 There are four elements 63 00:04:22,326 --> 00:04:24,459 in a facial-recognition technology. 64 00:04:24,461 --> 00:04:26,795 One is the algorithm, 65 00:04:26,797 --> 00:04:29,097 which is inspired by the human brain. 66 00:04:29,099 --> 00:04:32,667 Second was the need to have a camera in order 67 00:04:32,669 --> 00:04:35,750 to allow the vision, like the eye. 68 00:04:37,174 --> 00:04:40,075 You needed to also have the database. 69 00:04:40,077 --> 00:04:42,944 You needed to have the memory 70 00:04:42,946 --> 00:04:46,148 of people who are known. 71 00:04:46,150 --> 00:04:47,515 Humans, for example, 72 00:04:47,517 --> 00:04:50,335 we remember about 2,000 people. 73 00:04:51,062 --> 00:04:52,887 That's our database. 74 00:04:52,889 --> 00:04:54,822 And in order to run the algorithm, 75 00:04:54,824 --> 00:04:56,257 we need processing power. 76 00:04:56,259 --> 00:04:58,259 We do it effortlessly because we don't think of it. 77 00:04:58,261 --> 00:05:01,195 But, in fact, when I look at your face, many, 78 00:05:01,197 --> 00:05:03,531 many calculations happen in my brain. 79 00:05:03,533 --> 00:05:07,868 It is so innate that we don't even think about it. 80 00:05:09,439 --> 00:05:13,174 Three or four years of mathematical took us 81 00:05:13,176 --> 00:05:15,976 to develop the first facial-recognition algorithm. 82 00:05:15,978 --> 00:05:18,187 This is FaceIt, 83 00:05:19,148 --> 00:05:20,647 the first commercial product 84 00:05:20,649 --> 00:05:23,884 using face-recognition technology developed by Atick 85 00:05:23,886 --> 00:05:25,937 and his team in 1995. 86 00:05:26,855 --> 00:05:29,957 It opened a new window to our digital world. 87 00:05:29,959 --> 00:05:32,210 The genie's now out of the bottle. 88 00:05:32,761 --> 00:05:35,896 And any attempt to put it back, technologically, 89 00:05:35,898 --> 00:05:37,015 is doomed 90 00:05:38,234 --> 00:05:40,967 because I started getting a lot of calls 91 00:05:40,969 --> 00:05:43,769 from intelligence agencies around the world 92 00:05:43,771 --> 00:05:48,641 who thought this would be quite useful for their mission. 93 00:06:23,244 --> 00:06:25,844 Facial recognition is simply the ability 94 00:06:25,846 --> 00:06:27,178 for law enforcement 95 00:06:27,180 --> 00:06:29,080 to electronically search 96 00:06:29,082 --> 00:06:32,242 against millions of photo images. 97 00:06:33,754 --> 00:06:36,154 Next-generation identification, 98 00:06:36,453 --> 00:06:38,992 the FBI's biometric network, 99 00:06:40,093 --> 00:06:43,039 overseen by assistant director Stephen Morris. 100 00:06:44,117 --> 00:06:47,210 The system is looking for key features of a face. 101 00:06:47,667 --> 00:06:50,168 It'll measure the distance between the eyes... 102 00:06:52,905 --> 00:06:54,460 ...the distance between the ears, 103 00:06:56,609 --> 00:06:58,335 the ears in relation to the mouth. 104 00:06:59,712 --> 00:07:03,013 And then it's looking for other images in that repository 105 00:07:03,015 --> 00:07:05,328 that have those same measurements. 106 00:07:10,089 --> 00:07:12,156 Our database consists of around 107 00:07:12,158 --> 00:07:14,324 30 million mug-shot photos. 108 00:07:16,828 --> 00:07:19,729 There are also repositories of photographs 109 00:07:19,731 --> 00:07:21,498 that have been lawfully collected, 110 00:07:21,500 --> 00:07:24,133 such as visa photos, travel documents 111 00:07:24,135 --> 00:07:26,125 and driver's license photo files. 112 00:07:26,992 --> 00:07:29,573 So we're talking about more than just 30 million photos 113 00:07:29,575 --> 00:07:32,641 that can be searched using facial-recognition technology. 114 00:07:32,976 --> 00:07:34,796 The potential is unlimited. 115 00:07:35,379 --> 00:07:37,279 About half of American adults 116 00:07:37,281 --> 00:07:39,559 are in a law-enforcement database. 117 00:07:39,953 --> 00:07:41,843 Most don't even know it. 118 00:07:42,789 --> 00:07:46,210 To store the world's largest biometrics database 119 00:07:46,624 --> 00:07:49,359 is a 100,000-square-foot data center, 120 00:07:50,628 --> 00:07:53,362 about the size of a lower Manhattan block. 121 00:07:59,203 --> 00:08:01,403 Just be advised, 122 00:08:01,405 --> 00:08:03,687 there was an RP accident at 112 precinct. 123 00:08:04,208 --> 00:08:07,575 I've been working in the 6-7 for about 2 1/2, 124 00:08:07,577 --> 00:08:08,944 going on 3 years. 125 00:08:08,946 --> 00:08:11,947 We have the pictures of people 126 00:08:11,949 --> 00:08:13,915 who were involved in a shooting. 127 00:08:13,917 --> 00:08:16,718 So when you come into contact 128 00:08:16,720 --> 00:08:18,787 with that person, they know who we are. 129 00:08:18,789 --> 00:08:22,257 But now we have a step up because we know who you are. 130 00:08:24,794 --> 00:08:27,295 About 35,000 officers 131 00:08:27,297 --> 00:08:29,437 patrol the streets of New York City. 132 00:08:29,799 --> 00:08:31,799 But even the nation's largest police force 133 00:08:31,801 --> 00:08:33,868 can use an extra pair of eyes... 134 00:08:37,807 --> 00:08:39,312 ...or 10,000 of them. 135 00:08:55,223 --> 00:08:57,491 These cameras aren't just watching. 136 00:08:57,493 --> 00:08:59,403 With the help of live analytics, 137 00:08:59,562 --> 00:09:02,696 they can detect what the human eye cannot, 138 00:09:02,698 --> 00:09:04,359 a shot fired, 139 00:09:05,066 --> 00:09:06,600 a suspicious package 140 00:09:06,602 --> 00:09:08,750 or a suspect running from a crime. 141 00:09:09,371 --> 00:09:10,870 And they transmit this data 142 00:09:10,872 --> 00:09:13,343 directly to the real-time crime center. 143 00:09:14,242 --> 00:09:16,042 The main purpose of this unit is to help identify 144 00:09:16,044 --> 00:09:18,178 anybody who's unknown in a criminal investigation. 145 00:09:18,804 --> 00:09:21,214 The software enhances surveillance footage. 146 00:09:21,216 --> 00:09:23,783 We're able to convert a 2D image to a 3D image. 147 00:09:23,785 --> 00:09:25,118 And it's going to convert that image 148 00:09:25,120 --> 00:09:27,420 to a more proper pose that we're going to need, 149 00:09:27,422 --> 00:09:29,288 similar to a driver's license or a mug shot. 150 00:09:29,290 --> 00:09:32,458 Facial recognition has tremendous potential 151 00:09:32,460 --> 00:09:34,193 when you're looking for the proverbial 152 00:09:34,195 --> 00:09:35,523 needle in a haystack. 153 00:09:35,863 --> 00:09:37,129 It provides you a lead, 154 00:09:37,131 --> 00:09:39,498 particularly in an instance where you don't know 155 00:09:39,500 --> 00:09:41,266 who it is you're looking for. 156 00:09:52,773 --> 00:09:54,646 My name is Steve Talley. 157 00:09:54,648 --> 00:09:58,817 I lived in Colorado. I had a beautiful family. 158 00:09:58,819 --> 00:10:00,125 I was a loving father. 159 00:10:01,555 --> 00:10:02,875 I have two kids. 160 00:10:03,690 --> 00:10:06,148 I have a daughter who is 12 this February. 161 00:10:06,960 --> 00:10:09,562 I have a son who had just turned 9 last week. 162 00:10:11,498 --> 00:10:16,117 We lived in a very nice, family-oriented community. 163 00:10:16,436 --> 00:10:20,468 And I had a great career in financial services. 164 00:10:20,773 --> 00:10:24,342 And I was, at that time, excited about my life 165 00:10:24,344 --> 00:10:26,765 and my prospects and the future. 166 00:10:27,213 --> 00:10:29,492 I was living the American dream... 167 00:10:31,617 --> 00:10:33,359 ...or so I thought. 168 00:10:34,679 --> 00:10:36,750 My life changed dramatically. 169 00:10:37,189 --> 00:10:39,296 I got divorced from my ex-wife. 170 00:10:39,592 --> 00:10:43,351 And then I got laid off due to corporate restructuring. 171 00:10:43,996 --> 00:10:45,763 I had financial obligations. 172 00:10:45,765 --> 00:10:48,064 I still had child-support payments 173 00:10:48,066 --> 00:10:49,633 of about $2,000 a month. 174 00:10:49,635 --> 00:10:52,335 But I considered myself to still be an average, 175 00:10:52,337 --> 00:10:54,054 law-abiding citizen. 176 00:10:54,139 --> 00:10:55,812 But the authorities thought 177 00:10:55,837 --> 00:10:58,812 he was living a double life as a serial bank robber. 178 00:10:59,144 --> 00:11:01,812 And even local news joined in the hunt. 179 00:11:02,547 --> 00:11:03,813 Do you recognize this man? 180 00:11:03,815 --> 00:11:05,548 Denver police are looking for him tonight. 181 00:11:05,550 --> 00:11:07,349 They say he robbed the U.S. Bank 182 00:11:07,351 --> 00:11:08,751 on South Colorado Boulevard 183 00:11:08,753 --> 00:11:11,296 near East Mississippi and Glendale last week. 184 00:11:11,556 --> 00:11:12,755 Have a look at his picture 185 00:11:12,757 --> 00:11:14,690 taken by surveillance cameras in the bank. 186 00:11:14,692 --> 00:11:17,527 Police say he may be armed with a gun. 187 00:11:19,096 --> 00:11:22,695 One day, there was a pounding at my door. 188 00:11:23,467 --> 00:11:27,970 All of a sudden, I see a gentleman with the FBI jacket. 189 00:11:27,972 --> 00:11:30,234 He handcuffed me behind my back. 190 00:11:32,000 --> 00:11:35,010 He said, "Do you know why we're arresting you?" 191 00:11:38,000 --> 00:11:41,082 He said, "We're arresting you for two armed bank robberies 192 00:11:41,084 --> 00:11:43,062 and assaulting a police officer." 193 00:11:44,354 --> 00:11:46,734 I was driven to the detention center. 194 00:11:48,592 --> 00:11:51,109 I was in prison in a maximum-security pod 195 00:11:51,821 --> 00:11:56,343 because they had very strong facial recognition 196 00:11:57,267 --> 00:12:00,201 that proved that I was the guy. 197 00:12:02,339 --> 00:12:04,320 But I always said I was innocent. 198 00:12:05,008 --> 00:12:09,476 I had an air-tight alibi. I shared it with everyone. 199 00:12:10,013 --> 00:12:11,312 I had my own witnesses 200 00:12:11,314 --> 00:12:13,679 for the alibi come in and prove I was there. 201 00:12:14,383 --> 00:12:16,917 But my only crime is I, 202 00:12:16,919 --> 00:12:19,070 apparently, look like someone else. 203 00:12:20,155 --> 00:12:22,122 I really have nothing to hide. 204 00:12:28,030 --> 00:12:32,132 Nothing to hide, nothing to fear, right? 205 00:12:34,536 --> 00:12:37,148 Face rec gets the bad guys off the streets. 206 00:12:37,506 --> 00:12:38,820 So what's the harm? 207 00:12:40,409 --> 00:12:42,175 I grew up in a world 208 00:12:42,177 --> 00:12:45,812 where identity was part of our daily experience 209 00:12:45,814 --> 00:12:48,898 at a time when the world was in conflict. 210 00:12:50,118 --> 00:12:52,986 In societies where there was an oppressive regime, 211 00:12:52,988 --> 00:12:54,593 there was a chilling factor. 212 00:12:55,023 --> 00:12:58,157 People did not express themselves freely 213 00:12:58,507 --> 00:13:01,937 because there was a fear that they would be persecuted. 214 00:13:03,064 --> 00:13:05,367 Now we have a different kind of chilling factor. 215 00:13:05,833 --> 00:13:07,599 And it is driven not by governments, 216 00:13:07,601 --> 00:13:10,328 necessarily, but by the surveillance camera. 217 00:13:11,205 --> 00:13:12,604 And that chilling factor, 218 00:13:12,606 --> 00:13:14,921 it means we're going to change our behavior. 219 00:13:16,044 --> 00:13:18,500 And we no longer live in a free society. 220 00:13:19,580 --> 00:13:21,789 Will that be a society that we will accept? 221 00:13:32,392 --> 00:13:33,578 Zoom in. 222 00:13:34,862 --> 00:13:36,750 Try to get as close up as you can. 223 00:13:37,197 --> 00:13:38,882 Take a quick snapshot. 224 00:13:43,002 --> 00:13:44,802 By walking in to a casino, 225 00:13:44,804 --> 00:13:47,872 you have effectively given up your privacy 226 00:13:48,203 --> 00:13:49,640 because, in a casino, 227 00:13:49,642 --> 00:13:51,509 you really want to know your customer. 228 00:13:52,898 --> 00:13:54,712 So facial recognition 229 00:13:54,714 --> 00:13:56,847 dramatically improved the ability 230 00:13:56,849 --> 00:13:59,117 to actively track card counters, 231 00:13:59,619 --> 00:14:01,414 high net-worth individuals, 232 00:14:02,288 --> 00:14:05,255 cheaters and all sorts of other individuals 233 00:14:05,257 --> 00:14:07,424 that the casinos are interested in tracking. 234 00:14:10,007 --> 00:14:11,828 The man behind this technology 235 00:14:11,830 --> 00:14:13,497 is Wyly Wade, 236 00:14:13,499 --> 00:14:16,700 who provides it to 200 casinos across the country. 237 00:14:16,702 --> 00:14:19,515 Facial recognition is contactless. 238 00:14:19,738 --> 00:14:21,539 It is noninvasive. 239 00:14:22,040 --> 00:14:25,609 And it is the link from your digital environment 240 00:14:25,611 --> 00:14:27,644 to your physical environment. 241 00:14:29,108 --> 00:14:31,208 Part of this is a very personal issue to me. 242 00:14:38,490 --> 00:14:40,531 I've got a daughter with special needs. 243 00:14:40,847 --> 00:14:42,835 She's deaf. She's autistic. 244 00:14:45,096 --> 00:14:47,130 My daughter loves to climb. 245 00:14:47,132 --> 00:14:48,929 She flips. She twirls. 246 00:14:49,234 --> 00:14:50,733 I'm the human jungle gym. 247 00:14:51,015 --> 00:14:53,002 Oh, now you're going to run away, huh? 248 00:14:55,206 --> 00:14:56,739 She laughs. She plays. 249 00:14:56,741 --> 00:14:58,382 She runs. She hides. 250 00:15:00,046 --> 00:15:02,011 But she doesn't have this filter 251 00:15:02,013 --> 00:15:04,113 on what is good and what is bad. 252 00:15:04,250 --> 00:15:08,183 So we need an extra level of security to re-create 253 00:15:08,185 --> 00:15:10,242 some of that filter for her. 254 00:15:19,963 --> 00:15:22,796 We have cameras that automatically rotate, 255 00:15:23,460 --> 00:15:24,539 pan, 256 00:15:25,476 --> 00:15:28,546 tilt, zoom, 257 00:15:28,938 --> 00:15:31,125 all based off of either sound 258 00:15:32,465 --> 00:15:33,686 or motion. 259 00:15:34,718 --> 00:15:38,054 If that camera detects that there's a face there, 260 00:15:38,570 --> 00:15:40,481 then what it does is it drops it down 261 00:15:40,483 --> 00:15:42,717 to our security system 262 00:15:42,719 --> 00:15:44,652 and then compares that to the people 263 00:15:44,654 --> 00:15:46,710 that I don't want into our house 264 00:15:47,990 --> 00:15:51,158 so that that way you can be prewarned or preconfirm 265 00:15:51,160 --> 00:15:54,428 whether or not that person is a rapist or sex offender 266 00:15:54,430 --> 00:15:56,197 because we have hundreds of thousands 267 00:15:56,199 --> 00:15:58,666 of registered sex offenders in the United States. 268 00:15:58,668 --> 00:16:01,302 So if the UPS driver happens to be 269 00:16:01,304 --> 00:16:02,503 a registered sex offender, 270 00:16:02,505 --> 00:16:05,015 yeah, I'd like to be notified about that. 271 00:16:06,179 --> 00:16:09,042 Facial recognition gives us peace of mind. 272 00:16:09,044 --> 00:16:12,062 You don't know where the real danger lies. 273 00:16:12,314 --> 00:16:14,218 You don't know who the hooligans are. 274 00:16:14,549 --> 00:16:16,835 You probably don't even know your neighbors. 275 00:16:24,659 --> 00:16:27,882 Facial recognition is not positive identification. 276 00:16:28,430 --> 00:16:30,140 If you're looking for an individual, 277 00:16:30,432 --> 00:16:33,833 and you submit a search, and you get 14 back in your gallery, 278 00:16:33,835 --> 00:16:36,148 there could be 14 wrong people in there. 279 00:16:36,605 --> 00:16:39,038 And that is where it's up to the investigator 280 00:16:39,671 --> 00:16:40,773 to take the information 281 00:16:40,775 --> 00:16:42,492 that comes with that picture. 282 00:16:47,815 --> 00:16:49,915 After the robbery, the police 283 00:16:49,917 --> 00:16:52,184 and the FBI came, and they interviewed me, 284 00:16:52,186 --> 00:16:55,086 and they wanted me to identify the person 285 00:16:55,088 --> 00:16:56,789 and do a photo lineup. 286 00:16:58,773 --> 00:17:01,406 There were six people on that page. 287 00:17:02,129 --> 00:17:06,164 You know, I said, "Well, this kind of looks like the guy." 288 00:17:06,900 --> 00:17:09,301 And they asked me like, "What percentage do you think?" 289 00:17:09,671 --> 00:17:13,437 And I said, "Well, probably maybe 85%, 290 00:17:13,439 --> 00:17:15,585 but I'm not 100% sure." 291 00:17:21,447 --> 00:17:24,082 Instead of using my mug shot in this photo lineup, 292 00:17:24,084 --> 00:17:26,850 which they typically do because it's the most recent picture, 293 00:17:27,335 --> 00:17:29,186 they actually used the picture 294 00:17:29,188 --> 00:17:32,390 where I got my DUI in 2011. 295 00:17:32,921 --> 00:17:33,891 They're using this picture 296 00:17:33,893 --> 00:17:36,921 from 5 years ago where I look younger. 297 00:17:37,445 --> 00:17:41,231 That same DUI mug shot is what the FBI used 298 00:17:41,233 --> 00:17:43,492 to compare Talley to the robber. 299 00:17:43,959 --> 00:17:45,269 I didn't think he looked like me. 300 00:17:45,271 --> 00:17:47,137 But I could see some similarities. 301 00:17:47,139 --> 00:17:50,109 And I think he probably looked like a gazillion people. 302 00:17:51,143 --> 00:17:53,343 But in court, the eyewitness faces 303 00:17:53,345 --> 00:17:55,632 the suspect not in pixels 304 00:17:56,171 --> 00:17:57,484 but in the flesh. 305 00:17:58,734 --> 00:17:59,916 I insisted 306 00:17:59,918 --> 00:18:02,552 I would have to see Mr. Talley in person 307 00:18:02,554 --> 00:18:05,953 because I want to make sure that my decision is right. 308 00:18:07,203 --> 00:18:10,226 The robber had no markings, no moles. 309 00:18:10,228 --> 00:18:12,921 And when you see this Mr. Talley, 310 00:18:12,946 --> 00:18:14,445 he has a mole on his cheek. 311 00:18:15,375 --> 00:18:18,101 Also, Mr. Talley had a long nose. 312 00:18:18,773 --> 00:18:20,914 The other guy's nose was shorter. 313 00:18:21,859 --> 00:18:24,867 And Mr. Talley is a big man. 314 00:18:25,676 --> 00:18:27,484 And the robber was not. 315 00:18:27,645 --> 00:18:29,345 And I told the judge, 316 00:18:29,347 --> 00:18:31,080 I said, "If you're asking me 317 00:18:31,082 --> 00:18:33,014 if this was the guy that was the robber," 318 00:18:33,016 --> 00:18:37,187 I said, "I am absolutely, 100% positive it is not." 319 00:18:40,023 --> 00:18:41,757 The FBI and local police 320 00:18:41,759 --> 00:18:45,078 said it was me based on the similar features. 321 00:18:45,563 --> 00:18:47,363 But they totally ignored features 322 00:18:47,365 --> 00:18:49,796 that would exclude me as being a suspect. 323 00:18:50,901 --> 00:18:54,265 Finally, they wanted to do a height analysis. 324 00:18:54,905 --> 00:18:56,718 The bank robber was 6 foot. 325 00:18:57,975 --> 00:19:00,210 And they determined I was 6'3 1/2". 326 00:19:00,445 --> 00:19:02,601 So I was 3 inches too tall. 327 00:19:03,195 --> 00:19:05,937 That proved I couldn't possibly be the guy. 328 00:19:07,421 --> 00:19:09,450 His face got him arrested. 329 00:19:09,734 --> 00:19:12,109 But his height set him free. 330 00:19:12,989 --> 00:19:14,585 I did get dismissed. 331 00:19:15,398 --> 00:19:17,124 But they seemed to be trying to build a case 332 00:19:17,126 --> 00:19:19,293 where there really wasn't a case there. 333 00:19:19,295 --> 00:19:22,629 They didn't have any, not even a shred, of evidence, 334 00:19:22,631 --> 00:19:25,164 except for I looked like the bank robber. 335 00:19:26,302 --> 00:19:29,003 This is really scary stuff, this technology. 336 00:19:29,005 --> 00:19:32,172 It could be a great shortcut to help with investigations 337 00:19:32,174 --> 00:19:35,218 but at the expense of possibly, 338 00:19:35,677 --> 00:19:37,796 you know, targeting the wrong people. 339 00:19:44,887 --> 00:19:46,773 It could happen to you. 340 00:19:48,089 --> 00:19:50,640 As the database of faces grows larger, 341 00:19:51,507 --> 00:19:54,500 so does your chance of having a doppelganger. 342 00:19:55,445 --> 00:19:58,531 There are more cameras now than humans in the world. 343 00:19:58,835 --> 00:20:01,117 With people having cameras in their pocket, 344 00:20:01,289 --> 00:20:02,635 cameras in the street, 345 00:20:02,637 --> 00:20:04,640 there are billions of cameras in the world. 346 00:20:05,085 --> 00:20:07,476 That is a natural progression. 347 00:20:08,009 --> 00:20:09,601 And we saw some of it happening. 348 00:20:10,345 --> 00:20:13,312 But one thing we did not see, 349 00:20:13,314 --> 00:20:15,304 nor anyone saw or counted on, 350 00:20:15,695 --> 00:20:19,445 was the emergence of social media. 351 00:20:20,120 --> 00:20:23,188 If, in the '90s, I told you that, one day, 352 00:20:23,190 --> 00:20:25,257 there was going to be a database 353 00:20:25,259 --> 00:20:28,187 where people voluntarily add their faces 354 00:20:28,662 --> 00:20:31,593 and the faces of their friends and the faces of their children, 355 00:20:31,798 --> 00:20:34,366 and that database could be used to identify 356 00:20:34,368 --> 00:20:37,234 every single one of those billion people, 357 00:20:37,371 --> 00:20:39,771 I would be laughed at back then. 358 00:20:41,812 --> 00:20:45,718 But now Facebook is essentially a database 359 00:20:45,898 --> 00:20:48,320 that is the dream of a big brother. 360 00:20:48,937 --> 00:20:51,195 1.8 billion users, 361 00:20:51,485 --> 00:20:55,367 350 million pictures uploaded every day, 362 00:20:55,889 --> 00:20:59,156 we are the ones training Facebook's facial recognition. 363 00:20:59,493 --> 00:21:00,562 How? 364 00:21:00,760 --> 00:21:03,125 By tagging ourselves and our friends. 365 00:21:03,629 --> 00:21:05,829 Their algorithm is so strong, 366 00:21:05,831 --> 00:21:09,218 Facebook can I.D. you even if your face is hidden. 367 00:21:09,546 --> 00:21:11,302 There is money to be made 368 00:21:11,304 --> 00:21:13,523 from the recognition of your face. 369 00:21:13,873 --> 00:21:15,206 Imagine if you walked around, 370 00:21:15,208 --> 00:21:17,741 and on top of your head it said your name, 371 00:21:17,743 --> 00:21:20,210 how much money you have, what you like, 372 00:21:20,212 --> 00:21:21,645 what are your preferences in life, 373 00:21:21,647 --> 00:21:24,247 and machines that run algorithms 374 00:21:24,249 --> 00:21:26,820 will start to make decisions for us. 375 00:21:27,000 --> 00:21:28,619 And as a consequence, 376 00:21:28,621 --> 00:21:31,085 we may lose the battle of privacy. 377 00:21:32,591 --> 00:21:34,634 I think privacy is a misnomer at this point. 378 00:21:35,257 --> 00:21:37,293 We have to get over the idea 379 00:21:37,546 --> 00:21:40,546 that you own your data and your identity. 380 00:21:40,945 --> 00:21:45,568 We gave up our right to most of our data 381 00:21:45,953 --> 00:21:49,439 when every time we signed up to Google or to Facebook 382 00:21:49,750 --> 00:21:51,273 because it was convenient. 383 00:21:51,275 --> 00:21:53,475 It improved your life in a lot of ways. 384 00:21:53,477 --> 00:21:55,678 But it also broke down a lot of barriers 385 00:21:55,680 --> 00:21:58,180 to where that data that you claim 386 00:21:58,182 --> 00:22:01,265 is your identity is no longer your data. 387 00:22:01,385 --> 00:22:02,609 You don't own it. 388 00:22:03,120 --> 00:22:05,601 We've almost gone to a post-privacy era. 389 00:22:07,525 --> 00:22:09,648 What does post privacy look like? 390 00:22:10,428 --> 00:22:12,227 Just ask the Russians. 391 00:22:12,351 --> 00:22:15,797 Findface, an app that allows you to find any person 392 00:22:15,799 --> 00:22:18,179 in the largest Russian social network. 393 00:22:18,569 --> 00:22:21,062 It lets a user photograph a stranger, 394 00:22:21,305 --> 00:22:22,771 upload that picture 395 00:22:22,773 --> 00:22:25,117 and compare it to social media profiles 396 00:22:25,241 --> 00:22:27,175 to unmask the person's identity. 397 00:22:27,328 --> 00:22:28,643 The service allows you 398 00:22:28,645 --> 00:22:30,779 to not only find the desired user 399 00:22:30,890 --> 00:22:34,148 but also to send them messages and other information. 400 00:22:34,384 --> 00:22:35,750 In other words, 401 00:22:35,752 --> 00:22:38,265 it's a dream come true for stalkers. 402 00:22:40,123 --> 00:22:41,671 To see a face recognition 403 00:22:41,696 --> 00:22:43,396 being abused in a certain way, 404 00:22:43,960 --> 00:22:46,914 it means that we no longer live in a free society. 405 00:22:47,718 --> 00:22:50,598 But face recognition can be good 406 00:22:50,812 --> 00:22:52,233 as long as there's oversight 407 00:22:52,235 --> 00:22:55,103 to make sure no innocent individuals 408 00:22:55,105 --> 00:22:56,984 are accused. 409 00:23:04,437 --> 00:23:06,880 Now I'll wake up in the middle of the night, 410 00:23:06,882 --> 00:23:08,546 and I won't know where I'm at. 411 00:23:08,951 --> 00:23:10,618 All of a sudden, it dawns on me. 412 00:23:12,308 --> 00:23:14,109 This nightmare is my life. 413 00:23:16,492 --> 00:23:18,828 Here I am. I'm actually in a shelter right now. 414 00:23:23,798 --> 00:23:26,703 Even though they had no case, it dragged on. 415 00:23:27,369 --> 00:23:30,734 It's two years after, and I am still struggling. 416 00:23:31,606 --> 00:23:34,257 Because of this incident, I lost my housing. 417 00:23:34,876 --> 00:23:38,344 And even just being associated with bank robberies 418 00:23:38,346 --> 00:23:39,746 has, basically, effectively, 419 00:23:39,748 --> 00:23:43,282 black-balled me or destroyed my career in financial services. 420 00:23:44,653 --> 00:23:46,093 No one's going to touch me. 421 00:23:49,923 --> 00:23:52,054 The biggest thing is my custody situation. 422 00:23:52,249 --> 00:23:54,835 I haven't been able to see my kids in two years. 423 00:23:58,566 --> 00:23:59,998 I've missed the Christmases, 424 00:24:00,000 --> 00:24:01,534 the birthdays, the Thanksgivings, 425 00:24:01,536 --> 00:24:03,367 all the memories of them growing up. 426 00:24:05,039 --> 00:24:06,960 I'm always concerned about them, 427 00:24:07,307 --> 00:24:09,141 what they're feeling, what their thoughts are about me. 428 00:24:09,143 --> 00:24:11,289 Do they feel that they've been abandoned, 429 00:24:11,545 --> 00:24:13,429 that their father doesn't love them? 430 00:24:15,248 --> 00:24:18,039 Being homeless is extremely hard. 431 00:24:19,419 --> 00:24:21,429 You know, I feel isolated. 432 00:24:21,988 --> 00:24:23,588 I feel like I'm invisible, too, 433 00:24:23,590 --> 00:24:26,093 because a lot of people don't like to look at homeless. 434 00:24:26,693 --> 00:24:27,992 I'm in a pretty big hole right now 435 00:24:27,994 --> 00:24:29,625 that I have to dig myself out of. 436 00:24:29,962 --> 00:24:31,429 I'm fighting for my life back. 437 00:24:31,431 --> 00:24:34,031 That's really what this is about is fighting to get my life 438 00:24:34,033 --> 00:24:35,733 back to where it was, 439 00:24:35,735 --> 00:24:39,273 to fight for myself, for my family. 440 00:24:46,773 --> 00:24:49,880 There's a lot of good that's used of this technology. 441 00:24:49,882 --> 00:24:51,945 I use this to protect my family. 442 00:24:52,648 --> 00:24:55,118 Are there bad uses of facial recognition? 443 00:24:55,120 --> 00:24:58,804 Absolutely, because we don't live in a perfect world. 444 00:25:02,561 --> 00:25:05,929 Facial recognition is going to create problems. 445 00:25:06,593 --> 00:25:08,432 There's going to be some collateral damage. 446 00:25:09,000 --> 00:25:11,031 I guess I was collateral damage. 447 00:25:11,562 --> 00:25:13,726 My family was collateral damage. 448 00:25:14,906 --> 00:25:16,507 It happened to me. 449 00:25:16,608 --> 00:25:18,890 Why couldn't it happen to other people? 450 00:25:26,210 --> 00:25:29,385 Chances are your face can be used against you. 451 00:25:29,984 --> 00:25:31,387 We all feed the network. 452 00:25:32,054 --> 00:25:33,633 The more data we upload, 453 00:25:33,992 --> 00:25:35,843 the more powerful it becomes, 454 00:25:36,359 --> 00:25:40,156 entangling our physical selves into a digital web. 455 00:25:41,359 --> 00:25:44,800 We, as technologists, have to be responsible 456 00:25:44,802 --> 00:25:46,568 for the creations that we've made. 457 00:25:46,570 --> 00:25:48,203 And we have to explain 458 00:25:48,205 --> 00:25:50,898 to the world the inherent danger. 459 00:25:52,509 --> 00:25:54,643 And that is the missing link 460 00:25:54,812 --> 00:25:58,593 between our online persona 461 00:25:59,549 --> 00:26:01,164 and our offline persona. 462 00:26:02,118 --> 00:26:04,886 The offline persona is our only persona. 463 00:26:04,888 --> 00:26:07,601 It's unique. It's us, makes us human. 464 00:26:08,258 --> 00:26:10,351 And the ability to protect that 465 00:26:10,765 --> 00:26:14,729 will depend on the ability to stop face recognition 466 00:26:14,731 --> 00:26:17,328 from recognizing us without consent. 467 00:26:21,171 --> 00:26:24,500 And I don't believe we can afford 468 00:26:25,508 --> 00:26:27,078 to lose control 469 00:26:27,443 --> 00:26:29,234 over the most precious thing... 470 00:26:33,157 --> 00:26:34,648 ...which is our identity. 471 00:26:39,914 --> 00:26:43,183 synced and corrected by susinz *www.addic7ed.com* 35482

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.