Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,500 --> 00:00:03,461
I miss how my family used to gather
2
00:00:03,461 --> 00:00:04,796
at the end of the day.
3
00:00:04,796 --> 00:00:06,423
How we used to talk.
4
00:00:06,423 --> 00:00:08,591
My home was like a normal home.
5
00:00:08,591 --> 00:00:12,637
The simple, daily details that everyone has.
6
00:00:12,637 --> 00:00:16,015
Heba lived here, in northern Gaza.
7
00:00:16,015 --> 00:00:21,104
Her family evacuated on October 11th, 2023.
8
00:00:21,104 --> 00:00:25,567
By February, she learned her home was no longer there.
9
00:00:25,567 --> 00:00:30,280
She talked to me from a friend's home,
in Rafah, in southern Gaza.
10
00:00:30,488 --> 00:00:32,656
We received a picture of our house
11
00:00:32,656 --> 00:00:34,784
and we were in shock.
12
00:00:34,784 --> 00:00:36,494
We had down there, like,
13
00:00:36,494 --> 00:00:38,538
a place where where we have trees,
14
00:00:38,538 --> 00:00:40,582
we have flowers planted.
15
00:00:40,582 --> 00:00:44,294
Heba didn't know exactly why her home had been destroyed.
16
00:00:44,294 --> 00:00:46,755
But over the past few months, Israeli journalists
17
00:00:46,755 --> 00:00:49,049
have found that much of the destruction in Gaza
18
00:00:49,049 --> 00:00:51,176
since the attacks of October 7th
19
00:00:51,176 --> 00:00:56,431
has been enabled and often directed
by an artificial intelligence system.
20
00:00:56,431 --> 00:00:58,600
The promise of AI generally
21
00:00:58,600 --> 00:01:01,144
is a promise in two respects.
22
00:01:01,144 --> 00:01:04,605
One is swiftness and the second is accuracy.
23
00:01:04,605 --> 00:01:07,442
The whole dream of AI is
24
00:01:07,442 --> 00:01:10,236
that it would offer these precision strikes.
25
00:01:10,236 --> 00:01:13,573
But after over 34,000 Palestinians killed,
26
00:01:13,573 --> 00:01:18,453
compared to just over 1,400 in Israel's 2014 war in Gaza,
27
00:01:18,453 --> 00:01:21,456
it's clear something different is happening.
28
00:01:21,456 --> 00:01:24,250
So what does AI have to do with it?
29
00:01:24,250 --> 00:01:28,213
To get some answers, we called a couple of AI experts, reporters
30
00:01:28,213 --> 00:01:30,215
and investigative journalists.
31
00:01:36,096 --> 00:01:39,641
The Israeli Defense Forces’ use of AI is not new.
32
00:01:39,641 --> 00:01:42,310
I think that the most famous use of AI
33
00:01:42,310 --> 00:01:44,854
by the IDF is, of course, the Iron Dome,
34
00:01:44,854 --> 00:01:48,608
which is a defensive system that aims to disrupt
35
00:01:48,608 --> 00:01:50,485
the threat of missile attacks.
36
00:01:50,485 --> 00:01:53,029
This system is partly what defended Israel
37
00:01:53,029 --> 00:01:57,158
against Iran's drone and missile attacks in April 2024.
38
00:01:57,158 --> 00:02:00,578
The other one is another homegrown weapon that they have called
39
00:02:00,578 --> 00:02:02,956
the SMASH from Smartshooter,
40
00:02:02,956 --> 00:02:06,459
which is an AI precision assault rifle sight
41
00:02:06,459 --> 00:02:09,670
that you add on to handheld weapons.
42
00:02:09,670 --> 00:02:11,714
And what it does is it uses advanced
43
00:02:11,714 --> 00:02:14,884
image-processing algorithms to hone in on a target,
44
00:02:14,884 --> 00:02:18,513
sort of like a an auto-aim in Call of Duty.
45
00:02:18,513 --> 00:02:21,474
Another way Israel uses AI is through surveillance
46
00:02:21,474 --> 00:02:24,519
of Palestinians in the occupied territories.
47
00:02:24,519 --> 00:02:27,689
Every time they pass through one of the hundreds
48
00:02:27,689 --> 00:02:30,608
of checkpoints, their movements are being registered,
49
00:02:30,608 --> 00:02:33,403
Their facial images and other biometrics
50
00:02:33,403 --> 00:02:35,864
are being matched against a database.
51
00:02:35,864 --> 00:02:38,199
But we're now learning more about the AI systems
52
00:02:38,199 --> 00:02:40,618
that choose bombing targets in Gaza,
53
00:02:40,618 --> 00:02:47,584
from two reports in the Israeli publications +972 and Local Call.
54
00:02:49,085 --> 00:02:52,005
Gospel is a system that produces bombing targets
55
00:02:52,005 --> 00:02:54,883
for specific buildings and structures in Gaza.
56
00:02:54,883 --> 00:02:58,720
It does this by working in conjunction with other AI tools.
57
00:02:58,720 --> 00:03:00,555
And like any AI system,
58
00:03:00,555 --> 00:03:03,850
the first step is the large-scale collection of data.
59
00:03:03,850 --> 00:03:06,603
In this case, surveillance and historical data
60
00:03:06,603 --> 00:03:09,522
on Palestinian and militant locations in Gaza.
61
00:03:09,522 --> 00:03:11,649
The most famous application,
62
00:03:11,649 --> 00:03:14,235
would be Alchemist,
63
00:03:14,235 --> 00:03:16,988
which is a platform that collects data
64
00:03:16,988 --> 00:03:20,950
and allows the transfer of data between different departments
65
00:03:20,950 --> 00:03:24,495
later being transferred to another platform,
which is called the Fire Factory.
66
00:03:24,495 --> 00:03:29,083
The Fire Factory observes the data and categorizes it.
67
00:03:29,083 --> 00:03:32,921
The generated targets are generally put into one of four categories.
68
00:03:32,921 --> 00:03:34,631
First, tactical targets,
69
00:03:34,631 --> 00:03:37,926
which usually include armed militant cells, weapons warehouses,
70
00:03:37,926 --> 00:03:40,386
launchers and militant headquarters.
71
00:03:40,386 --> 00:03:42,472
Then there are underground targets,
72
00:03:42,472 --> 00:03:45,141
primarily tunnels under civilian homes.
73
00:03:45,141 --> 00:03:47,644
The third category includes the family homes
74
00:03:47,644 --> 00:03:50,230
of Hamas or Islamic Jihad operatives.
75
00:03:50,230 --> 00:03:54,609
And the last category includes targets
that are not obviously military in nature,
76
00:03:54,609 --> 00:03:58,947
particularly residential and high-rise buildings
with dozens of civilians.
77
00:03:58,947 --> 00:04:02,367
The IDF calls these power targets.
78
00:04:02,367 --> 00:04:03,910
Once the data is organized,
79
00:04:03,910 --> 00:04:07,121
it goes through a third layer called the Gospel.
80
00:04:07,121 --> 00:04:09,582
The Gospel creates an output
81
00:04:09,582 --> 00:04:13,002
which suggests specific possible targets,
82
00:04:13,002 --> 00:04:15,755
possible munitions,
83
00:04:15,755 --> 00:04:18,675
warnings of possible collateral damage, and etc.
84
00:04:18,675 --> 00:04:23,137
This system produces targets in Gaza faster than a human can.
85
00:04:23,137 --> 00:04:25,014
And within the first five days of the war,
86
00:04:25,014 --> 00:04:30,061
half of all the targets identified
were from the Power Targets category.
87
00:04:30,061 --> 00:04:34,857
Multiple sources who spoke to +972
reported that the idea behind power targets
88
00:04:34,857 --> 00:04:37,902
is to exert civil pressure on Hamas.
89
00:04:37,902 --> 00:04:40,488
Heba’s home was most likely one of the power targets
90
00:04:40,488 --> 00:04:44,450
picked up by the Gospel system.
91
00:04:45,368 --> 00:04:48,538
Months after the Gospel investigation, +972
92
00:04:48,538 --> 00:04:52,000
also surfaced a more opaque and secretive AI system,
93
00:04:52,000 --> 00:04:54,794
built for targeting specific people,
94
00:04:54,794 --> 00:04:57,338
known as Lavender.
95
00:04:57,338 --> 00:04:59,090
As the Israel-Hamas war began,
96
00:04:59,090 --> 00:05:01,592
Lavender used historic data and surveillance
97
00:05:01,592 --> 00:05:06,723
to generate as many as 37,000 Hamas and Islamic Jihad targets.
98
00:05:06,723 --> 00:05:09,517
Sources told +972 that about 10% of
99
00:05:09,517 --> 00:05:12,562
those targets are often wrong.
100
00:05:12,562 --> 00:05:17,066
But even when determining the 90% of supposedly correct targets,
101
00:05:17,066 --> 00:05:19,360
Israel also expanded the definition
102
00:05:19,360 --> 00:05:22,113
of a Hamas operative for the first time.
103
00:05:22,113 --> 00:05:25,199
The thing is, Hamas ultimately runs the Gaza Strip.
104
00:05:25,199 --> 00:05:28,161
So you have a lot of civil society that interacts with Hamas.
105
00:05:28,161 --> 00:05:31,664
Police force, doctors, civil society in general.
106
00:05:31,664 --> 00:05:35,084
And so these are the targets that we know that they're looking at.
107
00:05:35,084 --> 00:05:38,671
After Lavender used its data to generate these targets,
108
00:05:38,671 --> 00:05:42,550
AI would then link the target to a specific family home,
109
00:05:42,550 --> 00:05:46,095
and then recommend a weapon for the IDF to use on the target,
110
00:05:46,095 --> 00:05:49,390
mostly depending on the ranking of the operative.
111
00:05:49,390 --> 00:05:54,645
What we were told is that for low-ranking Hamas militants,
112
00:05:54,645 --> 00:05:57,982
the army preferred to use “dumb bombs,”
113
00:05:57,982 --> 00:06:01,694
meaning bombs that are not guided, because they are cheaper.
114
00:06:01,694 --> 00:06:06,074
So in a strange way, the less of a danger you posed,
115
00:06:06,074 --> 00:06:11,788
then they used less sophisticated bombs,
116
00:06:11,788 --> 00:06:15,166
therefore maybe creating more collateral damage.
117
00:06:15,166 --> 00:06:17,585
Sources told reporters that for every junior Hamas
118
00:06:17,585 --> 00:06:18,920
operative that Lavender marked,
119
00:06:18,920 --> 00:06:22,673
it was permissible to kill up to 15 or 20 civilians.
120
00:06:22,673 --> 00:06:24,425
But also that for some targets,
121
00:06:24,425 --> 00:06:26,928
the number of permissible civilian casualties
122
00:06:26,928 --> 00:06:30,056
was as high as 300.
123
00:06:30,056 --> 00:06:31,557
[Arabic] More than 50 displaced people were in the building.
124
00:06:31,557 --> 00:06:35,853
More than 20 children were in it.
125
00:06:38,689 --> 00:06:41,234
AI systems do not produce facts.
126
00:06:41,234 --> 00:06:42,819
They only produce prediction,
127
00:06:42,819 --> 00:06:45,905
just like a weather forecast or the stock market.
128
00:06:45,905 --> 00:06:48,116
The “intelligence” that’s there
129
00:06:48,116 --> 00:06:51,994
is completely dependent on the quality, the validity,
130
00:06:51,994 --> 00:06:55,540
the understanding of the humans
131
00:06:55,540 --> 00:06:57,875
who created the system.
132
00:06:57,875 --> 00:07:01,295
In a statement to the Guardian, the IDF “outright rejected”
133
00:07:01,295 --> 00:07:05,466
that they had “any policy to kill
tens of thousands of people in their homes”
134
00:07:05,466 --> 00:07:08,052
and stressed that human analysts must conduct
135
00:07:08,052 --> 00:07:11,889
independent examinations before a target is selected.
136
00:07:11,889 --> 00:07:15,435
Which brings us to the last step of both of these processes:
137
00:07:15,435 --> 00:07:18,104
Human approval.
138
00:07:18,104 --> 00:07:22,400
Sources told +972 that the only
human supervision protocol in place
139
00:07:22,400 --> 00:07:26,612
before bombing the houses of suspected
junior militants marked by Lavender,
140
00:07:26,612 --> 00:07:29,073
was to conduct a single check:
141
00:07:29,073 --> 00:07:32,201
Ensuring that the AI-selected target is male
142
00:07:32,201 --> 00:07:35,246
rather than female.
143
00:07:40,084 --> 00:07:44,505
Experts have been telling us that
essentially what's happening in Gaza
144
00:07:44,505 --> 00:07:49,635
is an unwilling test site for future AI technologies.
145
00:07:49,635 --> 00:07:51,345
In November 2023,
146
00:07:51,345 --> 00:07:53,723
the US released an international framework
147
00:07:53,723 --> 00:07:56,976
for the responsible use of AI in war.
148
00:07:56,976 --> 00:08:00,480
More than 50 signatures from 50 different countries.
149
00:08:00,480 --> 00:08:03,399
Israel has not signed on to this treaty.
150
00:08:03,399 --> 00:08:06,694
So we're in sort of this space
151
00:08:06,694 --> 00:08:08,946
where we lack sufficient oversight
152
00:08:08,946 --> 00:08:11,449
and accountability for drone warfare,
153
00:08:11,449 --> 00:08:16,037
let alone new systems being introduced like Gospel and Lavender.
154
00:08:16,037 --> 00:08:19,332
And we're looking at a future really, where
155
00:08:19,332 --> 00:08:21,667
there is going to be more imprecise
156
00:08:21,667 --> 00:08:24,128
and biased automation of targets
157
00:08:24,128 --> 00:08:27,465
that make these civilian casualties much worse.
158
00:08:27,465 --> 00:08:28,508
The fallacy of,
159
00:08:28,508 --> 00:08:31,302
you know, the premise that faster war fighting is somehow
160
00:08:31,302 --> 00:08:34,847
going to lead to global security and peace.
161
00:08:34,847 --> 00:08:38,768
I mean, this is just not the path that's going to get us there.
162
00:08:38,768 --> 00:08:40,352
And on the contrary,
163
00:08:40,352 --> 00:08:46,234
I think a lot of the momentum of these technological initiatives
164
00:08:46,234 --> 00:08:50,738
needs to be interrupted, in whatever ways we can.
165
00:08:51,364 --> 00:08:56,077
It really aches my heart that these
moments are never going to be back.
166
00:08:56,077 --> 00:08:59,163
It's not like I left home and like, for example,
167
00:08:59,163 --> 00:09:01,624
I traveled and I know it's there.
168
00:09:01,624 --> 00:09:03,918
No, it's not.
169
00:09:03,918 --> 00:09:06,629
It's not there anymore.
13405
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.