this post was submitted on 08 Apr 2024
234 points (95.0% liked)

World News

38563 readers
2545 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
 

As civilian casualties continue to mount in the wartorn Gaza Strip, reports of Israel's use of artificial intelligence (AI) in its targeting of Hamas militants are facing increasing scrutiny. A report by the Israeli outlets +972 Magazine and Local Call earlier this month said that Israeli forces had relied heavily on two AI tools so far in the conflict — "Lavender" and "Where's Daddy."

While "Lavender" identifies suspected Hamas and Palestinian Islamic Jihad (PIJ) militants and their homes, "Where's Daddy" tracks these targets and informs Israeli forces when they return home, per the report, which cites six Israeli intelligence officers who had used AI systems for operations in Gaza, including "Where's Daddy?"

"We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity," one of the officers told +972 and Local Call. "On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations," they added.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 5 months ago

The point is that dude has no means to collect such statistics, he's not a research agency asking representative samples about their medical history, documenting his methodology and making the data available, and he's not even citing statistics supplied by hospitals.

He literally just pulled those charts out of his ass. Instead of "well yes, they're lying, but I'm sure the broader point is true", consider "would they have any need to fabricate things if the broader point is true?"

Surely it would be easy enough to go to Xinjiang and verify it, since there's no travel restrictions and they wouldn't have to worry about getting caught making shit up.