MichelLouise

joined 4 years ago
[โ€“] [email protected] 1 points 3 years ago* (last edited 3 years ago)

New reminder (after the usage of facial recognition by the police during the protests of course, but also after the more recent twitter cropping thing) that most current AI computer vision software that exists have a racist and sexist bias. Joy Buolamwini and the now famous Timnit Gebru showed in 2018 that systems showing a >99% accuracy for gender classification were actually mostly evaluated on (and developed by?) white men, and that there was a 35 points drop in accuracy when evaluating on a black female dataset. Basically, if you're a black woman, there is a >1/3 chance that AI will classify you as a man.

(They re-evaluated the same software in a later paper showing that, compared to a control group of products that were not in the initial study, the fairness of the systems exposed improved over time. So it seems that even when it's through academic publications, bullying works.)

But with this app the additional problem is that the system misgendering someone will not even be considered as a bug, but precisely as a feature.

 

You owe him RSS, you owe him Markdown, you owe him Creative Commons, you owe him Reddit.

He was 26, wanted to share knowledge for free and they pushed him to suicide for it.

He would be 34 by now, and I cannot imagine all the wonderful things he would have created.