this post was submitted on 02 Feb 2021
5 points (100.0% liked)

main

15711 readers
13 users here now

THE MAIN RULE: ALL TEXT POSTS MUST CONTAIN "MAIN" OR BE ENTIRELY IMAGES (INLINE OR EMOJI)

(Temporary moratorium on main rule to encourage more posting on main. We reserve the right to arbitrarily enforce it whenever we wish and the right to strike this line and enforce mainposting with zero notification to the users because its funny)

A hexbear.net commainity. Main sure to subscribe to other communities as well. Your feed will become the Lion's Main!

Top Image of the Month will remain the Banner for a Month

Good comrades mainly sort posts by hot and comments by new!


gun-unity State-by-state guide on maintaining firearm ownership

guaido Domain guide on mutual aid and foodbank resources

smoker-on-the-balcony Tips for looking at financials of non-profits (How to donate amainly)

frothingfash Community-sourced megapost on the main media sources to radicalize libs and chuds with

just-a-theory An Amainzing Organizing Story

feminism Main Source for Feminism for Babies

data-revolutionary Maintaining OpSec / Data Spring Cleaning guide


ussr-cry Remain up to date on what time is it in Moscow

founded 4 years ago
MODERATORS
 

LEAVE BAD ANDROID REVIEWS HERE

LEAVE BAD IOS REVIEWS HERE

:crab-party: :crab-party: :crab-party:

↓↓↓ Read theory :LIB: ↓↓↓

Trans women are not allowed to use Giggle

It claims to enforce this by requiring users to "verify they are a female" by submitting an image to be analysed by AI

Giggle has a community called 'Gender Identity' for women who are de-transitioning only

The CEO/Founder of this flaming pile of garbage is a notorious TERF that made a name for herself after having a hissy fit about swimming pools

She's currently trying to get a journalist fired (for calling her a TERF)

There are many more examples of why this is a terrible thing run by a terrible person, but I'm sure you all get the point.

EDIT: Just found an article where she doubles down on EVERYTHING

top 36 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 3 years ago* (last edited 3 years ago) (1 children)

just used my own face for their ai recognition thing just to see if itd work and it fucking did lmao

i am now a TERF certified passing transgenderist

[–] [email protected] 9 points 3 years ago (4 children)

bruh really it didn’t work for me and i’m a c*s woman

[–] [email protected] 4 points 3 years ago (1 children)

your "Dudes Rock" t-shirt is in the mail.

welcome

[–] [email protected] 3 points 3 years ago

Thats what happens when you base an AI on incorrect assumptions about people I guess.

[–] [email protected] 2 points 3 years ago (2 children)

well im basically a bimbo so i guess you just gotta be a bimbo like me to get accepted

bimbo only app

[–] [email protected] 1 points 3 years ago
[–] [email protected] 1 points 3 years ago (1 children)

Me, a skinny bitch, seething, wig burnt to cinders, outside the gates

[–] [email protected] 2 points 3 years ago* (last edited 3 years ago)

the process has really been strange i didnt expect my lips to get this big lmao

like all the women in my family are sturdy turnip growing kinda women, i think i got the bimbo genes from my dad

if the rate of lip growth continues ill be a duck by the time im 50

[–] [email protected] 2 points 3 years ago* (last edited 3 years ago) (1 children)

nope. ur a dude. the TERF queens declare it. im sorry. you've been wrong your entire life, sadly misled by your parents and school friends and doctors and lovers, maybe for decades.

[–] [email protected] 1 points 3 years ago (1 children)
[–] [email protected] 2 points 3 years ago

you should probably tell all ur binary-attraction having exes that they're attracted to a differnt thing than they thought. gonna be super shitty and risk violence if they thought they were straight dudes. sympathies. you should probably get, like, lessons in how to punch things. or do you just know that now?

[–] [email protected] 5 points 3 years ago* (last edited 3 years ago) (2 children)

Please shove phone down your pants to continue

Also who wants to bet their AI shits all over the floor when it has to analyse a black person's face

[–] [email protected] 4 points 3 years ago* (last edited 3 years ago)

thats probably at least somewhat intentional :(

[–] [email protected] 4 points 3 years ago* (last edited 3 years ago)

who also wants to bet that it will let in some trans women but keep out some cis butch women?

[–] [email protected] 4 points 3 years ago (1 children)

The fact that Sall is literally reading this thread and absolutely seething is just :chefs-kiss:

Of course you're from the Gold Coast, Sall. The jokes write themselves

[–] [email protected] 3 points 3 years ago

Chapo.chat doesn't exactly have a wide distribution so the most likely scenario is someone sitting at home, hitting F5 continuously for any new mentions.

[–] [email protected] 3 points 3 years ago

Hey Sall, if you're reading this, please end your own life as soon as possible. Thanks!

[–] [email protected] 2 points 3 years ago (1 children)

How much you want to bet their terf AI is also racist and does the thing where it fails at recognizing black people

[–] [email protected] 2 points 3 years ago

Or disproportionately mislabels them as trans. Every white AI is racist.

[–] [email protected] 2 points 3 years ago* (last edited 3 years ago) (1 children)

It's definitely transphobic but I gotta say this is some impressive tech. The verification pic I took of my junk had pretty poor lighting and it still wouldn't let me sign up. Not sure why it thought I'd be wearing glasses on my penis but w/e.

[–] [email protected] 1 points 3 years ago (1 children)
[–] [email protected] 0 points 3 years ago (1 children)

Amazing. Some dipshit linked to Chapo in the replies!

[–] [email protected] 1 points 3 years ago
[–] [email protected] 2 points 3 years ago (1 children)

"Secured by gender verification" — Phrenology for transphobes.

[–] [email protected] 0 points 3 years ago (1 children)

The obvious response is an app that uses 'AI' to filter out white people

[–] [email protected] 0 points 3 years ago (1 children)

Or an app that uses AI to filter out terfs...

[–] [email protected] 1 points 3 years ago* (last edited 3 years ago)

You could throw away the image and just reject any british IP addresses

[–] [email protected] 2 points 3 years ago

connect with girls to form a giggle

yay! it's giggle time

sounds like it's for 5 year olds

[–] [email protected] 2 points 3 years ago

Failed screenwriter attempts to flog app, has huge security vulnerabilities for the first year (https://research.digitalinterruption.com/2020/09/10/giggle-laughable-security/) and lies to users about what is collected and can be sold, fortunately app is a failure and very few users have downloaded since 2019.

[–] [email protected] 1 points 3 years ago

★☆☆☆☆

They dont let the hot women devilishly handsome dudes nor the ethereal thembos from :flag-trans-pride: on the app. No point in using it.

[–] [email protected] 1 points 3 years ago (1 children)

Why call it 'giggle'

Such a cringe name :cringe:

[–] [email protected] 3 points 3 years ago

I'm discriminating against people. tee-hee

[–] [email protected] 1 points 3 years ago

dearest ceo of terf tech,

:get-in::gui-trans:

sincerly, :chapochat-trans:

[–] [email protected] 1 points 3 years ago* (last edited 3 years ago)

New reminder (after the usage of facial recognition by the police during the protests of course, but also after the more recent twitter cropping thing) that most current AI computer vision software that exists have a racist and sexist bias. Joy Buolamwini and the now famous Timnit Gebru showed in 2018 that systems showing a >99% accuracy for gender classification were actually mostly evaluated on (and developed by?) white men, and that there was a 35 points drop in accuracy when evaluating on a black female dataset. Basically, if you're a black woman, there is a >1/3 chance that AI will classify you as a man.

(They re-evaluated the same software in a later paper showing that, compared to a control group of products that were not in the initial study, the fairness of the systems exposed improved over time. So it seems that even when it's through academic publications, bullying works.)

But with this app the additional problem is that the system misgendering someone will not even be considered as a bug, but precisely as a feature.

[–] [email protected] 1 points 3 years ago

just FYI; AI tested images available at thispersondoesnotexist.com

if somebody wanted to trick a malevolent AI of some sort. for some reason.