[-] EliteCloneMike@lemmy.zip 4 points 1 day ago

It’s too bad that the software and hardware are not wholly independent. As for Google pulling that kind of nonsense, I wouldn’t put it past them. They are as sketchy as sketchy gets. But this seems isolated and out of the prying hands/eyes of Google (hopefully).

[-] EliteCloneMike@lemmy.zip 29 points 1 day ago

Seems likely at this moment that Motorola phones will run GrapheneOS be right out the gate as it is a partnership. Just hopefully it’s not a one sided partnership and Motorola doesn’t strong arm the GrapheneOS people into something that is more locked away and just works like an app. https://www.digitaltrends.com/phones/motorola-plans-to-put-grapheneos-on-phones-so-why-is-it-a-big-deal/

[-] EliteCloneMike@lemmy.zip 3 points 5 days ago

Will no one think of the shareholders. 🫤🙄 I am very much against this push of AI on everything without proper informed consent. I’m mainly thinking about how bad it is that AI is scanning people’s private photos like Google and Meta in the name of looking for child abuse. It’s an easy sell if you say anything is done to “save kids”, but it’s just mass surveillance and creating more harm than good.

[-] EliteCloneMike@lemmy.zip 3 points 6 days ago

If you’re comfortable using a NAS, that is probably the best option, but Proton offers drive storage services that are end-to-end encrypted. For photos Ente Photos is a good option.

Google’s algorithms go through all the files you upload to their servers and check them for anything that might go against their terms of service. I mean you could encrypt yourself prior to uploading, but that’s a lot of work. If their algorithm labels even one file as violating their terms of service, they may lock you out of all your data and your account. Their appeal process is useless and is likely just checked by the same algorithm that closed the account in the first place or rubber stamped by a person who goes through thousands of reports a day. Most appeals are rejected and they just delete lifetimes of data/memories like it’s nothing. Of course backups are recommended. Their AI algorithms were rolled out too soon and should never be used as judge, jury, and executioner for people’s data.

Google reports cartoon images and family photos as well. Forbes for reported on it (https://www.forbes.com/sites/thomasbrewster/2021/12/20/google-scans-gmail-and-drive-for-cartoons-of-child-sexual-abuse/). They closed a biggish profile YouTube channels account for cartoons as well (https://en.wikipedia.org/wiki/Naoki_Saito). Same for family photos/medical photos, of which there are plenty of reports from various news networks, the most prominent were probably the three from the NYT (https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.htmlhttps://www.nytimes.com/2022/12/30/technology/google-appeals-change.html, and https://www.nytimes.com/2023/11/27/technology/google-youtube-abuse-mistake.html). Plus more from El Pais (https://english.elpais.com/science-tech/2022-09-19/google-closed-my-account-over-sexual-content-but-theyre-not-telling-me-what-it-is-and-ive-lost-everything.html), Buisness Insider (https://www.businessinsider.com/google-users-locked-out-after-years-2020-10?op=1), Android Police (https://www.androidpolice.com/2021/03/08/when-google-locks-you-out-of-your-account-begging-the-internet-for-help-is-your-first-and-last-resort/), India Times (https://english.elpais.com/science-tech/2022-09-19/google-closed-my-account-over-sexual-content-but-theyre-not-telling-me-what-it-is-and-ive-lost-everything.html), etc. And tons of self reporting (https://piunikaweb.com/2026/02/03/google-photos-false-csam-flags-users-locked-out/).

My point is that it is not black and white or as simple as don’t download it. There are plenty of cases in which a person would not know such as downloading an AI training set (https://www.404media.co/a-developer-accidentally-found-csam-in-ai-data-google-banned-him-for-it/). If they truly wanted to follow the law, it would be knowing possession that should end with a persons account being terminated. All other cases should end with maybe the file reported and deleted. But their system is highly flawed and most appeals are denied, which is nonsense when less then 1% of these reports end up with an arrest and even fewer lead to convictions (https://stacks.stanford.edu/file/druid:pr592kc5483/cybertipline-paper-2024-04-22.pdf).

To be honest, I don’t think this is all a failure of Google or Meta or Microsoft, but the NCMEC and Thorn. They are the real threat to child safety, as they use their platform to claim to want to save children, but have other agendas (https://www.techdirt.com/2024/08/08/the-many-reasons-why-ncmecs-board-is-failing-its-mission-from-a-ncmec-insider/ and https://www.jezebel.com/ashton-kutcher-thorn-sex-workers-1850852760). Plus, at least Thorn has been found to lie about their numbers of children rescued (https://www.snopes.com/fact-check/kutcher-software-child-trafficking/).

All Google, and the others, are doing is over reporting and making harder to find actual criminals. It hardly worth celebrating when one is caught while thousands of innocent people are being harmed. There needs to be penalties for false reports or an ability for people to reclaim their data/accounts when cleared of wrongdoing. The number of false positives is absurd and Facebook and LinkedIn researchers have both found it to be highly erroneous (https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse?language=en).

I think we desperately need data privacy and data protection laws. And the “think of the children” or “I have nothing to hide” arguments against them are just trickle down ideas from these data brokers who profit heavily from invading personal data.

[-] EliteCloneMike@lemmy.zip 10 points 1 week ago

In case you are looking for Google alternatives to other services, I highly recommend Organic Street Maps or Magic Earth or Kagi Maps instead of Google Maps. Also FreeTube or Yewtu.be instead of YouTube. And mail providers like Proton or Tuta Mail that are end-to-end encrypted. And VPNs like Proton or Mullvlad. And most of all search engines like DuckDuckGo or Ecosia or Kagi or Tor to onionize your search experience. There are many alternatives to Google. I try to recommend for people to move away from Google where they can. I realize Google has worked their way into many websites and can be hard to get around in that sense, but ad blockers and DNS resolvers like uBlock Origin and NextDNS help to prevent tracking from Google.

[-] EliteCloneMike@lemmy.zip 8 points 1 week ago

Me too. Even after a tragedy caused by Google’s awful AI. My situation was that I uploaded old family photos in 2022, looking for photos of my best friend who passed away from cancer. A week later Google disabled my decade old account without warning in the middle of the night. No explanation. They just said “harmful content” was found and linked to a page of the most heinous accusations you could imagine. My appeal was rejected within hours. The message changed several months later to say “child abuse”, after an NYT article on the issue. Some of my family has changed their habits, but not really by a lot. Most still use Google services a lot. I am still in therapy because of the damage they caused. Some people blame me for not having a back-up. I usually did, but was in sick a sad state when I lost my friend that I want really thinking about it clearly and definitely didn’t have reason to expect anything like what they did.

Google, they report cartoon images and family photos as well. Forbes for reported on it (https://www.forbes.com/sites/thomasbrewster/2021/12/20/google-scans-gmail-and-drive-for-cartoons-of-child-sexual-abuse/). They closed a biggish profile YouTube channels account for cartoons as well (https://en.wikipedia.org/wiki/Naoki_Saito). Same for family photos/medical photos, of which there are plenty of reports from various news networks, the most prominent were probably the three from the NYT (https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.htmlhttps://www.nytimes.com/2022/12/30/technology/google-appeals-change.html, and https://www.nytimes.com/2023/11/27/technology/google-youtube-abuse-mistake.html). Plus more from El Pais (https://english.elpais.com/science-tech/2022-09-19/google-closed-my-account-over-sexual-content-but-theyre-not-telling-me-what-it-is-and-ive-lost-everything.html), Buisness Insider (https://www.businessinsider.com/google-users-locked-out-after-years-2020-10?op=1), Android Police (https://www.androidpolice.com/2021/03/08/when-google-locks-you-out-of-your-account-begging-the-internet-for-help-is-your-first-and-last-resort/), India Times (https://english.elpais.com/science-tech/2022-09-19/google-closed-my-account-over-sexual-content-but-theyre-not-telling-me-what-it-is-and-ive-lost-everything.html), etc. And tons of self reporting (https://piunikaweb.com/2026/02/03/google-photos-false-csam-flags-users-locked-out/).

My point is that it is not black and white or as simple as don’t download it. There are plenty of cases in which a person would not know such as downloading an AI training set (https://www.404media.co/a-developer-accidentally-found-csam-in-ai-data-google-banned-him-for-it/). If they truly wanted to follow the law, it would be knowing possession that should end with a persons account being terminated. All other cases should end with maybe the file reported and deleted. But their system is highly flawed and most appeals are denied, which is nonsense when less then 1% of these reports end up with an arrest and even fewer lead to convictions (https://stacks.stanford.edu/file/druid:pr592kc5483/cybertipline-paper-2024-04-22.pdf).

To be honest, I don’t think this is all a failure of Google or Meta or Microsoft, but the NCMEC and Thorn. They are the real threat to child safety, as they use their platform to claim to want to save children, but have other agendas (https://www.techdirt.com/2024/08/08/the-many-reasons-why-ncmecs-board-is-failing-its-mission-from-a-ncmec-insider/ and https://www.jezebel.com/ashton-kutcher-thorn-sex-workers-1850852760). Plus, at least Thorn has been found to lie about their numbers of children rescued (https://www.snopes.com/fact-check/kutcher-software-child-trafficking/).

All Google, and the others, are doing is over reporting and making harder to find actual criminals. It hardly worth celebrating when one is caught while thousands of innocent people are being harmed. There needs to be penalties for false reports or an ability for people to reclaim their data/accounts when cleared of wrongdoing. The number of false positives is absurd and Facebook and LinkedIn researchers have both found it to be highly erroneous (https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse?language=en).

I think we desperately need data privacy and data protection laws. And the “think of the children” or “I have nothing to hide” arguments against them are just trickle down ideas from these data brokers who profit heavily from invading personal data.

EliteCloneMike

0 post score
0 comment score
joined 1 week ago