this post was submitted on 18 Oct 2024
159 points (97.6% liked)
Asklemmy
43890 readers
766 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Let me tell you a story. Many years ago I worked for big banks and insurance companies. One day I was tasked with a project. It was an amazing, from the tech point of view, project. It was something like this: a user navigates to a bank website looking for information about some product. The website presents the user a simple contact form - first name, last name, phone number and/or email. Based on provided data bank would use it to update user data (if there was no official account it would update the "ghost" account, aka "I know about you, but you don't know about me"). Next the bank would scrape all publicly available social media accounts and build the "hidden" profile (I'll get to this later). Based on all that data, user would be assigned a score based on which all future interaction with a bank would be determined. For a regular person this would mean that "I'm sorry but according to our system we cannot give you a loan".
Now, about the "hidden" profile. It's a thing that all big companies (including banks and insurance companies) hold. It's all the data collected from all publicly available profiles (and sometimes from the shady sites), used to create a profile that's not visible to a frontline workers and it's referenced as a "system decided based on your data".
Now, to make this more scary. This happened 10-15 years ago. Way before the so called AI. Imagine how much more data those companies have about you in today's world and how good they are in processing it.
Now i have another question. What's the issue if they're ONLY using this info to improve my experience or make sensible business decisions?
Suppose they start out entirely benevolent. That commitment must be perpetually renegotiated in upheld over time. As the landscape changes, as the profit motive applies pressure, as new data and technologies become available, as new people on the next step of their careers get handed the reigns, the consistency of intention will drift over time.
The nature of data and privacy is such that it's perpetually subjected to these dynamic processes. The fabric of any pact being made, is always being rewoven, first with little compromises and then with big ones.