this post was submitted on 01 Apr 2024
267 points (99.3% liked)

Technology

1362 readers
480 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 7 months ago* (last edited 7 months ago) (2 children)

Few questions about that, unless they're literally taking their model and putting it into your own box using it's own compute power, I don't see how that's possible. They can call it "your" copilot all they want but if they're reading your data and prompts and computing that on their own box then they're using your data, right?

[–] [email protected] 3 points 7 months ago (2 children)

Major organizations use encryption where they hold the keys so Microsoft is unable to read their data. They can have thousands of servers running on Microsoft's Azure stack and yet Microsoft is unable to read the data that is being processed.

[–] [email protected] 4 points 7 months ago (2 children)

If all auditors are uncorrupted, highly competent and have full overview. Boeing was able to corrupt it's government auditors to save some money on redundant sensors. With Microsoft pushing big on gathering and selling data I wouldn't trust a byte that passes their server.

[–] [email protected] 1 points 7 months ago

You clearly do not understand encryption or corporate auditing.

[–] [email protected] 0 points 7 months ago (1 children)

Microsoft has to compete with other cloud providers on security. Unlike Boeing who has no domestic competition. Any of Google, Amazon, or Oracle would love to find out that Microsoft is decrypting user data to sell to partners because they would be screaming to the high heavens that O365/Azure is insecure and enterprises must switch to their solutions. SaaS/IaaS subscriptions are much more profitable than selling user data, there is a near 0 chance that Microsoft is improperly handling enterprise data (on purpose)

[–] [email protected] 2 points 7 months ago

Microsoft cannot decrypt your data when you hold the keys.

[–] [email protected] 2 points 7 months ago

Thanks for the explanation

[–] [email protected] 2 points 7 months ago

I’m not an admin, but I do provision ms cloud licensing and have run across this question more than a few times. At the enterprise level, I’m told the copilot data is “walled off” and secure, and not harvested by MS. I have nothing to back that up, but that’s what I’m told. I’m certain if it weren’t true, I would have heard about it by now.