this post was submitted on 20 Feb 2026
396 points (99.7% liked)

Technology

81611 readers
4333 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SirHaxalot@nord.pub 2 points 6 hours ago* (last edited 6 hours ago) (1 children)

That seems to be the terms for the personal edition of Microsoft 365 though? I’m pretty sure the enterprise edition that has the features like DLP and tagging content as confidential would have a separate agreement where they are not passing on the data.

That is like the main selling point of paying extra for enterprise AI services over the free publicly available ones.

Unless this boundary has actually been crossed in which case, yes. It’s very serious.

[–] dgdft@lemmy.world 2 points 4 hours ago

This part applies to all customers:

v. Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.

And while Microsoft has many variations of licensing terms for different jurisdictions and market segments, what they generally promise to opted-out enterprise customers is that they won’t use their inputs to train “public foundation models”. They’re still retaining those inputs, and they reserve the right to use them for training proprietary or specialized models, like safety-filters or summarizers meant to act as part of their broader AI platform, which could leak down the line.

That’s also assuming Microsoft are competent, good-faith actors — which they definitely aren’t.