this post was submitted on 19 Sep 2023
223 points (99.1% liked)
Technology
73567 readers
3898 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here's to hoping that open-source LLMs become more advanced and optimized.
I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.
If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?
In a private environment, one person's mistake can happen, period.
A corporate environment absolutely needs robust procedures in place to prevent the company and all their clients from such huge impact of one person's mistake.
But that's a looong tradition at M$ - not having it, I mean.
Root cause is whatever is allowing the human error to happen.