this post was submitted on 27 Jul 2025
158 points (96.5% liked)

Technology

73534 readers
3496 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] nathan@piefed.alphapuggle.dev 83 points 6 days ago (1 children)

$10 says they haven't actually escaped anything and it's just hallucinating a directory structure & file contents

[–] MagicShel@lemmy.zip 24 points 6 days ago

Even if it had access to its own source during training, the chances of it regurgitating it with total fidelity are zero.

[–] ignirtoq@fedia.io 24 points 6 days ago

Several years ago I created a Slack bot that ran something like Jupyter notebook in a container, and it would execute Python code that you sent to it and respond with the results. It worked in channels you invited it to as well as private messages, and if you edited your message with your code, it would edit its response to always match the latest input. It was a fun exercise to learn the Slack API, as well as create something non-trivial and marginally useful in that Slack environment. I knew the horrible security implications of such a bot, even with the Python environment containerized, and never considered opening it up outside of my own personal use.

Looks like the AI companies have decided that exact architecture is perfectly safe and secure as long as you obfuscate the input pathway by having to go through a chat-bot. Brilliant.

[–] BaroqueInMind@piefed.social 16 points 6 days ago (1 children)

And so Microsoft decided this wasn't a big enough vulnerability to pay them a bounty. Why the fuck would you ever share that with them then, if you could sell it to a black-hat hacking org for thousands?

[–] fmstrat@lemmy.nowsci.com 4 points 6 days ago (1 children)

There may not have been any logical progression beyond the container.

[–] deadcade@lemmy.deadca.de 1 points 5 days ago

Surely there wasn't an exploit on the half a year out of date kernel (Article screenshots from April 2025, uname kernel release from a CBL-Mariner released September 3rd 2024).

[–] Bubbey@lemmy.world 2 points 5 days ago

I'm sure nothing will go wrong with tons of critical business documents being routed through copilot for organizations...