How do you even get these chat bots to start telling you shit like this? Is it just from having a conversation for too long in the same chat window or something? I don't understand how this keeps happening.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Theres a Eula for that.
I would like to see the full transcript.
How do we know this didn't start off with prompts about creating a book, or asking about exciting things in life, or I don't know what.
Context would help a lot. Maybe it will come out in discovery.
That said, Gemini is garbage for anything anyways. Even as an AI, its bad at that.
I was thinking the same thing, like what is the flow of the chat to get it to this point?
This is so wild. The article frames Gemini to be the active part making the guy do things all the time. I cannot imagine how this works without roleplay-prompting and requesting those things from the chatbot. Not that I want to blame the victim and side with Google. It’s obviously dangerous to hand tools with good convincing-capabilities to unstable people. And weapons.