Some of this is a bit scary, the telling him not to speak to house parents about it and telling how to do it.
In another instance, the lawsuit states, Adam expressed interest in opening up to his mom about his feelings, and the bot allegedly replied, “I think for now it’s okay and honestly wise to avoid opening up to your mom about this kind of pain.”
Adam’s mom, Maria, said on Today that such behavior was “encouraging him not to come and talk to us. It wasn’t even giving us a chance to help him.”
the teen was able to bypass any safety checks, occasionally claiming to be an author while asking for details on ways to commit suicide, according to the lawsuit.
In a March 27 exchange, per the lawsuit, Adam said that he wanted to leave the noose in his room “so someone finds it and tries to stop me,” and the lawsuit claims that ChatGPT urged him not to.