My dad had me try out Bing GPT-4. I asked a question about a very famous artist's compositions. It constantly mixed up biographical details from that famous artist with another famous composer with a similar name, said things that were wrong (claimed he wrote a bunch of songs that he recorded (but didn't write) and basically didn't answer the question until I corrected each mistake individually. Some of the mistakes I knew were wrong already, while others I had to verify myself. All together, it took me 30 minutes to get an answer that I still don't know for sure is correct.
Basically, I trust LLMs less than I would trust a 3rd grader who is plagiarizing Wikipedia.