Off My Chest
RULES:
I am looking for mods!
1. The "good" part of our community means we are pro-empathy and anti-harassment. However, we don't intend to make this a "safe space" where everyone has to be a saint. Sh*t happens, and life is messy. That's why we get things off our chests.
2. Bigotry is not allowed. That includes racism, sexism, ableism, homophobia, transphobia, xenophobia, and religiophobia. (If you want to vent about religion, that's fine; but religion is not inherently evil.)
3. Frustrated, venting, or angry posts are still welcome.
4. Posts and comments that bait, threaten, or incite harassment are not allowed.
5. If anyone offers mental, medical, or professional advice here, please remember to take it with a grain of salt. Seek out real professionals if needed.
6. Please put NSFW behind NSFW tags.
view the rest of the comments
AI is actually a pretty good teacher. It explains things really well. I had to learn trigonometry for my University course. I used AI to teach myself a couple of algorithms. The AI explained it better than my professor did
now imagine how well you would understand it if you read the books the LLM stole from.
I can't ask a book a follow up question or ask it to verify my work so that I know i'm doing it right. AI is kind of like a tutor.
Also, I don't consider using information from books to be "stealing". Everybody uses information from books. That's literally how obtaining information works. You wouldn't accuse a human of stealing from a book because they explain something they learned from it.
AI can still make mistakes just like any source, which is why I still verify information.
🤔 I wonder what people did before AI? Too bad we'll never know because that was like forever ago, right?
It's intellectual property theft. someone spent an entire lifetime growing their knowledge and ideas within that book, only for a company to come along and steal it, then sell it to you in an abridged format.
LLMs make more mistakes than not. this is because an LLM doesn't comprehend what it says and simply emulates human speech patterns.
it can describe a hammer. it can tell you what it's used for. it can even tell you how to use it. But, it can't comprehend what it is. it's like trying to ask a person to comprehend the size of the universe. yeah, it's big but "big" still puts human comprehensive limitations on how infinitely vast space is.
you seem young, you'll understand one day when some brat tells you that your life's work is worthless but still valuable enough to steal and reference.
If you truly believed you were making a calm intellectual point, you wouldn’t need to end with an insult. That last sentence shows more frustration than confidence
any doubts I had about your age are now gone.