Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
its fine as long as it gives references to check out. I mean its not fine because of the energy usage but if that is solved I would use it for search. again as long as it tells me sources.
If what you want is sources, a regular Google search will do a better job if it's a popular subject.
Chatgpt and it's kin will be inexplicably creative in its choice of sources, and in its summary thereof. And in the sources themselves sometimes.
If it's something you care to get right, just skip AI.
If it's meaningless, then it's harmless in its potential inaccuracy
See just like a normal search its up to you to evaluate it. ai search wise is as I said another abstraction. Not using it is like turning off the little snipets search engines do nowadays and going back to just clicking an each and every link. The problem is people just taking the response as gospel with no critical thought.
Like a normal search, except it only provides like 4 links, it's choice of links is even worse than Google SEO, and it provides inaccurate summaries of them rather than relevant text snippets.
So yes, people just taking the response as gospel is bad. But also it's just worse than search if you're using it as a search.
You go to the sites just like you would with the snippets and if they don't pan out you can rephrase or just go back to a normal search. This is what I meant by a level of abstraction. You can skip the scroll and check what it gives you and if its good you save time (much like the snippets saved time) and if not you are no worse and you correct or go slightly older school. As much as I agree the chatbots can be wrong I don't find them to usually be off base. They generally find pretty decent resources. Now when they are wrong they can be really wrong but its no different from someone searching and just using the first return without going through the results and evaluating each link. It reminds me when a neighbor in the dorms was explaining html to me as he was making a site and I was like. Why? Just use gopher.
But like I said, if you're using it like a search then it's just worse.
It's way slower, it's way fewer results, and the summaries can never be as accurate as verbatim quotes of the pages themselves.
The only reason to use AI is to get the summary, and then you're not using it as a search. Maybe it provides some references you can use to fact check, but that's still not a search.
You have to remember, LLM's are literally just auto-complete. They don't have the goal of giving you an answer or resources, they have the goal of providing text that would complete the conversation in a way that looks similar to what they've seen before. If they can give you a biased answer supported by cherry picked references, that's just as valid of a completion, because it looks like how such a conversation might be completed.
In situations where the response can be inherently assessed for correctness (eg art, but that's a whole other ethical issue) or correctness can be automatically verified (eg programming) then there is some value in limited use.
Although I personally think the implications of putting it in the hands of business owners isn't worth it, that another topic.
The less results is a feature. Again when it works the currated list should be stuff you would have landed on by evaluating snippets. Look. If I search for something I do a search then scroll through results. I may or may not find some links to go to. I go to them and if any are promising I maybe make them a new tab but if not I modify the wording of the search and this thing can repeat a few times but usually less than three. In the end I tend to have about three websites I use. The chatbot acts like asking someone to look into it. When they get back to you they give you the "answer" but you talk with them and go over what they found to make sure. Why? Because if you were ok with them going forward then you would just give them that role. Its the same thing with the chatbot. Its like an assitant but you are responsible for the output and need to evaluate it. It can save time but does not always. I have asked someone to do something and after evaluation I have to do it myself because it was not good enough and I go send them to do something else or have them shadow me while I do the thing. The nice thing with the chatbot is you don't need to make sure its time is being utilized efficiently so you don't need to give it a new task and it does not have the capability of shadowing you (if it did then jobs would really be in trouble.)
Normal Google search is already curated.
And when you're done with the list of curated results and you want more? You click "next page" to get the next page of curated results. 10 curated results at a time.
The chat bot is NOT like asking someone to look into it, it just LOOKS like it. Don't anthropomorphize chatbots, because then they trick you into thinking they're doing something they're not.
You evaluate the references they send you, but you don't evaluate the references they don't send you.
This is a problem with Google SEO too, but LLM equivalent is already a problem and getting worse.
I disagree. Its using a similar process when evaluating what makes a good result. Im not anthropomorphizing it I am making a comparison and such things are about comparing things that have similarities but are not the actual thing. I don't think its a human. Again its why I say its another level of abstraction. Your response here would suggest people should stop looking at search results that are organized and go through each and every result not utilizing the algorithms that are looking for that best fit. The chatbot is just more algorithms. Look I don't care if you use it or not this whole thing was just me saying I see value in the same way I have seen value in each level of technology as it has come. Don't get me wrong I don't use it for majority of search just like I have not embraced technology before but I do realize at some point I likely will be utilizing it extensively although hopefully with more options that are as not mainstream and if im lucky local.