this post was submitted on 27 Jun 2024
388 points (96.6% liked)

Comic Strips

19338 readers
721 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
 

Source: Webtoon - RSS

you are viewing a single comment's thread
view the rest of the comments
[–] Tar_alcaran@sh.itjust.works 62 points 1 year ago (9 children)

People really should remember: generative AI makes things things that look like what you want.

Now, usually that overlaps a lot with what you actually want, but not nearly always, and especially not when details matter.

[–] FaceDeer@fedia.io 21 points 1 year ago (2 children)

It also isn't telepathic, so the only thing it has to go on when determining "what you want" is what you tell it you want.

I often see people gripe about how ChatGPT's essay writing style is mediocre and always sounds the same, for example. But that's what you get when you just tell ChatGPT "write me an essay about X." It doesn't know what kind of essay you want unless you tell it. You have to give it context and direction to get good results.

[–] gbuttersnaps@programming.dev 12 points 1 year ago (1 children)

Not disagreeing with you at all, you made a pretty good point. But when engineering the prompt takes 80% of the effort that just writing the essay (or code for that matter) would take, I think most people would rather write it themselves.

[–] FaceDeer@fedia.io 1 points 1 year ago

Sure, in those situations. I find that it doesn't take that much effort to write a prompt that gets me something useful in most situations, though. You just need to make some effort. A lot of people don't put in any effort, get a bad result, and conclude "this tech is useless."

[–] slazer2au@lemmy.world 9 points 1 year ago (1 children)

We are all annoyed at clients for not saying what they actually want in a Scope of Works, yet we do the same to LLM thinking it will fill in the blanks how we want it filled in.

[–] takeda@lemmy.world 7 points 1 year ago (1 children)

Yet that's usually enough when taking to another developer.

The problem is that we have this unambiguous language that is understood by human and a computer to tell computer exactly what we want to do.

With LLM we instead opt to use a natural language that is imprecise and full of ambiguity to do the same.

[–] FaceDeer@fedia.io 0 points 1 year ago

You communicate with co-workers using natural languages but that doesn't make co-workers useless. You just have to account for the strengths and weaknesses of that mechanism in your workflow.

load more comments (6 replies)