this post was submitted on 25 Sep 2025
-4 points (30.0% liked)

Perchance - Create a Random Text Generator

1185 readers
20 users here now

⚄︎ Perchance

This is a Lemmy Community for perchance.org, a platform for sharing and creating random text generators.

Feel free to ask for help, share your generators, and start friendly discussions at your leisure :)

This community is mainly for discussions between those who are building generators. For discussions about using generators, especially the popular AI ones, the community-led Casual Perchance forum is likely a more appropriate venue.

See this post for the Complete Guide to Posting Here on the Community!

Rules

1. Please follow the Lemmy.World instance rules.

2. Be kind and friendly.

  • Please be kind to others on this community (and also in general), and remember that for many people Perchance is their first experience with coding. We have members for whom English is not their first language, so please be take that into account too :)

3. Be thankful to those who try to help you.

  • If you ask a question and someone has made a effort to help you out, please remember to be thankful! Even if they don't manage to help you solve your problem - remember that they're spending time out of their day to try to help a stranger :)

4. Only post about stuff related to perchance.

  • Please only post about perchance related stuff like generators on it, bugs, and the site.

5. Refrain from requesting Prompts for the AI Tools.

  • We would like to ask to refrain from posting here needing help specifically with prompting/achieving certain results with the AI plugins (text-to-image-plugin and ai-text-plugin) e.g. "What is the good prompt for X?", "How to achieve X with Y generator?"
  • See Perchance AI FAQ for FAQ about the AI tools.
  • You can ask for help with prompting at the 'sister' community Casual Perchance, which is for more casual discussions.
  • We will still be helping/answering questions about the plugins as long as it is related to building generators with them.

6. Search through the Community Before Posting.

  • Please Search through the Community Posts here (and on Reddit) before posting to see if what you will post has similar post/already been posted.

founded 2 years ago
MODERATORS
 

After having used the new model for over a month, mostly on AI Story Generator, and investigating on the old and new AI models used, I've reached to a conclussion that, in m opinion, makes sense.

The old model was Llama 2. Llama 2 (and Llama 3) are models feed on books, as in lots of literature. Meta licensed a LOT of them to train the models.

The new model is Deep Seek, or at least it seems to be so. We'll assume it is, but to be fair, it doesn't changes the argument a lot. DS has an issue, it is trained on normal content, say: internet, some books obviously, interations, etc.

Now, what's the issue with this?

Llama is a model that knows WAY better how a story works, having hundreds of them on its dataset and having processed them during its training. DS doesn't, DS is a more generalist model, thought more as an assistant than a story creator.

For the kind of usage done here, essentially either chatting with characters with AI-Character-Chat or writing a story with AI-Story Generator, the improvement in context and general knowledge DS gives is not worth the decrease in narrative quality, and understanding of story writing. That's not mentioning all the hallucinations, total ignoration of context and prompting, and similar the new model has.

Llama 2 is a way better option for the kind of usage we have. Yes, we would be lossing some general knowledge. Yes, it may not be the best AI model out there. But it's all things considered, it's a matter of chosing the best option for our use case.

I understand the dev does all this work alone, and appretiate his effort for it. That's why, as a really active user of this platform and service, I consider the best choice here is to return to the old model.

If you have some argument more for it, please add it in the comments. Thanks everyone for your time.

-Lucalis.

you are viewing a single comment's thread
view the rest of the comments
[–] Garth01@lemmy.world 2 points 6 days ago (13 children)

I agree. The new model isn't very good. The main issues I have with it are that it tends to break immersion, saying stuff like "BREAKING_BAD_PATTERNS" or talking about "breaking bad patterns" for no given reason whatsoever and has no context to the roleplay at hand. They also tend to speak like this all the time:

John nods, "I think we've got it under control," he said "but watch your back just in case."

Even if the initial message has asterisk roleplay such as this, characters added to that thread will still talk like the above example, asterisks have to be incorporated a LOT into the initial message (at least from what I observed) in order for the new characters to RP like that.

The new model isn't that bad though, as it is able to portray fictional characters almost perfectly. I don't really use the AI Story Generator much, I only use the AI Character Chat and occasionally the image generator. I'm saying this based on what I've observed from there.

[–] Almaumbria@lemmy.world 1 points 5 days ago (11 children)

Hi! :)

Just commenting to clarify that the 'break bad patterns' bug is unrelated to the new model: this behavior is actually caused by the prompt used by AI character chat when generating the bot reply -- it always contains the phrase "Remember the break bad patterns rule", in reference to an item in the default writing instructions. IIRC, and in case it hasn't yet been fixed, this line is added to the end of the prompt somewhere deep in the getBotReply function; I forked ACC and can confirm that editing it out removed the issue.

Anyway, there are similar bugs in other generators, and I suspect most of them are also due to the prompt having similar instructions that where originally meant to mitigate quirks of the old model, but now only cause problems.

More on-topic, I've been testing the new model a lot, writing prompts for it from scratch, and the results are amazing: it can consistently understand complex, structured instructions, so one can more reliably make little 'programs' with it, not just narrative stuff. But you have to understand that generators using old prompts will more than likely not work out of the box, you have to tinker with them to get the results you want.

I really, really wish the new model stays. It has opened up a lot of possibilities for making new generators, and a rollback would really suck for me as a developer. I'm specifically hacking away at ACC to put together a new tool for narrative, world-building and roleplay; it's working fantastic, so I get a feeling of absolute dread each time I see posts like this! Please, don't take it away from me, I only need some more time! ;)

Anyway, just wanted to share these bits. Cheers!

[–] Randomize@lemmy.world 1 points 5 days ago (6 children)

Good luck, I hope you're successful! I also like the new model for rp (when it works properly once in a while), it's much smarter and doesn't need me to hold it's hand for every little detail. I immediately notice the difference. old model often doesn't understand chars can't know what happens in places far away unless there is some kind of stable connection. My user isn't constantly with the char 24/7, so that irked me quite a bit. New model knew without prompt. <3

[–] Lucalis@lemmy.world 1 points 5 days ago (1 children)

it’s much smarter and doesn’t need me to hold it’s hand for every little detail

Maybe Im the unluckiest guy ever, but on my end, the AI just hallucinates whatever it wants when I do something with a character. I MUST be actively guiding it to obvious things, and it still just completelly ignores it, something never happened before

[–] Randomize@lemmy.world 1 points 5 days ago* (last edited 5 days ago) (2 children)

That's why I said when it works once in a while. There are certain hours where I think dev is working on the model constantly (like right now) and yes, then it's dumb af. But like... uh... I don't know, like 10 hours ago or so, it worked perfectly fine for me. I was able to have a real flow of back-and-forth messages for the 10 mins I used it, without much rerolling or needing to prompt real-life mechanics like "char can't see what user does while texting" (from across the city)

Those 10 mins I got more story done than in two hours yesterday. And this wasn't the first time, that's why I think dev might works on it on certain more or less fixed times.

[–] Randomize@lemmy.world 1 points 4 days ago

Uhhh...damn, I jinxed it. maybe because it's weekend.

[–] Lucalis@lemmy.world 1 points 5 days ago (1 children)

Those 10 mins I got more story done than in two hours yesterday. And this wasn’t the first time, that’s why I think dev might works on it on certain more or less fixed times.

The issue is... I use it always, it helps with my Asperger and ADHD

[–] Randomize@lemmy.world 0 points 5 days ago* (last edited 5 days ago) (1 children)

Maybe see it like your doc being on vacation. If there's a serious issue, you need to find a substitute, otherwise you need to wait. It might not be perfect, but once the doc comes back, they are (hopefully) better than before.

It probably will take more time, days, weeks, mabey even months, idk much about programming. But complaining right now always seems to me like shouting at a surgeon mid-surgery why there's so much blood.

[–] Lucalis@lemmy.world 1 points 5 days ago

Those 10 mins I got more story done than in two hours yesterday. And this wasn’t the first time, that’s why I think dev might works on it on certain more or less fixed times.

Following your own analogy, the problem is: this didn't needed any surgery.

load more comments (4 replies)
load more comments (8 replies)
load more comments (9 replies)