Mixel

joined 2 years ago
[–] Mixel@feddit.de 6 points 2 years ago (2 children)

Can you elaborate on what questionable things they did? I am thinking of hosting my own gitea instance

[–] Mixel@feddit.de 1 points 2 years ago

Thank you that's was I was looking for!

[–] Mixel@feddit.de 2 points 2 years ago

Anwesend 👍🏻

[–] Mixel@feddit.de 2 points 2 years ago (5 children)

Gibt es schon eine eta wann die Instanz auf 0.18 upgraded?

[–] Mixel@feddit.de 2 points 2 years ago

Yes thank you for the information I really appreciate it! I decided to go for kobold.cpp for the meantime with CLBlast which works just overall way better than standart CPU inference. But im looking into the ROCm LLamacpp support which I am currently trying.

[–] Mixel@feddit.de 3 points 2 years ago

Vielen dank für den Tipp, ich glaube ich bleibe aber solange bei jerboa bis eine die erste version von Boost für Lemmy rauskommt :) Die habe ich nämlich schon sehr Lange bei Reddit genutzt

[–] Mixel@feddit.de 3 points 2 years ago* (last edited 2 years ago)

I will try that once in home! Ty for the suggestions can I use kobold also in sillytavern? iirc there was an option for koboldai or something is that koboldcpp or what does that option do?

EDIT: I got it working and its wonderful thank you for suggesting me this :) I had some difficulties setting it up especially with opencl-mesa since I had to install opencl-amd and then finind out the device ID and so on but once it was working its great!

[–] Mixel@feddit.de 3 points 2 years ago (2 children)

Richtig danke nochmal für die Antwort :) Ich habe auch einen thread auf lemmy.world gesehen wo die admins versucht haben auf 0.18 zu aktualisieren aber einige schwierigkeiten hatten ich hoffe das wird mit dem nächsten Update besser, sodass feddit auch ohne Komplikationen upgraden kann

[–] Mixel@feddit.de 2 points 2 years ago (2 children)

How do use ooba with rocm I looked at the python file where you can install amd and it will just say "amd not supported" and exit. I guess it just doesn't update the webui.py when I update ooba? I somewhere heard that llama.cpp with CLBlast wouldn't work with ooba, or am I wrong? Also is konoldcpp worth a shot? I hear some success with it

[–] Mixel@feddit.de 1 points 2 years ago

This is sadly the correct answer. Many people also just use what their preferred OS preinstalles rarely users ever use an alternative to that

[–] Mixel@feddit.de 1 points 2 years ago (1 children)

You can just have the extended interface so you have more options sadly there's no gesture for it yet

[–] Mixel@feddit.de 1 points 2 years ago

Have you search protondb for some reports? Not sure if the game runs good never had it

view more: ‹ prev next ›