this post was submitted on 31 May 2025
210 points (90.4% liked)

Technology

73655 readers
3689 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AmbiguousProps@lemmy.today 38 points 2 months ago (6 children)

Why would I use this over Ollama?

[–] Greg@lemmy.ca 31 points 2 months ago (4 children)

Ollama can’t run on Android

[–] AmbiguousProps@lemmy.today 22 points 2 months ago (1 children)

That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.

[–] Greg@lemmy.ca 4 points 2 months ago

Yes, that's my setup. But this will be useful for cases where internet connection is not reliable

[–] Diplomjodler3@lemmy.world 3 points 2 months ago

Is there any useful model you can run on a phone?

[–] pirat@lemmy.world 2 points 2 months ago
[–] gens@programming.dev 2 points 2 months ago

Llama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.

load more comments (1 replies)