this post was submitted on 16 Apr 2024
82 points (95.6% liked)

Apple

20410 readers
1 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] fartsparkles@sh.itjust.works 18 points 2 years ago

Quantised models can be surprisingly small. And if Apple aren’t targeting LLMs for local use, more specific/tailored models absolutely can run on device.

That said, given the precedent sent by Siri, their next progression of Siri into an LLM will absolutely require network connection and be executed server side.