frownGuy12

joined 2 years ago
[–] frownGuy12@alien.top 1 points 2 years ago

There’s a lot of low hanging fruit for LLM integration with mobile devices. OpenAI doesn’t have a mobile OS to integrate with, and Apple doesn’t have an LLM. It’ll be much easier for Apple to make an LLM than for OpenAI to make an OS.

[–] frownGuy12@alien.top 1 points 2 years ago (1 children)

If it’s could based it won’t. If it’s a local model then I imagine you’ll need a new chip and much more RAM.