Ask HN: Browser-Based LLM Models?
Does anyone know if there are there any plans for browsers to natively integrate LLMs, LLM APIs, or LLM models like Llama for local use by web applications?
I feel there's a large opportunity here for a more privacy-friendly, on-device solution that doesn't send the user's data to OpenAI.
Is RAM the current main limitation?
https://simonwillison.net/2024/Jul/3/chrome-prompt-playgroun...
https://developer.chrome.com/docs/ai/built-in
Thank you! This is exactly what I was looking for. I hope these become part of the web platform APIs! With Google pushing this effort, it might be highly likely.
I hope Apple will follow suit with some of their small models (https://huggingface.co/apple/OpenELM).
And then maybe even Firefox will join them...
Every big tech company is trying to do this. FB (through whatsapp), Google (through chrome/Android), Apple (through Safari/iOS/etc). As soon as they meet their internal metrics, they will release these to public
"Is RAM the current main limitation?"
(V)RAM+processing power+storage(I mean what kind of average user wants to clog half their hard drive for a subpar model that output 1 token a second?)
check out https://github.com/mlc-ai/web-llm
IMO the main limitation is access to powerful GPUs for running models locally and the size of some models causing UX problems with cold starts