How to run a local LLM as a browser-based AI with this free extension
ZDNETThe idea of querying a remote LLM makes my spine tingle — and not in a good way. When I need to do a spot of research via AI, I opt for a local LLM, such as Ollama.If you haven’t yet installed Ollama, you can read about it my guide on how to install an LLM on MacOS (and why you should). You can also install Ollama on Linux and Windows, and, given that the Firefox extension works on all three platforms, you can be sure that whatever desktop OS you use will work.Also: My 5 favorite web browsers – and what each is ideal forUsing Ollama from within the terminal window is actually quite easy, but it doesn’t give you such obvious access to other features (such as LLM/Prompt selection, image upload, internet search enable/disable, and Settings).The free extension I will point out works on Firefox, Zen Browser (one of my favorites), and others.Let’s get to that extension.How to install the Page Assist extension in FirefoxWhat you’ll need: To make this work, you’ll need Ollama installed and running, as well as the Firefox browser. That’s it. Let’s make some magic. More
