ZDNETI’ve turned to locally installed AI for research because I don’t want third parties using my information to either build a profile or train their local language models (LLMs). My local AI of choice is the open-source Ollama. I recently wrote a piece on how to make using this local LLM easier with the help of a browser extension, which I use on Linux. But on MacOS, I turn to an easy-to-use, free app called Msty. Also: How to turn Ollama from a terminal tool into a browser-based AI with this free extensionMsty allows you to use locally installed and online AI models. However, I default to the locally installed option. And, unlike the other options for Ollama, there’s no container to deploy, no terminal to use, and no need to open another browser tab. Msty features things like split chats (so you can run more than one query at a time), regenerate model response, clone chats, add multiple models, real-time data summoning (which only works with certain models), create Knowledge Stacks (where you can add files, folders, Obsidian vaults, notes, and more to be used to train your local model), a prompt library, and more.Msty is one of the best tools for interacting with Ollama. Here’s how to use it. More