in

My go-to LLM tool just dropped a super simple Mac and PC app for local AI – why you should try it

Jack Wallen / Elyse Betters Picaro / ZDNET

ZDNET’s key takeaways

  • Ollama AI devs have released a native GUI for MacOS and Windows.
  • The new GUI greatly simplifies using AI locally.
  • The app is easy to install, and allows you to pull different LLMs.

If you use AI, there are several reasons why you would want to work with it locally instead of from the cloud. 

First, it offers much more privacy. When using a Large Language Model (LLM) in the cloud, you never know if your queries or results are being tracked or even saved by a third party. Also, using an LLM locally saves energy. The amount of energy required to use a cloud-based LLM is growing and could be a problem in the future.

Ergo, locally hosted LLMs.

Also: How to run DeepSeek AI locally to protect your privacy – 2 easy ways

Ollama is a tool that allows you to run different LLMs. I’ve been using it for some time and have found it to simplify the process of downloading and using various models. Although it does require serious system resources (you wouldn’t want to use it on an aging machine), it does run fast, and allows you to use different models.

But Ollama by itself has been a command-line-only affair. There are some third-party GUIs (such as Msty, which has been my go-to). Until now, the developers behind Ollama hadn’t produced their own GUI.

That all changed recently, and there’s now a straightforward, user-friendly GUI, aptly named Ollama.

Works with common LLMs – but you can pull others

The GUI is fairly basic, but it’s designed so that anyone can jump in right away and start using it. There is also a short list of LLMs that can easily be pulled from the LLM drop-down list. Those models are fairly common (such as the Gemma, DeepSeek, and Qwen models). Select one of those models, and the Ollama GUI will pull it for you. 

If you want to use a model not listed, you would have to pull it from the command line like so:

ollama pull MODEL

Where MODEL is the name of the model you want.

Also: How I feed my files to a local AI for better, more relevant responses

You can find a full list of available models in the Ollama Library.

After you’ve pulled a model, it appears in the drop-down to the right of the query bar.

The Ollama app is as easy to use as any cloud-based AI interface on the market, and it’s free to use for MacOS and Windows (sadly, there’s no Linux version of the GUI).

I’ve kicked the tires of the Ollama app and found that, although it doesn’t have quite the feature set of Msty, it’s easier to use and fits in better with the MacOS aesthetic. The Ollama app also seems to be a bit faster than Msty (in both opening and responding to queries), which is a good thing because local AI can often be a bit slow (due to a lack of system resources).

How to install the Ollama app on Mac or Windows

You’re in luck, as installing the Ollama app is as easy as installing any app on either MacOS or Windows. You simply point your browser to the Ollama download page, download the app for your OS, double-click the downloaded file, and follow the directions. For example, on MacOS, you drag the Ollama app icon into the Applications folder, and you’re done.

Using Ollama is equally easy: select the model you want, let it download, then query away.

<!–> The Ollama app showing the model select drop-down.

Pulling an LLM is as easy as selecting it from the list and letting the app do its thing.

Jack Wallen/ZDNET

Should you try the Ollama app?

If you’ve been looking for a reason to try local AI, now is the perfect time. 

Also: I tried Sanctum’s local AI app, and it’s exactly what I needed to keep my data private

The Ollama app makes migrating away from cloud-based AI as easy as it can get. The app is free to install and use, as are the LLMs in the Ollama library. Give this a chance, and see if it doesn’t become your go-to AI tool.

Want more stories about AI? Check out AI Leaderboard, our weekly newsletter.


Source: Robotics - zdnet.com

Why I prefer this $180 Fender speaker over competing models by Sony and Bose

Google is building a Linux terminal app for native Android development – here’s why that’s huge