Date:

Ollama on MacOS Simplified

Using Msty with Ollama on MacOS

I’ve turned to locally installed AI for research because I don’t want third parties using my information to either build a profile or train their local language models (LLMs). My local AI of choice is the open-source Ollama. I recently wrote a piece on how to make using this local LLM easier with the help of a browser extension, which I use on Linux. But on MacOS, I turn to an easy-to-use, free app called Msty.

Installing Msty

What you’ll need: The only things you’ll need for this are a MacOS device, and Ollama installed and running. If you haven’t installed Ollama, do that first (here’s how). You’ll also need to pull down one of the local models (which is demonstrated in the article above).

Head to the Msty website, click the Download Msty dropdown, select Mac, and then select either Apple Silicon or Intel. When the installation is complete, double-click on the file and, when prompted, drag the Msty icon to the Applications folder.

Using Msty

1. Open Msty

Next, open Launchpad and locate the launcher for Msty. Click the launcher to open the app.

2. Connect your local Ollama model

When you first run Msty, click Setup Local AI and it will download the necessary components. Once the download completes, it will take care of the configuration and download a local model other than Ollama.

To connect Msty to Ollama, click Local AI Models in the sidebar and then click the download button associated with Llama 3.2. Once downloaded, you can select it from the models dropdown. You can also add other models, for which you’ll need to retrieve an API key from your account for that particular model. Msty should now be connected to the local Ollama LLM.

3. Model Instructions

One of the cool features of Msty is that it allows you to change the model instructions.

For example, you might want to use the local LLM as an AI-assisted doctor, for writing, accounting, as an alien anthropologist, or as an artistic advisor.

To change the model instructions, click Edit Model Instructions in the center of the app and then click the tiny chat button to the left of the broom icon. From the popup menu, you can select the instructions you want to apply. Click “Apply to this chat” before running your first query.

Conclusion

Msty is one of the best tools for interacting with Ollama. With its easy-to-use interface and features like split chats, regenerate model response, clone chats, add multiple models, real-time data summoning, create Knowledge Stacks, a prompt library, and more, you’ll be able to get the most out of your local LLM.

FAQs

Q: What is Msty?
A: Msty is a free app that allows you to use locally installed and online AI models.

Q: How do I install Msty?
A: You can download Msty from the Msty website and follow the installation instructions.

Q: How do I connect Msty to Ollama?
A: To connect Msty to Ollama, click Local AI Models in the sidebar and then click the download button associated with Llama 3.2. Once downloaded, you can select it from the models dropdown.

Q: What are the benefits of using Msty with Ollama?
A: Msty allows you to use your local Ollama LLM with ease, and its features like split chats, regenerate model response, clone chats, add multiple models, real-time data summoning, create Knowledge Stacks, a prompt library, and more make it a powerful tool for interacting with your local LLM.

Latest stories

Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here