Use Ollama for local llama inference on Raycast. This application is not directly affiliated with Ollama.ai.
Ollama installed and running on your mac. At least one model need to be installed throw Ollama cli tools or with 'Manage Models' Command. You can find all available model here.
View, add, and remove models that are installed locally or on a configured remote Ollama Server. To manage and utilize models from the remote server, use the Add Server action.
Chat with your preferred model from Raycast, with the following features:
From extentions preferences you can chose how many messages use as memory. By default it use the last 20 messages.
All preconfigured commands are crafted for general use. This command allow you to create a custom command for your specific needs.
Prompt use Raycast Prompt Explorer format with the following tags supported:
View, add, and remove MCP servers for use with "Chat With Ollama." Currently, only tools are supported. A model with tool capabilities is required.
This feature was tested with duckduckgo-mcp-server, which allows the model to search information on DuckDuckGo.