Skip to main content

Quick Setup

1

Navigate to AI Settings

Navigate to chrome://settings/oryx to add Ollama as a provider.
2

Get Model ID

Get the model ID of your Ollama model (e.g., gpt-oss:20b)
3

Start Ollama Server

Start Ollama with CORS enabled:
OLLAMA_ORIGINS="*" ollama serve
4

Select and Use

Select the model in agent and start using it! 🥳
If you don’t want to run from CLI with CORS settings, we recommend using LM Studio instead. See the LM Studio setup guide.

Alternative: LM Studio

LM Studio Setup

If you prefer not to run Ollama from the command line, LM Studio provides a more user-friendly alternative with a graphical interface.