Settings
Configure your API provider and model
Note:
- OpenAI (provider): Any OpenAI-compatible API - e.g. OpenAI, DeepSeek, Kimi, etc. Fill in that service's Base URL and API Key.
- Ollama: Run locally with
ollama serve, Base URL usuallyhttp://localhost:11434 - LiteLLM: Use any LLM that supports OpenAI-compatible API
- Test Connection checks if the Polarity backend is reachable (default
http://localhost:8000). Start it withpolarity serveoruvicorn polarity_agent.api:app --port 8000.