Local models and custom OpenAI-compatible endpoints
planned
B
Brian
Imported from GitHub feature request.
Source issue:
- #337 Add capability for local models / any OpenAI Endpoint — https://github.com/refactoringhq/tolaria/issues/337
Original context from GitHub:
Users would like Tolaria's AI integration to support local models, OpenRouter, and custom OpenAI-compatible API endpoints, instead of relying only on external coding-agent CLIs such as Claude Code or Codex.
Luca Rossi
marked this post as
planned