You write the query once. Router decides which model answers it — cheapest for simple questions, most capable for complex ones — and injects your codebase context automatically before sending. Supports Anthropic, OpenAI, Google Gemini, Together AI, Groq, Mistral, and Ollama. Works with any tool built on these providers including Cursor, Windsurf, and GitHub Copilot. One endpoint, every model, zero manual switching.
You write the query once. Router decides which model answers it — cheapest for simple questions, most capable for complex ones — and injects your codebase context automatically before sending. Supports Anthropic, OpenAI, Google Gemini, Together AI, Groq, Mistral, and Ollama. Works with any tool built on these providers including Cursor, Windsurf, and GitHub Copilot. One endpoint, every model, zero manual switching.