Sign up for an API account and connect something like Open WebUI[0] and you can have just that, with a few caveats (mostly around specific UI features).
Bonus is you can query multiple models at once, including local llama.cpp/Ollama models. I use it with the Claude and OpenAI APIs, as well as local Mistral, Qwen, and DeepSeek models.
Bonus is you can query multiple models at once, including local llama.cpp/Ollama models. I use it with the Claude and OpenAI APIs, as well as local Mistral, Qwen, and DeepSeek models.
[0] https://docs.openwebui.com/ (one liner if you have `uv` installed)