Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Running Sourcebot with a self-hosted LLM is something we plan to support and have documented in the golden path very soon, so stay tuned.

We are using the Vercel AI SDK which supports Ollama via a community provider, but doesn't V5 yet (which Sourcebot is on): https://v5.ai-sdk.dev/providers/community-providers/ollama



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: