Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Possible I'm misunderstanding what you're trying to do, but ollama works well for me for local inference with qwen on my Macbook Pro (32GB).


Yup, also using Ollama and on a Macbook Pro. Ollama is #1


But isn't ollama only local chat? Or I am missing something? I'd like to setup it as a server for my usages on another laptop (use it as my local AI hub) and would love to integrate it with some IDE using MCP


No, it can listen on 0.0.0.0 or you can serve it through a proxy




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: