Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How is your offering different from local ollama?
 help



Its batteries included. No config.

We also fine tuned and did RL on our model, developed a custom context engine, trained an embedding model, and modified MLX to improve inference.

Everything is built to work with each other. So it’s more like an apple product than Linux. Less config but better optimized for the task.


I only understood half of the tech jargon in your answer. If I understood it all I’d probably run it myself. If someone who is less knowing than me is your customer, you need to explain in simpler terms!

Fair enough! The simple answer is: we did a lot of work to make the model better at coding without requiring complicated installation or configuration. One comman to install and run.

All the benefits of claude code, without any of the limitations or rug pulls.


I’m not nitpicking, but you’re saying better than Claude or Codex? Is it also focused and tested mainly on web/JS technologies? It’s still berry much uphill battle building native apps. I think there’s untapped market for Swift / Android coding models.

Actually, we're more broadly trained than most models. We did long tail training across languages, so we improved execution with languages like java, swift, and even cobol.

It's definitely a david vs goliath. But we know there's a subset of devs who need the privacy or unlimited nature of local.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: