Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
andreww_young
3 months ago
|
parent
|
context
|
favorite
| on:
How to Deploy LLM Locally
Ollama is very convenient, but I advise you not to try it, because the capabilities of local models are really poor.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: