Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, a larger context makes it easier to integrate other tools, like a vector database for information retrieval to jam into the context, and the more context, the more potentially relevant information can be added. For models like llama, where context is (usually) max 2K tokens, you're sort of limited as to how much potentially relevant information you can add when doing complex tasks.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: