Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

8k context is tiny compared to what’s out there. They promise much larger contexts but until then it can’t even reliably summarize every web page out there.


There are techniques to extend the context window via fine tuning.

The authors claim this method was used to extend Llama 2 to 128k: https://github.com/jquesnelle/yarn




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: