Well, a larger context makes it easier to integrate other tools, like a vector database for information retrieval to jam into the context, and the more context, the more potentially relevant information can be added. For models like llama, where context is (usually) max 2K tokens, you're sort of limited as to how much potentially relevant information you can add when doing complex tasks.