Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> 1T total / 32B active MoE model

Is this the largest open-weight model?



No.

At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1T

You can always check here: https://lifearchitect.ai/models-table/


I believe so.

Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: