Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Where did you get that from? Cutoff date says august 2025. Looks like a newly pretrained model




> This stands in sharp contrast to rivals: OpenAI’s leading researchers have not completed a successful full-scale pre-training run that was broadly deployed for a new frontier model since GPT-4o in May 2024, highlighting the significant technical hurdle that Google’s TPU fleet has managed to overcome.

- https://newsletter.semianalysis.com/p/tpuv7-google-takes-a-s...

It's also plainly obvious from using it. The "Broadly deployed" qualifier is presumably referring to 4.5


How is that a technical hurdle if they obviously were able to do it before?

It's probably just a question of cost/benefit analysis, it's very expensive to do, so the benefits need to be significant.


If the pretraining rumors are true, they're probably using continued pretraining on the older weights. Right?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: