Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A “pretrained” ResNet could easily have been trained through a supervised signal like ImageNet labels.

“Pretraining” is not a correlate of the learning paradigms, it is a correlate of the “fine-tuning” process.

Also LLM pretraining is unsupervised. Dwarkesh is wrong.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: