Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

(Former AI researcher + current technical founder here)

I assume you’re talking about the latest advances and not just regression and PAC learning fundamentals. I don’t recommend following a linear path - there’s too many rabbit holes. Do 2 things - a course and a small course project. Keep it time bound and aim to finish no matter what. Do not dabble outside of this for a few weeks :)

Then find an interesting area of research, find their github and run that code. Find a way to improve it and/or use it in an app

Some ideas.

- do the fast.ai course (https://www.fast.ai/)

- read karpathy’s blog posts about how transformers/llms work (https://lilianweng.github.io/posts/2023-01-27-the-transforme... for an update)

- stanford cs231n on vision basics(https://cs231n.github.io/)

- cs234 language models (https://stanford-cs324.github.io/winter2022/)

Now, find a project you’d like to do.

eg: https://dangeng.github.io/visual_anagrams/

or any of the ones that are posted to hn every day.

(posted on phone in transit, excuse typos/formatting)



Would recommend Zero to Hero by Karpathy as well

https://karpathy.ai/zero-to-hero.html


I also recommend fastai. It gets you hands on from the very beginning with links to extra resources like papers and articles you can read to improve your understanding.

Doing fastai while solving comparative problems on your own in kaggle is quite enlightening


Ah, visual anagrams, that was exactly the idea I had for a project that would allow me to learn. I hadn't dared looking if it already existed. I will try to pretend it doesn't and try to find my own way...


fast.ai course (https://www.fast.ai/) gets a thumbs up from me as well




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: