With just NAND gates and D flip flops as primitives, building a full 16bit ALU, then an ISA for it, an assembler, compiler, operating system (with memory mapped keyboard and screen) and interpreter. Then writing a Sudoku game in that interpreted language, and seeing it reduce all the way to just bits. And then running it on the simulated computer.
Really have a holistic view of what actually happens when I run a computer. Also demystified compilers for me.
Second thing was learning concurrency control, transactions and all that. The Eureka moment for me was understanding that git and databases solve a similar problem. After that, both tools lost their mystery.
Second last but not least, making web apps! A low entry barrier way to make things anyone can use really boosted my confidence.
Last but not least, learning the statistical underpinnings of machine learning.
Of course, I'm still in University so I am by no means an expert at any of the above, but even at my level these things really help with knowing your true place.
Reading Designing Data-Intensive Applications was great in this same sense. Anything concurrency and distributed was always a mythical black box to me. It turns out, there are no real, all-encompassing solutions to these issues. It’s all trade offs, and each solution has some aspects it can’t help with. Vector clocks (allow ordering but only logically) and CAP (availability vs consistency under network partition) theorem are terms that come to mind.
nand2tetris often comes up on HN. I like to link to a similar, fully in-browser game which I think achieves the same: https://www.nandgame.com/ (no affiliation, I just really liked it)
With just NAND gates and D flip flops as primitives, building a full 16bit ALU, then an ISA for it, an assembler, compiler, operating system (with memory mapped keyboard and screen) and interpreter. Then writing a Sudoku game in that interpreted language, and seeing it reduce all the way to just bits. And then running it on the simulated computer.
Really have a holistic view of what actually happens when I run a computer. Also demystified compilers for me.
Second thing was learning concurrency control, transactions and all that. The Eureka moment for me was understanding that git and databases solve a similar problem. After that, both tools lost their mystery.
Second last but not least, making web apps! A low entry barrier way to make things anyone can use really boosted my confidence.
Last but not least, learning the statistical underpinnings of machine learning.
Of course, I'm still in University so I am by no means an expert at any of the above, but even at my level these things really help with knowing your true place.