Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think this is just it though. You don't necessarily need to be able to sit down and formally write out the exact amortized big O complexity, (or little O, [O|o]mega or theta) but equally it's good to have some idea of how something would scale.

I feel like self taught developers who are serious just learn this by intuition because, frankly, if you're writing software where it matters then very quickly it becomes an obvious concern. If your self taught and it doesn't matter then it doesn't matter!

With a formal background you may or may not use it, but I'd say the only difference is knowing the formal notation makes talking about it with other programmers who also know that notation easier, but even then it's not like algorithmic complexity is (at it's heart) at particularly difficult concept when directly applied to a project. I always found it much harder as an abstract idea rather than when working with a specific algorithm.



You don't necessarily need to be able to sit down and formally write out the exact amortized big O complexity, (or little O, [O|o]mega or theta) but equally it's good to have some idea of how something would scale.

If you have a good idea of how things scale, being able to express exactly how they scale with succinct and clear notation is useful. Quite useful in fact.

That's why formal notation exists, because it is handy. Not because there is an eternal, global conspiracy among academics to keep up useless habits just to show off.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: