Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I love how the first example is "use the common interfaces for new code". If only! That assumes there _is_ a common interface for doing a common task, and things aren't just a copy-paste of similar code and tweaked to fit the use case.

So the only tweak I'd make here, is that if you are tempted to copy a bit of code that is already in 100 places, but with maybe 1% of a change - please, for the love of god, make a common function and parameterize out the differences. Pick a dozen or so instances throughout the codebase and replace it with your new function, validating the abstraction. So begins the slow work of improving an old code base created by undisciplined hands.

Oh, and make sure you have regression tests. The stupider the better. For a given input, snapshot the output. If that changes, audit the change. If the program only has user input, consider capturing it and playing it back, and if the program has no data as output, consider snapshotting the frames that have been rendered.



At some points, new improvement and occasionally ingenuity need to find a healthy way back into the workflow. Moreso early on, but consistently over time as well.

If we just create copies of copies forever, products degrade slowly over time. This is a problem in a few different spheres, to put it lightly.

The main rule is a good one, but the article overfocuses on it.


Yes, this is the counterpoint I'd make to "resist the urge to make every corner of the codebase nicer than the rest of it": in an inconsistent codebase, maybe we should prioritize making it consistent where possible, and reducing unnecessary duplication is one way to reduce future change costs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: