Multiple nethack ascender here (~50x in 20yrs). I usually play in the traditional November tournament (originally devnull, now tnnt). Never set aside the time to learn any of the other roguelikes mentioned in the article, but wanted to mention that nethack itself is a class of games. Many people have written variants to scratch a particular itch (the article briefly mentions spork, which was one of the very first). Some of those are wildly different from the base game. There is even a separate tournament in June dedicated just to playing as many of the variants as you can (junehack)
That's one of the reasons I loved Ansible from the moment I saw it. As the OP points out, traditionally machines accumulated ad-hoc changes over a long period of time. Describing the "known good" state and running this "checklist" to make sure it is in that state both documents the checklist and evaluates it.
Same reason we haven't typed "cc" on the command line to call the C compiler on individual files for about 30 years or more.
The last time I typed (well, pasted) "cc" on the command line to call the C compiler on an individual file was 26 hours ago. I wanted to recompile a single-file program I'd just written with debugging information (-g) and it seemed easier to copy, paste, and edit the command line rather than to manually delete the file and reinvoke make with different CFLAGS.
I mean, I've surely compiled orders of magnitude more C files without typing "cc" on the command line over the last week. But it's actually pretty common for me to tweak options like -mcpu, -m32, -pg, -Os, or -std=c11 -pedantic (not to mention diet cc) by running a "cc" command line directly.
Similarly, I often run Python or JS code in the REPL or in Jupyter rather than putting it in a file. The rapid feedback sometimes helps me learn things faster. (Other times it's an attractive nuisance.)
But I may be a bit of an odd duck. I've designed my own CPU, on paper. I write assembly code for fun. I've implemented several different programming languages for fun. I like to know what's underneath, behind the surface appearances of things. And that requires experimenting with it.
Of course I cc one file quickie programs all the time. What I am talking about is a whole directory of source files, and just "knowing" which ones are out of date and building the object files manually.
I still remember years ago trying to convince one dev to use make on a package with 20-30 source files.
Running just cc instead of make is actually a much more reasonable thing to do nowadays than it was 10, 20, or 30 years ago.
https://gitlab.com/kragen/bubbleos/-/blob/master/yeso/admu-s... is the entry point to a terminal emulator I wrote, for example. `make -j 8` can build it with GCC from a `make clean` state in 380ms, but if I, for example, `touch admu-shell.c` after a build and run `make -j 8` to run an incremental build, it recompiles and relinks just that one file, which takes 200–250ms. So the incrementality of the build is saving me 230ms–280ms in that case.
Without -j, a nonincremental `make admu-shell` takes about 1100ms.
it takes 900 milliseconds to compile those 1100 lines of C. This is a little bit faster than building from scratch without -j because I'm not compiling the .c files that go into libyeso-xlib.a that admu-shell doesn't use. So all the work of `make` figuring out which ones are out of date and building the object files automatically and in parallel across multiple cores has saved me a grand total of 600–700 milliseconds.
That's something, to be sure; it's a saving† that makes the compilation feel immediate. But it's really pretty minor. 900ms is small enough that it only affects my development experience slightly. If I were to run the build in the background as I was editing, I wouldn't be able to tell if it were incremental or from-scratch.
Unless it screwed up, that is, for example because I didn't bother to set up makedepends, so if I edit a header file or upgrade a system library I might have to do a scratch build anyway. The `make` incremental-build savings doesn't come without a cost, so we have to question whether that cost is worth the benefit. (In this case I think it's worthwhile to use separate source files and `make` for other reasons: most of that source code is used in multiple Yeso programs, and `make -j` also makes a full build from scratch four or five times faster.)
If we extrapolate that 700ms saving backward to 25 years ago when our computers ran 500 million instructions per second instead of 30 billion, it's something like 45 seconds, which is enough of a wait to be distracting and maybe make me lose my train of thought. And 5 years further back, it would have taken several minutes. So `make` was an obvious win even for small projects like this at the time, and an absolute necessity for larger ones.
At the time, I was the build engineer on a largish C++ project which in practice took me a week to build, because the build system was kind of broken, and I had to poke at it to fix the problems whenever something got miscompiled. The compiler and linker were writing their output files to an NFS server over shared 10-megabit Ethernet.
As another data point, I just rebuilt the tcl8.6-8.6.13+dfsg Debian package. It took 1m24.514s. Recompiling just generic/tclIO.c (5314 SLOC) takes 1.7 seconds. So not doing a full rebuild of the Tcl library can save you a minute and a half, but 25 years ago (when Tcl 8 already existed) that would have been an hour and a half. If it's the late afternoon, you might as well go home for the day, or swordfight somebody in the hallway or something.
So incremental builds at the time were totally essential. Now they're a dispensable optimization that isn't always worth it.
______
† 1200 lines of C per second is pretty slow, so probably almost all of that is repeatedly lexing the system header files. I'm guessing that if I took the time to do a "unity build" by concatenating all the C files and consolidating the #includes, I could get that time down to basically the same as the incremental build.
Never MUDed, but November (and now June) are my nethack months going back about two decades. Actually started playing about fifteen years prior to that (hack on SunOS 4 machines) but didn’t get good for a while.
They turned it around by not being GE anymore. The stock trading as "GE" now is just GE aircraft engines, aka "GE Aerospace". The healthcare stuff went to "GE Healthcare" (GEHC) and power systems went to "GE Vernova" (GEV)
They also sold GE's profitable GE biotech division to Danaher where Larry Culp was the previous CEO. The proceeds was then used to shore up the finances of GE.
GE finance was spun off into Ally bank.
The other part of the turnaround though was the pension. Pension funds in a zero interest rate environment is a giant liability. That was a huge part of what drag down GE's finances for a while. Once the interest rate started to go back up again, the pension liability shrunk by a lot.
Overall, GE is in a much better position now then it was in the past two decades. I don't know why the post ends up on such a negative note. GE has been severely humbled but I think it's well positioned to grow again.
I've invested in GE twice, both during its downturns. This time I'm holding onto it because I think it's structurally in a better place than it was before.
GE Finance was spun off into Synchrony Bank. Ally Bank is the former General Motors Acceptance Corp (GMAC) the former finance arm of GM pre-financial crisis bankruptcy
Thanks for the history! Been driving past that plant for almost 40 years going back and forth across the state.
I was surprised when the article opened with what it said was the worlds first transistor radio, because I always thought that was the TR-1. It looks like GE made a prototype, but never took it to market.
I'm in the same boat. Been living here roughly 30 years, but this article was the first I'm learning about this fascinating bit of history! Glad the parent comment mentioned it!
Recently read The Man Who Broke Capitalism by David Gelles which is an excellent review of the Welch years, how he worked, and how he sent his minions out across the corporate world to wreak havoc. Wonderful read and really provides perspective on why modern corporate America is what it is.
Light Out by Gryta and Mann follow up by focusing on the Immelt years and how he tried to keep the ball rolling despite the hole that Jack left him in. Also an excellent read.
Sounds like Power Failure covers a lot of the same material as these two earlier books perhaps with some historical material.
As an engineer I always had just blamed nebulous “MBAs” as the reason we can’t have nice things, but the Gelles book makes a very strong case that it was Welch specifically that changed the rules of engagement to what they are now and follows the thread through his career showing the weaknesses he exploited, the benefit that he gained from it and how he spawned the modern cult of the Imperial CEO.
Many people have tried to run the same playbook with some success, but Welch had unique resources that let him continue the game far longer than anyone else.
As an aside, Gelles publisher made him add a hopeful last chapter on the social benefit corporate movement and the few corps that were able to resist welchism just so the book wouldn’t leave the reader feeling hopeless!
> Recently read The Man Who Broke Capitalism by David Gelles which is an excellent review of the Welch years, how he worked, and how he sent his minions out across the corporate world to wreak havoc.
The parts about Boeing in that book are... rough. Not rough as in "poorly-written" but rough as in "holy hell is that ever a brutal way to ruin a good company". Excellent book but lol it's not a feel-good read :)
Maybe a decade ago I read from critics concerned a shift to an aggressively globalized supply chain was certain to wreak havoc on Boeing’s quality control.
e.g. safety-critical nuts and bolts used to be produced down the street, now you get a few nuts from say Thailand and a few bolts from Malaysia… the critics complained it was certain to lead to problems.
Was that a part of what you read about in that book?
Not significantly no, it was much more focused on the McDonnell-Douglas reverse acquisition. To summarize: McDonnell-Douglas was failing and bought Boeing with Boeing’s own stock (technically Boeing bought McDonnell-Douglas with Boeing stock but in practice McDonnell management assumed control). MD’s executives were Jack Welch protégés and did the same thing to Boeing that happened to GE.
The part that story always stays silent is that Boeing then-CEO was big fan of Welch-ism apparently and oversaw major changes that caused long-term issues
... while new people (albeit not execs) from McDonnell-Douglas were publishing internal memos about how MD has experience on why the actions taken by Boeing (not MD!) CEO will cause problems.
Been doing engineering software since late 80's and this article hit me in the feels. I always wondered what the obsession with Harvard Graphics was in the early versions of PP. Now I understand.
Never had to ship software on 9-track tape, but remember receiving the source distro for C++ v1.0 from AT&T on one (cfront, no MI, etc)
Did ship plenty of software on QIC tape though, and man what a PITA. After much experience, we ended up retensioning every tape before writing. Sending releases to 100+ customers generated a Borg-cube of tapes that had to go into individual boxes for shipping, along with the rainbow of other tape flavors like TK50s and the various 4mm and 8mm tapes.
Documentation was a big deal, because once it was printed, you had a Borg-cube of shrink-wrapped paper and binders that were not going to change until the next release. I still miss proper documentation. Endless web pages are a lot more difficult to sit down and read start to finish.
This article helped me realize that I was shaped by this in the same way that many peoples grandparents were shaped by growing up in the great depression.
Alas, thought the original article was going to have some juicy details about the vector hardware. But since it was a survey of games inspired by the original, I feel compelled to mention "Maelstrom" by Ambrosia.
For my money, this was the absolute best take on Asteroids since the original. Originally Mac, but the source was later released and was ported to PC. We even had a tweaked version that we called "Carnage" that generated many storms of presents, comets, spiky balls, etc.
I have restored a couple of the G05-801 monitors for these. They are much more complicated to repair than standard rasters. Multiple failure paths to get a vector beam pointed at the middle of the screen, incinerating phosphors.