Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think that looking at it from a MHz standpoint is a red herring. How many things can you do per watt now, than you could per watt then?


Better ask, how many things you could do per watt now, if software had that era's leanness?


I occasionally fire up Windows 95 in a VM and marvel at how fast it is. Sometimes I use it for actual tasks, although it is hard to find things Windows 95 is useful for nowadays. My copy has office 97 installed, and I use that for some spreadsheety tasks.

I really like being able to click anything and know I won't see a loading spinner.


Gosh windows 95 was great - especially the ability to navigate the OS from the keyboard alone - At one point I knew W95 so well I could literally navigate the OS by hand without a screen.

An aquaintance at one point changed all the graphics settings to black - so ever screen, menu, text etc was black and you couldnt see anything. I was able to help her by using just the keyboard to navigate to settings and restore defaults. I did it from memory and key-clicks.

Also, it was a fun OS to fuck with people in the nascient realm of virus' of the time, there was setting the desktop as an image and hiding whatever was actually on the desktop so nobody could click on things... remote access BSODs etc.

I'm on an HP flagship gaming laptop now and it consumes probably 1,000 times more power (watts) and Compute to just display this single text entry form on HackerNews than any W95 machine back in the day...

In ~1997 or so I had to shutdown a branch office in San Diego and when I did so, we had a number of plastic sealed boxes of brand-new W95 on 3.5" floppies... I kept them for over a decade and then sold them on Ebay as collectors items for $75 each... and wrote a tale about how they were computing history.

But I had a PDA in 1993 which was by CASIO - and it had a little spreadsheet app on it, and I made a Gematria translator app in it - so I could type in any words and its formulae would spit out all the Gematria numbers for a name (the Celestine Prophecy was a famous book of the time)

That PDA was similar to this one, but less sophisticated - and I had that thing for ~25 years... but unlike OP I didnt take out the battery and it ruind the device.


As fond of memories as I have of the time I used Windows 95 and how much I learned from using it as a pre-teen, I mostly remember my maintenance schedule of reinstalling by the entire OS about once every month or so when it broke. I guess that jumpstarted my lifelong career/hobby of computer fuckery.

My first PDA was a Casio too, I don't remember which exact model but it was a Pocket Viewer and it was great. https://en.wikipedia.org/wiki/Pocket_Viewer

My "true" PDA was an Audiovox Maestro I got used off of eBay. I even got a compact flash modem adapter so I could dial-up using it. I was shocked to see almost 10 years later when I got my first smart phone, a T-Mobile Dash 3G, that the OS had barely changed since the Audiovox Maestro back then!

Not sure there's a point to any of this except my own anecdotal trip down memory lane :)

Edit: autocorrect fail


"Memories... they're basically the only thing you have to think back on"

-- Steven Write

---

I love "anecdotal trips down memory lane" -- because its stunning how much I have done and known in my life that I forget.

I truly regret not listening to my mom's advice to journal.


Back when phones had physical keypad buttons and T9 text, I could send a message without taking the phone out of my pocket. It looked suspicious though


Well, UX is a spectrum. You can do things instantly on a very old machine if you don't mind using a cli. If what you want to do is a game, you can't get Cyberpunk, but you can get Pacman.


Imagine the speed you could play Pacman on a modern CPU! You could probably play a whole level in nanoseconds.


Oh, I often do. It increases productivity by orders of magnitude.


Oh, but what if you had a text based implementation of cyberpunk? Where your imagination is the GPU (the best one there is)


Alas, my imagination is also text-based (aphantasia).


You mean like A Mind Forever Voyaging?


Apparently 30 days of things, per the article.


Most computers are inefficient per watt due to waiting on IO from the human.


There's always something running in the background (usually dozens of system services) so that's not a problem - plus they scale their consumption depending on power.

Plus the waiting on IO is when they're NOT doing something. When they do (playing video, listing files and rendering thunbnails, processing video in an editing app, rendering the next frame in game, playing a synth part in a music app, applying an effect on a photo, recalculating an Excel after a change, and so on) is when you want them to be fast.


And power consumption is a —among other things— a function of transistor size and process. A lot has changed in 30 years, but our methods of programming these things has veered towards incredible wastefulness.


But demand for software has also grown a lot in that period. If we trained software engineers and demanded quality to the same standards, we would not be able to meet demand.

Those standards were also set by the limitations of hardware, not because people had the will to do better back then.


To put it another way, how much longer battery life would that same psion get if the CPU/board was made with modern transistor sizes.


> I think that looking at it from a MHz standpoint is a red herring. How many things can you do per watt now, than you could per watt then?

Obviously a lot more. The OP said the thing could run in two AAs for a month.

Nowadays, almost nothing can last a month on one charge/set of batteries. The only thing I can think of that might pass is a Kindle, due to the trick where it's literally shut down almost all the time.


Ultra low power processors exist:

"GreenArrays is shipping its 144-core asynchronous chip that needs little energy (7 pJ/inst). Idle cores use no power (100 nW). Active ones (4 mW) run fast (666 Mips), then wait for communication (idle).

Tight coding to minimize instructions executed will minimize power. The programmer can also reduce instruction fetches, transistor switching and duty cycle.

Chuck Moore

GreenArrays, Inc."

Source: https://youtu.be/0PclgBd6_Zs

They just are not widely used because ARM and similar processors offer much more computational power and people are happy charging their devices every day or so.


> They just are not widely used because ARM and similar processors offer much more computational power and people are happy charging their devices every day or so.

This is exactly my point. Ultra low power processors exist, but they're not used for consumer electronics. Developers would rather build something bloated, quickly, than take the time to optimize. And technological advancements has taken away a lot of the pressure for optimization (e.g. I'm sure it was a super high priority to get a Psion to sip power, because a recharge was going to the store and paying $4 for a set of new batteries).

If I were a dictator that ruled with an iron fist, I'd mandate that all software be developed on underpowered devices, then released on fast ones.


> If I were a dictator that ruled with an iron fist, I'd mandate that all software be developed on underpowered devices, then released on fast ones.

Agreed! I wish there were widespread ways of throttling CPU and memory given to desktop applications today. If I'm testing a web application, I tell Firefox to throttle my network down to GPRS and see the responsiveness (or lack thereof) and once my work is done and GPRS is reasonably fast (or whatever) I can give a quick glance at normal 4G speeds and see my web app screams now.

So why can't I lock a desktop application to settings like "an unused 386 PC with 24MB of RAM"?


> If I were a dictator that ruled with an iron fist, I'd mandate that all software be developed[^W*] on underpowered devices, then released on fast ones.

*"tested"/"run during development exclusively"

You still want the text editor and more importantly compiler to run on beefy workstation hardware, in order to avoid programmer productivity problems like long compilation times[0] and to take advantage of CPU-intensive optimization techniques.

0: https://www.explainxkcd.com/wiki/index.php/303


I have a ton of devices that would disagree with you. If you run the Psion software on modern hardware, it'll run for two months.


Of course not, because no device can run at super low wattage with such an hypothetical OS.


It's not a hypothetical OS, it's what the Psion runs, and why not?


I did not know that the Psion OS is available for modern SOCs.


It isn't.

The Psion 3, 3A, 3C and 3MX ran EPOC, a proprietary 8086 OS with preemptive multitasking and a full keyboard-only GUI.

But their successors, the Psion 5, 5MX and netBook, ran a rewrite called EPOC32. That was written in C++, is native to ARM, and ran well in 4MB and very well in 8MB. Full preemptive multitasking, touchscreen GUI, networking, IPv4 and more, FAT32 and networking support.

That later was rebranded as Symbian and was the first mass-market smartphone OS. It was the only smartphone OS that could run the GSM comms stack on the same CPU as the user facing GUI OS -- its realtime support was that good. iOS, Android, WinCE, all need a separate CPU with its own RTOS for that.

Symbian is now 100% FOSS.

https://github.com/SymbianSource

It's a crime and a tragedy that nobody's picked it up and ported it to any modern SOC. It is much richer and more complete than any other modern C++ OS such as Genode or Serenity OS.


That's pointless since you need to recharge your device all the time. It's not progress if you have a lighter that needs gas every 2 minutes to work.


Except the lighter nowadays is a flamethrower. You can get a lighter that needs gas every month, or a flamethrower that needs gas every two minutes, but complaining that the flamethrower needs much more gas than the lighter is silly.


The problem is there are only flamethrowers now and no lighters.


I'm fairly sure you can still boy dumb phones that last a week+ though.


I think the argument is that we mostly want to accomplish the same sorts of things, but now we have to use a flamethrower.


But we don't, even the fact that we browse the Web and watch videos would have been impossible on a Nokia 3210.


We have been doing all of this since the 90s at least on desktop computers. Smartphones today are far more powerful than desktop computers from that era.


> doing all of this since the 90s at least on desktop computers

it wasn’t a very nice experience. Why would I want to watch anything in 240p win stead of 4k? Same for web, despite the bloat UX on most websites is miles ahead of what was common back in ~2000


Probably the memory is more relevant. 1 or 2 MB!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: