Hacker Newsnew | past | comments | ask | show | jobs | submit | vaxman's commentslogin

The downside of drawing the interest of Brewsters (https://youtu.be/fwYy8R87JMA) in Linux.

v259? [cue https://youtu.be/lHomCiPFknY]


But Linux doesn't run on a PET-2001

Want me to show you on the doll where he hurt me?


Where was this when when I was integrating an rPi0-2W into what was once an Avegant Glyph? [project inspired by https://youtu.be/RQeOm3CuUFY]


- gerlingguy says no HDMI chip (sigh), ah well... where's my Subpac?


Computer/Network gear that is designed, manufactured, sold and distributed by Chinese companies should be assumed to come with MSS-infected microcode, SecureBoot ([U]EFI/BIOS), firmware and/or operating systems that are vulnerable in ways that offer “plausibly deniability” as to the intent of the manufacturer. In some cases, the infected payload will be transmitted to the customer environment through updates supplied by Chinese websites, again in ways that allow for “plausibly deniability” as to their intent. Hacking Fortune 100 companies was one thing (because, shame on the Fortune 100 companies), but the recent move by China to hack Anthropic and use Claude as a cyber weapon is unleashing bi-partisan congressional support limits on China’s market access. Many more Chinese tech products, distributors and services will be removed from our markets, but the wheels are turning slowly and in the meantime, "Does anyone have any questions?" [https://youtu.be/_5yJZUyr_cM]

PS: The entertaining hacks (aka “influencers”) on YouTube, Reddit and in the blogger-sphere in general, are aware of these risks but recommend potentially dangerous Chinese systems and networking products anyway. In many cases, they will falsely portray the items as already being popular and The Way that everyone is doing IT. While their defense might be “don’t blame the player, blame the game”, people were shot at and died to create the freedom that they are now putting others at risk of losing with such reckless recommendations. It’s not just hardware either. These same video grifters will cheerfully recommend remote-access software (and kvm) products without even including the most basic of warnings. Should these irresponsible influencers be “de-platformed” by a faceless “Trust and Safety” team that’s somehow expected to come up to speed on the complex risks of every product that is otherwise (for the moment) still being legally distributed? No! In the 250 year history of the United States and for a century or two prior in Europe, there have been entire classes of crackpots roaming the countrysides peddling everything from hair-regrowth medicine, cult religions and life-extending hot springs (my favorite was the story of “Zzyzx” that seemed to hit on all three). The people who are making these infomercials while intentionally leaving out any account of the danger their viewers are placing themselves in by following such “advice” will ultimately feel the same wrath and suffer the common outcome associated with such behavior: being locked out of the economy and deprived of their assets. <cue https://youtu.be/BD2kWCfTcaU>


> Computer/Network gear that is designed, manufactured, sold and distributed by Chinese companies should be assumed to come with MSS-infected microcode, SecureBoot ([U]EFI/BIOS), firmware and/or operating systems that are vulnerable in ways that offer “plausibly deniability” as to the intent of the manufacturer. In some cases, the infected payload will be transmitted to the customer environment through updates supplied by Chinese websites

Better buy Cisco equipment. They are known for they security. And Apple (they fixed the last iMessage exploit), and Microsoft (we, at Microsoft, take security very seriously)...

If the Chinese are so bad, why are all American products made in China ?


OpenAI will go to zero unless it agrees to be acquired because they're messing with public company stock valuations using funky purchase orders leaving those public companies no choice but to cancel their credit (at least unless they get a "government backstop" that they say they don't want or need). Those who compete with OpenAI will also "take a hit" if/when that happens, so they would be wise to be looking to make a deal to acquire OpenAI. Dude was from Y Combinator and liked to bank on hope, focusing on capturing market share and worrying about profits later, which is fine in software startups playing with Monopoly money, but when it impacts vendors that are publicly traded companies (to the point that one is now valued at $5T), post-1929 rules come into play. Anthropic has a similar issue, but there, the issue is that their C-suite is making outrageous public statements that are suspected of intending to manipulate the stock values of both private and public competitors and of the publicly held vendors to all of these players. I hope they both go away quietly and someone declares victory rather than the stock market crashing!

As far as xAI, I doubt it will go to zero or run afoul of any of those market manipulation issues because it owns Twitter/X and I think it powers the realtime Tesla cloud, but betting on it is fraught with peril because of the high likelihood that it will wind up under the control of some less capable conglomerate (ergo, GM acquisition of Hughes Aircraft and resale to Raytheon, Boeing and News/DirecTV).

Google, Meta, a handful of B actors and China are where we have to place our bets, but only if we ourselves need (or want to invest on the theory that others need) trillion parameter models (and want to risk having the valuations lowered if/when adverse actions are taken against the above competitors).


*-"eventually" leaving those public companies no choice but to...

Clarifying, because there's no way a company (public or private) is going to reduce the credit line of a major customer until it's obvious that the orders "aren't real" But if Wall Street realizes it before they do, they can lose control of their business too. This is not quite Enron or WorldCom/MFS, but it's a very similar storm on the horizon. (BTW, ever wonder why Sprint never could remain airborne and eventually was merged with TeenMobile? It's because they overspent on CapX trying to keep up with the fraud at Worldcom and could never dig out to actually use all that spectrum. Likewise, we are still dealing with the fallout of the Enron collapse on the US domestic energy grid a quarter century later.)


TFA had me at "Per ChatGPT:" ROFL

FORTH has been used to control deep space telescopes and planetariums --it's a great way to expose instrumentation for high-level access, but that access might not be human anymore, it could just as easily expose instrumentation to AI agents.

For example, a FORTH page could contain a prompt "Animate a galloping black and white zebra on the attached 64x32 RGB LED matrix, cache the code to do this on a FORTH page called ZEBRASHOW6432." Another could prompt "Run ZEBRASHOW6432 at the top of the hour during weekdays in the Pacific Time Zone." This might involve lifting C drivers from Arduino and CircuitPython for the 64x32 RGB LED matrix driver into FORTH (trivial) and building an MCP-to-FORTH server, but endless fun...

So I guess it's best to stop reading about FORTH and grab an rP0-2W and attach desired indicators, sensors, actuators and transceivers to see for yourself.


> i used a chip8 emu

I used CHIP8 (on an RCA COSMAC VIP out "in the garage" bought with money from mowing lawns), would never have put it on my resume ROFL. Big Bad FORTRAN programmers would have looked at it and said "awe...ain't that cute." I did have 1802 assembler on there though. By 1992, at least one interviewer suspected I might be an "Easter Bunny" (not real) before finally meeting me. That's when I started scaling it back a bit (dropping a dozen assemblers, old languages like APL, etc.) I kept scaling it back for decades until eventually it fit on one page, but the people reading it didn't know what they didn't know (citizens of the dystopia) so my resume is only a formality anyway. (I probably should reduce it down to two or three sentences at this point. But back in the day man, I was almost typing (with an actual typewriter, also "out in the garage") in the margins --heh.) Hmm, valid question: How long before LinkedIn is an AI that conducts the entire search and hiring process? Also, WTH are we reading about CHIP8 in 2025 for...it's a dopeass dystopia, that's why. :D


A chip-8 interpreter is a common toy project for people interested in emulation.

I don't think people are building it to show off to employers as a portfolio project.

It's just a fun little weekend project. I find it's also a decent way to pick up the syntax for a new language.


Then what was the comment I quoted?


If it isn't on the Internet, it didn't happen right? Maybe we can change that...

Unix _is_ a play on "eunuchs" but that fact wouldn't have sold well during the mini computer [ https://www.britannica.com/technology/minicomputer ] wars, especially in the later 1980s [ https://youtu.be/IRpKHFfsH3A ] when Unix was exiting the exclusive world of academia and Bell Labs. This was an era when everyone for the prior thirty years had come up with mainframes and data centers were stocked full of "heavy iron" (IBM and IBM-clones like Amdahl) or at least very large "mini" computers from companies like DEC, which was so well run from the late 1950s on that its leader has been declared to be one of the greatest in the history of Corporate America and was studied at Harvard and Wharton for decades. The Unix technology "specialists" on the other hand were super-nerds: ghastly, feral, mostly pear-shaped, plain clothed technicians only BARELY tolerated in their own settings. [Ok, maybe all of them except Eric Schmidt heh.] Realize that the vast majority of American engineers in the 1980s still wore suits and ties --but not if they had anything to do with Unix. (Ken Olsen fer sure wore suits, but also drove a Ford Pinto, BTW.) When the demo dollies tasked with pushing any number of alternate hardware platforms up against IBM and DEC in the constant battle for those massive "heavy iron" budgets were asked to pitch UNIX (System 3, System V, System 7, BSD) up against bedrock OS/MVS and VMS, at first they would answer the obvious question (of what UNIX stood for) with "UNIX is not UNIX". That pretty much stuck in the period literature (COMPUTER MAGAZINE RAGS) too --no way they were going to answer "eunuchs"!

Also worth noting in this context: This was the era of "Nobody Ever Got Fired For Buying IBM" and the amount of money your company spent on "iron" was seen as a marker of its success AND YOUR PERSONAL CAREER STATUS in the tech universe, so you can imagine the type of customers and professionals that actually did buy into obscure UNIX-based hardware. This also created a lot of "friction" in the Industry that you can't easily learn about in this Future. It wasn't like today where people have "home labs" and can train themselves to go for whatever job they want using free software (even while sitting in a hellish ghetto of the poorest country on Earth). Back in the day, one was trained on what their school or employer had available (or they learned from carrying around books and imagination, or using X.25-based timeshare if they were lucky). Period. So maybe you landed a great job, but you had to use a shitty Unix computer with broken down terminals or maybe you had a shitty job but they gave you a coveted VAXstation. All your experience with Unix wouldn't buy you much in a DEC or IBM shop and vice versa. The implications this had on the layered applications of the day were profound, but mostly this created a lot of animosity between tech professionals of different backgrounds. There were constant attempts to address this, but the computer hardware manufacturers were complicit in it because it made it easier to lock their customers into one architecture or another.

<cue https://youtu.be/ciUfdVs-p84 >

Is it safe to now say that all general purpose operating systems except LINUX are nothing but husks to run LINUX (and whatever legacy ecosystem)? The most successful of all the 1980s demo dollies, The Scott McNealy, took a page out of DEC's playbook and instead of trying to go in with a massive super powerful Unix mini computer, he would pitch a few workstations running something called "SunOS" (BSD eunuchs) that "networked" over TCP/IP to effect "the system is the network" (a totally new concept then) before his company bet everything on a new chip (SPARC) that used RISC architecture to outperform the established industry players and make the guys in charge of those "heavy iron" budgets feel a bit inferior if they didn't buy-in a little. SunOS, SPARC and Solaris definitely caused a lot of disruption, but it really never had much of a chance to unseat IBM or DEC and was also slowly sinking into La Brea tar pit along with everything else (though it had a bit more life due to all the capX as the dot-com bubble was inflating around TCP/IP). IBM had already totally lost control of its maverick PC initiative (by under-estimating Billy The Kid who had also hired away DEC's top VMS engineer) and the ENTIRE market for mini computers (whether they ran OS/MVS, VMS or eunuchs like SunOS or NextStep) totally collapsed. Just the promise that a PC might be as powerful at VMS and could network as well as SunOS was sufficient to change perception and bet corporate budgets on a "computer, not a terminal, for every desk." More importantly, the resulting PC industry economies of scale meant that all of the tech workers could own a "home lab" and, in particular, allowed at least one kid growing up just outside the Soviet Union to go through the pages of Andy Tanenbaum's famous book on operating systems (that demonstrated key concepts for the reader through the creation of a eunuchs operating system Andy called MINIX). Combined with the political antics of a creepy academic communist at MIT and an irresponsible Defense backbone ISP in San Diego, the slow death of all operating systems has manifested (because LINUX ELF binaries and runtime support are now available on IBM mainframes, Windows and as of last June, macOS). Of course, there are still legacy shops, embedded systems and most of the new ELF-running operating systems still run LINUX in nested virtualization, but LINUX has pretty much taken over the game and eunuchs is el muerto.

Meanwhile, even the AIs incorrectly think that UNIX is "a playful reference on UNICS, the larger, more complex Multics 'project'" Sounds totally plausible like everything else coming out of an LLM, but we meat bags know better.

--- 'Now there were these places called cities and they had the knowin' of a lot of things, they did. They had skyscrapers, videos and sonic.. Then this thing called the Pockey Clips happened and you have to understand, this is Home and there's no Tomorrow Land.' https://youtu.be/rn4aIinTJBQ


> (because LINUX ELF binaries and runtime support are now available on IBM mainframes, Windows and as of last June, macOS)

I was not aware of linux binary support for macOS, can someone link to that?

EDIT: it seems to be this: https://www.infoq.com/news/2025/06/apple-container-linux/


This was an incredibly interesting read and, combined with the other poster's response above, completely answered my question. Thank you so much for typing this out. What a completely different world this must have been. (I started my career in 2007.)


This was obsoleted by UWB and Apple’s Nearby Interactions API.

https://www.qorvo.com/innovation/ultra-wideband/products/uwb...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: