We finally realise there is no such thing as a “reserve currency” in the floating exchange rate era and that the concept is a long dead hangover from fixed exchange rates.
And that’s definitely going to upset the gold bugs.
(In reality lots of things are held in reserve)
USD is a routing currency that is used because it is cheaper than the mesh alternative. When it stops being cheaper whoever is then cheapest will get the routing transactions.
> There was a time when you had to know ‘as’, ‘ld’ and maybe even ‘ar’ to get an executable.
No, there wasn't: you could just run the shell script, or (a bit later) the makefile. But there were benefits to knowing as, ld and ar, and there still are today.
> But there were benefits to knowing as, ld and ar, and there still are today.
This is trivially true. The constraint for anything you do in your life is time it takes to know something.
So the far more interesting question is: At what level do you want to solve problems – and is it likely that you need knowledge of as, ld and ar over anything else, that you could learn instead?
Knowledge of as, ld, ar, cc, etc is only needed when setting up (or modifying) your build toolchain, and in practice you can just copy-paste the build script from some other, similar project. Knowledge of these tools has never been needed.
Knowledge of cc has never been needed? What an optimist! You must never have had headers installed in a place where the compiler (or Makefile author) didn’t expect them. Same problems with the libraries. Worse when the routine you needed to link was in a different library (maybe an arch-specific optimized lib).
The library problems you described are nothing that can't be solved using symlinks. A bad solution? Sure, but it works, and doesn't require me to understand cc. (Though when I needed to solve this problem, it only took me about 15 minutes and a man page to learn how to do it. `gcc -v --help` is, however, unhelpful.)
"A similar project" as in: this isn't the first piece of software ever written, and many previous examples can be found on the computer you're currently using. Skim through them until you find one with a source file structure you like, then ruthlessly cannibalise its build script.
If you don't see a difference between a compiler and a probabilistic token generator, I don't know what to tell you.
And, yes, I'm aware that most compilers are not entirely deterministic either, but LLMs are inherently nondeterministic. And I'm also aware that you can tweak LLMs to be more deterministic, but in practice they're never deployed like that.
Besides, creating software via natural language is an entirely different exercise than using a structured language purposely built for that.
We're talking about two entirely different ways of creating software, and any comparison between them is completely absurd.
They can function kind-of-the-same in the sense that they can both change things written in a higher level language into a lower level language.
100% different in every other way, but for coding in some circumstances if we treat it as a black box, LLMs can turn higher level pseudocode into lower level code (inaccurately), or even transpile.
Kind of like how email and the postal service can be kind of the same if you look at it from a certain angle.
> Kind of like how email and the postal service can be kind of the same if you look at it from a certain angle.
But they're not the same at all, except somewhat by their end result, in that they are both ways of transmitting information. That similarity is so vague that comparing them doesn't make sense for any practical purpose. You might as well compare them to smoke signals at that point.
It's the same with LLMs and programming. They're both ways of producing software, but the process of doing that and even the end result is completely different. This entire argument that LLMs are just another level of abstraction is absurd. Low-Code/No-Code tools, traditional code generators, meta programming, etc., are another level of abstraction on top of programming. LLMs generate code via pattern matching and statistics. It couldn't be more different.
People negating down your comment are just "engineers" doomed to fail sooner or later.
Meanwhile, 9front users have read at least the plan9 intro and know about nm, 1-9c, 1-9l and the like. Wibe coders will be put on their place sooner or later. It´s just a matter of time.
Do you think most people under the age of 30 remember you can share a single computer between multiple users? When there was a single "home computer" or "PC" in the home, you learned about users and different rights. Unless you were a user back in those days or you've tinkered with any admin work, you wouldn't know this in 2026.
It's not my contention really that the UK or other nation can or can't afford to do things differently, it's more that that is the constant refrain coming from mainstream politics, along with a multitude of other excuses for relative inaction.
I’m going to disagree with the premise. The value in AI won’t come from providing AI but from using it.
The “knowledge cut off date” is 12 to 18 months ago for models, which essentially means that copyright has, in some ways, shrunk to that period since designing around is now very easy.
Given most people live on what they produced recently and not 20 years ago there’s an argument this makes access to knowledge and techniques fairer. Constant new creation is required to obtain a markup and that drives forward productivity
In other words it’s the copyright/patent argument all over again. And it’s perhaps a debate we need to have again as a service society.
>Remember that a therapist is really a friend you are paying for.
That's an awful, and awfully wrong definition that's also harmful.
It's also disrespectful and demeaning to both the professionals and people seeking help. You don't need to get a degree in friendship to be someone's friend. And having friends doesn't replace a therapist.
The problem is that the analysis of the alternatives only ever takes into account efficiency and not resilience. Which is typical of “rational expectations” belief systems based upon atomised individuals.
However the real world has politics in it, as we saw during the pandemic, at which points jurisdictions commandeer resources for themselves regardless of whether a “better price” is available elsewhere.
Within a jurisdiction where resources can be directed you only need one capacity for output. In a market situation you need multiple suppliers all of which with excess capacity to supply that you have reserved and which cannot be countermanded by other action (so it needs to be defended with military capacity). Once you cost all that in you may just find that doing it yourself is more efficient, once resilience is taken into account properly.
Nature rarely goes for the most efficient solution. When it does it tends to go the way of the Dodo.
I'm kinda surprised this isn't more popular. I figured we'd go this way eventually as we single out 10x-ers, give them a highly competent crew, and save a lot of money over your most expensive code monkey wasting time attending meetings, filling out Jira tickets, and giving presentations to the customer. You pay them a shitload of money - shouldn't you get every dollar's worth?
Honestly, at every job I spend an unreasonable amount of time getting up to speed on things that are only tangentially related to my job (No, here we need you to check all the boxes in the Jira ticket, ensure it's linked to a zephyr ticket, and ensure it's linked to a git PR - we don't care about you adding attachments or comments!)
And that’s definitely going to upset the gold bugs.
(In reality lots of things are held in reserve)
USD is a routing currency that is used because it is cheaper than the mesh alternative. When it stops being cheaper whoever is then cheapest will get the routing transactions.
reply