Cyberspace promised us we can all work together to create things, like one species coming together to solve problems. Now in 2026, we need to “space” for every little tribe…
Exactly. There never was a declaration of independence of cyberspace. BUT government and law moved too slowly by years and years. And they have, of course, not learned their lesson.
For example: suing Nappster 2 years after it launched. And that was just because it was an extremely clear-cut case. By the time they did that there were 10 such networks, none of which were sued, none of which had clear laws or court decisions stating clearly one way or the other if it was legal.
And when we're talking a vague issue, for example how copyright affects search engines, the first actually settled case (which was still a far cry from establishing the rules) happened in 2006, 16 years after the initial search engine started operating and over 8 years after Google started it's meteoric rise. The specific decision the courts deigned to make, after 16 years? That caching a page so it can be used to build a search index in the first place does not by itself violate copyright. Great, well, that covers it then. My point is, by then the cat was out of the bag, ran to the neighbors house, got 6 kittens, who each got 6 kittens themselves and one of it's grandchildren ate the sandwich the judge was hoping to have for lunch and one of the other kittens got adopted by the president of the US, while the rest invaded and destroyed the houses of publishers that tried to protect their copyright.
Imagine the insanity, the damage that any real court decision against search engines would do today. "No you can't show previews". "Ads don't respect trademarks". There is no room for any such decisions now. The few decisions they have made (in >30 years) have amplified the damage to the victims that the court system tried to help (just ask a few newspapers).
Of course, none of this has instilled any sense of reasonableness, modesty or urgency in any parliament, court or even executive around the globe. For instance, they could PRE-clarify the laws before AI takes over 5 industries. Does AI training violate copyright? What are the rights of an employee that gets fired because AI does their job? No government felt the need to answer the copyright question when it mattered, 7 years ago, and there is ZERO action on the second question. Are they planning to answer the people displacement question once 99% of companies have done it because competition forced them to?
Now any answer they give on the copyright front is beside the point since no court or Parliament actually has the power to order existing (potentially law-violating) models to be destroyed. Once again, they have placed themselves into a position where they are totally irrelevant. Now one might ask, the time is to decide if you violate copyright by training a model using a model that was trained while violating copyright. Perhaps that one is still relevant. But nothing will be done.
And please, it doesn't matter what your position is on the issue. Can model training violate copyright? Yes or no? We live in a democracy and no decision is made. This is an important part of why big companies get to openly violate laws on an unprecedented scale for billions and billions without consequences while kids sometimes get locked up for stealing a single candy.
That assumes LLMs are relevant and will be around a year from now. Let’s not forget NFTs.
Your comment is also blind to the absurd amount of research and projects which are born here but later move to look for funding.
So the EU is not irrelevant, on the contrary, we’re just mourning the fall of the US and transitioning to an independent future. Who would’ve though, we’d end up needing to build a copy of everything…
I used it as another “there was a strong tech push but ultimately we couldn’t make it work” kind of idea. With NFTs the grift was immediately visible, with LLMs it’s a bit harder, the whole “AI” facade gives people hope - I want to believe and stuff.
Rubio is a mouthpiece for a regime that’s not qualified to discuss Europe, or even his very own US of A. All he meant in his speech is that his government has chosen isolation
Swift never felt truly open source either. That people can propose evolution points doesn’t change the fact that Apple still holds all the keys and pushes whatever priorities they need, even if they’re not a good idea (e.g. Concurrency, Swift Testing etc)
Also funny enough, all cross platform work is with small work groups, some even looking for funding … anyway.
Apple has been always 'transactional' when it comes to OSS - they open source things only when it serves a strategic purpose. They open-sourced Swift only because they needed the community to build an ecosystem around their platform.
Yeah, well, sure they've done some work around LLVM/Clang, WebKit, CUPS, but it's really not proportional to the size and the influence they still have.
Compare them to Google, with - TensorFlow, k8s, Android (nominally), Golang, Chrome, and a long tail of other shit. Or Meta - PyTorch and the Llama model series. Or even Microsoft, which has dramatically reversed course from its "open source is a cancer" era (yeah, they were openly saying that, can you believe it?) to becoming one of the largest contributors on GitHub.
Apple I've heard even have harshest restrictions about it - some teams are just not permitted to contribute to OSS in any way. Obsessively secretive and for what price? No wonder that Apple's software products are just horrendously bad, if not all the time - well, too often. And on their own hardware too.
I wouldn't mind if Swift dies, I'm glad Objective-C is no longer relevant. In fact, I can't wait for Swift to die sooner.
Sort of an exception that proves the rule. Yes, it's great and was released for free. But at least partially that's not a strategic decision from Apple but just a requirement of the LGPLv2 license[1] under which they received it (as KHTML) originally.
And even then, it was Blink and not WebKit that ended up providing better value to the community.
[1] It does bear pointing out that lots of the new work is dual-licensed as 2-clause BSD also. Though no one is really trying to test a BSD-only WebKit derivative, as the resulting "Here's why this is not a derived work of the software's obvious ancestor" argument would be awfully dicey to try to defend. The Ship of Theseus is not a recognized legal principle, and clean rooms have historically been clean for a reason.
>> some teams are just not permitted to contribute to OSS in any way
My understanding is that by default you are not allowed to contribute to open-source even if its your own project. Exceptions are made for teams whose function is to work on those open-source project e.g. Swift/LLVM/etc...
I talked to an apple engineer at a bar years ago and he said they aren’t allowed to work on _anything_ including side projects without getting approval first. Seemed like a total wtf moment to me.
I have never had a non wtf moment talking to an apple software engineer at a bar.
I can recall one explaining to me in the mid 20 teens that the next iPhone would be literally impossible to jailbreak in any capacity with 100% confidence.
I could not understand how someone that capable(he was truly bright) could be that certain. That is pure 90s security arrogance. The only secure computer is one powered off in a vault, and even then I am not convinced.
Multiple exploits were eventually found anyway.
We never exchanged names. That’s the only way to interact with engineers like that and talk in real terms.
No, as far as I know, at Apple, this is strict - you cannot contribute to OSS, period. Not from your own equipment nor your friend's, not even during a vacation. It may cost you your job. Of course, it's not universal for every team, but on teams I know a few people - that's what I heard. Some companies just don't give a single fuck of what you want or need, or where your ideals lie.
I suspect it's not just Apple, I have "lost" so many good GitHub friends - incredible artisans and contributors, they've gotten well-payed jobs and then suddenly... not a single green dot on the wall since. That's sad. I hope they're getting paid more than enough.
Every programming job I've ever had, I've been required at certain points to make open source contributions. Granted, that was always "we have an issue with this OSS library/software we use, your task this sprint is to get that fixed".
I won't say never, but it would take an exceedingly large comp plan for me to sign paperwork forbidding me from working on hobby projects. That's pretty orwellian. I'm not allowed to work on hobby projects on company time, but that seems fair, since I also can't spend work hours doing non-programming hobbies either.
The fact that Swift is an Apple baby should indeed be considered a red flag. I know there are some Objective-C lovers out there but I think it is an abomination.
Apple is (was?) good at hardware design and UX, but they pretty bad at producing software.
For what it’s worth, ObjC is not Apple’s brainchild. It just came along for the ride when they chose NEXTSTEP as the basis for Mac OS X.
I haven’t used it in a couple decades, but I do remember it fondly. I also suspect I’d hate it nowadays. Its roots are in a language that seemed revolutionary in the 80s and 90s - Smalltalk - and the melding of it with C also seemed revolutionary at the time. But the very same features that made it great then probably (just speculating - again I haven’t used it in a couple decades) aren’t so great now because a different evolutionary tree leapfrogged ahead of it. So most investment went into developing different solutions to the same problems, and ObjC, like Smalltalk, ends up being a weird anachronism that doesn’t play so nicely with modern tooling.
I've never written whole applications in ObjC but have had to dabble with it as part of Ardour (ardour.org) implementation details for macOS.
I think it's a great language! As long as you can tolerate dynamic dispatch, you really do get the best of C/C++ combined with its run-time manipulable object type system. I have no reason to use it for more code than I have to, but I never grimace if I know I'm going to have to deal with it. Method swizzling is such a neat trick!
It is, and that’s part of what I loved about it. But it’s also the kind of trick that can quickly become a source of chaos on a project with many contributors and a lot of contributor churn, like we tend to get nowadays. Because - and this was the real point of Dijkstra’s famous paper; GOTO was just the most salient concrete example at the time - control flow mechanisms tend to be inscrutable in proportion to their power.
And, much like what happened to GOTO 40 years ago, language designers have invented less powerful language features that are perfectly acceptable 90% solutions. e.g. nowadays I’d generally pick higher order functions or the strategy pattern over method swizzling because they’re more amenable to static analysis and easier to trace with typical IDE tooling.
I don't really want to defend method swizzling (it's grotesque from some entirely reasonable perspectives). However, it does work on external/3rd party code (e.g. audio plugins) even when you don't have control over their source code. I'm not sure you can pull that off with "better" approaches ...
Many of the built-in types in Objective C all have names beginning with “NS” like “NSString”. The NS stands for NeXTSTEP. I always found it insane that so many years later, every iPhone on Earth was running software written in a language released in the 80s. It’s definitely a weird language, but really quite pleasant once you get used to it, especially compared to other languages from the same time period. It’s truly remarkable they made something with such staying power.
>It’s truly remarkable they made something with such staying power
What has had the staying power is the API because that API is for an operating system that has had that staying power. As you hint, the macOS of today is simply the evolution of NeXTSTEP (released in 1989). And iOS is just a light version of it.
But 1989 is not all that remarkable. The Linux API (POSIX) was introduced in 1988 but started in 1984 and based on an API that emerged in the 70s. And the Windows API goes back to 1985. Apple is the newest API of the three.
As far as languages go, the Ladybird team is abandoning Swift to stick with C++ which was released back in 1979. And of course C++ is just an evolution of C which goes back to 1972 and which almost all of Linux is still written in.
And what is Ladybird even? It is an HTML interpretter. HTML was introduced in 1993. Guess what operating system HTML and the first web browser was created on. That is right...NeXTSTEP.
In some ways ObjC’s and the NEXTSTEP API’s staying power is more impressive because they survived the failure of their relatively small patron organization. POSIX and C++ were developed at and supported by tech titans - the 1970s and 1980s equivalents of FAANG. Meanwhile back at the turn of the century we had all witnessed the demise of NeXT and many of us were anticipating the demise of Apple, and there was no particularly strong reason to believe that a union of the two would fare any better, let alone grow to become one of the A’s in FAANG.
I actually suspect that ObjC and the NeXT APIs played a big part in that success. I know they’ve fallen out of favor now, and for reasons I have to assume are good. But back in the early 2000s, the difference in how quickly I could develop a good GUI for OS X compared to what I was used to on Windows and GNOME was life changing. It attracted a bunch of developers to the platform, not just me, which spurred an accumulation of applications with noticeably better UX that, in turn, helped fuel Apple’s consumer sentiment revival.
Good take. Even back in the 1990s, OpenStep was thought to be the best way to develop a Windows app. But NeXT charged per-seat licenses, so it didn't get much use outside of Wall Street or other places where Jobs would personally show up. And of course something like iPhone is easier when they already had a UI framework and an IDE and etc.
Assuming you mean C (C++ is an 80s child), that’s trivially true because devices with an ObjC SDK are a strict subset of devices that are running on C.
Yes, that is why I don't find it "insane" like the grandparent does, like yeah, devices run old languages because those languages work well for their intended purpose.
You should feel that C’s longevity is insane. How many languages have come and gone in the meantime? C is truly an impressive language that profoundly moved humanity forward. If that’s not insane (used colloquially) to you, then what is?
Next was more or less an Apple spinoff, that was later acquired by Apple. Objective-C was created because using standards is contrary to the company culture. And with Swift they are painting themselves into a corner.
> Objective-C was created because using standards is contrary to the company culture
Objective-C was actually created by a company called Stepstone that wanted what they saw as the productivity benefits of Smalltalk (OOP) with the performance and portability of C. Originally, Objective-C was seen as a C "pre-compiler".
One of the companies that licensed Objective-C was NeXT. They also saw pervasive OOP as a more productive way to build GUI applications. That was the core value proposition of NeXT.
NeXT ended up basically taking over Objective-C and then it became of a core part of Apple when Apple bought NeXT to create the next-generation of macOS (the one we have now).
So, Objective-C was actually born attempting to "use standards" (C instead of Smalltalk) and really has nothing to do with Apple culture. Of course, Apple and NeXT were brought into the world by Steve Jobs
> Objective-C was created because using standards is contrary to the company culture.
What language would you have suggested for that mission and that era? Self or Smalltalk and give up on performance on 25-MHz-class processors? C or Pascal and give up an excellent object system with dynamic dispatch?
C's a great language in 1985, and a great starting point. But development of UI software is one of those areas where object oriented software really shines. What if we could get all the advantages of C as a procedural language, but graft on top an extremely lightweight object system with a spec of < 20 pages to take advantage of these new 1980s-era developments in software engineering, while keeping 100% of the maturity and performance of the C ecosystem? We could call it Objective-C.
Years ago I wrote a toy Lisp implementation in Objective-C, ignoring Apple’s standard library and implementing my own class hierarchy. At that point it was basically standard C plus Smalltalk object dispatch, and it was a very cool language for that type of project.
I haven’t used it in Apple’s ecosystem, so maybe I am way off base here. But it seems to me that it was Apple’s effort to evolve the language away from its systems roots into a more suitable applications language that caused all the ugliness.
Some refer to the “Tim Cook doctrine” as a reason for Swift’s existence. It’s not meant to be good, just to fulfill the purpose of controlling that part of their products, so they don’t have to rely on someone else’s tooling.
That doesn’t really make sense though. I thought that they hired Lattner to work on LLVM/clang so they could have a non-gpl compiler and to make whatever extensions they wanted to C/Obj-C. Remember when they added (essentially) closures to C to serve their internal purposes?
So they already got what they wanted without inventing a new language. There must be some other reason.
The Accidental Tech podcast had a long interview with Lattner about Swift in 2017 [0]. He makes it out as something that had started as side-project / exploration thing without much of an agenda, which grew mostly because of how good positive feedback the project had got from other developers. He had recently left Apple back then, and supposedly left the future of Swift in other peoples' hands.
I definitely agree with the first point - it's not meant to be the best.
On the second part, I think the big thing was that they needed something that would interop with Objective-C well and that's not something that any language was going to do if Apple didn't make it. Swift gave Apple something that software engineers would like a ton more than Objective-C.
I think it's also important to remember that in 2010/2014 (when swift started and when it was released), the ecosystem was a lot different. Oracle v Google was still going on and wasn't finished until 2021. So Java really wasn't on the table. Kotlin hit 1.0 in 2016 and really wasn't at a stage to be used when Apple was creating Swift. Rust was still undergoing massive changes.
And a big part of it was simply that they wanted something that would be an easy transition from Objective-C without requiring a lot of bridging or wrappers. Swift accomplished that, but it also meant that a lot of decisions around Swift were made to accommodate Apple, not things that might be generally useful to the lager community.
All languages have this to an extent. For example, Go uses a non-copying GC because Google wanted it to work with their existing C++ code more easily. Copying GCs are hard to get 100% correct when you're dealing with an outside runtime that doesn't expect things to be moved around in memory. This decision probably isn't what would be the best for most of the non-Google community, but it's also something that could be reconsidered in the future since it's an implementation detail rather than a language detail.
I'm not sure any non-Apple language would have bent over backwards to accommodate Objective-C. But also, what would Apple have chosen circa-2010 when work on Swift started? Go was (and to an extent still is) "we only do things these three Googlers think is a good idea", Go was basically brand-new at the time, and even today Go doesn't really have a UI framework. Kotlin hadn't been released when work started on Swift. C# was still closed source. Rust hadn't appeared yet and was still undergoing a lot of big changes through Swift's release. Python and other dynamic languages weren't going to fit the bill. There really wasn't anything that existed then which could have been used instead of Swift. Maybe D could have been used.
But also, is Swift bad? I think that some of the type inference stuff that makes compiles slow is genuinely a bad choice and I think the language could have used a little more editing, but it's pretty good. What's better that doesn't come with a garbage collector? I think Rust's borrow checker would have pissed off way too many people. I think Apple needed a language without a garbage collector for their desktop OS and it's also meant better battery life and lower RAM usage on mobile.
If you're looking for a language that doesn't have a garbage collector, what's better? Heck, what's even available? Zig is nice, but you're kinda doing manual memory management. I like Rust, but it's a much steeper learning curve than most languages. There's Nim, but its ARC-style system came 5+ years after Swift's introduction.
So even today and even without Objective-C, it's hard to see a language that would fit what Apple wants: a safe, non-GC language that doesn't require Rust-style stuff.
I think that their culture of trying to invent their own standards is generally bad, but it is even worse when it is a programming language. I believe they are painting themselves into a corner.
>For example, Go uses a non-copying GC because Google wanted it to work with their existing C++ code more easily. Copying GCs are hard to get 100% correct when you're dealing with an outside runtime that doesn't expect things to be moved around in memory.
Do you have a source for this?
C# has a copying GC, and easy interop with C has always been one of its strengths. From the perspective of the user, all you need to do is to "pin" a pointer to a GC-allocated object before you access it from C so that the collector avoids moving it.
I always thought it had more to do with making the implementation simpler during the early stages of development, with the possibility of making it a copying GC some time in the feature (mentioned somewhere in stdlib's sources I think) but it never came to fruition because Go's non-copying GC was fast enough and a lot of code has since been written with the assumption that memory never moves. Adding a copying GC today would probaby break a lot of existing code.
To add to this, whatever was to become Obj-C's successor needed to be just as or more well-suited for UI programming with AppKit/UIKit as Obj-C was. That alone narrows the list of candidates a lot.
I share the sentiment of the post. The best I can describe it is that I’m tech-savvy enough to know that “AI” is just not capable of what people want it to do and yet I apparently lack the “broligarch” gene required for me to ignore the technical reality in favour of seeking ways to exploit people who don’t know better (a-la OpenClaw guy).
> 2. Any pretense for AI Safety concerns that had been coming from OpenAI really fall flat with this move.
And Peter, creating what is very similar to giant scam/malware as a service and then just leaving it without taking responsibility or bringing it to safety.
I mean, most of the destruction is recycling that I am aware of. Turning into rags is the fate of most unwanted clothing. Do the euros burn it instead?
Give a man donated clothing and they will have clothes ... teach a man to become and indentured servant on minimum wage and they will be able to buy clothes every year for the rest of their lives.
That's pretty common in small companies. It's less common in large companies but can happen - you may use the "CTO" title for the founding engineer who still leads code and architecture, then hire someone under a different title (frequently "VP of Engineering") to handle the management / team growing side of the role.
That sounds like a reasonable split to me, so much so I’m not sure I’d understand why you’d want the same person handling both code/architecture and management.
CTO in my company* remains SME on a several components, commits to several production repositories (and expects the most stringent PR checks), and maintains couple of small tool used by us and the customers.
Its not that rare I think.
*small fintech with couple of billions in the accounts, not a startup, not a Fortune 500 company
reply