FWIW, as the application developer, it increases fragmentation, as more of the code in your app is determined by what version of the OS the user is running. If Apple were developing and debugging everyone's apps, that argument would make sense (but, of course, they are not). If you truly want to minimize costs for development and debugging by minimizing fragmentation you want to provide the most uniform and stable interface as possible for the developer and let the app then operate as identically as possible across every device it will ever work on, not just today but into the future.
Most Apple users always run the latest iOS version. And the vast majority of apps aren’t seeing major new OS features each iOS release anymore that they just have to adopt to stay competitive - they can focus on their business these days.
So in that kind of market, this approach does reduce support costs because you just pick a target iOS version to support and test with devices running that and you know newer devices will also just work.
> Most Apple users always run the latest iOS version.
I assume you're talking about the latest iOS version for their device, which may or may not be the latest iOS version released by Apple. I was running around with an iPhone 5 until 2019ish when apps stopped working altogether.
5–10% of people using earlier versions absolutely backs up the statement “Most Apple users always run the latest iOS version.”
It’s not reasonable to compare native apps to browser support. Dropping support for an older version of iOS doesn’t cut users off, they can continue to use the more recent version of your app that supports their device.
It’s especially unreasonable to compare it to people supporting Internet Explorer for a decade. Microsoft halted all development for five years and people carried on using it much longer. Apple comes out with a new major version of iOS every year and nobody is using decade-old versions of iOS.
I don't see how this would reduce support costs compared to shipping the runtime with the app. Couldn't you just pick a target iOS/Swift runtime version and support that even if the runtime wasn't tied to the iOS version?
The reduced support costs are that your testing complexity (ie cost) is hugely driven by how many OSes you are supporting. By bundling it with the app, you’re pretending that the Swift runtime is the only thing you have to test when you have to test all the OS integration bits. So by tying the runtime to the OS it’s saying “these are the same” and you only have a single compat flag to select. They do this with C++ runtime as well btw. And I believe the language and the runtime are decoupled from what I read although I haven’t paid attention to swift for a long time (ie you can enable new language features without using a new runtime).
The other support costs it reduces is for Apple because their testing matrix for making runtime releases is drastically reduced. Which means the swift team runs more efficiently.
It’s annoying but for end users it’s even more valuable because apps are smaller (and more longevity for storage) and less memory is used.
TLDR: Apple has always run the languages this way and it works nicely for their ecosystem.