That doesn’t make much sense to me, there are literally billions of phones that people are using all the time.
Apple has over 2.3 billion active devices of which a small percentage are Macs (an estimated 24 million were sold in 2024 and around twice that in iPads).
The most difficult to scale part of a cell network is number of devices connected, not bandwidth used anyway and cellular Macs aren’t going to add significantly more load to a network. And that assumes that Apple even cares what a carrier thinks.
I’m in Australia, not the USA, and for all people like to complain about internet here, we have excellent mobile coverage and it’s relatively affordable, but it’s all priced by usage.
I have 4 devices on my plan with shared 210GB of 4G usage between them for around AUD$200 (USD$130) a month on Australia’s best network (Telstra). I work remotely from cafes a lot (probably around 20-30 hours a week) as a developer and get nowhere close to that usage. I update all my apps, download all my podcasts, listen to lossless music and stream video whenever I want during breaks (although I’m not a huge out-of-home video consumer). I do literally nothing to limit my bandwidth usage and am lucky to use 30-40GB a month across all my devices.
The problem I think is that desktop software is often not written on the idea that they might need to sip from a straw.
Anyone who has tethered their machines to a phone with a middling connection knows how bad the computer experience cna get.
Like you mentioned 50 gigs a month per device... when I had to tether my machine for a week I was finding myself using 10 gigs _a day_, and this was ~6 years ago.
Not an argument that this stuff is impossible, of course, but I do think these machines are different beasts.
> The most difficult to scale part of a cell network is number of devices connected, not bandwidth used
Not a network engineer, but isn't it possible that it's only wasy to scale the number of devices because mobile devices play nice with the network? For example, battery life depends on batching network requests, meaning the incentives are aligned between Google, Apple, and the carriers?
If every device defaults to treating the network like a LAN, like MacOS is accustomed to being able to do, that may change the part of the network that's easy to scale
Apple has over 2.3 billion active devices of which a small percentage are Macs (an estimated 24 million were sold in 2024 and around twice that in iPads).
The most difficult to scale part of a cell network is number of devices connected, not bandwidth used anyway and cellular Macs aren’t going to add significantly more load to a network. And that assumes that Apple even cares what a carrier thinks.
I’m in Australia, not the USA, and for all people like to complain about internet here, we have excellent mobile coverage and it’s relatively affordable, but it’s all priced by usage.
I have 4 devices on my plan with shared 210GB of 4G usage between them for around AUD$200 (USD$130) a month on Australia’s best network (Telstra). I work remotely from cafes a lot (probably around 20-30 hours a week) as a developer and get nowhere close to that usage. I update all my apps, download all my podcasts, listen to lossless music and stream video whenever I want during breaks (although I’m not a huge out-of-home video consumer). I do literally nothing to limit my bandwidth usage and am lucky to use 30-40GB a month across all my devices.