It's more likely users would be mad that their browser update makes them play cookie-clicker to get to their sites.
By and large users are unaware of how much resources something uses, or should use. They don't really care about anything but getting from A to B as fast as possible, with as few interruptions as possible.
Computer resources are just like any other resource; expendable. User will always use more if that means it's more convenient. Human time is very valuable. Accepting multiple dialogues would take even more time than loading a fat page
as an add: there is no single resource you can bind the multiplier to; many sites use no css, but lots of js, or webgl, wasm, tables... There is simply no possible way to foresee what will be slow and what wont
The "CSS pixel" is just a pixel, unscaled for high-dpi displays. The point is to avoid revealing the hardware while allowing a bigger starting amount for a bigger window. If you prefer, just pretend I wrote "32 MiB".
Human time is valuable. That is the whole point of this. My time is wasted when my computer gets so slow that it takes 10 seconds for the Caps Lock light to respond. My time is wasted when the mouse lag is so awful that it takes me half an hour to kill a few tasks. My time is wasted when I have to walk away from an unusable computer, checking back every few hours to see if the OS might have killed the biggest task.
Web browser resource consumption is why I have a fresh new HN account. I had to power cycle the computer today, and it seems that Chromium won't save passwords over a restart unless I upload them all to Google.
All problems are fractal in nature, they require a problem-scope in order to be solvable.
In this case, it would be "how fast is fast enough?", which would be fast enough that the human operating the application doesen't lose focus.
For most pages I've viewed with my 150€ phone, submit->time to interactive is between 1-3 seconds for the first non-cached load, and much faster for when revisiting. This is sufficient worst-case for any conceivable tasks performed on web-apps today.
Some exceptions that come to mind (like Reddit) have ulterior motives to force slowness so that people have to use their native app instead (which isn't much better for what I've heard)
By and large users are unaware of how much resources something uses, or should use. They don't really care about anything but getting from A to B as fast as possible, with as few interruptions as possible.
Computer resources are just like any other resource; expendable. User will always use more if that means it's more convenient. Human time is very valuable. Accepting multiple dialogues would take even more time than loading a fat page
as an add: there is no single resource you can bind the multiplier to; many sites use no css, but lots of js, or webgl, wasm, tables... There is simply no possible way to foresee what will be slow and what wont