Google, Apple, and Mozilla apparently all failed to find a solution to this problem, even though lots of people wanted there to be a solution. Certainly, any browser that shipped this feature in a privacy-respecting manner would have bragging rights for awhile, so there is an incentive. Given that, I don't think I'm exaggerating the difficulty of the problem. This problem is also very similar to the challenges brought by Spectre-class vulnerabilities, and that one has been enormously costly for the whole industry.
I think it is completely fair to say that privacy-respecting shared caches are not simple.
Some solutions can be imagined, but they come with weird trade-offs or they do nothing for majority of the web. A new manifest format like you describe falls into the latter, since it would only apply to new websites using the new feature, and that’s without digging into the other problems it would pose.
In practice, people often visit the same websites repeatedly; they aren't constantly visiting new websites only once. A partitioned cache works just fine for the normal scenario. It's slightly less efficient for the first day someone uses their browser, but then things are honestly fine after that. It's unfortunate that we can't eek out the last tiny bit of performance for this, but I think the difference would be hard to measure in practice.
In my opinion, if websites would more commonly use brotli, that would make a far larger difference in efficiency than returning to a shared cache, and if browsers could have a standardized means of downloading only the bytes that changed in an asset like a javascript library instead of downloading the new version from scratch, that would make a much bigger difference too.
I think it is completely fair to say that privacy-respecting shared caches are not simple.
Couldn’t they make an exception for some domains and create a registry of really popular or fundamental links to packages like jquery et al? I have read on this topic before, but it sounded like all or nothing no shades of grey maximalism. Fine, partition those memes from imgur cdns, but let common libraries with known hashes to be shared at least. The potential attack is based on leaving a cdn-pixel and dl-time-testing it on other sites. But there is no big data in who has the 10 most popular releases of wasm-sqlite, dayjs or bootstap.min.css in their cache. These could be warmed up from literally anywhere, or even synced in background by an idle browser thread.
I feel like Google Chrome shipped an experiment at one point that was going to include some of the most popular libraries with the browser, so they would be equally cached for all Chrome users, for all sites. I'm having trouble finding any announcements about this, so maybe I dreamed this up.
We have a functional internet. I'm using it to communicate right now.