It's just that git does a much more interesting job with compression, actually. Lot's more to learn. They don't compress the snapshots via something like zstd directly, that comes much later after a delta step. (Interestingly, that delta compression step doesn't use the diffs that `git show` shows you for your commits.)
Very few people fall behind at the moment due to lack of access to information. People in poor countries largely have access to the internet now. It doesn’t magically make people educated and economically prosperous.
You are arguing the converse. Access to information doesn't make people educated, but lack of access definitely puts people at a big advantage. Chatbots are not just information, they are tools and using it needs training because they hallucinate.
Times New Roman was designed for a time when printing quality was not that good. With 1080p screen nowadays, that barrier is removed, so optimization of readability has different constraints.
The GPUs, sure. The mainboards and CPUs can be used in clusters for general-purpose computing, which is still more prevalent in most scientific research as far as I am aware. My alma mater has a several-thousand-core cluster that any student can request time on as long as they have reason to do so, and it's all CPU compute. Getting non-CS majors to write GPU code is unlikely in that scenario.
I provide infrastructure for such a cluster that is also available to anyone at the university free of charge. Every year we swap out the oldest 20% of the cluster as we run a five year depreciation schedule. In the last three years, we’ve mostly been swapping in GPU resources at a ration of about 3:1. That’s in response to both usage reports and community surveys.
reply