The problem with hosting JS libraries on CDNs is that the cache has a network effect.
You only gain performance if the browser already has a cached version of this specific version on this specific CDN.
If you don't - you end up losing performance, because now an additional DNS lookup needs to be performed, and an additional TCP connection needs to be opened.
the reason why I prefer to use a cdn is because it is a games theory example come to life here - if everyone used the cdn version then any user coming to your site would most likely have the cdn version in their cache and thus performance would go up, but if you use the cdn version and your competitors don't their performance is slightly better than yours and so on and so forth. Games theory indicates that in most games of this sort cooperation is better than non-cooperation.
And really if you are using one of the major libraries and a major CDN (Google, JQuery, etc.) over time your users will end up having the stuff in the cache, either from you or from others having used the same library version and cdn.
I suppose someone has done a study on CDN spreading of libraries and CDNS among users, so that you could figure out what the chance is that a user coming to your site will have a specific library cached - there's this http://www.stevesouders.com/blog/2013/03/18/http-archive-jqu... but it is 3 years ago, really this information would need to be maintained at least annually to tell you what your top cdn would be for a library.
but there isn't 1 CDN, or 1 version... if you need two libraries, but the canonical CDN for jquery is on one, and your required extension is on another... that's two DNS lookups, connections, request cycles, etc.
So you use the one that has both, but one is not canonical, which means more cache misses. That doesn't even count the fact that there are different versions of each library, each with it's own uses, and distribution, and the common CDN approach becomes far less valuable.
In the end, you're better off compositing micro-frameworks and building yourself. Though this takes effort... React + Redux with max compression in a simple webpack project seems to take about 65K for me, before actually adding much to the project. Which isn't bad at all... if I can keep the rest of the project under 250K, that's less than the CSS + webfonts. It's still half a mb though... just the same, it's way better than a lot of sites manage, even with CDNs
that's 2 dns lookups on any user that hasn't already done that somewhere in the past and had it cached.
The question then is how likely are they to have done that in regards to your particular cdn and version of the library.
I agree that a lot of possible cdns, versions and so forth decreases the value of the common CDN approach, but there are at least some libraries that have a canonical CDN (JQuery for example) and not using that is essentially being the selfish player in a games theory style game.
Since I don't know of any long running tracking of CDN usage that allows you to predict how many people who visit your site are likely to have a popular library in their cache it's really difficult to talk about it meaningfully (I know there are one-off evaluations done in one point in time but that's not really helpful).
Anyway it's my belief that widespread refusal to use CDN versions of popular libraries is of course beneficial in the short run for the individual site but detrimental in the long run for a large number of sites.
Latency of a new request as mentioned in one of those articles is the main reason why I self host everything.
Since HTTPS needs an extra round trip to startup, it's now even more important to not CDN your libraries. The average bandwidth of a user is only going to go up, and their connection latency will remain the same.
If you are making a SaaS product that business want, using CDNs also make it hard to offer a enterprise on-site version as they want the software to have no external dependencies.
This might make sense, if all of your users are located near your web servers and you can comfortably handle the load of all the requests hitting your web servers.
If the user making the request is in Australia, for example, and your web server is in the US, the user is going to be able to complete many round trip requests to the local CDN pop in Australia in the time it takes to make a single request to your server in the US.
Latency is one of the main reasons TO use a CDN. A CDN's entire business model depends on making sure they have reliable and low latency connections to end users. They peer with multiple providers in multiple regions, to make sure links aren't congested and requests are routed efficiently.
Unless you are going to run datacenters all around the world, you aren't going to beat a CDN in latency.
If the only thing you have on the CDN is libraries, it's faster to have your site host them even if it's on the other side of the world. When HTTP2 file push is widely supported, it becomes even more in favor of hosting locally, as you can start sending your libraries right after you are done sending the initial page without waiting for the browser to request them.
If you are using a CDN for images/video, then yes, you would have savings from using a CDN since your users will have to nail up a connection to your CDN anyways.
Then again a fair number of the users for the site I'm currently working on have high latency connections (800ms+), so it might be distorting my view somewhat.
Ya I would have to agree with you tracker. Ever 3rd party dependency is introducing another DNS lookup. The whole point behind using a CDN effectively, besides lowering latency, is to reduce your DNS lookups to a bare minimum. For example, I use https://www.keycdn. They support HTTP/2 and HPACK compression along with Huffman encoding which reduce the size of your headers.
The benefits of hosting say Google Fonts, Font Awesome, jQuery, etc. all with KeyCDN is that I can take better advantage of parallelism if I have one single HTTP/2 connection. Not to mention I have full control over my assets to implement caching (cache-control), expire headers, etags, easier purging, and the ability to host my own scripts.
You only gain performance if the browser already has a cached version of this specific version on this specific CDN. If you don't - you end up losing performance, because now an additional DNS lookup needs to be performed, and an additional TCP connection needs to be opened.
Here are a few reasons people choose to avoid CDNized versions of JS libraries. http://www.sitepoint.com/7-reasons-not-to-use-a-cdn/
This is a 6 year old post, but it raises some valid concerns: https://zoompf.com/blog/2010/01/should-you-use-javascript-li...