Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT, especially anything supporting HDR10. I suspect this is more of "they need to be fudged so they're wrong" more than anything.
Plasma has great contrast and a slightly wider gamut than a CRT. Neither one have particularly good gamuts unless you're comparing to sRGB. Many current screens can do much better.
"Professional CRTs
were manufactured to the SMPTE‐C standard gamut,
which closely matches the NTSC standard and slightly
exceeds the EBU (PAL/SECAM) standard. Any
limitations on the reproduction of certain color
shades were due to the maximum saturated levels of
red, green, and blue phosphor compounds used in
these monitors.
While coverage of SMPTE‐C and BT.709 color spaces
isn’t difficult to accomplish with a CRT, coverage of
new, extended color gamuts such as xvYCC and the
digital cinema P3 minimum standard gamut is
problematic for CRT phosphor imaging. At best, a
well‐designed CRT monitor could expect to cover
about 60% ‐ 70% of these wider spaces, which
typically push deeper into the green section of the
1931 CIE “tongue” color diagram."
This shows plasma as a little bit better than CRT, and again lists the limits of CRT gamut.
Thanks for the research. You might be right. I had CRTs and plasma side-by-side and then I had plasma and IPS side-by-side. Plasma and CRTs looked about the same to me, while IPS looked a bit worse, but that might be because of analog vs. digial inputs and calibrations. The CRTs had an auto-calibration function (Sony G520), but other than that I never did a color calibration on any of them.
I wouldn't be surprised at all if the particular IPS panels you looked at were worse, but we can do better on displays that try, especially these days.
Part of the issue is that so many things are calibrated for sRGB and that is a pretty small gamut.