How about dithering as a compositor-level rendering technique for low-bpp displays, rather than as a compression-at-rest effect?
I'm thinking specifically of non-HDR displays displaying HDR content (where HDR10 includes 10-bit-per-channel color.) Presumably the compsitor, if it knew you were watching an HDR video on a non-HDR display, would do better by dithering the HDR content down to 24-bit non-HDR content for display, rather than allowing it to be naively rendered with 24-bit banding?
I'm thinking specifically of non-HDR displays displaying HDR content (where HDR10 includes 10-bit-per-channel color.) Presumably the compsitor, if it knew you were watching an HDR video on a non-HDR display, would do better by dithering the HDR content down to 24-bit non-HDR content for display, rather than allowing it to be naively rendered with 24-bit banding?