Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why does it feel like it is inevitable that standardization/licensing organizations in tech will always eventually turn into a user-hostile mess?

USB, HDMI, what can we screw up next?

Is it incompetence? Malice? I'd really like to see an in-depth investigation of this phenomenon



HDMI was never a pro-user protocol, it was made to encumber a digital display signal with DRM.


This. HDMI was cooked as a proprietary connector with DRM by the big manufacturers in the DVD/Blu-Ray, TV, home-entertainment business and the big movie studios to enforce stronger content protections to protect their IP, at wich it miserably failed, as I can still torrent every Hollywood blockbuster and every Netflix series.

IIRC, every manufacturer must pay a fee to the HDMI consortium for every device with HDMI they sell.

DisplayPort, by contrast is a more open standard only requiring a flat fee for the standard documentation and membership instead of a fee per unit sold IIRC.


DisplayPort and DVI both support HDCP. This wasn't the purpose behind HDMI, though support for it was no doubt a requirement for adoption. It was designed to be a single cable for carrying video and audio between playback devices, receivers, and displays.

For this purpose, it succeeded and did a much better job at it than alternatives. HDMI still makes far more sense for use in a home theater environment than DisplayPort thanks to features like ARC.


I think the better question is why SDI video connections aren't available on any consumer devices.

While HDMI is nice for consumers because it carries audio/data, SDI cables are cheap (just a single coax cable!) and easy to route (T-splitters are a thing!).

SDI does not support HDCP, however.


I think cost might be the main factor there. SDI is serial instead of parallel like all other consumer digital video cables. Hardware to serialize and deserialize bits this fast on a single conductor is expensive. HDMI 1.0 had a max transmission rate of 4.95 Gbit/s in 2002. Today, HDMI 2.1 goes up to 48 Gbit/s.


On that note are there TVs with displayport?

I'm using my LG TV as monitor for a PC and forced to use HDMI.


Gigabyte has been selling a version of LG CX48 slightly changed to be a monitor. It has HDMI and DP.

Model name is AORUS FO48U.


IIRC, most panels interface via DisplayPort internally these days.


HDMI is great for a home theatre set up where there's an obvious central unit, but the ecosystem has gotten worse if your speakers don't take in HDMI, at least at the very cheap end of the spectrum I buy on.

My current TV will only put out an inferior "headphone" mix over the 3.5mm connection, and the SPDIF connection is co-axial on the tv, but optical on the speaker. Having to run a powered converter box just to get audio from my tv to a speaker feels like such a step backwards.


It sounds like your speaker system predates HDMI's adoption, or was never intended for use in a home theater system. Even the lowest budget soundbars will include an HDMI port and support ARC. I am surprised your TV has coax SPIDF but not toslink, as it was the gold standard for home theater audio before HDMI came around.

It sounds like you just got bad luck with your TV having a bad 3.5mm jack and not supporting toslink, as most I've seen will at the very least support the latter, and often have a separate "line out" port for the former.


Is there a big difference because 3.5 and something digital?

I know 3.5 is worse on a technically, but I've never been able to actually notice the difference.


3.5mm is an analogue signal that can only output stereo. Its quality will be limited by the device it comes out of.

SPDIF (both optical and coaxial) support sending multiple audio channels in their original encoding, to support surround sound. The receiving device needs to support decoding of the audio, which is why there is often an option to force the SPDIF output to use PCM stereo instead of surround sound.

If you have a cheap TV and a good hifi, you'll want to use SPDIF or HDMI so that your audio isn't ruined by poor quality of the audio chipset in the TV.


My assumption was just that it was something about how the EQ is mixed for that jack, because it is labelled specifically as headphones. The other replies about surround sound are true, as well, but I don't think should apply to my tv -> stereo soundbar set-up.

But the difference is definitely noticeably, even to my relatively forgiving ear.


The biggest difference is digital interconnects can carry extra data for surround sound.

(Also, unique for optical connections, it's easier to avoid ground loop hums.)


Not quite true. The "DRM" mechanism you're most likely referring to is HDCP which was designed separately by Intel to provide copy protection over multiple device types including DVI (the first implementation), DisplayPort and of course HDMI.

It's not the HDMI interface that enforces copy protection it's the software, firmware and additional hardware on the devices that do this. You can use HDMI perfectly fine without the additional DRM crap.


HDCP can run on DVI or DisplayPort too. HDMI is a smaller, lower pin count connector than DVI, however.


HDMI's initial version is electrically and pin-compatible (passive adapter only) with DVI-D single link; assuming the DVI port supports HDCP.

The parent post is correct in that the mandatory HDCP was a major feature (for the involved cabal of organizations).


> The parent post is correct in that the mandatory HDCP was a major feature

This is wrong. HDCP isn't mandatory to implement HDMI, they are two separate technologies. I'm not defending HDCP or DRM encumbered content but I wish folks would get their facts straight.


I almost always err on the "never attribute to malice that which can be adequately explained by incompetance". Howerver, the "standards" bodies ability to repeatedly make a complete pigs ear of every single interconnect system makes me assume the opposite.


I'm leaving towards malice (through not caring so much for users) caused by big tech using this arena as a battleground.

I wish all cables were equal too, but c'est la vie


Greed, of course.

If it becomes too big of a problem, each cable and device will be required to have a challenge-response Obscure Brand Inc. proprietary U9 chip burned with a valid secret key and serial number at the factory that must return a valid response for the link to be enabled.


>Is it incompetence? Malice? I'd really like to see an in-depth investigation of this phenomenon

"Word of advice. Agents are bad, but whatever you do, stay the hell away from Marketing."

- Thomas A. Anderson


Don't forget MPEG.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: