Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I bet if windows had supported DDC/CI for the past decade monitors would work with it reliably. (Well, as reliably as any hardware peripheral)


Yeah, I suspect the reason DDC (and EDID) are so bad is that they're not critical for the monitor to perform its basic function, so there's no "evolutionary pressure" to improve it.


Windows has had support for DDC/CI for a long time.


In what sense is it supported? What functionality is exposed to end users?


Automatic detection of default and supported resolutions and information about DPI? Or is it not the same?


DDC is how monitors send EDID information about their capabilities to the computer. DDC/CI adds the ability for the computer to send commands to the monitor for things like brightness and color settings and input selection. So Windows obviously is using plain DDC to offer you the right set of resolution choices, but none of the DDC/CI functionality appears to be exposed to end users, making it effectively unsupported and unused and untested.


I thought it was supported by NVidia control panel since ages?

P.S. Not a user of NVidia now, might be a fake memory.


I don't recall seeing anything like a monitor brightness slider in their control panel. And even if there was, it wouldn't be a reliable indicator of OS-level support for that functionality because the GPU vendor's control panel is the most likely piece of software to bypass the OS and control the display config through its own means.


Exposed to the user as a brightness control for external monitors? Where?


Not exposed in the GUI but has OS support which can be accessed through third party apps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: