Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yup, things are so much better now that they just work. Except when they don't, because now it's harder to do anything about it.

I've lost count of the number of Linux machines I've seen that won't offer the correct resolution for a particular monitor (typically locked to 1024x768 on a widescreen monitor).

I don't know whether the problem's with Linux, Xorg, crappy BIOSes or crappy monitors - but even now I occasionally resort to an xorg.conf file to solve such issues.



Do you work with a lot of KVMs? Directly plugged monitors usually just work thanks to EDID info, but cheap KVMs frequently block that signal and cause problems. It's rare for a monitor plugged directly into the computer to have problems these days, even on Linux.


No KVMs involved - but three of the machines I have in mind (not identical, but all running the same version of Linux Mint) have two monitors attached, one of which is OK and the other isn't. (Not mine - so I haven't put any time into trying to solve it yet.)

Another machine - which is mine - used to have a 19" VGA monitor attached which worked happily at 1280x1024 for months, then one day something got updated and it wouldn't do anything beyond 1024x768 after that until I resorted to an xorg.conf file.


Also, on modern machines you almost never want to be editing the xorg.conf. xrandr took over the responsibility of doing resolution stuff.

To fix the resolution on a modern distro the sequence is something like this (use your actual monitor dimensions and refresh rate of course):

    % cvt 1920 1080 60
Copy everything past "Modeline" into a cut buffer.

    % xrandr --newmode <paste the line from above>
Keep a note of the first line in the field, it will look something like "1920x1080_60", this is the "mode name"

Next, find out what your monitor is named:

    % xrandr | grep ' connected '
It will be HDMI-1 or VGA-1 or something like that, this is your "interface name".

Now add the mode to your monitor specification:

    % sudo xrandr --addmode <interface name> <mode name>
Finally, switch to the new mode:

    % xrandr --output <interface name> --mode <mode name>
This is the modern way of doing it. Manually setting up modelines in the xorg config file is oldschool.


That's very good info.

> For a long time the X Window System had a reputation for being difficult to configure.

But apparently some things never actually change. :)


Author of article:

Honestly, it is pretty easy to configure the X Server these days; very little manual intervention has been required since the mid-2000s if you want to accept the defaults, which largely are correct and good. I am mindful about what families of hardware I buy, though, but that's not too restrictive. The only piece of hardware of mine that needs manual configuration is the Logitech TrackMan Marble, but that's only because I operate the mouse with a right-handed layout with my left hand. Interestingly the TrackMan Marble does not work with its full feature set in Wayland (core example: the buttons to enable horizontal/vertical panning of a wide or tall document), and this is not exotic hardware. How configuration is being handled in the X to Wayland conversion is a mystery to me. Some of it is happening in libinput (I think), but other parts aren't. This is one of the reasons I am deferring the Wayland migration for as long as I can.

Configuring software stack that runs (think: what the ~/.xsession file manages) is the place I've invested most of my effort, and that's purely about aesthetics and behavior: DPI, font rendering settings, window manager, etc. And this is pretty easy to do these days, because most of these things can prototyped and altered in an existing X session (keeping a tight edit-run loop).

And both of these situations can be alleviated by storing critical configuration files (e.g., ~/.xsession or the X Server configuration) under version control. There's no point in having to invest in reconfiguration of the same hardware these days when there's cheap version control and storage.


Oh, for sure. I've been using X in various capacities for ~3 decades.

I remember how it was. I'm impressed with how it is.

I recently switched back to X as a primary desktop after a rather long hiatus of doing [mostly!] other things. There was some initial driver discourse (standard nVidia vs. OSS necessary nonsense), but it wasn't really so bad once I sorted out what I needed and most "regular" Linux users can skip by a lot of this by default.

So far, I've done zero manual configuration of X itself outside of using XFCE4's GUI tools to arrange the three monitors in front of me in the right order -- and I don't presently see any reason to change anything else.

It's been very pleasant, all said, even though I got here on Medium-Hard Mode with an a rather barebones base install of Void on an existing ZFS pool for root.

X really was one of the easier parts of the whole operation.

(I have no interest in Wayland. It offers no clear advantage to me as a user that I can identify; even the games I like to play run splendidly in X. I've also always adored the concept of remotely displaying GUI applications. It's convenient -- I ran remote X apps for years immediately prior to this recent switch, and it worked well. Remote X apps have saved my bacon a few times by allowing me to quickly get a thing done in a familiar way instead of learning how to do it using something else entirely and maybe stuffing it up in some unforeseen fashion.)


You want arcane, try doing the same thing in Windows when it can't detect your monitor properly and doesn't like your video card.


Thanks. Now I'll have nightmares of the time I spent trying to help a friend get the 32" TV they won in a contest (back when an LCD of that size was still both unusual and expensive) to work at proper native resolution in Windows.

Windows really wanted it to be 1080p, and the TV supported this input, but it was a blurry mess.

It was advertised as 720p, and the TV supported this input as well, but that was also a blurry mess.

It actually had a physical vertical resolution of something like 760 lines, which was not one of the modes that it offered up over DDC as an option for whatever was driving it to use.

Fun times.

(I did eventually get 1:1 pixel mapping, but IIRC I had to give him a different video card for this to happen.)


What Linux solves through configuration, Windows solves by having everything you buy come with its own model-specific drivers that burn the needed configuration into the .INI + .DLL.

Windows "not liking your video card" is presumably because you either aren't using the right driver, or because Windows doesn't like your driver — i.e. the monitor is old enough that there's no version of that driver for current Windows.


>This is the modern way of doing it.

So by modern you mean the 1980s?

My biggest takeaway reading through all these comments is that Plug-and-Play might as well be heresy...


It works. lcd monitors will have the native resolution.

Of course if you use debian 3 in 2024 the fault might be your own…


PnP will always work for a simple+direct+modern (GPU → DisplayPort or HDMI → display) display path; but there are a lot of people who for whatever reason still need to use VGA.

Despite EDID being invented during the VGA era, it wasn't invented at the beginning of it — so older VGA displays don't support EDID, and therefore don't support reporting their valid modes. (And this is relevant not just for CRTs, but LCDs too — and especially projectors for some reason. Some projectors released as recently as 2010 were VGA-only + non-EDID-reporting!)

Remember Windows saying "you proposed a new monitor resolution; we're gonna try it; if you don't see anything, wait 15 seconds and we'll undo the change"? That's because of expensive mid-VGA-era EDID-less VGA monitors. These were advertised as supporting all sorts of non-base-VGA-spec modes that people wanted to use, but came with nary a driver to tell Windows what those modes were — so Windows was in that era just offering people essentially the same experience as editing xorg.conf to add ModeSet lines, just through a UI. And obviously, if even Windows had no proprietary back-channel to figure out what the valid modes were, then Linux didn't have a penguin's chance in hell of deducing them without your manual intervention.

---

But also, people are often trying to "upcycle" old computers into Linux systems (embedded systems like digital-signage appliances being especially popular for this) — and these systems often only come with VGA outputs, and video controllers that don't capture EDID info for the OS even when they do receive it.

Hook an early-2000s "little guy" (https://www.youtube.com/watch?v=AHukN0JsMpo) to any display you like over VGA, no matter how modern — and it still won't know what it's talking to, and will need those modeset lines to be able to send anything other than one of the baseline-VGA-spec modes (usually 800x600@60Hz.)

And this is, of course, still true if you try to use one of these devices with a modern HDMI display using an adapter.

(You might think to get away from this by using a "USB video adapter" and creating an entirely-new PnP-compatible video path through that... but these devices are usually old enough that they only support USB 1.1. But hey, maybe you'll luck out and they have a Firewire port, and you could in theory convince Linux to use a Firewire-to-DVI adapter as an output for a display rather than an input from a camcorder!)

---

Besides the persisting relevance of VGA, there's also:

https://en.wikipedia.org/wiki/FPD-Link, which you might encounter if you're trying to get Linux running on a laptop from the 1990s, or maybe even a "palmtop" (i.e. the sort of thing that would originally have been running Windows CE);

• and the MIPI https://en.wikipedia.org/wiki/Display_Serial_Interface, which you might see if you're trying to do bring-up for a new Linux on a modern ARM SBC (or hacking on a system that embeds one — a certain popular portable game console, say.)

In both of these cases, no EDID-like info is sent over the wire, because these protocols are for devices where the system integrator ships the display as part of the system; and so said integrator is expected to know exactly what the specs of the display they're flex-cable-ing to the board is, and write that into a config file for the (proprietary firmware blob) driver themselves.

If you're rolling your own Linux for these systems, though, then you don't get a proprietary-firmware-blob driver to play with; the driver is generic, and that info has to go somewhere else. xorg.conf to the rescue!


> on modern machines you almost never want to be editing the xorg.conf.

No one ever wanted to be editing xorg.conf! (xkcd 963 anyone?)

I did try the "modern" way when I hit this problem (which would have been in early 2022) - but even if it had worked (which it didn't) I don't think it would have persisted beyond a reboot?


I've never had this technique fail on me. I've done it a lot since I work with a variety of crappy KVMs and run into this problem often enough. You do need to make it a startup script, but that's pretty easy to do.

If it didn't work it's possible you have deeper problems, like X falling back to some crappy software only VESA VGA mode because the proper drivers for your card got corrupted. I've not seen this in many many years, but it's possible. The last time it happened it was really obvious because the whole thing was crazy slow, like the mouse cursor was laggy and typing text into the terminal had over a second of delay. It wasn't subtle at all.


I seem to remember at the time I had trouble finding "current" instructions - I think the syntax changed somewhere along the line? - so there may well have been some crucial step missing.

I'm sure it hadn't fallen back to a VESA mode because I was using compositor features like zooming in on particular windows while screencasting.


> Directly plugged monitors usually just work thanks to EDID info

If you are dealing with consumer grade stuff that is sold a million times sure. I stopped keeping track of how often some special purpose/overpriced piece of display hardware had bad EDID information that made it glitch out.


> If you are dealing with consumer grade stuff that is sold a million times sure.

It's not a sure thing. Out of a bunch of mass produced monitors sharing the same model number and specs, some may still malfunction not reporting the correct EDID.


KVMs do tend to cause issues, especially when it comes to power management and waking from sleep. However, just two weeks ago I had issues with Debian when connecting directly to a monitor. Booting from the live image with a Nvidia GPU resulted in 1024x768 garbage. Surely the installer will take care of that and the open drivers will be sufficient. Surely.

Nope. I had to reinstall and the option to add the proprietary repository was not as obvious nor as emphasized as it should have been. It almost seemed like an intentional snub at Nvidia. I bailed for other desktop-related issues and ran back home to another distro.

But maybe Debian doesn't want to focus on desktop users and that's fine - they can continue to rule their kingdom of hypervisor cities filled with docker containers. The world needs that too.


> It almost seemed like an intentional snub at Nvidia.

I don't think anybody can come up with better intentional snubs at Nvidia than the Nvidia itself.

When it comes to their older graphics hardware, their drivers just refuse to work with newer kernels. GPU was capable to draw windows and play videos for a decade, but then, after a kernel update, it doesn't even show 1024x768 "garbage". Just black screen.

So effectively, buying Nvidia to use with Linux equals to buying hardware with expiration date.


I'm surprised the reverse-engineering folks that like jailbreaking game consoles and decompiling game ROMs, aren't all over the idea of decompiling old Nvidia drivers to modify + recompile them to speak to modern kernel APIs.


Such folks usually have modern GPUs, so they don't experience such problems.


Once a card is old enough you might have to switch to the Nouveau driver instead, which is probably fine since using a card that old on a modern machine suggests you aren't that interested in games or VR.


There is no other choice but Nouveau. But it's not that fine because it means losing hardware video decoding.

> using a card that old on a modern machine suggests

It's an old laptop. Totally adequate for scrolling web, watching movies and arguing about very important stuff on Hacker News. There is no way to change GPU there or switch to integrated Intel one.


> which is probably fine since using a card that old on a modern machine suggests you aren't that interested in games or VR

I think a more correct assumption is that you're likely interested in running games of at most the era the computer was purchased in. It'd be a shame if your 7-year-old GPU going out-of-support with a distro upgrade, meant that you suddenly become unable to run the 7-year-old games you've been happily playing up until that point.


It it really only 7 years? nVidia still lists driver support on their website for the GeForce GTX 600 on Linux, a card that is 12 years old.

https://www.nvidia.com/download/driverResults.aspx/226760/en...


> I've seen that won't offer the correct resolution for a particular monitor (typically locked to 1024x768 on a widescreen monitor).

I've been using linux for over 20 years, Xorg for most of that time, and I've never had any issues with screen resolution.


I'm genuinely pleased to hear that it works for you.

Unfortunately that doesn't make the problem I'm having go away! (On two of the machines I have in mind the issue is with a second monitor - that may well have something to do with it.)


I've been using 2 monitors on several machines, on several occasions. And gave plenty of presentations using projectors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: