We are under an article which tells you that you can have problems with Wayland and hiDPI screens. And for example I’m one of those people, who uses X11, because Wayland failed on many levels, like buggy video playing, crashing while changing monitors, or simply waking up my laptop with an external monitor, and I didn’t give more than a few days to fix these (cheers to the author to try this long), so I went back to X11. Which is still buggy, but on a “you can live with it level” buggy.
Btw, everybody who I know, and I too, changes the font size, and leaving the DPI scaling on 100%, or maybe 200% on X11.
You can scale down from a higher resolution to make the UI perceptively the same size. You can do this with xrandr --scale OR for example the GUI in Cinnamon on Mint after you check "fractional scaling" under X mind you.
I have a setup with a high DPI monitor mixed with a normal DPI monitor and KDE over Wayland just works fine. The only issue that I found are with Libre Office doing weird over scaling and Chrome/Chromium window resizing his window to the oblivion.
Not to mention that fractional scaling is practically required in order to use the majority of higher DPI monitors on the market today. Manufacturers have settled on 4K at 27" or 32" as the new standard, which lends itself to running at around 150% scale, so to avoid fractional scaling you either need to give up on high DPI or pay at least twice as much for a niche 5K monitor which only does 60hz.
Fractional scaling is a really bad solution. The correct way to fix this is to have the dpi aware applications and toolkits. This does in fact work and I have ran xfce under xorg for years now on hi-dpi screens just by setting a custom dpi and using a hi-dpi aware theme. When the goal is to have perfect output why do people suddenly want to jump to stretching images?
That doesn't gel with my experience, 1080p was the de-facto resolution for 24" monitors but 27" monitors were nearly always 1440p, and switching from 27" 1440p to 27" 4K requires a fractional 150% scale to maintain the same effective area.
To maintain a clean 200% scale you need a 27" 5K panel instead, which do exist but are vastly more expensive than 4K ones and perform worse in aspects other than pixel density, so they're not very popular.
4K monitors aren't a significant expense at this point, and text rendering is a lot nicer at 150% scale. The GPU load can be a concern if you're gaming but most newer games have upscalers which decouple the render resolution from the display resolution anyway.
I used to be like this. I actually ran a 14" FHD laptop with a 24" 4k monitor, both at 100%. Using i3 and not caring about most interface chrome was great, it was enough for me to zoom the text on the 4k one. But then we got 27" 5k screens at work, and that had me move to wayland since 100% on that was ridiculously small.
Because although I don't care much about the chrome, I sometimes have to use it. For example, the address bar in firefox is ridiculously small. Also, some apps, like firefox (again) have a weird adaptation of the scroll to the zoom. So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
Also, 200% on an FHD 14" laptop means 960x540 px equivalent. That's too big to the point of rendering the laptop unusable. Also, X11 doesn't support switching DPI on the fly AFAIK, and I don't want to restart my session whenever I plug or unplug the external monitor, which happens multiple times a day when I'm at the office.
This really isn't this far off. If we imagined the screens overlayed semi-transparently an 16 pixel letter would be over a 14 pixel one.
If one imagines an ideal font size for a given user's preference for physical height of letterform one one could imagine a idealized size of 12 on another and 14 on the other and setting it to 13 and being extremely close to ideal.
>So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
This is because it's scrolling a fixed number of lines which occupy more space at 300% zoom notably this applies pretty much only to people running high DPI screens at 100% because if one zoomed to 300% otherwise the letter T would be the size of the last joint on your thumb and legally blind folks could read it. It doesn't apply to setting the scale factor to 200% nor the setting for Firefox's internal scale factor which is independent from the desktop supports fractional scaling in 0.05 steps and can be configured in about:config
Right, and 27" 5k is 218 ppi, which isn't that much more than the 24". But don't forget that viewing distance plays a big role in this, and my 14" laptop is much closer than a 27" monitor. Bonus points for our specific model having an absolutely ridiculous viewing angle, so if it's too close the outer border are noticeably dark.
I don't really care about this but here's an example:
I have 2 27" screens, usually connected to a windows box, but while working they're connected to a MBP.
Before the MBP they were connected to several ThinkPads where I don't remember what screen size or scaling, I don't even remember if I used X11 or Wayland. But the next ThinkPad that will be connected will probably be HiDPI and with Wayland. What will happen without buying a monitor? No one knows.
It does work and has worked for over a decade. You can configure scaling under settings in Cinnamon or plasma for instance or via environmental variables in a simple environment like i3wm.
The post is from the Dev of i3wm an x11 window manager complaining among other things about how well his 8k monitor works under x11 and how poorly it works under Wayland.
You can also consult the arch wiki article on high DPI which is broadly applicable beyond arch
In that time I've had Hidpi work perfectly on first on Nvidia then recently on AMD GPUs on several different distros and desktops all running on X on several distros. They all worked out of the box and were able to scale correctly once configured.
The totality of my education on the topic was reading the arch wiki on hidpi once.
AFAIK one cannot span one x session across multiple GPUs although AMD had something that it once referred to as "eyefinity" for achieving this.
It is rarely needed discreet GPU often support 3 or even 4 outputs
One may wonder if you tried this a very long time ago back when AMD sucked and Nvidia worked well in 2005-2015