4K Monitors

A couple of years ago a relative who uses a Linux workstation I support bought a 4K (4096*2160 resolution) monitor. That meant that I had to get 4K working, which was 2 years of pain for me and probably not enough benefit for them to justify it. Recently I had the opportunity to buy some 4K monitors at a low enough price that it didn’t make sense to refuse so I got to experience it myself.

The Need for 4K

I’m getting older and my vision is decreasing as expected. I recently got new glasses and got a pair of reading glasses as a reduced ability to change focus is common as you get older. Unfortunately I made a mistake when requesting the focus distance for the reading glasses and they work well for phones, tablets, and books but not for laptops and desktop computers. Now I have the option of either spending a moderate amount of money to buy a new pair of reading glasses or just dealing with the fact that laptop/desktop use isn’t going to be as good until the next time I need new glasses (sometime 2021).

I like having lots of terminal windows on my desktop. For common tasks I might need a few terminals open at a time and if I get interrupted in a task I like to leave the terminal windows for it open so I can easily go back to it. Having more 80*25 terminal windows on screen increases my productivity. My previous monitor was 2560*1440 which for years had allowed me to have a 4*4 array of non-overlapping terminal windows as well as another 8 or 9 overlapping ones if I needed more. 16 terminals allows me to ssh to lots of systems and edit lots of files in vi. Earlier this year I had found it difficult to read the font size that previously worked well for me so I had to use a larger font that meant that only 3*3 terminals would fit on my screen. Going from 16 non-overlapping windows and an optional 8 overlapping to 9 non-overlapping and an optional 6 overlapping is a significant difference. I could get a second monitor, and I won’t rule out doing so at some future time. But it’s not ideal.

When I got a 4K monitor working properly I found that I could go back to a smaller font that allowed 16 non overlapping windows. So I got a real benefit from a 4K monitor!

Video Hardware

Version 1.0 of HDMI released in 2002 only supports 1920*1080 (FullHD) resolution. Version 1.3 released in 2006 supported 2560*1440. Most of my collection of PCIe video cards have a maximum resolution of 1920*1080 in HDMI, so it seems that they only support HDMI 1.2 or earlier. When investigating this I wondered what version of PCIe they were using, the command “dmidecode |grep PCI” gives that information, seems that at least one PCIe video card supports PCIe 2 (released in 2007) but not HDMI 1.3 (released in 2006).

Many video cards in my collection support 2560*1440 with DVI but only 1920*1080 with HDMI. As 4K monitors don’t support DVI input that meant that when initially using a 4K monitor I was running in 1920*1080 instead of 2560*1440 with my old monitor.

I found that one of my old video cards supported 4K resolution, it has a NVidia GT630 chipset (here’s the page with specifications for that chipset [1]). It seems that because I have a video card with 2G of RAM I have the “Keplar” variant which supports 4K resolution. I got the video card in question because it uses PCIe*8 and I had a workstation that only had PCIe*8 slots and I didn’t feel like cutting a card down to size (which is apparently possible but not recommended), it is also fanless (quiet) which is handy if you don’t need a lot of GPU power.

A couple of months ago I checked the cheap video cards at my favourite computer store (MSY) and all the cheap ones didn’t support 4K resolution. Now it seems that all the video cards they sell could support 4K, by “could” I mean that a Google search of the chipset says that it’s possible but of course some surrounding chips could fail to support it.

The GT630 card is great for text, but the combination of it with a i5-2500 CPU (rating 6353 according to [3]) doesn’t allow playing Netflix full-screen and on 1920*1080 videos scaled to full-screen sometimes gets mplayer messages about the CPU being too slow. I don’t know how much of this is due to the CPU and how much is due to the graphics hardware.

When trying the same system with an ATI Radeon R7 260X/360 graphics card (16* PCIe and draws enough power to need a separate connection to the PSU) the Netflix playback appears better but mplayer seems no better.

I guess I need a new PC to play 1920*1080 video scaled to full-screen on a 4K monitor. No idea what hardware will be needed to play actual 4K video. Comments offering suggestions in this regard will be appreciated.

Software Configuration

For GNOME apps (which you will probably run even if like me you use KDE for your desktop) you need to run commands like the following to scale menus etc:

gsettings set org.gnome.settings-daemon.plugins.xsettings overrides "[{'Gdk/WindowScalingFactor', <2>}]"
gsettings set org.gnome.desktop.interface scaling-factor 2

For KDE run the System Settings app, go to Display and Monitor, then go to Displays and Scale Display to scale things.

The Arch Linux Wiki page on HiDPI [2] is good for information on how to make apps work with high DPI (or regular screens for people with poor vision).


4K displays are still rather painful, both in hardware and software configuration. For serious computer use it’s worth the hassle, but it doesn’t seem to be good for general use yet. 2560*1440 is pretty good and works with much more hardware and requires hardly any software configuration.

9 comments to 4K Monitors

  • thanks for the info. The introduction to gsettings was useful. I don’t yet have a 4K monitor but recently was checking the resolution of the monitors that I have and discovered some differences.

  • I picked up an R9 270X back in ’14 for UHD support (@ 60 Hz over DisplayPort). Is there a particular reason to focus on HDMI?

    For CPU power, did you check if is enabled?

  • I use a 4k display on my Debian machine on a 1050TI Nvidia card and everything just worked. the 4K display in 43″ and I also have two 21″ 1080p displays attached, so the pixel density is just about the same, the 43″ display is a bit further from me than the 21″ displays used to be, but it’s great haveing about 10 megapixels available for terminals, web browsers, videos etc. Not sure I can go back. Perhaps I’ll switch the two 1080p panels for another 4K then I’ll just have two big 4K screens.

  • Paul

    In the recent past, I spent some thinking about the “I’m getting older and my vision is decreasing as expected.” topic, both professionally for my colleagues – and myself.
    And my conclusio for now is: it’s the screen SIZE in relation to the distance that matters. A bigger screen size is the next step after a font-size increase (“Earlier this year I had found it difficult to read the font size that previously worked well for me so I had to use a larger font …”). You write about 4K displays but skip the display sizes. Would there be a point in having the same e.g. 27” visible screen diagonal with 2560×1440 or 3820×2160, apart from being able to disable hinting of font and putting it closer to your eyes? [1]

    Unfortunately, I can’t help much with “No idea what hardware will be needed to play actual 4K video.” since I’m lacking 4K source material, but I suggest the solution will have to be “support in hardware” in some way – since even a raspberry pi 2 can display FHD. A keyword for AMD systems could be “unified video decoder”. Also, it seems ffmpeg is able to use GPU-hardware. [2,3,4]

    This reply was written at a 24”-FHD-Display (92dpi) in a distance of ~23 inch – and I wasn’t able to see individual pixels ;-)



  • Frans: It seems that the input options for 4K monitors are HDMI and DisplayPort. The older/cheaper video cards have VGA, DVI, and HDMI output. Therefore the combination of using a cheaper video card with a 4K monitor means HDMI is the only option. Thanks for that Arch Linux Wiki reference, I’ll check that out and blog about how it goes.

    highvoltage: Sounds like a nice setup. A 1050TI Nvidia card costs $200 at MSY so it’s an option for me. I will try 2*4K monitors if I get one of those.

    Paul, the 4K monitor I’m using now is about 2cm larger diagonally than the 1440p monitor I was using before and it’s in about the same position on my desk. So resolution is providing a real benefit. Good point about the possibility of moving it closer to fully take advantage of the higher resolution, I’ll have to experiment with that. Thanks for all the links about hardware decoding.

    I’d rather not spend too much on video hardware at the moment. Apparently cards that can decode VP9 will be out soon and that will offer significant benefits, but I haven’t researched this much yet.

  • The thing about older + HDMI means you’re presumably only getting 30 Hz. Older (~2014) + DisplayPort means 60 Hz. It’s a pretty significant difference in usability imho.

    Coincidentally I’m just looking into selling off my old R9 270X and my wife’s old R9 280X. The going rate seems to be about € 50 for the former and € 75-100 for the latter. Not necessarily worthwhile compared to a brand new 1050 TI with guarantee depending on your needs, but on the flip side it’s a third for the same performance or half for quite a bit more performance.

  • mpv –hwdec=auto _–XQ2zCdxw0I.mp4
    I ran the above command (to play a YouTube tutorial on React that I had previously downloaded and had problems displaying), it’s FullHD 60Hz. mpv says “Using hardware decoding (vdpau)”, so it looks like by default my hardware does what’s required. But then it says “Audio/Video desynchronisation detected! Possible reasons include…”.

    I used the command “youtube-dl -f 137+251” to get a 30Hz version of the video, it seems to have less problems as it plays a few seconds of video before giving the warning while the 60Hz video gives a warning instantly. So it seems that 30Hz vastly alleviates the problems.

    When I run time on mpv and quit playback after it says 5 seconds I get real/elapsed of 12.3/0.75s on the 60Hz file and 7.4/0.62s on the 30Hz file.

    Without –hwdec=auto when I get 6.4/2.1s for the 60Hz file and 6.3/1.3s for the 30Hz file. I guess that the difference between real-time and time displayed as played is the synchronisation problem which means that 30Hz with no hardware decode works best on my system.

    I just followed the instructions on this page to set my CPU governor to “performance” (instead of “powersave”), it improved the numbers slightly, but not enough to make much difference. The difference was enough to notice that it had changed something but not enough that it’s likely to make a difference I’ll ever notice when watching it.

  • After setting the GPU governor to performance I tested all the mplayer options for output. I found -vo xv and -vo x11 about equal in performance and significantly better than the default which turned out to be “vpdau” (IE hardware decompression).

    Thanks for the advice, it turns out that hardware decompression was the cause of my problems. But pointing me to look in this direction allowed me to solve it.