Archives

Categories

DisplayPort and 4K

The Problem

Video playback looks better with a higher scan rate. A lot of content that was designed for TV (EG almost all historical documentaries) is going to be 25Hz interlaced (UK and Australia) or 30Hz interlaced (US). If you view that on a low refresh rate progressive scan display (EG a modern display at 30Hz) then my observation is that it looks a bit strange. Things that move seem to jump a bit and it’s distracting.

Getting HDMI to work with 4K resolution at a refresh rate higher than 30Hz seems difficult.

What HDMI Can Do

According to the HDMI Wikipedia page [1], HDMI 1.3 (introduced in June 2006) to 1.4b (introduced in October 2011) supports 30Hz refresh at 4K resolution and if you use 4:2:0 Chroma Subsampling (see the Chroma Subsampling Wikipedia page [2] you can do 60Hz or 75Hz on HDMI 1.3 to 1.4b. Basically for colour 4:2:0 means half the horizontal and half the vertical resolution while giving the same resolution for monochrome. For video that apparently works well (4:2:0 is standard for Blue Ray) and for games it might be OK, but for text (my primary use of computers) it would suck.

So I need support for HDMI 2.0 (introduced in September 2013) on the video card and monitor to do 4K at 60Hz. Apparently none of the combinations of video card and HDMI cable I use for Linux support that.

HDMI Cables

The Wikipedia page alleges that you need either a “Premium High Speed HDMI Cable” or a “Ultra High Speed HDMI Cable” for 4K resolution at 60Hz refresh rate. My problems probably aren’t related to the cable as my testing has shown that a cheap “High Speed HDMI Cable” can work at 60Hz with 4K resolution with the right combination of video card, monitor, and drivers. A Windows 10 system I maintain has a Samsung 4K monitor and a NVidia GT630 video card running 4K resolution at 60Hz (according to Windows). The NVidia GT630 card is one that I tried on two Linux systems at 4K resolution and causes random system crashes on both, it seems like a nice card for Windows but not for Linux.

Apparently the HDMI devices test the cable quality and use whatever speed seems to work (the cable isn’t identified to the devices). The prices at a local store are $3.98 for “high speed”, $19.88 for “premium high speed”, and $39.78 for “ultra high speed”. It seems that trying a “high speed” cable first before buying an expensive cable would make sense, especially for short cables which are likely to be less susceptible to noise.

What DisplayPort Can Do

According to the DisplayPort Wikipedia page [3] versions 1.2–1.2a (introduced in January 2010) support HBR2 which on a “Standard DisplayPort Cable” (which probably means almost all DisplayPort cables that are in use nowadays) allows 60Hz and 75Hz 4K resolution.

Comparing HDMI and DisplayPort

In summary to get 4K at 60Hz you need 2010 era DisplayPort or 2013 era HDMI. Apparently some video cards that I currently run for 4K (which were all bought new within the last 2 years) are somewhere between a 2010 and 2013 level of technology.

Also my testing (and reading review sites) shows that it’s common for video cards sold in the last 5 years or so to not support HDMI resolutions above FullHD, that means they would be HDMI version 1.1 at the greatest. HDMI 1.2 was introduced in August 2005 and supports 1440p at 30Hz. PCIe was introduced in 2003 so there really shouldn’t be many PCIe video cards that don’t support HDMI 1.2. I have about 8 different PCIe video cards in my spare parts pile that don’t support HDMI resolutions higher than FullHD so it seems that such a limitation is common.

The End Result

For my own workstation I plugged a DisplayPort cable between the monitor and video card and a Linux window appeared (from KDE I think) offering me some choices about what to do, I chose to switch to the “new monitor” on DisplayPort and that defaulted to 60Hz. After that change TV shows on NetFlix and Amazon Prime both look better. So it’s a good result.

As an aside DisplayPort cables are easier to scrounge as the HDMI cables get taken by non-computer people for use with their TV.

10 comments to DisplayPort and 4K

  • Btw, this hopefully shouldn’t be relevant to monitors, but on my TV I had to go through special effort to enable 60 Hz UHD: https://fransdejonge.com/2020/01/enable-60-hz-and-10-bit-hdr-on-2018-sony-uhd-4k-tv/

  • Ram

    May be little offtopic,
    what is the difference between a Smart TV and Computer monitor? I mean both support HDMI while Smart TV now-a-days are cheaper than computer monitors. Is it possible to connect the Smart TV to a Computer mainbord or graphics card?

    By the by, I’m using 2011 era Intel Sandybridge based mainboard (using VGA) with Core i3 3.1GHz dual core with HT microprocessor which plays well 4K videos at 30Hz (after downloading from Youtube). 60FPS is not a problem for me.

  • Anonymous

    Unfortunately, neither Amazon Prime nor Netflix will actually give 4K video to a Linux device.

  • @Ram My TV is a giant monitor for my PC, Wii, and DVD/Blu-Ray player. The PC is the only thing that provides “4k HDR” (10-bit UHD), for example in Devil May Cry 5. I don’t care about its TV features.

    But I don’t think you can get a UHD TV smaller than 40″ or so. If that’s what you’re looking for, great, perhaps it’s a better deal than a monitor. (Keeping in mind features like speed & color accuracy are often worse on TVs, you won’t get FreeSync, etc.) Personally I think UHD is a must, but only because of the higher pixel density. Go beyond 28″ or so and you pretty much lose that.

    Anyway, just check the HDMI version on your PC. If it has HDMI 1.4 it should be capable of UHD @ 30 Hz. I’d say it’s more terrible than you might think but at least it’d be usable if or until you decide to grab a cheap used post-2014 GPU.

  • Frans: Annoying that it doesn’t just work by default.

    https://mjg59.dreamwidth.org/8705.html

    Ram: A Smart TV has an Android system inside it and can play YouTube, Netflix, etc on it’s own without a PC. The above post by Matthew Garrett explains some of the horror of HDTV. I’ve had enough problems getting monitors to work as monitors without wanting the extra pain of something that’s not designed to be a monitor. I’ve been using a PC to watch TV for 20 years now and it’s always worked well for me.

    https://www.reddit.com/r/linux/comments/73msw9/4k_netflix_on_linux_still_not_possible/

    https://www.digitaltrends.com/home-theater/getting-hd-netflix/

    Anon: As I’m using the free subscription to Amazon Prime and one of the cheaper subscriptions to NetFlix this hasn’t been an issue for me yet. According to the above Reddit post you can’t do 4K NetFlix on Linux due to DRM. According to the above DigitalTrends article the highest resolution on Chrome and Firefox is 720p and you need Edge to get 4K on a web browser.

    For the case of historical documentaries (something I watch a lot) this isn’t an issue as most of them are based on TV footage from decades ago and isn’t even 720p.

    I had considered setting up a second monitor just for watching TV (maybe a large FullHD or even 720p display) and using the 4K monitor for coding etc. Generally 720p video scaled up to 4K looks reasonable.

  • Ram

    @Frans and @etbe,

    Thanks Russell sir for the Mr. Matthew’s post. Anyway, that is little old, I don’t think the situation is same even now. The problem is now more with DRM & content protection things specially for FLOSS users.

    Problems solved in the HD decades (probably 2.5 decades of fight):
    1. change of ratio from 4:3/5:4/2.x:1/16:10/whatever to 16:9.
    2. change of resolution from 480i/576i/720i/whatever to 1920×1080. From “HD ready” to “Full HD”.
    3. change of scanning from Interlaced (i) to Progressive (p).
    4. change of frame rate from 25FPS/whatever to 30FPS.
    5. change in color space from whatever to Rec. 709.
    6. change of audio from mono/2channels to 5.1channels.
    7. change of signal from analog to digital.
    8. change of digital formats from MPEG-2/DivX to MPEG4 (H.264+AAC).
    9. change of connecting interface from Composite/Component/S-Video/VGA/DVI/whatever to HDMI/DP.
    10. change of display unit technology from CRT/Plasma to LCD/LED.
    11. change of LCD display from CFL backlit to LED backlit; from TN panel to IPS/VA.
    12. change of display unit size from 14inches/whatever to 22inches (computer) or 32inches (HDTV).

    At present if you are thinking about UHDTV (Android SmartTV only now-a-days), you have very limited specifications in mind:
    signal (digital), ratio (16:9), resolution (4K or above), scanning (p), colour (Rec. 709, HDR 10 or better), frame rate (30FPS or above), connecting interface (HDMI/DP/USB/Wi-Fi), in-built speaker, digital format (MPEG4/WebM), connectable with set-top-box (CableTV/IPTV/DTH).

    My requirements:
    1. exploring Google Earth, KDE Marble, Stellarium, KStars etc
    2. exploring physics simulations e.g., playing Xplane.

    So, large (above 50″) display is essential for me. Large display is also good for multi-window operations, although multiple displays can be used for that till you have the interfaces left to connect.
    My system is almost a decade old, so I will obviously go for complete system upgrade, probably with Intel Cascade Lake or AMD Zen 2 and AMD Navi (RDNA) GPU.

    So, for display should I go for SmartTV or wait for large computer monitor to arrive? Remember, I’m a PC guy and hate the ancient TV ecosystem.

  • You don’t state which cards you’re using. GT630 is an old card released in 2012 – how could it support HDMI 2.0 which was released in 2013?

    I agree that 4k seems easier to achieve with DisplayPort, at least on Linux; but I was able to use 4k HDMI on a modern linux with recent AMD cards (RX560 / RX570); I remember I had to tweak an apparently unrelated color setting regarding 10-bit colors or HDR.

  • dlehman

    @Ram Response time and input lag are two big factors that will determine if the TV will provide a reasonable experience if used as a monitor.

    https://www.howtogeek.com/411413/why-can’t-you-use-a-tv-as-a-monitor/

  • Ram

    @dlehman, how much response time is good? presently average SmartTVs are advertised as having 5ms to 2ms (don’t know the reality).

    Also, manufacturers will not be so dumb to forget about Google Play Games, Google Stadia, Microsoft xCloud, nVidia GeForce Now etc.

    Probably somebody having a good PC setup with GNU Linux or Windows and large (40″ or above) SmartTV bought within last year may be in a good position to judge the situation :)

    @etbe, to watch older (old is gold) TV shows/movies I think my age old PC setup is more than fine, as I have a Full HD (1080p) 22″ Dell computer monitor (with VGA & HDMI ports) well supported by all distros I tested (a huge in number) so far. As far as I know, till now there is no broadcasting standard in use to support 4K or above display (may be except ATSC v3.0 in USA) and Netflix/Amazon are not recovering the old contents (not even the english ones). And Internet is not made affordable yet for all; if you forget about cities, you forget about streaming even at 480p.

  • Ram: Yes Matthew’s post is old, it would be nice if someone was to investigate the situation with 4K TVs. It’s not really my thing so I’m not even sure where to start with that. Matthew’s post referenced HDMI, so HDMI is not on it’s own the solution to this problem.

    50″ monitors are extremely expensive, while smart TVs in that size are very affordable.

    Alan: Maybe there were multiple revisions of the GT-630 released and some used newer versions of HDMI. Either that or the card in question is lying about it’s refresh rate – which is possible.

    dlehman: Thanks for that link.

    Ram: Yes and old PC with 720p display would work well with NetFlix. But it wouldn’t do what I want to do for programming etc when I’m not watching TV.