Dynamic refresh rate technologies like AMD’s FreeSync and NVIDIA’s G-Sync have become de-facto standards for gaming PCs and displays. Last year the HDMI Forum introduced a more industry-standard approach to variable refresh rate as a part of the HDMI 2.1 package, and recently makers of consumer electronics started to add VRR support to their products. At Computex the consortium demonstrated VRR operation using a Samsung QLED TV and a Microsoft Xbox One X, but the demonstration was somewhat inconclusive.

Select Samsung QLED TVs to be launched this year are set to support a 120 Hz maximum refresh rate, HDMI 2.1’s VRR, as well as AMD’s FreeSync technologies, the company announced earlier this year. The technologies do essentially the same thing, but they are not the same method – AMD's Freesync-over-HDMI being a proprietary method – and as such are branded differently. From technological point of view, both methods require hardware and firmware support both on the source (i.e., appropriate display controller) as well as the sink (i.e., display scaler). As it appears, Samsung decided to add support for both methods.

As an added wrinkle, AMD sees VRR and FreeSync as two equal technologies, which is why it intends to keep relying on its own brand, even when over time it adds support for both technologies to its products. An example of such universal support for VRR and FreeSync is Microsoft’s Xbox One X console, which according to a Microsoft rep at the HDMI Consortium booth at Computex, supports both technologies. Meanwhile during its own press event at Computex, AMD demonstrated a Radeon RX Vega 56-based system with FreeSync working on a 1080p QLED TV from Samsung, so unless said GPU already supports HDMI 2.1’s VRR (which is something that would be logical for AMD to announce), it more likely proves that Samsung supports both VRR and FreeSync on select TVs. Meanwhile, it does not seem like Samsung’s TVs support LFC (low framerate compensation), at least not right now.

The somewhat convoluted demonstration of HDMI 2.1’s VRR capabilities reveals complexities of the HDMI 2.1 technology package in general, and difficulties with the HDMI 2.1 branding in particular.

As reported a year ago, the key feature that the HDMI 2.1 specification brings is 48 Gbps bandwidth that is set to enable longer-term evolution of displays and TVs. To support the bandwidth, new 48G cables will be required. The increased bandwidth of HDMI 2.1’s 48G cables will enable support of new UHD resolutions (some will require compression), including 4Kp120, 8Kp100/120, 10Kp100/120, and increased refresh rates. In addition, increased bandwidth will enable support of the latest and upcoming color spaces, such as BT.2020 (Rec. 2020) with 10, 12, or even more advanced with 16 bits per color component.

Finally, the HDMI 2.1 supports a number of capabilities not available previously, including QMS (quick media switching), eARC (enhanced audio return channel), QFT (quick frame transport), and ALLM (auto low latency mode). The list of improvements the HDMI 2.1 spec brings is significant, furthermore, some of the new features require the new cable, others do not. Therefore, the HDMI Forum made no secret from the start that some of the new features might be supported on some devices, whereas others might be not. Meanwhile, the HDMI 2.1 branding will be used for all of them, but with an appropriate disclosure of which capabilities are supported.

There is a reason why HDMI Forum wants to use the HDMI 2.1 brand for hardware that will support only one or two new features from the package, even if it comes with certain confusion. While the key features of HDMI 2.1 are its higher cable bandwidth and the resulting support for 8K resolutions, the Forum realizes that only a couple of countries in the world are currently experimenting with 8K UHD TV broadcasting, so there's currently not much need for high bandwidth/8K support in TVs sold in Europe or the U.S. Meanwhile, things like VRR and ALLM make sense for gamers today, but since they have to be supported by both sinks and sources, proper marking is required so that people who want to have them know to get the right hardware.

Microsoft says that it has plans to expand feature set of its Xbox One X consoles going forward, so it is possible that it will gain HDMI 2.1 capabilities eventually. Obviously, innovations are good for hardware owners, but while the HDMI 2.1 remains in its infancy, such approach cases confusion for people on the market for new hardware.

Want to keep up to date with all of our Computex 2018 Coverage?
 
Laptops
 
Hardware
 
Chips
 
Follow AnandTech's breaking news here!
Comments Locked

46 Comments

View All Comments

  • surt - Wednesday, June 20, 2018 - link

    Variable Refresh Rate.
    aka support for gaming devices to deliver pictures as completed rather than having to align with a fixed pace refresh rate.
  • mode_13h - Thursday, June 21, 2018 - link

    "Last year the HDMI Forum introduced a more industry-standard approach to variable refresh rate as a part of the HDMI 2.1 package, and recently makers of consumer electronics started to add VRR support to their products."

    So it was there, although it could've been clearer.
  • sunbear - Wednesday, June 20, 2018 - link

    HDMI is completely ill-conceived. All of the video streams on the source end of the cable are already compressed, but stupid HDMI insists on not taking advantage of that fact.

    Let’s say the bitrate from a UHD Disk player source is 50 Mbps. HDMI requires that the player decompresses that bit stream BEFORE sending down the HDMI cable. Stupid! If they sent the stream as-is and instead decompressed in the TV itself then no one would need a cable capable of 48 Gbps, cheap category cable would do the job fine and over 50 meter cable runs (no way any non-optical HDMI cable will be capable of that).

    And before people say “it’s too much to expect the TVs to be able to decompress/decode lots of different types of compressed streams”, remember that most TVs nowadays run Netflix, Amazon video, etc that do exactly that.
  • mode_13h - Thursday, June 21, 2018 - link

    No.

    HDMI needs to support uncompressed content, say from a games console or PC graphics card. So, you can't just cut the supported bandwidth to 50 Mbps or whatever. The cable & display device must support either uncompressed or (as the standard now has added) lossless compression.

    The other scenario you're missing is devices which either apply some overlay, such as A/V receivers, or which do some post processing (i.e. video processors and outboard scalers).

    Next time, I hope you'll think a little harder and maybe educate yourself a bit more, before presuming to be such a vastly superior intellect.

    BTW, streaming compressed video over such distances is already a solved problem. Many TVs support DLNA. So, just use a media server + Ethernet or wifi. No need for HDMI to add support for that use case. And one can even run Ethernet over HDMI, for cables and devices that support it.
  • johnthacker - Thursday, June 21, 2018 - link

    Does Ethernet over HDMI really get used? I haven't seen it in the field. Now, HDMI over Ethernet, that is indeed a popular way to solve the problem of needing long cable runs.
  • mode_13h - Friday, June 22, 2018 - link

    My 2013-era TV can stream via DLNA from devices connected to it that aren't networked. So, I'd say it's there and it works, but you must have both a cable and devices which support it.

    Cable support is required due to leveraging previously-unused pins. I think the same pins might also be used for ARC.

    One of the most annoying thing about HDMI is the cables need standard markings so that you can tell which cable supports what speeds and features.
  • Stefan75 - Monday, July 2, 2018 - link

    Good luck with HDMI over Ethernet... HDMI 2.0 = 19Gbit/s
  • 29a - Monday, June 25, 2018 - link

    Your reply was great until you got to.

    "Next time, I hope you'll think a little harder and maybe educate yourself a bit more, before presuming to be such a vastly superior intellect."

    Then you just turned into a dick, try being a little more friendly next time.
  • mode_13h - Tuesday, June 26, 2018 - link

    Well, let's see. I'm replying to a comment that starts:

    "HDMI is completely ill-conceived."

    ...not an innocent question, like:

    "Hey, why doesn't HDMI just keep the signal compressed?"

    Okay, so right off the bat, @sunbear is presuming that HDMI was slapped together by a bunch of idiots too dumb to do something so obvious. Such a harsh assumption deserves no less harsh a rebuke.

    I have very little patience for people who assume they know better than the developers of the tech we all use. Not that it's perfect, but to litigate your case on the basis of such facile analysis demeans us all.
  • A5 - Thursday, June 21, 2018 - link

    Doing decompression at the source instead of the sink makes it possible for a TV to last 10+ years.

    Imagine you bought a 1080p TV in 2008 that could only decode MPEG2 - it would be useless for everything besides OTA broadcasts and playing DVDs.

Log in

Don't have an account? Sign up now