HDMI 2.0 is finally here, but may not be what you were expecting

Back in September 4th 2013, HDMI Forum released the new version of HDMI, which was upgraded to version 2.0. This new version brought some very necessary upgrades in terms of supported bandwidth and the much needed new 4K resolutions. Also, as an added bonus, existing 1.4 category 2 cables would be compatible with the new standard and resolutions.

On the other hand, people learned to put some salt on some technological breakthroughs and product launches, with some call vapourware or paperware. Anyways, HDMI 2.0 would have no reason to diverge from the rest of the industry best practices, and as such it brings some much needed goodies, technological gimmicks and a few less publicised restrictions. From those, the cable compatibility claim seems to be the one causing more fuss, even for those more literate on electronics. There no more free upgrades as free lunches, so what’s the catch?

What is new today, it that finally there are some real HDMI 2.0 and 4K Ultra High Definition products on the market, both TVs and set top boxes, so we can really find out what was all the hype about and if all claims where in fact due. In fact, after some digging, we can now say that HDMI 2.0 is not completely what was expected, starting with the so much publicised legacy cable compatibility. At the end get ready for real HDMI 2.1 cables in order to fully take advantage of HDMI 2.0 new capabilities, but will it be worth it?

The Announcement

Well, fist of all let’s see what was presented to the public, as visible on the HDMI.org website.

The new HDMI 2.0 brings a lot new enhancements, but can one use all of them?
Source www.hdmi.org

So, again, let’s summarize what HDMI brings:

Ad
  • New up to 16Gbps bandwidth
  • 4K@60fps (Yes! Finally…)
  • Several non-bandwidth consuming features
  • Dual video streams (but fails to mention the available resolutions…)
  • NO NEW CABLES! Current cables “are capable of carrying the increased bandwidth.

On thing which was NOT announced was the availability of the Compliance Test Specification (CTS), which allows manufacturers to market HDMI 2.0 equipments. In fact, this was just released in April 2014, a full 7 months later (or late, depending on who to do work for 🙂 ). This is also the reason why products only arrived at the market 7 months later than the announcement. As only after the CTS specs are published equipment manufacturers (both client equipment and test equipment) can be certified and thus, sold as HDMI 2.0 compliant.

It’s the last feature which deserves questions: free current HDMI 1.4b cable compatibility. Since electronic communications began people acknowledge it is not physically possible to arbitrarily add bandwidth to an already existing medium. From 10Base-T to 100Base-T new cables were required (cat.5) and the same happened with 1000Base-T (cat.5e) and again with 10G Base-T (cat.6e). The same happens on DSL connections, although higher bandwidth was attained from 8Mbps up to 24Mbps, it did not came at the expense of higher frequencies, but with higher modulations and better error correction. It is simply not possible to increase the used frequency and expect everything works.

Also, note that even if the publicized maximum new bandwidth is 16Gbps, actually only around 14.4Gbps is actually usable due to 8b/10b encoding.

The numbers

Continuing using public information, we can also find more details on how 4K@60fps resolutions are achieved. Please keep in mind that previous HDMI versions already supported 4K resolutions, but limited to 25fps, which is not a great idea.

The new image formats supported by HDMI 2.0
Source www.hdmi.org

This is where strange things start to appear. First there’s a new 4:2:0 chroma subsamplig format (more information can be found here), and this new subsampling is only found the new 4Kp50/60 formats… but what for, if there is some much more bandwidth available? Also, please note that 8bit 4K is next to useless, as the new BT.2020 color space is only available with 10bit per channel formats.

Let’s then have a look at some rough required bandwidth numbers:

Ad

[table id=5 /]

This table contains rough calculations on the necessary bandwidth for some supported HDMI 2.0 resolutions.
Please note those number are not exact to CEA-861-F specifications, but are close enough for understanding. Also, for the sake of simplicity we’re assuming 4Kp50 and 4Kp60 are using roughly the same bandwidth. This is not technically correct, but those differences won’t change any conclusion.

Notes:
(1) – This is the maximum bitrate for HDMI 1.2 (TMDS@165Mhz – 3.96Gbps)
(2) – Requires Cat.2 cable (TMDS@340Mhz)
(3) – This mode is not supported even on new 16Gbps (14.4Gbps) bandwidth mode, nor is it defined on HDMI 2.0.

This table brings some very interesting numbers:

  • Although HDMI 2.0 brings new resolutions, those new resolutions are not supported with all color depths, specifically 4Kp50/60 with 48bit color exceeds HDMI new maximum bandwithd. So you can have 4K@50 or 48bit color, but not both
  • This is where 4:2:0 chroma subsampling starts showing it’s usefulness… or not. Is doesn’t make much sense to increase color depth to 48bits just to trash most of the chroma resolution.
  • 4K50/60 is possible even on HDMI 1.4 bandwidths by using 4:2:0 subsampling! This undoubtedly makes possible for current HDMI 1.4 Cat.2 cables to support 4K@50 resolutions. This is the reason why current cables are said to support the new bandwidths (more on this later) and also why 4:2:0 chroma subsampling was brought into the table.
  • Not only that, but you can also get 4Kp50 AND 30-bit enhanced BT-2020 colour space, but using 4:2:0 chroma subsampling. This is in fact the biggest gain a user can have with HDMI 2.0.

So, on one side some limitations on HDMI 2.0 are now clearly visible, but most importantly how current  cables can support 4K@50 albeit by sacrificing colour spacial accuracy. This is not perfect, but is an acceptable tradeoff. Better yet, you can even have access to the new BT.2020 UHD color space using current technology cables.

This also means there is no actual need to the 16Gbps (14.4Gbps useful) bandwidth modes, for most purposes. This fact should only be a remark, but my (informed) guess is that this may be the industry de facto standard, as we’ll see next.

Ad

The industry moves

HDMI is not managed by a regular standards body, such as ISO or SMPTE, but by two industry groups: HDMI Forum and HDMI Licensing. This get a little messy here as both groups are not composed by the same entities, which in turn, have different agendas.

HMDI 2.0 was in urgent need by both consumers and manufacturers, and from different perspectives:

Console manufacturers

Console manufacturers, and specifically Sony(1) and Microsoft had new consoles which for all means should support new and improved image quality and at high frame rates. Consoles absolutely require 50/60 frames per second frame rate for quality playing experience. HDMI 1.4 already supported 1080p60, but again, this was already supported (but not applicable to all games) on previous generation consoles. Also, proper 3D gaming is also possible with 1080p120 output, which was possible on HDMI 1.4 but on non trivial pixel frequencies (>330Mhz). Increased resolution was simply a must for the new generation consoles. 

(1) – Sony was in a peculiar situation, as it was present on all industry groups: console manufacturer, TV manufacturer and content producer.

However, both Microsoft Xbox One (November 22, 2013) and Sony PS4 (November 15, 2013) were launched so close to the HDMI 2.0 announcement, none of those supported HDMI 2.0 launch and still today, 6 months have passed and HDMI 2.0 it not to be found on either of them. However, there’s light at the end of the tunnel, as both Microsoft and Sony committed to support HDMI 2.0 in the future, whatever this means. However, the time it’s taking for them to bring consoles to the new standard is raising some very interesting question: why so long? Well, to start with, HDMI forum only finalised the CTS specs on April, so Microsoft and Sony are not that late, yet.

TV manufacturers

TV manufacturers were on the forefront of the 4K/UHD race. Samsung, LG, Panasonic, Sony, etc, all of them were in a race to bring 4K UHD TVs to the market. This was one of those instances where we can clearly see something went terribly wrong on the HDMI 2.0 standardisation, as all TV manufacturers had 4K TVs ready to market, but no means to get content there. Even tough, most choose to bypass HDMI altogether, offering alternative means to get you favourite Ultra High Definition soma. SamsungLG went on the alternative content route: offering on demand content from the likes of Netflix or YouTube, neither of those depending on external devices, and as such, on HDMI 2.0.

Ad

Finally, after being available the final piece of the puzzle, the HDMI 2.0 CTS (Compliance Test Specifications) those same TV manufacturers rushed to the market with new and improved version of HDMI 2.0 enabled TV sets, with a few quirks tough, on today’s models (June 2014) from LG and Samsung:

  • There is no support for 4Kp60 4:2:2, the highest possible quality setting on HDMI 2.0;
  • There is no support for BT.2020 colour space.

Although poorly documented, current models only support pixel rates up to 340 Mhz (or 340MCPS on HDMI forum current parlance). This means that only HDMI 1.4 compatible bit rates can be used (less than 8Gbps), but for sure (or not) HDMI 1.4 cables can also be used, thus implementing “HDMI 1.4 cable compatibility” promise.

This limitation is clearly visible on some E-EDID information from one such TV set:

HDMI VSDB:
CEC PA 3.0.0.0
Supports ACP, ISRC1, and/or ISRC2.
Supports 36, 30 bit color.
YCbCr 4:4:4 deep color supported.
Max TMDS clock: 300MHz
Content types: Graphics (text).
Basic 3D supported.
Image size is accurate to 1cm.
Supports 3840x2160 30Hz, 3840x2160 25Hz, 3840x2160 24Hz, 4096x2160 24Hz

On this TV, a 2014 curve 4K UHD from Samsung, although 4K resolutions are properly supported, they’re only achievable if you limit the TMDS clock to 300MHz. This limitation is further explained when we discuss about HDMI 2.0 cable compatibility below.

This is probably the reasoning why Samsung has moved all external interfaces to outside the TV set, into a dedicated break out box, such as this:

Samsung Evolution Kit 2014, included on some 2014 UHD TV models
Source: Samsung

These kind of solutions will enable users which just purchased 5000US$+ TV sets to spend a few more hundred bucks one or two years down the road. These toys go for the non bargaining price of 397US$, however that’s probably the only way to keep those costly TV sets up to the technology which was promised by HDMI 2.0. This further demonstrates how 4K UDH TV sets were actually rushed to market with a half baked implementation of HDMI 2.0. However, it may not be the manufacturer’s fault, as this is as far as you’ll probably see on HDMI 2.0.

Ad

Cables and the CTS

For some reason which is not completely clear, neither the proper HDMI 2.0 specifications, nor the Compliance Test specifications are made public. If the former is a clear means to fight against shady manufactures, the latter doesn’t seem to be in the best interest to both the public and the industry. The brand HDMI 2.0 as well as the logo are only allowed on devices and cables which have undergone proper HDMI 2.0 certification. So, it should the HDMI 2.0 CTS which protects the users from poorly designed and manufactured devices and cables, but without understanding what was actually certified the public is made barehanded against those same manufacturers the certification was intended to protect from.

Certification is any technology’s best tool to success, as it insures the public that it doesn’t have to worry about device compatibility. Without this, different manufacturers may implement the same exact standard into incompatible implementations due to some loose ends in the manner standards are written. This have happened every single time there is no proper certification, and specially in the initial versions. Every wondered why USB 1.0 products never reached the market….? Notwithstanding, the initial versions of USB 2.0 and again USB 3.0 were riddled with issues and interoperability bugs. If a proper certification would be present users would be able to buy up to par devices, without worrying with one device being compatible with the next. Or so would be the goal of proper certification processes, if they were only done as they should….

There are three main sections of HDMI CTS, both 1.4 and 2.0: sources (usually DVD, Bluray decoders and set top boxes), cable assemblies, although cables are only tested against HDMI 1.4 CTS and sinks (TVs). There are also other device types, but those are not relevant to this. As far as sinks are concerned, for HDMI 2.0, it’s only a matter of ensuring that the output is compliant. This mostly concerns signal levels, jitter and proper signalisation, both on the blanking space and on E-EDID.

On the cable side, nothing has changed: the same specifications which before only allowed for 340Mhz operation on high speed cables are now enough for 600Mhz operation. You may even think that HDMI 1.4 CTS already had room for 600Mhz operation, but that is completely not the case. All tests, both in terms of skew and jitter explicitly call for up to 340Mhz operation. This would be enough to raise some eyebrows, but it gets worst and worst. When you go to a store a see a huge number of HDMI cables, it doesn’t mean that every single cable was certified. Not even means that that specific model was certified. It simply means that the manufacturer was able to certify one particular cable, and then the same manufacturer commits to produce every similar cable up to the same specs, regardless of length or any other characteristic. This is why there can be good or bad HDMI cables, and where certification starts to break apart. In fact, this is why a relatively high number of cables won’t work at 1080p resolutions. Not only that, but if you have a look at the table above, you’ll notice that 1080p60 resolution only requires 170Mhz operation. Why can this be problematic? Because all the 4Kp50 compatibility promise relies on the assumption that all cables correctly work at 340Mhz as specified by the 1.4 CTS. This may be true in paper, but on the real world cables are not stressed up to the specs, but at around half of it: 170Mhz. When the HDMI forum stated there would be not new cables, it’s under the assumption all current cables are at least able to work at 340Mhz. Now, it is fairly known that not all cables correctly work at 1080p resolutions, so it’s safe to assume that at 4K resolutions, a lot more cables will not work, and we’re only talking of 340Mhz operation. So much for cable compatibility. Now, if we’re talking of 600Mhz operation, on cables tested up to 340Mhz, things can only go from bad to worst.
Finally, there are the sinks, or TV sets in common language. This is were all the HDMI 2.0 magic happens. As there are no new cables, it’s up to the HDMI sink to make sure that all data is properly received and decoded, regardless of the fact that the cable was not designed to work at higher bit rates and frequencies. Although not technically impossible, this puts most of the effort on one single side of the connection, and makes for a formidable challenge, and by challenging I also mean expensive. It’s so much of a challenge that today I could not find one single TV set with the 600Mhz 4Kp60 and 16bit per channel colour capability.

Wait. If there are not TV sets compatible with 600Mhz 4Kp60 48-bit colour, how can there be certified HDMI 2.0 devices? This is due to one fundamental characteristic (or flaw, depending on who you talk to): devices are not tested against a fixed set of tests but only against those tests manufactures state the device is compatible with, plus a small set of minimal tests to ensure basic compatibility. This means that HDMI 1.4 devices are only sure to support 720p signals, and HDMI 2.0 devices must only support 4K 4:2:0 signals, and everybody knows the industry always guides itself for the lowest common denominator (although there are good reasons not to make this rule stick on some details). This means that HDMI 2.0 will for all purposes will be limited to 4Kp50 4:2:0 8bit on most devices, which is a bit of a disappointment for some people, specially this expecting for proper UHD BT.2020 colour space. However, not everything is lost on this aspect. As 4K 10-bit is still supported on 340Mhz cables, we’ll probably see devices supporting BT.2020 colour specially on the higher quality implementations.

Conclusion

So, here you have it: the good the bad and the ugly on what regards to HDMI 2.0. It’s not probably what people were expecting, but it also not a total disappointment. It does lives up to some of it’s promises:  4K UHD for the people and using good quality current cables. It will probably not going to be the latest version of HDMI on the near term, as it doesn’t support 8K resolutions not even all 4K resolutions or 3D 4K. However, it’s a step on the right direction.

In the near future, you can expect incremental versions for HDMI. One one hand, we’ll see improved cable specifications. One which will allow for cheaper 600Mhz sinks, and another one to support the upcoming 8K resolutions.

But, again, this is the way technology works: what it top of the line today is probably common thing tomorrow, and legacy on the day after.

In sum, HDMI 2.0 brings 4K resolutions to the masses, which can never be considered a bad thing.

The good:

  • 4H Ultra High Definition available to consumers
  • UHD BT.2020 color space
  • Most high quality HDMI 1.4 cables will support 4K and the new color space

The bad:

  • Limited to 48-bit 4:2:0 4K
  • No 8K support
  • Current TV sets don’t support the new and improved resolutions

The ugly:

  • Cable compatibility will be touch and go.

One comment

  1. Pingback: HDMI 2.0 is finally here, but may not be what you were expecting | Too many Bits, too little Bytes

Leave a Reply

Back to Top