subscribe

4K and UHDTV Confusion Calamity

imgres

As with any new technology, there is often a clash between the engineering developers and the marketeers. (To this day, one my pet peeves continues to be the way audio amplifier power is defined, but that’s another story.) Ultra-High Definition Television, unfortunately, continues that trend. What the term actually means, and how it is modified, don’t currently present a clear picture, if you’ll forgive the pun.

The origin of the term Ultra-High Definition (or Ultra-HDTV, UHD TV, etc.) goes back at least to 2002, when NHK described an “Ultra-high-definition, Wide-screen System with 4000 Scanning Lines,” and I’m sure astute readers will find even earlier references. Now that it’s out of the research labs and actually in retail stores, the mayhem begins.

The CEA saw this coming, so back in June they released the official CEA Ultra High-Definition Display Characteristics V2, which stated that a TV, monitor or projector may be referred to as “Ultra High-Definition” if it meets certain minimum performance attributes, including a display with at least 3840 pixels horizontally and at least 2160 vertically. Of course, all this means is that a product carrying the CEA-trademarked UHDTV logo must meet these performance requirements.

Therein comes the next obfuscation opportunity: actual vs. “capable.” Yes, friends, history repeats itself. Remember when the market was flooded with “HD-capable” monitors that couldn’t actually display HDTV, and the ever-annoying “Real HD”? Of course, it’s happening again. Reminds me of “new and improved.” You mean I was a bozo to buy the original?

Are there other definitions? Of course. Last year, SMPTE updated, in several specifications, what UHDTV is or, at least, what some other terms mean. In their SMPTE ST 2036-1, they define “UHDTV1” (or UHD-1) as the 3840×2160-pixel image format, and UHDTV2 (or UHD-2) as the 7680×4320-pixel image format. One assumes that UHDTV is the union of these characteristics; aside from a variety of the usual frame rates, other pixel arrays are not defined.

Which brings up another factual fallacy: just what are “4K” and “8K”? SMPTE takes pains to declare that 4K is “a term used to describe images of 4096×2160 pixels although sometimes applied to UHDTV1 images … this term should not be used when referring to UHDTV1.” Similarly, they state that 8K is “a casual term for UHDTV2 images,” and “should not be used when referring to UHDTV2.” And, not to ignore their hard work, the ITU similarly recommends (R-REC-BT.2020-1-201406-I) that the pixel counts of 7680×4320 and 3840×2160 should be used “for UHDTV programme production and international exchange.”

If we want to blame someone for the term “4K,” we could hang it on the Digital Cinema folks, whose DCI Spec (as well as SMPTE 21DC, ST 2048, etc.) specifies 4K as 4096×2160; halve each of those dimensions, and you get DCI’s 2K. For multiple reasons (there are several, pro and con, including the “mathematically nice” exponent of 2), the film clan decided they needed something different than the video geeks, so there you are – more room for confusion. (And, no, DCI does not specify “8K.”) But at least 4K makes sense here. Of course, the DCI spec was never intended for consumer use.

Retail outlets, however, are not amused, as it’s easier to just say “4K,” and that’s exactly what they’re doing when the product is actually 3840×2160. Sony, for instance, calls their X950B TV a “4K Ultra HD TV,” with the resolution of the panel listed as “QFHD.” Ugh. Someone tell me what that officially means. (Thanks for finding that one, Norbert.)

So, I’m sorry guys, but the CEA “4K-Ultra HD” logo is wrong. They should have just stuck with “Ultra HD.” While we wait for the four-grand sheriffs to circle their wagons, let’s take our own shot at a norm, as suggested by our own Bob Raikes:

  • UltraHD is 3840×2160
  • DCI-4K is 4096×2160
  • Any other use of “4K” is undefined

Will anyone stick to this? Some may. I know we will try. And don’t bug me about 8K.

—Aldo Cugnini