Apple introduced a TV calibration feature with the new Apple TV 4K in April. The color balance option uses the front sensors on an iPhone with Face ID to optimize output from Apple’s streaming box (including the 2017 model). According to Apple, viewers will see “much more accurate colors and improved contrast” after calibration. But, HDTVTest analysts believe the feature doesn’t always deliver what it promises. And they know a thing or two about visual quality after drilling holes The Mandalorian real HDR claims.
In a new video, Vincent Teoh of HDTVTest armed himself with an old Apple TV 4K box and an iPhone 12 Pro to apply Apple’s calibration on a trio of 55-inch TVs: an LG OLED TV, a Samsung QLED and a Sony Bravia LCD LED display. TV. The AV buff also performed side-by-side comparisons on a Sony LCD mastering monitor with benchmark class color accuracy.
Across all three TVs, Apple’s color-balanced result appeared bluer than the original output, Teoh said. On the Sony LCD LED display, which was set to the most accurate custom preset out of the box, Apple’s calibration turned to a bluer white point than the D65 standard used in the industry. diffusion. In fact, this resulted in deterioration in color accuracy with an increase in delta errors, Teoh noted.
On the Samsung QLED TV in filmmaker mode, Apple’s balance also resulted in a bluer picture, with Teoh’s objective measurements confirming this blue shift. And, on the LG OLED TV in technicolor expert picture mode, the calibration produced lower grayscale errors, contributing to better color accuracy with reduced delta error digits. But, according to Teoh, the results of a properly performed calibration using specialized tools and software were still far from close.
Another drawback is the inability of the function to profile between different display technologies. This does not bode well for the wider range of display technology on the market, according to Teoh. Think LED LCDs with traditional phosphor or PFS backlighting, QLED TVs with quantum dot enhancement film, and WRGB OLED TVs. While some 2021 OLEDs also have a new green emitting layer.
As Teoh points out in the clip, all of these display technologies have different spectral power distribution (SPD). Thus, to obtain an accurate result in terms of luminance and color measurement, a colorimeter must be profiled against a spectral radiometer. However, the analyst notes that an iPhone is unlikely to be profiled on a spectral radiometer, which may explain why the color balance produced different results between OLED and LED LCD screens.
Giving Apple the benefit of the doubt, Teoh says it might be possible for Apple to “identify the TV via the mount” and apply the necessary EDR offset based on the known spectral response of the display technology. But it depends on whether the TV manufacturer provides the correct information and Apple is going through this process.
Teoh also admits that Apple’s color balance might work well for less precise TV presets. Yet even when the image quality was inaccurate, Apple’s calibration apparently introduced posterization into the image. This is when the shallow bit depth of color results in a definite “step” from one color gradation to another versus a smooth, continuous gradation. Finally, Teoh pointed out that Apple’s calibration feature only works for Apple TV output rather than all sources on the TV.
The analyst concluded that the feature could not match up with proper calibration, as it required replacing a number of TV settings, from video black level to superfluous edge enhancement and noise reduction. optimization of motion interpolation. In a nutshell, simply selecting the most accurate out-of-the-box image preset is “more important than performing the color balance procedure,” Teoh said.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.