On this episode of You Asked, buckle up for an on-the-road edition that’s all about acronyms and the troubles folks are having with them. ABL on OLEDs, APL, how HDR and WCG are not one and the same, MLA panel technology, and, of course, eARC.

As you can probably tell, I’m traveling (or I was — between the magic of publication dates and my crazy schedule, you can never be sure). If you hear honking horns, emergency vehicle sirens, or aggravated pedestrians in the background, kindly do me a favor and fuhgeddaboudit. Hey … when in New York, right?

Can SDR content trigger ABL on OLED TVs?

David Cobham in Ontario, Canada, is an ice hockey fan who is concerned about ABL, which stands for auto brightness limiter. David is leaning toward an OLED for his new TV, but is confused as to whether ABL applies to SDR content (that’s standard dynamic range, or, really, non-HDR — high dynamic range — content). Specifically, he means hockey! The last thing he wants is for all that gleaming white ice to cause his TV to dim when he’s watching games — which I gather he does a lot.

Thanks for your question, David! First, I’m going to explain ABL, and then we’ll get to your specific concerns.

Auto brightness limiter is a feature that was created for OLED (organic light-emitting diode) TVs. As just pointed out, the O in OLED stands for organic. And as organic beings, we are very familiar with the biological imperative that is aging and death. The organic compounds that make light in an OLED TV do wear down over time. And just as stress accelerates aging in humans, it accelerates the degradation of the organic compounds in an OLED TV. In other words: The harder you drive an OLED TV, the faster it will wear out.

To prolong the life of OLED TVs, an algorithm detects when an OLED TV is being driven hard for long periods of time, which is an important qualifier. It then reduces the brightness of the TV automatically. That’s ABL.

The good news is that as OLED TV tech has advanced, auto brightness limiting has been relaxed quite a bit. But if there’s one type of content that you can count on to trigger ABL, it’s hockey, which has a lot of bright white in it. And, unfortunately, hockey tends to make the auto brightness limiter more obvious because our eyes are sensitive to reductions in white light brightness.

Over the years, I’ve tried to include an ABL test in my reviews, but it’s less urgent of an issue these days. I suppose I should consider stress-testing every OLED for ABL, but I just don’t see it anymore in most types of content.

Now, let’s answer David’s question: Yes, SDR content can trigger ABL. In fact, ABL doesn’t care whether you’re watching SDR or HDR. It cares about how bright the TV is being asked to be. There’s this notion that HDR content is universally brighter, and that is not the case. HDR unlocks more dynamic range. And, yes, that means the brightest brights can be brighter. But HDR content is not brighter all the time. In fact, some folks complain that HDR content actually is more dim.

So it really depends on how bright you set up your OLED to be when it is playing HDR content. If you max out the OLED light level, and you select options like “brightness preferred” or HDR peak brightness set to high, then your SDR content will have a high average picture level. That’s APL, which is way too close an acronym to ABL.

With that said, I don’t think you’ll find that watching hockey means your TV is constantly dimming. The TV is going to get enough breaks showing player’s uniforms, the crowd, and the commentators, and the ABL is less likely to be triggered. If you pause on a very bright moment in the game, you would see the image dim slowly over time. But this issue where the TV slowly dims over time and never brightens up? That’s a bug that was found in a very few TVs, and it was fixed.

So, from an ABL perspective, I think you’re safe going with OLED. It’s actually the potential for burn-in that I’d be thinking about. If you have hockey on the TV 8 to 10 hours a day, four or more days a week, that score and information ticker down at the bottom of the TV could cause burn-in. But, hey, if you’re worried about that, you can always buy an LG G4. It’s got a five-year warranty that covers burn-in.


Wide color gamut and HDR

Jay asks: Wide color gamut has always been promoted as an advantage of HDR video. Very rarely, though, have I ever heard it discussed in TV or video reviews or promoted by TV manufacturers. All the discussion about HDR seems to center on brightness and specular highlights, not the improved colors over SDR. So my question is do film and video makers have to purposely plan for and shoot in a wide color gamut, or do those extra colors just “magically appear” when watching HDR video? Are we often watching HDR video that does not take advantage of WCG?

We often do see HDR and WCG, or wide color gamut, mentioned in the same breath. But, in fact, HDR and wide color gamut are not intrinsically linked. Technically speaking, you can have HDR without wide color gamut, and you can have wide color gamut without HDR. But the way content is packaged up, we simply don’t see wide color gamut content coming across without HDR.

Wide color gamut refers to a TV’s abilities to produce more colors than was possible from the pre-ultra-high-definition days. Broadcast, DVD, and Blu-ray discs were once capped at sRGB or Rec.709, which are represented by the smaller rectangles on the color gamut chart.

With Ultra HD, we got wide color gamut, which expanded the range of colors out to this larger triangle on this color chart. We call this color standard DCI-P3, or just P3 for short. And since we’re doing acronyms today, I’ll mention that the DCI in DCI-P3 stands for Digital Cinema Initiatives.

So, while HDR can have an effect on color brightness and color volume, HDR is about dynamic range of brightness, and not expanding a color palette. Wide Color Gamut increases the range of color shades by having a higher bit depth — increasing from 8-bit color to 10-bit color. HDR means a wider range of colors, and WCG means more shades of the colors in that range.

The content capture — at least for movies — was never a limitation. It’s always been the delivery system. Ten-bit color requires a lot of storage space or a lot of internet bandwidth to be delivered. And 4K Blu-rays and modern streaming services made it possible for us to get content with wide color gamut. And it always comes with HDR, too.

But the TV needs to be able to pull off wide color gamut. That’s what makes quantum dots so desirable. The quantum dots help expand color gamut and color volume by making a more pure white light from which color filters make colors. OLEDs have naturally better color purity and are also capable of wide color gamut.

Anyway, hope that is helpful. And to any color scientists out there: Please forgive me for watering down this explanation. I’m aware there is a lot more to it. But, we’re tying to keep it at the 101 level here.


Any downsides to MLA OLED technology?

Roy Rosenthal writes: I enjoy You Asked a lot. My question is about LG MLA technology. All of the reviews make it sound great. But is there any downside to MLA? For example: Is the off-axis viewing that is so good with regular OLED affected? Or is there any kind of content that MLA doesn’t handle well? Or is it just a pure win?

I’ve just gotten done testing the LG G4 OLED, which has LG’s MLA panel. And near as I can tell, MLA is just a pure win. I don’t see any issues inherent to the implementation of the micro lenses that make up MLA. And perhaps that’s due to how MLA works.

We have a whole explainer on MLA, but the short version is that MLA stands for micro lens array and refers to a layer in certain premium OLED panels that contains billions of tiny lenses — they measure in micrometers — to focus light out to the viewer. MLA helps OLED TVs be brighter simply by funneling light out to the viewer that otherwise is typically scattered and lost. It’s like reclaimed light output. And it doesn’t appear to get in the way of anything at all.


eARC on HDMI 2.1 ports

Bruce Macartney-Filgate writes in with questions about eARC, which stands for enhanced Audio Return Channel. Bruce’s question is around why he so commonly hears reviewers such as myself complaining about a TV’s eARC port taking up one of a TV’s HDMI 2.1 ports when he thinks that would be an advantage for folks using an A/V receiver as a video switcher.

Bruce is right. For anyone who is going to use an HDMI 2.1-capable A/V receiver as a video switcher — which means you would plug all the source devices, like game consoles, Blu-ray players, streaming boxes, and maybe a cable or satellite box into it — it would be an advantage for the TV’s eARC port to also be an HDMI 2.1 port.

Also, for those of you not familiar with eARC or ARC in general: Audio return channel is what allows an HDMI connection to be a two-way street. HDMI cables normally carry signal in one direction, from a source to a display or sound device. It gets a signal from point A to point B.

ARC and eARC allow for a video and audio signal to go one way — we’ll call it upstream — and for an audio signal to go the opposite way, or downstream. This way, folks can get sound from their TV — whether it’s from built-in apps or from any devices connected to the TV — into a soundbar or receiver. And, finally, the e in eARC stands for enhanced, and it just means that it can carry more information, so you can get lossless, uncompressed audio down the pipeline.

Now, a quick reminder for everyone: The only devices that occasionally need an HDMI 2.1 port are the Xbox Series X or S, the PlayStation 5, or a high-end gaming PC. That’s because those are the only devices that can offer 4K image resolution at 120Hz, so they need the extra bandwidth afforded by HDMI 2.1

So, if you had more than one of those three devices and an A/V receiver, then you would be happy that the eARC port was also HDMI 2.1

But for a lot of folks, that eARC port is going to be used with a soundbar or perhaps an A/V receiver or preamp/processor that doesn’t handle HDMI 2.1. In those cases, if you had more than one of those special devices that need HDMI 2.1, then you would want them to be free on your TV, independent of the eARC port, because all the eARC port needs to do is get great audio to your soundbar, or receiver.

Editors’ Recommendations






Share.
Exit mobile version