On today’s You Asked: Do 8K TVs for gaming make sense now? When is it time to replace a TV? How does the way a TV makes a picture match up with how our vision works? And which is better at upscaling: the game console or TV?


8K TVs for gaming: Do we want them … now?

@adamparker1504 asks: With the PS5 Pro offering 8K gameplay, I can imagine the PS6 and next-gen Xbox targeting 8K resolutions. Do you think by the end of this decade 8K will become a genuine consumer want?

This is a perfectly timed question. The short answer is, yes, I do think consumer demand for 8K TVs is about to go way up, and quickly. Although, not for the right reasons.

I think consumers will see the 8K logo on the PS5 Pro box, the subsequent PlayStation 6 and, no doubt, the new Xbox, and they will want their TV to have the same logo on it. (Queue the keyboard warriors commenting about how Sony is generating false demand for 8K TVs. But I don’t think that’s actually happening.)

Here’s the deal: I don’t see native 8K games coming very soon. It takes a monumental amount of processing power to render 4K games at higher frame rates. Doing 8K games at reasonable frame rates? … We’ll get there, but that is going to require a pretty huge advance in computing power.

What will be far more common, I suspect, is 4K games being upscaled by the console to 8K resolution — a task, by the way, that 8K TVs already perform out of necessity. You can upscale your 4K games to 8K with any 8K TV available right now, and that’s what the PS5 Pro will do, too.

But with console-sourced information for every pixel on an 8K TV, I can see 8K TV demand going up because game consoles are way more prevalent than high-end gaming PCs. (No offense to my high-end gaming PC people, but you know what I mean.)

People always ask, “Where’s the 8K content?” When there is an 8K logo on a PS5 box, folks who don’t know better might say, “There’s the 8K content” and will be stoked to buy an 8K TV, figuring they are going to unlock something special from their expensive new console.

I will say that this should all look pretty great at 77 inches or larger and, for sure, 85 inches or larger.


Upscaling conundrum

Thibault Ewbank writes: I’m a new cinephile and I watch all my 4K and Blu-rays with a Xbox One S. My TV is an LG A1 OLED from 2021. I don’t know what to choose for the upscale between the Xbox and the TV. Upscale seems (to me) a difficult process to test, so I haven’t found a lot of information.

Here’s the deal: If you set your Xbox to 4K, then it is upscaling everything (that isn’t 4K native content or gaming) to 4K. Your TV sees a 4K signal and it doesn’t upscale anything. So, practically speaking, you don’t really need to compare the upscaler in your Xbox to the upscaler in your LG A1 OLED.

If you wanted to rely exclusively on the upscaler in your TV, then I think you would need to set the resolution on your Xbox to match the native resolution of your game or content. So, for example, let’s say you were watching a DVD that’s at 480p. If you set your Xbox’s output resolution to 480p then played that DVD, you would be looking at the TV’s upscaling to 4K from 480p. You could then switch the Xbox’s output to 4K and watch the same DVD clip again to see which did better.

Upscaling to 4K from 480p is a hard job, though, so you’ll probably find that neither one does a super amazing job at it.


Time to upgrade my TV?

@Allessio777 asks: How many years should I keep a TV before tech improvement is worth a newer model?

And premium member Mike McIntosh commented: I have been struggling with FOMO as my Sony 65-inch X950G is about five years old. I have been eyeing a replacement in the Sony 65-inch A95L. But the hardest part is justifying the $3,000 price tag for the best picture quality in the land. Maybe I can keep living with the current TV until I can snag a deal in the future?

I was just thinking about this the other day. You know, in many of my TV reviews I’ve said something like, “If you’re upgrading from a TV that’s five years old or older, you’re likely to notice a big jump up in picture quality.

Sometimes I qualify that and sometimes I don’t. Let me clarify.

If you have a mid-tier TV from five years ago and you upgrade to a mid-tier 2023 or 2024 TV? You’ll notice a pretty big improvement, and that’s because a lot of what was once reserved for the very best TVs has now trickled down to the mid-tier level. For example, the TCL 6-series from five years ago was a great TV, but the processing, backlighting system, HDR peak brightness, and reduced blooming and halo of today’s TCL QM8 are vastly superior and so is the resulting picture quality.

But the jump-up in picture quality starts to be less stark if you’re upgrading from a high-end TV to another high-end TV. For example, if you have a 2019 Sony A9G OLED TV, you would see some improvement by upgrading to the Bravia 8 (I’m just keeping it at OLED TVs for continuity) but that the difference between those two would be far less noticeable than the difference between a 2019 TCL 6-Series and the 2024 TCL QM8.

At the end of the day you need to look at what you’ve got now and compare it to what you can get today at or around the same price, adjusted for inflation.

Mini-LED backlights are now more common on mid-tier TVs, not just super-high-end TVs. Processing has gotten better for some brands like TCL and Hisense, whereas Sony’s processing, while improved, has seen less improvement because it was so great to begin with.

See where I’m going with this? It depends on a few factors.

And to Mike McIntosh: Your Sony X950G is a solid TV! The A95L though is significantly better just by nature of being an OLED, but also it’s an exemplary TV. That price tag is steep, though.


Cones and color-mixing

Zunaid from Dubai writes: I had an interesting shower-thought the other day. The human eye has red, green, and blue cones with pretty wide and overlapping wavelength sensitivities (e.g. our green and blue cones overlap heavily and are both stimulated by green light). This made me wonder: Do the RGB pixels on our TVs emit a broad range of wavelengths matched to what our cones are sensitive to, or do they emit just a narrow band of wavelengths each? How would it impact our perception of TV color if it was one or the other?
Bonus question: How are cameras and displays calibrated for color recording and reproduction accuracy? How do we know the RGB wavelengths recorded by the camera sensor are the same wavelengths displayed by the TV’s LEDs?

I love this question for so many reasons. First, we get to talk about color theory and we get to talk about why display calibration is so important.

The retinas in our eyes contain two different kinds of photoreceptors: rods and cones. Rods perceive levels of light, which is contrast. Cones perceive color. We have a cone for red, a cone for green, and a cone for blue.

Zunaid, you mentioned that there’s overlap among those cones in terms of the wavelengths of the red, green, and blue we perceive, and that’s correct — it’s necessary for there to be overlap so that we can perceive a wide spectrum. Here’s an example: As the blue cone gets weaker at picking up blues as they move toward green, the green rod starts to pick them up. The sum of those two is what allows most folks to perceive balanced color — our brains are doing the color mixing and telling us what we see is yellow, magenta, etc.

A TV works by combining red, green, and blue, too. And it is important that the TV covers as much of the full spectrum of red, green, and blue wavelengths as possible. What’s more important is how well a TV can mix those wavelengths. If a TV can produce and mix color wavelengths in the same way our eyes perceive them, then we say they can produce color correctly.

But doing a broad range of wavelengths is relatively easy for TVs compared to their ability to produce extremely pure red, green, and blue. This is why when we look at a histogram of a TV’s pure red, green, or blue color output, we want to see a narrow a mountain with as sharp a peak as possible. The more pure that red, green, and blue, the more exacting a TV can be with its color production.

This is one reason we like QD-OLED TVs so much. They are able to make the purest red, green, and blue colors we’ve seen from a consumer display.

The only way we can know any of this with any kind of certainty is through measurements. We can use scientific instruments to measure colors and break them down into coordinates on a chart. For example, when I measure a TV using a colorimeter and Calman software, I’m measuring the coordinates of the colors the TV is making, then comparing them to a reference chart to see how accurate they are.

And that’s where the importance of white balance comes in. I talk a lot about how well a TV matches its whites up with a D65 white point. This means that we need to see a TV producing white at 6,504 degrees Kelvin. This reference white point is what is typically used during both the capture and reproduction of images. And we use it because it is meant to most closely represent daylight. If a TV’s balance is off, its color is going to be off, too.

To answer your bonus question: The only way we can know if a camera is capturing color wavelengths accurately is by — again — measuring what it recorded. If we can get the camera to accurately record what is happening in real life, then we’re good. But not all cameras do that, which is why color correction is so important.

Of course, much of the art in video and still images is in the manipulation of color to achieve a desired effect. Part of what we like about movies, TV, and photos is that they are not a perfect facsimile of real life. If you looked at the raw footage captured during the recording of the Matrix movies, you’d notice that it doesn’t have that distinctive green hue. The colorist made that greenish look after the fact. And the Matrix movies look the way they do because of that artistic choice.

This whole topic goes much deeper. But hopefully you understand a bit better the importance of measurements and accuracy when it comes to displays.






Share.
Exit mobile version