4K Upscaling solutions

The place for all discussion on gaming hardware
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: 4K Upscaling solutions

Post by elvis »

UHD's vertical resolution (2160 pixels) is a nice integer scale of both 720p (3 times) and 1080p (2 times). So if you can get your content scaled up to those resolutions, then the TV can usually do the rest.

Most UHD TVs do an adequate scaling job (speed/lag wise) from both 720p and 1080p (and plenty of reviewers test upscaling lag these days, so look for reviews that do).
thebigcheese
Posts: 707
Joined: Sun Aug 21, 2016 5:18 pm

Re: 4K Upscaling solutions

Post by thebigcheese »

Joelepain wrote:
thebigcheese wrote:If you want to use full range, go ahead, I am just now more convinced that it doesn't actually improve the visuals.
From a pure mathematical point, in a 24bit format, you only get 10.5 million colors with limited range RGB ( (235-16)^3 ) Vs 10.9 million colors with YCbCr ( (235-16)*(240-16)^2 ) Vs 16.7 million colors with Full range RGB ( 256^3).
I don't say it's obviously visible, but people have to stop saying it's quite the same thing when you loose more than 1/3 of your color palette...
And you have the same kind of wasted color palette with 30/36-bit color depth with Rec.2020...
thebigcheese wrote:However also keep in mind that the games would have been designed around limited range RGB (since Wii U only supported that, AFAIK), so insisting on playing those games in full range is kinda pointless IMO.
I'm a not a game developper, but I'm pretty sure games aren't designed around limited range. Game engine are much more complicated than this, most of the time dealing internally with floating point and making calculations in a much bigger range. It must be a pain and totally pointless to design a game engine around this "feature". I suppose it's much more simple to let the final stage of the engine or the hdmi interface do its whatever_format_it_spits to rgb-limited-range conversion.
And don't forget most WiiU port run at ~1080p on Switch vs ~720p on WiiU, often with more stable framerate too.

I am too convinced that limited range (RGB or YCbCr) is just an abomination that should have never existed. It's just pointless and stupid. But a lot of things are stupid in the hdmi realm...
I'm not saying it's not mathematically superior, I'm just saying that in practice it is not only not noticeable but also irrelevant if the system/TV isn't taking advantage of it. And also that there are many reasons to not use full because of poor implementations on the console or TV side. Xbox One, for example, crushes the hell out of blacks in full range in my experience. Really bad implementation. For older systems that can only display a limited number of colors at any time, there's absolutely no difference between full and limited range. For the newest consoles that display HDR, well you're not getting full range in HDR mode anyway. It's only a couple of instances in between where you are going to get any potential advantage to full vs limited. So yes, maybe set your PS3 and 360 to full, maybe the Switch, but I probably wouldn't set anything else to full.

As far as why it exists at all, well, remember that we used to send TV signals over the air once upon a time :p
User avatar
ASDR
Posts: 831
Joined: Sat Aug 12, 2017 3:43 pm
Location: Europistan

Re: 4K Upscaling solutions

Post by ASDR »

In addition to the points already made that limited range does not save in transmission bandwidth or simplify circuit design, I'd also say that it does not help at all on the console side. I'd very much doubt that there's any HDMI-era console / game / engine / GPU actually using or rendering with limited RGB in mind. The game / GPU / shaders will likely just have full range textures, use at least full 8 bit precision for intermediate computation and render everything to full range framebuffers. Doesn't save any memory or computation. Then, at the very end, precision will be thrown away. For no sound reason or benefit at all. Since HDMI does apparently not properly indicate limited vs full range data we even get compatibility issues that result in a washed out images or black/white crush. I bet even at the display end the first thing done is convert the image back to full range. Color math is just slower and more awkward with that weird truncated range vs just full 8 bit. Also, I know there are 6, 8 & 10 bit LCD panels, but I kinda doubt there's a 7.5 bit one or whatever limited range boils down to. And even if there were limited input signals around for any historical reasons, they could obviously be just converted to full range. This entire limited range thing is nothing but a boneheaded mistake.

Two more points

- I don't understand how HDR plays into this at all. IIRC, HDR on current consoles is YPbPr with 10bpc and chroma subsampling cause HDMI 2.0 doesn't have the bandwidth for 30bpp @ 4k60. I don't think any range setting has any effect on that color encoding at all.

- I'm surprised that there's no reliable 'auto' setting for the range on many current TVs. My ancient Sony does this just fine, Wii U, OSSC, etc. all just get the correct setting. I mean, how hard could it be? Worst case you just assume a signal is limited till you see a single full-range value and then switch. For most movies and games that would probably happen immediately.
User avatar
Unseen
Posts: 724
Joined: Sun May 25, 2014 8:12 pm
Contact:

Re: 4K Upscaling solutions

Post by Unseen »

ASDR wrote:Since HDMI does apparently not properly indicate limited vs full range data
It does, but not all sinks use that information and I wouldn't be surprised if some sources send incorrect information. For example a DVDO Edge correctly recognizes the limited/full range indicator bit from the source unless you tell it to ignore it.
User avatar
ASDR
Posts: 831
Joined: Sat Aug 12, 2017 3:43 pm
Location: Europistan

Re: 4K Upscaling solutions

Post by ASDR »

Unseen wrote:
ASDR wrote:Since HDMI does apparently not properly indicate limited vs full range data
It does, but not all sinks use that information and I wouldn't be surprised if some sources send incorrect information. For example a DVDO Edge correctly recognizes the limited/full range indicator bit from the source unless you tell it to ignore it.
Thanks. Yeah, I can totally see that. The second one popular device sends the wrong information you might as well give up trusting the full/limited flag at all. I guess I'm lucky that my current TV still does it. Works 100% from what I can tell. Didn't even notice for the longest time that my Raspberry Pi defaulted to limited because the TV just switched automatically.
Johnpv
Posts: 275
Joined: Tue Apr 04, 2017 4:46 pm

Re: 4K Upscaling solutions

Post by Johnpv »

I don't know if this helps the conversation at all but for me I use 2 separate inputs on my TV. I have a late 2015 4K Samsung with HDR support It was my understanding and experience that HDR can add some input lag. Looking at newer tests it seems to vary from tv to tv, so maybe not such a big deal now. For me I use 2 inputs on my TV, one with HDR turned on and calibration settings that go with that, and an input with HDR turned off with its own calibration settings. Using 2 inputs on the TV seems like the better solution than trying to make everything play nicely on one. My framemeister plays great with the input with HDR turned off, and I've run into no issues with it.
Post Reply