I was really just commenting on your claim that your OLED is already bright enough and that no higher is needed. I can see how I wasn't clear about that in my initial post.Classicgamer wrote: I don't. I fully understand the difference between peak and average brightness and the type of content where you would see both. I just don't share the confidence that the higher potential brightness of micro-led will result in a superior image to Oled. It's never that simple.
If the tech is capable of higher average brightness without sacrificing black levels, it is not apparent on Sony's prototype. If it were, you would expect to see higher contrast (than Oled), especially on specially made HDR content, but, it was nowhere near. Ultimately contrast is what matters. Not brightness or even black levels in isolation of each other.
Right now I don't really care about OLED vs Micro LED, because the tech is years away from being affordable anyway. Sure you can pick apart Sony's prototype (key word), and claim that OLED is superior. But you have to remember that OLED at this stage probably had a lot of shortcomings as well. Some of the first curved OLEDs we had in stock where I worked looked nowhere near as good as the current ones, I remember the blue colors especially being off on these sets. Right now Micro LED developers are focusing 100% on getting the LEDs smaller, picture optimisation comes later.
I believe that Micro LED has a good chance of surpassing OLED, provided that it's even possible to shrink the LEDs to the same size. Again, I love my OLED it's by far the best TV tech to date IMO, but that doesn't mean that there isn't room for improvement.
LG had to introduce white subpixels in their panels to even reach a brightness high enough for impactful HDR. And this can be a real problem for accurate color representation. Imagine a super bright, 1000 nits or higher, green neon sign in HDR. To even attempt to reach those levels of brightness (it won't) the TV has to mix in the white subpixel with the green for it to be anywhere bright enough, and this contaminates the color.
Luckily this is limited to rare occasions because most HDR highlights are white anyway, but I would still like to see this improved upon eventually
No OLED can do HDR at 4000 nits. HDR Tone mapping is used to map the HDR meta data to fit the display. These methods vary from manufacturer to manufacturer with Panasonic's being especially good IMO.In terms of the brightness used for movie mastering, most of the current mastering reference monitors I have seen are Oled. I am fairly certain that the reference monitors they use are more than capable of displaying optimal brightness.... An editor could not use a monitor for reference if it wasn't capable of displaying all the detail they wished to show in the final product.
Unfortunately there is no objectively correct standard for tone mapping, which is why some manufacturers choose to clip detail above a certain point to preserve image brightness, and others choose to dim the image to preserve detail. The LG 2017 OLED sets had very dim HDR for this very reason, where Sony's OLED the same year had really bright HDR in comparison, but discarded all the details above 1000 nits (IIRC).
Both companies have (in my opinion) improved upon this by doing more advanced tone mapping since, more alike what Panasonic is doing where they track the EOTF curve nicely before rolling off just before it reaches the displays max brightness. Something like this image:
I would imagine the Sony BVM OLEDs are doing something similar.
EDIT:
I'm surprised to find that the Sony BVM-X300 actually can display a full 1000 nits, but it does clip everything above it:
https://youtu.be/ESzWY0hW85Y?t=478
Very impressive nonetheless
The motion resolutions of all sample and hold displays are the same, with only the pixel response as a difference (OLEDs excel here compared to LCD). To increase motion resolution on sample and hold displays you have two options:Classicgamer wrote: For years reputable reviewers have been trying to come up with a meaningful quality measure. The closest they have come, in terms of specs to compare image quality are "motion resolution". It ultimately comes down to how much detail you can see in regular moving content. It doesn't tell you everything but that and dynamic contrast are at least useful for narrowing down a search.
Manufacturers don't state motion resolution specs on the Best Buy shelf label for a reason. Their cheaper poor quality tv's (that most people buy) don't score well. It pisses all over their marketing story which is that you should upgrade to their latest 4k or 8k high brightness tv's for a superior and more detail image.
Engage frame interpolation, which to my eyes looks terrible and more than often give that "soap opera effect". Furthermore this adds a ton of lag and is disabled in game modes.
Or enable black frame insertion which also increases the motion resolution by a lot, but can cause eye strain over long sessions. It also tends to drastically dim the screen.