orange808 wrote:eatnumber1 wrote:If I understand the PWM issue correctly, cur should vary over time. Right? That’s not happening here. Every metric: cur, average, min/max are all the same after a few seconds.
Why would flicker always vary? There's no regulations regarding cadence.
Also, you have an average reading that's fine. (By "fine", I mean the Time Sleuth gave you an estimate of the latency. The actual display lag of that particular LG panel isn't fine; three frames of lag is a bad score.)
Okay, saying this differently. My TV is an OLED display. It doesn't have a backlight, so can't be prone to the problem as it's been described.
Thinking this through more carefully though, if we pretend my TV did have a backlight, if the backlight is flickering at frequency fd, where fd > the refresh rate, and the TS samples at its own frequency ft, where ft also > the refresh rate, if fd is an even multiple of ft, then depending on the phase of ft relative to fd, ft will latch into sampling fd at one particular point in its duty cycle. This seems like it would produce a consistent latency measurement which changes each time the fd or ft clocks are restarted, but between runs the behavior would
only vary within a full clock period of ft (assuming it's greater than fd), meaning that the latency test may show very small variance between runs, but not anywhere near as large as a full 16ms of variance, which is what I'm seeing. If a full 16ms of variance were possible due to this explanation, it would mean that the backlight only flickers once every frame.
If instead, as what seems much more likely, fd is not an even multiple of ft, the phase difference should shift over time, which I think should produce varying latency measurements without restarting either clock.
Also FWIW, I want to get a useful measurement at all here because I want to then proceed to test the latency of various components in my HDMI chain, which I can't do until I've got a base latency measurement for the TV itself that I'm confident in.