Measuring lag with the OSSC (LG OLEDs)

The place for all discussion on gaming hardware
User avatar
orange808
Posts: 3196
Joined: Sat Aug 20, 2016 5:43 am

Re: Measuring lag with the OSSC (LG OLEDs)

Post by orange808 »

With 120Hz as an available option in some situations, I like knowing the total latency. Better testers will almost certainly arrive that test more signal types. The current "standard" is future proof.

Total display latency doesn't say anything about the game machine or software, but it let's us know what the possibilities are. Naturally, higher frame rates will lower the possible display lag "floor". I'm okay with latency numbers that show that (as long as the frame rate is mentioned).

If anything, I would like to see lag always presented with: the top, middle, and bottom readings AND the signal information ("resolution" and frame rate) AND the display settings during the test ("game mode" settings, etc., etc.) Few reviewers share all of that. It's also frustrating that nobody bothers to test deinterlacing lag. A Time Sleuth and five minutes isn't prohibitively expensive.

Showing me the total of processing latency plus "scan out" also tells me something about how the display works. Some displays completely buffer the frame before they begin a new "scan out". Also, I can make some generalisations about how a display's "black frame insertion" feature works with the lag numbers.

The real problem is: it's complicated and oversimplification will always confuse people in the end.

More and more often (it seems), there are multiple possible display latency numbers based on: signal type, frame rate, and display feature settings.
We apologise for the inconvenience
User avatar
Xyga
Posts: 7181
Joined: Tue Nov 05, 2013 8:22 pm
Location: block

Re: Measuring lag with the OSSC (LG OLEDs)

Post by Xyga »

I would have bet money someone would immediately pop up to say all that. :mrgreen:

Ok but personally before everything I like to tell about the one big misunderstanding that plagues like 90% of the cases that concern people playing video games and their displays, what they need to understand first is what the figures they read on the most popular websites actually mean so they can finally call things by their name.
If we start telling about all cases, mentioning also the soft's own delay etc ll at the same time, people won't follow the reasoning, they haven't for like a decade, so...
(you're right that with more and more 4K displays around it's important to explain the oddities some of those show, but basics/ basic cases first)

In any case strictly displays-speaking whatever the signal type/rate/scan fashion, I don't call what's part of the normal displaying process for a source 'lag' or 'delay', because it's not (which is why I don't use the misleading terms 'total display lag' anymore either) as clearly only what's abnormal/unwanted actually adding to the normal-ideal chain 'flow' within the display, deserves to be called lag/delay.

For your wish, oh I feel you. Honestly if I thought building a better database-type website for lag was any possible I'd have loved to participate. I have learned however, that it is almost impossible to convince stores to let me test displays, at least in my country it's dead.
As for gathering from ppl online you know it never brings much results, or it's already obsolete models, or contributors just drop a figure, no info whatsoever sometimes not even which bar, then they disappear...
One thing that would help change that to begin is an ultra cheap standalone tester, like 15~20 bucks max, because current options are all WAY too expensive. Naturally most don't see the interest of spending that money on something they'll prolly use only 2~3 times ever.
Strikers1945guy wrote:"Do we....eat chicken balls?!"
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

Xyga wrote:The OSSC doesn't read-report like the other testers do, so when it apparently doesn't add the frame 'drawn' time wherever not-the-top you place the sensor, it tells only the actual lag/delay, and in most cases there is only one measurement to report.
(logically. tho that way of reading is too counter-intuitive for what ppl expect, I won't say it enough)
Not sure what your point is here - please elaborate.
Xyga wrote:
I don't dismiss any possibilities but xeos hasn't shared anything about his monitor besides the brand, still 4.3ms would be a very common result expected for tons of monitors today, that's what you'll read from a serious website like pcmonitors.info or tftcentral.co.uk, while lesser ones would give something around 10+ms taken from the middle of the screen or an average.
This case smells like it, like he read somewhere in a review/chart that his monitor is 10+ms, and now testing himself he's surprised to read less, same as Konsolkongen here.
The same happens often-enough with ppl using a LB or any other lag tester by themselves, like "wat this is less than x or y review said", and it's hard to swallow that much more popular websites like Rtings or displaylag are, well, not technically 'wrong' but slightly misunderstanding the topic of lag/delay.
Actually, nothing of the sort - no lag numbers have been published for my monitor, anywhere, as it's ancient, about 10 years old. KDL-40vl130. I figured before I upgraded I should see how much of an improvement I'd get. Anyway, no way it's doing 4.3ms. It might *possibly* be strobing the backlight, however, which could easily create a false positive for the probe onset. Indeed with more careful measuring I discovered that it sometimes returned the same nonsense <10ms measurement even when the sensor was not overlapping the probe, but was placed on the black part of the screen. If all decent monitors are < 10 ms then there's little point in this bit of DIY hardware modding, at least IM(H)O. nobody's ever going to be able to tell the difference between 0 lag and 10ms.

Xyga wrote: In regards to what you think about the brightness/sensor, honestly I don't think it makes a big difference. Maybe like with the other testers we can read at worst a couple ms difference between 0% and 50% brightness then another two between 50% and 100%.
Like response times in most cases the difference brightness makes for lag testing shouldn't be any significant at all.
Reading an analog value like brightness with a fixed digital threshold is going to work well only if you are very lucky. Otherwise the threshold can be too low (like my case) or too high (resulting in no measurement at all). And depending on the empirical black-to-white transition curve it could add a lot to the lag. I did a simple modification to change the sensitivity on my setup - I covered part of the box holding the photodiode with an index card so that I could shift the trigger threshold (ie make the sensor less sensitive). This worked great to eliminate the false positives I was measuring below 10ms. Now by adjusting the light level reaching the diode I can measure lag levels of 20 ms to 54 ms. Doesn't matter where on the screen I make measurements - the effect of sensor sensitivity swamps screen painting time. This is why the wiki on the lag tester offhandedly mentions using a POT instead of a fixed resistor. Sadly when ordering components I didn't realize that the OSSC was using a binary threshold instead of a DAC, though in retrospect it's obvious that's what's going on, and of course a DAC would be crazy given that the circuit was designed for just reading a button press. Using a pot would allow a bit more control than my index card trick, but does not resolve the fundamental problem of measuring the analog value with a binary threshold.

Given my new understanding of the OSSC hardware I'm not so excited by the lag testing feature. I mean, it's certainly a neat hack and there's no harm to including it in the firmware. But it's far from robust enough to be worth much. Perhaps with OLED screens were the black to white transition is essentially instantaneous, and blacks really are blacks. But I assume all OLED screens have fantastically low lag, better even than the CRTs of legend.

No offense to the designer, marqs85, - he didn't set out to build a lag tester, and the OSSC is good for what it was designed to do.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

Xyga wrote: Think about it, by the logic of taking the center or average as the reference for measuring then all CRTs lag 8.3ms (at 60Hz)
That's just silly.
Yes and no. The real lag of a 120hz CRT is in fact lower than a 60hz CRT, and presumably that would only be empirically measurable if you choose somewhere other than the top of the screen to measure. Moving to the present, 120hz or 240hz screens are almost mainstream options now, though with a price premium to be sure. A fancy graph or table can convey the fixed lag and the lag due to refresh rate, but if you want a single value that captures what people actually experience center of the screen is probably the ideal.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

Xyga wrote:I would have bet money someone would immediately pop up to say all that. :mrgreen:

One thing that would help change that to begin is an ultra cheap standalone tester, like 15~20 bucks max, because current options are all WAY too expensive. Naturally most don't see the interest of spending that money on something they'll prolly use only 2~3 times ever.
After playing with a raspberry pi zero a bit for some home automation projects, I think $10 is doable. Certainly if you were willing to accept the binary threshold approach used by the OSSC. I'm not sure the pi can do faster than 60hz refresh over HDMI though. Disclaimer: I'm not actually working on a raspberry pi latency tester.
strayan
Posts: 671
Joined: Sun Mar 19, 2017 8:33 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by strayan »

According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344
User avatar
RIP-Felix
Posts: 140
Joined: Sun Mar 31, 2019 7:54 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by RIP-Felix »

strayan wrote:According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344
Perhaps they don't want to disclose input latency on newer models because they can imply by omission it's the same moving forward as it was for the best performing model.

My first question is this, are there appreciable gains to be had if 144Hz or 240Hz OLEDs become a thing?

60Hz = 16.67 ms/Frame
120Hz = 8.33 ms/Frame (-8.33ms vs. 60Hz)
144Hz = 6.94 ms/Frame (-9.72ms vs. 60Hz)(-1.39ms vs.120Hz)
240Hz = 4.17 ms/Frame (-12.5ms vs. 60Hz)(-4.17ms vs.120Hz)(-2.78ms vs.144Hz)

Not everyone buying an LG OLED is a console gamer. Anyone who tries a 60Hz vs 120Hz monitor can feel and see the 8ms difference (especially when using a mouse). However, the difference between 120Hz, 144Hz, and 240Hz may not be so easily discerned. There is only 1-4ms difference between them. On paper, the jump from 120Hz to 240Hz should only be half as dramatic as the jump from 60Hz to 120Hz. I've never actually tried it. So is it? And how much of the difference do you think is due to display tech (IPS vs LCD vs LED vs OLED, pixel refresh rate and such)?

Sure, psychologically gamers care about the extra 4ms, even if it they couldn't feel it (and I'm not saying they can't). A 1ms display removes the lag excuse for why you suck, which is reassuring when your putting in the time to not suck (it's one of the reasons I prefer retro games on CRT). On the industry side, playing up the importance of those precious milliseconds sells monitors. Everything else being equal, if one PC monitor didn't have input latency listed, would you buy it or the one that says 1ms? Even if they both were 1ms displays, you would buy the one that say's so for reassurance. So I get why it's importance might become inflated.

My second question is if the gains would be worth the additional cable bandwidth and GPU horsepower required to handle 240Hz for anything beyond 1080p? Is 120Hz the sweet spot? Perhaps that's where LG decision to stop making light of input latency is coming from. It's good enough to not really matter anymore.
User avatar
orange808
Posts: 3196
Joined: Sat Aug 20, 2016 5:43 am

Re: Measuring lag with the OSSC (LG OLEDs)

Post by orange808 »

I worry that a push for higher frame rates will introduce more checkerboard rendering and all the artifacts that come with it to PC--where compression doesn't belong.

It's like pushing PC games to follow broadcast television down the interlacing rabbit hole.

If I have to give up proper rendering for high frame rates, I'm good with 60Hz and 1ms polling. If the games are coded well and you have a good display, the lag is sufficiently low.
We apologise for the inconvenience
User avatar
Konsolkongen
Posts: 2309
Joined: Fri May 16, 2008 8:28 pm
Location: Denmark

Re: Measuring lag with the OSSC (LG OLEDs)

Post by Konsolkongen »

strayan wrote:According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344
Have they ever listed inputlag on their TVs? Have any manufacturer ever done that? I can’t remember ever seeing this listed when looking at specs.
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by tongshadow »

Konsolkongen wrote:
strayan wrote:According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344
Have they ever listed inputlag on their TVs? Have any manufacturer ever done that? I can’t remember ever seeing this listed when looking at specs.
GAMING MONITORS :mrgreen:
User avatar
Konsolkongen
Posts: 2309
Joined: Fri May 16, 2008 8:28 pm
Location: Denmark

Re: Measuring lag with the OSSC (LG OLEDs)

Post by Konsolkongen »

You're probably right and that's what he meant, but he is being interviewed about their 2020 OLED TVs which is why it didn't make sense to me :)
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

xeos wrote:
Xyga wrote:I would have bet money someone would immediately pop up to say all that. :mrgreen:

One thing that would help change that to begin is an ultra cheap standalone tester, like 15~20 bucks max, because current options are all WAY too expensive. Naturally most don't see the interest of spending that money on something they'll prolly use only 2~3 times ever.
After playing with a raspberry pi zero a bit for some home automation projects, I think $10 is doable. Certainly if you were willing to accept the binary threshold approach used by the OSSC. I'm not sure the pi can do faster than 60hz refresh over HDMI though. Disclaimer: I'm not actually working on a raspberry pi latency tester.
Disclaimer revoked. I wrote a raspberry pi latency tester.

https://alantechreview.blogspot.com/202 ... berry.html

besides a raspberry pi zero, you need a reasonably fast camera, such as samsung and apple have been shipping with their flagship phones for the last 5? years.
User avatar
Unseen
Posts: 723
Joined: Sun May 25, 2014 8:12 pm
Contact:

Re: Measuring lag with the OSSC (LG OLEDs)

Post by Unseen »

xeos wrote:Yes and no. The real lag of a 120hz CRT is in fact lower than a 60hz CRT, and presumably that would only be empirically measurable if you choose somewhere other than the top of the screen to measure.
But it's fake lag - it only exists because of an arbitrary decision to start measurement at the start of a frame instead of measuring from the point where the display receives the data it is supposed to show. Obviously, the display cannot look into the future, so starting the measurement before the data is sent does not make any sense - it's as if you're timing a 100m race, but start the timer when "on your mark" is called out instead of starting it at "go".

Furthermore, this fake lag is completely deterministic and can be calculated to an arbitrary starting point just from the video timing and the position of the measurement sensor on screen. If you really demand that your display must be able to look into the future, you can add it to a more logical "signal-to-photon" measurement.

And as another point: A signal-to-photon measurement as (AFAIK) done by the OSSC puts its measurement boundaries at the back (signal input) and front (screen) of the display, while a Bodnar-style measurement moves one of the boundaries inside the signal source (start of frame). For fair comparisons between displays, the former way looks a lot more reasonable to me.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

Unseen wrote:
xeos wrote:Yes and no. The real lag of a 120hz CRT is in fact lower than a 60hz CRT, and presumably that would only be empirically measurable if you choose somewhere other than the top of the screen to measure.
But it's fake lag - it only exists because of an arbitrary decision to start measurement at the start of a frame instead of measuring from the point where the display receives the data it is supposed to show. Obviously, the display cannot look into the future, so starting the measurement before the data is sent does not make any sense - it's as if you're timing a 100m race, but start the timer when "on your mark" is called out instead of starting it at "go".

Furthermore, this fake lag is completely deterministic and can be calculated to an arbitrary starting point just from the video timing and the position of the measurement sensor on screen. If you really demand that your display must be able to look into the future, you can add it to a more logical "signal-to-photon" measurement.

And as another point: A signal-to-photon measurement as (AFAIK) done by the OSSC puts its measurement boundaries at the back (signal input) and front (screen) of the display, while a Bodnar-style measurement moves one of the boundaries inside the signal source (start of frame). For fair comparisons between displays, the former way looks a lot more reasonable to me.
I may be totally missing your point, but in fairness you might be missing mine too ;-)

Let me elaborate on my perspective: since this is a video game forum, we should consider when that frame of video is rendered. As far as I know, that's all at once. You don't render the top part, start sending it out over HDMI, and then move on to rendering the bottom. Well actually, the atari 2600 does work that way, but that's a pretty special case. So the last moment your input into the computer/console/whatever could change the picture rendered is some point before the top of the screen is scanned out. Since that's going to depend on the game there's no point in trying to estimate it, other than to acknowledge that it can't be any point after the first pixel of the video signal has been sent over the cable. Thus the amount of lag between player input and photon produced is going to be higher at the bottom of the screen than the top.

As for the complete determinism, that's true, but people sure like to compare numbers without paying attention to the context. Such as, if it's a plasma display, where the scanout actually happens faster than the vertical refresh. In fact, as lovers of low lag we should hope that this approach becomes the standard rather than a rare case from a bygone era of display tech. And with modern displays there's no reason that the scanout needs to take exactly one refresh cycle. You could do all your preprocessing/motion estimation/deinterlacing and then push it out to the entire display in 4 or 8ms instead of the 16 implied by 60hz. I suppose it depends on whether the processing is happening over a couple scan lines or an entire frame. But given the lag you see in TVs, I'd guess it's often entire frames.

Finally, first photon's and all that are nice and measurable, but realistically they suggest an overly rosy idea of the true lag. It would be better to report a number that factored in the g2g response time of the monitor or something of that type. I would personally report the first photon at the top, and the 50% or 90% complete brightness response at the bottom. Call them the "marketing bs" and true lag, respectively ;-)
User avatar
Unseen
Posts: 723
Joined: Sun May 25, 2014 8:12 pm
Contact:

Re: Measuring lag with the OSSC (LG OLEDs)

Post by Unseen »

xeos wrote:Let me elaborate on my perspective: since this is a video game forum, we should consider when that frame of video is rendered. As far as I know, that's all at once.
I had something about this in my post originally, but deleted it because I thought it would be too confusing. I don't think that assumption is valid in the general case.
Well actually, the atari 2600 does work that way, but that's a pretty special case.
Rasteline-based effects aren't limited to just the Atari 2600, e.g. byuu loves to bring up a certain SNES game as emulator accuracy example that modifies some registers in a small part of the frame to generate a shadow for the player's ship. Modern VR systems sample the user input as close as possible before scanning out the picture and apply transformations to the previously-rendered picture to reduce the motion-to-photon latency.

Oh, and some modern emulators also try to race the beam to minimize input latency.
So the last moment your input into the computer/console/whatever could change the picture rendered is some point before the top of the screen is scanned out.
For systems that use hardware-scanned inputs something like that could be true, although I suspect the scan point is not always the first visible line of the picture. For systems where inputs are handled in software, the input can be read whenever it is convenient for the programmer, for example in a C64 game that has the main play area at the top and score/status at the bottom I would not be surprised if the input is sampled after switching to the status part because that likely needs less CPU intervention and thus allows more CPU time for handling the game logic.
Thus the amount of lag between player input and photon produced is going to be higher at the bottom of the screen than the top.
Yes, but that does not matter as long as we are talking only about display lag. If you claim that a certain number represents the display lag, but actually measure the entire path from a controller input through a rendering system, video transmission and the display processor that is just misleading.
As for the complete determinism, that's true, but people sure like to compare numbers without paying attention to the context.
Exactly! This is why a display lag measurement must include only the lag caused by the display itself and nothing that is outside of the control of the display.
You could do all your preprocessing/motion estimation/deinterlacing and then push it out to the entire display in 4 or 8ms instead of the 16 implied by 60hz.
I'm not completely sure, but it appears to me that you propose to lower display lag by increasing display lag? To scan out a picture faster than it is received, you must first buffer a sufficient amount so you don't run out of pixels before the bottom of the screen. Any buffer adds lag, even if it is a clever implementation that buffers just enough that the final pixel is received just-in-time.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

Unseen wrote:
xeos wrote:Let me elaborate on my perspective: since this is a video game forum, we should consider when that frame of video is rendered. As far as I know, that's all at once. Well actually, the atari 2600 does work that way, but that's a pretty special case.
Rasteline-based effects aren't limited to just the Atari 2600, e.g. byuu loves to bring up a certain SNES game as emulator accuracy example that modifies some registers in a small part of the frame to generate a shadow for the player's ship. Modern VR systems sample the user input as close as possible before scanning out the picture and apply transformations to the previously-rendered picture to reduce the motion-to-photon latency.
Raster line effects don’t necessarily mean that the game state itself is updated. If it were, you’d be at risk of piecemeal appearance where there (could) be a discontinuity between states, such as is seen with vsync off all the time in 3d games, usually referred to as tearing. I don’t deny it’s possible to be smart about this and only update just before rendering and also be smart enough to only do it when it’s not going to be visible. But it starts to require an awful lot of cleverness. I’d be curious of any specific examples. My recollection of the VR stuff is that they do delay rendering as long as possible and at times even apply a whole image transform after 3d rendering, but I’ve never heard of them actually repositioning objects on the screen as rendering progresses from top to bottom. Again, I’d welcome a specific citation to the contrary – it’s an interesting idea but seems like it would be very prone to tearing artifacts.
So the last moment your input into the computer/console/whatever could change the picture rendered is some point before the top of the screen is scanned out.
For systems that use hardware-scanned inputs something like that could be true, although I suspect the scan point is not always the first visible line of the picture. For systems where inputs are handled in software, the input can be read whenever it is convenient for the programmer, for example in a C64 game that has the main play area at the top and score/status at the bottom I would not be surprised if the input is sampled after switching to the status part because that likely needs less CPU intervention and thus allows more CPU time for handling the game logic.
Right. So this means that lag will be even worse than the estimate I am suggesting is most representative. But definitely not better, which is what you get if you measure signal out 2 photon out.
Thus the amount of lag between player input and photon produced is going to be higher at the bottom of the screen than the top.
Yes, but that does not matter as long as we are talking only about display lag. If you claim that a certain number represents the display lag, but actually measure the entire path from a controller input through a rendering system, video transmission and the display processor that is just misleading.
it's only misleading if it's not the typical scenario. My belief is that >90% of games are not updating the rendered output mid-frame. So the game state is as stale as the top-most pixel, and only gets worse as you travel down the screen. Again, I'd be interested in references to the contrary.
You could do all your preprocessing/motion estimation/deinterlacing and then push it out to the entire display in 4 or 8ms instead of the 16 implied by 60hz.
I'm not completely sure, but it appears to me that you propose to lower display lag by increasing display lag? To scan out a picture faster than it is received, you must first buffer a sufficient amount so you don't run out of pixels before the bottom of the screen. Any buffer adds lag, even if it is a clever implementation that buffers just enough that the final pixel is received just-in-time.
most displays do a lot of processing on the input - otherwise lag would be 0 at the upper left corner and 16.6ms at the bottom, like our good old progressive CRTs. Instead it's 8-80ms more than that. So they are already buffering a lot. All in the name of reducing motion smear, etc. Or, faking 120hz from a 60hz signal.
User avatar
Ed Oscuro
Posts: 18654
Joined: Thu Dec 08, 2005 4:13 pm
Location: uoıʇɐɹnƃıɟuoɔ ɯǝʇsʎs

Re: Measuring lag with the OSSC (LG OLEDs)

Post by Ed Oscuro »

I just want to point out that screen tearing is technically an example of a game updating a frame after the frame has begun drawing, even when it is due to the next frame's drawing being delayed rather than images being drawn too quickly. I don't know what proportion of current games this mangles, but it certainly was a problem in some 360 generation games. Some versions have tearing while others accept input lag to cover it up with v-sync. (Aside: I recently discovered I had my G-Sync settings wrong for quite a while, so lots of things were tearing on my PC, even in scenarios where it didn't make sense for them to be. Thankfully blurbusters points out that one should not have "fast" v-sync activated alongside G-Sync.)

Additionally, it might be a sensible alternative to talking about "display-based" or "console-based" latency to simply emphasize there is a window in which the output is meant to appear, and if something appears outside that window then it is objectionable. With faster scanout to the bottom of the screen (at least that's what I think happens on modern displays!) this should be closer to the truth than the inherent rolling shutter-like model on CRTs.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

Ed Oscuro wrote: With faster scanout to the bottom of the screen (at least that's what I think happens on modern displays!) this should be closer to the truth than the inherent rolling shutter-like model on CRTs.
Do you know of any specific examples other than most/all plasma displays? If it's already widespread all the more reason to report lag at the bottom of the screen than at the top (or, of course, both).
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by tongshadow »

Getting these results on my CRT monitors, what does the second ms mean?
https://i.imgur.com/vU3GmoM.jpg
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

tongshadow wrote:Getting these results on my CRT monitors, what does the second ms mean?
https://i.imgur.com/vU3GmoM.jpg
phosphor persistence. keep in mind the threshold is arbitrary so it doesn't really mean that much.
User avatar
orange808
Posts: 3196
Joined: Sat Aug 20, 2016 5:43 am

Re: Measuring lag with the OSSC (LG OLEDs)

Post by orange808 »

For the record, the Atari VCS (aka 2600) software won't usually poll the controller and perform updates to anything that would constitute "input lag" while updating the screen. Game logic is done in overscan and vblanking. So, in practice, the end user experience is no different than using a machine with proper video firmware and a frame buffer. Paddles and track ball games may perform extra checks during the frame, but you still don't create a new/updated frame of game logic until the gun is finished.

Extremely low lag is not something you'll encounter in real games.

Technically, I could write you a Pong game with paddles at the top and bottom of the screen that did poll the controller just a few scanlines before rendering the player and update the game logic more than once a frame, but who would want it, play it, or care?

Furthermore, some legacy software will deal with limited resources by spreading logic out across more than one frame--effectively clouding the true framerate and response of the software into something that can't be measured or described well.

There's display lag, controller lag, and software lag. Sans source code, the software is a "black box". That leaves us running camera tests if we want real latency for each game.
We apologise for the inconvenience
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

orange808 wrote: Technically, I could write you a Pong game with paddles at the top and bottom of the screen that did poll the controller just a few scanlines before rendering the player and update the game logic more than once a frame, but who would want it, play it, or care?
Exactly. This is why I think you should report lag relative to the start (or really, end) of vsync rather than when the signal is sent over the wire. Pixels at the bottom of the screen are always going to be more stale than at the top for video game content. Only video recorded with an old fashioned rolling shutter exactly synced to 60hz will it be any different (indeed this is how old TV cameras worked). And since none of us care about lag for old-fashioned tv cameras....

And as you've (implicitly) pointed out, none of us have ever noticed that our spaceship/whatever lags more at the bottom of the screen than the top.

BUT: I still think it's worth measuring input lag in this day and age where displays can vary from 3ms to 50ms of lag though (or worse for interlaced signals). And once you start that project, it's better to be consistent about what you report. In my reviews I measure/report both input lag and response time, but emphasize the total. I just finished a write-up of a monitor that has an excellent 4ms input lag, and a jaw dropping 18ms of response time, switching it from a fantastic choice to a middle of the line choice (https://alantechreview.blogspot.com/202 ... g-and.html). More controversially I also think you should use lag at the bottom of the screen when ranking screens, even though that bakes in the refresh rate. My reasoning being that an input lag of 4ms vs 8ms sounds significant when measured at the top of the screen, but less so when measured at the bottom: 20ms vs 24ms (assuming 60hz refresh).
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by tongshadow »

Alright, for the heck of it I'm also adding the display lag of my BenQ XL2420T (3.66ms).
https://i.imgur.com/cjdi72X.jpg
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

tongshadow wrote:Alright, for the heck of it I'm also adding the display lag of my BenQ XL2420T (3.66ms).
https://i.imgur.com/cjdi72X.jpg
But what's it's actual response time? without that you don't really know how long it takes to actually render a change to the screen.
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by tongshadow »

1ms.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

tongshadow wrote:1ms.
measured, or quoted from the spec sheet?
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Measuring lag with the OSSC (LG OLEDs)

Post by tongshadow »

Spec sheet.
User avatar
xeos
Posts: 167
Joined: Thu Oct 20, 2016 10:38 pm
Location: San Diego, California

Re: Measuring lag with the OSSC (LG OLEDs)

Post by xeos »

tongshadow wrote:Spec sheet.
spec sheets can lie pretty bad. did you see my earlier example of a 5ms spec monitor actually taking 18ms in the real world?
Post Reply