shmups.system11.org

Shmups Forum
 
* FAQ    * Search
 * Register  * Login 
It is currently Sun Aug 09, 2020 11:18 am View unanswered posts
View active topics



Post new topic Reply to topic  [ 48 posts ]  Go to page Previous  1, 2
Author Message
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Tue Jan 07, 2020 11:35 pm 


User avatar

Joined: 20 Aug 2016
Posts: 1726
With 120Hz as an available option in some situations, I like knowing the total latency. Better testers will almost certainly arrive that test more signal types. The current "standard" is future proof.

Total display latency doesn't say anything about the game machine or software, but it let's us know what the possibilities are. Naturally, higher frame rates will lower the possible display lag "floor". I'm okay with latency numbers that show that (as long as the frame rate is mentioned).

If anything, I would like to see lag always presented with: the top, middle, and bottom readings AND the signal information ("resolution" and frame rate) AND the display settings during the test ("game mode" settings, etc., etc.) Few reviewers share all of that. It's also frustrating that nobody bothers to test deinterlacing lag. A Time Sleuth and five minutes isn't prohibitively expensive.

Showing me the total of processing latency plus "scan out" also tells me something about how the display works. Some displays completely buffer the frame before they begin a new "scan out". Also, I can make some generalisations about how a display's "black frame insertion" feature works with the lag numbers.

The real problem is: it's complicated and oversimplification will always confuse people in the end.

More and more often (it seems), there are multiple possible display latency numbers based on: signal type, frame rate, and display feature settings.
_________________
We apologise for the inconvenience


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Wed Jan 08, 2020 12:36 am 


User avatar

Joined: 05 Nov 2013
Posts: 7181
Location: block
I would have bet money someone would immediately pop up to say all that. :mrgreen:

Ok but personally before everything I like to tell about the one big misunderstanding that plagues like 90% of the cases that concern people playing video games and their displays, what they need to understand first is what the figures they read on the most popular websites actually mean so they can finally call things by their name.
If we start telling about all cases, mentioning also the soft's own delay etc ll at the same time, people won't follow the reasoning, they haven't for like a decade, so...
(you're right that with more and more 4K displays around it's important to explain the oddities some of those show, but basics/ basic cases first)

In any case strictly displays-speaking whatever the signal type/rate/scan fashion, I don't call what's part of the normal displaying process for a source 'lag' or 'delay', because it's not (which is why I don't use the misleading terms 'total display lag' anymore either) as clearly only what's abnormal/unwanted actually adding to the normal-ideal chain 'flow' within the display, deserves to be called lag/delay.

For your wish, oh I feel you. Honestly if I thought building a better database-type website for lag was any possible I'd have loved to participate. I have learned however, that it is almost impossible to convince stores to let me test displays, at least in my country it's dead.
As for gathering from ppl online you know it never brings much results, or it's already obsolete models, or contributors just drop a figure, no info whatsoever sometimes not even which bar, then they disappear...
One thing that would help change that to begin is an ultra cheap standalone tester, like 15~20 bucks max, because current options are all WAY too expensive. Naturally most don't see the interest of spending that money on something they'll prolly use only 2~3 times ever.
_________________
Strikers1945guy wrote:
"Do we....eat chicken balls?!"


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Fri Jan 10, 2020 11:42 pm 



Joined: 20 Oct 2016
Posts: 33
Xyga wrote:
The OSSC doesn't read-report like the other testers do, so when it apparently doesn't add the frame 'drawn' time wherever not-the-top you place the sensor, it tells only the actual lag/delay, and in most cases there is only one measurement to report.
(logically. tho that way of reading is too counter-intuitive for what ppl expect, I won't say it enough)



Not sure what your point is here - please elaborate.

Xyga wrote:

I don't dismiss any possibilities but xeos hasn't shared anything about his monitor besides the brand, still 4.3ms would be a very common result expected for tons of monitors today, that's what you'll read from a serious website like pcmonitors.info or tftcentral.co.uk, while lesser ones would give something around 10+ms taken from the middle of the screen or an average.
This case smells like it, like he read somewhere in a review/chart that his monitor is 10+ms, and now testing himself he's surprised to read less, same as Konsolkongen here.
The same happens often-enough with ppl using a LB or any other lag tester by themselves, like "wat this is less than x or y review said", and it's hard to swallow that much more popular websites like Rtings or displaylag are, well, not technically 'wrong' but slightly misunderstanding the topic of lag/delay.



Actually, nothing of the sort - no lag numbers have been published for my monitor, anywhere, as it's ancient, about 10 years old. KDL-40vl130. I figured before I upgraded I should see how much of an improvement I'd get. Anyway, no way it's doing 4.3ms. It might *possibly* be strobing the backlight, however, which could easily create a false positive for the probe onset. Indeed with more careful measuring I discovered that it sometimes returned the same nonsense <10ms measurement even when the sensor was not overlapping the probe, but was placed on the black part of the screen. If all decent monitors are < 10 ms then there's little point in this bit of DIY hardware modding, at least IM(H)O. nobody's ever going to be able to tell the difference between 0 lag and 10ms.


Xyga wrote:
In regards to what you think about the brightness/sensor, honestly I don't think it makes a big difference. Maybe like with the other testers we can read at worst a couple ms difference between 0% and 50% brightness then another two between 50% and 100%.
Like response times in most cases the difference brightness makes for lag testing shouldn't be any significant at all.


Reading an analog value like brightness with a fixed digital threshold is going to work well only if you are very lucky. Otherwise the threshold can be too low (like my case) or too high (resulting in no measurement at all). And depending on the empirical black-to-white transition curve it could add a lot to the lag. I did a simple modification to change the sensitivity on my setup - I covered part of the box holding the photodiode with an index card so that I could shift the trigger threshold (ie make the sensor less sensitive). This worked great to eliminate the false positives I was measuring below 10ms. Now by adjusting the light level reaching the diode I can measure lag levels of 20 ms to 54 ms. Doesn't matter where on the screen I make measurements - the effect of sensor sensitivity swamps screen painting time. This is why the wiki on the lag tester offhandedly mentions using a POT instead of a fixed resistor. Sadly when ordering components I didn't realize that the OSSC was using a binary threshold instead of a DAC, though in retrospect it's obvious that's what's going on, and of course a DAC would be crazy given that the circuit was designed for just reading a button press. Using a pot would allow a bit more control than my index card trick, but does not resolve the fundamental problem of measuring the analog value with a binary threshold.

Given my new understanding of the OSSC hardware I'm not so excited by the lag testing feature. I mean, it's certainly a neat hack and there's no harm to including it in the firmware. But it's far from robust enough to be worth much. Perhaps with OLED screens were the black to white transition is essentially instantaneous, and blacks really are blacks. But I assume all OLED screens have fantastically low lag, better even than the CRTs of legend.

No offense to the designer, marqs85, - he didn't set out to build a lag tester, and the OSSC is good for what it was designed to do.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Fri Jan 10, 2020 11:49 pm 



Joined: 20 Oct 2016
Posts: 33
Xyga wrote:
Think about it, by the logic of taking the center or average as the reference for measuring then all CRTs lag 8.3ms (at 60Hz)
That's just silly.


Yes and no. The real lag of a 120hz CRT is in fact lower than a 60hz CRT, and presumably that would only be empirically measurable if you choose somewhere other than the top of the screen to measure. Moving to the present, 120hz or 240hz screens are almost mainstream options now, though with a price premium to be sure. A fancy graph or table can convey the fixed lag and the lag due to refresh rate, but if you want a single value that captures what people actually experience center of the screen is probably the ideal.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Fri Jan 10, 2020 11:54 pm 



Joined: 20 Oct 2016
Posts: 33
Xyga wrote:
I would have bet money someone would immediately pop up to say all that. :mrgreen:

One thing that would help change that to begin is an ultra cheap standalone tester, like 15~20 bucks max, because current options are all WAY too expensive. Naturally most don't see the interest of spending that money on something they'll prolly use only 2~3 times ever.


After playing with a raspberry pi zero a bit for some home automation projects, I think $10 is doable. Certainly if you were willing to accept the binary threshold approach used by the OSSC. I'm not sure the pi can do faster than 60hz refresh over HDMI though. Disclaimer: I'm not actually working on a raspberry pi latency tester.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Sat Jan 11, 2020 9:28 am 



Joined: 19 Mar 2017
Posts: 323
According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Sun Jan 12, 2020 12:06 am 


User avatar

Joined: 31 Mar 2019
Posts: 80
strayan wrote:
According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344
Perhaps they don't want to disclose input latency on newer models because they can imply by omission it's the same moving forward as it was for the best performing model.

My first question is this, are there appreciable gains to be had if 144Hz or 240Hz OLEDs become a thing?

60Hz = 16.67 ms/Frame
120Hz = 8.33 ms/Frame (-8.33ms vs. 60Hz)
144Hz = 6.94 ms/Frame (-9.72ms vs. 60Hz)(-1.39ms vs.120Hz)
240Hz = 4.17 ms/Frame (-12.5ms vs. 60Hz)(-4.17ms vs.120Hz)(-2.78ms vs.144Hz)

Not everyone buying an LG OLED is a console gamer. Anyone who tries a 60Hz vs 120Hz monitor can feel and see the 8ms difference (especially when using a mouse). However, the difference between 120Hz, 144Hz, and 240Hz may not be so easily discerned. There is only 1-4ms difference between them. On paper, the jump from 120Hz to 240Hz should only be half as dramatic as the jump from 60Hz to 120Hz. I've never actually tried it. So is it? And how much of the difference do you think is due to display tech (IPS vs LCD vs LED vs OLED, pixel refresh rate and such)?

Sure, psychologically gamers care about the extra 4ms, even if it they couldn't feel it (and I'm not saying they can't). A 1ms display removes the lag excuse for why you suck, which is reassuring when your putting in the time to not suck (it's one of the reasons I prefer retro games on CRT). On the industry side, playing up the importance of those precious milliseconds sells monitors. Everything else being equal, if one PC monitor didn't have input latency listed, would you buy it or the one that says 1ms? Even if they both were 1ms displays, you would buy the one that say's so for reassurance. So I get why it's importance might become inflated.

My second question is if the gains would be worth the additional cable bandwidth and GPU horsepower required to handle 240Hz for anything beyond 1080p? Is 120Hz the sweet spot? Perhaps that's where LG decision to stop making light of input latency is coming from. It's good enough to not really matter anymore.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Sun Jan 12, 2020 12:58 am 


User avatar

Joined: 20 Aug 2016
Posts: 1726
I worry that a push for higher frame rates will introduce more checkerboard rendering and all the artifacts that come with it to PC--where compression doesn't belong.

It's like pushing PC games to follow broadcast television down the interlacing rabbit hole.

If I have to give up proper rendering for high frame rates, I'm good with 60Hz and 1ms polling. If the games are coded well and you have a good display, the lag is sufficiently low.
_________________
We apologise for the inconvenience


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Sun Jan 12, 2020 10:30 am 


User avatar

Joined: 16 May 2008
Posts: 1902
Location: Denmark
strayan wrote:
According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344


Have they ever listed inputlag on their TVs? Have any manufacturer ever done that? I can’t remember ever seeing this listed when looking at specs.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Sun Jan 12, 2020 6:57 pm 



Joined: 07 Jan 2017
Posts: 173
Konsolkongen wrote:
strayan wrote:
According to an LG spokesperson they will know longer be disclosing input latency https://youtu.be/zRhd2Wsy6L0?t=344


Have they ever listed inputlag on their TVs? Have any manufacturer ever done that? I can’t remember ever seeing this listed when looking at specs.

GAMING MONITORS :mrgreen:


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Sun Jan 12, 2020 11:31 pm 


User avatar

Joined: 16 May 2008
Posts: 1902
Location: Denmark
You're probably right and that's what he meant, but he is being interviewed about their 2020 OLED TVs which is why it didn't make sense to me :)


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Mon May 18, 2020 12:26 am 



Joined: 20 Oct 2016
Posts: 33
xeos wrote:
Xyga wrote:
I would have bet money someone would immediately pop up to say all that. :mrgreen:

One thing that would help change that to begin is an ultra cheap standalone tester, like 15~20 bucks max, because current options are all WAY too expensive. Naturally most don't see the interest of spending that money on something they'll prolly use only 2~3 times ever.


After playing with a raspberry pi zero a bit for some home automation projects, I think $10 is doable. Certainly if you were willing to accept the binary threshold approach used by the OSSC. I'm not sure the pi can do faster than 60hz refresh over HDMI though. Disclaimer: I'm not actually working on a raspberry pi latency tester.


Disclaimer revoked. I wrote a raspberry pi latency tester.

https://alantechreview.blogspot.com/2020/05/a-5-tv-input-lag-tester-using-raspberry.html

besides a raspberry pi zero, you need a reasonably fast camera, such as samsung and apple have been shipping with their flagship phones for the last 5? years.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Mon May 18, 2020 12:02 pm 


User avatar

Joined: 25 May 2014
Posts: 616
xeos wrote:
Yes and no. The real lag of a 120hz CRT is in fact lower than a 60hz CRT, and presumably that would only be empirically measurable if you choose somewhere other than the top of the screen to measure.

But it's fake lag - it only exists because of an arbitrary decision to start measurement at the start of a frame instead of measuring from the point where the display receives the data it is supposed to show. Obviously, the display cannot look into the future, so starting the measurement before the data is sent does not make any sense - it's as if you're timing a 100m race, but start the timer when "on your mark" is called out instead of starting it at "go".

Furthermore, this fake lag is completely deterministic and can be calculated to an arbitrary starting point just from the video timing and the position of the measurement sensor on screen. If you really demand that your display must be able to look into the future, you can add it to a more logical "signal-to-photon" measurement.

And as another point: A signal-to-photon measurement as (AFAIK) done by the OSSC puts its measurement boundaries at the back (signal input) and front (screen) of the display, while a Bodnar-style measurement moves one of the boundaries inside the signal source (start of frame). For fair comparisons between displays, the former way looks a lot more reasonable to me.
_________________
GCVideo releases: https://github.com/ikorb/gcvideo/releases


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Mon May 18, 2020 7:23 pm 



Joined: 20 Oct 2016
Posts: 33
Unseen wrote:
xeos wrote:
Yes and no. The real lag of a 120hz CRT is in fact lower than a 60hz CRT, and presumably that would only be empirically measurable if you choose somewhere other than the top of the screen to measure.

But it's fake lag - it only exists because of an arbitrary decision to start measurement at the start of a frame instead of measuring from the point where the display receives the data it is supposed to show. Obviously, the display cannot look into the future, so starting the measurement before the data is sent does not make any sense - it's as if you're timing a 100m race, but start the timer when "on your mark" is called out instead of starting it at "go".

Furthermore, this fake lag is completely deterministic and can be calculated to an arbitrary starting point just from the video timing and the position of the measurement sensor on screen. If you really demand that your display must be able to look into the future, you can add it to a more logical "signal-to-photon" measurement.

And as another point: A signal-to-photon measurement as (AFAIK) done by the OSSC puts its measurement boundaries at the back (signal input) and front (screen) of the display, while a Bodnar-style measurement moves one of the boundaries inside the signal source (start of frame). For fair comparisons between displays, the former way looks a lot more reasonable to me.


I may be totally missing your point, but in fairness you might be missing mine too ;-)

Let me elaborate on my perspective: since this is a video game forum, we should consider when that frame of video is rendered. As far as I know, that's all at once. You don't render the top part, start sending it out over HDMI, and then move on to rendering the bottom. Well actually, the atari 2600 does work that way, but that's a pretty special case. So the last moment your input into the computer/console/whatever could change the picture rendered is some point before the top of the screen is scanned out. Since that's going to depend on the game there's no point in trying to estimate it, other than to acknowledge that it can't be any point after the first pixel of the video signal has been sent over the cable. Thus the amount of lag between player input and photon produced is going to be higher at the bottom of the screen than the top.

As for the complete determinism, that's true, but people sure like to compare numbers without paying attention to the context. Such as, if it's a plasma display, where the scanout actually happens faster than the vertical refresh. In fact, as lovers of low lag we should hope that this approach becomes the standard rather than a rare case from a bygone era of display tech. And with modern displays there's no reason that the scanout needs to take exactly one refresh cycle. You could do all your preprocessing/motion estimation/deinterlacing and then push it out to the entire display in 4 or 8ms instead of the 16 implied by 60hz. I suppose it depends on whether the processing is happening over a couple scan lines or an entire frame. But given the lag you see in TVs, I'd guess it's often entire frames.

Finally, first photon's and all that are nice and measurable, but realistically they suggest an overly rosy idea of the true lag. It would be better to report a number that factored in the g2g response time of the monitor or something of that type. I would personally report the first photon at the top, and the 50% or 90% complete brightness response at the bottom. Call them the "marketing bs" and true lag, respectively ;-)


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Mon May 18, 2020 7:51 pm 


User avatar

Joined: 25 May 2014
Posts: 616
xeos wrote:
Let me elaborate on my perspective: since this is a video game forum, we should consider when that frame of video is rendered. As far as I know, that's all at once.

I had something about this in my post originally, but deleted it because I thought it would be too confusing. I don't think that assumption is valid in the general case.

Quote:
Well actually, the atari 2600 does work that way, but that's a pretty special case.

Rasteline-based effects aren't limited to just the Atari 2600, e.g. byuu loves to bring up a certain SNES game as emulator accuracy example that modifies some registers in a small part of the frame to generate a shadow for the player's ship. Modern VR systems sample the user input as close as possible before scanning out the picture and apply transformations to the previously-rendered picture to reduce the motion-to-photon latency.

Oh, and some modern emulators also try to race the beam to minimize input latency.

Quote:
So the last moment your input into the computer/console/whatever could change the picture rendered is some point before the top of the screen is scanned out.

For systems that use hardware-scanned inputs something like that could be true, although I suspect the scan point is not always the first visible line of the picture. For systems where inputs are handled in software, the input can be read whenever it is convenient for the programmer, for example in a C64 game that has the main play area at the top and score/status at the bottom I would not be surprised if the input is sampled after switching to the status part because that likely needs less CPU intervention and thus allows more CPU time for handling the game logic.

Quote:
Thus the amount of lag between player input and photon produced is going to be higher at the bottom of the screen than the top.

Yes, but that does not matter as long as we are talking only about display lag. If you claim that a certain number represents the display lag, but actually measure the entire path from a controller input through a rendering system, video transmission and the display processor that is just misleading.

Quote:
As for the complete determinism, that's true, but people sure like to compare numbers without paying attention to the context.

Exactly! This is why a display lag measurement must include only the lag caused by the display itself and nothing that is outside of the control of the display.

Quote:
You could do all your preprocessing/motion estimation/deinterlacing and then push it out to the entire display in 4 or 8ms instead of the 16 implied by 60hz.

I'm not completely sure, but it appears to me that you propose to lower display lag by increasing display lag? To scan out a picture faster than it is received, you must first buffer a sufficient amount so you don't run out of pixels before the bottom of the screen. Any buffer adds lag, even if it is a clever implementation that buffers just enough that the final pixel is received just-in-time.
_________________
GCVideo releases: https://github.com/ikorb/gcvideo/releases


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Mon May 18, 2020 9:37 pm 



Joined: 20 Oct 2016
Posts: 33
Unseen wrote:
xeos wrote:
Let me elaborate on my perspective: since this is a video game forum, we should consider when that frame of video is rendered. As far as I know, that's all at once. Well actually, the atari 2600 does work that way, but that's a pretty special case.


Rasteline-based effects aren't limited to just the Atari 2600, e.g. byuu loves to bring up a certain SNES game as emulator accuracy example that modifies some registers in a small part of the frame to generate a shadow for the player's ship. Modern VR systems sample the user input as close as possible before scanning out the picture and apply transformations to the previously-rendered picture to reduce the motion-to-photon latency.



Raster line effects don’t necessarily mean that the game state itself is updated. If it were, you’d be at risk of piecemeal appearance where there (could) be a discontinuity between states, such as is seen with vsync off all the time in 3d games, usually referred to as tearing. I don’t deny it’s possible to be smart about this and only update just before rendering and also be smart enough to only do it when it’s not going to be visible. But it starts to require an awful lot of cleverness. I’d be curious of any specific examples. My recollection of the VR stuff is that they do delay rendering as long as possible and at times even apply a whole image transform after 3d rendering, but I’ve never heard of them actually repositioning objects on the screen as rendering progresses from top to bottom. Again, I’d welcome a specific citation to the contrary – it’s an interesting idea but seems like it would be very prone to tearing artifacts.

Quote:
Quote:
So the last moment your input into the computer/console/whatever could change the picture rendered is some point before the top of the screen is scanned out.

For systems that use hardware-scanned inputs something like that could be true, although I suspect the scan point is not always the first visible line of the picture. For systems where inputs are handled in software, the input can be read whenever it is convenient for the programmer, for example in a C64 game that has the main play area at the top and score/status at the bottom I would not be surprised if the input is sampled after switching to the status part because that likely needs less CPU intervention and thus allows more CPU time for handling the game logic.



Right. So this means that lag will be even worse than the estimate I am suggesting is most representative. But definitely not better, which is what you get if you measure signal out 2 photon out.

Quote:
Quote:
Thus the amount of lag between player input and photon produced is going to be higher at the bottom of the screen than the top.

Yes, but that does not matter as long as we are talking only about display lag. If you claim that a certain number represents the display lag, but actually measure the entire path from a controller input through a rendering system, video transmission and the display processor that is just misleading.


it's only misleading if it's not the typical scenario. My belief is that >90% of games are not updating the rendered output mid-frame. So the game state is as stale as the top-most pixel, and only gets worse as you travel down the screen. Again, I'd be interested in references to the contrary.

Quote:
Quote:
You could do all your preprocessing/motion estimation/deinterlacing and then push it out to the entire display in 4 or 8ms instead of the 16 implied by 60hz.

I'm not completely sure, but it appears to me that you propose to lower display lag by increasing display lag? To scan out a picture faster than it is received, you must first buffer a sufficient amount so you don't run out of pixels before the bottom of the screen. Any buffer adds lag, even if it is a clever implementation that buffers just enough that the final pixel is received just-in-time.


most displays do a lot of processing on the input - otherwise lag would be 0 at the upper left corner and 16.6ms at the bottom, like our good old progressive CRTs. Instead it's 8-80ms more than that. So they are already buffering a lot. All in the name of reducing motion smear, etc. Or, faking 120hz from a 60hz signal.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Tue May 19, 2020 2:26 am 


User avatar

Joined: 08 Dec 2005
Posts: 18368
Location: uoıʇɐɹnƃıɟuoɔ ɯǝʇsʎs
I just want to point out that screen tearing is technically an example of a game updating a frame after the frame has begun drawing, even when it is due to the next frame's drawing being delayed rather than images being drawn too quickly. I don't know what proportion of current games this mangles, but it certainly was a problem in some 360 generation games. Some versions have tearing while others accept input lag to cover it up with v-sync. (Aside: I recently discovered I had my G-Sync settings wrong for quite a while, so lots of things were tearing on my PC, even in scenarios where it didn't make sense for them to be. Thankfully blurbusters points out that one should not have "fast" v-sync activated alongside G-Sync.)

Additionally, it might be a sensible alternative to talking about "display-based" or "console-based" latency to simply emphasize there is a window in which the output is meant to appear, and if something appears outside that window then it is objectionable. With faster scanout to the bottom of the screen (at least that's what I think happens on modern displays!) this should be closer to the truth than the inherent rolling shutter-like model on CRTs.


Top
 Offline Profile  
 
 Post subject: Re: Measuring lag with the OSSC (LG OLEDs)
PostPosted: Tue May 19, 2020 5:03 pm 



Joined: 20 Oct 2016
Posts: 33
Ed Oscuro wrote:
With faster scanout to the bottom of the screen (at least that's what I think happens on modern displays!) this should be closer to the truth than the inherent rolling shutter-like model on CRTs.


Do you know of any specific examples other than most/all plasma displays? If it's already widespread all the more reason to report lag at the bottom of the screen than at the top (or, of course, both).


Top
 Offline Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 48 posts ]  Go to page Previous  1, 2

All times are UTC


Who is online

Users browsing this forum: Google [Bot] and 22 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
Space Pilot 3K template by Jakob Persson
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group