I think it's a lot about people's own understanding (or lack of it) or preference about the lag thing,
Some only look a the raw input lag (top of the screen very beginning of the frame) typically monitor reviews websites such as tftcentral, pcmonitors.info, etc
Some only care about the center-of-the-screen display lag (most popular/mainstream)
Some only mind the bottom, where the frame ends,
Some even take the three measurements and average the three figure, like displaylag.com, idk why.
Personally I prefer to know about the input lag first (top of the screen) even though I know the mid-screen total display lag is the most pertinent in practice.
What matters most is that people understand what the measuments mean. In lots of displays discussions people mix different fashions of lag measurements in total confusion, but I can't blame them since most websites that show lag measurements also fail to explain clearly-enough what they mean, how it works, what they're showing you and why.
That's why I've been for a while in favor of always showing all three lag measurements, top, middle, bottom, if all websites would do that that people might be confused for while but in the end there will be no room left for confusion anymore.
fandangos wrote:And when you say "lagless" how much is it? is 20ms lagless?
Well 0 is lagless...but we're only humans so honestly even about 1/4 of a frame (4~ms) at the top of the screen/beginning of the frame (+4ms = 8~ms midscreen), which is quite common today for monitors, can reasonably be called 'lagless'.
20ms, assuming we're talking about a middle screen measurement is not lagless but still quite good for most gaming, and again unless one is a competition cyborg it won't get in the way.
fandangos wrote:If retroarch had a decent scanline shadder (all sucks and are not what a scanline looks like) I would play it at 2048x1536@60hz on it at it's native resolution.
CRT-Royale can look pretty awesome
http://shmups.system11.org/viewtopic.ph ... h&start=60 for shaders you absolutely need to use the native resolution of your display otherwise your output gets rescaled by the dispplay. Plus for shaders the higher output resolution to work on the better the results as they have more room to actually draw finer details.
But as good as those can look the motion of flat panel display's isn't as good as CRT's and when it moves a lot many details by those shaders are lost in the blur.
fandangos wrote:But that's not the focus of this discussion but I also want to add one other thing, maybe if you can convince me around this I would be happy to be proven wrong.
The Hi-Def NES is way sharper compared to retroarch at 4k. This doesn't make sense. I have bilinear filtering disabled and such but let me put it this way:
The hi def nes is sharp as the OSSC x5 and retroarch is sharp like the framemeister. Hope my analogy can paint a clear picture of how I see it.
I think something's wrong in your RA configuration.
Also remember the Hi-Def NES outputs 1080p at best, which means it is upscaled to 4K by your TV, so it can't be pixel-sharp, there's a bit of interpolation smoothing (most displays don't have a setting to allow disabling scaling interpolation), same for the OSSC and Mini.
RA on the other hand is output from a PC and therefore can use the full 4K resolution of your TV, and be pixel-sharp since when the full panel resolution is used and assuming you're in game or graphics mode (all tv processing disabled).
Really look into your RA settings and force it to output at 4K, then try shaders again, and if you see scaling artifacts ('scanlines' of uneven thickness, banding, or moire patterns) try enabling integer scaling in RA's settings as well, although this will force the picture to show at exact multiple values (selectable don't worry) and in cases mean you will see black borders or the opposite a pictore slightly too big, the result will often be cleaner and more accurate compared to the usual fractional scaling.