a) Untrue. My display is a 40" 4k wasabi mango and has neither gsync or freesync and accepts whatever my gpu throws at it. As does my 27" asus monitor.Fudoh wrote:that's simply wrong - on so many levels that I can't even start. You make it sound as if you believe what you're writing. Please stop spreading misinformation like that.It has always been console related. Gaming monitors can accept basically any fps as long as it is a whole number and doesn't exceed their maximum input (not a problem as most consoles output 30-60 fps) so if you have a gaming monitor and use an ossc to get hdmi in at a whole number... it's compatible. Boom no more issues, end of story. And you would just have to tolerate a virtually imperceptible fraction of a frame per second of lag.
a) monitors won't accept anything but a few standarized timings EXCEPT when they're driven by a GSYNC or Freesync GPU.
b) rendering fps are only rarely related to the output refresh rate (and NEVER on consoles).
c) what's that BS about integer fps ?
d) if you even try to framerate convert a hardware synced input signal by 0.1Hz you get VERY noticable and constant stutter.
b) It is related because what I am saying is that gaming displays accept a wide variety of inputs, a lot more than displays marketed as "tvs".
c) The issue is with the hdmi standard not being as accepting of "non standard" inputs. If a tv has an issue with an input it is almost always because it is not a whole number, such as 59.9. I've never had an input that was a whole number 1-max get rejected from a console.
d) Untrue. Simply adding part of a single frame of delay to your display is imperceptible. For example the xrgb mini adds around 1.5 fps of lag and the incredibly vast majority of people can not tell at all.