Can you post some photos please?orange808 wrote: There was some previous speculation that the 1:1 modes on many BenQ monitors applied filtering. I compared the GPU integer scales to using the xl2720's 1:1 modes. I don't see a difference.
Nvidia Takes a stab at integer scaling
Re: Nvidia Takes a stab at integer scaling
Re: Nvidia Takes a stab at integer scaling
but but.. integer scale! It's the holy grail of scaling!this is going to be bad. If they're fixated on only providing integer scales or nearest neighbour as an alternative, there's no way to get smooth scrolling without sticking to integer scales. And we'll be stuck with this implementation for years. I don't understand why a company like doesn't get the concept of combining integer scales with linear filtering?

I tried it on Wolfenstien Youngblood and couldn't tell any difference. I'm using a GTX 1080ti rnUltra low latency is a neat idea, but don't try it with new games. It studders badly. It's probably okay if you turn all the settings to min and game at 720p, but that's ridiculous. The "competitive" gamers will like, though.
OSSC Forums - http://www.videogameperfection.com/forums
Please check the Wiki before posting about Morph, OSSC, XRGB Mini or XRGB3 - http://junkerhq.net/xrgb/index.php/Main_Page
Please check the Wiki before posting about Morph, OSSC, XRGB Mini or XRGB3 - http://junkerhq.net/xrgb/index.php/Main_Page
Re: Nvidia Takes a stab at integer scaling
Integer scaling is ideal in specific situations mostly, where fractional doesn't work well-enough.
I don't know from where the "integer for everything" idea started. *shrug*
I don't know from where the "integer for everything" idea started. *shrug*
Strikers1945guy wrote:"Do we....eat chicken balls?!"
Re: Nvidia Takes a stab at integer scaling
Shadow of the Tomb Raider with ray tracing and Assassin's Creed Odyssey are my newest games. Both of them had some studdering at times while moving around. It's the kind of slight studders we used to see in large world games from hard drive loading, but they are running on a solid state. Doesn't happen with Low Latency at "On" or "Off". "Ultra* triggers it. I could easily play the games with it there, but I didn't feel any difference in responsiveness (although there is probably slightly less lag).BuckoA51 wrote: I tried it on Wolfenstien Youngblood and couldn't tell any difference. I'm using a GTX 1080ti rn
We apologise for the inconvenience
Re: Nvidia Takes a stab at integer scaling
Integer scaling is of limited utility, because it rarely works properly without some sort of compromise, be it incorrect aspect ratios from non-square pixels, or letter/pillarboxing from non-integer resolution ratios.
A far more useful mode is one that does something similar to "integer each axis at the lowest multiplier that produces output higher than the target output resolution, and then uses a bilinear downscale to the target resolution/aspect ratio." There is probably some single-pass algorithm that does the same thing.
That provides results that are nearly as sharp (when talking about large scaling factors, like 224p to 1080p or 1440p or 2160p), fill the screen height, have a correct aspect ratio, have no shimmering, etc.
When you use this approach on actual integer scaling factors (like 240p or 480p or 720p to 1440p), then it's effectively just a pure integer scale anyhow.
A far more useful mode is one that does something similar to "integer each axis at the lowest multiplier that produces output higher than the target output resolution, and then uses a bilinear downscale to the target resolution/aspect ratio." There is probably some single-pass algorithm that does the same thing.
That provides results that are nearly as sharp (when talking about large scaling factors, like 224p to 1080p or 1440p or 2160p), fill the screen height, have a correct aspect ratio, have no shimmering, etc.
When you use this approach on actual integer scaling factors (like 240p or 480p or 720p to 1440p), then it's effectively just a pure integer scale anyhow.
Re: Nvidia Takes a stab at integer scaling
I was experimenting with that last time I tried playing StarCraft and found that on a 1920x1080 monitor, integer scaling to 1280x960 and then upscaling to 1920x1080 was sharper than integer scaling to 1920x1440 and downscaling.
So picking the integer scale factor that's closest to the output resolution might be better for sharpness, though not necessarily accuracy. Not that you would realistically be able to tell the difference in either case.
So picking the integer scale factor that's closest to the output resolution might be better for sharpness, though not necessarily accuracy. Not that you would realistically be able to tell the difference in either case.
Re: Nvidia Takes a stab at integer scaling
Either way, that general two-step approach produces far more polished results than pure integer scales.
StarCraft, on the other hand, has a modern HD remaster that supports arbitrary resolutions, so there's no reason to use the old graphics
Though IIRC the remastered version also lets you use the original graphics using some sort of interpolated-but-sharp-scaled approach that is vaguely similar to what we've described.
StarCraft, on the other hand, has a modern HD remaster that supports arbitrary resolutions, so there's no reason to use the old graphics

Re: Nvidia Takes a stab at integer scaling
More on intels implementation https://cdn.discordapp.com/attachments/ ... image0.png
Re: Nvidia Takes a stab at integer scaling
in other words: same shit.
Re: Nvidia Takes a stab at integer scaling
It has a 1-4 sharpness slider.Guspaz wrote:Though IIRC the remastered version also lets you use the original graphics using some sort of interpolated-but-sharp-scaled approach that is vaguely similar to what we've described.
Nvidia only offers integer scaling (with black bars), but Intel offers options for both integer scaling and nearest scaling, if you can handle the artifacts.Fudoh wrote:in other words: same shit.
Re: Nvidia Takes a stab at integer scaling
nobody should. If people start to accept that, we don't have to wonder why M2 doesn't give a shit and delivers the same inadequacies with than otherwise nice work.f you can handle the artifacts.