PixelFX Morph

The place for all discussion on gaming hardware
User avatar
vrunk11
Posts: 8
Joined: Thu May 20, 2021 8:56 am
Location: Europe

Re: PixelFX Morph

Post by vrunk11 »

Is there a planned function for converting limited range to full range ? (switching between IRE 7.5 and IRE 0) for NTSC input

It could be really useful for enhancing the colors when using analog out in (film mode), Maybe it's related to the LUT mentioned earlier, but I think its more specific for video than games that why I'm asking.

Also what is the chip used for the 3D comb filter and TBC is it an ADV7482 ? :roll:

thanks
User avatar
HDgaming42
Posts: 331
Joined: Wed Jul 15, 2009 3:16 am
Location: Canada

Re: PixelFX Morph

Post by HDgaming42 »

Guspaz wrote:The problem with using a LUT for NES palettes is that you must have a known fixed source pallet to transform it to the target, at which point you might as well just load the target palette directly onto the NESRGB in the first place.
True--I'm just spitballing use cases for LUTs here. I don't think many people who have the NESRGB flash new palettes, and just end up using whatever selection they/their modder chose during install.
fernan1234 wrote:Yeah, for the PC Engine the RGB -> YUV table is known now, which is why you can get accurate results like with the MiSTer TG16 core. But that's not the case with the NES, which is why there is still such a jungle of RGB palettes for it. The only way to get the "real" colors is by cloning the actual NES PPU, like what Kevtris did with the Nt Mini that can thus output the "real" colors via S-video, which can then be processed by a scaler much more nicely than composite.
Another great usage, and this will likely get me in trouble, would be to play 8 and 16bit games in D93 the way the gods intended with a D93 LUT, while viewing other systems in D65. Or being able to instantly jump between the two to play in whatever you prefer.

I did a color check on my PVM-2530 and it's way out to lunch. So much so that a LUT won't be able to dial it all in--I'll have to open it up to start from a better baseline to get average delta errors under 3. In the meantime I'm running a 12hr patch generation to see how much closer I can get it. Eventually I'll be able to compare a 33 vs 65pt cube.

I'll report back.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

HDgaming42 wrote:Another great usage, and this will likely get me in trouble, would be to play 8 and 16bit games in D93 the way the gods intended with a D93 LUT, while viewing other systems in D65. Or being able to instantly jump between the two to play in whatever you prefer.
Now this is a great idea!
User avatar
HDgaming42
Posts: 331
Joined: Wed Jul 15, 2009 3:16 am
Location: Canada

Re: PixelFX Morph

Post by HDgaming42 »

HDgaming42 wrote:I did a color check on my PVM-2530 and it's way out to lunch. So much so that a LUT won't be able to dial it all in--I'll have to open it up to start from a better baseline to get average delta errors under 3. In the meantime I'm running a 12hr patch generation to see how much closer I can get it. Eventually I'll be able to compare a 33 vs 65pt cube.

I'll report back.
So here's the PVM-2530 raw D93 read against D65*, followed by a D65 LUT created with 100 patches, and finally a D65 LUT generated via over 1000 test patches.

Image


You can tell there were a few errors (one grescale read as blue), but I ran it fully automated and didn't do any cleanup. Now, the 2530 was designed for D93, but the version of Colorchecker I have doesn't support it, so I had to verify against D65. This might be partially why the initial read is so far out, but I've never calibrated the colour on it either so I don't know. I created the LUTS as D65, and that's what I verified against.

So a LUT can wrestle down delta errors in excess of 10, down to below 2. Fully automated. With free software. You just need a compatible meter, and a way to get the test signal to your display device.

Believe...in the power of LUTs.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: PixelFX Morph

Post by VEGETA »

Not just a bit and if anything it's the other way around. The Morph's custom scaler is more capable than the Intel Scaler IP currently being used in the OSSC Pro, it has Composite/S-Video support, a built-in DAC, and a very capable wifi system processor.
Looks like I was correct that the OSSC Pro is using Intel scaler IP in their FPGA. I assume you use some Xilinx IP too? and BTW, how can you buy HDMI and HDCP related ICs? they require 15k of licensing that I am sure you are not paying. Unless you use some IC from China without the need to get license to buy it. I am interested. Maybe you could go for DisplayPort instead of HDMI, then people put a small cheap adapter to HDMI.

The built in DAC seems nice addition for those who want analog RGB via VGA or so to display on their CRT, but not a main feature IMO. Looks like this device is going to be better than 5x, I am cheering for you!
strayan
Posts: 671
Joined: Sun Mar 19, 2017 8:33 pm

Re: PixelFX Morph

Post by strayan »

VEGETA wrote:they require 15k of licensing that I am sure you are not paying.
Why are you sure of this?
User avatar
Extrems
Posts: 540
Joined: Sat Jan 30, 2016 5:01 pm
Contact:

Re: PixelFX Morph

Post by Extrems »

Try searching for "Pixel Fx" here.
User avatar
Dr. Claw
Posts: 30
Joined: Thu Jul 16, 2020 4:14 pm

Re: PixelFX Morph

Post by Dr. Claw »

Extrems wrote:Try searching for "Pixel Fx" here.
And that's a L.

I saw HD Retrovision in there too... wonder what they're cooking up.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

Assuming that they are all classified as "low volume" (<10k units) they'd be paying $5000 per year + 1$ per unit. So it'd be 15k if they actually get to that amount of units per year, but seems more realistic to assume something like 1000-2000 units made per year. Seems sustainable and overall worthwhile.
User avatar
Guspaz
Posts: 3136
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: PixelFX Morph

Post by Guspaz »

Dr. Claw wrote:
Extrems wrote:Try searching for "Pixel Fx" here.
And that's a L.

I saw HD Retrovision in there too... wonder what they're cooking up.
Just in case it's not clear to others, Pixel Fx is indeed an HDMI adopter listed there, listed since January 2021, as are several other retro companies.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: PixelFX Morph

Post by VEGETA »

My assumption was not based on anything, but figured out people don't want to spend money on licenses when making such products. I mean, does the OSSC itself has a license?

I was talking also about HDCP license not just HDMI, HDCP is a minimum of 15k per year. I want to design my own scaler and I found out about licensing stuff along the way. Therefore I chosen DisplayPort instead, I hope I can start designing soon.

HDMI licensing is really a pain, I mean not the 1$ per device but rather the annual payment. HDCP seems to be even worse, many ICs are pre-programmed with their keys, so you must get the license to purchase the ICs.

Sorry to bother you with all this, I was just wondering.

I am interested in what scaler IP you are using to achieve your result, and what killed features you have over 5x or OSSC or even Framemeister.

thanks for your efforts
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

VEGETA wrote:I am interested in what scaler IP you are using to achieve your result, and what killed features you have over 5x or OSSC or even Framemeister.
Pretty interesting that you wrote "even Framemeister" in that list since the 5X is already 100% better than it in every single way, and so was the OSSC since a couple of years ago in every way except compatibility, input type options (no composite/S-video), and deinterlacing (though at least it did it without the lag of the FM).

But in any case, have you checked out the roadmap linked on the first post? It lists a lot of killer features that should make you drool. And some of woozle's responses in this topic add even more things to look forward to.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: PixelFX Morph

Post by VEGETA »

fernan1234 wrote:
VEGETA wrote:I am interested in what scaler IP you are using to achieve your result, and what killed features you have over 5x or OSSC or even Framemeister.
Pretty interesting that you wrote "even Framemeister" in that list since the 5X is already 100% better than it in every single way, and so was the OSSC since a couple of years ago in every way except compatibility, input type options (no composite/S-video), and deinterlacing (though at least it did it without the lag of the FM).

But in any case, have you checked out the roadmap linked on the first post? It lists a lot of killer features that should make you drool. And some of woozle's responses in this topic add even more things to look forward to.
I don't own a framemeister nor recommend it to people, I recommend OSSC for most of the times. I have an OSSC from Aliexpress that works well.. my only complain is bad 480i handling since it outputs lots of flicker. I play PS2 only on it, and I had to change the PS2 internal resolution to 480p using GSM to be able to enjoy it better on OSSC.

I will stick to my OSSC until your device hits the market, hoping it will be cheap enough for me to buy.

and just a side note: why not use DisplayPort instead of HDMI, then rely on adapters to deliver the signal? this will relief you from licensing and fees.

thanks
User avatar
Fudoh
Posts: 13015
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Re: PixelFX Morph

Post by Fudoh »

whether you think that a 5X is enough for your deinterlacing needs, is really up to your personal needs, but from an objective view the FM is stil considerably better in that regard. But that's not really surprising given that the FM uses probably THE best video processor IC created during the Full HD era. Any FPGA coming even close is a marvelous achievement already.

The 5X's scaling is pretty good. I'm not sure that many people would profit from a better algorithm. A more flexible engine (with free scaling on both axis) would be welcome of course.

Marqs stated that the "full version" of Intel's deinterlacing algorithm (with diagonal interpolation added to match the Kyoto's capabilities in the FM) is too heavy for the FPGA used and the road map for the Morph also "only" mentions motion-adaptive deinterlacing without going into detail which features will be available.

So at the moment I ASSUME that the 5X, the OSSC Pro and the Morph will all use a rather simple motion-adaptive deinterlacing algorithm and the FM will continue to reign supreme in this particular category. (Important note: Of course I fully agree that the 5X is a much better overall package for almost all users).
anexanhume
Posts: 41
Joined: Sat Mar 06, 2021 1:12 am

Re: PixelFX Morph

Post by anexanhume »

Fudoh wrote:whether you think that a 5X is enough for your deinterlacing needs, is really up to your personal needs, but from an objective view the FM is stil considerably better in that regard. But that's not really surprising given that the FM uses probably THE best video processor IC created during the Full HD era. Any FPGA coming even close is a marvelous achievement already.

The 5X's scaling is pretty good. I'm not sure that many people would profit from a better algorithm. A more flexible engine (with free scaling on both axis) would be welcome of course.

Marqs stated that the "full version" of Intel's deinterlacing algorithm (with diagonal interpolation added to match the Kyoto's capabilities in the FM) is too heavy for the FPGA used and the road map for the Morph also "only" mentions motion-adaptive deinterlacing without going into detail which features will be available.

So at the moment I ASSUME that the 5X, the OSSC Pro and the Morph will all use a rather simple motion-adaptive deinterlacing algorithm and the FM will continue to reign supreme in this particular category. (Important note: Of course I fully agree that the 5X is a much better overall package for almost all users).
The latency cost also has to be taken into account. The FM pays for its deinterlacing with more latency, and I wonder if those advanced methods could be implemented in FPGA with less latency than the FM.
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: PixelFX Morph

Post by Josh128 »

I guess de-interlacing is a big thing to some folks, but personally, I actually prefer simple bob over all other techniques as it replicates the look of actual 480i on a CRT and should technically be one of the least latency-additive ones. Every single fancy de-interlacing technique Ive ever seen looks great on still pictures, equal to 480p, but as soon as motion starts they all (every single one that Ive seen) fall on their faces. Its impossible to create image data that doesnt exist, no matter how good you fake it.

I've played with the de-interlacing on the GBS-C and RT5x pro a bit, the motion adaptive GBS-C de-interlacing is very impressive on a VGA CRT, again, until things move. Bob, on the other hand, perfectly replicates 480i on a 15KHz CRT when applied to a 31KHz CRT, and thats my ideal picture, as true to the perfection of technology that I grew up playing these games with as possible.
User avatar
Dr. Claw
Posts: 30
Joined: Thu Jul 16, 2020 4:14 pm

Re: PixelFX Morph

Post by Dr. Claw »

Josh128 wrote:I guess de-interlacing is a big thing to some folks, but personally, I actually prefer simple bob over all other techniques as it replicates the look of actual 480i on a CRT and should technically be one of the least latency-additive ones. Every single fancy de-interlacing technique Ive ever seen looks great on still pictures, equal to 480p, but as soon as motion starts they all (every single one that Ive seen) fall on their faces. Its impossible to create image data that doesnt exist, no matter how good you fake it.

I've played with the de-interlacing on the GBS-C and RT5x pro a bit, the motion adaptive GBS-C de-interlacing is very impressive on a VGA CRT, again, until things move. Bob, on the other hand, perfectly replicates 480i on a 15KHz CRT when applied to a 31KHz CRT, and thats my ideal picture, as true to the perfection of technology that I grew up playing these games with as possible.
I kind of agree with you on this, particularly with 480i sources. The first time I used one of the RetroTinks instead of another off-brand scaler on PS2, I was amazed at how "much like a CRT" it looked for 480i sources. I still like its "bob" use a little more than the OSSC for 480i.

On the Framemeister topic, it had 2 drawbacks. One -- which was minor to me, the HDMI input didn't quite handle everything you threw at it. And of course the other was resolution switching, which was major. I still wish that had been mitigated.

That the new generation of scalers intended for video games seems to have squashed it makes any of them a "go" for me. Morph seems much a proper successor to the FM based on what I've read thus far, just for the input/output reasons alone.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

anexanhume wrote:The latency cost also has to be taken into account. The FM pays for its deinterlacing with more latency, and I wonder if those advanced methods could be implemented in FPGA with less latency than the FM.
This. With respect to Fudoh, but honestly who really cares if the FM's deinterlacing merely looks a bit nicer to the eye than the adaptive deinterlacing of the 5X and ostensibly this Morph and the OSSC Pro, when this arguably nicer-looking deinterlaced picture comes at the cost of horrible lag, which I'd say matters for all games, even text-based menu-heavy games. When I select a menu item on an RPG or even text adventure game I want it to feel snappy like on a CRT.

But this is all if you prefer a progressive-looking deinterlaced picture. I agree even more with the above comments about bob+"scanlines" (the bob in the GBS-C, bob+100% scanlines on OSSC Pro, or "CRT simulate" on the TINK products) is the ideal solution, since not only does it look most authentic as interlaced pictures, but is the fastest. We're not talking about video or film here, but games that were made to be played on CRTs. The ideal is for them to look and feel like on a CRT after scaling.
ldeveraux
Posts: 1113
Joined: Thu Mar 01, 2018 10:20 pm

Re: PixelFX Morph

Post by ldeveraux »

VEGETA wrote:and just a side note: why not use DisplayPort instead of HDMI, then rely on adapters to deliver the signal? this will relief you from licensing and fees.
Not many TVs support DP, only monitors.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

ldeveraux wrote:Not many TVs support DP, only monitors.
I think the suggestion is that it's trivial to convert DP to HDMI with a cheap dongle, ostensibly with no effect on picture quality.

But the user who made that suggestion may be missing the fact that as a manufacturer of these devices it can make all the sense in the world financially to just swallow the HDMI fees and offer the convenience to increasingly demanding customers.
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: PixelFX Morph

Post by Josh128 »

fernan1234 wrote:
But this is all if you prefer a progressive-looking deinterlaced picture. I agree even more with the above comments about bob+"scanlines" (the bob in the GBS-C, bob+100% scanlines on OSSC Pro, or "CRT simulate" on the TINK products) is the ideal solution, since not only does it look most authentic as interlaced pictures, but is the fastest. We're not talking about video or film here, but games that were made to be played on CRTs. The ideal is for them to look and feel like on a CRT after scaling.
I think you hit the nail on the head here. For movies / video, etc, good de-interlacing replicates what you see in a movie theater / film, and I can understand why some videophiles desire the perfect de-interlacing solution because it helps replicate the way they were originally filmed/presented, but for games, 480i games were always 480i, so anything other than bob-type deinterlacing solutions are presenting them in a historically un-natural / inaccurate way. And of course, the least amount of added input lag is of utmost importance for games, and makes zero difference for video/movies.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

Josh128 wrote:I think you hit the nail on the head here. For movies / video, etc, good de-interlacing replicates what you see in a movie theater / film, and I can understand why some videophiles desire the perfect de-interlacing solution because it helps replicate the way they were originally filmed/presented, but for games, 480i games were always 480i, so anything other than bob-type deinterlacing solutions are presenting them in a historically un-natural / inaccurate way. And of course, the least amount of added input lag is of utmost importance for games, makes zero difference for video/movies.
Right, though one detail I forgot to mention is that when you use this approach you darken the picture by half, just like when you use full scanlines for 240p, which can be tricky to compensate for on the scaler or monitor side to restore a bright CRT-like picture without crushing whites and colors or elevating blacks (which is particularly bad if using OLEDs). And this darkening problem is further compounded if you use BFI, which is the only way on modern TVs to approach the original motion clarity that is also essential for a CRT-like experience. Using gamma compensation or LUTs is probably the best approach, which the Morph could use to the benefit of both 240p and 480i games with scanlines. Then I dream further of a scaler that can add a rolling black bar to emulate CRT scanning (of course it would have to be a buffered output, so lag would be a tradeoff).

The point was also that, after the release of the RT5X, the Framemeister should be considered 100% obsolete for actual gaming. It shouldn't even come up as an option anymore. Of course it can still be seen as great, or even optimal, for other purposes like film/video viewing as well as capturing.
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: PixelFX Morph

Post by Josh128 »

Right, though one detail I forgot to mention is that when you use this approach you darken the picture by half, just like when you use full scanlines for 240p, which can be tricky to compensate for on the scaler or monitor side to restore a bright CRT-like picture without crushing whites and colors or elevating blacks (which is particularly bad if using OLEDs).
Are you sure this is the case though?-- When using the GBS-C on my VGA monitor, I dont remember a discernable difference in brightness between passthrough native 480p and 31KHz bob 480i when I tested the other day, not saying you are wrong here, you may indeed be right, but I also know for certain that the Panasonic TC-P50-X60 I used to have used internal bob deinterlacing when fed a 480i signal and it looked every bit as bright as 480p without touching settings if I recall. It looked amazingly CRT like.

If it is the case, you are correct about it really only being possibly an OLED issue as modern LCDs get easily bright enough to counter any brightness loss in those modes.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

Josh128 wrote:Are you sure this is the case though?-- When using the GBS-C on my VGA monitor, I dont remember a discernable difference in brightness between passthrough native 480p and 31KHz bob 480i when I tested the other day, not saying you are wrong here, you may indeed be right, but I also know for certain that the Panasonic TC-P50-X60 I used to have used internal bob deinterlacing when fed a 480i signal and it looked every bit as bright as 480p without touching settings if I recall. It looked amazingly CRT like.
Just to be clear, "bob deinterlacing" alone does not darken the picture. It looks annoyingly flickery precisely because it's not blacking out every other line (or field for each frame). That has to be what your Panasonic does, and what the OSSC does by default. But if you activate alternating scanlines on the OSSC and turn them up to 100% then you get the authentic interlace look, at the cost of halving the brightness. Same with the TINK's "CRT simulate" mode, as well as the GBS-C's bob mode. I couldn't say why you didn't notice the difference when using the GBS-C on your VGA monitor if you switched to bob (adaptive deinterlace is the default), but it has to be the case by necessity--every other line is black! It's like outputting 240p line doubled to 31khz + full scanlines on a VGA CRT, it's half as bright as 240p is at 15Khz on an SD/multiformat CRT, and it's the reason why people do the whole 240p@120Hz on VGA CRTs.

edit:
Josh128 wrote:If it is the case, you are correct about it really only being possibly an OLED issue as modern LCDs get easily bright enough to counter any brightness loss in those modes.
And yes, OLEDs are particularly troublesome because they have terribly low nits for SDR signals, compared to LCDs, as well as more aggressive ABL. The best thing for OLEDs would be to flag the output of the scaler as HDR, which is what OLEDs do best.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: PixelFX Morph

Post by VEGETA »

fernan1234 wrote:
ldeveraux wrote:Not many TVs support DP, only monitors.
I think the suggestion is that it's trivial to convert DP to HDMI with a cheap dongle, ostensibly with no effect on picture quality.

But the user who made that suggestion may be missing the fact that as a manufacturer of these devices it can make all the sense in the world financially to just swallow the HDMI fees and offer the convenience to increasingly demanding customers.
well, you are correct about HDMI and HDCP (this one is more important) fees being sometimes necessary. However, for someone who doesn't have near enough money, it is a dead end. I tried messaging HDCP people by telling them I could pay per unit sold or after selling and getting profits... but they always respond with a no. buying the license and paying the annual 15k is a must before anything else.

however, those who have enough money they can spend it easily on license.

I just wanna ask if someone tried a DP to HDMI adapter with say 4k60 signal. does it convert it successfully without lag? how much price?
User avatar
bobrocks95
Posts: 3460
Joined: Mon Apr 30, 2012 2:27 am
Location: Kentucky

Re: PixelFX Morph

Post by bobrocks95 »

When did everyone get so sensitive to lag? 5-10 years ago 50ms+ LCDs were the norm, and people added scalers on top of that and were happy. Now you've got stuff like that guy in the 5x thread who refused to buy it because it was 4ms slower than the 2x or something close to that.

2 frames of lag on the Framemeister is of course worse on paper, but pair it with a fast display and show me the double blind test that shows anyone can reliably tell the difference. I think the placebo effect is strong here (and, again, sure, the 5X pretty much replaces the FM, yes).
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

bobrocks95 wrote:When did everyone get so sensitive to lag? 5-10 years ago 50ms+ LCDs were the norm, and people added scalers on top of that and were happy. Now you've got stuff like that guy in the 5x thread who refused to buy it because it was 4ms slower than the 2x or something close to that.

2 frames of lag on the Framemeister is of course worse on paper, but pair it with a fast display and show me the double blind test that shows anyone can reliably tell the difference. I think the placebo effect is strong here (and, again, sure, the 5X pretty much replaces the FM, yes).

It's not so much a placebo, but rather impressions will depend on what you've been used to. I'm not special, I've simply been using CRTs all my life, and only started really getting serious with scalers and flat panels this year. There's many people who've been on the scaler + flat panel game longer but have also been using CRTs on the side and thus remain alert to the differences. I could immediately tell a "big" (to me) difference between the 5X's triple buffered mode and the framelock mode, whereas other users on that topic were saying it feels the same to them.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: PixelFX Morph

Post by VEGETA »

fernan1234 wrote:
bobrocks95 wrote:When did everyone get so sensitive to lag? 5-10 years ago 50ms+ LCDs were the norm, and people added scalers on top of that and were happy. Now you've got stuff like that guy in the 5x thread who refused to buy it because it was 4ms slower than the 2x or something close to that.

2 frames of lag on the Framemeister is of course worse on paper, but pair it with a fast display and show me the double blind test that shows anyone can reliably tell the difference. I think the placebo effect is strong here (and, again, sure, the 5X pretty much replaces the FM, yes).

It's not so much a placebo, but rather impressions will depend on what you've been used to. I'm not special, I've simply been using CRTs all my life, and only started really getting serious with scalers and flat panels this year. There's many people who've been on the scaler + flat panel game longer but have also been using CRTs on the side and thus remain alert to the differences. I could immediately tell a "big" (to me) difference between the 5X's triple buffered mode and the framelock mode, whereas other users on that topic were saying it feels the same to them.
I game primarily on flat panels (LG PC monitor 75hz) and don't feel a thing really, using OSSC and a general component to HDMI product. Yes, input lag is overrated these days, but people think that as long as the ICs are capable of delivering 0 lag then they should.

I am not familiar with 5x modes, but "framelock" seems like "genlock" right? where input frame v sync is phase-aligned with output frame v sync. right?
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: PixelFX Morph

Post by Josh128 »

VEGETA wrote: I game primarily on flat panels (LG PC monitor 75hz) and don't feel a thing really, using OSSC and a general component to HDMI product. Yes, input lag is overrated these days, but people think that as long as the ICs are capable of delivering 0 lag then they should.

I am not familiar with 5x modes, but "framelock" seems like "genlock" right? where input frame v sync is phase-aligned with output frame v sync. right?
Here are Time Sleuth results for 2 different LG 75Hz monitors, and as you can see, they are basically 0 lag. I was thinking my 1080p monitor was a bit slower but I was mistaken-- in these tests its a few tenths of a ms faster, but that could be attributed to the fact that I was not feeding native res (my TS is not currently capable of 1440p) to the QHD monitor. This is within .5 - 1.5 ms of what I measure on an actual CRT with a zero lag HDMI to VGA converter. Im guessing this is the norm for most flat panels these days, up until a few days ago I did not realize that.

Concerning lag, I used to be a naysayer myself, but I find ~37ms (the input lag of my Samsung plasmas according to various tests including Time Sleuth) to be the upper limit of what I consider acceptable for action game input lag. The average joe gamer that grew up gaming on flat panel TVs would probably never even notice this amount of lag, but someone who grew up gaming on CRTs might. Anything over that and they definitely will, IMO. 16ms or below, I dont think anyone outside of Xmen/mutants would ever notice.

Image
User avatar
Guspaz
Posts: 3136
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: PixelFX Morph

Post by Guspaz »

That's quite impressive, as the LG 27GL850 has 4.3ms of latency at 144Hz and 14ms of latency at 60Hz. Well, at 1440p. rtings didn't measure at 1080p.
Post Reply