PixelFX Morph

The place for all discussion on gaming hardware
Post Reply
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

I've also noticed that when reviewers check 1080p lag they check for 1080p@120fps, not 60fps. No one cares about our legacy stuff.
User avatar
Guspaz
Posts: 3136
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: PixelFX Morph

Post by Guspaz »

rtings tested:

1440p144: 4.3 ms
1440p60: 14.0 ms
1440p144 VRR: 4.7ms
1440p60 VRR: 13.2ms
1440p144 HDR: 4.3ms

I don't think we need detailed results for every possible resolution, but, maybe throw in 1080p60, since that's a really common resolution if you're plugging a gaming console into the thing? Like, a Nintendo Switch or a PS4.
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: PixelFX Morph

Post by Josh128 »

Rtings has extremely suspect input latency results on many monitors. They completely botched Samsung plasma results for the final gen models:

RTings

F8500: 37.5ms (PC mode)
F5300: 70.8ms :o
F5500: 55.2ms
F4500: 50ms

Actual
F8500: 37.5ms (PC mode)
F5300: 37.5ms
F5500: Should be identical to F5300
F4500: 37.5ms

Also, for my LG 24MP59G, they have the input lag listed at 7.1ms - 8ms. The only thing I can think of is that they are measuring the center of the screen, which of course you would measure 7 to 8 ms, just like the bottom will measure close to 16ms. Its how the video is sent to the panel. Its misleading. They should be measuring at the top left of the panel, in which case they would be in the same 1.5ms range that Im measuring with the TS.

As for the plasmas, they appear to work differently from the zero lag LCD panels, like they take in the entire 16ms scan to a buffer and then display it all at once. This must be how it works, because I measure the exact same lag anywhere on the screen, it doesnt vary.
thebigcheese
Posts: 707
Joined: Sun Aug 21, 2016 5:18 pm

Re: PixelFX Morph

Post by thebigcheese »

VEGETA wrote:I just wanna ask if someone tried a DP to HDMI adapter with say 4k60 signal. does it convert it successfully without lag? how much price?
Adapters that convert from one to the other are generally not great. Passive adapters are limited in what sorts of signals they can accept/output and active ones have a tendency to add lag. So if we're going to go on complaining about things that add lag, offloading HDMI to an adapter is just a bad idea.
User avatar
Guspaz
Posts: 3136
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: PixelFX Morph

Post by Guspaz »

No DisplayPort adapter (passive or active) will add any lag. Passive DisplayPort to HDMI adapters are perfectly fine. They're just that, passive adapters, the DisplayPort port is sending out an HDMI signal.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

Yeah there is no lag with those converters, as long as they're not doing any scaling. Most of the cheap ones you find can have trouble doing 40k60 though, and may only do 4k30 at most. But this is irrelevant for us here since no gaming scaler is expected to output 4k any time soon.
User avatar
Guspaz
Posts: 3136
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: PixelFX Morph

Post by Guspaz »

I believe that retro scaler manufacturers have said they'd expect a scaler capable of 4K60 to have a pricetag in the thousands of dollars, based on the cost of the FPGAs/RAM/HDMI transmitters required to do that. Nobody would buy that.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

Oh, I'm pretty sure some would buy it, but almost certainly not in enough numbers to justify the whole enterprise at this point. Give it about 4 more years I'd say.
thebigcheese
Posts: 707
Joined: Sun Aug 21, 2016 5:18 pm

Re: PixelFX Morph

Post by thebigcheese »

Guspaz wrote:No DisplayPort adapter (passive or active) will add any lag. Passive DisplayPort to HDMI adapters are perfectly fine. They're just that, passive adapters, the DisplayPort port is sending out an HDMI signal.
Admittedly, this comes mostly from my experience researching adapters to get VR to work. Oculus specifically recommends against using adapters and, indeed, I was never able to find one that worked reliably either for my Rift or for getting 4k60 on my TV. This is not even including trying to get various VRR and 444 and other sorts of things working. If you're only trying to get 1080p60, that might be fine, but even 1440p60 is likely to be sketchy with most of the common adapters out there. My point really is just that relying on adapters is a way worse solution than having native HDMI output and I would much, much prefer a native HDMI device.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

ross wrote:As someone who's never seen this bob + scanline effect, I'm assuming it produces an image that looks a lot more like a pin sharp PVM with noticeable flicker and combing in motion rather than an old CRT television that would blur/hide a lot of the quirks of interlaced video?
Yes it's almost the same as how interlace looks on a BVM, if you get close you can easily see the lines flickering and combing artifacts are also as clear as the picture itself, since you do have a fully resolved picture. And I say BVM specifically because in my experience those do not make 480i look as flickery as the PVMs I've seen even with equivalent or similar TV lines. On BVMs or this simulation the flicker doesn't bother me personally at normal viewing distance, and is objectively less noticeable than bob without scanlines. The combing can be a bit distracting sometimes though, but that one is hard to avoid in general with any kind of deinterlacing without adding blur to the picture.

I'm sure it'd be possible to add filtering to soften the picture and make it look more like on a consumer TV, of course at the cost of reducing how sharp and resolved the picture looks.
User avatar
Guspaz
Posts: 3136
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: PixelFX Morph

Post by Guspaz »

thebigcheese wrote:
Guspaz wrote:No DisplayPort adapter (passive or active) will add any lag. Passive DisplayPort to HDMI adapters are perfectly fine. They're just that, passive adapters, the DisplayPort port is sending out an HDMI signal.
Admittedly, this comes mostly from my experience researching adapters to get VR to work. Oculus specifically recommends against using adapters and, indeed, I was never able to find one that worked reliably either for my Rift or for getting 4k60 on my TV. This is not even including trying to get various VRR and 444 and other sorts of things working. If you're only trying to get 1080p60, that might be fine, but even 1440p60 is likely to be sketchy with most of the common adapters out there. My point really is just that relying on adapters is a way worse solution than having native HDMI output and I would much, much prefer a native HDMI device.
Any passive adapter should be expected to work fine up to 4K30, as that was what was supported under the dual-mode standard for DP 1.2. 1440p60 requires even less bandwidth than that and should also be fine. 4K60 is another story, that was only added in DP 1.3, and GPUs as recent as the GeForce 1000 series (like 1080) only shipped with DP 1.2 support out of the box (they later had a firmware update to add 1.3/1.4).

It should be noted that the passive adapters do actually require a chip to handle the voltage level change, dual mode displayport only changes over the signaling to match HDMI/DVI.
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: PixelFX Morph

Post by Josh128 »

ross wrote:
fernan1234 wrote:But this is all if you prefer a progressive-looking deinterlaced picture. I agree even more with the above comments about bob+"scanlines" (the bob in the GBS-C, bob+100% scanlines on OSSC Pro, or "CRT simulate" on the TINK products) is the ideal solution, since not only does it look most authentic as interlaced pictures, but is the fastest. We're not talking about video or film here, but games that were made to be played on CRTs. The ideal is for them to look and feel like on a CRT after scaling.
As someone who's never seen this bob + scanline effect, I'm assuming it produces an image that looks a lot more like a pin sharp PVM with noticeable flicker and combing in motion rather than an old CRT television that would blur/hide a lot of the quirks of interlaced video?
Actually, no. The reason I didnt notice any brightness changes from using bob deinterlacing +scanlines on the GBS-C on the Gamecube the other day was because I had the scanline strength set to 50 (default/lightest). It really doesnt change the brightness at all! An added plus is that it looks very, very smooth on a VGA CRT in this mode. Super authentic looking (again though, this IS on a CRT). Increasing scanline strength doesnt affect sharpness or flicker either, all it does is darken the picture. A matter of preference I would say. You dont even need scanlines in bob deinterlace mode if you dont want, you can just run it without them and it still looks great, but I find the 50% scanlines + bob to look the most authentic. Its basically perfect. Meanwhile, "CRT Simulate" on the 5X greatly darkens the screen vs other modes, presumably because it uses black scanlines by default and Im not sure if scanline strength is adjustable while using "CRT Simulate", however, if running on an LCD its no problem to jack up the brightness to counter that brightness loss. I'd also like to add that the motion adaptive interlace on the GBS-C looks REALLY damned good on CRT, even in motion. Its 95-99% of the quality of 480p I would say. It does add 16ms of lag, but its very impressive.

So fernan1234, because my GBS-C was on 50% scanline mode the other day is the reason I didnt notice that brightness loss we discussed above. I wouldnt change a thing about it, that setting is essentially perfect. The GBS-C is an amazing little gem of a device.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

Josh128 wrote:So fernan1234, because my GBS-C was on 50% scanline mode the other day is the reason I didnt notice that brightness loss we discussed above. I wouldnt change a thing about it, that setting is essentially perfect. The GBS-C is an amazing little gem of a device.
Yes you're totally right, I forgot that the scanline settings affect how dark it becomes. Ever since I set it up I adjusted my scanlines to the max because that's what brings it closer to the kind of picture I was used to on a CRT BVM, but I can still see that it can look great with lighter scanlines.

And no, on the 5X's CRT simulate mode you can't make such an adjustment, but on the OSSC you can, and I bet also on the upcoming Morph.
Smashbro29
Posts: 532
Joined: Thu Dec 15, 2011 2:46 am

Re: PixelFX Morph

Post by Smashbro29 »

On paper doesn't this seem like the best upscaler? Am I missing something?
anexanhume
Posts: 41
Joined: Sat Mar 06, 2021 1:12 am

Re: PixelFX Morph

Post by anexanhume »

Smashbro29 wrote:On paper doesn't this seem like the best upscaler? Am I missing something?
I would say it and the OSSC Pro seem very close.
User avatar
kitty666cats
Posts: 1270
Joined: Tue Nov 05, 2019 2:03 am
Location: Massachusetts, USA

Re: PixelFX Morph

Post by kitty666cats »

The Morph’s potential for use in digitizing finicky analog video makes it blow away the OSSC Pro, for me :)
Smashbro29
Posts: 532
Joined: Thu Dec 15, 2011 2:46 am

Re: PixelFX Morph

Post by Smashbro29 »

anexanhume wrote:
Smashbro29 wrote:On paper doesn't this seem like the best upscaler? Am I missing something?
I would say it and the OSSC Pro seem very close.
Would you like to expand on that? I'm very interested in all the upcoming tech.
User avatar
HDgaming42
Posts: 331
Joined: Wed Jul 15, 2009 3:16 am
Location: Canada

Re: PixelFX Morph

Post by HDgaming42 »

Smashbro29 wrote:On paper doesn't this seem like the best upscaler? Am I missing something?
For me it's the support of LUTs. Whichever one implements them gets my money this round.
anexanhume
Posts: 41
Joined: Sat Mar 06, 2021 1:12 am

Re: PixelFX Morph

Post by anexanhume »

Smashbro29 wrote:
anexanhume wrote:
Smashbro29 wrote:On paper doesn't this seem like the best upscaler? Am I missing something?
I would say it and the OSSC Pro seem very close.
Would you like to expand on that? I'm very interested in all the upcoming tech.
Consult the feature list in the OP versus the OP in the OSSC Pro. You’ll see that they’re quite similar.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: PixelFX Morph

Post by VEGETA »

HDgaming42 wrote:
Smashbro29 wrote:On paper doesn't this seem like the best upscaler? Am I missing something?
For me it's the support of LUTs. Whichever one implements them gets my money this round.
can you explain what look up tables are you referring to?
User avatar
HDgaming42
Posts: 331
Joined: Wed Jul 15, 2009 3:16 am
Location: Canada

Re: PixelFX Morph

Post by HDgaming42 »

VEGETA wrote:
HDgaming42 wrote:
Smashbro29 wrote:On paper doesn't this seem like the best upscaler? Am I missing something?
For me it's the support of LUTs. Whichever one implements them gets my money this round.
can you explain what look up tables are you referring to?
Happy to! Essentially they're text files that describe how you would like colours interpreted or altered. They can be corrective or for effect.
https://www.studiobinder.com/blog/what-is-lut/

Good video showing what is possible with LUTs
https://3dlutcreator.com/

Free software to generate various LUT formats:
https://displaycal.net/

There was a fpga box designed for this called the eeColor, and it was able to hold multiple 65pt cube LUTs.
https://www.displaycalibrationtools.com ... low-guide/

It only supports HD.

Simple conversion boxes now hold 33pt LUTs
https://www.blackmagicdesign.com/ca/pro ... converters

You only need to "profile" a monitor once. Then you can generate multiple LUTs. You can have one to correct the imperfections in your set's reproduction of colour. You can have one to "convert" your set to D93 (arguably what 8 and 16bit Japanese games were programmed for). Another for North American D65. Maybe you have one to correct for PCEngine RGB. Maybe you really want to play SMB1 as Wario and change Red to Purple.

All possible with LUTs. Happy to expand upon anything you're interested in. :)
Sirotaca
Posts: 103
Joined: Sun Mar 19, 2017 12:08 am

Re: PixelFX Morph

Post by Sirotaca »

Another interesting possibility with LUTs is to eliminate mild video noise for consoles with a small color palette by effectively reducing the color depth. The PC Engine with its 9-bit palette would be a good candidate for that, for example. Get it dialed in and it would be indistinguishable from a pure digital output.

I don't know how difficult it would be to implement in an upscaler like the Morph, but if it's feasible it would be a very cool and powerful feature indeed.
User avatar
Unseen
Posts: 723
Joined: Sun May 25, 2014 8:12 pm
Contact:

Re: PixelFX Morph

Post by Unseen »

Sirotaca wrote:I don't know how difficult it would be to implement in an upscaler like the Morph, but if it's feasible it would be a very cool and powerful feature indeed.
It depends on the size of the LUT - the more points you have on the edge of the cube, the larger your memory needs to be to hold all the values and it must be very high-speed, low-latency memory because you need to look up multiple values per pixel. If the feature is ever implemented, you should probably not expect something on the level of eeColor or even Lumagen's offerings (17x17x17 cube).

I'm pretty sure that a LUT would increase the latency by at least 4 pixels (1 for lookup, 3 for interpolation) ;)
User avatar
HDgaming42
Posts: 331
Joined: Wed Jul 15, 2009 3:16 am
Location: Canada

Re: PixelFX Morph

Post by HDgaming42 »

Unseen wrote:
Sirotaca wrote:I don't know how difficult it would be to implement in an upscaler like the Morph, but if it's feasible it would be a very cool and powerful feature indeed.
It depends on the size of the LUT - the more points you have on the edge of the cube, the larger your memory needs to be to hold all the values and it must be very high-speed, low-latency memory because you need to look up multiple values per pixel. If the feature is ever implemented, you should probably not expect something on the level of eeColor or even Lumagen's offerings (17x17x17 cube).

I'm pretty sure that a LUT would increase the latency by at least 4 pixels (1 for lookup, 3 for interpolation) ;)
Thanks for looking into this! I wonder if a 17pt cube would be enough to move from D65 to D93.

Another possibility--and bare with me here--is to offer SDI out as an option. The decklink mini converters can hold a 33pt cube and only cost $200CAD (they only take SDI in, and output HDMI).

SDI timings are tight, and trying to use a LUT with legacy gaming hardware is really hard as the output of the OSSC was never close enough for the HDMI to SDI converters I tried. The world needs better ways to get video into UHD compatible SDI anyway--the Morph could fill that gap.

Maybe you could have a header for SDI ready to populate if there are licensing considerations? Have the output ready to toggle within the software after the end user has soldered on the output?

Then you'd be off the hook for LUTs! ;)
User avatar
Unseen
Posts: 723
Joined: Sun May 25, 2014 8:12 pm
Contact:

Re: PixelFX Morph

Post by Unseen »

HDgaming42 wrote:Thanks for looking into this!
Just to clear up any potential confusion: Reading your post I'm under the impression that you seem to think I'm somehow involved with the PixelFX products. I am not, I'm just a reader of this forum who has a bit of background knowledge about video processing.
User avatar
HDgaming42
Posts: 331
Joined: Wed Jul 15, 2009 3:16 am
Location: Canada

Re: PixelFX Morph

Post by HDgaming42 »

Unseen wrote:
HDgaming42 wrote:Thanks for looking into this!
Just to clear up any potential confusion: Reading your post I'm under the impression that you seem to think I'm somehow involved with the PixelFX products. I am not, I'm just a reader of this forum who has a bit of background knowledge about video processing.
Thanks for the clarification. Maybe the OP could be updated to include the members of PixelFX, which seem to be

Woozle64
citrus3000psi
chriz2600

Unless they go by other handles, it seems only Woozle is active here?
energizerfellow‌
Posts: 208
Joined: Thu Sep 27, 2018 1:04 am

Re: PixelFX Morph

Post by energizerfellow‌ »

Konsolkongen wrote:Will the Morph offer some basic white balance adjustments? The Dreamcast with original Sega VGA box (maybe all VGA boxes?) has a slightly too green image. This can somewhat be corrected on the OSSC. I expect the OSSC Pro will offer the same options but it would be great if the Morph did too.
Sounds like a BT.601-into-sRGB problem with a side of CRT gamut being bigger than sRGB's. Anything vaguely modern will be expecting PC-style sRGB on it's VGA input, not video-style BT.601.
vrunk11 wrote:Is there a planned function for converting limited range to full range ? (switching between IRE 7.5 and IRE 0) for NTSC input
In analog YUV/YIQ land there's indeed both IRE 0.0 and 7.5, but other than a slight variance in the voltage level of logical zero, there's no difference. Analog RGB has always been IRE 0.0, as far as I know.

The difference in analog voltage swing doesn't translate into the digital realm where, at least by official spec, 8-bit YCbCr should always be 16-235, regardless of what the analog side looked like. The "full range" 0-255 data in YUV is technically standards-breaking and support for it is functionally broken in many, many things, both software and hardware. When you're working in the RGB color space, however, virtually everything uses and expects the full 0-255 (but not always...).

This should also make it obvious why 10-bit color is a big deal for YUV and why 10-bit color was part of the SDI specification in the 80s and basically anything newer than D1 in the pro space uses 10-bit color.
HDgaming42 wrote: Another great usage, and this will likely get me in trouble, would be to play 8 and 16bit games in D93 the way the gods intended with a D93 LUT, while viewing other systems in D65. Or being able to instantly jump between the two to play in whatever you prefer.
It's not just D93 while point for NTSC-J, but also PAL-style EBU phosphors defining the color space instead of SMPTE-C? Per the manual from one of Sony's newer PVMs, it does indeed looks like NTSC-J is EBU + D93. This appears to be backed up by comments in Alexis Van Hurkman's The Color Correction Handbook: Professional Techniques for Video and Cinema.

Image
HDgaming42 wrote: Another possibility--and bare with me here--is to offer SDI out as an option.
If there ever ends up being a "pro" model of the Morph (as in sold to commercial customers via places like B&H), SDI I/O would be strongly desired in that market, along with BNC connectors and general mechanical hardening.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: PixelFX Morph

Post by fernan1234 »

energizerfellow‌ wrote:It's not just D93 while point for NTSC-J, but also PAL-style EBU phosphors defining the color space instead of SMPTE-C? Per the manual from one of Sony's newer PVMs, it does indeed looks like NTSC-J is EBU + D93. This appears to be backed up by comments in Alexis Van Hurkman's The Color Correction Handbook: Professional Techniques for Video and Cinema.
That picture is simply from a Sony monitor manual, regarding what settings the monitor defaults to when selecting a given region, which the user can then change depending on the source material to be worked on, and is not necessarily representative of what was used generally in Japan for a given period of time.

What we can know is that many professional monitors including Sony made ones used P-22 phosphors up to the 1990s at least, while there were also EBU ones later on as well mainly on non-Sony monitors like Victor/JVC and Ikegami, but a few older Sony ones as well. Sony as the industry leader itself switched to SMPTE-C for pro monitors sold in Japan in the mid/late 1990s up to their last CRTs in the mid 2000s.
edit: not so sure about the chronology here, it does look like there were times when these and other phosphor types co-existed in the Japanese market.
energizerfellow‌ wrote:If there ever ends up being a "pro" model of the Morph (as in sold to commercial customers via places like B&H), SDI I/O would be strongly desired in that market, along with BNC connectors and general mechanical hardening.
This whole idea sounded funny to think about for a moment, but then I realized how much gaming and streaming is growing, and although retro is a small subset of that, it's actually not inconceivable that there can be a large enough professional space for it at some point down the road.
User avatar
vrunk11
Posts: 8
Joined: Thu May 20, 2021 8:56 am
Location: Europe

Re: PixelFX Morph

Post by vrunk11 »

energizerfellow‌ wrote: In analog YUV/YIQ land there's indeed both IRE 0.0 and 7.5, but other than a slight variance in the voltage level of logical zero, there's no difference. Analog RGB has always been IRE 0.0, as far as I know.

The difference in analog voltage swing doesn't translate into the digital realm where, at least by official spec, 8-bit YCbCr should always be 16-235, regardless of what the analog side looked like. The "full range" 0-255 data in YUV is technically standards-breaking and support for it is functionally broken in many, many things, both software and hardware. When you're working in the RGB color space, however, virtually everything uses and expects the full 0-255 (but not always...).
But for the analog out it could be a good idea to have a full range output with good black and colors and on the hdmi/dvi side im not sure that
when passed through other video processing (switch scaler or capture card or a dvi input,) the limited range can be handled properly , and we should consider that the conversion from limited range to full range is done somewhere on the tv or the scaler.
the benefit of doing it on the scaler is that you control everything and don't rely on any other processing
(and for using the yuv444 input on a tv its necessary to use the PC mode that disable all other processing)
User avatar
orange808
Posts: 3196
Joined: Sat Aug 20, 2016 5:43 am

Re: PixelFX Morph

Post by orange808 »

That leads into something I mentioned in another thread just now.

There are almost no "all analog" transcoders available that will accept analog, transcode, and output analog without damaging the signal using a "generic" ADC sampling step (before processing). It would be nice to have a new and widely available option to properly sample, transcode, and standardise all signals to a single standard (RGB or component)--and be able to split that signal for both CRTs and feeding a video scaler. It would be nice to tackle color processing in this initial step as well.

If it was capable, I would use two Morph machines. One Morph would receive everything from the matrix and feed back analog back in to the switch. The other Morph would accept the transcoded signal (after the switch) and feed HDMI to the HDMI switch.
We apologise for the inconvenience
Post Reply