shmups.system11.org

Shmups Forum
 
* FAQ    * Search
 * Register  * Login 
It is currently Sat Jul 24, 2021 1:21 am View unanswered posts
View active topics



Post new topic Reply to topic  [ 164 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6  Next
Author Message
 Post subject: Re: PixelFX Morph
PostPosted: Tue Jun 01, 2021 11:51 pm 



Joined: 14 Aug 2017
Posts: 1576
I've also noticed that when reviewers check 1080p lag they check for 1080p@120fps, not 60fps. No one cares about our legacy stuff.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Wed Jun 02, 2021 1:15 am 


User avatar

Joined: 06 Oct 2015
Posts: 2722
Location: Montréal, Canada
rtings tested:

1440p144: 4.3 ms
1440p60: 14.0 ms
1440p144 VRR: 4.7ms
1440p60 VRR: 13.2ms
1440p144 HDR: 4.3ms

I don't think we need detailed results for every possible resolution, but, maybe throw in 1080p60, since that's a really common resolution if you're plugging a gaming console into the thing? Like, a Nintendo Switch or a PS4.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Wed Jun 02, 2021 12:45 pm 


User avatar

Joined: 16 Jan 2014
Posts: 1416
Rtings has extremely suspect input latency results on many monitors. They completely botched Samsung plasma results for the final gen models:

RTings

F8500: 37.5ms (PC mode)
F5300: 70.8ms :o
F5500: 55.2ms
F4500: 50ms

Actual
F8500: 37.5ms (PC mode)
F5300: 37.5ms
F5500: Should be identical to F5300
F4500: 37.5ms

Also, for my LG 24MP59G, they have the input lag listed at 7.1ms - 8ms. The only thing I can think of is that they are measuring the center of the screen, which of course you would measure 7 to 8 ms, just like the bottom will measure close to 16ms. Its how the video is sent to the panel. Its misleading. They should be measuring at the top left of the panel, in which case they would be in the same 1.5ms range that Im measuring with the TS.

As for the plasmas, they appear to work differently from the zero lag LCD panels, like they take in the entire 16ms scan to a buffer and then display it all at once. This must be how it works, because I measure the exact same lag anywhere on the screen, it doesnt vary.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 2:42 pm 



Joined: 21 Aug 2016
Posts: 663
VEGETA wrote:
I just wanna ask if someone tried a DP to HDMI adapter with say 4k60 signal. does it convert it successfully without lag? how much price?


Adapters that convert from one to the other are generally not great. Passive adapters are limited in what sorts of signals they can accept/output and active ones have a tendency to add lag. So if we're going to go on complaining about things that add lag, offloading HDMI to an adapter is just a bad idea.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 3:09 pm 


User avatar

Joined: 06 Oct 2015
Posts: 2722
Location: Montréal, Canada
No DisplayPort adapter (passive or active) will add any lag. Passive DisplayPort to HDMI adapters are perfectly fine. They're just that, passive adapters, the DisplayPort port is sending out an HDMI signal.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 3:24 pm 



Joined: 14 Aug 2017
Posts: 1576
Yeah there is no lag with those converters, as long as they're not doing any scaling. Most of the cheap ones you find can have trouble doing 40k60 though, and may only do 4k30 at most. But this is irrelevant for us here since no gaming scaler is expected to output 4k any time soon.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 3:50 pm 


User avatar

Joined: 06 Oct 2015
Posts: 2722
Location: Montréal, Canada
I believe that retro scaler manufacturers have said they'd expect a scaler capable of 4K60 to have a pricetag in the thousands of dollars, based on the cost of the FPGAs/RAM/HDMI transmitters required to do that. Nobody would buy that.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 4:20 pm 



Joined: 14 Aug 2017
Posts: 1576
Oh, I'm pretty sure some would buy it, but almost certainly not in enough numbers to justify the whole enterprise at this point. Give it about 4 more years I'd say.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 6:49 pm 



Joined: 15 Oct 2017
Posts: 256
fernan1234 wrote:
But this is all if you prefer a progressive-looking deinterlaced picture. I agree even more with the above comments about bob+"scanlines" (the bob in the GBS-C, bob+100% scanlines on OSSC Pro, or "CRT simulate" on the TINK products) is the ideal solution, since not only does it look most authentic as interlaced pictures, but is the fastest. We're not talking about video or film here, but games that were made to be played on CRTs. The ideal is for them to look and feel like on a CRT after scaling.

As someone who's never seen this bob + scanline effect, I'm assuming it produces an image that looks a lot more like a pin sharp PVM with noticeable flicker and combing in motion rather than an old CRT television that would blur/hide a lot of the quirks of interlaced video?


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 6:58 pm 



Joined: 21 Aug 2016
Posts: 663
Guspaz wrote:
No DisplayPort adapter (passive or active) will add any lag. Passive DisplayPort to HDMI adapters are perfectly fine. They're just that, passive adapters, the DisplayPort port is sending out an HDMI signal.


Admittedly, this comes mostly from my experience researching adapters to get VR to work. Oculus specifically recommends against using adapters and, indeed, I was never able to find one that worked reliably either for my Rift or for getting 4k60 on my TV. This is not even including trying to get various VRR and 444 and other sorts of things working. If you're only trying to get 1080p60, that might be fine, but even 1440p60 is likely to be sketchy with most of the common adapters out there. My point really is just that relying on adapters is a way worse solution than having native HDMI output and I would much, much prefer a native HDMI device.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 7:00 pm 



Joined: 14 Aug 2017
Posts: 1576
ross wrote:
As someone who's never seen this bob + scanline effect, I'm assuming it produces an image that looks a lot more like a pin sharp PVM with noticeable flicker and combing in motion rather than an old CRT television that would blur/hide a lot of the quirks of interlaced video?


Yes it's almost the same as how interlace looks on a BVM, if you get close you can easily see the lines flickering and combing artifacts are also as clear as the picture itself, since you do have a fully resolved picture. And I say BVM specifically because in my experience those do not make 480i look as flickery as the PVMs I've seen even with equivalent or similar TV lines. On BVMs or this simulation the flicker doesn't bother me personally at normal viewing distance, and is objectively less noticeable than bob without scanlines. The combing can be a bit distracting sometimes though, but that one is hard to avoid in general with any kind of deinterlacing without adding blur to the picture.

I'm sure it'd be possible to add filtering to soften the picture and make it look more like on a consumer TV, of course at the cost of reducing how sharp and resolved the picture looks.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 7:22 pm 


User avatar

Joined: 06 Oct 2015
Posts: 2722
Location: Montréal, Canada
thebigcheese wrote:
Guspaz wrote:
No DisplayPort adapter (passive or active) will add any lag. Passive DisplayPort to HDMI adapters are perfectly fine. They're just that, passive adapters, the DisplayPort port is sending out an HDMI signal.


Admittedly, this comes mostly from my experience researching adapters to get VR to work. Oculus specifically recommends against using adapters and, indeed, I was never able to find one that worked reliably either for my Rift or for getting 4k60 on my TV. This is not even including trying to get various VRR and 444 and other sorts of things working. If you're only trying to get 1080p60, that might be fine, but even 1440p60 is likely to be sketchy with most of the common adapters out there. My point really is just that relying on adapters is a way worse solution than having native HDMI output and I would much, much prefer a native HDMI device.


Any passive adapter should be expected to work fine up to 4K30, as that was what was supported under the dual-mode standard for DP 1.2. 1440p60 requires even less bandwidth than that and should also be fine. 4K60 is another story, that was only added in DP 1.3, and GPUs as recent as the GeForce 1000 series (like 1080) only shipped with DP 1.2 support out of the box (they later had a firmware update to add 1.3/1.4).

It should be noted that the passive adapters do actually require a chip to handle the voltage level change, dual mode displayport only changes over the signaling to match HDMI/DVI.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 9:08 pm 


User avatar

Joined: 16 Jan 2014
Posts: 1416
ross wrote:
fernan1234 wrote:
But this is all if you prefer a progressive-looking deinterlaced picture. I agree even more with the above comments about bob+"scanlines" (the bob in the GBS-C, bob+100% scanlines on OSSC Pro, or "CRT simulate" on the TINK products) is the ideal solution, since not only does it look most authentic as interlaced pictures, but is the fastest. We're not talking about video or film here, but games that were made to be played on CRTs. The ideal is for them to look and feel like on a CRT after scaling.

As someone who's never seen this bob + scanline effect, I'm assuming it produces an image that looks a lot more like a pin sharp PVM with noticeable flicker and combing in motion rather than an old CRT television that would blur/hide a lot of the quirks of interlaced video?


Actually, no. The reason I didnt notice any brightness changes from using bob deinterlacing +scanlines on the GBS-C on the Gamecube the other day was because I had the scanline strength set to 50 (default/lightest). It really doesnt change the brightness at all! An added plus is that it looks very, very smooth on a VGA CRT in this mode. Super authentic looking (again though, this IS on a CRT). Increasing scanline strength doesnt affect sharpness or flicker either, all it does is darken the picture. A matter of preference I would say. You dont even need scanlines in bob deinterlace mode if you dont want, you can just run it without them and it still looks great, but I find the 50% scanlines + bob to look the most authentic. Its basically perfect. Meanwhile, "CRT Simulate" on the 5X greatly darkens the screen vs other modes, presumably because it uses black scanlines by default and Im not sure if scanline strength is adjustable while using "CRT Simulate", however, if running on an LCD its no problem to jack up the brightness to counter that brightness loss. I'd also like to add that the motion adaptive interlace on the GBS-C looks REALLY damned good on CRT, even in motion. Its 95-99% of the quality of 480p I would say. It does add 16ms of lag, but its very impressive.

So fernan1234, because my GBS-C was on 50% scanline mode the other day is the reason I didnt notice that brightness loss we discussed above. I wouldnt change a thing about it, that setting is essentially perfect. The GBS-C is an amazing little gem of a device.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Thu Jun 03, 2021 10:03 pm 



Joined: 14 Aug 2017
Posts: 1576
Josh128 wrote:
So fernan1234, because my GBS-C was on 50% scanline mode the other day is the reason I didnt notice that brightness loss we discussed above. I wouldnt change a thing about it, that setting is essentially perfect. The GBS-C is an amazing little gem of a device.


Yes you're totally right, I forgot that the scanline settings affect how dark it becomes. Ever since I set it up I adjusted my scanlines to the max because that's what brings it closer to the kind of picture I was used to on a CRT BVM, but I can still see that it can look great with lighter scanlines.

And no, on the 5X's CRT simulate mode you can't make such an adjustment, but on the OSSC you can, and I bet also on the upcoming Morph.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Fri Jun 18, 2021 4:17 pm 


User avatar

Joined: 15 Dec 2011
Posts: 525
Location: NYC
On paper doesn't this seem like the best upscaler? Am I missing something?


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Fri Jun 18, 2021 4:28 pm 



Joined: 06 Mar 2021
Posts: 30
Smashbro29 wrote:
On paper doesn't this seem like the best upscaler? Am I missing something?

I would say it and the OSSC Pro seem very close.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Fri Jun 18, 2021 9:03 pm 


User avatar

Joined: 05 Nov 2019
Posts: 771
Location: Massachusetts, USA
The Morph’s potential for use in digitizing finicky analog video makes it blow away the OSSC Pro, for me :)
_________________
For CRTs, A/V gear, video games & more, be sure to check out my eBay!

Image


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Fri Jun 18, 2021 9:32 pm 


User avatar

Joined: 15 Dec 2011
Posts: 525
Location: NYC
anexanhume wrote:
Smashbro29 wrote:
On paper doesn't this seem like the best upscaler? Am I missing something?

I would say it and the OSSC Pro seem very close.

Would you like to expand on that? I'm very interested in all the upcoming tech.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Fri Jun 18, 2021 10:42 pm 


User avatar

Joined: 15 Jul 2009
Posts: 317
Location: Canada
Smashbro29 wrote:
On paper doesn't this seem like the best upscaler? Am I missing something?


For me it's the support of LUTs. Whichever one implements them gets my money this round.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Fri Jun 18, 2021 10:56 pm 



Joined: 06 Mar 2021
Posts: 30
Smashbro29 wrote:
anexanhume wrote:
Smashbro29 wrote:
On paper doesn't this seem like the best upscaler? Am I missing something?

I would say it and the OSSC Pro seem very close.

Would you like to expand on that? I'm very interested in all the upcoming tech.


Consult the feature list in the OP versus the OP in the OSSC Pro. You’ll see that they’re quite similar.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sat Jun 19, 2021 1:39 am 


User avatar

Joined: 31 May 2021
Posts: 52
HDgaming42 wrote:
Smashbro29 wrote:
On paper doesn't this seem like the best upscaler? Am I missing something?


For me it's the support of LUTs. Whichever one implements them gets my money this round.


can you explain what look up tables are you referring to?


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sat Jun 19, 2021 2:36 am 


User avatar

Joined: 15 Jul 2009
Posts: 317
Location: Canada
VEGETA wrote:
HDgaming42 wrote:
Smashbro29 wrote:
On paper doesn't this seem like the best upscaler? Am I missing something?


For me it's the support of LUTs. Whichever one implements them gets my money this round.


can you explain what look up tables are you referring to?


Happy to! Essentially they're text files that describe how you would like colours interpreted or altered. They can be corrective or for effect.
https://www.studiobinder.com/blog/what-is-lut/

Good video showing what is possible with LUTs
https://3dlutcreator.com/

Free software to generate various LUT formats:
https://displaycal.net/

There was a fpga box designed for this called the eeColor, and it was able to hold multiple 65pt cube LUTs.
https://www.displaycalibrationtools.com ... low-guide/

It only supports HD.

Simple conversion boxes now hold 33pt LUTs
https://www.blackmagicdesign.com/ca/pro ... converters

You only need to "profile" a monitor once. Then you can generate multiple LUTs. You can have one to correct the imperfections in your set's reproduction of colour. You can have one to "convert" your set to D93 (arguably what 8 and 16bit Japanese games were programmed for). Another for North American D65. Maybe you have one to correct for PCEngine RGB. Maybe you really want to play SMB1 as Wario and change Red to Purple.

All possible with LUTs. Happy to expand upon anything you're interested in. :)


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sat Jun 19, 2021 5:24 am 



Joined: 19 Mar 2017
Posts: 81
Another interesting possibility with LUTs is to eliminate mild video noise for consoles with a small color palette by effectively reducing the color depth. The PC Engine with its 9-bit palette would be a good candidate for that, for example. Get it dialed in and it would be indistinguishable from a pure digital output.

I don't know how difficult it would be to implement in an upscaler like the Morph, but if it's feasible it would be a very cool and powerful feature indeed.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sat Jun 19, 2021 12:32 pm 


User avatar

Joined: 25 May 2014
Posts: 659
Sirotaca wrote:
I don't know how difficult it would be to implement in an upscaler like the Morph, but if it's feasible it would be a very cool and powerful feature indeed.

It depends on the size of the LUT - the more points you have on the edge of the cube, the larger your memory needs to be to hold all the values and it must be very high-speed, low-latency memory because you need to look up multiple values per pixel. If the feature is ever implemented, you should probably not expect something on the level of eeColor or even Lumagen's offerings (17x17x17 cube).

I'm pretty sure that a LUT would increase the latency by at least 4 pixels (1 for lookup, 3 for interpolation) ;)
_________________
GCVideo releases: https://github.com/ikorb/gcvideo/releases


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sat Jun 19, 2021 3:07 pm 


User avatar

Joined: 15 Jul 2009
Posts: 317
Location: Canada
Unseen wrote:
Sirotaca wrote:
I don't know how difficult it would be to implement in an upscaler like the Morph, but if it's feasible it would be a very cool and powerful feature indeed.

It depends on the size of the LUT - the more points you have on the edge of the cube, the larger your memory needs to be to hold all the values and it must be very high-speed, low-latency memory because you need to look up multiple values per pixel. If the feature is ever implemented, you should probably not expect something on the level of eeColor or even Lumagen's offerings (17x17x17 cube).

I'm pretty sure that a LUT would increase the latency by at least 4 pixels (1 for lookup, 3 for interpolation) ;)


Thanks for looking into this! I wonder if a 17pt cube would be enough to move from D65 to D93.

Another possibility--and bare with me here--is to offer SDI out as an option. The decklink mini converters can hold a 33pt cube and only cost $200CAD (they only take SDI in, and output HDMI).

SDI timings are tight, and trying to use a LUT with legacy gaming hardware is really hard as the output of the OSSC was never close enough for the HDMI to SDI converters I tried. The world needs better ways to get video into UHD compatible SDI anyway--the Morph could fill that gap.

Maybe you could have a header for SDI ready to populate if there are licensing considerations? Have the output ready to toggle within the software after the end user has soldered on the output?

Then you'd be off the hook for LUTs! ;)


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sat Jun 19, 2021 3:44 pm 


User avatar

Joined: 25 May 2014
Posts: 659
HDgaming42 wrote:
Thanks for looking into this!

Just to clear up any potential confusion: Reading your post I'm under the impression that you seem to think I'm somehow involved with the PixelFX products. I am not, I'm just a reader of this forum who has a bit of background knowledge about video processing.
_________________
GCVideo releases: https://github.com/ikorb/gcvideo/releases


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sun Jun 20, 2021 4:01 am 


User avatar

Joined: 15 Jul 2009
Posts: 317
Location: Canada
Unseen wrote:
HDgaming42 wrote:
Thanks for looking into this!

Just to clear up any potential confusion: Reading your post I'm under the impression that you seem to think I'm somehow involved with the PixelFX products. I am not, I'm just a reader of this forum who has a bit of background knowledge about video processing.


Thanks for the clarification. Maybe the OP could be updated to include the members of PixelFX, which seem to be

Woozle64
citrus3000psi
chriz2600

Unless they go by other handles, it seems only Woozle is active here?


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sun Jun 20, 2021 8:42 am 



Joined: 27 Sep 2018
Posts: 203
Konsolkongen wrote:
Will the Morph offer some basic white balance adjustments? The Dreamcast with original Sega VGA box (maybe all VGA boxes?) has a slightly too green image. This can somewhat be corrected on the OSSC. I expect the OSSC Pro will offer the same options but it would be great if the Morph did too.

Sounds like a BT.601-into-sRGB problem with a side of CRT gamut being bigger than sRGB's. Anything vaguely modern will be expecting PC-style sRGB on it's VGA input, not video-style BT.601.

vrunk11 wrote:
Is there a planned function for converting limited range to full range ? (switching between IRE 7.5 and IRE 0) for NTSC input

In analog YUV/YIQ land there's indeed both IRE 0.0 and 7.5, but other than a slight variance in the voltage level of logical zero, there's no difference. Analog RGB has always been IRE 0.0, as far as I know.

The difference in analog voltage swing doesn't translate into the digital realm where, at least by official spec, 8-bit YCbCr should always be 16-235, regardless of what the analog side looked like. The "full range" 0-255 data in YUV is technically standards-breaking and support for it is functionally broken in many, many things, both software and hardware. When you're working in the RGB color space, however, virtually everything uses and expects the full 0-255 (but not always...).

This should also make it obvious why 10-bit color is a big deal for YUV and why 10-bit color was part of the SDI specification in the 80s and basically anything newer than D1 in the pro space uses 10-bit color.

HDgaming42 wrote:
Another great usage, and this will likely get me in trouble, would be to play 8 and 16bit games in D93 the way the gods intended with a D93 LUT, while viewing other systems in D65. Or being able to instantly jump between the two to play in whatever you prefer.

It's not just D93 while point for NTSC-J, but also PAL-style EBU phosphors defining the color space instead of SMPTE-C? Per the manual from one of Sony's newer PVMs, it does indeed looks like NTSC-J is EBU + D93. This appears to be backed up by comments in Alexis Van Hurkman's The Color Correction Handbook: Professional Techniques for Video and Cinema.

Image

HDgaming42 wrote:
Another possibility--and bare with me here--is to offer SDI out as an option.

If there ever ends up being a "pro" model of the Morph (as in sold to commercial customers via places like B&H), SDI I/O would be strongly desired in that market, along with BNC connectors and general mechanical hardening.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Sun Jun 20, 2021 1:09 pm 



Joined: 14 Aug 2017
Posts: 1576
energizerfellow‌ wrote:
It's not just D93 while point for NTSC-J, but also PAL-style EBU phosphors defining the color space instead of SMPTE-C? Per the manual from one of Sony's newer PVMs, it does indeed looks like NTSC-J is EBU + D93. This appears to be backed up by comments in Alexis Van Hurkman's The Color Correction Handbook: Professional Techniques for Video and Cinema.


That picture is simply from a Sony monitor manual, regarding what settings the monitor defaults to when selecting a given region, which the user can then change depending on the source material to be worked on, and is not necessarily representative of what was used generally in Japan for a given period of time.

What we can know is that many professional monitors including Sony made ones used P-22 phosphors up to the 1990s at least, while there were also EBU ones later on as well mainly on non-Sony monitors like Victor/JVC and Ikegami, but a few older Sony ones as well. Sony as the industry leader itself switched to SMPTE-C for pro monitors sold in Japan in the mid/late 1990s up to their last CRTs in the mid 2000s.
edit: not so sure about the chronology here, it does look like there were times when these and other phosphor types co-existed in the Japanese market.

energizerfellow‌ wrote:
If there ever ends up being a "pro" model of the Morph (as in sold to commercial customers via places like B&H), SDI I/O would be strongly desired in that market, along with BNC connectors and general mechanical hardening.


This whole idea sounded funny to think about for a moment, but then I realized how much gaming and streaming is growing, and although retro is a small subset of that, it's actually not inconceivable that there can be a large enough professional space for it at some point down the road.


Top
 Offline Profile  
 
 Post subject: Re: PixelFX Morph
PostPosted: Mon Jun 21, 2021 8:25 am 


User avatar

Joined: 20 May 2021
Posts: 6
Location: Europe
energizerfellow‌ wrote:
In analog YUV/YIQ land there's indeed both IRE 0.0 and 7.5, but other than a slight variance in the voltage level of logical zero, there's no difference. Analog RGB has always been IRE 0.0, as far as I know.

The difference in analog voltage swing doesn't translate into the digital realm where, at least by official spec, 8-bit YCbCr should always be 16-235, regardless of what the analog side looked like. The "full range" 0-255 data in YUV is technically standards-breaking and support for it is functionally broken in many, many things, both software and hardware. When you're working in the RGB color space, however, virtually everything uses and expects the full 0-255 (but not always...).


But for the analog out it could be a good idea to have a full range output with good black and colors and on the hdmi/dvi side im not sure that
when passed through other video processing (switch scaler or capture card or a dvi input,) the limited range can be handled properly , and we should consider that the conversion from limited range to full range is done somewhere on the tv or the scaler.
the benefit of doing it on the scaler is that you control everything and don't rely on any other processing
(and for using the yuv444 input on a tv its necessary to use the PC mode that disable all other processing)


Top
 Offline Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 164 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6  Next

All times are UTC


Who is online

Users browsing this forum: Extrems and 7 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
Space Pilot 3K template by Jakob Persson
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group