OSSC Pro

The place for all discussion on gaming hardware
User avatar
vol.2
Posts: 2435
Joined: Mon Oct 31, 2016 3:13 pm
Location: bmore

Re: OSSC Pro

Post by vol.2 »

marqs wrote:CVBS/S-video would be fed into 1/2 channels of ISL51002 which would do line-locking and digitization just like for RGBS. The resulting signal obviously would not be directly usable as chroma is encoded in PAL/NTSC so you'd need to extract color burst, lock to it and finally do some DSP to decode color data. Not entirely trivial but should be still possible even with limitations of the ADC and FPGA.
what's the methodology from there? are you virtualizing the analog color demod section of an NTSC chroma signal? or is it more like virtualizing one of those all-in-one ICs that TVs began to use in the mid to late 80s? or is it something entirely different?
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: OSSC Pro

Post by VEGETA »

To give you an example, years ago in my day job we were checking viability of development of a FPGA-based acceleration card and the process included negotiations with local Altera/Intel distributor. The quoted prices were around 20-30% of Digikey/Mouser if I remember correctly, assuming a yearly quota would be met. Those were higher-tier Arria FPGAs, though and we're not getting quite as good deal here, but you should still not take Digikey/Mouser FPGA pricing as reference for batch production.
I know getting stuff from manufacturer is always better but didn't think it is as much as 20% of distro price. And it wouldn't be like this unless you have very high quantities. if this is the case then yes, getting from manufacturer is always better since also you can get away with separate shipping fee.
CVBS/S-video would be fed into 1/2 channels of ISL51002 which would do line-locking and digitization just like for RGBS. The resulting signal obviously would not be directly usable as chroma is encoded in PAL/NTSC so you'd need to extract color burst, lock to it and finally do some DSP to decode color data. Not entirely trivial but should be still possible even with limitations of the ADC and FPGA.
oh this is the way then. I thought you could use an extra decoder module then just feed the digital data into fpga scaler.
To my understanding ADV740x is a family of basic CVBS/RGB decoders (later models also included digital data input port) while ADV780x are mostly similar but have added SDRAM interface for TBC, comb filtering etc. Then there is also the oddball budget model ADV7181D which is most recent of the bunch and thus possibly worth checking out if you can look past its limitations. Even if datasheet mentions 240p it doesn't guarantee the chip works with all sync types and quircks. For example, my old AE700 projector from 15+ years ago has ADV7402 in it and it didn't work with all SNES models (later ones don't have serration pulses). Whether the problem was in chip configuration rather than in its capability is something I don't know, but I'd do some testing before jumping on using any one chip.
most recent IC is ADV7842 which seems to be an updated or more recent version of ADV7441A which is used in Framemeister. as everyone seem to agree, FM deals perfectly good with 240p material.

I choose ADV7800 for my design since it supports 1080p analog input, while ADV7842 seems better but it is more expensive and requires than blasphemous HDCP license (or does it?) which I will never get xD. ADV7800 seems old but still in production... its dev board is non-existent, ADV7403 board is freaking +500$ while ADV7842 more features and more recent with mere 200$ dev board.

I am still contacting them to find a solution cheap enough to test... then I saw you taking adv7842 into consideration, why not use it? it supports all formats you need and has very cheap dev kit?

thanks for your effort.
User avatar
marqs
Posts: 1034
Joined: Sat Dec 15, 2012 12:11 pm
Location: Finland

Re: OSSC Pro

Post by marqs »

vol.2 wrote:what's the methodology from there? are you virtualizing the analog color demod section of an NTSC chroma signal? or is it more like virtualizing one of those all-in-one ICs that TVs began to use in the mid to late 80s? or is it something entirely different?
My thought was splitting luma & chroma (for cvbs) and doing demodulation in digital domain, followed by CSC etc. I'm not an expert in this area, though, so I'd appreciate if someone more familiar with the subject could take lead developing the feature (or telling it's not feasible if so).
VEGETA wrote:oh this is the way then. I thought you could use an extra decoder module then just feed the digital data into fpga scaler.
That would be the easy solution, but it'd require extra HW and come with the limitations these decoder chip typically have. Still, it might be the only viable solution at least in short term.
VEGETA wrote:most recent IC is ADV7842 which seems to be an updated or more recent version of ADV7441A which is used in Framemeister. as everyone seem to agree, FM deals perfectly good with 240p material.
According to Analog, ADV7842 launched in 2010 while ADV7181D in 2011.
VEGETA wrote:I am still contacting them to find a solution cheap enough to test... then I saw you taking adv7842 into consideration, why not use it? it supports all formats you need and has very cheap dev kit?
The primary goal was to find most capable and flexible RGB digitizer. While ADV7842 has CVBS and HDMI decoding built-in which would be nice, judging from the datasheet I'm not so sure whether its RGB digitization is best in class. It's also prototype-unfriendly BGA chip and has unnecessary stuff like SDRAM interface. With more time and resources I'd had certainly tested an eval board, but writing drivers for multiple chip candidates and debugging them is time-consuming and not that much fun.
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: OSSC Pro

Post by VEGETA »

According to Analog, ADV7842 launched in 2010 while ADV7181D in 2011.
ADV7181D is "not recommended for new designs", while ADV7800, 7802, and similar are to follow discontinuation soon. Now the only IC they fully recommend is ADV7842 which seems most capable too. Framemeister uses ADV7441A which is obsolete, and kinda the per-desessor of ADV7842. They confirmed the standard definition processor of adv7842 is better than 7441a. do you find FM handling of 240p and similar material to have flaws? adv7842 should be better than it.
judging from the datasheet I'm not so sure whether its RGB digitization is best in class.
how did you come to this conclusion from datasheet? you can check design support materials.
It's also prototype-unfriendly BGA chip and has unnecessary stuff like SDRAM interface.
BGA is pain but now I see people reballing it on youtube, plus you can let your assembler do it. you will for sure make a test board with lots of 0402 unsolderable by hand stuff, add BGA to them.

SDRAM is not necessary, you can ditch it altogether. its only required if you want 3d comb filter and similar stuff which you will be doing in your scaler anyway. I guess there is a pdf for adv7800 showing "2d operation" which is without ram.
With more time and resources I'd had certainly tested an eval board, but writing drivers for multiple chip candidates and debugging them is time-consuming and not that much fun.
their eval board comes with its own software for free. I guess you just feed it script files which they provide too.. I still didn't get one myself.
User avatar
marqs
Posts: 1034
Joined: Sat Dec 15, 2012 12:11 pm
Location: Finland

Re: OSSC Pro

Post by marqs »

VEGETA wrote:They confirmed the standard definition processor of adv7842 is better than 7441a. do you find FM handling of 240p and similar material to have flaws? adv7842 should be better than it.
I've had my fair share of issues with FM and sync detection etc. but not all of them are necessarily adv7441a's fault.
VEGETA wrote:how did you come to this conclusion from datasheet? you can check design support materials.
For example, it seems to lack features such as coast adjustment and external sampling clock. Ideally those would not be needed, but after working with video digitizers I've become quite dubious about automation working flawlessly so I welcome all knobs. Clamp and ALC controls also were thin compared to some other digitizers unless I failed to notice them in the 500+ page manual.
VEGETA wrote:their eval board comes with its own software for free. I guess you just feed it script files which they provide too.. I still didn't get one myself.
Feel free to list upsides/downsides of ADV7842 / ADV7800 if you end up evaluating them. The community needs a good reference of video digitizers since nobody has publicly reviewed / tested them. New chip designs are no more made and production of many old ones has ended (or ending soon), so now would be a good time to compile a summary of the still available ones. I could fill in details for TVP7002 and ISL51002 as I've mostly used them.
Nrg
Posts: 45
Joined: Sun Aug 30, 2015 8:36 pm

Re: OSSC Pro

Post by Nrg »

Here's an example of manually decoding svideo chroma/color signal using Terasic FPGA board and simple THDB-ADA ADC board (no video decoder IC is used at all):
- part1: https://www.youtube.com/watch?v=TGCjMlYM594
- part2: https://www.youtube.com/watch?v=HgMyCpVtBCk
- part3: https://www.youtube.com/watch?v=XFzlU050uH4

part3 description:
- "FPGA(Terasic C5G Cyclone V GX Starter Kit) samples NTSC(S-Video) signales(Y,C) at 14.318MHz/14bits(THDB-ADA), obtains R-Y, B-Y signales using color-burst signal directly (No PLL is used to decoding the color) and displays deocded NTSC-video on an LCD."

blog entry from the author: http://debuota23.blog106.fc2.com/blog-entry-30.html
I asked a couple of years ago if the author has plans opensourcing the implementation but unfortunately that was not the plan :(

Anyway, as those videos show, it's obviously possible to do pal/ntsc decoding without video decoder ICs. I've been planning to try to do the same myself using a FPGA+ADC ("how hard can it be" :) ), but so far haven't had enough time for it..
AaronSR
Posts: 84
Joined: Mon Aug 14, 2017 1:01 am

Re: OSSC Pro

Post by AaronSR »

Hi, not sure if the original OSSC can output a padded/pillarbox resolution or if the Pro will be able to? I own a monitor that has no way to force 4:3 on its own (the option is greyed out unless I use DisplayPort).
nmalinoski
Posts: 1974
Joined: Wed Jul 19, 2017 1:52 pm

Re: OSSC Pro

Post by nmalinoski »

AaronSR wrote:Hi, not sure if the original OSSC can output a padded/pillarbox resolution or if the Pro will be able to? I own a monitor that has no way to force 4:3 on its own (the option is greyed out unless I use DisplayPort).
Pillarboxing and windowboxing aren't possible on the original OSSC; the OSSC Pro should be able to do that.

Since aspect ratio adjustment is available with DP, perhaps look into an HDMI to DP converter?
AaronSR
Posts: 84
Joined: Mon Aug 14, 2017 1:01 am

Re: OSSC Pro

Post by AaronSR »

Thats an idea, would that potentially introduce lag though? Thanks I'll order one and test it. Like its good for HDMI/PC you can easily set 4:3 but not for my console stuff which I kinda wanted to use it as both a TV and monitor.

But either way I am looking to grab an OSSC Pro when they're available, always needed something that could deal with 240p/480i switches and now I can finally retire the cheap scart to HDMI that I've used for years.
User avatar
parodius
Posts: 720
Joined: Wed Jan 26, 2005 5:54 am
Location: Singapore

Re: OSSC Pro

Post by parodius »

Edit : woops wrong thread
My sales thread : 2020/07/20..MASTER.VER.
ZellSF
Posts: 2642
Joined: Mon Apr 09, 2012 11:12 pm

Re: OSSC Pro

Post by ZellSF »

AaronSR wrote:Thats an idea, would that potentially introduce lag though? Thanks I'll order one and test it. Like its good for HDMI/PC you can easily set 4:3 but not for my console stuff which I kinda wanted to use it as both a TV and monitor.
Remember HDMI to DP adapters are directional. A DP to HDMI adapter will not work, and those are way more common.

A HDMI to DP adapter shouldn't introduce lag, but your monitor might process DP and HDMI differently and so you'll end up with different amounts of lag that way. I bought one to reduce lag since my monitor has 1 frame more lag on HDMI ports than it does on DP ports.

Also the adapter I bought (this one if you're curious) doesn't like the OSSC so compatibility might be an issue.
User avatar
Josh128
Posts: 2123
Joined: Thu Jan 16, 2014 9:01 am

Re: OSSC Pro

Post by Josh128 »

Marqs, any updates on availability timeframe?
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: OSSC Pro

Post by VEGETA »

I forgot to ask about power supply filtering in this design. I don't see much going on, I assume people will buy their own 12v power brick or something, right? this would need good filtering to eliminate high frequency noise.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: OSSC Pro

Post by fernan1234 »

Has the Pro been confirmed to use 12v power? I was hoping it would be 5V like the OSSC Classic, since I've actually gotten best noise-free results using USB power hubs with USB to barrel plug cables. This is also more convenient than having another wall wart taking up 1 or more outlets.
User avatar
marqs
Posts: 1034
Joined: Sat Dec 15, 2012 12:11 pm
Location: Finland

Re: OSSC Pro

Post by marqs »

Nrg wrote:Here's an example of manually decoding svideo chroma/color signal using Terasic FPGA board and simple THDB-ADA ADC board (no video decoder IC is used at all):
- part1: https://www.youtube.com/watch?v=TGCjMlYM594
- part2: https://www.youtube.com/watch?v=HgMyCpVtBCk
- part3: https://www.youtube.com/watch?v=XFzlU050uH4

part3 description:
- "FPGA(Terasic C5G Cyclone V GX Starter Kit) samples NTSC(S-Video) signales(Y,C) at 14.318MHz/14bits(THDB-ADA), obtains R-Y, B-Y signales using color-burst signal directly (No PLL is used to decoding the color) and displays deocded NTSC-video on an LCD."

blog entry from the author: http://debuota23.blog106.fc2.com/blog-entry-30.html
I asked a couple of years ago if the author has plans opensourcing the implementation but unfortunately that was not the plan :(

Anyway, as those videos show, it's obviously possible to do pal/ntsc decoding without video decoder ICs. I've been planning to try to do the same myself using a FPGA+ADC ("how hard can it be" :) ), but so far haven't had enough time for it..
I wonder if sampling clock (14.318MHz) being a multiple of NTSC subcarrier (4x 3.579545MHz) is a hard requirement for color decoding to work in this design or just a selected value. In the former case it wouldn't really allow freeform sampling and thus would not be superior to the common decoder ICs.
AaronSR wrote:Hi, not sure if the original OSSC can output a padded/pillarbox resolution or if the Pro will be able to? I own a monitor that has no way to force 4:3 on its own (the option is greyed out unless I use DisplayPort).
Pillarboxing can be done, no problem there.
Josh128 wrote:Marqs, any updates on availability timeframe?
We're still trying to launch this year.
fernan1234 wrote:Has the Pro been confirmed to use 12v power? I was hoping it would be 5V like the OSSC Classic, since I've actually gotten best noise-free results using USB power hubs with USB to barrel plug cables. This is also more convenient than having another wall wart taking up 1 or more outlets.
It has 5V barrel plug input like before, but practically all ADCs and logic use 3.3V or below so the input is not fed directly to any sensitive parts. I would still recommend using the 2.5A PSU that we plan to include with the board - you are likely to just ask for trouble if you use an underpowered USB hub or a thinny USB cable.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: OSSC Pro

Post by fernan1234 »

marqs wrote:It has 5V barrel plug input like before, but practically all ADCs and logic use 3.3V or below so the input is not fed directly to any sensitive parts. I would still recommend using the 2.5A PSU that we plan to include with the board - you are likely to just ask for trouble if you use an underpowered USB hub or a thinny USB cable.
I see, that's good to know. I use a power strip with USB outputs that seem to follow the USB spec of 5V 2A, and I've never had issues with the OSSC Classic, whereas I actually did run into troubles using the PSU included with the VGP kit.
User avatar
ThaPhatCat
Posts: 3
Joined: Fri Jan 08, 2021 7:12 pm

Re: OSSC Pro

Post by ThaPhatCat »

Hi marqs : ) Please forgive me if the question was asked before, I only follow progress on OSSC Pro every month or so.

Is there enough hardware on the FPGA to do 3D LUT to match CRT "neon" colors (for lack of a better term.) - simulation of phosphor glow / CRT's "color volume"? In the 90s scalers and line doublers that used ASICs to do they thing were costing an arm and leg, but now we could do it with FPGAs - has something like this happend with 3D LUTs?

I include some pictures from @CRTpixels Twitter account to show how CRTs are shifting colors - to explain what I mean. The CRT colors look less cheap and less like a kids cartoon and more... mature.

https://twitter.com/CRTpixels/status/14 ... 10593?s=19
https://twitter.com/CRTpixels/status/14 ... 48163?s=19
https://twitter.com/CRTpixels/status/14 ... 13603?s=19
https://twitter.com/CRTpixels/status/14 ... 13125?s=19
https://twitter.com/CRTpixels/status/14 ... 54976?s=19
https://twitter.com/CRTpixels/status/14 ... 01572?s=19


In the link below, there are photos from a computer LUT shader. They’re created from official color profiles distributed with CRT monitor drivers.

https://forums.libretro.com/t/crt-color ... ndai/18496

https://imgur.com/a/PGMWPgz
User avatar
marqs
Posts: 1034
Joined: Sat Dec 15, 2012 12:11 pm
Location: Finland

Re: OSSC Pro

Post by marqs »

ThaPhatCat wrote:Is there enough hardware on the FPGA to do 3D LUT to match CRT "neon" colors (for lack of a better term.) - simulation of phosphor glow / CRT's "color volume"? In the 90s scalers and line doublers that used ASICs to do they thing were costing an arm and leg, but now we could do it with FPGAs - has something like this happend with 3D LUTs?
It comes down to capacity and performance requirements. If you'd use full 24bit value from the ADC as address, the LUT would have 16.7M entries (of 24bits each). There's 256MB of DRAM on board so it wouldn't consume too big portion of that and address space could be easily reduced for many classic consoles. Another matter is performance since the lookup would need to be done for each sampled pixel. For low-res sources that shouldn't be a problem, but for inputs with high pixel clocks (>100MHz) it might cause contention with other blocks accessing DRAM (frame buffer, deinterlacer etc.).
User avatar
ThaPhatCat
Posts: 3
Joined: Fri Jan 08, 2021 7:12 pm

Re: OSSC Pro

Post by ThaPhatCat »

marqs wrote:
ThaPhatCat wrote:Is there enough hardware on the FPGA to do 3D LUT to match CRT "neon" colors (for lack of a better term.) - simulation of phosphor glow / CRT's "color volume"? In the 90s scalers and line doublers that used ASICs to do they thing were costing an arm and leg, but now we could do it with FPGAs - has something like this happend with 3D LUTs?
It comes down to capacity and performance requirements. If you'd use full 24bit value from the ADC as address, the LUT would have 16.7M entries (of 24bits each). There's 256MB of DRAM on board so it wouldn't consume too big portion of that and address space could be easily reduced for many classic consoles. Another matter is performance since the lookup would need to be done for each sampled pixel. For low-res sources that shouldn't be a problem, but for inputs with high pixel clocks (>100MHz) it might cause contention with other blocks accessing DRAM (frame buffer, deinterlacer etc.).
I uploaded a photo for you:

https://ibb.co/JKKMN4M

It's a comparison of a real Sony Trinitron CRT TV 20" that I quickly overlaid next to RetroTink 5x new Aperture Grille Scanline Filter - I believe that from a normal viewing distance if the colors were matched via 3D LUT on an OLED TV with BFI the motion resolution, input lag, colors, contrast and Aperture Grille anti aliasing effect would be indistinguishable from a real consumer CRT TV.

And the geometry would be better, I'm not a PVM / BVM person (BVM scanlines are bigger than the actual lines), I like their better geometry but I use line doubler on a high-end CRT to achieve no scanlines effect just like an arcade CRT.

Do you think OSSC Pro is capable of 3D LUT with 240p 5x scaling (to match CRT TV's overscan) WITH artificial scanlines *at the same time*??? That would save people thousands of dollars spent on Sony L5 PVMs / BVMs and recapping them.
User avatar
maxtherabbit
Posts: 1763
Joined: Mon Mar 05, 2018 4:03 pm

Re: OSSC Pro

Post by maxtherabbit »

ThaPhatCat wrote:That would save people thousands of dollars spent on Sony L5 PVMs / BVMs and recapping them.
People who invest that kind of money in high end CRTs do it because they want the genuine article. No one is going to jump ship to a "CRT simulation" no matter how good it looks
strayan
Posts: 671
Joined: Sun Mar 19, 2017 8:33 pm

Re: OSSC Pro

Post by strayan »

maxtherabbit wrote:
ThaPhatCat wrote:That would save people thousands of dollars spent on Sony L5 PVMs / BVMs and recapping them.
People who invest that kind of money in high end CRTs do it because they want the genuine article. No one is going to jump ship to a "CRT simulation" no matter how good it looks
I am certainly willing to consider offloading my 21 inch CRTs (probably just garage them just in case :mrgreen: ) if it looks as good as I think it might on my OLED PVM: https://pro.sony/s3/cms-static-content/ ... 477047.pdf
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: OSSC Pro

Post by fernan1234 »

And saying that the BFI on LG's WOLEDs would be "indistinguishable" from a CRT for motion clarity is going quite a bit too far, and that's not counting the disadvantages introduced by BFI.
strayan wrote:I am certainly willing to consider offloading my 21 inch CRTs (probably just garage them just in case ) if it looks as good as I think it might on my OLED PVM: https://pro.sony/s3/cms-static-content/ ... 477047.pdf
A Sony pro OLED on the other hand will get you much closer because it uses a scan driving rather than BFI, and without brightness loss. Not 100% like CRT, but it's the best persistence/sample-and-hold blur solution available up to this day. These are 1080p panels so no additional scaling with 4X and 5X outputs (actually 3X and 2X too since these monitors can show 480p and 720p pixel matched unscaled).
User avatar
VEGETA
Posts: 425
Joined: Mon May 31, 2021 10:40 am

Re: OSSC Pro

Post by VEGETA »

It has 5V barrel plug input like before, but practically all ADCs and logic use 3.3V or below so the input is not fed directly to any sensitive parts. I would still recommend using the 2.5A PSU that we plan to include with the board - you are likely to just ask for trouble if you use an underpowered USB hub or a thinny USB cable.
Hmmm I saw you using LP5912 LDO which won't be able to filter out any high frequency noise. PSRR is rated at very very low frequency of say 1-50 khz, most SMPS will be a lot more than this, 1 MHz being very common and sometimes more, especially if it is small size (= small size inductors). Looks like LP5912 noise rejection figure is only good up to 10 khz if not less. with 100 khz it is at 40 db.

What is the noise and ripple generated by your supplied wall PSU? at what frequency does it operate at? you must see it in oscilloscope to verify but it is known that LDOs will not filter the noise\ripple at higher frequencies.

It doesn't matter if your sensitive parts use 5v supply or not since 3.3v is generated from it directly without proper filtering (LC, Pi filters, etc...). I think you know all this and took it in consideration, I just want to point it out. I am really interested about the PSU you are going to supply with the device in terms of price, origin, and most importantly its noise\ripple specs. Chinese stuff are almost always very bad in this regard.

If you still take suggestions, then I suggest getting an AC-DC power module from meanwell which will be < 10$ and will act at a frequency less than 100 khz (maybe 65 khz) which is easily decimated by good LDO of yours.

I am really pumped for the pro this year!
strayan
Posts: 671
Joined: Sun Mar 19, 2017 8:33 pm

Re: OSSC Pro

Post by strayan »

Not to mention having the ability to correctly reproduce the color space of ITU-709, EBU, and SMPT-C standards, essentially zero input lag (it measures 16ms at the bottom of the screen with my TS) and is painfully luminous even at the lowest setting.

:mrgreen: :mrgreen: :mrgreen:
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: OSSC Pro

Post by fernan1234 »

strayan wrote:Not to mention having the ability to correctly reproduce the color space of ITU-709, EBU, and SMPT-C standards, essentially zero input lag (it measures 16ms at the bottom of the screen with my TS) and is painfully luminous even at the lowest setting.

:mrgreen: :mrgreen: :mrgreen:

That too and many other things. Totally outside of their designated purposes, but they remain the best flat panels that one can buy for retro gaming. Ironically they were not compatible with the OSSC past 2X mode, but the Pro should take care of that issue.
H6rdc0re
Posts: 224
Joined: Tue Jan 17, 2017 8:22 pm

Re: OSSC Pro

Post by H6rdc0re »

fernan1234 wrote:And saying that the BFI on LG's WOLEDs would be "indistinguishable" from a CRT for motion clarity is going quite a bit too far, and that's not counting the disadvantages introduced by BFI.
If you play in a light controlled environment or even better a dark room then the brightness loss is compensated. Motion is perfect though and comparable to a CRT. Input lag is also minimal.

Throw in a Sinden Lightgun and a Retrotink 5X with it’s new awesome scanlines and reasons for CRT’s start to fade.
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: OSSC Pro

Post by fernan1234 »

H6rdc0re wrote:If you play in a light controlled environment or even better a dark room then the brightness loss is compensated. Motion is perfect though and comparable to a CRT. Input lag is also minimal.

Throw in a Sinden Lightgun and a Retrotink 5X with it’s new awesome scanlines and reasons for CRT’s start to fade.

Sounds like you don't like using "scanlines", but for people that do need to use them especially at 100% it's basically a deal-breaker because that's an additional 50% brightness loss on top of the 50% loss from 60Hz BFI which is what you need for most optimal motion clarity, and how close that gets you to "comparable to CRT" seems to vary by perception. For me it's not that close, and according to measurements it's basically 1/4 as clear as CRT which is not bad but still ways to go IMO.

I do agree that for many people commercial OLEDs + modern scalers can be quite a satisfactory alternative to CRTs.
User avatar
orange808
Posts: 3196
Joined: Sat Aug 20, 2016 5:43 am

Re: OSSC Pro

Post by orange808 »

fernan1234 wrote:
H6rdc0re wrote:If you play in a light controlled environment or even better a dark room then the brightness loss is compensated. Motion is perfect though and comparable to a CRT. Input lag is also minimal.

Throw in a Sinden Lightgun and a Retrotink 5X with it’s new awesome scanlines and reasons for CRT’s start to fade.

Sounds like you don't like using "scanlines", but for people that do need to use them especially at 100% it's basically a deal-breaker because that's an additional 50% brightness loss on top of the 50% loss from 60Hz BFI which is what you need for most optimal motion clarity, and how close that gets you to "comparable to CRT" seems to vary by perception. For me it's not that close, and according to measurements it's basically 1/4 as clear as CRT which is not bad but still ways to go IMO.

I do agree that for many people commercial OLEDs + modern scalers can be quite a satisfactory alternative to CRTs.
Which specific OLED displays are you referencing? I agree that nothing really looks like a CRT. Brightness can be an issue in bad lighting. I'm fortunate and I have a room with ceiling mounted "track lighting" to accommodate my displays and beamers. Of course, flicker also bothers some people.

I ask because Sony's rolling scan in their OLED professional monitors was quite good. I think the motion resolution may be better than a CRT and the clarity can be jarring--especially when you're playing a familiar game. The biggest limitation on OLED motion clarity may be manufacturers that don't care. Although, it's entirely possible that rolling scan looks very bad on a large screen.
We apologise for the inconvenience
fernan1234
Posts: 2175
Joined: Mon Aug 14, 2017 8:34 pm

Re: OSSC Pro

Post by fernan1234 »

orange808 wrote:Which specific OLED displays are you referencing? I agree that nothing really looks like a CRT. Brightness can be an issue in bad lighting. I'm fortunate and I have a room with ceiling mounted "track lighting" to accommodate my displays and beamers. Of course, flicker also bothers some people.

I ask because Sony's rolling scan in their OLED professional monitors was quite good. I think the motion resolution may be better than a CRT and the clarity can be jarring--especially when you're playing a familiar game. The biggest limitation on OLED motion clarity may be manufacturers that don't care. Although, it's entirely possible that rolling scan looks very bad on a large screen.
First it's good to distinguish between different meanings of "OLED" because the technologies are quite different. You have LG's WRGB OLED (WOLED) which is bottom emission OLED that uses color filters and big white pixels to try compensate for brightness and color limitations. You have the newer QD-OLED from Samsung which is blue OLED with a QD layer filter for color. And then there's the top emission, unfiltered true RGB OLED used on Sony's PVMs (now discontinued) and BVMs. That's also the order in terms of quality (with LGs way at the bottom, but also the most affordable obviously).

Regarding LG's panels, I don't think BFI flicker at 60hz (which is what you want to simulate standard CRT refresh rate) should be bothersome for anyone, especially from the normal viewing distance that these large TVs require. For 50Hz I can see it being more of a problem. On the newer CX and C1 models with 120Hz refresh rates you can adjust the BFI cycle duty more, but while anything other than full BFI @ 60Hz (OLED Motion Pro at max setting) will reduce brightness less, it will also be less effective at clearing motion with our retro games that target 60fps or a fraction (it can be excellent for 120fps games though). Speaking only about "legacy" 60fps or ~30fps content, at that max setting the motion clarity is 25% that of CRT, which again is pretty good but you can tell a big difference if you have a CRT next to it. Rolling scan on these panels would actually not look bad, in fact the lower Motion Pro settings kind of look like it (though they're moving quarter framers rather than an actual rolling bar), which ends up being less noticeable than the full frame flicker.

I don't think anyone has taken accurate motion clarity measurements of Sony's rolling scan. BlurBusters has an article with some measurements but I don't think they're correct. To my eyes, my guesstimate is that it gets you about half way toward CRT motion clarity, which is pretty great.
User avatar
orange808
Posts: 3196
Joined: Sat Aug 20, 2016 5:43 am

Re: OSSC Pro

Post by orange808 »

I assume the write up is for other readers? Anyhow, any good motion technology should be frame locked to the input. For instance, I have a projector here built for simulation. It locks "BFI" to the input frame rate. Two bulbs ensures brightness, but nothing stops a bit of flicker.

Just out of curiosity, are you judging the motion clarity of a Sony PVM from that article or have you purchased one?
We apologise for the inconvenience
Post Reply