Retro Consoles - 0 or 7.5 IRE for Black?

The place for all discussion on gaming hardware
fernan1234
Posts: 2182
Joined: Mon Aug 14, 2017 8:34 pm

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by fernan1234 »

The moment you switch to 9300K everything will feel too blue, but our eyes naturally adjust to the new white balance within a few minutes. And if you then switch back to 6500K it will immediately feel too yellow. So give it a try for at least half an hour and see how you like it.
User avatar
NewSchoolBoxer
Posts: 369
Joined: Fri Jun 21, 2019 2:53 pm
Location: Atlanta, GA

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by NewSchoolBoxer »

I'm starting to get interested in this thread. Japanese consoles in route since they cost half what American ones do. Realized my 20L2 has IRE 0 and 7.5 settings + an additional "SMPTE" for Component. Color temperature settings for D56, 65, 93 and a custom USER. Obviously equate to 5600/6500/9300K. At least RGB has no allowable IRE setting.

I have a long post in Saturn + OSSC thread. Summary is American SNES and PS2 models I tested were using IRE 7.5 as expected.

Best example is Vegas Stakes showing more detail in bottom quarter of screen with IRE 0 versus IRE 7.5: https://youtu.be/HjTnwJdDwHs
If console were really outputting IRE 0 then my understanding is blacks would no longer be crushed + darker overall, versus actually brighter, when switching from IRE 7.5.
Going to be messy if right IRE setting can really be game dependent. Obviously not every game on analog consoles was made in Japan and I think Super Metroid and Mario Paint are bit for bit identical between US and Japan releases.

I found a useful chart on page 14 of the manual from the circa 2011 OLED PVM-740. A cool $2725 MSRP :mrgreen: got you 7.4″ with 960×540 resolution, so not even 720p: https://pro.sony/en_AO/support-resources/pvm-740/manual

R= &Color Temp&NTSC Setup&Color Space R=North America&D65&7.5&SMPTE-C R=Latin America - PAL and PAL-N / Argentina, Paraguay, Uruguay&D65&0&EBU R=Latin America - NTSC and PAL-M / Brazil&D65&7.5&SMPTE-C R=Africa, Australasia, Europe, Middle East&D65&0&EBU R=Asia Except Japan - NTSC&D65&7.5&SMPTE-C R=Asia Except Japan - PAL&D65&0&EBU R=Japan - PAL&D93&0&EBU


Color temperature is all Japan's fault. Putting together two message board posts, seems Japan was 9300K between 1985 and 2011 and can use to argue Famicom is D65.
I found message board claim that Japanese PS1 SCPH-XXX3 models use IRE 7.5.
fernan1234
Posts: 2182
Joined: Mon Aug 14, 2017 8:34 pm

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by fernan1234 »

@NewSchoolBoxer:

1. As mentioned before, all of the IRE doubts and hesitations can be avoided completely by simply using RGB or YPbPr when available (that's the reason why your PVM has no IRE setting for RGB/YPbPr, it doesn't apply to them because they're not NTSC signals). If you insist on or have to use NTSC signals, then you'll have to play it by ear because it is all indeed a mess within and across systems.
2. Component level can be left as SMPTE for virtually all content. The Betacam 0 and 7 settings are for, you guessed it, betacam footage.
3. For Famicom you want IRE 0 if using its original NTSC output. Unsure about some American developed games but who cares about those? :P
4. For temperature/white balance, just use whatever looks nicer to you, D65 or D93 it really doesn't matter for these old games.
5. Pretty sure I told you this before in another topic, but SMPTE-C and EBU color spaces/phosphors came late into the CRT game and were only really used for broadcast content and film-to-video conversion. Game content was likely developed/referenced on P-22 or some older phosphor CRTs.
User avatar
NewSchoolBoxer
Posts: 369
Joined: Fri Jun 21, 2019 2:53 pm
Location: Atlanta, GA

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by NewSchoolBoxer »

Thank you fernan1234 for responding with expert knowledge. I get that RGB is outside the NTSC/PAL encoding system but I like being scientific. I stream in S-Video since it's effortless to capture and I don't have to move my CRT.

IRE 0 for Famicom is helpful and I can easily confirm with American NES. D65 vs D93 is a minor difference like you're saying.
I've seen your SMPTE-C and EBU comment and again that's helpful. Was directed to someone else I think but no matter.

SMPTE looks 100% correct for PS2 Mega Man X-7 and Final Fantasy X on PS2. Beta 7.5 is too dark, 0 even worse, especially with greens. However, Beta 7.5 looked absolutely correct on PS1 Ogre Battle that was ported from SNES. Setup is 20PVML2 on the right video outing to I'Art 20F703 on the left: https://imgur.com/a/KEx6tzh

The reds and burgundies are too bright and saturated in SMPTE compared to the I'Art and SNES version. Trying to come up with an explanation, I found the IRE and Component Level information in the manual of the M4 that 22point8 posted and my L2: https://imgur.com/a/MwbsYWW

My theory is 240p games were developed with the intention of the player using the bundled Composite cable. Devs picked an IRE and in porting these games to another analog system, was easy to preserve IRE (NTSC) coloring and not code logic for SMPTE. PS1 lacks YPbPr after all. You can see in manual pic that SMPTE 100% white voltage at 700 mV is too low for the 714 mV of NTSC and Beta 7.5 level. Basically, I'm crushing whites. Burgundy looked equally wrong in Final Fantasy VII.

As niche a use case of 0/7.5 Beta in YPbPr is to watch Beta tapes (lol) and preserve NTSC coloring, I think 240p games over YPbPr is another. Other SNES ports and PS1 games I can try.
fernan1234
Posts: 2182
Joined: Mon Aug 14, 2017 8:34 pm

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by fernan1234 »

NewSchoolBoxer wrote: SMPTE looks 100% correct for PS2 Mega Man X-7 and Final Fantasy X on PS2. Beta 7.5 is too dark, 0 even worse, especially with greens. However, Beta 7.5 looked absolutely correct on PS1 Ogre Battle that was ported from SNES. Setup is 20PVML2 on the right video outing to I'Art 20F703 on the left: https://imgur.com/a/KEx6tzh

The reds and burgundies are too bright and saturated in SMPTE compared to the I'Art and SNES version. Trying to come up with an explanation, I found the IRE and Component Level information in the manual of the M4 that 22point8 posted and my L2: https://imgur.com/a/MwbsYWW
This is interest, and like you said most likely because neither of these systems originally supported YPbPr. What kind of YPbPr cables are you using? Or perhaps you're using RGB-to-YPbPr transcoding? I've never used them, but I am under the impression that HD Retrovision YPbPr cables for these systems are tuned to deliver RGB-like results. It's good to know that any variation can be addressed with the component level settings on a professional monitor. I wonder what the results would be like on a consumer TV with component inputs that doesn't have such settings though.
energizerfellow‌
Posts: 208
Joined: Thu Sep 27, 2018 1:04 am

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by energizerfellow‌ »

Per comments by Artemio and Rama in the 240p Test Suite thread, it would appear all (?) non-Sony pre-HD NTSC game consoles from a Japanese-based company are IRE 0.0 (and assume P22 + D93?), regardless of what country the hardware was sold into originally? Sony, on the other hand, appears to have region-specific black levels, but it's bugged and affects RGB too? Maybe the Dreamcast's and 1-chip SNES's too-hot levels are from 0-100 IRE incorrectly pushed up into 7.5-107.5 IRE?

The existence and state of the black level setup pedestal is easy enough to verify with an oscilloscope on the composite/luma pin (ideally under load via something like the RGBench), no display needed. Sampling the signal directly removes human color perception and display calibration from the data gathering process, yielding much more accurate information. As far as I know, nobody has publicly documented oscilloscope-verified black levels across various consoles and models in one place.

As for Beta component signaling levels (and more specifically Betacam, not consumer Betamax) vs SMPTE, no consumer-focused hardware ever used Betacam levels that I'm aware of. They're all SMPTE (and by extension IRE 0.0). End of discussion, really.
User avatar
NewSchoolBoxer
Posts: 369
Joined: Fri Jun 21, 2019 2:53 pm
Location: Atlanta, GA

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by NewSchoolBoxer »

fernan1234 wrote: What kind of YPbPr cables are you using? Or perhaps you're using RGB-to-YPbPr transcoding?
I roll the Walmart approved "Camponent Cable for PXX" that came with the PS2 I bought on Craigslist. Then I use passive video balun over ethernet cable to route to the JVC I'Art. I don't see relevant JVC video settings but the late October 2002 manufacturing date leads me to believe it uses the color gamut and temperature intended for American audiences. I'm confident it's the same SMPTE-C as the PVM but JVC seems to recognize Composite/S-Video true white 714 mV inside a Component signal and uses that instead. Redditor told me that the HD Retrovision RGB -> YPbPr cable on SNES looks better than S-Video but worse than RGB. What I expected but I'm curious if it crushes whites.

PVM assumes we know what we're doing. Not stopping me turning it Japanese with IRE 0 and DS93 from American DVD player source. I was trying to find more info about the Beta part of its 0/7.5 setting. Is YCbCr that I see was supported in high end BetaCam equipment alongside SDI. This source gives NTSC-U and J 100% white as 714 mV like Composite/S-Video and gamma-corrected R'G'B but unlike 700 mV SMPTE-C.

I don't know how they botched this description though considering PAL stands for Phase Alternating Line: "In NTSC, the phase of the color subcarrier reverses every field, and in PAL, it indexes 90° per field."
User avatar
NewSchoolBoxer
Posts: 369
Joined: Fri Jun 21, 2019 2:53 pm
Location: Atlanta, GA

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by NewSchoolBoxer »

Sorry I'm verbose. Wanted to split into two posts since wall of text.
energizerfellow‌ wrote:Per comments by Artemio and Rama in the 240p Test Suite thread, it would appear all (?) non-Sony pre-HD NTSC game consoles from a Japanese-based company are IRE 0.0 (and assume P22 + D93?), regardless of what country the hardware was sold into originally? Sony, on the other hand, appears to have region-specific black levels, but it's bugged and affects RGB too? Maybe the Dreamcast's and 1-chip SNES's too-hot levels are from 0-100 IRE incorrectly pushed up into 7.5-107.5 IRE?
Thank you for expert analysis and the links. I had seen those but not read through enough. I like your Dreamcast and 1-chip SNES theories and I ordered a 1-chip SFC for 6800 yen to test with. Screwing up the color space with budget consolidation 2-chip into 1-chip would explain a lot. Nice life if non-Sony is all IRE 0.0 and 93K to avoid tracking every Saturn revision. The IRE 7.5 still "feels" correct to me for never having seen IRE 0.0 until this week.
energizerfellow‌ wrote:The existence and state of the black level setup pedestal is easy enough to verify with an oscilloscope on the composite/luma pin (ideally under load via something like the RGBench), no display needed. Sampling the signal directly removes human color perception and display calibration from the data gathering process, yielding much more accurate information. As far as I know, nobody has publicly documented oscilloscope-verified black levels across various consoles and models in one place.
This is 10k% what I want to do. I've reached my theorizing limit and I cringe watching commission-driven upscaler, cable and switcher reviews by eyeballing a BVM versus obtainable CRT...then uploading on 4:2:2 YCbCr for us to watch in miniature form...then not bothering to compare 480i to 480p with doubled impedance and no need to de-interlace...ahhh. Fudoh's reviews are the most objectively scientific. SNR mainstream when?

I'm sure several modern cable and device makers have oscilloscope-verified black levels but why give that info out for free? I want the Analog Discovery 2. Only 2 channels hurts but built-in signal generator and spectral analysis are legit. I like the RGBench quality of life features. Just that $65+$6 price and I don't use SCART.
energizerfellow‌ wrote:As for Beta component signaling levels (and more specifically Betacam, not consumer Betamax) vs SMPTE, no consumer-focused hardware ever used Betacam levels that I'm aware of. They're all SMPTE (and by extension IRE 0.0). End of discussion, really.
I mean, you're right. I still think the PVM Beta 0.0/7.5 setting is correct for SNES and PS1 signals being transcoded into YPbPr but can prove with oscilloscope. Maybe I can get a production quality Sony UVW-1600/1800 for the price people try to hawk VHS+DVD recorders for.
energizerfellow‌
Posts: 208
Joined: Thu Sep 27, 2018 1:04 am

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by energizerfellow‌ »

NewSchoolBoxer wrote: I roll the Walmart approved "Camponent Cable for PXX" that came with the PS2 I bought on Craigslist. Then I use passive video balun over ethernet cable to route to the JVC I'Art. I don't see relevant JVC video settings but the late October 2002 manufacturing date leads me to believe it uses the color gamut and temperature intended for American audiences. I'm confident it's the same SMPTE-C as the PVM but JVC seems to recognize Composite/S-Video true white 714 mV inside a Component signal and uses that instead. Redditor told me that the HD Retrovision RGB -> YPbPr cable on SNES looks better than S-Video but worse than RGB. What I expected but I'm curious if it crushes whites.
In Sony land true-to-spec SMPTE-C and EBU phosphors seem to be limited to Sony's HR tubes in BVM, select late high-end PVM (800-line L5/L4/M4), and the GDM-FW900 PC monitor (not sure about other manufacturers...). Your Sony PVM-20L2, for instance, is still P22. I'm still not sure if anything other than P22 ever shipped to end consumers in regular TVs.

The NTSC decoder in that JVC should definitely be fully NTSC-M compliant with IRE 7.5 black level setup and 714 mV luma. That said, it may be configurable in the service menu.

Looking at your sample pictures, I think both the displays could use some color calibration. The best readily available analog test pattern generator I can think would be a CECH-2xxx PS3 slim (both new enough to avoid early PS3 hardware issues and old enough to not have intentionally crippled analog outputs by AACS mandate) running one of the better reference discs, e.g. Spears & Munsil, Disney WOW, or Digital Video Essentials.
NewSchoolBoxer wrote:PVM assumes we know what we're doing. Not stopping me turning it Japanese with IRE 0 and DS93 from American DVD player source. I was trying to find more info about the Beta part of its 0/7.5 setting. Is YCbCr that I see was supported in high end BetaCam equipment alongside SDI. This source gives NTSC-U and J 100% white as 714 mV like Composite/S-Video and gamma-corrected R'G'B but unlike 700 mV SMPTE-C.
Betacam having larger chroma voltage swings around a higher reference voltage (see Appendix A) looks to be the difference, along with NTSC/PAL variations. This can lead to some interesting color shifts if you have a SMPTE/EBU N10 vs Betacam mismatch. Also, the 700 mV vs 714 mV is a PAL vs NTSC thing.

Interestingly, that Tektronix document has a note about RGB in NTSC regions that may indicate the PS1's behavior is intentional?
The RGB Standards

An RGB component signal consists of three monochrome video signals, each representing the image for one of the primary colors. Combining these three monochrome images in a display results in a full color image. Possible sources of RGB video include cameras, telecine machines, composite decoders, character generators, graphics systems, color correctors, and others.

In general, RGB signals use the same peak-to-peak amplitude as the luminance signal in the local composite standard. This explains why there are several RGB standards in use today and why it’s important to determine the characteristics of your equipment and calibrate for the appropriate levels (including setup, if required). The following paragraphs describe the four RGB interconnect standards you might encounter:
• 700 mV RGB (SMPTE/EBU N10)
• 714 mV RGB (NTSC-related)
• 714 mV RGB with setup (NTSC-related)
• 700 mV RGB with setup (MII)

SMPTE/EBU N10

Since the non-NTSC regions have standardized on +700 mV video and -300 mV sync, this is the component interconnect standard in use in most non-NTSC regions. (See Figure 55.)

The SMPTE/EBU component standard specifies that the Y (luminance) signal is on channel one, the blue color difference signal is on channel two, and the red color difference signal is on channel three. Since luminance carries the sync information in color difference formats, and green carries the sync information in RGB, hardware compatibility is achieved by putting the green signal on channel one. Sync will thus always be on the same channel. (Although SMPTE RGB has sync on all channels, this is not always the case in other RGB formats.)

For similar reasons, the blue signal is put on channel two like the blue color difference signal, and the red signal is put on channel three like the red color difference signal. It therefore seems appropriate to call the SMPTE format “GBR’’ rather than “RGB.’’ In the rest of this appendix, we will use the term GBR. Time will tell which term remains in common usage.

NTSC-RELATED

The NTSC system has two characteristics that may lead to differences in the related GBR interconnect: the 10:4 video-tosync ratio and black-level setup. Setup is usually added as part of the encoding process, so GBR signals coming directly from a camera generally do not have setup. In this case, the non-composite GBR is at 714 mV peak. If sync is added in this system, it will be at –286 mV. (Sync is usually taken from the green channel, although it may be added to all three.) Prior to the advent of component video, this was the common GBR interconnect in NTSC regions. (See Figure 56.)

If an NTSC signal is decoded, and the resulting GBR is normalized to 714 mV peak, setup is included on GBR. Setup may also be added on non-decoded feeds to gain compatibility among various GBR sources. In this case, each of the GBR signals will have the same levels as luminance in NTSC. Another source of 714 mV GBR with setup is translated Betacam format component signals. (See Figure 57.)
Things seem a lot more sane over in YPbPr component land if you don't care about Betacam or MII:
The SMPTE/EBU standards for color difference format component analog video is much like its GBR counterpart. Each of the signal wires carries a 700 mV video signal, there is no black-level setup, and sync tip is at –300 mV. Sync is only on the Y (luminance) channel, which is channel one. The blue color difference signal, PB, is on channel two, and the red color difference signal, PR, is on channel three. (See Figure 59.)

It’s often convenient to display the PB and PR signals with the reference level offset to +350 mV on the display so all signals occupy the same range on the waveform display. (See Figure 60.)

The color difference signals are the familiar B-Y and R-Y, normalized for 700 mV peak-to-peak on a 100% amplitude color bar signal. Except for gain differences, these signals are identical to the U and V signals in PAL.
NewSchoolBoxer wrote: I'm sure several modern cable and device makers have oscilloscope-verified black levels but why give that info out for free? I want the Analog Discovery 2. Only 2 channels hurts but built-in signal generator and spectral analysis are legit.
You'll want something with 4 channels to get all the video channels on one screen, plus you really need all 3x for the color difference that is YPbPr.
Roc-Y
Posts: 1
Joined: Tue Dec 26, 2023 3:10 am

Re: Retro Consoles - 0 or 7.5 IRE for Black?

Post by Roc-Y »

I encountered an interesting situation. I have a cheap 14-inch consumer-grade CRT TV with an M61260BFP chip. When I input a 0 IRE black level signal, whether it is CVBS, YUV, RGB, NTSC or PAL, signals below 16/255 will be cut off. Adjusting the brightness cannot be restored, and the setting item cannot be found in factory mode.
What's interesting is that it just cuts off the signal below 16, allows 17 to be displayed, and retains the signal above 235. 16 does not correspond to 7.5IRE, but 16/255=6.3IRE. But this is an analog signal TV, and 16-235 should only be related to digital signals.
Unless the TV thinks that the input signal comes from the 16-235 digital signal directly from the DAC, no decoding device should be stupid enough to do this.
If the manufacturer is afraid that users will complain about the 7.5IRE black level signal washout, then why not set it to 7.5% but to exactly 16/255?
I really can't figure it out.
In addition, I also found that when I use an HDMI-CVBS converter. Both PAL and NTSC output signals have black level setup. Therefore, in order to avoid signal clipping, many devices have black level setup regardless of the rules. Then TV manufacturers also crop the signal by default based on this phenomenon?
Post Reply