240p/480i bandwidth? (for choosing extron rgb amp model)

The place for all discussion on gaming hardware
Post Reply
deezdrama
Posts: 222
Joined: Sat Dec 21, 2019 4:10 pm

240p/480i bandwidth? (for choosing extron rgb amp model)

Post by deezdrama »

Anyone know anything about bandwidth required for 240p/480i rgb signals? I ask because there is extron rgb distrobution amp models with 80mhz, 180mhz ,300mhz ,600mhz bandwidth and the only old ass brochure I could find states the 180mhz model is good for svga resolutions, and the 300mhz and above are good for HD resolutions so I assumed the 80mhz would be fine for 240p signals from mister. I couldnt find any info from google, its like they wiped the results from the 90s from existence. Anyway... I ordered the extron ada 4 80..... 4 outputs , 80mhz.... I hope it will suffice

Im going from mister fpga through varies pvms/bvms and then to a few rgb modded consumer sets
User avatar
Guspaz
Posts: 3219
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by Guspaz »

NTSC transmissions occupied a total of 6 MHz.

Image

I believe 13.5 MHz is the standard pixel clock for digitizing NTSC video?
Sirotaca
Posts: 103
Joined: Sun Mar 19, 2017 12:08 am

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by Sirotaca »

It'll be just fine. AFAIK BVMs top out at about 1000 TVL, and if I've done my math right, 1000 TVL 240p/480i corresponds to just under 13 MHz. You'll want some headroom above that for your amp's bandwidth since that's usually specified at -3 dB, but 80 MHz is plenty.
Guspaz wrote:NTSC transmissions occupied a total of 6 MHz.
Yeah, but this is about RGB, not NTSC.
User avatar
Guspaz
Posts: 3219
Joined: Tue Oct 06, 2015 7:37 pm
Location: Montréal, Canada

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by Guspaz »

Sirotaca wrote:It'll be just fine. AFAIK BVMs top out at about 1000 TVL, and if I've done my math right, 1000 TVL 240p/480i corresponds to just under 13 MHz. You'll want some headroom above that for your amp's bandwidth since that's usually specified at -3 dB, but 80 MHz is plenty.
Guspaz wrote:NTSC transmissions occupied a total of 6 MHz.
Yeah, but this is about RGB, not NTSC.
It's NTSC timings and as I said, 13.5 MHz is the typical pixel clock used.
User avatar
NewSchoolBoxer
Posts: 369
Joined: Fri Jun 21, 2019 2:53 pm
Location: Atlanta, GA

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by NewSchoolBoxer »

That's smart of you to consider the bandwidth of what you're plugging in. Chart is good, just note that NTSC video caps at 4.2 MHz since 4.5-6 MHz is reserved for audio. PAL video is a little higher.

RGB at 240p/480i is about 6 MHz. 6 MHz in to 4x6 MHz out = 30 MHz. At 240p/480i, no way in hell you can exceed 80 MHz. 480p is also super safe for 1 in to 4 out since it's a little more than double 480i bandwidth.
Guspaz wrote: I believe 13.5 MHz is the standard pixel clock for digitizing NTSC video?
I've seen it in an official standards document as the pixel clock for digitizing Component/YPbPr but that makes sense it's the same for NTSC and PAL too. One ADC sampling rate for all SD video. The interesting thing to me is the 13.5 MHz was chosen to allow up to 6.75 MHz bandwidth on the luma and either 6.75 MHz or 3.375 MHz on Pb and Pr. I used to think only YCbCr had chroma subsampling.

I expect video game Component not to exceed NTSC or PAL video bandwidth but DVD players can go up to 6.75 MHz for 480i/576i. Not sure if most do or not.
deezdrama
Posts: 222
Joined: Sat Dec 21, 2019 4:10 pm

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by deezdrama »

Awesome... Looks like my hunch (hope) of 80mhz being plenty for 240p paid off.
Found the extron rgb distro amp (1 in 4 out) from a church surplus for $20. The higher mhz models go up in price significantly with each model tier jump.... Im a cheap ass so took a gamble, Glad it will work for my purposes, I see a pvm and rgb consumer crt wall in my future :D
Sirotaca
Posts: 103
Joined: Sun Mar 19, 2017 12:08 am

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by Sirotaca »

NewSchoolBoxer wrote:I expect video game Component not to exceed NTSC or PAL video bandwidth
Pixels are square waves. If you want nice sharp pixel transitions, as I assume most people here do, you need more bandwidth than that. Consoles like the Genesis have meaningful frequency components well in excess of 20 MHz in RGB.

Image

I assume a MiSTer with a modern DAC could do even better. Though again, 80 MHz will be more than enough. The CRTs are going to be the limiting factor here.
User avatar
NewSchoolBoxer
Posts: 369
Joined: Fri Jun 21, 2019 2:53 pm
Location: Atlanta, GA

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by NewSchoolBoxer »

Pixels are square waves. If you want nice sharp pixel transitions, as I assume most people here do, you need more bandwidth than that. Consoles like the Genesis have meaningful frequency components well in excess of 20 MHz in RGB.
I'm sorry this got so long. You're smart and technical and you help me. I don't intend to write so much to make a counter argument more difficult or time consuming.

I would say you're mistaken. More bandwidth on the Extron won't do anything here. Any information sitting outside of a harmonic of the fundamental frequency is noise.

If you take bandwidth to mean a substantially higher ADC sampling rate (more oversampling) then I could agree square wave distortion would be less. Less quantization noise. That helps to an extent. If you take bandwidth to mean the console RGB amp's gain-bandwidth product to be 20x6 MHz = 120 MHz or more, I also agree. NTSC Composite video gets distorted at < 50 MHz.

I've seen graphs here showing 1CHIP SNES has sharper RGB rise and fall times than 2CHIP and this is the main explanation for sharper pixel transitions. We're ignoring the obvious, that the console is being modded to perform better with a modern video amp, with vastly superior slew rate and bandwidth, and I assume less crosstalk, gain/phase error and harmonic distortion..versus the 90s one in a console sold at a loss.

I don't see how faster rise and fall times contribute to sharper pixel transitions on a CRT where true pixels don't exist. Reducing 20ns rise to 10ns, is video improved? I think so long as rise and fall times are faster than the Master Clock period (1/f0 = 47ns SNES) and sync timings are being met then going faster is irrelevant. All SD CRT and VGA timing windows have tolerance of 100ns or greater, right? Faster rise/fall correlate with less power loss from parasitic capacitance and a higher quality signal with less ringing in general. Kind of an infamous saying but correlation isn't causation.

My knowledge is lacking on CRT video processing and I'm open to a correction.

Video being blurrier or less sharp is more clearly from greater noise and distortion in the 2CHIP signal that could stem from any number of sources, including square wave ringing from a worse video amp, a more glitchy DAC, more aged capacitors, PCB traces between VPU and CPU that doesn't exist in 1CHIP, more heat from consuming more power, etc. Not sure who first said 2CHIP has a subpar video filter that can't be disabled but makes sense. The 2CHIP circuit proposed in this thread that gives better video is a massive redoing of the RGB amp and circuitry and I'm guessing bypasses the filter like 1CHIP mod does.

It can be shown with Fourier series that a square wave is composed of a fundamental frequency and odd harmonics (no n even number terms) if duty cycle = 50%. Whereas harmonics of a pure sine wave are a result of a non-LTI system, cutting off square wave harmonics by a low pass filter or processor's inability to see data at 3x, 5x, 7x etc of ~6 MHz fundamental is going to distort the edges with ringing. Thankfully, if processor can sample more or less in the exact middle of the square wave, which is to say use the correct sampling phase based on the video resolution, the impact of a less than ideal square wave is minimized.

Still, minimization isn't removal. Analog video's phase has drift. What the NTSC/PAL colorburst basically fixes but RGB lacks this. Sharper transitions give a slightly longer flat part of the square wave and, if the ADC incorrectly samples over these edge areas, the ADC's error will reduce.

Being that a sharp and instantaneous transition from 0V to square wave peak requires an (impossible) infinite bandwidth, it's more accurate to model an RGB square wave as a trapezoid with Fourier. The rise and fall times can be different but for sake of preserving intuition, better to create the series with their frequencies being the same. So there are two frequencies in the model, fundamental of square = f0 with its angular form and 1/τ = 1/(rise=fall time). Well, if 1/τ is an exact odd multiple of f0, that could degrade video a little.

The power contained in the τ terms, which is this high frequency data you refer to, doesn't contain video data. All terms of the Fourier series occur from the fundamental frequency, not rise/fall time. The rise/fall time have nothing to do with the amplitude in the time domain. I like getting use from tax dollars so here is a PPT from NASA showing how varying rise/fall doesn't change the amplitude or improving ringing. Opens in LibreOffice Impress

tl;dr

I don't think it's correct to say Genesis and SNES have meaningful information above the Standard Definition bandwidth limit of 6 MHz. SNES can't encode any data above 21.5 MHz Master Clock even if it wanted to, yet by drawing rise and fall of squares in the nanosecond range, part of the wave's power exists above that. Like in your graph, 1/13.4ns rise time is the period = 1/frequency = 74.63 MHz. If the rise time took twice as long then you'd see 37.32 MHz instead. This is not useful information that improves sharpness.

What matters is the interpreted voltage level of the flat part of the square wave from
a) amplitude
b) sync timings being met
c) Extron DAC sampling at high enough frequency to include sufficient f0 terms
d) Extron sampling phase.

The amplitude is an offset = impulse at DC = 0 Hz that is independent of τ.

deezdrama having a 600 MHz splitter versus the 80 MHz one isn't going to make 6 MHz 240p/480i RGB video look any better. Both will apply a low pass filter for anti-aliasing slightly above 6 MHz regardless of their own bandwidth because no Standard Definition video information exists above 4.2-6 MHz. If the 600 MHz device has a much higher sampling rate then that's different, distortion would be less but the bandwidth figure alone doesn't say anything about sampling, its analog filter or DSP ability.
Sirotaca
Posts: 103
Joined: Sun Mar 19, 2017 12:08 am

Re: 240p/480i bandwidth? (for choosing extron rgb amp model)

Post by Sirotaca »

Your fundamental misunderstanding seems to be that there's digital sampling happening somewhere in this chain. There isn't. It's analog RGB going through an analog amp and into analog CRTs. If fast rise/fall times are what you put in, fast rise/fall times are what you'll get out, to the extent that your equipment is up to the task. Granted, on an RGB-modded consumer CRT TV you probably wouldn't notice if you passed your RGB signals through an SD LPF, but you might on a high-TVL PVM/BVM.
NewSchoolBoxer wrote:Like in your graph, 1/13.4ns rise time is the period = 1/frequency = 74.63 MHz.
Also, this is way off. The rule of thumb is [bandwidth] = 0.35 / [rise time]. So a rise time of 13.4 ns would correspond to a bandwidth of approximately 26 MHz.
Post Reply