Fudoh wrote:
Quote:
Isn't "OSSC compatibility" dependent on the signal source?
not really. It's comes down to the capture unit's ability to lock to non standard timings, since the OSSC "tricks" to achieve any resolution higher than 480p and isn't able to output timings close to the VESA or SMPTE standard timings.
On the one hand, due to the wide variety of analog signals you can throw at the OSSC, and because the output digital video timings are always closely coupled to the input analog video timings, the output timings vary accordingly across different signal sources, even regardless of the effective OSSC settings. On the other hand, while I don't have an ironclad overview, I doubt that it's a black-and-white situation where there are those capture devices that accept everything an OSSC will (for all practical intents and purposes) throw at them, and those devices that very strictly only accept standard timings. Rather, my everyday informal experience with different source/sink combinations and online reports by others suggest that there's more than just two degrees of tolerance across different capture devices (just like with displays).
Even without going into technical details, observe that the lion's share of video game systems the OSSC is typically fed with output non-standard timings, while some specific cases of signal sources are known for causing problems on the sink side more often than others (while the OSSC itself successfully locks onto the signal); examples include the PS1 outputting 314 total lines for a share of games in 50Hz mode, and the SNES' jitter issue. In both cases, there are sinks that accept the resulting TMDS signal without further issues, and there are sinks for which this is not the case.
In light of this, wouldn't it be an oversimplification to speak of "OSSC (in)compatibility" while
only factoring in either the source- or the sink side of the chain?
Or was I merely misunderstanding and you actually just meant something like "OSSC friendliness" or "good" compatibility?