Guspaz wrote:
There's been a nocut mod for it since last year, though.
Yes, that that requires you to either cut a section out of the lower EM shield (undermining the point of a no-cut mod) or remove it entirely, and I'm not a fan of either approach. My point is that the UltraHDMI could have been designed in a way that utilized a no-cut, 3D-printed AV shroud while leaving the lower EM shield in place.
Guspaz wrote:
Not much to be done about the UltraHDMI only supporting weave deinterlacing, other than to point out that only a handful of games used 480i full-time, and a bunch of those have framerate issues with 480i that make them best played at 240p anyhow. Still, it doesn't really make sense that bob deinterlacing and 480i passthrough aren't options. At least in the latter case the user could let their TV's normally much higher quality deinterlacer have at it, at the expense of a few frames of extra lag.
I believe you can get 240p/480i out of the UltraHDMI if you enable Direct mode, but that wouldn't eliminate the HDMI blackouts, which would make anything with frequent mode switches (like RE2) a pain to play; and bob deinterlacing support wouldn't really make a difference, because that's all my current TV (Samsung LN32B360) will do when it's in game mode, and I think I'd rather have interlacing artifacts than vertical flicker.
Guspaz wrote:
Why would the component inputs be limited to 15 kHz? Component video typically goes up to 720p/1080i (45 kHz for 720p). The BNC inputs support VGA input too, since they have the five plugs for RGBHV.
I'm not sure about the XM27/29, but there are a lot of CRTs in the US that have 15kHz-only YPbPr component inputs; someone with a history limited to those displays might reasonably assume that the NEC CRTs have a similar limitation.
(Also, I believe YPbPr component maxes out at 1080p; the PS3 and Xbox 360 are capable of 1080p over YPbPr, but I don't think support for it is universal; I think a lot of TVs will max out at 1080i for component.)