Guspaz wrote:It's true that you can play GameCube games on a Wii, but it has softer component, and no HDMI option.
Do you think there is a need for an "HDMI option" for the Wii besides the existing Wii-to-HDMI dongles?
RGB32E wrote:From discussions with Unseen he's indicated that higher performance FPGA would be required for upscaling and that he doesn't have the interest in implementing scaling that preserves "non-square" pixel aspect ratios claimed for all 4:3 GCN games.
The Cube's pixels are always non-square, it doesn't matter if the game is configured for 4:3 or 16:9 output. Nintendo chose to design the Cube's video system around standard video timings, so it uses a fixed 13.5MHz pixel clock (originally specified in BT.601 in 1982) for 480i/576i modes and 27MHz for 480p/576p. With this pixel clock and using standard video timings, there is no combination of active video lines and aspect ratio that would result in square pixels.
and the remaining pixels are part of the horizontal blanking period that have been added by the FPGA as visible padding.
Although the Gamecube digital video data encodes these pixels as blanking, they absolutely must be translated to active-but-black video for the digital video output. With analog video, the display generally does not know if a given pixel is black or blanking, so Nintendo could take this shortcut to simplify the configuration of the video processor for games that want to avoid rendering image in the overscan area that would often be invisible. With DVI, blanking and active-but-black pixels are encoded in a different way, so without this translation in GCVideo, the display would see various non-standard resolutions depending on the game. This is a great way to annoy users, because some displays refuse those non-standard resolutions completely. Others may show other oddities, for example I have one that would always zoom the picture to the full panel size and disable the aspect ratio buttion.
There is already a known-good reference signal available from the Gamecube that shows why the black padding is the only correct option: The composite video output does not distinguish between "real" blanking, the increased blanking caused by reduced image dimensions used by a game and black pixels that may be present at the actual image borders. A digital display receiving this signal would always process it in a way that is equivalent to 720 active non-square pixels per line, an analog display (CRT, assuming proper adjustment) would not care about pixels at all and lock its horizontal sweep to the HSync signals, giving the same AR as the GCVideo DVI output with re-blanking for 720 horizontal pixels.
Even the OSD shows a conversion of 640x480 to 720x480!
There is no resolution conversion, the input line is just an indication of the lazyness of the current game's programmers.
This choice is understandable for 480p only, as most HDTVs require the extra pixels for proper input sampling.
Not sure what you are talking about here - sampling is only relevant when you are dealing with analog signals, but GCVideo Lite doesn't touch the blanking area because the difference isn't visible at the output anyway.
Hence, I do not believe the claim that "oh, your TV will do a better job of upscaling" is anything more than shenanigans, and isn't the whole story.
I don't believe your claim "The TV's upscaling sucks, you can implement something better on an FPGA" either.
have you captured any GCN games that actually have more than 640 active pixels?
To pick a random example: Everything on the PAL Zelda collection CD (which I believe only runs in 60Hz anyway) outputs more than 640 active pixels horizontally. The menu is 666x448i, the Wind Waker demo runs in 660x480i, the NES emulators use 660x232p, the N64 emulator uses 704x480i and the movie player is 672x480i. Another well-known game with an "odd" resolution is Mario Kart Double Dash which outputs 666x448.