trivial wrote:There's a half-pixel of uncertainty as to everything's static vertical position, too. Changes to vertical position add more uncertainty on top of that.
No, there is no "uncertainty." Either the lines are scanned in the same place every frame (progressive scan, like all the old consoles), or they alternate up and down (interlaced).
It's possible to use interlaced timing even when you're only displaying a low-res image, so the same lines get displayed during both fields and appear flickery. But this is limitted to a few cases not including the NES (I believe X68000 was mentioned, some ports for newer consoles, etc.) If you had an MSX2 it would be easy to see by turning on interlace mode under BASIC.
I used to have a Trident VGA card with TV-out and a DOS utility that enabled very primitive antiflicker by halving the vertical rez displayed on the TV. But vertical scrolling in MCGA games still looked flickery! Is that because of the 200 - 240 discrepancy?
First, did the picture have scanlines? If not, it wasn't really progressive scan. Usually, anti-flicker filters just soften the image in the vertical direction. Second, if the card was rescaling the 200 lines to fill more or less of the display area on a TV that would cause noticable artifacts, particularly during scrolling. Third, good old mode 13h (MCGA, standard VGA 320x200x8bit) only allows a single frame buffer so tearing is apt to occur unless the program is written carefully and the PC and video card are fast enough to stay ahead of the beam. Last but not least, mode 13h runs at 70hz! so on a 60hz TV, some frames will have to be dropped.