Yes, that thread is extremely popular for its... hardware discussion.
I think that I can answer two of these questions! You seem to know what interlacing is; It's the method that you described, regarding the drawing of the odd lines of a frame, and then the even lines. It does this at twice the rate of the frame rate, meaning that NTSC TVs could be thought of as producing video at 60 Hz (sixty times per second), which it is, but this isn't the same as 60
frames per second. Kiken was right, with his mention. It's thirty frames per second, but those frames are split up into two fields, and the fields are presented at sixty fields per second.
This would seem all fine and good, but, your eyes can pick up on how half of the image is new and half of it is old, even if the rows of the new image are interlaced with rows from the old one. The only positive aspect that I can think of (which might not be true) is that the video display would seem to be flickering less than if it was actually progressive scan (entire frame at once, like a PC monitor) since I figure that watching a 30 Hz refresh rate would stuff your eyes into a bag and beat them, until they pulled their bonds up and strangled the refresh rate with them, running off to an opera house and becoming angels of music.
Note that this isn't the same thing as scan lines, which I myself am not even perfectly clear on. I figure that scan lines are either intentionally puttline odd or even lines into progressive scan video, to sort of "fake" the effect of interlaced video,
or, it's the presentation of interlaced video on a progressive scan screen, using only the odd field or only the even field. So, you're still seeing the same picture, but the interlaced half is removed, so there isn't any "comb-like" effect during motion, but there's also technically only half of the vertical resolution being displayed (remove the even lines, and a 320 x 200 image is now 160 x 200, only with black lines interspersed to take up the screen area of 320 x 200).
Then, since the display (such as PC monitor) is already being refreshed at a different rate (such as 60 Hz), there isn't any flicker, but the interlaced/scan line video is still presented at 30 frames per second. This can all be very confusing, and I hope that I'm explaining it well enough (and actually know what I'm talking about). Here's a page on the "interlacing" part of the equation:
http://neuron2.net/LVG/interlacing.html
I think that I myself just realized the issue with "true" and "fake" resolution. If people in the know are saying that some consoles have to produce an image at, say, 640 x 480, and they're then asked to produce a port of a Genesis/Mega Drive game, then they must produce a 320 x 224 image which would ideally use up the whole screen. Since this, doubled in size, would be 640 x 448, it would have a black vertical bar of 32 pixels to the right (or 16 on each side) and would appear squashed, when mapped onto the display of 640 x 480. I get it. (Or do I?)
So, the only way to get it to fit the whole screen would be to use some sort of scaling method, which would lessen the quality of the video. If a console would simply output the 320 x 224 signal, it would logically be made to cover the whole screen, without scaling, but wouldn't work properly with such devices as LCDs (and some HDTVs? I'm not sure of how those work, yet). I'm also sure that even if the original was 320 x 240, and it was scaled with pixel resize to 640 x 480, it still wouldn't look the same as the actual 320 x 240 resolution being displayed on the screen.
CRT stands for "Cathode Ray Tube," and is essentially the "glass tube"-type display, which contrasts such displays as LCDs in that any resolution can be displayed at any size in full quality on the screen, whereas LCDs (and other fixed-pixel displays) can display only their native resolution (the actual number of pixels built into the display) without greatly sacrificing visual quality. Another downside to LCDs (and plasma, apparently?) is that it takes a while for the individual pixels to turn on and off, as opposed to instantly, so what you end up with is a "trailing" effect, exaggerated motion blur, which makes games quite less fun to play, if you're trying to play well. Also, technically, LCD pixels don't "turn on" and off; There's a constant light "on" behind them, meaning that it must be covered up to create darker colors (or black). This means that black isn't ever really black, at least, not in the way that it can be with a CRT. You can adjust the brightness and contrast on an LCD all that you like, and you'll never end up with a fully-black screen, which means that darker images can look very washed-out.
So, you know the problem with the "Sony 36" as well, since you're aware of the blur problems with LCD and plasma TVs.
As for 100 Hz... hmm... is this the "light gun incompatibility issue" thing? That's my best guess.
(Edit: After reading something else that bloodflowers posted, I'm guessing that 100 Hz TVs, while having a non-flickering display, have to calculate how/when to display each frame/field of the source video, thus causing a delay between input and output, which is probably acceptable for watching television and film, especially if the sound is delayed by the appropriate amount. However, for games, it would be
death.)
Wow. That's a lot of text. I'd like to pose a couple of questions as well (if anybody will even see them):
1) Do these RGB monitors that everyone talks about, display interlaced video? Or, are they basically VGA monitors? I was thinking that it was basically the highest-quality analog interlaced video possible, but, I'm feeling a little bit confused, now. Arcade machines use RGB monitors, right? So, they are... interlaced displays? I haven't been to an arcade, in a while.
2) Next... Do RGB-capable televisions simply use composite sync, or something else? Finally, if the preceding condition is true... what I'm getting at, is... If one were to produce a color-difference component video signal based on an RGB signal and composite sync, would it appear properly on modern televisions (at least CRT models), and be comparable to an RGB monitor? I'm thinking that it's probably not worth the trouble, and that the time that it takes to transcode the signal would cause some sort of annoying delay, and that it would simply be easier and almost as good to use S-video.
Not to hijack the thread, or anything.