What's the main reason for the 480i display?

The place for all discussion on gaming hardware
Post Reply
User avatar
Overkill
Posts: 513
Joined: Mon Aug 22, 2011 6:11 pm
Location: Portugal

What's the main reason for the 480i display?

Post by Overkill »

So we all want old games in 240p, and if possible, 480p games, like in the Xbox, DC, etc. But in the end of the PS1/Saturn era, the games started to be displayed in 480i, with the majority of the PS2 titles being like that.

What's the main reason for 480i output become the standard in the home console market in the year 2000? Many titles, like the snk/playmore are in 480i, but the PS1 versions were in 240p, the Street fighter alpha series in the ps1/saturn is low resolution, but in PS2 is 480i and 480p, but why they don't leave is as in it was in the arcades? The arcades were still using low resolution and they make home versions for console in 480i. There are a few exceptions, like the SEGA AGES 2500 titles, like sega rally, with diferent resolutions available to choose from. So many arcade compilations available on the PS2 and all are in 480i. The only bonus we have in Europe at that time was they included a 60hz option.

Im trying to understand the reason of that "dark age" as Fudoh stated in is website. Because it seems at that time he have gone backwards instead of getting better visuals
fagin
Posts: 1654
Joined: Fri Mar 19, 2010 2:29 pm
Location: UK

Re: What's the main reason for the 480i display?

Post by fagin »

Programming libraries would have been centred around 480i video output for fake high res. 2D was limited in place of the developing 3D era. No one "gave a shit" about progressive vs interlaced back in the day and it would have been easier to stick with the norm for coding and design purposes.

All imo.
User avatar
Fudoh
Posts: 13040
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Re: What's the main reason for the 480i display?

Post by Fudoh »

Especially on the emulation packs (Capcom classics etc) 720x480i was required to use the original CPS1/2 resolutions without downscaling. In general the higher resolution gives more room to accomodate a wider number of different in-game resolutions.

In the late 90s/early 00s, when CRTs were still in use, 480i wasn't the worst thing. Switching back to 240p for a game would cause most people to complain about the low-res look and the visible scanlines. ICO was such an example.
User avatar
Overkill
Posts: 513
Joined: Mon Aug 22, 2011 6:11 pm
Location: Portugal

Re: What's the main reason for the 480i display?

Post by Overkill »

So in generally those days most of the people preferred the 480i look?
User avatar
Fudoh
Posts: 13040
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Re: What's the main reason for the 480i display?

Post by Fudoh »

When the Dreamcast was released in 1998/1999, all major games were hi-res, but hardly anybody had a large VGA screen, so 15khz RGB in 480i was the most common connection. People got used to the reduced scanlines and going back to 240p for just a few games, made them appear weird.
User avatar
antron
Posts: 2861
Joined: Wed Feb 22, 2006 7:53 pm
Location: Egret 29, USA

Re: What's the main reason for the 480i display?

Post by antron »

My medical imaging book (Bushberg I believe) claimed that interlacing had to do with the original design of cine cameras and CRTs. It was supposedly a way to obtain smother motion without increasing the bandwidth requirements of the devices.
User avatar
Fudoh
Posts: 13040
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Re: What's the main reason for the 480i display?

Post by Fudoh »

of course - that's right. Did ever see a TV program in 240p ? Looks terrible...
User avatar
Ed Oscuro
Posts: 18654
Joined: Thu Dec 08, 2005 4:13 pm
Location: uoıʇɐɹnƃıɟuoɔ ɯǝʇsʎs

Re: What's the main reason for the 480i display?

Post by Ed Oscuro »

It doesn't give smoother motion (it has only equal temporal resolution to 240p60, and only equal resolution per set of fields) but essentially splits a high-resolution frame into two sets of fields, one displayed in each frame. Sort of - there's a wrinkle here as well. Basically, it only gives the illusion of greater detail. If you cut two pictures that are very close in content into strips and put them back together, discarding all the odd-numbered strips from one picture and all the even-numbered strips from another, you will perceive more detail, even though taken individually each set of strips is only half the resolution of the original full-resolution pictures (which, in NTSC's interlaced system, never are captured to begin with).

This was a choice made by the NTSC system designers at the outset of the medium, because (as Fudoh says) 240p would've looked terrible.

The other major factor involved was the desire to avoid lengthy frame times to avoid brightness-related variation - flicker. If you are scanning the whole screen twice as quickly, there's going to be less variation in brightness on each pass, and the eye is supposed not to notice the darker remnants of the previous scan between the newly updated scan lines. I doubt that accuracy of the scanning mechanism was any factor, because the electron guns are still deflected to each position you'd get if the set was scanning at 480p30 (instead of 240p60), using the "30" to mean the scan rate.

In other respects the design of the early television (not cinema) cameras of the time seems to be a good argument against interlacing, because (unless I'm very mistaken) they also scan the scene and generate a wholly analog signal that has to match the NTSC standard (whatever it is), because there's no memory or buffer for doing operations like splitting a "real frame" into odd and even interlaced field sets. So when the camera generates the signal for the odd fields, it's doing that in real time - when it generates the even fields, it is again doing it in real time, along with generating the offset for those fields. In the mean time (the next 1/60th of a second) Jackie Gleason might have wobbled slightly (Chock Fulla Booze!) out of his previous position, so what the camera can see is actually a different scene - there's no "full frame" to reconstruct. (It also seems there should be a roughly 1/60 second difference between the beginning of the set of fields and the end.) If you reconsider my "pictures" analogy from before, just remember that there is no "full resolution" picture that the camera is working from - by the way it is set up, what it captures is like the full-resolution picture cut into strips and half are discarded - from the very start. In practice it's not usually a big problem for NTSC programming, because news anchors and carefully-selected ties don't move that much horizontally or show terrible shimmering problems.

That's a supposition on my part, but I think it's a good one. The other possibilities don't make sense. This was a time when the networks routinely recorded over important programs to save money on magnetic tape - there was certainly no interest in wasting money on putting very expensive capabilities into cameras that would not be used.

For HD-era ties and for games with lots of horizontal movement, interlacing is absolutely terrible. They just avoided these things in TV programming.
User avatar
antron
Posts: 2861
Joined: Wed Feb 22, 2006 7:53 pm
Location: Egret 29, USA

Re: What's the main reason for the 480i display?

Post by antron »

I really don't understand this subject at all. Here is the beginning of Wikipedia's entry for interlaced video:
Interlaced video is a technique of doubling the perceived frame rate introduced with the signal without consuming extra bandwidth. Since the interlaced signal contains the two fields of a video frame captured at two different times, it enhances motion perception to the viewer and reduces flicker by taking advantage of the phi phenomenon effect.


I've read this other places too, in an academic setting, but I really don't understand. How is it a reduction of flicker? It definitely flickers more for video games to my eyes.

Perhaps it's the nature of video games with lots of small discrete objects on the screen. But video is more of a scene, or a face.
Last edited by antron on Sun Sep 15, 2013 2:37 am, edited 1 time in total.
User avatar
Fudoh
Posts: 13040
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Re: What's the main reason for the 480i display?

Post by Fudoh »

first part refers to 480p30 (15khz) -> 480i60 (15khz). 2nd part is plain wrong.
User avatar
Ed Oscuro
Posts: 18654
Joined: Thu Dec 08, 2005 4:13 pm
Location: uoıʇɐɹnƃıɟuoɔ ɯǝʇsʎs

Re: What's the main reason for the 480i display?

Post by Ed Oscuro »

What's wrong with the second part?

There are obviously times when the Phi effect will induce new visual artifacts (the "cut frame," "combing," or tatami mat-like effect, with a ripple across the screen) when there is apparent movement in a direction against the field refresh - i.e. horizontal, side-to-side movement. But if details are reasonably stable, visual effects (the "phi" the article mentions is certainly a candidate) will allow the perception of more detail.

It actually makes sense to put it in terms of the phi phenomenon, because refreshing things that are close together (interlaced scan) should be less notable than refreshing things far away more quickly (progresive scan) when each bright impulse (the scan) is followed by a fade to dark.

There's also actually another name for the other interlacing effect I mentioned earlier - aliasing - that I didn't know: "interline twitter."

They are certainly correct about the two fields of a frame being captured at different times, although some people might object that the convention of calling the two interlaced fields a "frame," when taken together, is wrong: Each scan of the fields originates from a totally different original signal, unless you are holding a frame for transmission in both fields (this could be done for telescreening of a film by printing the same film frame twice - scanning each frame in turn - and today it is commonly done with a frame buffer, of course).

Interlaced video game consoles don't do this, nor do video cameras (running an interlaced NTSC signal), nor any pre-HD video source that I know of. The persistence of vision from one set of fields to the next means that even computer graphics for broadcast generally won't need to worry about this. Only if you were designing a scene which looked better at 30 FPS with a single frame split into both fields for transmission, than a scene which looked better interlaced but at 60 FPS, and you also need the technical ability to do so. Attempting to "fix" the problem of interlacing comes with an inherent delay (which, in the context of video game systems, would be lag as the second scan of fields would be based on out-of-date data when compared with the first).
kamiboy
Posts: 1982
Joined: Sat Sep 04, 2010 4:40 pm
Location: Denmark

Re: What's the main reason for the 480i display?

Post by kamiboy »

None of the reasonings for the creation of 480i back in, what, the 30's or 40's, should be applied to contemporary hardware.

CRT displays of yore neither looked nor performed like CRT's of the 80's and 90's that we all grew up with. Same goes for recording and broadcasting hardware. The decisions made were all made for ancient technology that performed very differently.
User avatar
Ed Oscuro
Posts: 18654
Joined: Thu Dec 08, 2005 4:13 pm
Location: uoıʇɐɹnƃıɟuoɔ ɯǝʇsʎs

Re: What's the main reason for the 480i display?

Post by Ed Oscuro »

Pretty much in agreement with you there, although flicker might still be somewhat reduced in 60 FPS interlaced over 30 FPS progressive.
Post Reply