Discovered something odd recently Time Stalkers (Dreamcast)

The place for all discussion on gaming hardware
User avatar
Link83
Posts: 344
Joined: Tue May 21, 2013 2:39 am

Re: Discovered something odd recently Time Stalkers (Dreamca

Post by Link83 »

fernan1234 wrote:It does look like the actual game picture is 640x480 within a 720x480 frame when outputting 31khz RGBHV. In contrast, when outputting 15khz RGBHV the game picture is actually 720x480 (which IMO looks like the better aspect ratio for most game content).
From reading the SDK the Dreamcast's actual maximum resolution is 640x480 @60Hz for both "TV" and "VGA" modes (Unless i'm misunderstanding something) Perhaps there is some function where its automatically scaled/stretched to fill a 720x480 frame for 15kHz but not for 31kHz, but if so I haven't found any mention of it in the SDK yet :?
nmalinoski
Posts: 1974
Joined: Wed Jul 19, 2017 1:52 pm

Re: Discovered something odd recently Time Stalkers (Dreamca

Post by nmalinoski »

Link83 wrote:
fernan1234 wrote:It does look like the actual game picture is 640x480 within a 720x480 frame when outputting 31khz RGBHV. In contrast, when outputting 15khz RGBHV the game picture is actually 720x480 (which IMO looks like the better aspect ratio for most game content).
From reading the SDK the Dreamcast's actual maximum resolution is 640x480 @60Hz for both "TV" and "VGA" modes (Unless i'm misunderstanding something)
Is there any information about the encoder and its configuration? It sounds like it renders in 640x480, which makes sense, because that's what the DCHDMI gets from the board; but then the analogue encoder must be pillarboxing 640x480 to 720x480 when in "VGA" mode, and stretching it to 720x480 when in "TV" mode.
User avatar
Link83
Posts: 344
Joined: Tue May 21, 2013 2:39 am

Re: Discovered something odd recently Time Stalkers (Dreamca

Post by Link83 »

nmalinoski wrote:Is there any information about the encoder and its configuration? It sounds like it renders in 640x480, which makes sense, because that's what the DCHDMI gets from the board; but then the analogue encoder must be pillarboxing 640x480 to 720x480 when in "VGA" mode, and stretching it to 720x480 when in "TV" mode.
There are bits and pieces about the DAC, but based on chriz2600's post here the pillarboxing is aleady present in the digital data before it even hits the DAC. Presumably HOLLY must be performing some sort of scaling/stretching, with a possibly incorrect setting for "VGA" mode (Much like the Wii 480p configuration bug present in the official SDK) Perhaps the compiler is doing this automatically. I'll put a few interesting SDK tidbits below:-
Image

Image Image Image
Image Image

Supported Resolutions
Image Image Image

Sofdec (FMV) Aspect Ratios (About the only mention of 720x480 I could find)
Image Image
Last edited by Link83 on Wed Feb 05, 2020 5:00 am, edited 2 times in total.
User avatar
vol.2
Posts: 2995
Joined: Mon Oct 31, 2016 3:13 pm
Location: bmore

Re: Discovered something odd recently Time Stalkers (Dreamca

Post by vol.2 »

Fusion916 wrote: I think you're just more of a fan of a soft image. On low resolution texture games with very little hardware anti-aliasing, I suppose interlaced image can look a bit more pleasing to the eyes.
interlacing is sharper in many games of the era. wii goldeneye for example. the progressive scan just looks really soft and unpleasant.
Until you play a RPG where you need to read a lot of text....
i honestly think final fantasy 12 looks amazing, and it's interlaced. i never had an issue with the text.
fernan1234
Posts: 2242
Joined: Mon Aug 14, 2017 8:34 pm

Re: Discovered something odd recently Time Stalkers (Dreamca

Post by fernan1234 »

Well to be more precise, DC's 15khz/"TV" output is not completely stretched to 720x480, that would be too wide. But on a display calibrated to show 480i and DTV 480p with the same width, the DC's "VGA"' pillarboxing will be larger, showing a more narrow picture. On my multisync monitor, the DC's "TV" output more closely matches the 4:3 AR and non-square pixels of my other analogue sources, whereas the "VGA" picture is skinner than all of them. I have a preset with 480p stretched to compensate for it when I need to use it.

I think 480p is a standard that only partially belongs in the analogue realm. It came to prominence during the transition to fixed square pixel displays. I feel like many home consoles of that time, namely DC, PS2, Gamecube, Wii, didn't know quite how to optimize for it. The Wii is infamous for messing up its progressive output, and there was a lot of discussion about it in the cloning the Gamecube component topic.
vol.2 wrote:interlacing is sharper in many games of the era. wii goldeneye for example. the progressive scan just looks really soft and unpleasant.
Some will tell you that's the Wii at fault, and not progressive itself. I don't think that that's the whole story though. Again, devs may not have known what to do exactly to get 480p right on CRTs, especially non-PC CRTs, whereas the 480i standard was long-established and familiar, so it wouldn't be surprising that due to software reasons the interlace picture is more optimal. I still think there are effects from interlacing itself that contribute to a better picture, in the right conditions at least (i.e., not relying on de-interlacing and/or scaling, not using a fixed pixel display, etc.).

And I also never had issues with interlaced text either.
User avatar
vol.2
Posts: 2995
Joined: Mon Oct 31, 2016 3:13 pm
Location: bmore

Re: Discovered something odd recently Time Stalkers (Dreamca

Post by vol.2 »

fernan1234 wrote:
And I also never had issues with interlaced text either.
I don't doubt that the hardware/software being used is the issue with the progressive looking soft. I am assuming that the interlaced games that have good, sharp text and images were specifically developed to be displayed interlaced.

I do often have trouble reading text from a PC that has been interlaced. For example, I have a 1080i monitor that looks fantastic in 540p, but the text is unreadable in 1080i. I am assuming that it has to do with the target resolution for the software being progressive, and that the text looks shitty because the interlacing process is throwing out some information or the process leaves artifacts. With a game like Final Fantasy 12, they would have tweaked the process so the end result was sharp. I'm making the assumption that optimizing for one method or the other makes it hard to look good both ways.
Post Reply