Well to be more precise, DC's 15khz/"TV" output is not completely stretched to 720x480, that would be too wide. But on a display calibrated to show 480i and DTV 480p with the same width, the DC's "VGA"' pillarboxing will be larger, showing a more narrow picture. On my multisync monitor, the DC's "TV" output more closely matches the 4:3 AR and non-square pixels of my other analogue sources, whereas the "VGA" picture is skinner than all of them. I have a preset with 480p stretched to compensate for it when I need to use it.
I think 480p is a standard that only partially belongs in the analogue realm. It came to prominence during the transition to fixed square pixel displays. I feel like many home consoles of that time, namely DC, PS2, Gamecube, Wii, didn't know quite how to optimize for it. The Wii is infamous for messing up its progressive output, and there was a lot of discussion about it in the cloning the Gamecube component topic.
vol.2 wrote:interlacing is sharper in many games of the era. wii goldeneye for example. the progressive scan just looks really soft and unpleasant.
Some will tell you that's the Wii at fault, and not progressive itself. I don't think that that's the whole story though. Again, devs may not have known what to do exactly to get 480p right on CRTs, especially non-PC CRTs, whereas the 480i standard was long-established and familiar, so it wouldn't be surprising that due to software reasons the interlace picture is more optimal. I still think there are effects from interlacing itself that contribute to a better picture, in the right conditions at least (i.e., not relying on de-interlacing and/or scaling, not using a fixed pixel display, etc.).
And I also never had issues with interlaced text either.