MAME on a TV
MAME on a TV
I'm going to buy a TV to play games on, because a 17" monitor is right at the threshold of 'annoyingly small'...
I plan to use an S-video to composite cable (I have an NVIDIA 8800) and configure the TV as a second "monitor". Is this going to work?
I plan to use an S-video to composite cable (I have an NVIDIA 8800) and configure the TV as a second "monitor". Is this going to work?
Please post picture! 
Emph

Emph

RegalSin wrote:Street Fighters. We need to aviod them when we activate time accellerator.
im using the same setup, a lappy svideo out into yellow tv jack. well, the picture quality is a bit weird. if i go to screen test in mame games, during the white grid, theres like some ripple effect through the grids.

and, is there any way to test whether my tv support 15khz or not?

and, is there any way to test whether my tv support 15khz or not?
It is to us a sufficient body in which, fairies and it is packed and can group of play.
I'm using just an s-video at this point (found out the TV has an s-video jack) and it isn't that distorted. I don't understand the configuration procedures, though, and I don't even know if soft 15khz would work with my setup...?joeboto wrote:dragon blaze test screen. have you tried using soft 15khz software? i didnt see any improvement by using it. probably because of s-video to rca cable.
No snobbery intended. I'm not rich enough to own a house full of Japanese cabs (I have one chinese knockoff cab, which took me ages to save for) and original PCBs, and I too play a lot of games on TV via consoles with interlaced pictures.captpain wrote:Still, I don't give two shits about the snobbery
It's quite a step up.
I still think interlaced pictures suck. There are no consumer video cards with TV-out chips that support progressive scan pictures at 15KHz, and it stinks.
If someone could write a driver or other bit of software that gave you total control over your TV-out signal generating chip, they'd be my personal hero. Alternatively, if someone could make a RGB->S-Video converter that was reasonably priced, they'd also be my hero.
Yes, your TV supports 15KHz. If it didn't you wouldn't be seeing a picture.joeboto wrote:im using the same setup, a lappy svideo out into yellow tv jack. well, the picture quality is a bit weird. if i go to screen test in mame games, during the white grid, theres like some ripple effect through the grids.
and, is there any way to test whether my tv support 15khz or not?
The interference you see is just that - interference. You can try better cables (thicker shielding, better quality connectors) and moving to S-Video instead. Both will give you a better picture.
Also as a general bit of advice, try to make sure your TV and your computer are using the same power point / power board. Slight ground differences can cause some ripple effects too.
Ah, OK, sorry about thatelvis wrote:No snobbery intended. I'm not rich enough to own a house full of Japanese cabs (I have one chinese knockoff cab, which took me ages to save for) and original PCBs, and I too play a lot of games on TV via consoles with interlaced pictures.captpain wrote:Still, I don't give two shits about the snobbery
It's quite a step up.
I still think interlaced pictures suck. There are no consumer video cards with TV-out chips that support progressive scan pictures at 15KHz, and it stinks.
If someone could write a driver or other bit of software that gave you total control over your TV-out signal generating chip, they'd be my personal hero. Alternatively, if someone could make a RGB->S-Video converter that was reasonably priced, they'd also be my hero.
Yes, your TV supports 15KHz. If it didn't you wouldn't be seeing a picture.joeboto wrote:im using the same setup, a lappy svideo out into yellow tv jack. well, the picture quality is a bit weird. if i go to screen test in mame games, during the white grid, theres like some ripple effect through the grids.
and, is there any way to test whether my tv support 15khz or not?
The interference you see is just that - interference. You can try better cables (thicker shielding, better quality connectors) and moving to S-Video instead. Both will give you a better picture.
Also as a general bit of advice, try to make sure your TV and your computer are using the same power point / power board. Slight ground differences can cause some ripple effects too.

So my best picture is going to come from
a) Nice cables (I am using a kind of flimsy s-video cable that came with my video card, so maybe I should try a new one?)
b) Same power source (already done)
Is there anything else I can do?
I assume my 2004-ish CRT TV is only capable of 1 resolution...? Right? So I must be using it right now. Soft 15KHz won't do anything, I guess.
-
GaijinPunch
- Posts: 15850
- Joined: Mon Jan 31, 2005 11:22 pm
- Location: San Fransicso