Technicalities about TV sets for gaming

The place for all discussion on gaming hardware
Post Reply
User avatar
zimeon
Posts: 89
Joined: Tue Feb 01, 2005 2:32 pm
Location: Sweden

Technicalities about TV sets for gaming

Post by zimeon »

I ripped this from the Mushihime thread.

Living in Europe, using Japanese machines, and thus battling with the PAL/NTSC problem quite a lot, in combination with using Amigas and genlocks, have caused me to sort of self-learn facts about video technology. I know about the NTSC signal not being 60Hz but 59.85 (or whatever), I know that a standard TV sets draw every other line first and the rest afterward. I know (or think I know) what the composite signal, the S-video signal and the RGB signal is.

But I've never read any theory on the subject. I can't say I truly know what interlacing is, and I certainly have (had) no idea about the true or fake low-res that ports could use, or how different TV sets reacts with different resolutions. Or what on earth a 31kHz compatibility is.

Any one care to fill me out? I'd really like to learn more.
bloodflowers wrote:Out of all the current CRT type TVs I tested recently, every single one had some kind of issue with interlace, aside from the Panasonic 36" which has some tearing on fast movement and an over-processed look instead. The processing can't actually keep up with 60 frames per second interlaced, I'm not sure of the algorithm used - but where you see an enemy that flashes for 1 frame when hit, you'll see it flashing alternate lines instead - shadows done by flickering a dark patch (1 frame on, 1 frame off - old cheap technique) will be done with alternate lines too. This is a Toshiba 36ZP48 by the way for anyone in the UK doing TV shopping.
What exactly is a CRT type TV? What does CRT stand for? And what other interlacing issues did you find?
bloodflowers wrote:And don't touch the Sony 36 with a 10 foot pole for games, and plasma screens suffer from horrible motion fade type blur. Yes all of them, even the very expensive ones (I know, I tested over 20 sets of different makes and types with Gradius V).
What exactly is the problem with the Sony 36?
Kron wrote:The Wega series is fine as long as you don't get a 100hz model.
What is the problem with a 100Hz machine in this case? Or in general cases?

I have to admit I had no idea sets could differ so, except of course that LCD TVs and plasma TVs could have blur problems. I have an old Philips IDTV 100Hz 36'' widescreen... Good or bad?
Enthusiasm without skill
User avatar
icepick
Posts: 443
Joined: Fri Feb 04, 2005 9:18 pm
Location: Minnesota, US

Post by icepick »

Yes, that thread is extremely popular for its... hardware discussion. :lol:

I think that I can answer two of these questions! You seem to know what interlacing is; It's the method that you described, regarding the drawing of the odd lines of a frame, and then the even lines. It does this at twice the rate of the frame rate, meaning that NTSC TVs could be thought of as producing video at 60 Hz (sixty times per second), which it is, but this isn't the same as 60 frames per second. Kiken was right, with his mention. It's thirty frames per second, but those frames are split up into two fields, and the fields are presented at sixty fields per second.

This would seem all fine and good, but, your eyes can pick up on how half of the image is new and half of it is old, even if the rows of the new image are interlaced with rows from the old one. The only positive aspect that I can think of (which might not be true) is that the video display would seem to be flickering less than if it was actually progressive scan (entire frame at once, like a PC monitor) since I figure that watching a 30 Hz refresh rate would stuff your eyes into a bag and beat them, until they pulled their bonds up and strangled the refresh rate with them, running off to an opera house and becoming angels of music.


Note that this isn't the same thing as scan lines, which I myself am not even perfectly clear on. I figure that scan lines are either intentionally puttline odd or even lines into progressive scan video, to sort of "fake" the effect of interlaced video, or, it's the presentation of interlaced video on a progressive scan screen, using only the odd field or only the even field. So, you're still seeing the same picture, but the interlaced half is removed, so there isn't any "comb-like" effect during motion, but there's also technically only half of the vertical resolution being displayed (remove the even lines, and a 320 x 200 image is now 160 x 200, only with black lines interspersed to take up the screen area of 320 x 200).

Then, since the display (such as PC monitor) is already being refreshed at a different rate (such as 60 Hz), there isn't any flicker, but the interlaced/scan line video is still presented at 30 frames per second. This can all be very confusing, and I hope that I'm explaining it well enough (and actually know what I'm talking about). Here's a page on the "interlacing" part of the equation:

http://neuron2.net/LVG/interlacing.html

I think that I myself just realized the issue with "true" and "fake" resolution. If people in the know are saying that some consoles have to produce an image at, say, 640 x 480, and they're then asked to produce a port of a Genesis/Mega Drive game, then they must produce a 320 x 224 image which would ideally use up the whole screen. Since this, doubled in size, would be 640 x 448, it would have a black vertical bar of 32 pixels to the right (or 16 on each side) and would appear squashed, when mapped onto the display of 640 x 480. I get it. (Or do I?)

So, the only way to get it to fit the whole screen would be to use some sort of scaling method, which would lessen the quality of the video. If a console would simply output the 320 x 224 signal, it would logically be made to cover the whole screen, without scaling, but wouldn't work properly with such devices as LCDs (and some HDTVs? I'm not sure of how those work, yet). I'm also sure that even if the original was 320 x 240, and it was scaled with pixel resize to 640 x 480, it still wouldn't look the same as the actual 320 x 240 resolution being displayed on the screen.

CRT stands for "Cathode Ray Tube," and is essentially the "glass tube"-type display, which contrasts such displays as LCDs in that any resolution can be displayed at any size in full quality on the screen, whereas LCDs (and other fixed-pixel displays) can display only their native resolution (the actual number of pixels built into the display) without greatly sacrificing visual quality. Another downside to LCDs (and plasma, apparently?) is that it takes a while for the individual pixels to turn on and off, as opposed to instantly, so what you end up with is a "trailing" effect, exaggerated motion blur, which makes games quite less fun to play, if you're trying to play well. Also, technically, LCD pixels don't "turn on" and off; There's a constant light "on" behind them, meaning that it must be covered up to create darker colors (or black). This means that black isn't ever really black, at least, not in the way that it can be with a CRT. You can adjust the brightness and contrast on an LCD all that you like, and you'll never end up with a fully-black screen, which means that darker images can look very washed-out.

So, you know the problem with the "Sony 36" as well, since you're aware of the blur problems with LCD and plasma TVs.

As for 100 Hz... hmm... is this the "light gun incompatibility issue" thing? That's my best guess.

(Edit: After reading something else that bloodflowers posted, I'm guessing that 100 Hz TVs, while having a non-flickering display, have to calculate how/when to display each frame/field of the source video, thus causing a delay between input and output, which is probably acceptable for watching television and film, especially if the sound is delayed by the appropriate amount. However, for games, it would be death.)

Wow. That's a lot of text. I'd like to pose a couple of questions as well (if anybody will even see them):

1) Do these RGB monitors that everyone talks about, display interlaced video? Or, are they basically VGA monitors? I was thinking that it was basically the highest-quality analog interlaced video possible, but, I'm feeling a little bit confused, now. Arcade machines use RGB monitors, right? So, they are... interlaced displays? I haven't been to an arcade, in a while. :oops:

2) Next... Do RGB-capable televisions simply use composite sync, or something else? Finally, if the preceding condition is true... what I'm getting at, is... If one were to produce a color-difference component video signal based on an RGB signal and composite sync, would it appear properly on modern televisions (at least CRT models), and be comparable to an RGB monitor? I'm thinking that it's probably not worth the trouble, and that the time that it takes to transcode the signal would cause some sort of annoying delay, and that it would simply be easier and almost as good to use S-video.

Not to hijack the thread, or anything. 8)
\\ /\/\ \
User avatar
system11
Posts: 6273
Joined: Tue Jan 25, 2005 10:17 pm
Location: UK
Contact:

Re: Technicalities about TV sets for gaming

Post by system11 »

zimeon wrote:What exactly is a CRT type TV? What does CRT stand for? And what other interlacing issues did you find?
CRT = Cathode Ray Tube - those big heavy glass screen ones we all know and love. Interestingly, a good one still gives a better picture overall than even the most expensive plasmas..
zimeon wrote:What exactly is the problem with the Sony 36?

What is the problem with a 100Hz machine in this case? Or in general cases?
The 100hz Sony ones suffer from a big delay between what is being fed into the inputs, and what appears on the screen - like it's streaming it through (which I suspect it is). This delay is approximately half a second. Doesn't sound much, in practise it is a huge issue. Take Rainbow Six for example. You're turning too late, and too far. Half a second is ample for someone to lean around a corner and kill you, before you even see them. Some plasmas I tested suffer this too, and the JVC 36".

All the 100hz screens have picture processing of some kind, some worse than others. As I said in the other thread - the Toshiba only draws every other line of a single interlace frame - result only every other line of a 1 frame change is seen. I didn't test this on other models so can't comment on whether or not they do the same thing. The JVC you can almost see building the picture in processing, it was terrible. The Sony has the aforementioned lag. The Panasonic apparently suffers motion tearing when things go -very- fast, but I'd already decided against that one as the geometry was suspect on all sets I tried, and I saw some very strange processing testing with a DVD - it tries to do too much.
zimeon wrote:I have to admit I had no idea sets could differ so, except of course that LCD TVs and plasma TVs could have blur problems. I have an old Philips IDTV 100Hz 36'' widescreen... Good or bad?
I couldn't find any Philips sets on my travels that were still working and connected.. The new ones have a poor track record with reliability I'm told. I believe they got fair reviews though. While all CRTs are mostly the same in operation, it's the picture processing companies have added to "improve" pictures in films that has screwed gamers - this is where the differences are.

Again as mentioned in the other thread - every single plasma or LCD I tried had blur problems, some worse than others - even the one someone on NTSC-UK said had NO blur, actually did. I must have sharper eyes for that sort of thing, or just know what to use as a decent test. The Panasonic ones are very close to acceptable in blur terms, and actually -are- acceptable if you don't mind fast moving backgrounds with hard edges having a little motion blur. Of course, plasmas also burn. Saw several with Sky logos burned into the corner... DLP sets also suffered similarly with blur - shame really as the pictures are as good as a CRT with none of the other problems, I almost bought one. Some people can see 'rainbows' when watching DLP though.

I learned much more than I ever wanted to, trying to find a decent basic television with a big screen. No single unit is perfect for gaming, if you need a big screen but want to use it for games too, you just have to pick which flaw bothers you least. Sometimes noticing every other bar of a shadow is missing in interlace games, bothers me less than the other problems - so I bought the Toshiba. As a bonus, that set also has Dolby 5:1 built in, and comes with a set of speakers too - great for those who would like an entry level home cinema setup without the hassle+cost of other solutions. It's the cheapest one too, and has the best geometry of all the 36s except for the Sony. No picture in picture, but it does have 3x scart inputs (2 are RGB), and component and digital audio inputs. It comes with the best stand too, 3 shelf. The speaker stands must be bought seperately (Toshiba Prostand 3) - but they're very cheap.
System11's random blog, with things - and stuff!
http://blog.system11.org
User avatar
zimeon
Posts: 89
Joined: Tue Feb 01, 2005 2:32 pm
Location: Sweden

Post by zimeon »

Thanks a lot for the exhaustive answers.

Just to see that I've understood correctly... Since you mention "interlaced games", I take this as the games can be programmed to create different resolutions for the video out? Meaning the consoles themselves do not have a specific resolution that all games play in?

And, you're further saying that a CRT TV can accept any (or at least several) resolution, whereas a plasma TV or an LCD TV will have to sort of reconstruct the picture to fit their native resolution? Does this resolution have anything to do with the NTSC 525/PAL 625 vertical lines on a CRT TV?

I don't know what resolution Mushihime runs in, I'll just say 320*240. If I've understood correctly, the "fake" low-res is that the machine is really emitting a, say, 640*480 signal, but the smallest pixel is 2*2 pixels, instead of emitting a true 320*240 signal, right?

So what exactly do the manufacturers gain on using this method? It would seem easier to use the game's original resolution instead of emulating it...

And I'm still a bit at a loss how emulating a low-resolution picture in a higher resolution will deteriorate the picture...

I'll just whip some other comments I was wondering about...
jiji wrote:The problem is that when scaled up, these games are running at a resolution that must be interlaced for display on non-HD televisions.
If I've understood right, interlacing is a method one uses to sort of create a higher resolution than the monitor can take, right? Using sort of the technique of alternating between two pictures while sort of "shaking" the screen vertically half a vertical line at the same frame rate as the picture change? At least this is what I learned that interlacing meant on the Commodore Amiga -- improving the resolution while creating a sort of flickering picture.

So... what's being said is, that if these games are not presented in their true low resolution, CRT TV sets will have to interlace the picture in order to create the high resolution that the, say, PS2 emits? And this is sad since the games are not of that high resolution to begin with? Am I right here?
Recap wrote:Your explanation brings to think that a fake low res game (non interlaced, if you prefer) looks OK with a HD monitor, which actually doesn't.
Wait... why would a fake low res (doubled pixels) look bad on an HD monitor? The HDTV would have to scale a low res picture anyway, right? What's the difference if the video signal emitting machine or the HDTV itself does this scaling?
Jiji wrote:I wouldn't complain nearly so much about these scaled-up ports if they were simply pixel-doubled with no interlacing, but I play my games on a 13" 1084, not an HDTV (and I've heard very bad things about how low-res games look on those).
Pixel-doubled with no interlacing? Is "no interlacing" an alternative if the emitted resolution is high (even if it's a fake low-res)? Is no-interlacing at all an alternative if playing on a CRT TV? Don't all CRT TVs interlace in order to make up for their poor resolution?

Wait... is it the console that decides whether the picture is to be interlaced or not? Can a PS2 be set to emit a, say, interlaced 640*480 or non-interlaced 640*480? It isn't the video monitor that decides whether a picture is to be interlaced, then?

Excuse me for being dumbass stupid but...
Jiji wrote:He wondered why I was using a Commodore monitor and why I couldn't just use a new HDTV that used component video, and then said something about how we need to keep up with the times. I had to bite my tongue. :p
And if I've understood correctly, this was only because of the blur effects that plasma TVs have? I mean, isn't component video supposed to be of equal picture quality as an RGB signal (or am I really stupid again?) Not that I've seen a component output for the PS2, but still...

Hm... I'm getting more confused by the minute here...
Enthusiasm without skill
User avatar
Recap
Posts: 363
Joined: Tue Apr 19, 2005 10:13 am
Location: Spain
Contact:

Post by Recap »

I have no time to read the whole thread, but seems you're still a bit confused. Things are easier than they seem. Without getting too "technical":

- 15 kHz resolutions are low resolutions (the ones smaller than 320 x 256, if I recall) while 31 kHz resolutions are the ones over 640 x 480 (thst is, "hi res"). There're also 24 kHz resolution used by some old arcade games (mostly, Sega games), but that's not relevant these days.

- So there are monitors suitable for low res systems (the ones which work at 15 kHz, like standard CRT TVs, standard arcade monitors, the Commodore 1084) and monitors suitable for hi res systems (standard VGA monitors, HDTVs). So 31 kHz = VGA.

- A 15 kHz CRT displays *any* low resolution properly. A VGA CRT displays **any** hi res properly. A 31 kHz plasma TV/TFT/LCD monitor use fixed frequencies/resolutions, so they don't display *any* hi resolution as a CRT does.

- So to display 31 kHz resolutions in 15 kHz and viceversa there are "tricks" which will always affect the image quality.

- Interlacing. It's used to display a high resolution in a low res TV. That is, to transform a 31 kHz signal into a 15 kHz one. 15 kHz monitors will support a max of 800 x 600 i.

- Scaling-up. Used to display a native low res image in a hi res monitor. It's exactly what MAME and your graphic card do if you use a standard VGA monitor with any low res game. A plasma/LCD TV will automatically do that with a low res console's picture. Pixel-doubled picture.

- Home consoles do obviously work at different resolutions. Some are low res only (MD), some are hi-res only (GCN), some are both (PS2). The console's video output and software development determines if hi res games are actually displayed in true hi res (31 kHz) or "in fake hi res" (interlaced). The PS2 is well known for its 31 kHz-unfriendly games, while the DC, just the opposite.

- Now comes the evil thing. Any hi res machine is able to display any low resolution in full screen just for its scaling capabilities. A low res picture is scaled-up in both dimensions until its fills the screen, pretty useful if you are porting a 15 kHz game and want, for obvious reasons, keep the original aspect ratio. Pretty ugly since you're seeing drawn lines which shouldn't be there. And I'm not even speaking about the picture in motion.

- That scaled-up (usually, filtered) hi res picture can again be output in its now "natural" 31 kHz or in its now "unnatural" 15 kHz with interlacing. So yep, in the latter, they're readapting the picture againg to be output at 15 kHz. Wicked.

- Keep in mind that a picture's aspect ratio is not only determined by the resolution but also by its pixels' shape, which is usually pretty different between the different low res systems. Autoscaling to preserve the aspect ratio is as easy it sounds, so devs do obviously take the easy way.
Image
User avatar
GaijinPunch
Posts: 15660
Joined: Mon Jan 31, 2005 11:22 pm
Location: San Fransicso

Post by GaijinPunch »

The 100hz Sony ones suffer from a big delay between what is being fed into the inputs, and what appears on the screen
What did you make of the 50hz Wega stuff? Recap posted in my forums that 'Wega's are 100 hz' which I think he might've mistyped. I'm seeing 100 hz and non-100 hz models on some sites. Wondering if the non-100 hz is worth picking up.
User avatar
nZero
Posts: 2605
Joined: Wed Jan 26, 2005 1:20 am
Location: DC Area
Contact:

Post by nZero »

Recap wrote: - 15 kHz resolutions are low resolutions (the ones smaller than 320 x 256, if I recall) while 31 kHz resolutions are the ones over 640 x 480 (thst is, "hi res"). There're also 24 kHz resolution used by some old arcade games (mostly, Sega games), but that's not relevant these days.
15.25khz and 31.5khz are merely video bandwidth. The ability to send x amount of information in y slice of time. At 60hz vertical refresh, a 15.25khz signal can send approximately 262 lines. Resolutions requiring more than 480 lines of vertical resolution at 60hz are going to require more than a 31.5khz video bandwidth to send that information in.
Image
User avatar
system11
Posts: 6273
Joined: Tue Jan 25, 2005 10:17 pm
Location: UK
Contact:

Post by system11 »

GaijinPunch wrote:
The 100hz Sony ones suffer from a big delay between what is being fed into the inputs, and what appears on the screen
What did you make of the 50hz Wega stuff? Recap posted in my forums that 'Wega's are 100 hz' which I think he might've mistyped. I'm seeing 100 hz and non-100 hz models on some sites. Wondering if the non-100 hz is worth picking up.
My TV shopping expeditions were only recent - I didn't see -any- 50hz stuff on shop floors except smaller screens that I wasn't looking at. Can't help there I'm afraid.

As for the interlace on Toshiba issue - I checked - my old Trinitron does the same thing, and is definitely not new enough to have processing built in. Only reason I'd never seen it before, is the picture on the new TV is quite a bit sharper.. So there's nothing wrong with the Tosh at all (after you tinker in the service menu to reduce the ntsc overscan). Thumbs up!
System11's random blog, with things - and stuff!
http://blog.system11.org
User avatar
GaijinPunch
Posts: 15660
Joined: Mon Jan 31, 2005 11:22 pm
Location: San Fransicso

Post by GaijinPunch »

I'm currently looking at TVs I can buy in the states (imported from somewhere over on your side of the world). There are some Wega's that come in 100 hz and (presumably) 50 hz models. This seems to be one of the few multisystem TVs with SCART input I can find in the US. Even then, it's pricey. Probably 150 pounds more than you'd pay in the UK.
User avatar
zimeon
Posts: 89
Joined: Tue Feb 01, 2005 2:32 pm
Location: Sweden

Post by zimeon »

Thanks again vor very exhaustive answers, Recap. I might be starting to get this.

However:
nZero wrote:15.25khz and 31.5khz are merely video bandwidth. The ability to send x amount of information in y slice of time. At 60hz vertical refresh, a 15.25khz signal can send approximately 262 lines. Resolutions requiring more than 480 lines of vertical resolution at 60hz are going to require more than a 31.5khz video bandwidth to send that information in.
Aha, so the "15kHz" is referring to the number of "picture instructions". Hey, this starts to make sense. I've heard the statement that the NTSC picture has not 525 vertical lines, but to be exakt 262,5 (whereas the PAL system has 312,5)

However, I'm still wondering about one thing: if a CRT TV receives as 31.5kHz signal, that signal should be able to contain information for all 525 lines, right? How does the TV react then? Progressive scan? Or do CRT TVs always draw every odd line first and every even line second?

And one thing about 100Hz TVs, that makes sense in PAL land where the standard frequency is 50Hz. But when displaying a native 60Hz signal, do 100Hz TVs go into 120Hz mode? Or do they (the horror) try to recalculate a 60Hz signal into 100Hz? Is this maybe, the cause of the "time delay"?

If someone could clarify these two questions for me, I'd be very grateful.
Enthusiasm without skill
User avatar
system11
Posts: 6273
Joined: Tue Jan 25, 2005 10:17 pm
Location: UK
Contact:

Post by system11 »

zimeon wrote: And one thing about 100Hz TVs, that makes sense in PAL land where the standard frequency is 50Hz. But when displaying a native 60Hz signal, do 100Hz TVs go into 120Hz mode? Or do they (the horror) try to recalculate a 60Hz signal into 100Hz? Is this maybe, the cause of the "time delay"?
I couldn't find an answer to this myself, despite extensive reading and checking specs. What I did find is you're better off finding a TV where the 100hz mode can be turned off (example - the Toshiba offers progressive/natural/active/100hz - prog is a non-100hz mode). The current 36" JVC is particularly bad in this respect, as it resamples the whole picture to 50hz even if it's 60. You can probably guess how awful this looks in games, although for DVDs it was fine. I didn't find a single set where you can turn off the processing entirely - it's simply not an option now, so you have to pick one that doesn't cause a lag. To save a lot of effort, if you go TV shopping, take your console with you - an NTSC one.
System11's random blog, with things - and stuff!
http://blog.system11.org
User avatar
GaijinPunch
Posts: 15660
Joined: Mon Jan 31, 2005 11:22 pm
Location: San Fransicso

Post by GaijinPunch »

Don't you think I'd be (relatively) safe avoiding a 100hz TV altogether and going for a standard 50 hz? Getting a 50 hz TV with scart input in the states is looking like it will take an act of God. The only one that I can pretty much order is the non 100 hz Wega for $600. I really have no way of testing this, since I live on an island.
User avatar
system11
Posts: 6273
Joined: Tue Jan 25, 2005 10:17 pm
Location: UK
Contact:

Post by system11 »

If you can find a non-100hz grab it!

My choice was severely limited as I needed a 36" widescreen set. Why? because I'm used to a 28" 4:3 set for gaming, which this TV was replacing - you have to go up to 36" to get the same size 4:3 picture as on the 28...

It looks awesome for films too ;-) The Toshiba has built in subwoofer, and when you connect the (supplied) surround speakers, the tv becomes a center speaker too. I just finished hooking this up, tested with Tomb Raider (recorded off digital cable) - sounded great in the scene near the end in the woods.
System11's random blog, with things - and stuff!
http://blog.system11.org
User avatar
zimeon
Posts: 89
Joined: Tue Feb 01, 2005 2:32 pm
Location: Sweden

Post by zimeon »

Vaguely related, maybe but still...

How about projectors? What technique do they use and how will this affect gaming? I have this feeling that most modern projectors have their own native resolution, however are there any that don't?
Enthusiasm without skill
User avatar
GaijinPunch
Posts: 15660
Joined: Mon Jan 31, 2005 11:22 pm
Location: San Fransicso

Post by GaijinPunch »

Jesus... 36"? How the hell do you TATE that thing w/o breaking your back? The one I'm looking at is 29" and will be amply heavy I'm sure.

EDIT:
This is the one I'm looking at. Two SCART inputs, 50hz love. Unless someone can tell me why not to, I'll be ordering this one likely tomorrow.
User avatar
system11
Posts: 6273
Joined: Tue Jan 25, 2005 10:17 pm
Location: UK
Contact:

Post by system11 »

GaijinPunch wrote:Jesus... 36"? How the hell do you TATE that thing w/o breaking your back? The one I'm looking at is 29" and will be amply heavy I'm sure.
You answered your own question there - you don't tate an 85 kilo widescreen. I didn't turn my old 29" 4:3 either - you'd be surprised just how heavy they really are - if it has side speaker enclosures you'll need to be careful about cracking the side and instability. A 29" by the way, is again a good 50-60 kilos depending on model.
System11's random blog, with things - and stuff!
http://blog.system11.org
User avatar
GaijinPunch
Posts: 15660
Joined: Mon Jan 31, 2005 11:22 pm
Location: San Fransicso

Post by GaijinPunch »

Yeah, it says it's 150 pounds shipping weight, so I figure at least 130 pounds for the TV alone. I had a 24 that I could TATE in my sleep, so I'm sure with a little TLC I can do a 29". 36 just ain't happening though. With my luck, it would fall and me and kill me. Then again, I managed to step on a sea urchin surfing one time.
User avatar
Kron
Posts: 475
Joined: Tue Feb 01, 2005 6:45 pm
Location: UK

Post by Kron »

GaijinPunch wrote: EDIT:
This is the one I'm looking at. Two SCART inputs, 50hz love. Unless someone can tell me why not to, I'll be ordering this one likely tomorrow.
That is a perfect 4:3 ratio 50/60hz tv which is ideal for low res RGB as it has no filters or artificially added picture effects. Only one of the scart sockets will be RGB enabled if its a standard imported Euro model.

Remember to use a transformer if your standard mains power is 110v as it'll need 220/240v.
neorichieb1971
Posts: 7676
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

If you want close to RGB picture resolution on a TV in the States, you should buy a RGB to component transcoder. Mine cost $160. Then you can just use a 15khz Component video TV. You can get a 21" for $200 these days.

You will need a Euro SCART lead for each console modified with Stereo outputs (if your consoles have no L+R for sound). The benefit to this is that you will always get component quality video, you can ship scart leads easier than TV's and the quality is about the same.

Importing a TV from Europe is mega expensive, you need all kinds of transformers and most of the stuff will not be warrantied.
This industry has become 2 dimensional as it transcended into a 3D world.
Post Reply