Sony Bravia LCD TVs

The place for all discussion on gaming hardware
Post Reply
User avatar
DC906270
Posts: 993
Joined: Wed Apr 27, 2005 7:34 pm
Location: THE UK!!!

Sony Bravia LCD TVs

Post by DC906270 »

Anyone have any opinions, things to watch for, pros/cons of the Sony Bravia LCD TVs for use with shmup consoles PSX/Saturn/PS2/Dreamcast/XBOX360/MAME etc. Im thinking of getting a 40" Sony KDL-40L4000 - 40" Widescreen 1080P Full HD Bravia LCD TV. It has an RGB scart input, stating an obvious plus point.
User avatar
Fudoh
Posts: 13015
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Post by Fudoh »

The lower Bravia series are really not good. A nice choice for gaming is the Z series, or the W if you don't have lots of money. But stay away from the rest of the current lineup.
User avatar
DC906270
Posts: 993
Joined: Wed Apr 27, 2005 7:34 pm
Location: THE UK!!!

Post by DC906270 »

care to expand on why the cheaper ones are not so good?
User avatar
Fudoh
Posts: 13015
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Post by Fudoh »

Yes, a few reasons. Due to the lower picture processing quality it's not worth using the Scart input on those. There are only a few very selected LCD and plasma TVs out there which offer nice RGB quality. Even the my Sony X series model is rather borderline. RGB via Scart is ok, but it's far from breathtaking and nowhere as good as run through a XRGB unit. Second would be the inputlag introduced by the internal processing. The lower Bravia models do not have a dedicated Gamemode - still they suffer from a 2-3 frames lag depending on the input.... Last the question is how good the L4000 is in terms of smearing of objects. The set probably does not have a 120/240Hz function which would make it a subpar choice even for modern games.

I know it's tempting to buy a good looking, nice priced LCD set, but you should put a little more work into reviewing the set. Depending on your Xbox360 version you need a low-lag HDMI input. VGA is most likely fast out of the box, but to keep your older systems at the same "lag level" you need a XRGB or something similar to give you a lagfree RGB to VGA conversion.
User avatar
DC906270
Posts: 993
Joined: Wed Apr 27, 2005 7:34 pm
Location: THE UK!!!

Post by DC906270 »

ok, so a SONY KDL32W5500U would be a better choice? it has a "game" mode, motionflow (?) , blur reduction, and 100hz capability. doesnt mention 240hz though, is this the standard frequncy?
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

I have the W4500 and its awesome. I don't know how it is for shooters though. For that I use my cabs or my hantarex monitor. I wouldn't buy a HDTV for shmupping. For a start 16:9 is a retarded aspect ratio for shmupping.
This industry has become 2 dimensional as it transcended into a 3D world.
User avatar
Fudoh
Posts: 13015
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Post by Fudoh »

The W5 series is pretty nice. They all have 100/120Hz. 200/240Hz is Z-Series exclusive.
User avatar
DC906270
Posts: 993
Joined: Wed Apr 27, 2005 7:34 pm
Location: THE UK!!!

Post by DC906270 »

neorichieb, i dont have the luxury of a lot of space for a CRT though, so my only option is an LCD for TV and gaming duties combined. you can opt for a 14:9 display mode, this should be okay for shmupping, just wont be full screen?
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

I'm buying a few Hanspree's over the summer. 19" 4:3 LCD. I bought a PC monitor stand with rotate mechanism on it and the Hanspree comes with all connections. Its probably a piece of crap, but i'll try it and use it for odd duties around the house since its small.
This industry has become 2 dimensional as it transcended into a 3D world.
User avatar
system11
Posts: 6276
Joined: Tue Jan 25, 2005 10:17 pm
Location: UK
Contact:

Post by system11 »

I'll have to be honest, I haven't been able to detect any input lag on my 40X3000 using RGB SCART, and I don't even switch that awful game mode on. The test was to run things like Gradius 5 and tap the controller quickly and repeatedly, seemed fine. It was one of the whole reasons I bought the set.

Best thing to do for the OP, is do what I did. Take your console to TV shops, and try things out until you get one you're happy with. If you're serious and they won't let you do it, they don't want the sale. I had to go through seven, and that was after doing a bit of homework just to decide which ones to try.
System11's random blog, with things - and stuff!
http://blog.system11.org
PC Engine Fan X!
Posts: 8449
Joined: Wed Jan 26, 2005 10:32 pm

Post by PC Engine Fan X! »

bloodflowers wrote:I'll have to be honest, I haven't been able to detect any input lag on my 40X3000 using RGB SCART, and I don't even switch that awful game mode on. The test was to run things like Gradius 5 and tap the controller quickly and repeatedly, seemed fine. It was one of the whole reasons I bought the set.

Best thing to do for the OP, is do what I did. Take your console to TV shops, and try things out until you get one you're happy with. If you're serious and they won't let you do it, they don't want the sale. I had to go through seven, and that was after doing a bit of homework just to decide which ones to try.
I'm not so sure if that would allowable in the USA based speciality A/V stores. Perhaps a quick call to ask management if that would be possible to accomodate such a special favor before making such an expensive big-ticket item purchase. Yep, buying a HDTV monitor does take some proper homework and hands-on trial to see if it's the right/perfect "one for you".

Many years ago, when I got my first 20" CRT-based TV monitor, the specialty A/V store dealer had a purchase clause that said that if you weren't satisfied with your purchase, take it back (even if it was opened) and get credit towards something else. (Nowdays, most big-box A/V retailers don't play that particular game anymore -- once you open up a big-ticket expensive item, you can't take it back type of deal unless if it is defective.) Well, I did just that except twice -- getting a more expensive 20" TV monitor each time. I finally settled on a nice Sony 20" Trinitron TV monitor with S-Video input -- basically, a dedicated gaming console monitor. This was before the days of the internet (doing some research of CRT-based TV monitors) so I'm sure it was quite a hassle for the dealer to be so accomodating during all of this as he did make a commission off of the TV sale purchase anyways. ^_~

PC Engine Fan X! ^_~
User avatar
cvaniafan
Posts: 441
Joined: Thu Sep 25, 2008 11:03 pm
Location: Франция !

Post by cvaniafan »

Fudoh wrote:The lower Bravia series are really not good. A nice choice for gaming is the Z series, or the W if you don't have lots of money. But stay away from the rest of the current lineup.


Any opinion about the V series ? I have read the V series is the same as W, minus the HDTV tuner.
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

V series has 8bit display in the older models. I cannot say for the current model. Its not that good, as you can see gradual brick layer changes from say a red spectrum from pink to dark red.

Here is a diagram showing 8 bit video -

Image


The W series doesn't have the flashy gloss frame but it does have a 10 bit video processor. I could only find this picture giving a side by side comparison.

Image

As you can see, 8 bit video is really naff. The shades only change by a handful of pixels wide, its like its drawn with a really fat pen instead of a fine tuned point.

I was extremely happy to learn of the differences in this technology, since most TV salesman and spec sheets don't even list the video processor. For 240p or 480i you'll not notice a difference probably, but with HD it would stand out.

AFAIK, anything less than the W series has 8 bit video processor, you get what you pay for.
This industry has become 2 dimensional as it transcended into a 3D world.
User avatar
cvaniafan
Posts: 441
Joined: Thu Sep 25, 2008 11:03 pm
Location: Франция !

Post by cvaniafan »

It's interesting info, thanks.
Just wondering if the V serie/8bits coupled with a XRGB3 is not sufficient for older systems (PS1/PS2, Neo Geo, Saturn, Super Famicom, etc) and DVD ? I currently have no plan to buy HD stuff. Although 8bits, I suppose the image processing quality is ok for standard definition ?
neorichieb1971 wrote:V series has 8bit display in the older models. I cannot say for the current model. Its not that good, as you can see gradual brick layer changes from say a red spectrum from pink to dark red.

Here is a diagram showing 8 bit video -

Image


The W series doesn't have the flashy gloss frame but it does have a 10 bit video processor. I could only find this picture giving a side by side comparison.

Image

As you can see, 8 bit video is really naff. The shades only change by a handful of pixels wide, its like its drawn with a really fat pen instead of a fine tuned point.

I was extremely happy to learn of the differences in this technology, since most TV salesman and spec sheets don't even list the video processor. For 240p or 480i you'll not notice a difference probably, but with HD it would stand out.

AFAIK, anything less than the W series has 8 bit video processor, you get what you pay for.
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

The XRGB3 has nothing to do with video colour processing.

Example, the PS3 boot up screen has a gradation of a light to a dark colour. Therefore it will be more apparent that you have an 8 bit screen when looking at such images.

The XRGB3 will do its job on any screen that its compatible with and all HDTV's are compatible with it. The V series only displays 256 colours at any one given moment and therefore the gradation between one colour and another is more noticeable than with a 10 bit HDTV.

Unfortunately scaling hardware, fake scanlines and any other such equipment doesn't come cheaply and the rule of the thumb is, the more you spend the better the results. There is no shortcut way to spend the least amount of money and get outstanding results. By buying the V series, your basically going for the bigger TV in the cheaper range. So your spending more money on the size rather than the quality.

I have noticed when viewing SD TV on my HDTV that the gradation of colours is very limited to start with. This is due to bandwidth and bitrates used for digital TV.

If your talking about SD in videogames, it is ultimately the TV itself which is the bottle neck. Since the console is hardwired into the TV with no bandwidth problems to speak of.

If your wanting middle ground between HDTV and SDTV then the V series is a good place to start, but don't expect the best of either worlds.
This industry has become 2 dimensional as it transcended into a 3D world.
Gwyrgyn Blood
Posts: 695
Joined: Mon Mar 06, 2006 9:48 pm

Post by Gwyrgyn Blood »

neorichieb1971 wrote:V series has 8bit display in the older models. I cannot say for the current model. Its not that good, as you can see gradual brick layer changes from say a red spectrum from pink to dark red.

Here is a diagram showing 8 bit video
Sorry but this is a complete load of crap. Even the shittiest of shit TN LCD panels do 18-bit color (6-bits per channel). Pretty much all decent panels do 24-bit color (8-bits per channel). You are confusing 8-bit color (256 colors total) with 8-bits per channel (aka 24-bit). 256 color is the kind of level of colors they used to have back in the DOS days.

As far as the difference between HDTVs doing 8-bits per channel and 10-bits per channel, it's pretty much entirely marketing hype. Neither consoles nor Blueray output at 10-bits per channel. They might one day in the future, but today they don't. The visual difference between 24-bit and 30-bit is very minor as well, though I guess it MIGHT make a difference if you are a hard core photo-editor, but then you'd probably be buying an LCD PC monitor anyway.
User avatar
Fudoh
Posts: 13015
Joined: Mon Mar 06, 2006 3:29 am
Location: Germany
Contact:

Post by Fudoh »

Gwyrgyn is right
The V series only displays 256 colours at any one given moment and therefore the gradation between one colour and another is more noticeable than with a 10 bit HDTV.
that's crap. The difference between 8- and 10-bit panels is that 8-bit panels display 2^8 shades per color or in a greyscale picture whereas 10-bit panels do 2^10 shades per color resulting in smoother gradients IF there were 10-bit sources.

In theory you gain something with 10-bit displays if you're using a Blu-Ray player which does color upsampling ($1500 Sony 5000ES for example), but the inputs on the TV have to support Deepcolor as well.

For "our" purposes here (=everything gaming related) it absolutely does NOT matter if your TV has 8-bit or 10-bit color processing.
User avatar
cvaniafan
Posts: 441
Joined: Thu Sep 25, 2008 11:03 pm
Location: Франция !

Post by cvaniafan »

Fudoh wrote:For "our" purposes here (=everything gaming related) it absolutely does NOT matter if your TV has 8-bit or 10-bit color processing.


Great, that's what I thought :)
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

The way I explained it might be wrong because the info was passed to me 18 months ago by some guys at www.hdtvtest.co.uk and I recapped by using google and probably got something unrelated to the subject matter. Because after thinking about it, your right 256 is pre SVGA era.

However,

The skull image on the left is relative to what you might see on a V series Bravia and the one on the right is relative to that of the W series. I think its clear in a gif, let alone a blu ray.

I think what I got wrong was I googled 8 bit 10 bit and got something unrelated to what I was talking about.

Sorry for the confusion.

My advice is to stay away from the V series unless you like Gloss that much.
This industry has become 2 dimensional as it transcended into a 3D world.
Gwyrgyn Blood
Posts: 695
Joined: Mon Mar 06, 2006 9:48 pm

Post by Gwyrgyn Blood »

neorichieb1971 wrote:T
The skull image on the left is relative to what you might see on a V series Bravia and the one on the right is relative to that of the W series.
No it isn't. There is literally NO difference in what you would see unless you actually had something outputting a 10-bit per channel signal. Which you wouldn't have. Blurays do not output at 10-bits per channel, period. Neither to any consoles.
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

I just got on msn with my friend who reviews TV;s.

I got into a debate using your info and some guy said that unless your using blu ray with some upsampling colour system with 1.3 hdmi your not going to need 10 bit

oh, I see
well the data on the discs is 8-bit

I'm feeling a bit out of my league

but LCDs and other digital displays need to process the image before they can display it
so even though the source is 8-bit, you need greater than 8-bit at the display to accurately reproduce it

hmmm, so how does 10 bit improve the image, is it like an artificial upgrade

It sounds a bit like de-interlacing to me.. where instead of filling in blank lines its filling blank gradients

hmm, not really
an 8-bit panel can only produce 256 shades per colour (RGB)

and a 10-bit panel can do 1024 per colour
and the data stored on the discs is 8-bit as well, so 256 shades there too
but your LCD panel isn't going to have a perfect output
you have to do gamma an white balance correction
which uses up a number of the panel's shades
purely hypothetical numbers, you might have 256 shades at the panel, but only, say, 200 after white balance and gamma correction, which is less than the 256 on the disc
if you have 1024 shades at the panel, after correction you might be left with 900 shades at the panel which is still enough to reproduce 256 accurately
it's a big more complex than that, but that's the easiest way I can describe it

Well, I don't understand fully even after that explanation, but he did say the skull images were a true reflection of the difference between the V and W series Bravias. I trust his judgement he reviews TV's.
This industry has become 2 dimensional as it transcended into a 3D world.
Gwyrgyn Blood
Posts: 695
Joined: Mon Mar 06, 2006 9:48 pm

Post by Gwyrgyn Blood »

Yeah, that's more or less correct. If you want it described in another way, read the 'Digital Representation' paragraph of this: http://en.wikipedia.org/wiki/Rec._709

However, as it says, most any PC monitor, or a commercial quality TV is not going to have this problem anyway.

The difference you will get however, is way less significant than the image of the skulls you posted. What you posted is a good example of 8-bit color (256 colors) VS 16/24-bit color (65k/1.6m colors), ie several orders of magnitude difference in number of colors. The difference between 16-235 and 0-256 is less than an order of magnitude.

Instead of spending your money on marketing hype to fix a problem with crappy TVs, just buy a TV that doesn't suck. Chances are, if you are buying a TV for gaming, there are lots of other concerns that are significantly more important than a minor difference in color quality anyway!
User avatar
cvaniafan
Posts: 441
Joined: Thu Sep 25, 2008 11:03 pm
Location: Франция !

Post by cvaniafan »

In a french website ( http://www.lesnumeriques.com/article-465-3516-16.html ), they review TVs with high accuracy. In their review of the Sony 40V4000, they state it has 8 bits processing but they couldn't see any difference in the screen gradation compared with the W4000. Basically, they conclude the V4000 is a W4000 without HD tuner, USB-in, and a few options. In France, the V4000 is 200 Euros cheaper than the W4000, which is not bad!
My assumption is the skull pics comparison might be a bit exaggerated for marketing purposes, but maybe I am wrong.
Gwyrgyn Blood
Posts: 695
Joined: Mon Mar 06, 2006 9:48 pm

Post by Gwyrgyn Blood »

cvaniafan wrote: My assumption is the skull pics comparison might be a bit exaggerated for marketing purposes, but maybe I am wrong.
Well it's obviously greatly exaggerated because if it was actually 8-bit per channel versus 10-bit per channel, you would not be able to tell the difference at all on your PC monitor. It's probably actually like 16-bit versus 24-bit colors or something. I'd check for sure but I have no idea how to count color usage in GIMP. :I

Funny thing about colors is that each bit-depth step up you go gives you exponentially more colors but the difference is significantly less noticeable than the last step up.

Most normal consumers can't even tell the difference between TN panels (6-bit per channel) and IPS panels (8-bit per channel). The difference between HDTV 8-bit and 10-bit is significantly smaller than that difference.
Ex-Cyber
Posts: 1401
Joined: Thu Oct 25, 2007 12:43 am

Post by Ex-Cyber »

Whether banding is noticeable partly depends on the source material. If there are no smooth gradients, you won't see banding. In most cases
live-action TV and movies don't cause problems, but sweaty foreheads on the cable news channels are common offenders on my low-end LCD HDTV, and especially certain scenes in Twilight Princess look terrible (mostly ones that are going for a sunset look). That might have something to do with colorspace conversion from NTSC/S-Video as well, but I wouldn't be suprised if the panel is the limiting factor.
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

Ok, so now we have established whats not important, I would like to know where the $400 difference is going in the W range. Since I'm beginning to believe from these posts that my extra pennies went on a matt finish and not much else.

The spec sheet on the W range is on par with the X range, which is the aesthetically pleasing version, but also top of the range.
This industry has become 2 dimensional as it transcended into a 3D world.
User avatar
cvaniafan
Posts: 441
Joined: Thu Sep 25, 2008 11:03 pm
Location: Франция !

Post by cvaniafan »

neorichieb1971 wrote:Ok, so now we have established whats not important, I would like to know where the $400 difference is going in the W range.


According to the reviews, the $400 difference goes in:

- HD Tuner
- USB 2.0-in
- XrossMediaBar onscreen menu
- x.v. colour and light-sensor

The V4000 is a W4000 "Lite" with the same great image quality.
neorichieb1971
Posts: 7680
Joined: Wed Jan 26, 2005 1:28 am
Location: Bedford, UK
Contact:

Post by neorichieb1971 »

Pfft, the Tuner is probably useless as it was designed before the announced DVB2 standard for freeview HD coming out next year.

Hopefully i'm wrong, otherwise I guess they will use a CAM card, or even worse i'll need a freeview box. If thats the case, i'll buy a DVB2 playtv for my PS3. If one comes out of course.

Its interesting none the less, I haven't even read my manual yet :lol:
This industry has become 2 dimensional as it transcended into a 3D world.
Post Reply