Color temperature for games and consoles: 6500K or 9300K?

The place for all discussion on gaming hardware

Color temperature for games and consoles: 6500K or 9300K?

6500K for all
19
66%
9300K for all
6
21%
6500K for American content and 9300K for Japanese content
2
7%
Other (please specify)
2
7%
 
Total votes: 29

User avatar
Einzelherz
Posts: 1279
Joined: Wed Apr 09, 2014 2:09 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Einzelherz »

Elvis are you a professional patronizing dick or is it just a hobby?
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

Einzelherz wrote:Elvis are you a professional patronizing dick or is it just a hobby?
My apologies, I don't intend to patronise. I do work with a lot of good folk who, like nissling, go to the Nth degree of detail for deeply technical things that generally flies right over the head of the intended audience. All correct, and all generally TMI for regular folk.

Colour theory is crazy complex, while aiming to educate folks who, after 4 pages, still can't understand why there's no "perfect white" with a 20 minute CIE1931 video is entirely correct, my bet is nobody who didn't get it by page 2 would bother to watch that video.

If the KISS (Keep It Simple Stupid) approach comes across as patronising, then I apologise. Again, not my intention. My only aim is to inject some pragmatism into the direct question of how much time/effort/cost one should invest to calibrate a display for something that isn't income generating, with full understanding that it's subjective, so sometimes folks just want to because they want to, and that's cool too.
User avatar
Xer Xian
Posts: 881
Joined: Sun Feb 06, 2005 3:23 pm
Location: Italy

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Xer Xian »

@Elvis - Don't worry, I don't think that anyone here is going to buy pro equipment to pro calibrate their monitor for playing SMB or Zelda (except maybe GenerHAL).
If, like Nissling, you already have the means, then sure it makes sense - but for anyone else, not really. We're talking consoles that are known to have wildly inconsistent RGB voltages here, for which people are (rightly) happy to use standardized cables that attenuate the levels correctly only on average, so many are already saying goodbye to "perfect whites" no matter how well their monitor is calibrated.
User avatar
Einzelherz
Posts: 1279
Joined: Wed Apr 09, 2014 2:09 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Einzelherz »

No one is suggesting you spend a ton of money to buy professional calibration gear. Commercial level stuff is in the ~100 usd range, or was last time I checked. And there is free software available. In my situation I spent $80 on a used Colormunki (a license restricted i1 display pro) and $0 on HFCR, learned how to use it via online tutorials, and calibrated a dozen screens between my house and my parents'.

If we're going the length (on this forum) to spend a crapton of money/time for screens, or RGB/HDMI kits, or RGB tv conversions, etc. then I think learning some basics about color theory and display calibration would be something everyone would want to know.
User avatar
Xer Xian
Posts: 881
Joined: Sun Feb 06, 2005 3:23 pm
Location: Italy

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Xer Xian »

Einzelherz wrote:If we're going the length (on this forum) to spend a crapton of money/time for screens, or RGB/HDMI kits, or RGB tv conversions, etc. then I think learning some basics about color theory and display calibration would be something everyone would want to know.
Definitely, I agree. But is it important to have the white point set to a specific temperature and knowing that the delta is under a certain critical threshold for gaming (especially retro-)? There are no flesh tones or natural scenery or carefully chosen lighting like in movies to worry about getting right. In any case, it certainly depends on your standards and whether or not you have a trained eye. I don't, thankfully :lol:
User avatar
Einzelherz
Posts: 1279
Joined: Wed Apr 09, 2014 2:09 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Einzelherz »

You're 100% correct about the necessity of white balance or temperature on most old consoles with their limited palettes. Nissling only brought up the temperature shift specifically in citing the flaws in the BKM probe.

But 1. Old consoles aren't the only thing we deal with. Calibration works on every screen (like my plasma I watch nearly everything on) and some later consoles attempt fleshtones and more realistic color schemes. And 2. Calibration also involves setting proper black levels and contrast which I'd argue is even important on the oldest stuff. It just makes the sprites pop more, imo.

Luckily, if my experience is anything to go by, most monitors show up with good calibration so nothing is needed for colorful older games.
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

Einzelherz wrote:No one is suggesting you spend a ton of money to buy professional calibration gear. Commercial level stuff is in the ~100 usd range, or was last time I checked. And there is free software available. In my situation I spent $80 on a used Colormunki (a license restricted i1 display pro) and $0 on HFCR, learned how to use it via online tutorials, and calibrated a dozen screens between my house and my parents'.
I think that's a great solution. Cheaper colorimiters have been criticised elsewhere in this thread, but I think they're fine for use (again, in context that this isn't a business requirement for most of us here).

Also check out the ColorHug colorimeter:
http://www.hughski.com/

He only sells the second model now (95GBP/130USD). I bought the first as an experiment back in 2012, and it was fine. It was quite slow compared to other devices (takes roughly twice as long to get a reading), but very capable. It isn't supplied with software, but works fine with the free/opensource dispcalGUI/DisplayCal:
https://displaycal.net/

You'll need a laptop (any OS works) with the probe, but the software is pretty easy to follow and get your basic curves and levels readouts to begin your adjustments.
Einzelherz wrote: then I think learning some basics about color theory and display calibration would be something everyone would want to know.
Fair enough. I did link to the VES Colour "bible" a page back. There's a gratuitous amount of information in there on colour theory as well that's worth reading (not in video form, which I know is preferred for some people):
elvis wrote:If you're interested in the deeper technical science of this from a TV/film/cinema point of view, I recommend reading "the bible" on the source:
https://github.com/jeremyselan/cinemati ... or_VES.pdf
Last edited by elvis on Tue Mar 13, 2018 4:50 am, edited 3 times in total.
Saturngamer81
Posts: 49
Joined: Wed Apr 01, 2015 12:03 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Saturngamer81 »

nissling wrote:Anyone who believes that there is such a thing as "pure white" on a display should have a look at this video in order to understand how CIE1931 works.

https://www.youtube.com/watch?v=82ItpxqPP4I
great vid nissling
User avatar
Lawfer
Posts: 2283
Joined: Fri Dec 01, 2006 3:30 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Lawfer »

GeneraLight wrote:Thanks. Displaying pure whites would age the phosphors the fastest and put the most stress on the CRT. I think I'll keep my white balance and colors at a calibrated D65, since it's the NTSC-U standard and apparently doesn't drain the life of a display as fast as D93 does.
Are you sure about that? 6500K seems whiter/yellower than 9300K, while 9300K seem darker/bluer than 6500K, so wouldn't 9300K wear out the CRT less fast than 6500K?

I know that technically 9300K has a higher brightness than 6500K (but at the same time 6500K has a higher contrast than 9300K), but the blueish push on the whites makes it look less bright as a result.

But still contrast does increase the luminance of the screen and 6500K has a higher contrast level than 9300K.
Taiyaki
Posts: 1050
Joined: Fri Apr 04, 2014 11:31 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Taiyaki »

Personally I think there's nothing wrong with either choice when it comes to crt and classic games.
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by tongshadow »

I think it's worth mentioning that most CRT TV manufacturers calibrated their displays for 10000k/14000k temperature. The ones that offered the option for Normal, Cool and Warm colors had 10500K, 14000K and 8200K as their temperatures, respectively.
User avatar
Einzelherz
Posts: 1279
Joined: Wed Apr 09, 2014 2:09 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Einzelherz »

tongshadow wrote:I think it's worth mentioning that most CRT TV manufacturers calibrated their displays for 10000k/14000k temperature. The ones that offered the option for Normal, Cool and Warm colors had 10500K, 14000K and 8200K as their temperatures, respectively.
Do you have a source for this peculiar information?
tongshadow
Posts: 613
Joined: Sat Jan 07, 2017 5:11 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by tongshadow »

Not peculiar at all if you read most modern CRT TV's service manuals. Saw this info on Panasonic, Philips and LG's manuals.
Ryeno
Posts: 89
Joined: Tue Nov 05, 2019 6:50 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ryeno »

Einzelherz wrote:
tongshadow wrote:I think it's worth mentioning that most CRT TV manufacturers calibrated their displays for 10000k/14000k temperature. The ones that offered the option for Normal, Cool and Warm colors had 10500K, 14000K and 8200K as their temperatures, respectively.
Do you have a source for this peculiar information?
IIRC the Sony 34XBR960's coolest setting is 10k+.

With that said, you could calibrate for 6500k. With TVs that have multiple color temps, generally you switch it to warm and calibrate to 6500k. With monitors like the Sony PVM-2530, disable New Dynamic Color, and calibrate to 6500k.
User avatar
bobrocks95
Posts: 3460
Joined: Mon Apr 30, 2012 2:27 am
Location: Kentucky

Re: Color temperature for games and consoles: 6500K or 9300K

Post by bobrocks95 »

I like cool colors so I set the color temp setting on displays to Neutral or cool :)
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
User avatar
kitty666cats
Posts: 1270
Joined: Tue Nov 05, 2019 2:03 am
Location: Massachusetts, USA

Re: Color temperature for games and consoles: 6500K or 9300K

Post by kitty666cats »

I usually calibrate TVs/monitors to 6500K for general use and if they have a 9300K and/or “cool” setting, I usually switch to that if playing older Japan-developed games (it usually looks more “right” to me, heh).

I’ve also found some anime seems to look more “right” to me at 9300/cool. Makes sense!
JXS
Posts: 1
Joined: Mon Aug 15, 2022 4:35 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by JXS »

nissling wrote:After all the hype, simplification and misunderstanding in this thread, it’s time to get things going with the BKM-14L. It has become legendary for its features, but rarely questioned by anyone around here. How come then that I want to do this?

The answer is simple: I have been using CalMAN for several years and learned quickly that “automatic calibrations” (which is a term I never want to use again) usually end up with more issues than what you started with. This by itself doesn’t mean that automatic processes and profiling is all bad, as it can get your image to the right direction. But that’s the deal…

If you are aiming for a target, why stop halfway there? Why satisfy with a compromise which you haven’t had any control over, nor know exactly what’s been going on, when you can (if you’ve got the right knowledge), make the decisions by yourself and confirm that the calibration you’re doing ends up correct?

So, let’s just get down to this subject which I know will cause quite a bit of controversy, but I do believe that we shouldn’t put too much reliability in old measurement tools when we have much better options today. A friend of mine bought a Sony BVM-D24E1WE about six months ago. The tube has gone around 36 000 hours and despite its usage, it’s in overall very good condition. I got to see it just two weeks after he first bought it and was very impressed by the image. Two months ago he got to borrow a BKM-14L from a friend who claimed that he had used it with success for his D24E1WE and my friend certainly saw an improvement, but I was very hesitant from the start considering how “automatic calibrations” or profiling usually turns out.

I have used consumer versions of CalMAN for several years and just recently I stepped it up to a new level by purchasing CalMAN Video Professional. Although I bought the license as a private person, I am using it in business (both as a calibrator for consumers but also as an employee on a film archive where I work). For measurement and source, I’m using a Spectracal C6 and Videoforge HDMI. Both of these are NIST-certified and ensures that any issues that I see either on the image or on the measurements are caused by the monitor, TV or projector. If you’re a consumer who just wants to play around, this is certainly overkill, but if you’re being hired for this kind of task you certainly need to step it up to this sort of level in order to even justify the prices you’re asking for.

Just as soon as we started the BVM-D24E1WE, I saw immediately that the white balance was off. Supposedly, he had chosen the BKM-14L to calibrate to D65 but the image was certainly too cold. Not cold like D93, but certainly nowhere near D65. We let it be turned on for about forty minutes before I started taking measurements. And… Oh boy, I’ll just cut to the fun part and show you a before and after graph…

Image

Now, note that these graphs only show the white balance and not the luminance. In order words, differences in gamma aren’t shown and that’s perfectly fine for two reasons.

#1. I want to compare the white balance that I got from the BKM-14L to what I could achieve by myself with my own tools, which I know measure as intended.
#2. Since I do calibrate black level and gamma as well, it wouldn’t be a fair comparison to begin with if those were measured. Still, even if they were, it would still be night and day.

It should also be mentioned that EOTF on a CRT is rather difficult to get graphs for to begin with, as it is always higher at shadow details to get lower and lower further up on the greyscale. This can cause weird looking measurements but as long as you know how a CRT behaves and can calibrate the black level properly you don’t really have to mind how the gamma curve will turn up in CalMAN (as the software doesn’t seem to have proper support for CRT EOTF anymore). Instead it’s a much better idea to pay attention to how the white balance is measured and what you can do about it.

With that said, things are very clear here. The probe is simply not as sensitive to blue as it’s supposed to and therefore it gives you a far colder image then you desire. This issue would also cause the image to get colder than 10 000° Kelvin if you’ve choose to calibrate to D93 with it. We could let it all end here, but let’s dig a bit deeper into this subject.

When measuring the color temperature of the former “calibration”, we see that the average color temperature is 7334° Kelvin, with whites being at nearly 7400° Kelvin. That’s a differential of 13% to 6500° Kelvin and should certainly not be considered a reference point. In fact, I’d say this is certainly noticeable even in general consumer environments.

Image

It should also be mentioned that even when the EOTF is considered by CalMAN, my calibration has a maximum dE2000 of 1.4-1.5. This is, of course, caused by the mathematical differences in EOTF and even by then these results are extremely good. It’s certainly more than enough for any consumer and probably even holds up to reference standards.

Image

And here are some gamut measurements. Colors were generally very good before as well but thanks to the calibrated white balance it all got even more fine-tuned. What could be mentioned is that the green cannon may be slightly aged but it’s not visible to the naked eye and still falls within reference. Also, there are straight lines between Green-Magenta, Blue-Yellow and Red-Cyan that all pass through the white point. In other words, both primaries and secondaries are properly balanced.
Image

The Sony BVM-D24E1WE is one of the very finest CRTs I’ve ever seen and seeing it calibrated to 6500° Kelvin and a light output of 100cd/m^2 was a completely new experience. It’s so extremely stable and sharp with an exceptional, natural EOTF that is very difficult (unless impossible) to match even on new displays. Very dark, almost black detail are there and visible while you have clear whites on an overall bright image. The colors are pretty much flawless as well and the greyscale looks perfect for the human eye. Along with my Sony HDM-3830E, this is probably my favorite CRT of all time.

For anyone interested, I also calibrated a BVM-D20F1E. It hadn’t been used with the BKM-14L and the white balance was certainly off, but in a different matter. The tube has just under 5000 operational hours and performs like a charm.

Image

Image

In all, I honestly cannot say I would recommend anyone using the BKM-14L to get accurate white balance. I will say that the issues I saw from it gave me an idea that it probably did work rather well in the 90s and early 00s, but nothing lasts forever. My C6 will probably not do the trick in 2030 either.

Now, am I saying that the BKM-14L is a bad product or that you should let it go? No, I’m not! There are probably probes out there that does still provide some accuracy and those could possibly do the trick. It’s also possible to use it in order to get a more uniform image and prevent tint on certain eras of the tube. This is seems to be a more reliable feature. But if you’re looking to get the very best white balance and accurate color temperature out of your wonderful BVM, I’d honestly recommend you to look around for ISF certified calibrators who can still calibrate Grade1 CRTs. I’m not saying that it’s free, but it will probably still turn out cheaper and easier than hunting down a BKM-14L that you in the end don’t even know if it works properly or not.

As a final word, I would like to add a comment regarding this whole D65 vs D93 discussion and how calibration is a very important part of this subject: Unless you’re 100% certain what color temperature your monitor shows, there’s absolutely no guarantee whatsoever that what you’re seeing is accurate. Therefore, first claiming that “D93 is a must for retro games” and later “Calibration isn’t necessary for retro gaming” is nothing but a way of saying that facts are facts and research is overkill. Unless we dig into this subject, dare to question ourselves and the methods we’re using, we will never be able to get anywhere with a discussion like this. I may be uncomfortable to some, and I may seem overkill to the average Joe. But if I wasn’t this extreme in others' eyes (though I wouldn't personally call myself that), I would most likely not have my current job.

Calibration is about manual work, knowledge and image. Not automation, digits or mini-golf. GLHF.
Was Calman set up to read rec.709?

What exactly did you do to hit 709 targets?

I can't hit 709 with my BVM. With U.S. D series

I'm guessing it's the difference between EBU phosphors and SMPTE-C Phosphors.
Post Reply