Color temperature for games and consoles: 6500K or 9300K?

The place for all discussion on gaming hardware

Color temperature for games and consoles: 6500K or 9300K?

6500K for all
19
66%
9300K for all
6
21%
6500K for American content and 9300K for Japanese content
2
7%
Other (please specify)
2
7%
 
Total votes: 29

Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

Einzelherz wrote:Suddenly GL isn't a minmaxer.
GeneraLight wrote: This forum is basically about getting the best possible image quality out of your games.
I am, but there exists diminishing returns. Colors values that are calibrated 99% or even 95% as good won't make any difference at all as far as human eyesight is concerned.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

elvis wrote:He's profiling his monitor with the Spyder. [...] This is how you calibrate a modern computer/workstation display
No, just no. The first sentence is correct. The second (quoted) sentence is nothing but a big, misleading lie.

With an ICC-profile (aka. LUT), you're not changing anything on the display. Not at all. You're only adding changes to the output signal. This goes against what calibration is all about for various numbers of reasons.

Since the monitor itself isn't correct to begin with, an ICC-profile will just add incorrect values to the signal in order to correct the errors that the display itself is still casusing. You're just moving and reducing bits to adjust certain values. You'll end up with a total that's lower than the entirety. Also, since the software is only reading certain values it can only know that those values measure correct. It doesn't get the entire image when profiling.

In all, you'll end up with an inaccurate greyscale with several tints and most certainly increased posterization. This can all be avoided by calibrating the display itself against a certified generator and colorimeter, as well as suitable software.
elvis wrote:As for the Spyder being "amongst the worst", it's a cheap device that gets you to about 95% accuracy on an 8-bit-per-pixel display. Is that good enough?
The Spyder colorimeters are infamous for having poor quality, as well as poor quality control (meaning two otherwise identical meters will give vastly different results even from factory). I've seen measurements from a brand new Spyder 4 that measures 100nits as 115nits. That's a 15% differential in light output. It's also common that the meters themselves give results that differs ±2% where no changes have been made, even when the display itself has been warmed up. Its color accuracy isn't very good either. If you calibrate your monitor against the measurements, chances are you'll get a rather yellowish kind of image as they tend to measure blues incorrect.

I also want to mention that the Spyders are genereally speaking very slow and since time is money, it would be rather expensive for me to use a Spyder since I have to make sure every time I use it that I get proper measurements. In CalMAN, you can let the software make 5 samples to reduce the inaccuracy. This can be helpful with a Spyder, but now it'll take arount 25-30 seconds just to get one single value.

Personally, I don't like to give a product series a thumb down but as a colorist and calibrator I honestly cannot recommend the Spyder to anyone. If you're looking to spend around $300 on a colorimeter, I'd say get an X-Rite i1Display Pro instead. It's far more reliable and certainly much faster. Also, if you profile the meter with an SPD for the display that you're about to calibrate, you'll get mote accuracy than enough for any consumer enviroment.
elvis wrote:If you want to calibrate 10-bit-per-pixel displays with true blacks (CRTs, OLEDs, etc), then you're spending 3-10 times the amount on better gear. And again, you can go completely bananas here and buy way more expensive devices if you want.
Considering the amount of issues the Spyder gives you and how much time it costs, I honestly would call that solution "completely bananas" than buying something by X-Rite or Spectracal that you actually can trust.
elvis wrote:But again, consider "fit for purpose". A $300 Spyder for calibrating your CRT to play NES is just fine. I have access to very expensive calibration hardware thanks to my job, and I don't even bother. I adjust my various CRTs (including PVMs and BVMs) by eye to the 240p test suite colour bars, as that's good enough for me. It is very easy to get caught up in the pedantry of all of this, but at some point you just have to relax and realise it's just a video game.
There's so much wrong in this statement that I don't even know where to start.

First of all, I know that a Spyder isn't good enough for even a consumer calibration, unless you want to take a really detour.
Second, whether or not you think a Spyder is good enough doesn't matter for me. Do you believe anyone would hire me if I used a cheap, unreliable tools to perform calibrations?
And third, it's not your nor mine task to decide whether or not it's worth it for my client that I calibrate their display. That's entirely up to them.

You don't think it's worth the money to have your monitor properly calibrated to whather you're using it for? Okay, that's fine. That's your decision. That still doesn't change anyone else's mind and a calibrated display is always preferable. Besides, if you've already bought such a fine grade1 CRT by Sony, it's already completely overkill for retro gaming. You may as well get the most out of it if you've made it so far.
Rec.709 has been superseded by Rec.2020, which in itself has been extended for the wider gamut of UHD displays (which CRTs should be able to handle). But either is fine.
Why would you want Rec.2020 on a CRT?

For DCI-P3, CRTs are obsolete. They're not even considered references anymore. For Rec.709, they are still to a certain point considered references since they do respond very well with CIE1931 and thus it have been possible to "rely" on them for several years from an industry standpoint. But for DCI-P3, which is a digital cinema standard, CRTs aren't exactly useable...

Besides, there's no reason to go beyond DCI-P3 by now anyway for color gamuts.
Yeah. I just talked to Savon-pat last night, and he said that while there may be some validity to the BKM-14L being outdated for calibration by today's standards, it won't matter to someone who's just playing video games or watching movies. The difference is not perceptible to the human eye, and 100% color accuracy only matters to professional color purist who needs values to be 100% accurate for reference in their work.
Judging both from my measurements and what I saw with my own eyes, the BKM-14L wasn't even close to being accurate. In fact, it's pretty damn far from it. I will post more info on this later on. It wouldn't really surprise me if the graphs will make the BKM-14L decrease in value dramatically...
I'm curious, what are you referencing the BKM against? If you posted it and I missed it I apologize.
A NIST-certified Spectracal C6.
I am, but there exists diminishing returns. Colors values that are calibrated 99% or even 95% as good won't make any difference at all as far as human eyesight is concerned.
And judging from what I saw and measured, the BKM-14L won't give you results anywhere near as good as you think it does.
Last edited by nissling on Mon Feb 26, 2018 5:45 pm, edited 2 times in total.
GojiFan90
Posts: 92
Joined: Fri Sep 22, 2017 1:28 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by GojiFan90 »

Speaking of color temperature, does anyone know offhand how the preset color temperatures for Samsung plasma sets correlates to the kelvin temperatures? Is warm2 6500K? And for 9300K, is standard/neutral or cool closer to D93?
Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

nissling wrote:Judging both from my measurements and what I saw with my own eyes, the BKM-14L wasn't even close to being accurate. In fact, it's pretty damn far from it. I will post more info on this later on. It wouldn't really surprise me if the graphs will make the BKM-14L decrease in value dramatically...
And judging from what I saw and measured, the BKM-14L won't give you results anywhere near as good as you think it does.
Well even so, I only paid $250 for mine which is brand new, wheras the market value for these is anywhere from $500 to nearly $900 used. I need a colorimeter for the IPS LCD monitor anyway, so may as well use it for my BVM as well...
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

I'm not saying that you should sell your BKM-14L. It does still work fine for the uniformity feature. There's just no guarantee whatsoever that the white balance will even come close to the standard you've chosen it to "calibrate" to.
User avatar
Xer Xian
Posts: 881
Joined: Sun Feb 06, 2005 3:23 pm
Location: Italy

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Xer Xian »

This thread surely turned upside down, from cringe-worthy to overly technical.

I'm eagerly waiting to see a few fancy CalMan charts now! Just for the eye-candy. :mrgreen:
Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

nissling wrote:I'm not saying that you should sell your BKM-14L. It does still work fine for the uniformity feature. There's just no guarantee whatsoever that the white balance will even come close to the standard you've chosen it to "calibrate" to.
I'm not familiar with the uniformity feature. What does that even do?
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

After all the hype, simplification and misunderstanding in this thread, it’s time to get things going with the BKM-14L. It has become legendary for its features, but rarely questioned by anyone around here. How come then that I want to do this?

The answer is simple: I have been using CalMAN for several years and learned quickly that “automatic calibrations” (which is a term I never want to use again) usually end up with more issues than what you started with. This by itself doesn’t mean that automatic processes and profiling is all bad, as it can get your image to the right direction. But that’s the deal…

If you are aiming for a target, why stop halfway there? Why satisfy with a compromise which you haven’t had any control over, nor know exactly what’s been going on, when you can (if you’ve got the right knowledge), make the decisions by yourself and confirm that the calibration you’re doing ends up correct?

So, let’s just get down to this subject which I know will cause quite a bit of controversy, but I do believe that we shouldn’t put too much reliability in old measurement tools when we have much better options today. A friend of mine bought a Sony BVM-D24E1WE about six months ago. The tube has gone around 36 000 hours and despite its usage, it’s in overall very good condition. I got to see it just two weeks after he first bought it and was very impressed by the image. Two months ago he got to borrow a BKM-14L from a friend who claimed that he had used it with success for his D24E1WE and my friend certainly saw an improvement, but I was very hesitant from the start considering how “automatic calibrations” or profiling usually turns out.

I have used consumer versions of CalMAN for several years and just recently I stepped it up to a new level by purchasing CalMAN Video Professional. Although I bought the license as a private person, I am using it in business (both as a calibrator for consumers but also as an employee on a film archive where I work). For measurement and source, I’m using a Spectracal C6 and Videoforge HDMI. Both of these are NIST-certified and ensures that any issues that I see either on the image or on the measurements are caused by the monitor, TV or projector. If you’re a consumer who just wants to play around, this is certainly overkill, but if you’re being hired for this kind of task you certainly need to step it up to this sort of level in order to even justify the prices you’re asking for.

Just as soon as we started the BVM-D24E1WE, I saw immediately that the white balance was off. Supposedly, he had chosen the BKM-14L to calibrate to D65 but the image was certainly too cold. Not cold like D93, but certainly nowhere near D65. We let it be turned on for about forty minutes before I started taking measurements. And… Oh boy, I’ll just cut to the fun part and show you a before and after graph…

Image

Now, note that these graphs only show the white balance and not the luminance. In order words, differences in gamma aren’t shown and that’s perfectly fine for two reasons.

#1. I want to compare the white balance that I got from the BKM-14L to what I could achieve by myself with my own tools, which I know measure as intended.
#2. Since I do calibrate black level and gamma as well, it wouldn’t be a fair comparison to begin with if those were measured. Still, even if they were, it would still be night and day.

It should also be mentioned that EOTF on a CRT is rather difficult to get graphs for to begin with, as it is always higher at shadow details to get lower and lower further up on the greyscale. This can cause weird looking measurements but as long as you know how a CRT behaves and can calibrate the black level properly you don’t really have to mind how the gamma curve will turn up in CalMAN (as the software doesn’t seem to have proper support for CRT EOTF anymore). Instead it’s a much better idea to pay attention to how the white balance is measured and what you can do about it.

With that said, things are very clear here. The probe is simply not as sensitive to blue as it’s supposed to and therefore it gives you a far colder image then you desire. This issue would also cause the image to get colder than 10 000° Kelvin if you’ve choose to calibrate to D93 with it. We could let it all end here, but let’s dig a bit deeper into this subject.

When measuring the color temperature of the former “calibration”, we see that the average color temperature is 7334° Kelvin, with whites being at nearly 7400° Kelvin. That’s a differential of 13% to 6500° Kelvin and should certainly not be considered a reference point. In fact, I’d say this is certainly noticeable even in general consumer environments.

Image

It should also be mentioned that even when the EOTF is considered by CalMAN, my calibration has a maximum dE2000 of 1.4-1.5. This is, of course, caused by the mathematical differences in EOTF and even by then these results are extremely good. It’s certainly more than enough for any consumer and probably even holds up to reference standards.

Image

And here are some gamut measurements. Colors were generally very good before as well but thanks to the calibrated white balance it all got even more fine-tuned. What could be mentioned is that the green cannon may be slightly aged but it’s not visible to the naked eye and still falls within reference. Also, there are straight lines between Green-Magenta, Blue-Yellow and Red-Cyan that all pass through the white point. In other words, both primaries and secondaries are properly balanced.
Image

The Sony BVM-D24E1WE is one of the very finest CRTs I’ve ever seen and seeing it calibrated to 6500° Kelvin and a light output of 100cd/m^2 was a completely new experience. It’s so extremely stable and sharp with an exceptional, natural EOTF that is very difficult (unless impossible) to match even on new displays. Very dark, almost black detail are there and visible while you have clear whites on an overall bright image. The colors are pretty much flawless as well and the greyscale looks perfect for the human eye. Along with my Sony HDM-3830E, this is probably my favorite CRT of all time.

For anyone interested, I also calibrated a BVM-D20F1E. It hadn’t been used with the BKM-14L and the white balance was certainly off, but in a different matter. The tube has just under 5000 operational hours and performs like a charm.

Image

Image

In all, I honestly cannot say I would recommend anyone using the BKM-14L to get accurate white balance. I will say that the issues I saw from it gave me an idea that it probably did work rather well in the 90s and early 00s, but nothing lasts forever. My C6 will probably not do the trick in 2030 either.

Now, am I saying that the BKM-14L is a bad product or that you should let it go? No, I’m not! There are probably probes out there that does still provide some accuracy and those could possibly do the trick. It’s also possible to use it in order to get a more uniform image and prevent tint on certain eras of the tube. This is seems to be a more reliable feature. But if you’re looking to get the very best white balance and accurate color temperature out of your wonderful BVM, I’d honestly recommend you to look around for ISF certified calibrators who can still calibrate Grade1 CRTs. I’m not saying that it’s free, but it will probably still turn out cheaper and easier than hunting down a BKM-14L that you in the end don’t even know if it works properly or not.

As a final word, I would like to add a comment regarding this whole D65 vs D93 discussion and how calibration is a very important part of this subject: Unless you’re 100% certain what color temperature your monitor shows, there’s absolutely no guarantee whatsoever that what you’re seeing is accurate. Therefore, first claiming that “D93 is a must for retro games” and later “Calibration isn’t necessary for retro gaming” is nothing but a way of saying that facts are facts and research is overkill. Unless we dig into this subject, dare to question ourselves and the methods we’re using, we will never be able to get anywhere with a discussion like this. I may be uncomfortable to some, and I may seem overkill to the average Joe. But if I wasn’t this extreme in others' eyes (though I wouldn't personally call myself that), I would most likely not have my current job.

Calibration is about manual work, knowledge and image. Not automation, digits or mini-golf. GLHF.
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

nissling wrote:No, just no. The first sentence is correct. The second (quoted) sentence is nothing but a big, misleading lie.

With an ICC-profile (aka. LUT), you're not changing anything on the display. Not at all. You're only adding changes to the output signal. This goes against what calibration is all about for various numbers of reasons.
Ho boy.

What your *computer* monitor displays is a combination of it's natural light generating rules and the signal the video card sends it. You can change these colours at either spot. We manually calibrate monitors as close as we can to whatever standards are required, but for the domestic monitors that occupy the majority of our production floor, that's not possible to get perfect accuracy, because the controls just aren't there (there's no remote you can plug in, so everything's done via controls on the panel). From there we profile them with Argyll, which itself generates information that can be fed into any software colour correction tool for any OS (we use Linux, but macOS and Windows have identical features). That gets the monitor closer again to reference.

And I say for a second time, that's not how we treat the OLEDs and digital cinema projectors we have, where we do bother to go to the pedantic detail you've mentioned in this thread, because that's where it matters. For the rest of our very large, very busy creative studio, we don't have thousands of man-hours per month to calibrate hundreds of monitors that drift every 30 days. So instead, we use quick, automated methods for this exact reason. (15 minutes times 200 monitors is still 50 man hours a month, assuming you can get on the computer to profile the display when you want, as often you have to work around the artist).

And none of that has anything to do with the OCIO and ACES colour science and software we use in our professional software, where we differentiate between camera look, display look and shot look, and the right combination for given footage depending on the hardware used along the way.

https://en.wikipedia.org/wiki/Academy_C ... ing_System
http://opencolorio.org/

That's entirely off topic for "I want to calibrate my home CRT to play video games".
nissling wrote:If you are aiming for a target, why stop halfway there?
Again, question what you want out of your efforts. If it took you an hour of work to get to "half way", is it worth another hour of work to get something that is going to deliver a 1% real world difference to your naked eye? If you're a professional colourist, sure. If you're a desktop designer, maybe. If you're playing a NES on a CRT, shit no.
nissling wrote:Calibration is about manual work, knowledge and image.
No argument. But again, horses for courses. Your very pedantic, very detailed post is the sort of thing we do in three of our suites, and not the 200+ other displays we have around the building. Why? Because those man hours aren't feasible, even in a professional studio.
GeneraLight wrote:I am, but there exists diminishing returns. Colors values that are calibrated 99% or even 95% as good won't make any difference at all as far as human eyesight is concerned.
You, sir, are 100% correct, on the topic of what this thread is about. Some of these posts are a cracking example of the difference between satisfying real world requirements, and what an over-excited engineer wastes precious time and money on for no perceivable difference, other than satisfying personal pedantry.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

elvis wrote:What your *computer* monitor displays is a combination of it's natural light generating rules and the signal the video card sends it. You can change these colours at either spot. We manually calibrate monitors as close as we can to whatever standards are required, but for the domestic monitors that occupy the majority of our production floor, that's not possible to get perfect accuracy, because the controls just aren't there (there's no remote you can plug in, so everything's done via controls on the panel). From there we profile them with Argyll, which itself generates information that can be fed into any software colour correction tool for any OS (we use Linux, but macOS and Windows have identical features). That gets the monitor closer again to reference.
And an ICC profile will still cause issues in accuracy. No matter what you say or want. It's not a calibration, it's a profiling.
elvis wrote:So instead, we use quick, automated methods for this exact reason.
And those monitors aren't calibrated. It doesn't really matter what you say or want.
elvis wrote:Again, question what you want out of your efforts. If it took you an hour of work to get to "half way", is it worth another hour of work to get something that is going to deliver a 1% real world difference to your naked eye? If you're a professional colourist, sure. If you're a desktop designer, maybe. If you're playing a NES on a CRT, shit no.
You are missing my point. The differences between what the BKM-14L achieved and what I could do in a relatively short period of time (took about 30 minutes) are huge. And I'm doing this for a client, who have paid me for doing this work. Are any of us two the one who should cut in and say "Hold it, are you going to play Retro games? I may as well leave it here".

If you think it's overkill to play video games on a properly calibrated monitor, how come you own a BVM to play games on? That, if anything, is extremely overkill.
elvis wrote:No argument. But again, horses for courses. Your very pedantic, very detailed post is the sort of thing we do in three of our suites, and not the 200+ other displays we have around the building. Why? Because those man hours aren't feasible, even in a professional studio.
And still you're missing my point. You think I'm the one being overkill and yet those who claim that D93 is the proper way to play retro games have no idea if their references are even correct?
elvis wrote:You, sir, are 100% correct, on the topic of what this thread is about. Some of these posts are a cracking example of the difference between real world requirements, and what an over-excited engineer wastes precious time and money on.
Like you do when you're playing on a BVM or PVM?
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

nissling wrote:And an ICC profile will still cause issues in accuracy. No matter what you say or want. It's not a calibration, it's a profiling.
Sure. And it's nice, quick and dirty for the lower end requirements on the desktop, particularly on monitors with very few adjustment options.
nissling wrote:If you think it's overkill to play video games on a properly calibrated monitor, how come you own a BVM to play games on?
Because I got one for free, before it was cool. Benefits of working for a VFX company and being known as the retro gaming guy when they throw out CRTs. :)
nissling wrote:And still you're missing my point.
Nah, I get your points fine. Everything you've said is deeply technical, and perfectly correct. But all equally pedantic too.
nissling wrote:yet those who claim that D93 is the proper way to play retro games have no idea if their references are even correct?
Oh, I'm 100% with you there. I read through this thread reading "D65 vs D93" the whole way, asking myself how the hell anyone was making these claims without knowing a damned thing about how the source was created. As you've alluded to above, and as I said point blank above, the source material matters. We're fortunate enough in our industry to have that information (we get detailed camera information from set with every single bit of information about the settings on the cameras - most of which are baked into metadata today, which is even nicer). But, how the hell do I know what settings some random Japanese game dev had their monitor calibrated to back in the 80s? Hell, it's difficult to get even basic information about that time in history, let alone pedantry to that nature.

So yeah, I support you in that claim. I still calibrate all my displays to a 6500K whitepoint, and basic sRGB/Rec709 type levels/curves, because that's just what I do everywhere else, and that just feels right to me. 9300 always looks too warm, for no scientific reason other than what I'm used to in my day to day work.
nissling wrote:Like you do when you're playing on a BVM or PVM?
Wanna know something funny? I've got another 15 PVMs/BVMs in my shed, untouched. Grand total cost: $0. :)
Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

nissling wrote:So, let’s just get down to this subject which I know will cause quite a bit of controversy, but I do believe that we shouldn’t put too much reliability in old measurement tools when we have much better options today.
Why not? How does age play a factor?
A friend of mine bought a Sony BVM-D24E1WE about six months ago. The tube has gone around 36 000 hours and despite its usage, it’s in overall very good condition. I got to see it just two weeks after he first bought it and was very impressed by the image. Two months ago he got to borrow a BKM-14L from a friend who claimed that he had used it with success for his D24E1WE and my friend certainly saw an improvement
If your friend and his friend both saw an improvement on their BVM-D24s, then what's the issue here?
Just as soon as we started the BVM-D24E1WE, I saw immediately that the white balance was off. Supposedly, he had chosen the BKM-14L to calibrate to D65 but the image was certainly too cold. Not cold like D93, but certainly nowhere near D65. We let it be turned on for about forty minutes before I started taking measurements. And… Oh boy, I’ll just cut to the fun part and show you a before and after graph…

With that said, things are very clear here. The probe is simply not as sensitive to blue as it’s supposed to and therefore it gives you a far colder image then you desire. This issue would also cause the image to get colder than 10 000° Kelvin if you’ve choose to calibrate to D93 with it. We could let it all end here, but let’s dig a bit deeper into this subject.

When measuring the color temperature of the former “calibration”, we see that the average color temperature is 7334° Kelvin, with whites being at nearly 7400° Kelvin. That’s a differential of 13% to 6500° Kelvin and should certainly not be considered a reference point. In fact, I’d say this is certainly noticeable even in general consumer environments.
Perhaps the probe was damaged or used incorrectly.
The Sony BVM-D24E1WE is one of the very finest CRTs I’ve ever seen and seeing it calibrated to 6500° Kelvin and a light output of 100cd/m^2 was a completely new experience. It’s so extremely stable and sharp with an exceptional, natural EOTF that is very difficult (unless impossible) to match even on new displays. Very dark, almost black detail are there and visible while you have clear whites on an overall bright image. The colors are pretty much flawless as well and the greyscale looks perfect for the human eye. Along with my Sony HDM-3830E, this is probably my favorite CRT of all time.
Yes, I agree.

How noticeable is the difference in luminance between 100cd/m^2 and 70cd/m^2? The Sony BVM-D24E1WU has a maximum output of 100cd/m^2, while the Sony BVM-D32E1WU only has a maximum output of 70cd/m^2. Also, given that the two monitors have nearly identical specs but different sizes, can you say with certainty as a professional calibrator that size makes a difference in picture quality, given every other constant is exactly the same?
In all, I honestly cannot say I would recommend anyone using the BKM-14L to get accurate white balance. I will say that the issues I saw from it gave me an idea that it probably did work rather well in the 90s and early 00s, but nothing lasts forever. My C6 will probably not do the trick in 2030 either.
So you're basically saying that the accuracy of a calibration will depend on the age and condition of the tools being used?
Now, am I saying that the BKM-14L is a bad product or that you should let it go? No, I’m not! There are probably probes out there that does still provide some accuracy and those could possibly do the trick. It’s also possible to use it in order to get a more uniform image and prevent tint on certain eras of the tube. This is seems to be a more reliable feature. But if you’re looking to get the very best white balance and accurate color temperature out of your wonderful BVM, I’d honestly recommend you to look around for ISF certified calibrators who can still calibrate Grade1 CRTs. I’m not saying that it’s free, but it will probably still turn out cheaper and easier than hunting down a BKM-14L that you in the end don’t even know if it works properly or not.
It sounds like the performance and results of the BKM-14L depends on what condition it's in. Probes that have a lot of dirt, wear/tear and usage probably won't perform nearly as well as a brand new one.

I once got my Sony KD-34XBR960 HD CRT professionally calibrated by a THX/ISF-certified calibrator. And while the difference in picture quality was mind-blowingly huge and amazing, the calibration was $425. Am I just better off buying a colorimeter and calibrating myself?
As a final word, I would like to add a comment regarding this whole D65 vs D93 discussion and how calibration is a very important part of this subject: Unless you’re 100% certain what color temperature your monitor shows, there’s absolutely no guarantee whatsoever that what you’re seeing is accurate. Therefore, first claiming that “D93 is a must for retro games” and later “Calibration isn’t necessary for retro gaming” is nothing but a way of saying that facts are facts and research is overkill.
I agree 100%. Even if you're only watching or playing content that was made for D65/D93, it won't matter if your display isn't calibrated correctly for D65/D93.
Unless we dig into this subject, dare to question ourselves and the methods we’re using, we will never be able to get anywhere with a discussion like this. I may be uncomfortable to some, and I may seem overkill to the average Joe. But if I wasn’t this extreme in others' eyes (though I wouldn't personally call myself that), I would most likely not have my current job.

Calibration is about manual work, knowledge and image. Not automation, digits or mini-golf. GLHF.
True. Your results and the graphs were certainly an interesting and educational experience. I thank you for that.
elvis wrote:monitors that drift every 30 days. So instead, we use quick, automated methods for this exact reason.
Is it normal for displays to drift in calibration that quickly? I don't want to pay a ton of money to have someone calibrate my displays or spend several hours calibrating them myself just for them to fall out of spec after a month.
That's entirely off topic for "I want to calibrate my home CRT to play video games".

Again, question what you want out of your efforts. If it took you an hour of work to get to "half way", is it worth another hour of work to get something that is going to deliver a 1% real world difference to your naked eye? If you're a professional colourist, sure. If you're a desktop designer, maybe. If you're playing a NES on a CRT, shit no.
You make a good point.
nissling wrote:You, sir, are 100% correct, on the topic of what this thread is about. Some of these posts are a cracking example of the difference between satisfying real world requirements, and what an over-excited engineer wastes precious time and money on for no perceivable difference, other than satisfying personal pedantry.
Well, I don't mind if my calibration isn't 100% accurate as long as it falls within the margin of error for indistinguishable color to the naked human eye.
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

GeneraLight wrote:Is it normal for displays to drift in calibration that quickly? I don't want to pay a ton of money to have someone calibrate my displays or spend several hours calibrating them myself just for them to fall out of spec after a month.
First up, *all* displays drift. What makes them drift and how quickly is dependent on many things. Age in particular is the biggest cause of drift. OLEDs, for example, have a notable half-life which has a real-world impact on their colour accuracy. CRTs that rely heavily on capacitance and inductance of their components, as well as the health of the high voltage flyback transformer, degrade over time in a way that affects not only overall brightness, but also each gun independently.

Age is often the biggest factor (and when we say age, we mean how long the components have been turned on - details a PVM style monitor will give you, but often only about the tube, not telling you about the service dates of internal parts). The older a device is, the quicker it can drift (not always of course - manufacturing faults are a part of life). For most of us in the retro gaming community, we're all running old gear, so drift is a fact of life. Devices drift as they warm up, too. We never profile a device when it's just been turned on, regardless of display type. For LCDs and OLEDs, 15-30 minutes is a good wait time to see more clearly what the colour of the device is like. For CRTs, they can change over the course of an hour as components warm up (again, worse for older hardware).

In the VFX world, we're mandated a minimum of re-profiling/correcting (again, see pedantic discussion above on the difference) devices every 30 days. That's pushed down to us from upon high (i.e.: major film studios, Netflix, etc). And the science would back them up - we have some displays that (objectively measured with expensive equipment) drift quite a bit in that time, and not just because they're old (again, even within a single batch of monitors of the same brand and manufacture date, we get variance, because that's just life in electronics).

On top of that, ambient light makes a huge difference in the human eye's perception of light. When it comes down to it, we are hugely complex and infinitely variable analogue "machines" that interpret light based on millions of years of highly subjective evolution. And while colour science does everything in its power to try and formalise the rules of generating photons and wavelengths, a lot of the problem is how the wetware (your eyes/brain) interpret that energy output. Many colorimeters will include ambient light probes on the back of the devices (i.e.: pointing away from the screen). When measuring/profiling displays, that information can be taken into effect to assist in delivering a relative measurement. Indeed, in a studio environment, we do everything we can to either standardise light (trying to ensure we have no direct light sources - walls painted monotone grey, identical brand lights of an agreed temperature pointing at walls only in rooms and suites), or removing light sources all together (darkened rooms with only floor-level assistance lighting for walkways, or blockout curtains around operators). But say you've got a particular area that for whatever reason has some natural light hitting it, then that's going to give you subjective/relative differences in your perception across the course of the day.

Seeing this in action is easy. You can put different temperature bulbs in lights in two different rooms, and look at the same image on two different computers in those different rooms profiled/calibrated with respect to the ambient light. Move the two computers into the same room, and you instantly see the difference in the pictures side by side.

If you're interested in the deeper technical science of this from a TV/film/cinema point of view, I recommend reading "the bible" on the source:
https://github.com/jeremyselan/cinemati ... or_VES.pdf

That pretty much stands as the starting point most of the industry uses to build tools upon, which itself is ever changing as we develop newer and more accurate techniques to deal with all of these problems.

Looping all the way back around and answering your basic question, "is it worth it?" - again I repeat the core message I've said a few times in this thread. Colour is complex. Calibration is complex. You can spend dozens of hours getting things super accurate, only to have your display (especially if you're using a PVM with 10,000 hours on it) drift out of whack again.

Is it worth it? Only you can answer that. As I said above, despite having expensive equipment available to me that can calibrate my PVMs, I don't even bother for old games. I use the 240p test suite to get the gamma, grey, red, green and blue levels looking "good enough" to my naked eye, and I go from there. I'm not editing photos or making movies on this thing.

Perhaps if you've just picked up a device, getting it professionally calibrated once is worth it. Any drift from there will be relatively minor, and probably just a slight gamma adjustment away (assuming no major fault in the hardware). But yes, it will drift at some point. Whether you want to spend a bucket of cash in either the hardware to calibrate it yourself, or the professional services for someone else to do it, is up to you. For me, even with the tools and experience, the answer for my retro gaming enjoyment is "no, not worth it". I'd rather spend the precious spare hours outside of work and other commitments actually playing games. YMMV.
User avatar
Einzelherz
Posts: 1279
Joined: Wed Apr 09, 2014 2:09 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Einzelherz »

Thank you, Nissling for a fantastic set of posts.
SamIAm
Posts: 475
Joined: Thu Mar 03, 2005 1:09 am

Re: Color temperature for games and consoles: 6500K or 9300K

Post by SamIAm »

In the total absence of any apparent standardization for color display and correction by Japanese developers, and also in light of how most devs probably used composite video on a random consumer set to make their final calls anyway, there really is no such thing as a perfect calibration for console games. By all means, use the 240p Test Suite's pluge and color bar tests to make sure your black level is right and your individual R, G and B aren't way out of balance, but after that, I say go with what looks good, period.

I don't even use 75-ohm terminators on my monitors, except for one that has visible signal ringing without them. With the brightness and contrast turned down a bit to compensate, I can make the overall image nice and "punchy" in a way that I can't with terminators attached. Call me a philistine, but it looks like I've always imagined video games looking.

To me, 6500k is too mellow. The subtler tones are nice for live action, but I think that classic video games should have a certain intensity. I go with 9300k in all situations, and I've got my RGB gain levels balanced to keep the image from ever looking too cold.

These old broadcast monitors are fun, but I don't see them as a path to perfection. They're just a tool to help me get what I want, with minimal convergence and geometry issues and rich, high-TVL pixels.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

Just a quick, general reply. I may be able to get into more detail later on...
GeneraLight wrote:Why not? How does age play a factor?
Measurement tools of this sort do age and perform different over time. A probe that's been sitting on a shelf for 20 years will most likely not perform like it did back then. A major pro with the C6 though is that every meter being sold is manually certified by an engineer for various display types, plus you can always send it to Spectracal for re-certification. My C6 still measures properly but it may drift somewhat until next year for some display types.
GeneraLight wrote:If your friend and his friend both saw an improvement on their BVM-D24s, then what's the issue here?
The issue is that the color temperature still wasn't accurate. As you can see on my graphs, the white balance was off with over 800° Kelvin. Thats's very noticeable even to the naked eye yet they were still looking at the monitor and thinking it displayed D65 properly.
GeneraLight wrote:Perhaps the probe was damaged or used incorrectly.
And here we come to one of my main points... Why do some people have confidence in such old calibration gears that give no information whatsoever about the changes being made nor that you can test with suitable software to see how it actually performs? It's all being done double-blind and you have no control over anything. What you see is what you get, and believing that you'll get your monitor properly calibrated without knowing anything pretty much goes beyond what calibration is about.

It's fairly easy to follow the instructions for the BKM-14L and I don't believe it was misused.
GeneraLight wrote:So you're basically saying that the accuracy of a calibration will depend on the age and condition of the tools being used?
Yes, especially in a case where you don't get any graphs or other information on what's been done. With CalMAN, it's all done manually and I know the entire workflow.
GeneraLight wrote:It sounds like the performance and results of the BKM-14L depends on what condition it's in. Probes that have a lot of dirt, wear/tear and usage probably won't perform nearly as well as a brand new one.

I once got my Sony KD-34XBR960 HD CRT professionally calibrated by a THX/ISF-certified calibrator. And while the difference in picture quality was mind-blowingly huge and amazing, the calibration was $425. Am I just better off buying a colorimeter and calibrating myself?
These sorts of tools even age when not used. Still, what's the point in sticking to a 20 year old method when you can spend your money on either the same service from someone with the right gear and knowledge to do it for you or some basic, modern calibration tools?

You can look in to the X-Rite i1D3, as it's a colorimeter with great value for your money. It supports CRTs. As for software you'll probably want to start with HCFR, which is open source. I personally wouldn't say it's worth it to get into calibration if you just want one display calibrated and instead buy the service like you did.
GeneraLight wrote:I agree 100%. Even if you're only watching or playing content that was made for D65/D93, it won't matter if your display isn't calibrated correctly for D65/D93.
And that's the point. There's no guarantee that the BKM-14L will give you accurate white balance, while manual work by someone who knows what he's doing will most likely end up much closer to the reference.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

elvis wrote:Oh, I'm 100% with you there. I read through this thread reading "D65 vs D93" the whole way, asking myself how the hell anyone was making these claims without knowing a damned thing about how the source was created. As you've alluded to above, and as I said point blank above, the source material matters. We're fortunate enough in our industry to have that information (we get detailed camera information from set with every single bit of information about the settings on the cameras - most of which are baked into metadata today, which is even nicer). But, how the hell do I know what settings some random Japanese game dev had their monitor calibrated to back in the 80s? Hell, it's difficult to get even basic information about that time in history, let alone pedantry to that nature.

So yeah, I support you in that claim. I still calibrate all my displays to a 6500K whitepoint, and basic sRGB/Rec709 type levels/curves, because that's just what I do everywhere else, and that just feels right to me. 9300 always looks too warm, for no scientific reason other than what I'm used to in my day to day work.
Finally I think we're on the right track. I personally don't claim that D65 is the proper standard for retro gaming, nor that D93 would be more correct. And I can perfectly understand if someone doesn't care too much about color accuracy on their display while gaming. The thing is, just like I said, that when people say that D65 is inaccurate for all retro games that were developed in Japan I feel like question them just what kind of references such a statement is based on except D93 was the NTSC-J standard? And when they just say that they've set their monitors to D65 and takes that as a reference... *Facepalm*

What made me more interested in this subject was BKM-14L once it was brought up for discussion. If you've let it "calibrate" your monitor to a specific color temperature, you've at least tried to get a more accurate reference. But how well does it hold up? After seeing how this specific probe performed, it doesn't look too good.

Now, is the BKM-14L enough for anyone just want to get something more out of his or her BVM? Maybe. Is it enough for anyone who wants a somewhat accurate color temperature? No, I wouldn't say so. With differences between 8-900° Kelvin, it's considerably off D65. If that's good enough for playing retro games or not is a completely different question however, and for most people it probably is.

Personally I've always preferred D65 and having my BVM calibrated to it makes me feel confident that what I display on it is also shown with an accurate color temperature. Whether or not that's accurate to the original, intended look of an Nes game however I cannot answer. That's a completely different manner and one I'm not really interested in digging into any deeper.
GeneraLight wrote:How noticeable is the difference in luminance between 100cd/m^2 and 70cd/m^2? The Sony BVM-D24E1WU has a maximum output of 100cd/m^2, while the Sony BVM-D32E1WU only has a maximum output of 70cd/m^2. Also, given that the two monitors have nearly identical specs but different sizes, can you say with certainty as a professional calibrator that size makes a difference in picture quality, given every other constant is exactly the same?
Depending on the enviroment and what you're showing, it can be anywhere from negligible or drastic. Thing is though that the BVM-D32E1WU has a significantly larger screen size than the D24 and thus the same light output will appear brighter on it. In a dark room I think I would be satisfied with 85-90cd/m^2 for a D32.

Besides, where did you get those specs? From my measurements the D24 could go as high as 160cd/m^2 without issues, though that's way too bright for a CRT if you ask me.

Regarding display drift, I'd say it all depends on how the display was calibrated and what the usages are but to make things easier let's just say you're using the display in your home for playing games and watch some films...

If the display is calibrated with dE u'v, the RGB balance will be generally more stable to begin with (especially in shadow details). For dE2000 however may end up with a more precise gamma. If your display decrease its light output over time, you'll most likely see more degregation in gamma than white balance. For my OLED it's pretty much the opposite though. The gamma and light output haven't changed but reds had shifted slightly since I had it professionally calibrated 1½ year ago (by an ISF certified calibrator). But even when I measure it now and re-calibrate it, the differences are pretty much impossible to see. As long as you're just using it for enjoyment it's just not worth calibrating it again by now.

What you can do is use ColorChecker (which is free) maybe once in a while to get new measurements and see how it performs without knowing exactly what causes any of the issues. Generally speaking though I'd say that even if your calibrated display drifts over time it'll still look much better than a non-calibrated set. As long as you're happy and don't do anything that needs perfect accuracy you'll probably be fully satisfied with one, complete and successful calibration.
Last edited by nissling on Tue Feb 27, 2018 7:41 pm, edited 1 time in total.
Lord of Pirates
Posts: 508
Joined: Sun May 12, 2013 5:03 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Lord of Pirates »

GeneraLight wrote:
Now, am I saying that the BKM-14L is a bad product or that you should let it go? No, I’m not! There are probably probes out there that does still provide some accuracy and those could possibly do the trick. It’s also possible to use it in order to get a more uniform image and prevent tint on certain eras of the tube. This is seems to be a more reliable feature. But if you’re looking to get the very best white balance and accurate color temperature out of your wonderful BVM, I’d honestly recommend you to look around for ISF certified calibrators who can still calibrate Grade1 CRTs. I’m not saying that it’s free, but it will probably still turn out cheaper and easier than hunting down a BKM-14L that you in the end don’t even know if it works properly or not.
It sounds like the performance and results of the BKM-14L depends on what condition it's in. Probes that have a lot of dirt, wear/tear and usage probably won't perform nearly as well as a brand new one.

I once got my Sony KD-34XBR960 HD CRT professionally calibrated by a THX/ISF-certified calibrator. And while the difference in picture quality was mind-blowingly huge and amazing, the calibration was $425. Am I just better off buying a colorimeter and calibrating myself?
elvis wrote:monitors that drift every 30 days. So instead, we use quick, automated methods for this exact reason.
Is it normal for displays to drift in calibration that quickly? I don't want to pay a ton of money to have someone calibrate my displays or spend several hours calibrating them myself just for them to fall out of spec after a month.
My guess would be that Sony used cheaper filters because they weren't concerned with longevity. I wouldn't buy a BKM-14L at all after seeing the results Nissling posted.

It depends on what meter they used. You'll lose accuracy by re-calibrating with an i1DP3 if they used something high end but it should still be acceptable considering cost.
Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

nissling wrote:Oh, I'm 100% with you there. I read through this thread reading "D65 vs D93" the whole way, asking myself how the hell anyone was making these claims without knowing a damned thing about how the source was created. As you've alluded to above, and as I said point blank above, the source material matters. We're fortunate enough in our industry to have that information (we get detailed camera information from set with every single bit of information about the settings on the cameras - most of which are baked into metadata today, which is even nicer). But, how the hell do I know what settings some random Japanese game dev had their monitor calibrated to back in the 80s? Hell, it's difficult to get even basic information about that time in history, let alone pedantry to that nature.
You're right. I don't think we'll ever know which color temperature Japanese game developers used for making NES, SNES, Genesis, Neo-Geo, N64, etc. games, and if there was only one used. We don't even know if their monitors were properly calibrated, so trying to figure out the standard is probably a futile and vain effort. Not to mention a plethora of other things to consider.
elvis wrote:Wanna know something funny? I've got another 15 PVMs/BVMs in my shed, untouched. Grand total cost: $0. :)
Nice. Which models do you have?
elvis wrote:First up, *all* displays drift. What makes them drift and how quickly is dependent on many things. Age in particular is the biggest cause of drift. OLEDs, for example, have a notable half-life which has a real-world impact on their colour accuracy. CRTs that rely heavily on capacitance and inductance of their components, as well as the health of the high voltage flyback transformer, degrade over time in a way that affects not only overall brightness, but also each gun independently.
How quickly would a CRT BVM drift and degrade, assuming it's brand new?
Age is often the biggest factor (and when we say age, we mean how long the components have been turned on - details a PVM style monitor will give you, but often only about the tube, not telling you about the service dates of internal parts). The older a device is, the quicker it can drift (not always of course - manufacturing faults are a part of life). For most of us in the retro gaming community, we're all running old gear, so drift is a fact of life. Devices drift as they warm up, too. We never profile a device when it's just been turned on, regardless of display type. For LCDs and OLEDs, 15-30 minutes is a good wait time to see more clearly what the colour of the device is like. For CRTs, they can change over the course of an hour as components warm up (again, worse for older hardware).
True. I guess it's something that we all have to accept, whether we want to or not.
In the VFX world, we're mandated a minimum of re-profiling/correcting (again, see pedantic discussion above on the difference) devices every 30 days. That's pushed down to us from upon high (i.e.: major film studios, Netflix, etc). And the science would back them up - we have some displays that (objectively measured with expensive equipment) drift quite a bit in that time, and not just because they're old (again, even within a single batch of monitors of the same brand and manufacture date, we get variance, because that's just life in electronics).
So you're saying that calibrating a display once a month isn't enough for a lot of major film studios and Netflix?
On top of that, ambient light makes a huge difference in the human eye's perception of light. When it comes down to it, we are hugely complex and infinitely variable analogue "machines" that interpret light based on millions of years of highly subjective evolution. And while colour science does everything in its power to try and formalise the rules of generating photons and wavelengths, a lot of the problem is how the wetware (your eyes/brain) interpret that energy output. Many colorimeters will include ambient light probes on the back of the devices (i.e.: pointing away from the screen). When measuring/profiling displays, that information can be taken into effect to assist in delivering a relative measurement. Indeed, in a studio environment, we do everything we can to either standardise light (trying to ensure we have no direct light sources - walls painted monotone grey, identical brand lights of an agreed temperature pointing at walls only in rooms and suites), or removing light sources all together (darkened rooms with only floor-level assistance lighting for walkways, or blockout curtains around operators). But say you've got a particular area that for whatever reason has some natural light hitting it, then that's going to give you subjective/relative differences in your perception across the course of the day.
Very well said. I've always shared the same sentiments about the sheer intricacy of humans. Now, I'm not going so far as to paint the walls in my room a monotone grey, but I can do a pitch-black room with zero light. What about white colored walls with natural sunlight coming in through a single window in the morning/noon (my favorite time to play games if not playing in pitch-black darkness).
Seeing this in action is easy. You can put different temperature bulbs in lights in two different rooms, and look at the same image on two different computers in those different rooms profiled/calibrated with respect to the ambient light. Move the two computers into the same room, and you instantly see the difference in the pictures side by side.
Yep. The reflected light of an incandescent light bulb shining from above will give you much different colors than natural daylight pouring in from the window.
If you're interested in the deeper technical science of this from a TV/film/cinema point of view, I recommend reading "the bible" on the source:
https://github.com/jeremyselan/cinemati ... or_VES.pdf
I am, and thank you.
That pretty much stands as the starting point most of the industry uses to build tools upon, which itself is ever changing as we develop newer and more accurate techniques to deal with all of these problems.
It's pretty amazing thinking about the things we're capable of. Maybe one day we can develop displays and calibration methods that will never drift or degrade. :mrgreen:
Looping all the way back around and answering your basic question, "is it worth it?" - again I repeat the core message I've said a few times in this thread. Colour is complex. Calibration is complex. You can spend dozens of hours getting things super accurate, only to have your display (especially if you're using a PVM with 10,000 hours on it) drift out of whack again.
Yep. I just want a 100% accurate calibration that lasts a long time (if not forever). What about a brand new BVM in excellent condition?
Is it worth it? Only you can answer that. As I said above, despite having expensive equipment available to me that can calibrate my PVMs, I don't even bother for old games. I use the 240p test suite to get the gamma, grey, red, green and blue levels looking "good enough" to my naked eye, and I go from there. I'm not editing photos or making movies on this thing.
I guess so. It seems SamIAm and you both have the right idea.
Perhaps if you've just picked up a device, getting it professionally calibrated once is worth it. Any drift from there will be relatively minor, and probably just a slight gamma adjustment away (assuming no major fault in the hardware). But yes, it will drift at some point. Whether you want to spend a bucket of cash in either the hardware to calibrate it yourself, or the professional services for someone else to do it, is up to you. For me, even with the tools and experience, the answer for my retro gaming enjoyment is "no, not worth it". I'd rather spend the precious spare hours outside of work and other commitments actually playing games. YMMV.
Realistically, I think that's what I'd do. Either pay an THX/ISF-certified calibrator to get it fully calibrated and 100% accurate (color, color temperature, geometry, convergence, focus, contrast, brightness, white balance, etc.) just once and live with it, or buy/rent some good calibration tools at a modest price and fully calibrate all of my displays just once.

At the end of the day, I just want to play games and watch movies in the highest fidelity, and calibration is just one of the many things that prolongs reaching that point for me. Along with researching and buying switchers, fully shielded BNC/SCART cables with the correct specs, modding my consoles for RGB/HDMI, etc.
Einzelherz wrote:Thank you, Nissling for a fantastic set of posts.
SamIAm wrote:In the total absence of any apparent standardization for color display and correction by Japanese developers, and also in light of how most devs probably used composite video on a random consumer set to make their final calls anyway, there really is no such thing as a perfect calibration for console games. By all means, use the 240p Test Suite's pluge and color bar tests to make sure your black level is right and your individual R, G and B aren't way out of balance, but after that, I say go with what looks good, period.

I don't even use 75-ohm terminators on my monitors, except for one that has visible signal ringing without them. With the brightness and contrast turned down a bit to compensate, I can make the overall image nice and "punchy" in a way that I can't with terminators attached. Call me a philistine, but it looks like I've always imagined video games looking.

To me, 6500k is too mellow. The subtler tones are nice for live action, but I think that classic video games should have a certain intensity. I go with 9300k in all situations, and I've got my RGB gain levels balanced to keep the image from ever looking too cold.

These old broadcast monitors are fun, but I don't see them as a path to perfection. They're just a tool to help me get what I want, with minimal convergence and geometry issues and rich, high-TVL pixels.
Yeah, I'm beginning to agree with you. Even if we were somehow able to know for sure which color temperature Japanese game developers used and if their monitors were properly calibrated or not (good luck finding that out), chances are they probably weren't calibrated or used standard D65 or D93 at all, or many games on a single console using multiple color temperatures.
nissling wrote:Just a quick, general reply. I may be able to get into more detail later on...
Alright. I'm interested in hearing your full insight.
Measurement tools of this sort do age and perform different over time. A probe that's been sitting on a shelf for 20 years will most likely not perform like it did back then.
But why though? What would physically cause the probe to not function as well as it once did?
The issue is that the color temperature still wasn't accurate. As you can see on my graphs, the white balance was off with over 800° Kelvin. Thats's very noticeable even to the naked eye yet they were still looking at the monitor and thinking it displayed D65 properly.
I don't doubt your graphs and result data for a second. But there's probably a lot more variables at play here. I mean, you only have a sample size of one.
And here we come to one of my main points... Why do some people have confidence in such old calibration gears that give no information whatsoever about the changes being made nor that you can test with suitable software to see how it actually performs? It's all being done double-blind and you have no control over anything. What you see is what you get, and believing that you'll get your monitor properly calibrated without knowing anything pretty much goes beyond what calibration is about.
Probably because it was made specifically for broadcast monitors that demanded 100% accurate colors and white balance. Sony wouldn't make an expensive probe for an expensive broadcast monitor that gives inaccurate results.
It's fairly easy to follow the instructions for the BKM-14L and I don't believe it was misused.
You're right, I can't argue that. It's so simple to use.
Yes, especially in a case where you don't get any graphs or other information on what's been done. With CalMAN, it's all done manually and I know the entire workflow.
That is one drawback for sure. I really liked how my XBR960 calibration came with a graphed report detailing before-and-after color/gamma values, much like your graphs. Really adds a lot of re-assurance for the money you spent having someone calibrate a display for you.
These sorts of tools even age when not used. Still, what's the point in sticking to a 20 year old method when you can spend your money on either the same service from someone with the right gear and knowledge to do it for you or some basic, modern calibration tools?
What causes them to age though? Dust, light exposure?
You can look in to the X-Rite i1D3, as it's a colorimeter with great value for your money. It supports CRTs. As for software you'll probably want to start with HCFR, which is open source. I personally wouldn't say it's worth it to get into calibration if you just want one display calibrated and instead buy the service like you did.
Hmm, $279. Not bad. I would use it for my 3 CRTs and 1 LCD, and probably even all of my family's HDTVs in the house just for the hell of it.
And that's the point. There's no guarantee that the BKM-14L will give you accurate white balance, while manual work by someone who knows what he's doing will most likely end up much closer to the reference.
What about do multiple auto-calibrations with the probe?
nissling wrote:Finally I think we're on the right track. I personally don't claim that D65 is the proper standard for retro gaming, nor that D93 would be more correct. And I can perfectly understand if someone doesn't care too much about color accuracy on their display while gaming. The thing is, just like I said, that when people say that D65 is inaccurate for all retro games that were developed in Japan I feel like question them just what kind of references such a statement is based on except D93 was the NTSC-J standard? And when they just say that they've set their monitors to D65 and takes that as a reference... *Facepalm*

What made me more interested in this subject was BKM-14L once it was brought up for discussion. If you've let it "calibrate" your monitor to a specific color temperature, you've at least tried to get a more accurate reference. But how well does it hold up? After seeing how this specific probe performed, it doesn't look too good.
I'm not sure if properly calibrated monitors and a standardized color temperature is even a thing in the video game industry. And even if it is, was it always like that? We have no way of knowing what the source material was designed on, aside from the fact that D93 was the NTSC-J standard for television broadcasts until 2011.
Now, is the BKM-14L enough for anyone just want to get something more out of his or her BVM? Maybe. Is it enough for anyone who wants a somewhat accurate color temperature? No, I wouldn't say so. With differences between 8-900° Kelvin, it's considerably off D65. If that's good enough for playing retro games or not is a completely different question however, and for most people it probably is.

Personally I've always preferred D65 and having my BVM calibrated to it makes me feel confident that what I display on it is also shown with an accurate color temperature. Whether or not that's accurate to the original, intended look of an Nes game however I cannot answer. That's a completely different manner and one I'm not really interested in digging into any deeper.
Perhaps. But like I said before, these probes probably have a varying degree of accuracy and quality, and a sample size of only 1 simply isn't good enough to definitively conclude that the BKM-14L Auto-Setup Probe is a bad choice for 100% accurate calibrations.
Depending on the enviroment and what you're showing, it can be anywhere from negligible or drastic. Thing is though that the BVM-D32E1WU has a significantly larger screen size than the D24 and thus the same light output will appear brighter on it. In a dark room I think I would be satisfied with 85-90cd/m^2 for a D32.
What about a white room with bright natural morning/noon sunlight pouring in through a window with white blinds, and the same room completely pitch-black at night? Those are the conditions I plan on playing my games in.
Besides, where did you get those specs? From my measurements the D24 could go as high as 160cd/m^2 without issues, though that's way too bright for a CRT if you ask me.
http://broadcaststore.com/pdf/model/22107/22107.pdf

Specifications:
D32: Within 5% for luminance from 0 to 70 cd/m2
D24: Within 5% for luminance from 0 to 100 cd/m2
Regarding display drift, I'd say it all depends on how the display was calibrated and what the usages are but to make things easier let's just say you're using the display in your home for playing games and watch some films...

If the display is calibrated with dE u'v, the RGB balance will be generally more stable to begin with (especially in shadow details). For dE2000 however may end up with a more precise gamma. If your display decrease its light output over time, you'll most likely see more degregation in gamma than white balance. For my OLED it's pretty much the opposite though. The gamma and light output haven't changed but reds had shifted slightly since I had it professionally calibrated 1½ year ago (by an ISF certified calibrator). But even when I measure it now and re-calibrate it, the differences are pretty much impossible to see. As long as you're just using it for enjoyment it's just not worth calibrating it again by now.
So I should get my displays fully calibrated to be 100% accurate once and live with it, until I start seeing issues again?
What you can do is use ColorChecker (which is free) maybe once in a while to get new measurements and see how it performs without knowing exactly what causes any of the issues. Generally speaking though I'd say that even if your calibrated display drifts over time it'll still look much better than a non-calibrated set. As long as you're happy and don't do anything that needs perfect accuracy you'll probably be fully satisfied with one, complete and successful calibration.
You're right. I think that's what I'll do.
Lord of Pirates wrote:
GeneraLight wrote:
Now, am I saying that the BKM-14L is a bad product or that you should let it go? No, I’m not! There are probably probes out there that does still provide some accuracy and those could possibly do the trick. It’s also possible to use it in order to get a more uniform image and prevent tint on certain eras of the tube. This is seems to be a more reliable feature. But if you’re looking to get the very best white balance and accurate color temperature out of your wonderful BVM, I’d honestly recommend you to look around for ISF certified calibrators who can still calibrate Grade1 CRTs. I’m not saying that it’s free, but it will probably still turn out cheaper and easier than hunting down a BKM-14L that you in the end don’t even know if it works properly or not.
It sounds like the performance and results of the BKM-14L depends on what condition it's in. Probes that have a lot of dirt, wear/tear and usage probably won't perform nearly as well as a brand new one.

I once got my Sony KD-34XBR960 HD CRT professionally calibrated by a THX/ISF-certified calibrator. And while the difference in picture quality was mind-blowingly huge and amazing, the calibration was $425. Am I just better off buying a colorimeter and calibrating myself?
elvis wrote:monitors that drift every 30 days. So instead, we use quick, automated methods for this exact reason.
Is it normal for displays to drift in calibration that quickly? I don't want to pay a ton of money to have someone calibrate my displays or spend several hours calibrating them myself just for them to fall out of spec after a month.
My guess would be that Sony used cheaper filters because they weren't concerned with longevity. I wouldn't buy a BKM-14L at all after seeing the results Nissling posted.

It depends on what meter they used. You'll lose accuracy by re-calibrating with an i1DP3 if they used something high end but it should still be acceptable considering cost.
I see you bolded the part about the XBR960. What filters are you talking about? I don't understand what you're talking about when you mention the i1DP3 and meters being less accurate than the base calibration from factory.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

But why though? What would physically cause the probe to not function as well as it once did?
These devices are designed to pick up the electromagnetic waves that the monitor is reproducing. This involves a sensitive step where the light must be fed and interpreted by a fixed solution. Much like anything else, degregation is a fact. It's not just about wear and tear, time itself causes it to change.
I don't doubt your graphs and result data for a second. But there's probably a lot more variables at play here. I mean, you only have a sample size of one.
If you, or anyone else, can provide some graphs and overall analysis I'd be very interested. As a calibrator I want to take every chance I can get to see how the BKM-14L actually performs in comparison to a modern, certified colorimeter. In Sweden you'll most likely not find anyone else even willing to make this sort of comparison but me.
Probably because it was made specifically for broadcast monitors that demanded 100% accurate colors and white balance. Sony wouldn't make an expensive probe for an expensive broadcast monitor that gives inaccurate results.
That doesn't mean that:
1. Current options are less accurate.
2. The probe itself isn't as reliable today as back then.

And considering the bare amount of information you get from process, especially in comparison to even modern consumer software, you'll need to make a ColorChecker analysis in order to see if the probe can still hold up.
What causes them to age though? Dust, light exposure?
I'd say overall degregation. I don't think dust or light exposure was the issue with this probe considering the clear shift in white balance. My guess is that the probe have gotten less sensitive over the years and since blue is a low-energy color, it measured it especially low. Therefore, it was compensated by the probe by increasing B-gain and decreasing R-gain.
What about do multiple auto-calibrations with the probe?
If the auto-calibration is this inaccurate, there's no point in doing it to multiple displays.
Perhaps. But like I said before, these probes probably have a varying degree of accuracy and quality, and a sample size of only 1 simply isn't good enough to definitively conclude that the BKM-14L Auto-Setup Probe is a bad choice for 100% accurate calibrations.
The only way to identify if a BKM-14L is even worth using is to let it do its calibration and then check with a certified equipment and CalMAN if it still holds up. If you want to do it, go ahead. I'm waiting for more people to try this out and come with their results. I've done this once and I'm not impressed. I can move on.

You will not be able to get an accurate calibration on a CRT this old by the way.
What about a white room with bright natural morning/noon sunlight pouring in through a window with white blinds, and the same room completely pitch-black at night? Those are the conditions I plan on playing my games in.
What? With these conditions, you should honestly reconsider if a CRT is even worth it as an all-round solution. If I were you I'd rather buy an OLED or an LCD with matted display.
Thank you!
So I should get my displays fully calibrated to be 100% accurate once and live with it, until I start seeing issues again?
I cannot decide what you think is your best alternative. That's up to you.
User avatar
Xer Xian
Posts: 881
Joined: Sun Feb 06, 2005 3:23 pm
Location: Italy

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Xer Xian »

GenerHALight wrote:CUT
Image

.. don't mind me, just kidding.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

Nerdgasm! :lol:
Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

nissling wrote:These devices are designed to pick up the electromagnetic waves that the monitor is reproducing. This involves a sensitive step where the light must be fed and interpreted by a fixed solution. Much like anything else, degregation is a fact. It's not just about wear and tear, time itself causes it to change.
But how does time degrade it though? Assuming it's properly stored in a dark, room-temperature setting with no moisture, humidity or dust.
If you, or anyone else, can provide some graphs and overall analysis I'd be very interested. As a calibrator I want to take every chance I can get to see how the BKM-14L actually performs in comparison to a modern, certified colorimeter. In Sweden you'll most likely not find anyone else even willing to make this sort of comparison but me.
That doesn't mean that:
1. Current options are less accurate.
2. The probe itself isn't as reliable today as back then.

And considering the bare amount of information you get from process, especially in comparison to even modern consumer software, you'll need to make a ColorChecker analysis in order to see if the probe can still hold up.
The only way to identify if a BKM-14L is even worth using is to let it do its calibration and then check with a certified equipment and CalMAN if it still holds up. If you want to do it, go ahead. I'm waiting for more people to try this out and come with their results. I've done this once and I'm not impressed. I can move on.
If only I had the equipment and expertise.
I'd say overall degregation. I don't think dust or light exposure was the issue with this probe considering the clear shift in white balance. My guess is that the probe have gotten less sensitive over the years and since blue is a low-energy color, it measured it especially low. Therefore, it was compensated by the probe by increasing B-gain and decreasing R-gain.
But how did it get less sensitive. And if you say time, what exactly did time change to make it less sensitive?
If the auto-calibration is this inaccurate, there's no point in doing it to multiple displays.
That's not what I meant. I meant doing multiple consecutive auto-calibrations on the same monitor to even out any inaccuracies.
You will not be able to get an accurate calibration on a CRT this old by the way.
Why not?
What? With these conditions, you should honestly reconsider if a CRT is even worth it as an all-round solution. If I were you I'd rather buy an OLED or an LCD with matted display.
The CRT screens are facing opposite from the window, so there's no light shining on the screens at all. I'm going to invest in blinds so I can adjust the amount of sunlight that enters my room, and allow for a pitch-black room at night since there's a street light outside that shines into my room at night.

Why would I reconsider? Zero input lag, motion clarity and fluidity that isn't garbage, amazing colors and black levels, 240p - 1080i support at their native resolution, light gun games, the list goes on and on.
Thank you!
Yeah, I seemed to have misinterpreted that data. You mentioned though that the D32 outputs a brighter image than the D24 with the same settings because it's a bigger screen? How is that even possible if the data suggests it has a lower maximum cd/md^2?
I cannot decide what you think is your best alternative. That's up to you.
Well, I think I'll go with the one-time complete calibration.
Xer Xian wrote:
GenerHALight wrote:CUT
Image

.. don't mind me, just kidding.
Well, forgive me for being so inquisitive but during the past few weeks I've wondered whether you might be having some second thoughts about the mission.
Last edited by Ikaruga11 on Tue Feb 27, 2018 11:19 pm, edited 1 time in total.
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

But how does time degrade it though? Assuming it's properly stored in a dark, room-temperature setting with no moisture, humidity or dust.
I'm not an engineer. I'm just a calibrator who have to know how my tools behave.
But how did it get less sensitive. And if you say time, what exactly did time change to make it less sensitive?
It gets old and degrades. You may ask yourself why your CRT have degraded? Or just about anything you own?
That's not what I meant. I meant doing multiple consecutive auto-calibrations on the same monitor to even out any inaccuracies.
Why would that help if the probe is inaccurate to begin with?
Why not?
Because the tubes have aged and will not be able to perform completely accurate again. Those that are in good condition are still very capable, but they will never completely accurate. In fact, it's debatable if any display has ever been completely accurate.
Why would I reconsider? Zero input lag, motion clarity and fluidity that isn't garbage, amazing colors and black levels, 240p - 1080i support at their native resolution, light gun games, the list goes on and on.
You won't be able to gain any of the contrast of a CRT if you're using it in a bright enviroment. The highly reflective glass and natural EOTF kills CRTs unless you're in a dark room (or with controlled dimmings). If you're looking for "amazing black levels" I cannot recommend you anything but an OLED.
You mentioned though that the D32 outputs a brighter image than the D24 with the same settings because it's a bigger screen? How is that even possible if the data suggests it has a lower maximum cd/md^2?
You're reading my posts completely wrong.

70cd/m^2 on a 32" display is brighter than 70cd/m^2 on a 24" display. I've never said 70cd/m^2 on a 32" display is brighter than 100cd/m^2 on a 24" display, nor that the D32 outputs a brighter image than the D24 "with the same settings". I didn't even mention anything about settings.

I honestly don't think that you're seriously into this. The fact still stands: You cannot know for sure that your BKM-14L does its process properly unless you measure it. If you ever do so please give us your results.
Ikaruga11
Posts: 1454
Joined: Thu Apr 07, 2016 1:32 pm

Re: Color temperature for games and consoles: 6500K or 9300K

Post by Ikaruga11 »

nissling wrote:I'm not an engineer. I'm just a calibrator who have to know how my tools behave.
Fair enough.
It gets old and degrades. You may ask yourself why your CRT have degraded? Or just about anything you own?
A harsh and unfortunate reality. :cry:
Why would that help if the probe is inaccurate to begin with?
You said the BKM-14L you tested have very high blue levels and low red levels. What I'm saying is if you kept using it, it would bring the blues down and bring the reds up. Repeat the auto-calibration until the RGB values are evened out.
Because the tubes have aged and will not be able to perform completely accurate again. Those that are in good condition are still very capable, but they will never completely accurate. In fact, it's debatable if any display has ever been completely accurate.
Fair enough. I guess the battle for perfection is an unwinnable one in the long term.
You won't be able to gain any of the contrast of a CRT if you're using it in a bright enviroment. The highly reflective glass and natural EOTF kills CRTs unless you're in a dark room (or with controlled dimmings). If you're looking for "amazing black levels" I cannot recommend you anything but an OLED.
I also said I would be using it when my room is pitch-black. I know I won't get optimal picture quality if there's light in the room, and I can accept that. With blinds, I can make the room dimmer in the morning/noon so less light is coming in and my room is darker. I can also try to control any ambient light being emitted from power LEDs from things such as game consoles, surge protectors and monitors. Now sure how much of a difference that will make, if any at all.
70cd/m^2 on a 32" display is brighter than 70cd/m^2 on a 24" display.
How though? I'm wondering if that can be a good thing or a bad thing.
I've never said 70cd/m^2 on a 32" display is brighter than 100cd/m^2 on a 24" display, nor that the D32 outputs a brighter image than the D24 "with the same settings". I didn't even mention anything about settings.
When I mentioned settings, I meant that all variables and constants are identical to have conclusive and accurate results. Same video source, same calibration, same lighting, same condition, etc.
I honestly don't think that you're seriously into this. The fact still stands: You cannot know for sure that your BKM-14L does its process properly unless you measure it. If you ever do so please give us your results.
I kind of am serious. Well, I'll look into buying that X-Rite i1D3 or something similar/better for calibrating my IPS LCD monitor, and can use it to measure my BVM after auto-calibrating it with the BKM-14L. Aside from color, is contrast and brightness all you need to adjust on a fixed panel display such as Plasma, LCD, OLED, etc.?
Last edited by Ikaruga11 on Wed Feb 28, 2018 2:18 am, edited 3 times in total.
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

GeneraLight wrote:Nice. Which models do you have?
I'm due for a stocktake. I've got over 40 CRTs now across my arcade and domestic tube collecting, so it all gets lost in the noise. I'll try and get model numbers soon.

Quick snap of one set of shelving in the shed. On the list of "things to clean up one day". Some PVM 20s, 14s and 9s in there. Mix and match of models (some RGB/YPbPr, some only composite/s-video, pretty sure at least one is only monochrome too).

Image
GeneraLight wrote:How quickly would a CRT BVM drift and degrade, assuming it's brand new?
Again, it depends on a lot of things. Age is one factor. Component quality is another. Drift happens all the time (minutes after you profile/calibrate), but at tiny percentages. When you notice drift also differs person by person. So again, we profile and calibrate every display every 30 days as a running average in our workplace. At home, I adjust displays when I notice problems, which can be months/years. But again, at home for retro gaming, my level of concern is quite low.
GeneraLight wrote:So you're saying that calibrating a display once a month isn't enough for a lot of major film studios and Netflix?
Like all business problems, the "cost to reward" ratio is the biggest determining factor. You can get all pedantic about colour and spend hours checking these things every week, but as I've said a few times now, that doesn't scale to hundreds of monitors, as you're just spending a tonne of money on not only multiple colorimeters to get the work done in parallel, but also on wages of technical staff to do the calibration/profiling, as well as dead time for artists who can't use their workstations.

So the big studios average it out to once per month per display. And again, the profiling we do on the workstation of someone doing 3D modelling or simulations is going to be far less detailed than what nissling documented a page back, because it's not feasiible. But at the same time, we go to even deeper pedantic detail for our DIT/grading suites than that, because it matters more there. Our compositors sit somewhere in the middle. Pragmatism is necessary in business, not just over-excited theory.
GeneraLight wrote:What about white colored walls with natural sunlight coming in through a single window in the morning/noon (my favorite time to play games if not playing in pitch-black darkness).
Ambient light differs in warmth throughout the day. Midday is typically cool. Morning and evening warmest. All of these are going to change your "white balance". How do you profile/calibrate for that? I'd probably take a running average and calibrate for when you most commonly play games. I play mostly during the evening with fairly consistent artificial lighting, so I calibrate for that.
GeneraLight wrote:It's pretty amazing thinking about the things we're capable of. Maybe one day we can develop displays and calibration methods that will never drift or degrade. :mrgreen:
To make a display that never shifts would mean making a display that either doesn't require traditional components (i.e.: no copper/silicon, no capacitors, etc), or making those components perfect (i.e.: they never change with temperature, they never degrade over time). If you could figure out how to make either of these things, you'd be VERY rich. :)
GeneraLight wrote:Yep. I just want a 100% accurate calibration that lasts a long time (if not forever). What about a brand new BVM in excellent condition?
*Everybody* wants "100% accurate calibration that lasts a long time (if not forever)". But there's a reason we all spend big dollars trying to get that. :)

Speaking very, very generally, the Sony BVM CRT was the result of many very clever people doing their best to produce a monitor as close to "perfect" as possible. One also needs to understand that these things retailed for $30,000 - $60,000 for certain models, so they're right up there at the "price is no issue" end of town. Assuming everything is as expected (no manufacturing faults, they sit in a humidity and temperature controlled environment), I'd expect these things after a proper professional calibration to be good for a year. BUT, I also know these things don't always sit in air-conditioned studios, and often travel out on site to shoot locations. Physical movement, transport, temperature and humidity fluctuations, etc, all play havoc with these poor things. The company I work for used to do outdoor broadcast before switching over to VFX, and we'd have to calibrate our PVMs and BVMs before and after every major job/shoot (which could last anywhere from single-digit days, to 3-6 months) because of all of these factors.

FWIW, we buy various broadcast OLED displays today that retail for the same sorts of prices. We don't have a lot of them (maybe 2-3 scattered around the studio), but we have some more cheaper ones (in the $10K range) here and there too. 90% of our studio use standard UHD/HDR displays that are factory calibrated to Adobe sRGB. These days we've got a few newer ones that will go all the way to DCI-P3 gamut, which is nice. But again, these cost a lot, so they're not everywhere yet. We're still working quite a lot of regular old 8-bit-per-pixel domestic LCDs for stuff that's not as colour-sensitive, and the final checking/finishing is done on the nicer displays. Again, business pragmatism all the way.
GeneraLight wrote:Realistically, I think that's what I'd do. Either pay an THX/ISF-certified calibrator to get it fully calibrated and 100% accurate (color, color temperature, geometry, convergence, focus, contrast, brightness, white balance, etc.) just once and live with it, or buy/rent some good calibration tools at a modest price and fully calibrate all of my displays just once.
I think for the purposes of a retro/arcade gaming enthusiast, this is pretty reasonable. You know the device is looking as it should at least from that point forward, and then it's up to you to make minor adjustments based on "eyeball tests".
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

You said the BKM-14L you tested have very high blue levels and low red levels. What I'm saying is if you kept using it, it would bring the blues down and bring the reds up. Repeat the auto-calibration until the RGB values are evened out.
Why waste time and money on that? You're still as blind the tenth time as the first and by that time it would've been faster to use a colorimeter and CalMAN to get the measurements. Not to speak of, you can get suitable, modern equipment for the money a BKM-14L costs anyway. The choice is easy for me.
How though? I'm wondering if that can be a good thing or a bad thing.
Because it's larger? You're missing that the measurements are in square meters.

For instance, 30cd/m^2 on a 100" screen is about equal to 120cd/m^2 on a 50" screen if aspect ratio is the same. Since the screen size is four times as large, you'll end up with four times as high light reflection if it's the same measurements per square meter.
Aside from color, is contrast and brightness all you need to adjust on a fixed panel display such as Plasma, LCD, OLED, etc.?
On pretty much all consumer displays, only brightness, contrast, lightoutput, EOTF* and white balance are what you can calibrate. Otherwise, what you see is what you get. CMS is pretty much always broken. CRTs aren't much different actually apart from convergence and geometry.

*You will take measurements to see what EOTF is suitable for your enviroment, but you cannot really change what the display has to offer you in this regard.
User avatar
tjstogy
Posts: 341
Joined: Tue Sep 01, 2015 1:27 am
Location: New York

Re: Color temperature for games and consoles: 6500K or 9300K

Post by tjstogy »

The first post may as well be "why is the sky blue? oh and what is the most accurate color of the sky, and how can it accurately be measured from everywhere in the world? Actually I don't even look at the sky--- but if I did, what device do I use to capture it so it can look perfect?"
nissling
Posts: 454
Joined: Sun May 10, 2015 8:12 am
Location: Stockholm, Sweden

Re: Color temperature for games and consoles: 6500K or 9300K

Post by nissling »

Anyone who believes that there is such a thing as "pure white" on a display should have a look at this video in order to understand how CIE1931 works.

https://www.youtube.com/watch?v=82ItpxqPP4I
User avatar
elvis
Posts: 984
Joined: Fri Nov 04, 2005 10:42 pm
Location: Brisbane, Australia

Re: Color temperature for games and consoles: 6500K or 9300K

Post by elvis »

nissling wrote:Anyone who believes that there is such a thing as "pure white"
I'm fairly certain people get the concept that white light doesn't exist. If not, ask them to give you the wavelength of white light, and that you'll politely wait for them to find that information. If they need it visually, a close up photo of the screen with even a modern phone camera illustrates immediately that there are no white pixels.

I mean, sure, delve into CIE standards and 21 minute long technical videos if you want. But go right back to the ultra basics and it all makes sense too.
Post Reply