orange808 wrote:Sure. You win.
Gaming on 77" Oled
Re: Gaming on 77" Oled
Strikers1945guy wrote:"Do we....eat chicken balls?!"
Re: Gaming on 77" Oled
I like turtles.Xyga wrote:orange808 wrote:Sure. You win.
We apologise for the inconvenience
-
bobrocks95
- Posts: 3472
- Joined: Mon Apr 30, 2012 2:27 am
- Location: Kentucky
Re: Gaming on 77" Oled
I think you should probably scoot your couch/TV back and invest in some extension cables...Flashman wrote:Yeah I must say I got a 55' LG for my gaming earlier this year, it's got a great game mode and I don't notice any lag at all, I would have preferred a smaller size though, and there didn't seem to be too much available with the specs I was looking for. Since getting it, I do notice I have sore / bloodshot eyes sometimes after a long session - there's a limit to how far away I can sit due to wired controllers and space in the room. I have a bottle of eye drops on standby which sorts my eyes out, but obviously its a little worrying what long term effects it could be having.Xyga wrote:No, they present large screens because it's popular for the electronics shows, but MicroLED has been demonstrated in small modulable panels (to form custom-sized/shaped larger ones if desired)Classicgamer wrote:I suspect you'll be out of luck on smaller sizes with micro led too. The consumer trend on TV sizes is only going up. Samsung's first micro led set, "The Wall" is 140".
As far as TV tech goes, I think micro led is the least likely to appear in small sizes and the least likely to be suitable for users who sit close to the screen. There is a reason why they have only made a 140" one currently. I read that they were working on 98" and 75" models though.
MicroLED is precisely the tech that should make smaller sizes possible, while it's a huge hassle with OLED apparently b/c they need a costly dedicated production line for each size, and no way to make modulable designs of course.
Anyway MicroLED is nowhere near to become affordable mainstream tech, we'll see what happens in a decade or so.
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
-
- Posts: 873
- Joined: Thu Sep 11, 2014 3:37 pm
Re: Gaming on 77" Oled
One thing I found with Oled is that it can be too bright - to the point where it hurts my eyes and gives me headaches. First thing I did when I got them was to turn it down until it was comfortable. This is good practice for looking after a display like this anyway but also for looking after your eyes.
I worry about my son watching it as the excessive brightness of blue light from modern displays is now the leading cause of ocular degeneration which is now the leading cause of blindness.... Which is scary.... I like being able to see stuff.
I know it's tempting to leave an Oled brighter than the son as it can handle it without messing up the contrast but it isn't necessary. It still looks great when brightness is turned down to the point where it stops causing pain.
I worry about my son watching it as the excessive brightness of blue light from modern displays is now the leading cause of ocular degeneration which is now the leading cause of blindness.... Which is scary.... I like being able to see stuff.
I know it's tempting to leave an Oled brighter than the son as it can handle it without messing up the contrast but it isn't necessary. It still looks great when brightness is turned down to the point where it stops causing pain.
-
- Posts: 124
- Joined: Tue Oct 30, 2018 8:29 pm
Re: Gaming on 77" Oled
Flashman - try turning down the brightness and the oled light. I would also suggest a bias light strip behind your screen, $10-20 and will help with your eye soreness.
Re: Gaming on 77" Oled
Any display can be too bright, and any display can be adjusted to be less bright. It's a meaningless complaint. Reducing the brightness of OLED displays prolongs their lifespan, so it's a good idea in any case.
As for blue light, there is no risk from the blue light from displays, as the sun ordinarily puts out blue light with an order of magnitude the intensity as even a bright display on max brightness. Staring at a high-powered LED or flashlight might be another story, but that's not what a display is.
https://www.health.harvard.edu/blog/wil ... 9040816365
As for blue light, there is no risk from the blue light from displays, as the sun ordinarily puts out blue light with an order of magnitude the intensity as even a bright display on max brightness. Staring at a high-powered LED or flashlight might be another story, but that's not what a display is.
https://www.health.harvard.edu/blog/wil ... 9040816365
Re: Gaming on 77" Oled
This is most likely untrue, we shouldn't be in a rush to believe such things.Classicgamer wrote:the excessive brightness of blue light from modern displays is now the leading cause of ocular degeneration which is now the leading cause of blindness....
-
BazookaBen
- Posts: 2079
- Joined: Thu Apr 17, 2008 8:09 pm
- Location: North Carolina
Re: Gaming on 77" Oled
Yeah, my first thought is: You probably get exposed to more blue light going outside in the sun than you would all day watching a TV.shroom2k wrote:This is most likely untrue, we shouldn't be in a rush to believe such things.Classicgamer wrote:the excessive brightness of blue light from modern displays is now the leading cause of ocular degeneration which is now the leading cause of blindness....
-
- Posts: 3
- Joined: Sat Aug 31, 2019 3:23 pm
Re: Gaming on 77" Oled
I have an LG B7, when I got it and went looking for calibration guides they all talked about reducing the "OLED Light" level as low as you can while still being comfortable with the image given the light level of the room it's in. As others have said, reducing the light level is better for the health of the display as well. There is also the energy saving option which adjusts the brightness based on the light level of the room but most guides are going to have you turn that off, but if you're having eye strain maybe give it a try.
Re: Gaming on 77" Oled
Typical brightness for a computer monitor: 100 nits
Typical sustained SDR brightness for a television: 200 nits
Typical direct sunlight: 20,000 - 40,000 nits
The sun has both blue *and* UV light, but sure, let's worry about our TVs being too bright.
Typical sustained SDR brightness for a television: 200 nits
Typical direct sunlight: 20,000 - 40,000 nits
The sun has both blue *and* UV light, but sure, let's worry about our TVs being too bright.
-
- Posts: 873
- Joined: Thu Sep 11, 2014 3:37 pm
Re: Gaming on 77" Oled
I don't know how true the blue light / ocular degeneration thing is as it depends who you ask. I'm not sure anyone knows for sure yet.
Either way it's silly to use comparisons with the sun as a source of comfort because you don't spend hours each day staring at the sun. And, if you did, you would most certainly have serious vision problems including blindness. Most of us can barely stand staring at the sun for a few seconds.
We do spend a lot of time staring at screens though. Many modern displays are bright enough to be seen in direct sunlight which is a good indication of how much brighter they are than the screens of yesteryear.
What I know for sure is that my Oled displays hurt my eyes when I first got them so I turned down the brightness and now it doesn't. Did I save myself from certain blindness or just avoid a nasty headache? Who knows... Either way, it's a good idea.
BTW, the problem is not specific to Oled. It's any HDR capable or ultra high brightness display. Current Oled and Qled tv's are both excessively bright by default imo.
Either way it's silly to use comparisons with the sun as a source of comfort because you don't spend hours each day staring at the sun. And, if you did, you would most certainly have serious vision problems including blindness. Most of us can barely stand staring at the sun for a few seconds.
We do spend a lot of time staring at screens though. Many modern displays are bright enough to be seen in direct sunlight which is a good indication of how much brighter they are than the screens of yesteryear.
What I know for sure is that my Oled displays hurt my eyes when I first got them so I turned down the brightness and now it doesn't. Did I save myself from certain blindness or just avoid a nasty headache? Who knows... Either way, it's a good idea.
BTW, the problem is not specific to Oled. It's any HDR capable or ultra high brightness display. Current Oled and Qled tv's are both excessively bright by default imo.
Re: Gaming on 77" Oled
The 20-40 thousand nits is not from staring at the sun, it's from staring at any surface that the sun is shining on. In other words, it's the brightness of general matte reflective surfaces outdoors that are in the sunlight. The sidewalk outside your house on a clear day is hundreds of times brighter than your computer monitor, and you're getting way more blue light off that sidewalk than you'll ever get off your TV. You don't notice that it's so much brighter because your eyes adjust, but take your HDTV outside and let the sun shine directly on it and maybe tape a sheet of paper to the screen and you'll see how much brighter that sheet of paper is than the brightest whites your TV can pump out.
-
- Posts: 873
- Joined: Thu Sep 11, 2014 3:37 pm
Re: Gaming on 77" Oled
I know enough about reflection to understand that the type of surface matters a great deal. Your average sidewalk or garden lawn is a very poor light reflector and it's diffuse. I.e it scatters light in all directions instead of focusing it. Tv's are made to focus light towards our eyes.
If you were to walk on a retro-reflective or specular reflective surface, the reflection of the sun would be more than capable of damaging your eyes.
People that do those crazy trecks across the North and South pole have to wear eye protection to avoid damaging their eyes as ice is significantly more reflective than concrete.
I'm no expert on occular issues but from what little I have read on it, it's not just the brightness that is problematic. It's the type and color of light.
Anyway, people experiencing headaches and eye strain from screen time have two choices. They can turn the brightness down and avoid issues. Or... Keep their headaches and eye strain because their TV isn't as powerful as the sun.
If you were to walk on a retro-reflective or specular reflective surface, the reflection of the sun would be more than capable of damaging your eyes.
People that do those crazy trecks across the North and South pole have to wear eye protection to avoid damaging their eyes as ice is significantly more reflective than concrete.
I'm no expert on occular issues but from what little I have read on it, it's not just the brightness that is problematic. It's the type and color of light.
Anyway, people experiencing headaches and eye strain from screen time have two choices. They can turn the brightness down and avoid issues. Or... Keep their headaches and eye strain because their TV isn't as powerful as the sun.
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Gaming on 77" Oled
Or set the color temperature right (Warm 2), which will decrease the amount of blue light output. This will much closer match a properly calibrated screen.
Re: Gaming on 77" Oled
Here's a review of one of the TUF ELMB monitors.Fudoh wrote:LTT just posted a video with a new Asus monitor that offers strobing alongside variable refresh rate. Not the worst video although they don't differentiate between BFI and strobing. Definitely interesting monitor though for PC users.
https://www.tftcentral.co.uk/reviews/as ... vg27aq.htm
Looks like it has an odd "strobe scan" cadence of 2.5 strobes per screen refesh at 60Hz. Apparently, it looks bad at low refresh rates.
It's still interesting for very high refesh rates.
We apologise for the inconvenience
Re: Gaming on 77" Oled
Quick question about retro gaming on OLED's. Has anyone ever reported suffering screen burn in from using artificial scanlines from something such as an OSSC?
Just asking since scanlines are a must for me when it comes to 240p games, and a lot of artificial scanlines aren't solid black.
Just asking since scanlines are a must for me when it comes to 240p games, and a lot of artificial scanlines aren't solid black.
Re: Gaming on 77" Oled
Burn-in is not as big a problem as people seem to think. The RTings tests are an extreme case that represents like 5-10 years of normal usage, at least. Their units are also the 2017 models, and newer models should have a slightly better resistance to burn-in. For example, the 2018 models enlarged the red subpixel, presumably because the red burned in faster than other colours on the 2017 units.
Burn-in happens because of bright elements on the screen, and RTing's tests on a display that had letterboxed content with static ultra-bright logos in some places (the letterboxing would simulate the 4:3 pillarbox from retro games, and the scanlines would be the same on a much smaller scale) showed extreme burn-in on the ultra-bright static logos, but no visible burn-in from the letterboxing.
Burn-in happens because of bright elements on the screen, and RTing's tests on a display that had letterboxed content with static ultra-bright logos in some places (the letterboxing would simulate the 4:3 pillarbox from retro games, and the scanlines would be the same on a much smaller scale) showed extreme burn-in on the ultra-bright static logos, but no visible burn-in from the letterboxing.
Re: Gaming on 77" Oled
To add to that, if you're careful not to leave your TV on when you aren't using it, the risk of burn in will be very low. When I see old TVs in the thrift store with burn in, it's pretty much always the cable TV menu, or some advertising border that accompanied other content in a commercial situation. If you just use your TV normally and don't leave it on when you're sleeping on the menu, it'll probably be fine.Guspaz wrote:Burn-in is not as big a problem as people seem to think.
-
- Posts: 873
- Joined: Thu Sep 11, 2014 3:37 pm
Re: Gaming on 77" Oled
Real burn-in was a potential problem with CRT, plasma and now Oled. As with those other display technologies, you don't have to worry about it if you look after your TV properly.
It's unlikely to happen from letterbox content as the pixels are turned off completely with black screens on Oled displays. It's most likely to occur with bright static or flashing content over prolonged periods.
The most common occurrences on crt's was from the flashing white "insert coin" on screens with brightness turned way up over years of 24/7 use. Or airport flight board displays. You could literally see the damage burnt into the phosphor grid even when the monitor was turned off. I have never seen "scanlines" burnt into a screen and all CGA crt monitors displayed 240p graphics with unused gaps between scanlines.
Most of the actual burn-in on Oled displays are from idiots - AKA "user error". They leave CNN's static logo on all day, every day, on it's brightest HDR setting and never run the pixel refresher. Then they blame Oled tech for the burn-in.
I own 4 Oled screen if you include my phone. I regularly game on all of them, including mame on my Windows 10 tablet and no hint of burn in. Just use common sense. Don't hold regular 12 hour Pong or Pac-Man sessions as those old 1-screen games were the worst offenders. And keep brightness down to sensible levels.
It's unlikely to happen from letterbox content as the pixels are turned off completely with black screens on Oled displays. It's most likely to occur with bright static or flashing content over prolonged periods.
The most common occurrences on crt's was from the flashing white "insert coin" on screens with brightness turned way up over years of 24/7 use. Or airport flight board displays. You could literally see the damage burnt into the phosphor grid even when the monitor was turned off. I have never seen "scanlines" burnt into a screen and all CGA crt monitors displayed 240p graphics with unused gaps between scanlines.
Most of the actual burn-in on Oled displays are from idiots - AKA "user error". They leave CNN's static logo on all day, every day, on it's brightest HDR setting and never run the pixel refresher. Then they blame Oled tech for the burn-in.
I own 4 Oled screen if you include my phone. I regularly game on all of them, including mame on my Windows 10 tablet and no hint of burn in. Just use common sense. Don't hold regular 12 hour Pong or Pac-Man sessions as those old 1-screen games were the worst offenders. And keep brightness down to sensible levels.
Re: Gaming on 77" Oled
OLED burn-in is cumulative, so it's going to happen to every OLED screen eventually. But the question is, will it happen to a problematic degree within the normal service life of a screen, and for the vast majority of people, no, it won't.
That said, I'm not sure why they haven't implemented burn-in correction by tracking the wear on individual subpixels. OLED subpixels have a predictable and consistent lifespan, they decrease in brightness in a predictable manner based on how much light they've pumped out. So why not track the total light output of each subpixel, so that you can correct for burn-in by reducing the brightness of the rest of the screen to compensate, or by increasing the brightness of the burnt-in subpixels?
It doesn't seem like it'd be that hard to do, maybe a 128-bit accumulator for every subpixel, and you take the current brightness of every subpixel, apply some weighting based on a curve, and add it up. Then you've got this convenient map of the effective brightness of the subpixels to do any corrections/compensations.
IIRC, LG OLEDs have four subpixels per pixel, so one gigabyte of flash storage and one gigabyte of RAM would be enough to track it. Plus the hardware to accumulate the data, but they're custom-designing the video chips in these things, they could add dedicated hardware to do it.
That said, I'm not sure why they haven't implemented burn-in correction by tracking the wear on individual subpixels. OLED subpixels have a predictable and consistent lifespan, they decrease in brightness in a predictable manner based on how much light they've pumped out. So why not track the total light output of each subpixel, so that you can correct for burn-in by reducing the brightness of the rest of the screen to compensate, or by increasing the brightness of the burnt-in subpixels?
It doesn't seem like it'd be that hard to do, maybe a 128-bit accumulator for every subpixel, and you take the current brightness of every subpixel, apply some weighting based on a curve, and add it up. Then you've got this convenient map of the effective brightness of the subpixels to do any corrections/compensations.
IIRC, LG OLEDs have four subpixels per pixel, so one gigabyte of flash storage and one gigabyte of RAM would be enough to track it. Plus the hardware to accumulate the data, but they're custom-designing the video chips in these things, they could add dedicated hardware to do it.
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Gaming on 77" Oled
They must be doing something along those lines. Because Rtings 1 year burn in test showed no degradation in light output at all.
The TV also does compensation cycles in standby for every four hours of use to clean up any image retention. This is important, and most likely the reason why we see burn in on some store display models as most stores cut the power completely after closing time, never giving the TV a chance to compensate.
The TV also does compensation cycles in standby for every four hours of use to clean up any image retention. This is important, and most likely the reason why we see burn in on some store display models as most stores cut the power completely after closing time, never giving the TV a chance to compensate.
Re: Gaming on 77" Oled
Well, some of us like to use the TV as a secondary PC desktop screen. For example, I like to use it for equalizer/audio monitoring in DAWs - and for a good session, it needs to have that on screen for hours. So no OLED for me.
Re: Gaming on 77" Oled
That's actually proof that they're not doing it. If they were reducing the brightness of non-burnt-in subpixels to match the burnt-in ones, sacrificing overall screen brightness to hide or reduce burn-in (under the logic that burn-in or non-uniformity is more distracting than a uniform reduction in brightness), then you'd expect to see a reduction in light output with no visible burn-in. Instead, in their one year test, we see visible burn-in, but no reduction in overall light output.Konsolkongen wrote:They must be doing something along those lines. Because Rtings 1 year burn in test showed no degradation in light output at all.
It doesn't even need to be mandatory, all the other burn-in prevention features are optional toggles in the menu, so they can just add a toggle with some marketing name like "Uniformity Enhancer" or something.
-
bobrocks95
- Posts: 3472
- Joined: Mon Apr 30, 2012 2:27 am
- Location: Kentucky
Re: Gaming on 77" Oled
Ah yes let me just track 33,177,600 subpixels as individual data points and do some on-the-fly floating point arithmetic on themGuspaz wrote:OLED burn-in is cumulative, so it's going to happen to every OLED screen eventually. But the question is, will it happen to a problematic degree within the normal service life of a screen, and for the vast majority of people, no, it won't.
That said, I'm not sure why they haven't implemented burn-in correction by tracking the wear on individual subpixels. OLED subpixels have a predictable and consistent lifespan, they decrease in brightness in a predictable manner based on how much light they've pumped out. So why not track the total light output of each subpixel, so that you can correct for burn-in by reducing the brightness of the rest of the screen to compensate, or by increasing the brightness of the burnt-in subpixels?
It doesn't seem like it'd be that hard to do, maybe a 128-bit accumulator for every subpixel, and you take the current brightness of every subpixel, apply some weighting based on a curve, and add it up. Then you've got this convenient map of the effective brightness of the subpixels to do any corrections/compensations.
IIRC, LG OLEDs have four subpixels per pixel, so one gigabyte of flash storage and one gigabyte of RAM would be enough to track it. Plus the hardware to accumulate the data, but they're custom-designing the video chips in these things, they could add dedicated hardware to do it.
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
Re: Gaming on 77" Oled
I get what you're saying, but we don't necessarily need to do any floating point in real time.bobrocks95 wrote:Ah yes let me just track 33,177,600 subpixels as individual data points and do some on-the-fly floating point arithmetic on themGuspaz wrote:OLED burn-in is cumulative, so it's going to happen to every OLED screen eventually. But the question is, will it happen to a problematic degree within the normal service life of a screen, and for the vast majority of people, no, it won't.
That said, I'm not sure why they haven't implemented burn-in correction by tracking the wear on individual subpixels. OLED subpixels have a predictable and consistent lifespan, they decrease in brightness in a predictable manner based on how much light they've pumped out. So why not track the total light output of each subpixel, so that you can correct for burn-in by reducing the brightness of the rest of the screen to compensate, or by increasing the brightness of the burnt-in subpixels?
It doesn't seem like it'd be that hard to do, maybe a 128-bit accumulator for every subpixel, and you take the current brightness of every subpixel, apply some weighting based on a curve, and add it up. Then you've got this convenient map of the effective brightness of the subpixels to do any corrections/compensations.
IIRC, LG OLEDs have four subpixels per pixel, so one gigabyte of flash storage and one gigabyte of RAM would be enough to track it. Plus the hardware to accumulate the data, but they're custom-designing the video chips in these things, they could add dedicated hardware to do it.
I can make a lookup table to avoid making the same extremely similar calculations over and over again. Accessing whatever data structure I construct is going to be dead easy--and fast.
Assuming 128 bit data, I can store the complete data set in about 500MB.
It's manageable.
We apologise for the inconvenience
Re: Gaming on 77" Oled
The TV is already doing tons of mathematical operations on every single sub pixel every single frame. This is trivial in comparison to what it’s already doing, and it’s just a basic DSP-style operation with no logic. It’s already custom silicon they’re designing for the video processor in the TV.
Re: Gaming on 77" Oled
Why do any calculations at all? We know the approximate outcomes beforehand.Guspaz wrote:The TV is already doing tons of mathematical operations on every single sub pixel every single frame. This is trivial in comparison to what it’s already doing, and it’s just a basic DSP-style operation with no logic. It’s already custom silicon they’re designing for the video processor in the TV.
Just create a data structure (let's imagine simple multidimensional array) that uses a few indexes (that correspond with the state of a pixel).
I already know how much "wear" that particular state of the pixel will create. I can precalculate it.
So, all you need to do is pull a value from the table and increment your current indexed "wear" value.
I'm just accessing ram, incrementing a value, and storing it.
Same concept applies when I "draw" a pixel and change the brightness based on wear. Lookup table.
Last edited by orange808 on Fri Sep 06, 2019 7:23 pm, edited 1 time in total.
We apologise for the inconvenience
Re: Gaming on 77" Oled
Incrementing a stored value by a variable amount retrieved from a lookup table is a mathematical operation: it’s addition.
Re: Gaming on 77" Oled
Everything that we can perform is a mathematical operation. Although, the actual CPU or APU is just a circuit.Guspaz wrote:Incrementing a stored value by a variable amount retrieved from a lookup table is a mathematical operation: it’s addition.
Floating point operations are inherently more complex to complete. So, it's to our advantage to avoid doing them when it's not necessary.
Of course, I'm only speaking from dev experience on 8 bit platforms and mobile, so what the fuck would I know about it?
We apologise for the inconvenience
Re: Gaming on 77" Oled
Right, and my point is that doing an additional mathematical operation (be it integral or floating point) on every subpixel is very easy when you consider that the video processor is already doing many mathematical operations on every subpixel. Replacing a floating point multiply with a lookup table is just a cost optimization that reduces the transistor count a the expense of a tiny amount of extra RAM.