Cheapest video card for 4K@120hz for Retroarch?
Cheapest video card for 4K@120hz for Retroarch?
When I get my 4K TV soon, I want to run Retroarch in 4K @ 120hz and use CRT shaders. What would be the cheapest video card that would allow me to do this well. My thinking is that if I do black frame insertion at 120hz, the CRT shader won't look as dark and it will help motion look less blurry when using CRT shaders.
-
- Posts: 2185
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Cheapest video card for 4K@120hz for Retroarch?
Which TV do you have that supports 4K @ 120Hz? I've never seen a commercial TV that goes above 60Hz with 4K. Some gaming monitors I'm sure do.
Regarding black frame insertion, if it's generated from software like Retroarch it will be subject to performance dips, whereas BFI from the TV/monitor itself would be consistent. Though on TVs as far as I've seen BFI is restricted to 60Hz, which is perfect for 60fps games, but will lead to a darker picture especially paired with scanline shaders. Backlight settings can compensate.
Regarding black frame insertion, if it's generated from software like Retroarch it will be subject to performance dips, whereas BFI from the TV/monitor itself would be consistent. Though on TVs as far as I've seen BFI is restricted to 60Hz, which is perfect for 60fps games, but will lead to a darker picture especially paired with scanline shaders. Backlight settings can compensate.
Re: Cheapest video card for 4K@120hz for Retroarch?
4K120hz? Sometimes in 2020 or 2021.
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
LG’s 2019 OLED TVs will have proper HDMI 2.1 inputs, so I would guess that these sets will support 4k@120Hz
Re: Cheapest video card for 4K@120hz for Retroarch?
I was not aware that HDMI 2.1 is needed for 4K @ 120Hz. That is unfortunate because I am able to get a discount on Samsung TVs at half price at my job through this year and I will likely be getting the Samsung 55" Q80R which only has HDMI 2.0. I probably won't be at this job next year and that is why I want to buy a Samsung TV this year for the discount.
That said, how do CRT shaders look in 4K @60Hz? CRT Royale using the aperture grille mask with settings slightly tweaked looks decent on my 40" 1080p Samsung CCFL LCD made in 2008 with brightness and contrast turned up. Some have said that the motion resolution with CRT shaders in 4K @ 60Hz isn't great and that is a significant concern of mine. However, I don't notice any motion resolution issues with CRT shaders on my 1080p display with horizontal scrolling. Motion does get pretty blurry with vertical scrolling but most retro games I play don't have vertical scrolling.
That said, how do CRT shaders look in 4K @60Hz? CRT Royale using the aperture grille mask with settings slightly tweaked looks decent on my 40" 1080p Samsung CCFL LCD made in 2008 with brightness and contrast turned up. Some have said that the motion resolution with CRT shaders in 4K @ 60Hz isn't great and that is a significant concern of mine. However, I don't notice any motion resolution issues with CRT shaders on my 1080p display with horizontal scrolling. Motion does get pretty blurry with vertical scrolling but most retro games I play don't have vertical scrolling.
Last edited by Brad251 on Sat Mar 23, 2019 3:39 am, edited 1 time in total.
-
- Posts: 2185
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Cheapest video card for 4K@120hz for Retroarch?
They can look pretty damn good. There was a good topic a while ago on this: viewtopic.php?f=6&t=58086&start=30Brad251 wrote:That said, how do CRT shaders look in 4K @60Hz.
I posted some pics to this album:
https://imgur.com/a/3kd4WYI
If persistence blur doesn't bother you too much, the shader route on flat panels would work well for you. A lot of TVs can do black frame insertion which helps clear motion a lot, though the picture will look darker.
If these shaders are too heavy a load on your computer, a great option is to use the "interlacing" shader as the only shader pass in combination with CRTSwitchRes at either native or a super resolution, whatever works best with your TV or monitor.
Re: Cheapest video card for 4K@120hz for Retroarch?
For HDMI 2.1 you will need to change EVERYTHING, cables, TVs, receivers, everything, if even one item is not 2.1 throughout the chain, it will be downgraded to something lower.
As of right now there are no HDMI 2.1 TVs, receivers, nor cables. Meaning the best you would get is 4K60.
As of right now there are no HDMI 2.1 TVs, receivers, nor cables. Meaning the best you would get is 4K60.
-
bobrocks95
- Posts: 3472
- Joined: Mon Apr 30, 2012 2:27 am
- Location: Kentucky
Re: Cheapest video card for 4K@120hz for Retroarch?
Considering LG's 2019 OLEDs start shipping in a couple of weeks, if OP is in that sort of price range then it's absolutely worth waiting for.Lawfer wrote:For HDMI 2.1 you will need to change EVERYTHING, cables, TVs, receivers, everything, if even one item is not 2.1 throughout the chain, it will be downgraded to something lower.
As of right now there are no HDMI 2.1 TVs, receivers, nor cables. Meaning the best you would get is 4K60.
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
Re: Cheapest video card for 4K@120hz for Retroarch?
The risk of burn in with OLED is too great for me to consider it for gaming. OLED displays are also a lot less bright than LED LCDs and aren't good for brighter rooms.bobrocks95 wrote:Considering LG's 2019 OLEDs start shipping in a couple of weeks, if OP is in that sort of price range then it's absolutely worth waiting for.
Re: Cheapest video card for 4K@120hz for Retroarch?
A lot of modern TVs with 60hz BFI or 60hz PWM Cycle periods don't have issues with brightness much anymore. And in a dark room it's not too much an issue.
You just won't get like HDR like levels of brightness. Most can still hit 100-200 nits it seems like. Which is plenty bright.
My 1080p Sony hits around 80'ish nits and it's a 2015 model year ish set or so. It works well depending on the content in a dark room. (Bad low value transitions still cause dark detail to be smeary or have double images unfortunately.) Bright room not as much depending on the game.
You just won't get like HDR like levels of brightness. Most can still hit 100-200 nits it seems like. Which is plenty bright.
My 1080p Sony hits around 80'ish nits and it's a 2015 model year ish set or so. It works well depending on the content in a dark room. (Bad low value transitions still cause dark detail to be smeary or have double images unfortunately.) Bright room not as much depending on the game.
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
There is basically no risk of permanent burn in with normal use. If you use it as a PC monitor with static objects for hours on end every day, then yes probably.Brad251 wrote:The risk of burn in with OLED is too great for me to consider it for gaming. OLED displays are also a lot less bright than LED LCDs and aren't good for brighter rooms.bobrocks95 wrote:Considering LG's 2019 OLEDs start shipping in a couple of weeks, if OP is in that sort of price range then it's absolutely worth waiting for.
Re: Cheapest video card for 4K@120hz for Retroarch?
[/quote]There is basically no risk of permanent burn in with normal use. If you use it as a PC monitor with static objects for hours on end every day, then yes probably.[/quote]
This just isn't true. Burn in can occur and the a static object on the screen doesn't have to been for hours on end for days. I know several people that experienced permanent burn in on their OLED. They had a static image on their screen for a few hours at a time occasionally and they experienced burn in. Everyone has a different definition of normal use. What would be normal TV use for a gamer would not be normal TV use for someone that mainly watches movies. At the very least, the risk of burn in on OLED is much higher than with an LED LCD and OLEDs area a really bad option for brighter rooms because they can't get that bright.
This just isn't true. Burn in can occur and the a static object on the screen doesn't have to been for hours on end for days. I know several people that experienced permanent burn in on their OLED. They had a static image on their screen for a few hours at a time occasionally and they experienced burn in. Everyone has a different definition of normal use. What would be normal TV use for a gamer would not be normal TV use for someone that mainly watches movies. At the very least, the risk of burn in on OLED is much higher than with an LED LCD and OLEDs area a really bad option for brighter rooms because they can't get that bright.
-
- Posts: 2185
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Cheapest video card for 4K@120hz for Retroarch?
Do you know what year those OLEDs with burn in are? 2016 panels seem to be the last ones that had the chance of burn in even with non-insane usage. In 2017 LG changed the sub-pixel structure on their panels to vastly reduce burn in risk. Now it does indeed take static images being displayed, and with certain colors, and with certain brightness settings, for hours on end.
Starting in 2017 brightness did improve quite a bit too, enough to perform well in bright rooms. There's still things I dislike about current OLED panels, but these two are no longer issues.
Starting in 2017 brightness did improve quite a bit too, enough to perform well in bright rooms. There's still things I dislike about current OLED panels, but these two are no longer issues.
Re: Cheapest video card for 4K@120hz for Retroarch?
I work in retail and the people that experienced burn in on their OLED purchased the 2017-2018 models from my store. Even if the burn in risk has been reduced, personally, I don't want to take the risk that I could get burn in. I'm happy to get a Samsung QLED with a picture that is nearly as good as OLED, save for the black levels.fernan1234 wrote:Do you know what year those OLEDs with burn in are? 2016 panels seem to be the last ones that had the chance of burn in even with non-insane usage. In 2017 LG changed the sub-pixel structure on their panels to vastly reduce burn in risk. Now it does indeed take static images being displayed, and with certain colors, and with certain brightness settings, for hours on end.
Starting in 2017 brightness did improve quite a bit too, enough to perform well in bright rooms. There's still things I dislike about current OLED panels, but these two are no longer issues.
-
- Posts: 208
- Joined: Thu Sep 27, 2018 1:04 am
Re: Cheapest video card for 4K@120hz for Retroarch?
If you want some scientific data from relatively current OLED models:Brad251 wrote:I work in retail and the people that experienced burn in on their OLED purchased the 2017-2018 models from my store. Even if the burn in risk has been reduced, personally, I don't want to take the risk that I could get burn in. I'm happy to get a Samsung QLED with a picture that is nearly as good as OLED, save for the black levels.
https://www.rtings.com/tv/learn/real-life-oled-burn-in-test
https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled
Re: Cheapest video card for 4K@120hz for Retroarch?
Indeed, burn in in OLED doesn't look like it's going anywhere, if you want to get an OLED to use it lightly only to watch anime or tv series off Blu-ray, then it should be fine, anything other than that and it might not be so fine.Brad251 wrote:I work in retail and the people that experienced burn in on their OLED purchased the 2017-2018 models from my store.
https://www.youtube.com/watch?v=zyEA4YyjH9A
https://www.youtube.com/watch?v=GX78-Bw9lKM
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
Come on, implying that any kind of static image on an OLED will result in burn in is completely unfair. As I've told you before, I game and watch local TV on my C8 for hours on end and there isn't the slightest hint of burn in, and I've looked for it.Lawfer wrote: Indeed, burn in in OLED doesn't look like it's going anywhere, if you want to get an OLED to use it lightly only to watch anime or tv series off Blu-ray, then it should be fine, anything other than that and it might not be so fine.
https://www.youtube.com/watch?v=zyEA4YyjH9A
https://www.youtube.com/watch?v=GX78-Bw9lKM
Also your links doesn't really prove anything. Yes the last one with the 2016 set does indeed have burn in. But we don't know how this set was treated, we only see the damage and one angry customer. He might have been incredibly stupid with his TV. But to be fair I have no experience with OLEDs from 2016 and older, as I have only owned 2017 and 2018 sets. So let's just for the sake of argument assume that burn in did occur far more frequently on 2016 and older sets.
Now, for the first link. I don't see any burn in at all in that video. I see the darker patches, yes. But it's also very common knowledge that OLEDs have poor near black uniformity, and on some panels this manifests as dark areas on a gray screen. His set is very bad no doubt and should absolutely be replaced.
The oval pattern we can see is most likely something the 2018 sets does sometimes on an all uniform coloured screen. I have no idea why as the 2017s didn't do this, but I suspect that it has something to do with burn in protection, and it never shows during actual content, so it has never bothered me
I find it quite amusing that he claims to be an "expert calibrator" yet he didn't know about these OLED quirks and immediately jumps the gun and shouts Burn in.
People reading this thread should follow energizerfellow's link as they actually do a really good and scientific job of showing how much abuse is actually needed for a 2017 set to get permanent burn in.
energizerfellow wrote: If you want some scientific data from relatively current OLED models:
https://www.rtings.com/tv/learn/real-li ... rn-in-test
https://www.rtings.com/tv/learn/permane ... n-lcd-oled
Re: Cheapest video card for 4K@120hz for Retroarch?
Look at this video:
https://www.youtube.com/watch?v=P_KW7WVBFb8
The TV is still good after 6 months of use with no signs of burn in, but he clearly stresses twice that no gaming was done on the TV and that the oner of this TV are not huge fanatics just regular people who do just regular TV vieweing, because it is known that gaming on an OLED is not something that is recommended as it will have a higher risk of incurring burn-in.
https://www.youtube.com/watch?v=P_KW7WVBFb8
The TV is still good after 6 months of use with no signs of burn in, but he clearly stresses twice that no gaming was done on the TV and that the oner of this TV are not huge fanatics just regular people who do just regular TV vieweing, because it is known that gaming on an OLED is not something that is recommended as it will have a higher risk of incurring burn-in.
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
My personal experience is that gaming is completely fine and nothing to worry about. You can disregard that if you will. But it’s foolish to disregard Rtings very thoroughly thought out and extensive testing, that even takes gaming with static HUDs into account.
This claim is also backed up by Vincent Theo from HDTVtest who IS a professional calibrator and has seen more OLEDs than we could ever dream of.
Just to make it clear. I’m not saying that burn in can’t happen at all, but it will take quite a while
This claim is also backed up by Vincent Theo from HDTVtest who IS a professional calibrator and has seen more OLEDs than we could ever dream of.
Just to make it clear. I’m not saying that burn in can’t happen at all, but it will take quite a while
Re: Cheapest video card for 4K@120hz for Retroarch?
65C7 owner here, and i used to put my OSSC on it. Big mistake. If you're one of the lucky ones that have no issues after extensive use. More power to you, but don't mislead people that there isn't a risk. I love the oled picture but I cannot recommend it for retro gaming.
For the OP best to wait and see if Samsung will at least make any sets this year with HDmi 2.1. I am waiting for the tcl 8k tv myself.
Sent from my SAMSUNG-SM-G891A using Tapatalk
For the OP best to wait and see if Samsung will at least make any sets this year with HDmi 2.1. I am waiting for the tcl 8k tv myself.
Sent from my SAMSUNG-SM-G891A using Tapatalk
Displays I currently own:
LG 83C1(OLED),LG 77C2(OLED), LG 42C2(OLED),TCL 75R635(MiniLED),Apple Studio Monitor 21(PCCRT),SONY 34XBR960x2(HDCRT)
SONY 32XBR250,Samsung UBJ590(LED),Panasonic P50VT20(Plasma),JVC NZ8
LG 83C1(OLED),LG 77C2(OLED), LG 42C2(OLED),TCL 75R635(MiniLED),Apple Studio Monitor 21(PCCRT),SONY 34XBR960x2(HDCRT)
SONY 32XBR250,Samsung UBJ590(LED),Panasonic P50VT20(Plasma),JVC NZ8
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
I don’t think I’m misleading anyone here.
I would love to hear more info on your end. Like how long did you play at a time and did you have the panel brightness set to max?
On both a C7 and currently a C8 we have several times played 4:3 fighters through the OSSC for four-five hours straight. And at the very worst there has been some slight image retention where the black bars were, but nothing that didn’t go away after a few hours.
I would love to hear more info on your end. Like how long did you play at a time and did you have the panel brightness set to max?
On both a C7 and currently a C8 we have several times played 4:3 fighters through the OSSC for four-five hours straight. And at the very worst there has been some slight image retention where the black bars were, but nothing that didn’t go away after a few hours.
Re: Cheapest video card for 4K@120hz for Retroarch?
Streets of rage 2 line 5x mode from OSSC from start to finish. About 1 hour of gameplay. I recall watching TV and still seeing the life bars. Image retention didn't go away for a couple of hours. Not going to risk it again plus 4:3 would wear those pixels out over time.
With the LG oled having 22ms lag even in game mode it just wasn't worth it when I could just give up size and have better image quality and display lag on my Sony 34xbr960.
I do use the oled for modern gaming. Shadow of the tomb raider looks gorgeous in 4k/60 on my 2080ti. So for single player games the display lag isn't so bad. When the tcl 8k hits, expect a full review.
Sent from my SAMSUNG-SM-G891A using Tapatalk
With the LG oled having 22ms lag even in game mode it just wasn't worth it when I could just give up size and have better image quality and display lag on my Sony 34xbr960.
I do use the oled for modern gaming. Shadow of the tomb raider looks gorgeous in 4k/60 on my 2080ti. So for single player games the display lag isn't so bad. When the tcl 8k hits, expect a full review.
Sent from my SAMSUNG-SM-G891A using Tapatalk
Displays I currently own:
LG 83C1(OLED),LG 77C2(OLED), LG 42C2(OLED),TCL 75R635(MiniLED),Apple Studio Monitor 21(PCCRT),SONY 34XBR960x2(HDCRT)
SONY 32XBR250,Samsung UBJ590(LED),Panasonic P50VT20(Plasma),JVC NZ8
LG 83C1(OLED),LG 77C2(OLED), LG 42C2(OLED),TCL 75R635(MiniLED),Apple Studio Monitor 21(PCCRT),SONY 34XBR960x2(HDCRT)
SONY 32XBR250,Samsung UBJ590(LED),Panasonic P50VT20(Plasma),JVC NZ8
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
So are the life bars from Steets of Rage 2 still visible on your TV to this day?
-
- Posts: 2185
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Cheapest video card for 4K@120hz for Retroarch?
I still remember your Youtube video where you gleefully exclaim "it's like a giant CRT!" when I was looking into an OLED+OSSC combo last year. I tried it too and gave up on it as well, though for reasons other than burn in.Bahn Yuki wrote:65C7 owner here, and i used to put my OSSC on it. Big mistake.
Re: Cheapest video card for 4K@120hz for Retroarch?
No but I won't risk the possibility of wearing out the pixels with 4:3 content. Yes the infinite contrast and lower motion blur made it look like a giant crt but display lag is noticable compared to a crt.Konsolkongen wrote:So are the life bars from Steets of Rage 2 still visible on your TV to this day?
Hopefully tcl 8k has similar results as the 4k series and I'm curious to see how mini led does since microled leaves a lot to be desired.
Sent from my SAMSUNG-SM-G891A using Tapatalk
Displays I currently own:
LG 83C1(OLED),LG 77C2(OLED), LG 42C2(OLED),TCL 75R635(MiniLED),Apple Studio Monitor 21(PCCRT),SONY 34XBR960x2(HDCRT)
SONY 32XBR250,Samsung UBJ590(LED),Panasonic P50VT20(Plasma),JVC NZ8
LG 83C1(OLED),LG 77C2(OLED), LG 42C2(OLED),TCL 75R635(MiniLED),Apple Studio Monitor 21(PCCRT),SONY 34XBR960x2(HDCRT)
SONY 32XBR250,Samsung UBJ590(LED),Panasonic P50VT20(Plasma),JVC NZ8
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
Well then you don't have burn in on your TV. You had image retention which is completely normal. Fair enough that you don't want to risk anything, but I don't think your one experience with image retention justifies calling my posts misleading
I have never said that OLEDs are image retention free. They are not. But I personally don't consider that a problem anyway as it goes away. Burn in is permanent image retention and is much less common.
I have never said that OLEDs are image retention free. They are not. But I personally don't consider that a problem anyway as it goes away. Burn in is permanent image retention and is much less common.
-
bobrocks95
- Posts: 3472
- Joined: Mon Apr 30, 2012 2:27 am
- Location: Kentucky
Re: Cheapest video card for 4K@120hz for Retroarch?
LG sets also have some sort of automatic wear leveling that runs every so many hours, so I would imagine 4:3 content wouldn't have as much of an impact as you'd think.
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
-
Konsolkongen
- Posts: 2315
- Joined: Fri May 16, 2008 8:28 pm
- Location: Denmark
Re: Cheapest video card for 4K@120hz for Retroarch?
That is correct. I believe that every four hours of use will trigger a compensation cycle when the TV is in standby.
Re: Cheapest video card for 4K@120hz for Retroarch?
Remember the quoted lag is middle of the screen which means input lag +8.3ms, thechnically the input lag on the LG is just under 1 frame.Bahn Yuki wrote:With the LG oled having 22ms lag even in game mode it just wasn't worth it when I could just give up size and have better image quality and display lag on my Sony 34xbr960.
Is that really noticeable vs. a CRT ? I doubt many people could actually tell.
Also the XBR 960 is famous for being among the few CRTs to do interlacing+deinterlacing and image processing for aspect correction, on practically everything you feed it and generating input lag, which is likely to be over 1 frame.
Maybe it's a case similar to the plasma motion placebo effect, that for years made people claim plasmas had less lag than LCDs by default, but this isn't true, rather the better motion made users 'just feel' it more responsive.
It's possible that you enjoy more playing on the XBR960 because of its CRT properties, but in reality it is very likely that it is your LG can produce the least lag and the purest picture.
Strikers1945guy wrote:"Do we....eat chicken balls?!"
-
- Posts: 208
- Joined: Thu Sep 27, 2018 1:04 am
Re: Cheapest video card for 4K@120hz for Retroarch?
Yep, those Hi-Scan and Super Fine Pitch FD Trinitrons like the KD-34XBR960 were a good ~2 frames behind on lag vs an analog CRT TV. Lots of TVs are faster than that these days.Xyga wrote:Also the XBR 960 is famous for being among the few CRTs to do interlacing+deinterlacing and image processing for aspect correction, on practically everything you feed it and generating input lag, which is likely to be over 1 frame.
Here's a Hi-Scan Sony KV-27HS420 doing it's laggy thing:
https://www.youtube.com/watch?v=--xY_NvwJ_g
Don't forget that Youtube lets you step a video frame by frame with the comma and period keys when paused the video is paused.