Using a PC for CRT shaders with original hardware
Using a PC for CRT shaders with original hardware
Requires a capture card, obviously. Install Shaderglass
https://github.com/mausimus/ShaderGlass
Use with monitoring or capture software. Ptbi is good for Blackmagic devices.
http://ptbi.metaclassofnil.com/
Install ReShade to Shaderglass for even more options.
https://reshade.me/
https://github.com/mausimus/ShaderGlass
Use with monitoring or capture software. Ptbi is good for Blackmagic devices.
http://ptbi.metaclassofnil.com/
Install ReShade to Shaderglass for even more options.
https://reshade.me/
-
- Posts: 5
- Joined: Fri Sep 02, 2022 5:02 pm
Re: Using a PC for CRT shaders with original hardware
Apologies, I'm not up-to-date on the latest capture cards, but wouldn't playing through the capture window introduce a lot of latency?
Re: Using a PC for CRT shaders with original hardwarel
Ptbi was designed with low latency in mind and has 2 to 4ms processing time. I’m not sure what the lag is from the Blackmagic card itself. From what I’ve read the Micomsoft Sc-512n1-l is the best out there.
https://www.youtube.com/watch?v=cUnZt7Kg03U
Don’t know about other cards. The trade off is lag in favor of visuals.
Some good info here, but I’d look for something better than OBS to play games.
https://calvin.me/how-i-play-consoles-on-pc/
https://www.youtube.com/watch?v=cUnZt7Kg03U
Don’t know about other cards. The trade off is lag in favor of visuals.
Some good info here, but I’d look for something better than OBS to play games.
https://calvin.me/how-i-play-consoles-on-pc/
Re: Using a PC for CRT shaders with original hardware
I think you can also do this with the libretro/RetroArch video processor core, if you are running Linux. Maybe it would also work to feed live video from a capture card through the FFMPEG core.
Magewell PCIe cards might be another low latency option, and they support Nvidia GPUDirect Video as well.
No need to wait for a RetroTink 4K, move directly to 8K
Magewell PCIe cards might be another low latency option, and they support Nvidia GPUDirect Video as well.
No need to wait for a RetroTink 4K, move directly to 8K
Re: Using a PC for CRT shaders with original hardware
Going over this chart
https://calvin.me/static/dda1607f4b2149 ... atency.png
At 60fps, it looks like we can achieve 2 frames of lag pairing a line doubler with a low latency capture card.
https://fpstoms.com/
That seems perfectly acceptable to me. Does anyone know what brands the “generic” scaler with 48ms latency are?
ReShade can be installed to video programs like VLC as well, and ShaderGlass works with anything.
https://calvin.me/static/dda1607f4b2149 ... atency.png
At 60fps, it looks like we can achieve 2 frames of lag pairing a line doubler with a low latency capture card.
https://fpstoms.com/
That seems perfectly acceptable to me. Does anyone know what brands the “generic” scaler with 48ms latency are?
ReShade can be installed to video programs like VLC as well, and ShaderGlass works with anything.
Re: Using a PC for CRT shaders with original hardware
I could never get reshade to look anything close to a consumer CRT or oldschool CRT PC monitor. I really tried. Spent hours with it but just couldn't. The only CRT shaders that I find convincing enough are the ones in Retroarch. CRT Royale is probably chief among them, but even that is not 1:1 like a CRT. Just the best replica I've seen.
Re: Using a PC for CRT shaders with original hardware
For latency purposes, I would need to see a Time Sleuth photograph of the processed signal. There's really no need to speculate or guess. It can be tested. I'm not optimistic.SGGG2 wrote:Going over this chart
https://calvin.me/static/dda1607f4b2149 ... atency.png
At 60fps, it looks like we can achieve 2 frames of lag pairing a line doubler with a low latency capture card.
https://fpstoms.com/
That seems perfectly acceptable to me. Does anyone know what brands the “generic” scaler with 48ms latency are?
ReShade can be installed to video programs like VLC as well, and ShaderGlass works with anything.
I don't remember the latency on my Datapath card, but it's an incredibly large and unplayable amount of lag.
I'm curious about some things:
How does the processing pipeline work? If you're scraping the "preview" window output from the capture card's software, that window is running inside the operating system. Modern operating systems and graphics drivers lock userland out of refresh rate controls. You need VRR to get a frame locked signal with the right refresh. What capture card software supports VRR? Do you have access to the raw input signal or are you processing video that has been permanently damaged by frame rate conversion?
We apologise for the inconvenience
Re: Using a PC for CRT shaders with original hardware
Extrems plays through his capture window and gets 2-3 frames of lag, not sure you can do much better than that in Windows.
Re: Using a PC for CRT shaders with original hardware
The latency chart is Time Sleuth results from a YouTuber who specializes in streaming technology. https://www.youtube.com/watch?v=AMuaQOXagxk&t=459s
ReShade has ports of several RA shaders, I’ve found them to be reasonably comparable. Librashader allows for RA shaders anywhere, but it’s in early development. https://snowflakepowe.red/blog/announci ... ader-0.1.0
The Elgato HD60X supports VRR. I had stuttering using a Blackmagic Intensity 4K Pro PCIe card with Ptbi on a fixed refresh display, but no such issues when switching to Freesync. I don’t know the technical details, just figured people here would be interested in testing it. I’d like to know, too. If you already have a gaming PC and capture card the only investment is time.
ReShade has ports of several RA shaders, I’ve found them to be reasonably comparable. Librashader allows for RA shaders anywhere, but it’s in early development. https://snowflakepowe.red/blog/announci ... ader-0.1.0
The Elgato HD60X supports VRR. I had stuttering using a Blackmagic Intensity 4K Pro PCIe card with Ptbi on a fixed refresh display, but no such issues when switching to Freesync. I don’t know the technical details, just figured people here would be interested in testing it. I’d like to know, too. If you already have a gaming PC and capture card the only investment is time.
Re: Using a PC for CRT shaders with original hardware
Here are some Retroarch/Libretro projects that seem very promising. I like the idea of using crt shaders with original hardware or old pc games. (I hope the ossc pro will get this functionality at some point).
If anyone has a chance to test these out, come back and post your experience.
-WindowCast core Official release thread-
Libretro core to capture the contents of another window for video processing. This is useful, for say, capturing the output of a standalone emulator (like xemu, Dolphin standalone, RPCS3 or PCSX2 nightlies) or a PC game running in a window and then processing it with RetroArch’s shader stack.
https://forums.libretro.com/t/official- ... core/40464
libretro-video-processor
The basic idea is this -- plug your legacy console into a capture device and use RetroArch to upscale it and apply shaders to taste.
https://docs.libretro.com/library/video_processor/
https://github.com/libretro/RetroArch/t ... -processor
If anyone has a chance to test these out, come back and post your experience.
-WindowCast core Official release thread-
Libretro core to capture the contents of another window for video processing. This is useful, for say, capturing the output of a standalone emulator (like xemu, Dolphin standalone, RPCS3 or PCSX2 nightlies) or a PC game running in a window and then processing it with RetroArch’s shader stack.
https://forums.libretro.com/t/official- ... core/40464
libretro-video-processor
The basic idea is this -- plug your legacy console into a capture device and use RetroArch to upscale it and apply shaders to taste.
https://docs.libretro.com/library/video_processor/
https://github.com/libretro/RetroArch/t ... -processor
Re: Using a PC for CRT shaders with original hardw
WindowsCast adds 2 frames of lag, although the dev is looking to improve it. ShaderGlass has all the shaders compiled within the program itself to minimize processing time, although I can’t find specific numbers. As far as I can tell, Libretro Video Processor is Linux only.
Re: Using a PC for CRT shaders with original hardware
+2 frames of lag on top of display lag doesnt sound very promising, though I agree the concept is cool. I just dont see any advantages at all for this over using an RT5X, GBS-C, or OSSC scaler.
-
- Posts: 2188
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Using a PC for CRT shaders with original hardware
The advantage would be more detailed CRT mask/grille simulation filters, especially on displays with 4K or more resolution. I agree 2f or more of lag is not worth it though. Maybe not a big deal for certain types of games.
Re: Using a PC for CRT shaders with original hardware
The advantages should be obvious. The amount of post processing options and customization on PC dwarfs anything available on even the most expensive retro scalers. Scanlines, masks, bloom, LUTs, gamma, motion blur, downscaling, multi layered effects, you name it. There’s no comparison. This is partly my fault, since I haven’t provided any media. On the other hand, anyone who’s used RetroArch should have a general idea what’s happening here.Josh128 wrote:+2 frames of lag on top of display lag doesnt sound very promising, though I agree the concept is cool. I just dont see any advantages at all for this over using an RT5X, GBS-C, or OSSC scaler.
If your main concern is lag, this isn’t ideal. There’s no argument here. I have two minds about this. I’d prefer to keep lag as low as possible, but from experience I know anything at 5 frames or under is perfectly playable at 60fps. i.e, you adjust and forget about it, and I’d bet the same goes for most. Not recommend for competitive fighting games or STG score chasing, but you get the idea. There are users on this forum with 2 or 3 frame lag scaler set ups right now! It’s at least comparable to THAT. There are other drawbacks, some tearing and the usual capture woes.
Another justification, I prefer to play 3D PS2 games on original hardware when possible, but find the visual quality lacking even with optimizations. After PC shaders it’s hard to go back to scalers. This serves as sort of midway point between real hardware and emulation. Even with my laggy controller adapter set up (up to two frames), and processing chain, the lag is probably stilll less than PCSX2! I can’t believe I’m playing on real hardware.
The examples here shouldn’t be thought of as finished use cases, but rather a set of developing technologies that will only get better with time and interest. Very few of these were developed with playing consoles in mind. Most of them are intended for use with PC games.
Ptbi (monitoring software) gives us a glimpse of what’s possible when it’s taken into account, up to 1 frame of software lag (the author claims processing times as low as 5ms), integrated Nvidia scaling algorithms, and refresh rate control. However, its hardware locked to Blackmagic devices, and Durante, of Dark Souls fame, has abandoned development.
In the first page of the DE-10 thread users are discussing the possibility of a straight digitizer with no lag. Now this is exciting!
https://shmups.system11.org/viewtopic.php?t=67775donluca wrote:I mean... if we're digitizing to a PC there's absolutely no need of any kind of processing on the board whatsoever (which is likely to add lag as well).
Once it's digitized you can do everything in software on your PC/Mac and the possibilities become literally endless.
I just want a good, cheap, "open source" digitizer compatible with win/*nix/macOS so that I can grab a 240p/480i signal and get it to my PC screen with the lowest amount of lag possible and THEN scale it to whatever I feel like.
Re: Using a PC for CRT shaders with original hardware
Got Ptbi working, the required DLL files must be in the program directory itself and were only included with an older version. Installing Visual Studio won’t work. Since this is the only program designed for actually playing games, I’ll be focusing most of my efforts here. Ptbi is way more plug and play and more of a “virtual video processor” than capture program. I’ve had it run flawlessly for hours, but there are a ton of moving parts here which can make troubleshooting difficult.
Ptbi works with Blackmagic devices only, which are targeted at video professionals and adhere to strict broadcast standards. It doesn’t support 480i/p, and devices with off specs signal modes like a 240p console or OSSC aren’t compatible with the capture hardware. This requires external scaler solutions.
The Extron DSC 301 seems ideal here, as it has excellent scaling, optional frame lock, signal conversion, and PC software which allows for real time adjustments and robust preset management all without switching operating environments. HDMI RGB Limited output seems a bit punchier than HDMI YUV Limited (not sure if that’s placebo), and I’d like to keep unnecessary conversions off the PC side until things are sorted, unless it seems to cause an issue, which it might, as YUV output is far more compatible.
Onto the games!
Getting audio crackling with Ptbi (but not the desyncs present with Blackmagic in OBS). This is a known latency issue with Kernal drivers. I’ve greatly reduced the frequency, duration and intensity though various power and perform settings.
Unless you’re using Windows 11, Shaderglass has a yellow border around the edge. This is a OS security feature. If this bothers you, you might want to skip it entirely and just install ReShade directly to your capture program. The CRT shader options are limited in comparison, though.
Shaderglass has frameskip options, with “frameskip 2” enabled by default. I’m struggling to wrap my head around what this means exactly. Supposedly it’s only processing every other frame, but It doesn’t appear to drop any. I don’t see any difference disabling it.
I’ll post media and lag numbers once my OSSC arrives, as of now I don’t have any way to capture 240p sources. My scalers are 480i and above units. Technically speaking, I am out a bit out of my depth here. Any help is appreciated
Overall, very encouraging so far.
Ptbi works with Blackmagic devices only, which are targeted at video professionals and adhere to strict broadcast standards. It doesn’t support 480i/p, and devices with off specs signal modes like a 240p console or OSSC aren’t compatible with the capture hardware. This requires external scaler solutions.
The Extron DSC 301 seems ideal here, as it has excellent scaling, optional frame lock, signal conversion, and PC software which allows for real time adjustments and robust preset management all without switching operating environments. HDMI RGB Limited output seems a bit punchier than HDMI YUV Limited (not sure if that’s placebo), and I’d like to keep unnecessary conversions off the PC side until things are sorted, unless it seems to cause an issue, which it might, as YUV output is far more compatible.
Onto the games!
Code: Select all
Xbox/480p/Otogi 2 - runs near flawlessly regardless of settings. There’s only one section with some (repeatable) tearing and stuttering which I’m 99% certain is from the game itself.
PS2/1080i GSM/Zone of the Enders 2 - see above, with some minor issues once in a great while, possibly set up related.
PS2/1080i GSM/Nightshade - this game is prone to tearing and stutter, runs way better after switching to Ptbi. Usually runs well. Will run flawlessly for an hour or more and start stuttering for no apparent reason. Maybe just a little bit and goes away, power cycling the DCS sometimes fixes it, sometimes appears to be Windows issue.
No issues with with the few hours I spent with various 360 games.
Unless you’re using Windows 11, Shaderglass has a yellow border around the edge. This is a OS security feature. If this bothers you, you might want to skip it entirely and just install ReShade directly to your capture program. The CRT shader options are limited in comparison, though.
Shaderglass has frameskip options, with “frameskip 2” enabled by default. I’m struggling to wrap my head around what this means exactly. Supposedly it’s only processing every other frame, but It doesn’t appear to drop any. I don’t see any difference disabling it.
I’ll post media and lag numbers once my OSSC arrives, as of now I don’t have any way to capture 240p sources. My scalers are 480i and above units. Technically speaking, I am out a bit out of my depth here. Any help is appreciated
Overall, very encouraging so far.
Re: Using a PC for CRT shaders with original hardware
I'd be passively interested to know how much lag Lossless Scaling adds for real time machine learning upscaling in Windows 11. The GPU is the only unique feature a PC has. It would be nice to enhance PS1 3d someday, because ugly PS1 3d has aged like milk. Although, we're looking ahead. I doubt any of the current options does very much good. Could turn into something good in the future.
We apologise for the inconvenience
Re: Using a PC for CRT shaders with original hardware
I've figured out how ptbi, Shaderglass and ReShade affect each other, and eliminated all computer-side motion issues using the on-screen statistics windows available in ptbi and ReShade. They interact with each other in ways that aren't always obvious. Processing times are very encouraging.
Observations and Recommendations:
On the left is a source image from a Extron DSC-301, on the right is ptbi/Shaderglass with gamma correction and bilinear filtering enabled in ptbi, and the Mame HLSL shader with Pixel Size = 1 enabled in Shaderglass for further ReSahde processing. I use a blue light filer at around 2900k, so these may look off if the screenshot removes it.
Frame spikes and processing stat differences are due to running in split screen and taking a screenshot. This doesn't happen with normal full screen use. I'm far more interested in making games look "good" as opposed to accurate. Anyone with a PC can install ReShade in Shaderglass, load up a screenshot or emulator and see whatever they want. The only difference here is the source.
System Specs:
Windows 11 Pro 22H2
Intel Core i7-10700 (8-Core, 16 threads) CPU@2.90GHz
32GB RAM
2TB Gen3 Nvme
EVGA 3070ti FTW
Sony 43" KD-43X85K 4k 120hz display (8ms lag at 4k120)
Observations and Recommendations:
- If using an external scaler enable "Frame Lock" if possible. This will eliminate all tearing from the scaler itself.
Blackmagic devices are optimized for YUV - YUV processing time at 1080p60 in ptbi averages about 1.45ms with spikes up to 2.5ms, RGB averaged around 2.5 with spikes up to 4.5ms. 720p60 was around half these numbers.
Unfortunately, Blackmagic devices don't support 480p at all. They do support 480i, but ptbi doesn't. In other words, resolutions lower than 720p require a scaler.
Even when running full screen on a 4k display, ptbi reports a max resolution of 1080p. I discovered this by installing ReShade directly to the capture program, the level of detail present in presets was much lower. There’s definitely some sort of scaling being applied within ptbi itself, not sure exactly what’s happening here.
My unoptimized ReShade presets (with 13 effects and 45 passes) average about 9.5 - 10ms processing time at 4k60, and under 5ms at 1080p60.
Using Shaderglass to clone the ptbi window provides full access to 4k image processing. Be sure to enable "Input > Window > ptbi" in Shaderglass. This has far better performance than desktop capture.
Use the "Frame skip" option in Shaderglass to reduce the processing load in Reshade. If the processing time in ReShade is greater than source frame time (16ms for 60fps, 8ms for 120) you WILL get frame drops and stuttering. At 4k120 you want to enable "Frame skip = 3" for 60fps sources.
Playing 60fps sources with Windows set to 60hz and Frame skip set to zero feels laggy. 60fps source, Windows 120hz, Frame skip 3 (60fps) feels very smooth. I refuse to believe 8ms makes a difference here, so I’m not sure what’s causing it.
If Shaderglass isn't the primary program you will get stuttering. You can "click through" with the mouse to change "focus" to the source program, allowing key commands to register. You'll need to Window+Tab to get back into Shaderglass. Be sure to click on the MENUBAR of the progam when selecting it, otherwise Shaderglass won't see commands. You can't shift focus if the ReShade window is open. If you accidently click the mouse, you'll shift focus.
Given all this, ReShade should only be installed to Shaderglass. Be sure to check "Perfomance Mode" for reduced processing times and increased stability. If you're still experiencing stuttering or frame drops first try to Windows+tab into ptbi, then Shaderglass, then try to fullscreen command in once or twice. Repeat or quit Shaderglass and relaunch.
There's no stuttering or tearing on the display itself when "G-Sync compatible" mode is enabled in Nvidia control panel. Both ptbi and Shaderglass work with adaptive sync.
On the left is a source image from a Extron DSC-301, on the right is ptbi/Shaderglass with gamma correction and bilinear filtering enabled in ptbi, and the Mame HLSL shader with Pixel Size = 1 enabled in Shaderglass for further ReSahde processing. I use a blue light filer at around 2900k, so these may look off if the screenshot removes it.
Frame spikes and processing stat differences are due to running in split screen and taking a screenshot. This doesn't happen with normal full screen use. I'm far more interested in making games look "good" as opposed to accurate. Anyone with a PC can install ReShade in Shaderglass, load up a screenshot or emulator and see whatever they want. The only difference here is the source.
System Specs:
Windows 11 Pro 22H2
Intel Core i7-10700 (8-Core, 16 threads) CPU@2.90GHz
32GB RAM
2TB Gen3 Nvme
EVGA 3070ti FTW
Sony 43" KD-43X85K 4k 120hz display (8ms lag at 4k120)
Last edited by SGGG2 on Wed Aug 09, 2023 5:03 pm, edited 2 times in total.
Re: Using a PC for CRT shaders with original hardware
Wow, I really like the look of your shader effects. If I may ask what presets did you use to get those results? I know a lot of people say they don't like Bloom lighting but I've always been really into it along with that chromatic RGB separation effect you got going on.
Does ptbi's anti-aliasing technique also work with your setup? or does it cause weird visual anomalies? I have a black magic 4K that I purchased a while ago to test ptbi with but I never got around to it. I also purchased one of the really low input lag avermeda capture cards.
Is this game running in 480p? Last time I Used the Extron DSC-301 it was really bad at 480i processing and gave lots of combing artifacts.(It looked much better with the 960i from the ossc 3x mode and I'm assuming it would do fine with 1080i from the PS2, but I haven't had time to test it. input leg I think would probably be pretty high since Extron's de-interlacing tends to be pretty input lag heavy
Does ptbi's anti-aliasing technique also work with your setup? or does it cause weird visual anomalies? I have a black magic 4K that I purchased a while ago to test ptbi with but I never got around to it. I also purchased one of the really low input lag avermeda capture cards.
Is this game running in 480p? Last time I Used the Extron DSC-301 it was really bad at 480i processing and gave lots of combing artifacts.(It looked much better with the 960i from the ossc 3x mode and I'm assuming it would do fine with 1080i from the PS2, but I haven't had time to test it. input leg I think would probably be pretty high since Extron's de-interlacing tends to be pretty input lag heavy
Re: Using a PC for CRT shaders with original hardware
That’s Nightshade on PS2 running at 1080i with GSM. Deinterlaced 1080i actually looks more detailed than progressive modes, although not quite as sharp or retro looking. It’s a field rendering game, if you examine closely you‘ll see weaving artifacts on the left and how image processing completely cleans it up on the right. Extron deinterlacing is one frame, the same as motion adaptive. I don’t have a processor with bob deinterlacing to test with yet.
All shaders used are listed in the Shaderglass (MAME Hlsl) and ReShade windows. Bloom really makes CRT type effects really come alive, especially for 3D content. An optimized version of my shader could shave off as much as 5ms processing time, but I doubt I’m the one to do it. I take a painterly approach to stacking effects.
Anti-ailising “works” as in it activates, but AA on sub 1080p sources is a non-starter. All it does is smear the image, and not in a good way. There’s not enough information for proper AA. Bilinear filtering, gaussian blur and chromatic aberration provide much better results.
EDIT: I use bloom effects to offset brightness loss from CRT shaders, of which I’ll often stack 2 or 3, chromatic aberration helps reduce the resulting moire patterns, gives textures a more naturalistic look and helps round off overly sharp UI elements.
All shaders used are listed in the Shaderglass (MAME Hlsl) and ReShade windows. Bloom really makes CRT type effects really come alive, especially for 3D content. An optimized version of my shader could shave off as much as 5ms processing time, but I doubt I’m the one to do it. I take a painterly approach to stacking effects.
Anti-ailising “works” as in it activates, but AA on sub 1080p sources is a non-starter. All it does is smear the image, and not in a good way. There’s not enough information for proper AA. Bilinear filtering, gaussian blur and chromatic aberration provide much better results.
EDIT: I use bloom effects to offset brightness loss from CRT shaders, of which I’ll often stack 2 or 3, chromatic aberration helps reduce the resulting moire patterns, gives textures a more naturalistic look and helps round off overly sharp UI elements.
Re: Using a PC for CRT shaders with original hardware
It's a cool idea ...
Might be really a winner if can find a cheap card it'll work with. If you've already got the kit it's kind of a no brainer to give it a go imho.
The thing that seems (?) to be the ongoing issue with capture devices that there's a bit of flakiness with capturing and post processing.
It'll be interesting case by case how emulation issues stack up to steaming issues...
Might be really a winner if can find a cheap card it'll work with. If you've already got the kit it's kind of a no brainer to give it a go imho.
The thing that seems (?) to be the ongoing issue with capture devices that there's a bit of flakiness with capturing and post processing.
It'll be interesting case by case how emulation issues stack up to steaming issues...
Re: Using a PC for CRT shaders with original hardware
Three Blackmagic Intensity Pro 4k PCIe cards on eBay $70 each for anyone interested in PtBi. No analog breakout cable. https://www.ebay.com/itm/295874373000?_ ... p_homepage
FYI, I haven’t figured out how to get scanlines correct in ShaderGlass for scaled 240p capture. The issue being downscaling (Pixel Size) not lining up correctly and degrading the image too much. if Shaderglass effects get used they’re generally at Pixel Size 1 or 2 for mask effects. No issue with scanlines in ReShade.
FYI, I haven’t figured out how to get scanlines correct in ShaderGlass for scaled 240p capture. The issue being downscaling (Pixel Size) not lining up correctly and degrading the image too much. if Shaderglass effects get used they’re generally at Pixel Size 1 or 2 for mask effects. No issue with scanlines in ReShade.
Re: Using a PC for CRT shaders with original hardware
I've had some success approximating proper alignment for 240p sources in ShaderGlass. The main issue being the program doesn't support proper pixel sizes for commonly used console resolutions; 224p. 256p, etc. This results in scaling artifacts such as half pixels and improper ratios. If this is something you'd like to see addressed, please let the author know here. Many modern retro PC titles use these resolutions as well. https://github.com/mausimus/ShaderGlass/issues/55
3D games fare better since bad scaling isn't as evident, and CRT effects can do a good job of covering it up with some tweaking. Here's Panzer Dragoon Zwei (Sega Saturn) - 5x OSSC Generic 4:3 profile, 1080p > Extron DSC 301 HD at 1080p (no scaling) > PtBi > ShaderGlass + image tweaking in ReShade. (Work in progress)
First picture is CRT-Hyllian-Sinc-Smartblur-Sgenpt - Pixel size 10
Pictures 2 and 3 are CRT-Torridgristle - Pixel Size 2, Scanline size set to 15
3D games fare better since bad scaling isn't as evident, and CRT effects can do a good job of covering it up with some tweaking. Here's Panzer Dragoon Zwei (Sega Saturn) - 5x OSSC Generic 4:3 profile, 1080p > Extron DSC 301 HD at 1080p (no scaling) > PtBi > ShaderGlass + image tweaking in ReShade. (Work in progress)
First picture is CRT-Hyllian-Sinc-Smartblur-Sgenpt - Pixel size 10
Pictures 2 and 3 are CRT-Torridgristle - Pixel Size 2, Scanline size set to 15
Re: Using a PC for CRT shaders with original hardware
EDIT: If you read an earlier version of this post, I did the math wrong. Corrected now. LOL
I was able to measure input lag on the capture card by feeding in video from the HDMI port of my graphics card directly into the Intensity Pro, running an onscreen stopwatch and recording the display at 240fps. 4k 120hz on the display (Gsync off) and 1080p 60 through the capture card, recorded with an iPad pro.
Results are about as expected, with an average of 50 milliseconds, or three frames. I was hoping for closer to two. It's still decent. Measurements *might* be lower if recorded with a faster camera.
The best known combination right now is an Elgato HD60X paired with OBS (as tested by EposVox), at a latency of two frames (31ms). I've had issues with OBS where the frame rate randomly drops in half (my guess is the program lost focus) so I'm not sure it's worth switching. Ptbi is made strictly for monitoring, there's a lot less that can go wrong.
Shaderglass with ReShade active seems to add anywhere around 1 to 3 frames, but I didn't really test too much, and not in fullscreen mode, which has superior performance. There's another scaling program, Magpie, which people use to inject ReShade into, which is supposedly faster.
https://github.com/Blinue/Magpie
Assuming a display has 8ms lag at 120hz, Windows Direct Composition (which has a framebuffer of 1 frame) gives half a frame of lag for 60hz sources, 240hz, one quarter of a frame, etc -- but the buffer for post processing goes down with it. Gsync somehow bypasses WDC, and in theory could reduce framebuffer latency to 0. A straight digitizer would have latency comparable to connecting directly to a display. (I was able to force ptbi to use Gsync when paired with Shaderglass, but it's no longer working after a firmware upgrade on my TV and graphics driver reinstall.)
https://blurbusters.com/gsync/gsync101- ... ttings/10/
I tested frame generation with a program called "Lossless Scaling", and hate it. It makes me sick just looking at it.
https://anton-malezhik.itch.io/lossless-scaling
Solved a couple issues, as well. Scaling issues with ptbi corrected by overriding system DPI scaling in the compatibility setting. This means you can bypass Shaderglass and install ReShade directly into ptbi with full 4k resolution. This will bypass any latency added by Shaderglass, but you no longer have the option to use Frameskip at 120hz for a 16ms post processing buffer, 8ms is the max. Which is fine unless you have layers of effects.
All audio problems solved by routing sound into a USB audio device and a cheap ground loop isolator. Audio disabled in capture programs and "listened" to in sound settings. No more artifacts or de-syncs.
I was able to measure input lag on the capture card by feeding in video from the HDMI port of my graphics card directly into the Intensity Pro, running an onscreen stopwatch and recording the display at 240fps. 4k 120hz on the display (Gsync off) and 1080p 60 through the capture card, recorded with an iPad pro.
Results are about as expected, with an average of 50 milliseconds, or three frames. I was hoping for closer to two. It's still decent. Measurements *might* be lower if recorded with a faster camera.
The best known combination right now is an Elgato HD60X paired with OBS (as tested by EposVox), at a latency of two frames (31ms). I've had issues with OBS where the frame rate randomly drops in half (my guess is the program lost focus) so I'm not sure it's worth switching. Ptbi is made strictly for monitoring, there's a lot less that can go wrong.
Shaderglass with ReShade active seems to add anywhere around 1 to 3 frames, but I didn't really test too much, and not in fullscreen mode, which has superior performance. There's another scaling program, Magpie, which people use to inject ReShade into, which is supposedly faster.
https://github.com/Blinue/Magpie
Assuming a display has 8ms lag at 120hz, Windows Direct Composition (which has a framebuffer of 1 frame) gives half a frame of lag for 60hz sources, 240hz, one quarter of a frame, etc -- but the buffer for post processing goes down with it. Gsync somehow bypasses WDC, and in theory could reduce framebuffer latency to 0. A straight digitizer would have latency comparable to connecting directly to a display. (I was able to force ptbi to use Gsync when paired with Shaderglass, but it's no longer working after a firmware upgrade on my TV and graphics driver reinstall.)
https://blurbusters.com/gsync/gsync101- ... ttings/10/
I tested frame generation with a program called "Lossless Scaling", and hate it. It makes me sick just looking at it.
https://anton-malezhik.itch.io/lossless-scaling
Solved a couple issues, as well. Scaling issues with ptbi corrected by overriding system DPI scaling in the compatibility setting. This means you can bypass Shaderglass and install ReShade directly into ptbi with full 4k resolution. This will bypass any latency added by Shaderglass, but you no longer have the option to use Frameskip at 120hz for a 16ms post processing buffer, 8ms is the max. Which is fine unless you have layers of effects.
All audio problems solved by routing sound into a USB audio device and a cheap ground loop isolator. Audio disabled in capture programs and "listened" to in sound settings. No more artifacts or de-syncs.
Re: Using a PC for CRT shaders with original hardware
Big news, I've managed to get Gsync working in both Ptbi and OBS! This should bypass any lag introduced by Desktop Windows Manger, allowing rendering directly to output while in borderless fullscreen.
Enable Gsync for windowed and full screen mode. While in the Gsync panel click on "Display" in the menu and enable "G-sync Compatible Indicator" so you can see if it's working or not.
Open Nvidia Control Panel
Chooses "Manage 3D settings"
Click the "Program Settings" tab
Click "add" and choose Ptbi
Ptbi Nvidia Control Panel settings:
Low Latency Mode - Ultra
Monitor Technology - G-Sync compatible
Power Management Mode - Prefer maximum performance
Preferred Refresh Rate - Highest available
Triple Buffering - Off
Vertical Sync - On
Vulkan/OpenGL Present Method - Prefer layered on DXGI swapchain
OBS is A LOT tricker, it doesn't even show the Gsync indicator.
Enable Open GL via command line prompt " --allow-opengl"
(Right click on shortcut, choose "properties" and add the above prompt to the "target" field)
Launch OBS, under File>Settings>Advanced switch the video renderer to OpenGL. Hit "apply" and then "okay"
This will close the program
(Follow the same instructions as Ptbi to add OBS to the Nvidia Control Panel)
OBS Nvidia Control Panel settings:
Low Latency Mode - Ultra
Monitor Technology - G-Sync compatible
Open GL GDI compatibility - Prefer Compatible
Power Management Mode - Prefer Maximum Performance
Preferred Refresh Rate - Highest Available
Triple Buffering - Off
Vertical Sync - On
Vulkan/OpenGL Present Method - Auto
Download "Wined3D" DX 1-11 to Open GL wrapper and follow the instructions for a DX11 install. Right click the OBS shortcut and choose "open file location" for the proper directory.
https://fdossena.com/?p=wined3d/index.frag
Now, when you launch OBS it'll show Gsync indicator status "Normal". Right click and choose "Fullscreen Projector (Preview) > "YourDisplay" and Gsync is active!
Would appreciate if someone is willing to lend me a Time Sleuth or similar, so I can confirm these readings are accurate.
Enable Gsync for windowed and full screen mode. While in the Gsync panel click on "Display" in the menu and enable "G-sync Compatible Indicator" so you can see if it's working or not.
Open Nvidia Control Panel
Chooses "Manage 3D settings"
Click the "Program Settings" tab
Click "add" and choose Ptbi
Ptbi Nvidia Control Panel settings:
Low Latency Mode - Ultra
Monitor Technology - G-Sync compatible
Power Management Mode - Prefer maximum performance
Preferred Refresh Rate - Highest available
Triple Buffering - Off
Vertical Sync - On
Vulkan/OpenGL Present Method - Prefer layered on DXGI swapchain
OBS is A LOT tricker, it doesn't even show the Gsync indicator.
Enable Open GL via command line prompt " --allow-opengl"
(Right click on shortcut, choose "properties" and add the above prompt to the "target" field)
Launch OBS, under File>Settings>Advanced switch the video renderer to OpenGL. Hit "apply" and then "okay"
This will close the program
(Follow the same instructions as Ptbi to add OBS to the Nvidia Control Panel)
OBS Nvidia Control Panel settings:
Low Latency Mode - Ultra
Monitor Technology - G-Sync compatible
Open GL GDI compatibility - Prefer Compatible
Power Management Mode - Prefer Maximum Performance
Preferred Refresh Rate - Highest Available
Triple Buffering - Off
Vertical Sync - On
Vulkan/OpenGL Present Method - Auto
Download "Wined3D" DX 1-11 to Open GL wrapper and follow the instructions for a DX11 install. Right click the OBS shortcut and choose "open file location" for the proper directory.
https://fdossena.com/?p=wined3d/index.frag
Now, when you launch OBS it'll show Gsync indicator status "Normal". Right click and choose "Fullscreen Projector (Preview) > "YourDisplay" and Gsync is active!
Would appreciate if someone is willing to lend me a Time Sleuth or similar, so I can confirm these readings are accurate.
Re: Using a PC for CRT shaders with original hardware
I've figured out how to enable Gsync with ShaderGlass cloning Ptbi. Requires ShaderGlass 0.8. Doesn't work with other versions. If Ptbi has a ReShade install, it must be uninstalled first.
Nvidia Control Panel Settings for Ptbi
Low Latency mode - Ultra
Monitor - Use Global Setting (G-Sync compatible) **will NOT WORK if you manually set G-Sync**
Power Management Mode - Prefer Maximum Performance
Triple Buffer - Off
V-Sync - On or Use 3D Application Settting
Vulkan/OpenGL Present Method - Use Global Setting (Auto)
ShaderGlass 0.8 Settings. You'll want to use Nvidia inspector if you have multiple versions installed, the control panel sees them all as the same program. You can create individual profiles with Nvidia Inspector.
All the required Gsync options should be enabled by default, I like to turn on "GSYNC Support Indicator Overlay" for testing
Ultra Low Latency - Enabled
Vertical Synce - Force on
Launch Ptbi. The Gsync indicator will say "normal"
Launch ShaderGlass. The Gsync indicator will be positive.
Make Ptbi fullscreen
In ShaderGlass, choose Ptbi under Input>Window
ShaderGlass will hook into Ptbi, enabling Gsync in both programs
Enter fullscreen with ShaderGlass
ShaderGlass must be in focus to maintain Gsync.
Please note the settings for Ptbi are different here than running it standalone mode with Gsync.
Nvidia Control Panel Settings for Ptbi
Low Latency mode - Ultra
Monitor - Use Global Setting (G-Sync compatible) **will NOT WORK if you manually set G-Sync**
Power Management Mode - Prefer Maximum Performance
Triple Buffer - Off
V-Sync - On or Use 3D Application Settting
Vulkan/OpenGL Present Method - Use Global Setting (Auto)
ShaderGlass 0.8 Settings. You'll want to use Nvidia inspector if you have multiple versions installed, the control panel sees them all as the same program. You can create individual profiles with Nvidia Inspector.
All the required Gsync options should be enabled by default, I like to turn on "GSYNC Support Indicator Overlay" for testing
Ultra Low Latency - Enabled
Vertical Synce - Force on
Launch Ptbi. The Gsync indicator will say "normal"
Launch ShaderGlass. The Gsync indicator will be positive.
Make Ptbi fullscreen
In ShaderGlass, choose Ptbi under Input>Window
ShaderGlass will hook into Ptbi, enabling Gsync in both programs
Enter fullscreen with ShaderGlass
ShaderGlass must be in focus to maintain Gsync.
Please note the settings for Ptbi are different here than running it standalone mode with Gsync.