EDITED
When it comes to 240p content, there is no such thing as "better objective picture quality", it all comes down to preference. Some people love how high-end BVMs handle 240p content, others don't. No one is right or wrong here. The Kurozumi shader is fantastic when it comes to replicating that BVM-look, and is a really good alternative if you don't want to shell out 1K$+ for a BVM.
When I say "objective picture quality" I just mean displaying the image in a way that resembles the native display environment as closely as possible while introducing as few artifacts as possible. At a minimum, this requires scanlines and some slight blur since CRTs were less sharp than LCDs even when displaying the same resolution, as a result of the CRT's gaussian beam profile. Since LCDs introduce their own blur when things are in motion, you really just need to add scanlines when on an LCD.
Patrickbot wrote:contrary to some sources, ideal scanlines are closer to a 2:1 ratio than a 1:1 ratio
Again, this is a subjective opinion, not an objective one.
Edit: I stand corrected; “perfect” scanlines, such as would appear on a hypothetically perfect CRT, are in fact 1:1 when discussing *objective* brightness. Each visible line being twice as objectively bright results in them being *perceived* as 1.4 times as bright. Either way, both objective and perceived brightness should be the same for 240p vs 480i.
I happen to own a BVM and a 4K OLED. The picture quality of 240p on the BVM is both sharper and brighter than Kurozumi running on the OLED. The BVM also has zero input lag. The OLED can't compete with that, even when running Retroarch's low latency function. I prefer the BVM, but the Kurozumi is still amazing. Will this change when we in 5-8 years have access to 8K OLED's with zero input lag (fingers crossed) and higher brightness? I don't know, maybe, but as nostalgia and "authenticity" are important to me I will probably still prefer the BVM.
I wonder if your display has HDR, and if you're actually using it when running the emulator. AFAIK there is no way for the user to call on HDR manually, except through Windows, and that's still less than ideal. Theoretically, a display with HDR is orders of magnitude brighter than a CRT.
I believe you that Kurozumi on the OLED isn't as sharp as the BVM, because almost all shaders add blur. The solution is to always use nearest neighbor and integer scaling. You lose the perfect 4:3 AR, but you can still get pretty close to 4:3, and CRTs never had perfect geometry, anyway. Unless there's blur being added somewhere, a digital display should always be sharper than a CRT displaying the same resolution as a result of the natural gaussian blur of the CRT and the phosphor structure of the CRT.
Retroarch in 2018 is capable of lower input latency than the actual hardware, using the "run ahead" feature, but it's still not perfect.