casualcoder wrote:
This is another thing that makes me reach for Retroarch. I'm not sure how it does it but its vsync is perfect, no choppyness and all but the most minimal lag if you use GPU sync mode.
I'm not sure how it performs this honestly, nor if it
really does without speeding everything up, haven't investigated nor touched RA in a long time but magical variable refresh does not exist. Have you checked the games actual speed while playing ? (pressîng F11 to make MAME's internal refresh speed meter appear)
I mean Groovy can force vsync to 60Hz and everything will be smooth (increase sync_refresh_tolerance to like 10) and it'll be at the cost of only 1 frame too since it's using d3d9ex. In fact doing that you will even gain access to frame_delay for all games, allowing you to further reduce lag, if your cpu is strong enough.
But similarly all games not 60hz will be sped up, which isn't a big deal when they're close, but pretty annoying for the 58, 56, 54 etc games.
With Groovy though going further if you have a compatible monitor (crt or non-edid-locked lcd) you can achieve sub-1frame vsynced at the exact original refresh, and use Portaudio to reduce audio lag if you wish, all at the same time.
In fact I believe you can do it with RetroArch too to some extent using custom modes somewhere, though only a number of fixed ones, where Groovy+AMD+Emudriver being dynamic automatically support any mode with precision, like a simili-VRR but it's native rates instead of adaptive to a single one.
Of course commercial FreeSync or G-Sync (and soon HDMI VRR) make all this groovy and retroarch lag reduction stuff obsolete, indeed.
Some TVs already support FreeSync.
casualcoder wrote:
Not exactly. While I'm not a fan of input lag even if it's "baked in", my main intent isn't to eliminate it entirely but rather to compensate for the lag in the chain that I can't account for.
So, for example, my TV has 16ms of input lag (1 frame) and my RetroCade computer has probably ~6ms just to process the USB controller input itself. If I set a game like Guwange to 1 frame of input lag in Retroarch, it's actually 1 frame + 1 frame + 6ms. Which is almost identical to the original pcb input lag. In cases where I surpass the PCB by a long margin (2 frames), I don't exactly get upset about it.
Sure it's one way of doing it, personally I don't like too much the idea of getting the input to register too far within the game's own time to compensate a laggy chain (chewing 1 frame off any game shouldn't hurt gameplay tho, just a personal belief*) so I tend to the chain first making sure it's as low-lag as possible, monitor, controller, OS and ports, then apply a lag reduction method to push the input as close to the game's as possible.
In any case for doing that, one day if the beam racing (aka frame slice) method becomes a thing in MAME it'll be less trouble for everyone. Still won't compensate for a laggy setup tho. :p
* I've been wondering, do we know for sure that in the game's program running on its intended pcb, the inputs always register
after the equivalent of a number of frames has been produced ? after all these old school pcb's aren't frame-based, MAME is. I don't know if it always waits though, since the inputs layer is not the same thread as video.
casualcoder wrote:
I've been MAME'ing for over 20 years, so I'll take every small victory in beating the system over enforced input lag any day.
Not sure what you mean by 'enforced' so just for the sake of precision; where MAME drivers can be considered accurate (not all of them for sure but for now consider the ones that we can trust the most with what we know), as long as we don't have hard data to contradict (direct pcb lag measurements) then in theory these drivers don't produce more lag than the original games running on pcb, so that part of the emulation delay cannot be considered 'enforced', or at least not yet with only claims, it's still officially the legit delay.
It's only the vsync lag on top of that, and our hardware's, that qualifies as enforced/undesirable, but we indeed have the means/tools to eliminate most of it now, whatever the preference is, groovy, retroarch, or free/gsync.
casualcoder wrote:
I recall this is one of the things I was impressed with with ShmupMame originally. I thought the developer of that did something to the blitter rates to improve emulation accuracy. And I think technically you can adjust them further, but as you say it sounds like a recipe for disaster.
Blitter delay has been in all MAME builds since then, even before in the 'too close' builds, and it doesn't help much. Until some point we had determined a few rather well working values, but later the cv1k driver was optimized a couple of times and that went to shit, today with several games you have to readjust the CPU% down first before using blitter, not sure what to do exactly I've found some rather nicely working settings but there's too much margin for error, imagine the granularity is 1000 and for each game you're supposed to find the best balanced sweet spot, using the two sliders, and that against the original pcb (or a broad collection of reliable quality gameplay videos, worst case)
It'll be a PITA to complete, all that to get only more or less approaching inaccurate slowdown behaviour, again probably nobody will bother to research thoroughly, or by the time it's done MAMEdev will have implemented wait states. We're not talking near future whatever the scenario.
casualcoder wrote:
dannycheeto wrote:
casual coder for groovymame you just download the most recent d3d9ex and then in the mame.ini under the 'CORE SWITCHRES OPTION' category, edit like this
Code: monitor lcd that's it, i started using it myself thanks to xyga's help.
Thanks, I'll definitely give that a shot with that in mind. If I can achieve equal to what I get with Shmupmame (and not dive into another loop of compatibility headaches) by making those changes then I would be quite happy.
Doing that is just the most basic way of configuring Groovy, you get 2~3 frames better lag performance than baseline MAME, which means only 1 frame remains used to sync, and that's it if you don't do more.
Then - repeating what I've mentioned earlier this post - if you want you can further decrease the lag using frame_delay (in the sliders menu, only from groovy v0.206 because the slider thing was a bit broken before) but by default that will work only for games within +/- 2Hz off 60Hz.
If you wish to expand the number of games games eligible to frame_delay you need to open the mame.ini again and increase the sync_refresh_tolerance value.
The default is 2.0 (2Hz), so if for instance you increase it to 2.5 the 1st gen Cave games will be included.
But again if you do so, pressing F11 ingame you'll see the meter indicate 104% speed, it's the tradeoff (on the plus side the scrolling will be butter-smooth)
Increasing it more like 10 will cover pretty much everything in MAME, but games like R-Type of course will be sped up considerably. Your choice.
With ShmupMAME you get something similar to the default vsynced 1 frame if using ddraw (nb: not sure ddraw is fine in Win 10 or even 8 ), then the sprite buffer hacks hardcoded in the drivers remove yet another frame, but it breaks the video accuracy (sprites layer in many games no longer in sync with the tiles layer for instance, almost unnoticeable in some games, horribly so in others, I suspect it goes as far as breaking proper timings depending on the game, changing the gameplay as a consequence)
IIRC Guwange fortunately seems almost unaffected.
But considering the numerous games that are, the obsolete drivers (mamedev have fixed tons of things since then), and missing games, shmupmame is long past its coolness.
It keeps an advantage in terms of hardware requirement though, since unlike frame_delay the sprite hacks don't cost anything cpu-wise. Also the old style GUI is nice.