Computer hardware for MAME!

The place for all discussion on gaming hardware
Post Reply
User avatar
Ed Oscuro
Posts: 18654
Joined: Thu Dec 08, 2005 4:13 pm
Location: uoıʇɐɹnƃıɟuoɔ ɯǝʇsʎs

Computer hardware for MAME!

Post by Ed Oscuro »

Here's a rundown of what sort of performance benefits I've seen moving from a Core 2 Duo to a Core i7, specifically looking at MAME. I don't have benchmark numbers but I don't think there's a need for that kind of precision. I should note, importantly, there are a few minor issues with the hardware I'm trying to run down, but I don't think any of them could have much of an impact on MAME. With the new chip, most arcade games on ROM will now fit in the processor's on-die cache, and probably a lot of the MAME code running as well. I have some thoughts for people looking to build a MAME rig but I'd like to hear other anecdotes (or better yet hard data, if anybody is so inclined) to figure out a winning strategy for a MAME build.

Short background: Recently I had my venerable Core 2 Duo machine running Windows XP die, and instead of trying to spend a lot of money on fixing it, I finally swapped out the bad motherboard in my Core i7 machine (parts bought end of 2008) and have it mostly working now (just need to sort out sound and get the fans quieted down, might swap out the somewhat too loud case flow fan for the sake of noise and my sanity, but a minor thing really that I mainly notice because I demand nearly perfectly quiet towers). The part that's relevant for MAME is the bargain-basement Core i7 920 processor, which is the "bottom end" of the Core i7 series. I understand that, for single-threaded applications, a Core i5 might actually be more sensible, it's what was out at the time and I'm happy with it. Bit of a performance boost for everyday things in fact. Setup then:

* ASRock X58 Extreme6 (replacing a Dead On Arrival ASUS P6T)
* Windows 7 64-bit Ultimate (why not? I wanted to be able to switch the whole OS over to Japanese). This is a switch from a 32-bit operating system.
* 6GB of DDR3 RAM, triple channel (Patriot Viper, I need to have another look at the timings sometime, as it's showing up in the motherboard as 1066MHz, instead of the 1600MHz advertised, but much happier that all three sticks are registering now, at least)
* ASUS Radeon HD EAH5850, as a more environmentally friendly alternative to the still unused Radeon HD 4870x2. I'm unlikely to need this kind of gaming muscle for some time but it was nice to play with tessellation in the Uningine "Heaven" demo.
* Hard drives:
* A 1TB Hitachi Drive (HDT721010SLA360) takes a few moments to spin up when I need 0.140 MAME ROMs (this drive was purchased along with the core of the computer components in 2008, and only in the last few months was put on the old Core 2 Duo machine to replace a really old drive, one of the first SATA drives, an old 300GB model), and
* the 2TB Samsung Spinpoint F4 HD204UI (in case anybody cares, I got it because it was the first disk with 667MB/platter drives, and by all accounts had sterling performance otherwise - quiet, low idle power, only a few platters so likely decent reliability, yadda yadda).

I'm looking to finally give Linux (probably Linux Mint if Ubuntu seems too bare, and though I just installed the "perfect 10" October 10 release of Ubuntu, I'm advised to wait a month before actually installing it, to wait for any late fixes, as there do seem to be some issues) a try soon, likely installing it on the 1TB drive. It'll be interesting to see what, if any, impact this has on performance.

First some thoughts and speculation about the current crop of CPUs and what other choices might have brought.

I'm becoming pretty familiar with the 2.4 / 2.8-ish GHz range of CPUs, and this range of frequencies has been at the bottom of the launch of a new performance part from Intel the last two big launches. I've MAMEd on a Compaq S5200NX, and comfortably looped Ninja Emaki more than a few times with a 360 controller (the good old days, huh). That's about a 2.7 GHz Celeron with 512K L2 cache. More than adequate for most arcade games without obvious hiccups. Then there was the lowest-end at launch, but still great, Core 2 Duo E6600 at 2.4GHz. That bumped cache up to an amazing 4MB shared between two cores. Then I used a T7700 in a "desktop replacement" laptop, which for all intents and purposes seems to be the equal of the E6600 - same specs but a slightly lower bus. Then we have today's (and yesterday's) Core i7 920, with 256K of cache per core plus 8MB shared between all cores. Tom's Hardware ranks the E6600 as a "fifth tier" product, with the 920 a first-tier CPU. I suspect that overclocking will have a big performance boost for MAME - perhaps close to a linear improvement, because I don't see the gigantic amount of cache limiting any but the most demanding games emulated in MAME, where 3D graphics may require more data be moved than can be held on the shared caches. And still the sheer speed of the processor and RAM may be enough to keep the CPU happily working and fed with data without obvious slowdowns - but this is all pure speculation on my part.

Obviously, the specific behavior of each CPU with regard to cache sharing could be a factor, but the Core iX designs (i7, i5, i3 even) all seem to have shared L3 caches, so single-threaded applications like MAME don't do too poorly. They also apparently all have a feature called "smart cache" so they can share L2 caches - so theoretically perhaps MAME can get closer to 9MB cache total, if you add the 3x256K L2 caches from other cores.

I dimly recall reading that the Core i5 has better single-core overclocking abilities than the early i7 releases, meaning that it will pull power from three cores to give one core better performance in the case of single-threaded (or hyperthreaded) application demand. From my perspective, the i7 is at a fixed speed; however, right now I'm using a ASRock utility called "Intelligent Energy Saver," which lowers clocks on the fly to reduce power consumption. I generally don't like these things (the ASUS equivalent for the P5B motherboard, which supported the Core 2 Duo, reportedly could throttle you down when you didn't want it to, and I didn't fancy using a program to save CPU time for other programs...obviously, this is becoming less of an issue with each generation).

Onto games. I've been able to play some pretty terrible stuff on MAME and enjoy it, but it was always depressing to try out some ancient, ugly-looking Midway game powered by a Voodoo GPU and have it crawl. I'm not up to 100% yet but it now seems playable (until MAME, or the game, crashes the game itself - not sure which is which in the case of Gauntlet: Dark Legend, but Dark Legend crashed on me in minutes of fooling around, though MAME itself was responsive). So, here's a short list of what stuff was like (in my memory) and what it's like now. I'm getting these speeds with random crap loaded in, and Firefox going along in the background.

Important note: I've been using MAME Plus! for a few years now. Core 2 Duo got 32-bit releases. The Core i7 is getting 64-bit builds, put together by Sword on the MAME Plus! Forum in the absence of the site maintainer. Someday I'll have to fool around with building my own sources - I looked into it recently on the 32-bit Windows XP and Core 2 Duo box but only as a last resort.

Cruis'n USA: Somehow or other I managed to play through this (though somewhat badly) a few years ago at considerably less than fullspeed, with the keyboard, on the Core 2 Duo. I do believe I used savestates and they worked rather fluidly, as the one bright spot. Now, Cruis'n exceeds 100% speed at all times that I have seen, at all points where the user has control, through the beginning of a game. There may be some choke points but I haven't seen any.

Cruis'n World: Seems more of the same. Should be playable all through.

Gauntlet Legends: I recall there being sound and gameplay stutter on the Core 2 Duo, and generally unplayable framerates. It wasn't smooth, and it wasn't playable - you'd press a button, and the character might make a move toward it, and there'd be a ringing echo through the speakers the second later. On the Core i7, the situation seems totally different. Can't judge the sound (can't hear the sound currently) but gameplay is fluid. The worst framerates are when the map overview is shown - the extra polygons being shown seem to require a boost in CPU time commensurate with the likely increased demands on the emulated 3dfx Voodoo (2?) graphics. It dips to around 88% on the rolling level selection screen, and at the beginning of Mountain dips further to around 77%, but quickly sprang back to 100%. It hovers around this amount. Overall, seems playable, and considering the Voodoo hardware and generally buggy feel of the game (try running into the edge of the rope bridge from the cliff side on Mountain)...probably about par anyway. However, I experienced a second random crash testing this, so there may be something the matter with the emulation.

Gradius IV: I recall there being major sound stutter and lag under the Core 2 Duo, and game-destroying slowdown when the gold sun or lava dragons show up near the beginning of the game. To be sure, whenever polygons get on screen there's still a bit of a framerate dip under 100%, but it still remains fluid and seems playable, even when you get a few suns in and it dips to 65%. Strangely, when the highscores are flying in (alternating lines from different sides of the screen) after a game and highscore entry, the framerate doesn't hit 100%. This sequence also appears polygon based like highscore entry, instead of sprite based.

Nebulasray: Well over 230% when the speed limit is revoked, both in demo and gameplay. Ah, the joys of trying to stay alive, and score, at 230%-250% variable speed!

Ray Storm: Surprisingly, this 1996 release seems about the hard limit for a Rays game the i7 920 handles competently at stock speeds. The intro scene (before the title, with spaceship shenanigans...in space) was over 100% all the way to the end - mostly around 110-120%, with dips to 106% and occasionally 105%. Right before and at the big flash of light before the title screen it dipped to somewhere around 88% or lower, not that this is a critical time. In a minute or two of gameplay, the game stayed above 100% all the time. Fluid.

Shinobi: Protected (FD1094) and unprotected sets run 1000-1100% through part of the demo at least. No problems. Ever. (Unless you're a robot who's trying to get through the MAME collection at ten times the normal speed.)
Pengo: Sometimes, it dips below 3000% in the demo. I think. Seems like it gets close to 4000% too, though.

Task Force Harrier: It's a minor point, but fast-forward is a good way to keep the action rolling and avoid sitting around at intermission scenes. For a game like Sky Shark, the intermissions are important because you're resting your hands. For a game like Task Force Harrier with built-in autofire, it doesn't make much sense to have frequent intermissions. On the Core 2 Duo, fast-forward seemed "controllable" - on the Core i7, you have to be quick to end the short cutscene at the beginning of the game before you get control of your plane and enemies start appearing. I have the feeling that my habit of simply flicking the speed limiter to get things moving is going to meet a gruesome death. In terms of speed, I think it's an increase somewhere from around 800-900% to 1400%. (Who knew that an unprotected set of Shinobi was more demanding on MAME than Task Force Harrier?)

Triple channel vs. not?
I'm not going to test specifically for this, but when I first tried out Gauntlet Legends, the computer only had 4GB of RAM recognized across two banks. Today Molag RAM has smiled on me and granted me a third channel, and I'm going to do all I can to keep him pleased in hopes that I don't have a defective part somewhere. (Amusing, but profoundly stupid, anecdote: I was fiddling with a power cable hanging out of the still-open case, when I apparently shorted something which forced the PC to shut down - when it came back, the third bank of RAM was finally showing up. How about that? Not that it's impossible it was there, or ready to be detected, immediately before and I didn't notice, but I don't believe it was.) For some programs this will severely hamper data throughput; for MAME, I don't notice much or any difference between then and now. I believe that my framerates in Legends are exactly the same today as yesterday. There did seem to be more dips below 100% the other day, but it's easy to imagine things from only a few data points (especially if they're being pulled from personal memory).

CONCLUSION:
Tom's doesn't rank any AMD CPUs in the first tier of October's list of top Gaming CPUs, but they also don't test for MAME performance. While AMD hasn't had a top-tier performance CPU for what seems like an embarrassingly long time now, they do compete well in the performance : dollar ratio with most parts and if you don't need absolute top performance you should do pretty well. I wouldn't expect their top CPUs to do much better than the Core i7 920 in some of its border cases here (Ray Storm, Cruis'n USA), but they may provide a good stable MAME experience in most games. Strangely, I find that in some cases having a faster CPU actually hurts usability (refer to Task Force Harrier above). Gradius IV takes a somewhat anachronistic "throw 3D on 2D and mush it together" approach, so its hardware requirements vary wildly from moment to moment - from 300% and more in ship selection, to 110%+ at the game beginning (the traditional "let's boring spheres of option" chase around the screen), to 65% shortly after. I don't think any but the most expensive Intel CPUs are capable of doing this, unless (and maybe even then not unless) you are heavily invested in overclocking.
Post Reply