Better performance from Windows programs on Mac using Bootcamp?

This weekend I got sick of dealing with Wineskin/Playonmac/Crossover to (badly) run Windows programs (read: games) in “virtual” Windows wrappers, so I bit the bullet and bought a copy of Windows (luckily I work for a University so I could get the $70 discounted Student version, otherwise I wouldn’t have bothered).

As expected, my dedicated Windows games definitely work much, much better in Windows that my crappily self-ported Mac versions, but I was wondering:

I have a program that’s been professionally ported to Mac, e.g. Tomb Raider 2013 (the Steam Play version). Despite the fact that every time I launch the game it tells me that my computer does not meet the minimum requirements, it runs just fine (i.e. playable) on my computer on “medium” settings. I’m getting an average FPS of 20.1, min 12.8 max 24.9, which is not great, but it’s still smooth and not choppy at all.

Probably a dumb question, but will I get better performance under Windows, or will it not make any significant difference because the hardware is still the same?

On a side note, as someone who’s had only Macs for at least a decade now, I am surprisingly enjoying Windows 8.1, once I figured out how to install it without already having Windows (it wouldn’t even let me download the ISO, but eventually I figured out a workaround). It’s like having a whole other computer! There are certain quirks about Windows that will take some getting used to, but I’m enjoying using it so far.

Not a dumb question, but not likely one that someone can answer off the cuff. You’ll probably have to test it out.

It depends on the driver support for your hardware in both OSes, how OS-agnostic (and graphics library-agnostic) the game was originally written and how good a job they did porting the code.

Odds are if it was originally written for Windows, it’ll run better on Windows.

It depends on the quality of the port. It would not surprise me one bit to learn that the OS X port has worse performance than the Windows version on the same hardware. I know from past experiences that OS X ports are frequently poorly-optimized. (Civilization 5 was terrible, for example.)

But there are way too many variables here to answer the question-- why don’t you just install it on Windows and check? Tomb Raider even has a nifty built-in benchmarking tool on the main menu.

I do support work for a company that produces computer games for Windows, Mac and Linux. We have to have different video requirements for different platforms; on Windows 512mb of video ram is the minimum requirement, whereas it’s 1gb of video ram on the other two.

The basic answer is that the DirectX structure and video drivers for Windows are far more efficient and polished than the OpenGL-based drivers that the other two OSs use. And Apple cannot seem to keep up with current OpenGL versions; before 10.7 OS X used OpenGL 2.1 for heaven’s sake. 10.9.2 only supports 4.1 even though 4.2 is out. And so on. Perhaps part of the reason for that is that Apple have to write the OS X video drivers and they can only be installed by a whole OS X update; on Windows nVidia, AMD and to a lesser extent Intel are in fierce competition for the graphics card market, and so release updated and optimised DirectX drivers every few months - which can be installed without an OS update.

I’ve done a lot of porting between Mac and Windows in my life. Bottom line: Yes, it’s going to be faster in Windows–but probably not enough so that you’re going to notice or to make up for the hassle of constantly having to boot back and forth.

3D games will show this more than others: Direct3D, the graphics library that’s used for most Windows games is a little more efficient than OpenGL, the Mac (and everything else non-Windows) equivalent at most things. By “little more,” I generally mean 5-10%. But there’s a caveat – new graphics cards tend to have new features that result in better specific effects: better water, or volumetric light beams, or mist/smoke, or whatever. PC drivers are optimized and released for these features almost immediately, the Mac ones lag, sometimes for years, and in general DirectX has access to these new features before OpenGL does (or at least in a way that’s easier for the developer to use). So as you crank up the graphics options, the delta will get a little more in favor of Windows–again, even though the hardware is identical.

For non-graphical things, there’s still a small Mac penalty – the Mac has more layers of API between it and the “bare metal.” This is in the 2-5% range, but for this one you get all those expected Mac advantages: better graphics, better color, shadows, animation, nicer font handling, etc. It’s also, curiously, getting better with each Mac release: Since Snow Leopard, Mac OS X has been getting noticeably faster with each release. Windows does generally, too, but the Mac improves more.

Which brings us to Mavericks. If you’re running the very latest Mac OS X version, the above may not apply. Mavericks adds a feature called memory compression, in which there’s an intermediate state for memory between “in use” and “swapped out to disk” – it’s first compressed in place (still in RAM), and only if that fails does it get written to disk (which is hundreds or thousands of times slower to access, depending on disk type). The performance gain from this is substantial – if you’ve got less than 8GB of memory in your Mac, or run enough applications to put the crunch on whatever memory you do have, the benefit of memory compression will dwarf the effects above, and you’ll probably be faster on the Mac–especially if you’re still using a magnetic disk as your swap device.

All of the above assumes that the hardware actually is identical. If you’re booting one OS off an SSD, and the other off a magnetic drive, disk access speed will dominate and the SSD OS will likely perform better (although this may only affect loading speeds depending on the game; some are disk bound all the time, some not.)

So there’s the long-winded way of saying “it depends.”

Thanks for the responses.

I’m just going to go ahead and try it when I get home tonight. I’ll come back with a report.

Follow-up report:

I installed Tomb Raider on the Windows side and ran the benchmark with the same graphics settings:

Min FPS: 18.0
Max FPS: 34.0
Average FPS: 26.6

I messed with the setting a bit, turning off shadows and reflections, but still keeping the highest texture settings:

Min FPS: 23.3
Max FPS: 42.0
Average FPS: 34.0

So I guess the answer is a resounding “yes,” for this particular game, at least.