64 Bit Operating Systems and 32 Bit Applications

I’m curious about what is gained by making a 64 bit operating system refuse to execute a 32 bit application.

MacOS 10.14 was the last MacOS that would let you run a 32 bit app. It’s a 64 bit environment and cheerfully runs 64 bit apps (and warns you when you launch a 32 bit app that you’re gonna need to get an upgrade in the very near future).

Windows 10 (64) doesn’t seem to have any attitudes about runnning a 32 bit application. I just installed one in my virtual Parallels environment.

What does Apple attain by stripping out support for 32 bit code? Does it make the operating environment as a whole more efficient? Does it let them rip out oodles of OS-level code that they no longer need to lug around with them? What’s the tradeoff for “there’s something you used to be able to do and we’re taking that away as of version 15”?

[auto-hijack: While I’m at it, what keeps some 3rd party developer from creating a shareware or commercial package that provides the missing 32-bit support? I wondered the same thing when they dropped Rosetta#1: what would have prevented 3rd party developers from writing something that would let later versions of MacOS still execute PowerPC code in emulation? ]

Fewer moving pieces mean less development effort and testing and, therefore, a cost-reduction.

That’s really it.

Nothing, as long as the people who use that software are willing to accept a performance hit. The package would likely be some form of emulator like QEMU: Build a “system in a bottle” with tricks to access the external system and run the 32-bit code in that. It can be quite efficient, but nothing is quite as fast as native, and in some applications that matters.

Apple has been a control freak about their operating system (and other parts of their business) going back to the original Macintosh systems. It has been both a strength and a weakness. I know there are more specific answers to your query, but I think it’s worth pointing out that this has been a trait of the company for a very long time.

Well, to expand upon that, before they’d have to actively develop (to keep parity) and test all 32 bit libraries (and there’s a LOT of libraries), which not only is a manpower issue but also means wasted disk space. In theory you could choose to not include the 32 bit libraries, but macOS has never been about such options with installation. They like it as smooth and simple as possible.

ETA:
Removed some stuff as I got sidetracked and blabbed on about maintaining a full 32 bit version of the OS.

It’s not just reducing maintenance and development. It’s a deliberate business strategy to keep you purchasing new software. Apple has always eschewed backwards compatibility, only allowing it temporarily. Rosetta only worked for a while, and Rosetta 2 (which allows you to run Intel 64-bit code on their new in-house CPUs) will only work for a while, too. It won’t be long until the latest macOS will not run on Intel hardware at all.

Once Apple decides something is obsolete, they don’t really want you using it. They don’t want to maintain it. And they don’t want you think it’s good enough so you won’t have to upgrade.

(Which is odd since a lot of people like older Macbooks.)

You make it sound like it was around for a year or two (“Rosetta only worked for a while”). They introduced it to the OS in 10.4.4 (January 10th 2006) and was removed in 10.7.0 (July 1st 2011). Although with 10.6 it was a downloadable extra and not part of a standard install.

That’s five and a half years.

Missed the edit window:

And you know what? I’m pretty sure I never even used Rosetta for anything. Hell, the only issue I have seen with the dropping of 64 bit is that certain very old versions of Unity, that some of the REALLY old games at the company I work for were developed on, don’t run. But they were mobile apps that were never maintained or revisited. We’ve upgraded them all to be based of a 64 bit version of Unity now though, which frankly we should have been doing anyway. You can never be sure that an old version of anything is going to run on any Operating System. The same versions of Unity don’t play nice with APFS (Apple’s new filing system that came in with 10.13). The software runs, but you can’t see any files. I have a “Legacy” partition, formatted in the older “Mac OS Extended” file format, on my computer as a workaround

Part of Apple’s business model is to sell hardware, as much as the market will bear. The Apple user base is used to being forced to upgrade. Each microprocessor architecture transition (680x0 to PowerPC, PowerPC to Intel, and surely Intel to M1) has included an emulator for the previous processors, which was then dropped after about 5 years.

This is not very different from what you see on the mobile front. No modern phone will execute an application from 2010 (Android or iOS); it’s just less apparent because the mobile platforms have Stores where you magically upgrade your applications for free and transfer them when you upgrade your phone or tablet. The Mac itself is now using that model for most software.

Even on the Cloud side, Google is famous for killing off products even if they’re popular. It’s not that they have hardware to sell you, or that they want to punish their users; it’s about evolving their software / platform / security architecture while keeping their developer head count at its current level. They’re not going to kill Search, GMail, Chrome, YouTube, Maps or Drive/Docs/Sheets; but the secondary products are kept on minimum maintenance and will be killed off if the general platform’s evolution needs them to change significantly.

Microsoft is the exception: Windows desktop users (especially in business) demand that a 20-year-old application be kept operational. The transition to a Play Store-like environment started with Windows 8, but they’ve had to keep maintaining the 32-bit Intel platform for the desktop because they’d lose market share if they dropped it.

Yes. And that’s not very long when Windows can still run most stuff from Windows 95–25 years ago. The idea of not having access to software I own that is more than five years old seems awful to me.

Ideally it would never be removed at all. It would become an optional component and never get updated, but it would still be there. It’s not like the hardware can’t support it. It’s ultimately just a Mac OS 9 emulator.

Apple just doesn’t count those of us who use older software–e.g. gamers and business users (who often have legacy software solutions–as important to its customer base. They use backwards compatibility as a stopgap only.

@Heracles I’m not so sure about that. I have apps on my Android phone designed with Android 4.0 minimum. I believe YouTube is still Android 5.0 (or possibly 6.0) minimum. Those OSes are more than 5 years old, and suggests backwards compatibility is still going strong.

As for the Cloud stuff, Google is generally made fun of for that, and people are wary to get invested in new Google stuff. The main issue I hear from everyone about Stadia is saying you’ll lose all your purchased games in a few years. There’s never any reason to believe Google is in anything for the long haul.

But that’s the thing, it wouldn’t “never get updated”, that’s just not how these sort of things work. Fundamental changes to an OS can have a knock on effect, take the example of Unity and APFS that I gave. Look at the various UI API changes the Mac has gone through. A Carbon app would still have to be supported, which means Carbon libraries. Or development to keep a bridge working with whatever changes appear in Cocoa (which is what Apple ended up doing until they retired PowerPC). With software there is ALWAYS work to keep older stuff that you officially support working.

But @Heracles touched on an important thing here: market share. Not only is macOS’s market share tiny compared to Windows but also a far larger proportion of macOS users upgrade. They even did before macOS became free. It is a business decision on the side of Apple, money lost by dropping support versus money lost by continuing to support them. Due to so few people running older versions of the OS they decided to cut support. That’s just the way it is.

People don’t tend to run mission critical things on older Macs. Even in a world where in the last week or so we read about a Chinese train service closing down as their Flash software somewhere stopped working after being mothballed this year. Yes, there may be some older users, but it is way less than with the likes of Windows. And more often than not it appears to be non-Mac users that are moaning about it in a continuation of the monumentally dumb Apple vs PC debate.

Maybe if Microsoft had had to deal with their main users (not minor users, like the DEC Alpha version of NT that, I think, didn’t go past NT4, but their main, core userbase) changing core architecture not just once but three times we’d see a different situation with Windows. A different attitude. But they didn’t, the biggest they’ve had to deal with is bringing in 64 bit.

Weirdly, Photoshop runs on Mac as 32 Bit just fine (and I honestly cannot tell the difference in performance) but some options in it become unavailable - mostly the automated bits.

What’s weird, perhaps, is that my introductory experience with the Mac and the PC environment gave me a diametrically opposite sense of which one was closed and inflexible and which would let you do things your own way. The PC was intimidating (I’m talking about the MS-DOS days at this point); you didn’t learn how to do new things on it my randomly poking around in menus and changing things. But once you’d seen your first dropdown menu, you explored all the settings and preferences on a Mac. Shareware and freeware from Info-Mac and ZiffNet let you install INITs (extensions) and cDEVs (control panels) that let you modify the heck out of how the system behaved.

PC users seemed to divide up into those who only knew how to use them for certain explicit tasks and those who knew their way around autoexec.bat and config.sys and IRQs, but the latter didn’t go around installing much environment-modifying software either because they were afraid of viruses and incompatible DLL library files.

Be that as well it may, yeah, it has often felt like Apple themelves wanted you to use the computer like it was a Bic lighter, something you buy and use exactly as it came equipped and then throw it away when it runs out of butane. But most of the good software for the Mac didn’t come from Apple.

Oh well.

I’ve got the backward compatibility stuff licked. I can run every MacOS from System 0.9 to MacOS 11.2 in other hardware and/or in emulators and virtual machines, and PC operating systems from MS-DOS 3.3 to Windows 10, not to mention a couple BSD, Red Hat, AmigaOS, etc environments.

Well, the thing about emulators and auxiliary hardware boxes is that if you use them a lot, you need to be able to move things around between them and control them from where you’re sitting. Even when you’ve got everyone on TCP/IP, System 6 doesn’t make a good file share client for a Windows domain and Windows 7 isn’t equipped to open shared folders from MacOS 8.6; TeamViewer and PCAnywhere and GoToMyPC and VNC are nice cross-platform choices for remote control but tend not to play nicely with their own slightly older versions and won’t install the latest on decades-old operating systems. Enter Timbuktu.

Timbuktu is freaking amazing. With very few exceptions, all these environments can remote into each other and control each other and transfer files to each other. MacOS 10.3 on a PowerBook can be opened in a window from Windows 7 in a Parallels VM and vice versa; MacOS 8.6 in SheepShaver can open Windows 10. The exceptions to this interoperability are the pre-TCP environments, the Linux and Amiga, and… MacOS 10.15 and 11.2 Because Timbuktu was acquired by Motorola which bought Netopia and left to languish then sold to Arris which continued the neglect and then killed it. Which means it’s a 32 bit app that won’t get updated. pout

Not so. Win 7 and beyond will not run any 16 bit software and a couple of my favorite programs disappeared. I still haven’t forgiven MS for that.

32 bit versions of Win7 and beyond can run 16 bit stuff, if the optional NTVDM component is added

(I’m not going to say that all 16 bit software will work flawlessly in that environment though)

64 bit versions don’t have this available

I’m still pissed that I can’t get ‘regular’ leaded gasoline for my '58 Olds Starfire Ninety-Eight hardtop. I know tetraethyl lead is terrible for causing environmental lead levels to skyrocket and that it damages catalytic converters, but damnit, the Rocket V8 wasn’t built to be a clean-running econobox; it was meant to be a belching gas-guzzler that can ram a Chevy Tahoe off the road without showing a scratch! Also, they make parking spaces too damn narrow and short today!

Stranger

Right, the 64 bit versions of windows dropped a lot of libraries that MS considered obsolete and didn’t want to get working with their 32 bit subsystem. One of the software projects I worked on 10 years ago hit this problem, we were using a .net component that wouldn’t work on some of our customers’ Win7 64-bit computers, and since we weren’t even actively maintaining it we decided to just start from scratch with a new version rather than find a different component.

To the question raised in the OP, Microsoft puts considerable effort into backwards compatibility. They’ve set that expectation with their customers and they have the development teams to do it. OSX is a different beast in terms of customer expectations and importance to the bottom line.

Not sure if this is a point for or against Microsoft, but I have an old computer with Windows 7 and some games on it. Not long ago, I installed Beyond a Steel Sky on it, which worked, but running the program instantly crashed because it could not locate xinput1_4.dll, which does not exist on Windows 7. “Damn it!” I thought, “must I install Windows 10 on this piece of junk? Fuck that,” I decided, and copied xinput1_3 from the system folder, dropped it into the directory next to the game, and renamed it xinput1_4. Crash! It couldn’t find some function. More cursing. Suddenly I realized. I copied xinput1_3 from Cyberpunk 2077, renamed that one xinput1_4, and the game ran flawlessly.

So why wasn’t the game shipped that way? Obviously it could have run unmodified on Windows 7. Even Cyberpunk 2077 did , at least after a bit of hex editing to make it run on an obsolete CPU which worked fine after I hacked it because it did not in fact need any of the features not supported by that microprocessor.

You know, plenty of 64-bit OSes run 32-bit code fine. Removing lead from gasoline was an absolute human imperative, but removing some libraries from an OS is not, and it’s hardly curmudgeonly to want your computer to keep working correctly.

There have been a few watersheds like that one. Another big one was Hardware Abstraction that came along with the Windows NT family, hitting the consumer space with Windows XP - around that time, there was a chorus of complaint (feeling quite similar to the discontent level expressed later about Windows 8) about how older software and hardware just wouldn’t work any more when people migrated from Win98 to XP.
(It’s actually quite fun looking back at the clingy nostalgia people had over WinXP at the end of its life, in contrast to the reaction that happened when it first arrived)

The problem generally hit older/legacy stuff where the original manufacturer either wasn’t around any more, didn’t care to, or could not, write new drivers for their hardware device. That was the point where I learned that, as a provider of IT support, it was vital to keep systems and services up to date and in support by their suppliers.

You seem to have missed the point I was (admittedly obliquely) trying to make, which is that insisting on “backwards compatibility” in technology ad infinitum is swimming upstream into an ever-increasing current. At some point, any company or developer is going to shear away the mass of obsolescent requirements or standards for new product development just because of the time, expense, and trouble to continue to maintain continuity for features, capabilities, and interfaces that are only desired by a small niche of users.

As an example, the once ubiquitous VGA connector for computer monitors has virtually disappeared, and even the full-sized DVI is becoming more difficult to find as producers of both computers and peripherals have moved toward Mini- and Micro-DVI, or have dispensed with any analog input interface with DisplayPort or HDMI standards and connectors. While I am sure this was a source of frustration and complaint for people who wanted to keep using their Sony Trinitron CPD-E400 monitor they bought back in 1996 and that still “works fine and doesn’t have all of the distortion and dead pixels of those new fangled LED monitors”, the vast majority of users are happy to have a monitor that doesn’t occasionally buzz, build up static charge, and for which they have to worry about setting the compatible refresh rate or screen resolution, not to mention enjoying the vast improvements in image quality and stability.

If you want or need to still run 32 bit programs, there is an entire cottage industry of enthusiasts writing emulators and salvaging hardware that still runs older operating systems just fine, so you can take your pick of whether you want to go pseudo-retro emulation or full-on anachronistic in your computing setup. But insisting that computer and operating system manufacturers to cater to the needs of the ever-dwindling demographic of customers needing to run decades-old software is pretty much the definition of curmudgeonly.

Stranger