Apple Will Phase Out Rosetta 2 in Macos 28
Key topics
Apple's plan to phase out Rosetta 2 in macOS 28 has sparked controversy among developers and users who rely on the compatibility layer for running x86 apps on Apple Silicon Macs, with concerns about the impact on various software and workflows.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
6h
Peak period
61
96-108h
Avg / period
20
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 24, 2025 at 4:04 AM EDT
3 months ago
Step 01 - 02First comment
Oct 24, 2025 at 9:59 AM EDT
6h after posting
Step 02 - 03Peak activity
61 comments in 96-108h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 31, 2025 at 2:17 PM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
If you're a Mac user, you expect this sort of thing. If running neglected software is critical to you, you run Windows or you keep your old Macs around.
A lot of software is for x64 only.
If Rosetta2 goes away, Parallels support for x64 binaries in VMs likely goes away too. Parallels is not neglected software. The x64 software you'd want to run on Parallels are not neglected software.
This is a short-sighted move. It's also completely unprecedented; Apple has dropped support for previous architectures and runtimes before, but never when the architecture or runtime was the de facto standard.
https://docs.parallels.com/parallels-desktop-developers-guid...
Rosetta 2 never supported emulating a full VM, only individual applications.
https://www.parallels.com/blogs/parallels-desktop-20-2-0/
Nevertheless, running x64 software including Docker containers on aarch64 VMs does use Rosetta. There's still a significant valid use case that has nothing to do with neglected software.
Edited my post above. Thanks for the correction.
FWIW, Windows running on a 64-bit host no longer runs 16-bit binaries.
E.g. i have half of macos games in my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed. Best way to do it is to ditch macos version altogether and emulate win32 version of the game (witch will run at reasonable speed via wine forks). Somehow Win32 api is THE most stable ABI layer for linux & mac
To be fair, it's the emulation of x86-32 with the new ARM64 architecture that causes the speed problems. That transition is also why MacBooks are the best portables, in terms of efficiency, that you can buy right now.
All ARM chips have crippled x86-32 performance, because they're not x86-32 chips. You'll find the same (generally worse) performance issues trying to run ARM64 code with x86-64.
Which isn't an issue since Windows 95 was not a 16-bit OS, that was MS-DOS. For 16-bit DOS apps there's virtualization things like DOSbox or even HW emulators.
I doubt such a thing has ever happened in the history of consumer-facing computing.
Linux users do it all the time with WINE/Proton. :-)
Before you complain about the term 'major OEM operating system'; Ubuntu is shipped on major OEMs and listed in the supported requirements of many pieces of hardware and software.
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Comments like this show how low standards have fallen. Mac OS X releases have short support lengths. The hardware is locked down-you need a massive RE effort just to get Linux to work. The last few gens of x86 Mac hardware did not have as much, but it was still locked down. M3 or M4 still do not have a working installer. None of this is funded by Apple to get it working on Linux or to get Windows ARM working on it as far as I know.
In comparison, my brother in-law found an old 32bit laptop that had Windows 7. It forced itself without his approval to update to Windows 10. It had support for 10 years from Microsoft with just 10. 7 pushed that 10 to... hmm... 13+ years of support?
And there’s a near 100% chance you’ll have to recompile/download pre-re-compiled binaries if moving to a completely different architecture. Same here.
A few years ago, I installed Windows 10 on a cheap laptop from 2004—the laptop was running Windows XP, had 1GB of memory, a 32-bit-only processor, and a 150GB hard drive. The computer didn't support USB boot, but once I got the installer running, it never complained that the hardware was unsupported.
To be fair, the computer ran horrendously slow, but nothing ever crashed on me, and I actually think that it ran a little bit faster with Windows 10 than with Windows XP. And I used this as my daily driver for about 4 months, so this wasn't just based off of a brief impression.
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Come on. I've done that and still do: I use an ancient version of Adobe Acrobat that I got with a student discount more than 10 years ago to scan documents and manipulate PDFs. I'd probably switch to an open source app, if one were feature comparable, but I'm busy and honestly don't have the time to wade through it all (and I've got a working solution).
Adobe software is ridiculously overpriced, and I'm sure many, many people have done the same when they had perpetual-use licenses.
If you were instead asking for hardware documentation, or open-sourcing of Rosetta once sunset, then we're on the same team.
Open-sourcing is one solution, but knowing Apple it's not a likely one. Their "we know best" mindset is why I quit dailying Macs entirely - it's not sustainable outside the mobile dev business. A computer that supports 32-bit binaries, OpenGL or x86 translation when you bought it should be able to retain that capability into the future. Anything less is planned obselecense, even if you want to argue there's a silver lining to introducing new tech. New tech should be competitive on-merits, not because it's competitor was forcibly mutilated.
Apple has done this exact same thing for every architecture change and every API they sunset, but you gave them your money anyways. Their history with discontinuing software support and telling users to harang third-party devs isn't exactly a secret.
[1] https://www.hamrick.com
It would be different if the feature wasn't popular at all but that doesn't seem to be the case.
Apple doesn't want to maintain it forever, and a handful of legacy apps will never be bothered to update to native Apple Silicon support unless it means losing access to their user base. Apple has given them plenty of time to do it naturally, and now Apple is giving them a stronger reason and a couple more years to get it done. Apple is not randomly discontinuing it with no notice; two years is plenty of time for maintained software to get over the finish line.
At the end of the day, Apple doesn't want to pay to maintain this compatibility layer for forever, and Apple's customers will have a better experience in the long run if the software they are using is not running through an extra translation layer.
There will always be some niche users who want this feature to remain forever, but it's clearly not a significant enough percentage of users for Apple to be worried about that, or else Apple would maintain it forever.
The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.
2. In the resulting window, click the "More Info..." button. This will open the System Settings window.
3. Scroll to the bottom of that window and click "System Report."
4. In the left side of the resulting window, under "Software," click "Applications." This will provide a list of installed applications. One of the columns for sorting is "Kind"; all apps that are x86 will be listed with the kind, "Intel."
I'm super aware of the issues involved--I oversaw the transition from PPC to Intel at a university back in the day, using OG Rosetta. Even then, we had users who would only stop using their PPC apps when you took them from their cold, dead hands.
1. Go into Activity Monitor
2. From the CPU or memory tab, look at the “Kind” column. It’ll either say “Apple” or “Intel.” If the Kind column isn’t visible, right-click on the column labels and select Kind.
There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.
My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.
> One of the key reasons why Rosetta 2 provides such a high level of translation efficiency is the support of x86-64 memory ordering in the M1 SoC. The SoC also has dedicated instructions for computing x86 flags.
[1] https://en.wikipedia.org/wiki/Rosetta_(software)
[1] https://github.com/apple/container -- uses Rosetta translation for x64 images.
There is hardware acceleration in place that that only exists for it to, as you just stated, give it acceptable performance.
It does take up die space, but they're going to keep it around because they've decided to reduce the types of applications supported by Rosetta 2 (and the hardware that it exists only for it) will support.
So, seems like they've decided they can't fight the fact that gaming is a Windows thing, but there's no excuse for app developers.
Schematically "Rosetta 2" is multiple things:
- hardware support (e.g TSO)
- binary translation (AOT + JIT)
- fat binaries (dylibs, frameworks, executables)
- UI (inspector checkbox, arch(1) command, ...)
My bet is that beyond the fancy high-level "Rosetta 2" word what will happen is that they'll simply stop shipping fat x86_64+aarch64 system binaries+frameworks[0], while the remainder remains.
[0]: or rather, heavily cull
If you're not willing to commit to supporting the latest and greatest, you shouldn't be developing for Apple.
It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.
I've never seen this make a practical difference. I'm sure you can spot differences if you look for them (particularly at the hardware interface level) but qemu has done this for decades and so has apple.
And it looks like Rosetta 2 for containers will continue to be supported past macOS 28 just fine. It's Rosetta 2 for Mac apps that's being phased out, and not even all of that (they'll keep it for games that don't need macOs frameworks to be kept around in Intel format).
https://github.com/DLTcollab/sse2neon
The low-level Rosetta as a translation layer (which is what containers use) will be kept, and they will even keep it for Intel games, as they say in the OP.
The only hold out is GraalVM which doesn’t trivially support cross compilation (yet).
I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.
They released this a while ago which has hints of supporting amd64 beyond the Rosetta end date.
Is it slow? Absolutely. But you'd be insane to run it in production anyway.
A test suite that becomes 10x slower is already a huge issue.
That said, it doesn't seem llike Rosetta for container use is going anywhere. Rosetta for legacy Mac applications (the macOS level layer) is.
You can of course always use qemu inside that vm to run non-native code (eg x86 on Apple Silicon), however this is perceived as much slower than using Rosetta (instead of qemu).
The deprecation is mentioned in the context of Rosetta translation environment [1]. Rosetta for Linux uses same wording [2].
For example, Docker at least used to use this same binary translation internally year ago (the same tech as deprecation is mentioned). I don't know how it is today.
[1]: https://developer.apple.com/documentation/apple-silicon/abou...
[2]: https://developer.apple.com/documentation/virtualization/run...
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.
https://news.ycombinator.com/item?id=42483895
It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible
I hope you were at least compensated like a team of 20 engineers :P
https://www.quora.com/Apple-company/How-does-Apple-keep-secr...
It’s really unclear what it means to support old games but not old apps in general.
I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).
So then why say only games when the minimum to support the games probably covers a lot of non games too?
I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?
That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.
Bear in mind that a large chunk of Mac gaming right now that needs translation are windows games translated via crossover.
So my point remains, if Apple has to continue providing Intel builds of all of these frameworks, that means a lot of other apps could also continue to run. But ... Apple says they won't, so how are they going to accomplish this? That's the mystery to me.
I’m assuming Apple isn’t going to arbitrarily restrict what runs but will remove things to just the subset that they believe are needed for games such that other stuff just implicitly won’t work.
I grant it’s probably possible to do, but I think that is a lot more work and more error prone than just continuing to ship the major frameworks as they were.
From Apple’s perspective I’m sure they have a few big goals here:
1. Encourage anyone who wants to continue offering software on Mac to update their builds to include arm64.
2. Reduce download size, on disk size, and memory use of macOS.
3. Reduce QA burden of testing ancient 3rd party software.
These are also the same motivations Apple had when they eliminated 32 bit Intel and when they eliminated Rosetta 1, but they were criticized especially for leaving behind game libraries.
Arguably, arbitrarily restricting what runs gets them the biggest slice of their goals with the minimum work. Devs are given the stick. People typically only play 1 game at a time and then quit it, so there isn’t a bunch of Intel code in RAM all the time because of a few small apps hanging out, and they have less to test because it’s a finite set of games. It just will chafe because if they do that then you know that some unblessed software could run but Apple is just preventing it to make their lives easier.
They already have the frameworks supporting intel. They can just start pruning away.
Some teams will draw the short straw of what needs to continue being supported, but it’s likely a very small subset of what they already maintain today.
And then the next question is why? It's not like they've ever promised much compatibility for old software on new macOS. Why not let it be just best effort, if it runs it runs?
You would hope that apple would open source it, but they are one of the worst companies in the world for open sourcing things. Shame on all their engineers.
174 more comments available on Hacker News