Dissecting the Apple M1 GPU, the End
Original: Dissecting the Apple M1 GPU, the end
Key topics
As a young developer wrapped up her 5-year journey to reverse-engineer the Apple M1 GPU for Asahi Linux, commenters celebrated her remarkable achievement and pondered her next move - particularly, her new role at Intel. Some worried that Intel might not be the most stable or enlightened home for her talent, given the company's history of handling high-profile engineers and recent layoffs in its Linux driver team. Meanwhile, others were relieved that she landed at Intel, which has a strong track record of contributing to open-source projects like the Linux kernel and Mesa. The discussion also touched on Apple's restrictive approach to open-source work and its implications for innovation, with some lamenting the company's "walled garden" approach and its potential impact on the broader tech ecosystem.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
23m
Peak period
63
0-6h
Avg / period
17.8
Based on 160 loaded comments
Key moments
- 01Story posted
Aug 26, 2025 at 9:44 PM EDT
4 months ago
Step 01 - 02First comment
Aug 26, 2025 at 10:06 PM EDT
23m after posting
Step 02 - 03Peak activity
63 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 30, 2025 at 1:43 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
My point was that the graphics division itself will still be around, as integrated mobile SoCs are basically the only revenue stream Intel still has a good handle on. That requires a graphics core, and all of the other usable options are either not for sale to Intel, have burned Intel in the past, or are owned by Arm.
Don't get me wrong, Intel's outlook is IMO currently indeed rather bleak, but I would not completely write it off just yet.
I think at best you could say it's more challenging or perhaps risky being a bit restricted with IP, but I'd call it miles away from a "graveyard".
You can hardly call Intel/amd/qualcomm etc all struggling due to the architectures being locked down.
Look at powerpc/Isa. It's (entirely?) open and hasn't really done any better than x86.
Fundamentally you're going to be tied to backwards compatibility to some extent. You're limited to evolution, not revolution. And I don't think x86 had failed to evolve? (eg avx10 is very new)
And Apple, to complete the circle.
Apple does have open source projects. https://opensource.apple.com But the scope is rather limited. For someone of Alyssa's skillset there really isn't anything there.
You can already do this work on M1/M2 using Asahi. A compute server doesn't need fully working peripherals and external displays.
Man I wish i had half of the energy of this author.
Trolling will get you banned here, so please don't.
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
We can't not make mistakes. The best we can do is acknowledge when we make a mistake and do what we can to fix it.
> I censor whatever I like (dislike), I guess
If you knew how many comments I dislike on HN, you would no longer have that perception.
To be fair, even if you have the best CPU and GPU designers, it's not as if you can call up TSMC and have them do a run of your shiny new processor on their latest (or even older) process. You can't fab them at home either.
If it was up to me, 2 years of successful reverse engineering (of a variety of projects/products) would be a requirement to be called an engineer. You learn a lot from working things that you can’t learn from a book (and without having to do the mistakes yourself first…)
Just to make it clear: I am not implying anything about Alyssa - just stating an observation based on my own experience.
Creating things is a gamble, as mass adoption is almost never by technical merits, but by marketing. So you could make open documented everything but still end up with nobody benefiting from that openness, because a competitor (whether open or not) wipes you out. You saw this happen even in the era where electronic devices were expected to come with full schematics -- there were winners and losers even then.
But, if something has become widespread and well adopted, and it's not open, that's a problem. It absolutely should be opened up and documented. Especially if it's not because the money-grubbing creators of the something are deliberately hiding how it works and locking down control in order to extract more money from everyone else's pockets. The sooner you put an end to that, and the more often you fight against that, the sooner society itself becomes more efficient and fairer for everyone.
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
So if the discussions are true, it can take years for the developers to finish M1/M2 upstreaming with all the Linux kernel bureaucracy. That is, unless they decide to start working on M3 before finishing the upstreaming
Qualcomm has been beating the marketing drum on this instead of delivering. Ampere has delivered excellent hardware but does not seem interested in the desktop segment. The "greatest Linux laptop around" can not be some unmaintained relic from a hostile hardware company.
If you want to do a device, and your only chip option is Qualcomm I'd recommend not doing a device at all.
Can you see any other machine coming close to a Mac in terms of hardware quality and performance? Obviously the cost is silly, but while I agree with your sentiment, it seems optimistic to hope.
IME the Asahi support page is spot-on: There are a couple of yet-unsupported features (DP-alt mode being a big one), but any feature listed as supported will just work without hidden gotchas. I find this a big contrast to other devices, which will often "work" but have annoying little quirks here and there that are workable but can feel like a downgrade compared to Windows.
There's some room for improvement, but that is purely relative to macOS. Asahi still solidly beats other x86 devices (other than the low end ones you wouldn't do development work on).
One issue is that idle battery consumption is higher than on macOS (an active area of improvement though [1]), which you'll notice by an M1 laptop discharging by about 12% overnight when macos would've eaten maybe 2-3%. Not a big issue normally, but can be inconvenient if the device shuts down due to empty battery overnight.
During more passive uses at daytime (e.g. playing music), the display tends to be the biggest power hog. Not really Linux-specific, but I actively turn off the screen when not needed hence (KDE lets you configure the power button to do so).
[1] https://social.treehouse.systems/@chaos_princess/11498433865...
Any sources for that? I'd be quite surprised if Apple had radically altered the architecture.
[1] https://developer.apple.com/videos/play/tech-talks/111375/
the great thing is, you can!
Macbook pro display is one of the best laptop display.
(Better for the battery too, if you can keep most of the screen dark.)
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
https://asahilinux.org/2025/02/passing-the-torch/
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
While we still have quite a way to go, this progress has already made rebases significantly less hassle and given us some room to breathe.
https://asahilinux.org/2025/08/progress-report-6-16/
It is accepting a new challenge.
She did the challenging stuff she cares about. One aspect of nerd brain often is that you can hyperfocus on challenging stuff, but can't get the motivation to work on stuff you don't care about - and even what would be a 20 minute task can end up taking days because of that. It's great that she has the self awareness to set goals, and step away once they're done.
I didn't have that in that age - and still sometimes struggle. I was lucky enough that my employer back then recognized my issues, and paired other people with me for doing the stuff I was not interested in, and now usually manage to load those issues onto other co-workers by myself.
I've said it before and I will keep saying it again: the financialization of everything and the utter dominance of braindead, long-since disproven MBA ideology is going to seriously impede our societies in the next decades.
And now that tech is flooded with people, it's gonna be easy to just not deal with the troublesome "weird" people, and instead go with those who are happy to go thru all the bureaucracy and 10 stage interviews
I wouldn't be so quick to judge someone for ADHD.
Because I have it, untreated. And I couldn't even finish university because of it. I'm unable to do certain things, like at all, I'm nearly physically ill when doing these things. Hard to explain it, to someone without these problems :)
Luckily enough, it's not that important here / Idc about money, career etc.
I have ADHD-PI or whatever (diagnosed). I know the struggle. it's a life long battle that never ends.
Hang in there
(Mind you, I'm not talking about a matter of inborn temperament or character, much less a moral flaw! Rather, finding the compelling challenge even in "boring" tasks is a valuable skill and situational tactic that anyone should explicitly learn about and aim to acquire as part of becoming a mature professional, not a matter of morality or somehow being dismissed as "lazy"!)
At least with Panfrost it made more sense bc it still being used
M1 chip laptops can only be bought second hand at this point
[1]https://9to5mac.com/2024/03/16/walmart-m1-macbook-air-launch...
[2]https://www.walmart.com/ip/Apple-MacBook-Air-13-3-inch-Lapto...
But 8GB of RAM.. that's unfortunately completely unusable by most developers. (Panfrost drivers you can at least use on RPi-like devices)
Maybe in another 5 years it'll work on the M3/4 and I'll revisit this. Good to know the devices are still being built so long after release
The Walmart deal is a total mystery. It started, seemingly, as dumping new old stock without selling it on Apple.com, but they’ve even updated the machine I think so clearly it’s an ongoing concern.
Nothing like it I know of for Apple, ever. I’d love to know the story.
It has been a long time since people have needed cutting edge laptops, so an M1 bought today will still work for 90% of people for the next 5+ years. Even if Apple doesn’t earn a large profit margin on the sale of the laptop, they could earn a decent amount on monthly services revenue, plus increased odds of that person buying a watch/airpods/phone/etc.
An M1 is great. But RAM and storage won’t hold up as long.
I suspect they can sell them at that price and still make a killing, and all the equipment to make chassis/etc is already paid off.
For the RAM, 8GB is not enough, but in fairness, when the system can page out at 200GB/s, paging out doesn't hurt nearly as bad. Its only when things have to thrash the page file that it becomes readily apparent on these (say, an application needs to have more than a few GB of stuff resident in memory all the time).
But even if 8 GB of RAM holds you today, will it hold you five years from now?
Or are you going to have to get rid of the computer much faster and buy another one by then.
Whereas simply doubling the RAM would likely extend the life a significant amount.
It’s not high spec for sure, but with M1 RAM counting double (they swap very efficiently up to a certain point) it’s still plenty for casual use.
I see most people around me watching media, using a web browser to shop, maps, look at photos/videos (small storage is great for Apple, then more people buy icloud), fill out pdfs, and maybe some email or light excel.
Presumably, those are the people likely to buy a laptop at Walmart.
New M1 Macbook Airs are still available at Walmart (maybe elsewhere). But even if not, who cares? People are still writing code for computers that haven't been sold since the 1980s.
As the other replies show, you can still buy this machine, but it sounds like it likely won't be for too much longer
The developers involved must be acutely aware of it. Maybe they have some sense that the work will easily update to current M chips. Or maybe they don't really care about that. It was just an interesting exercise and they move on
Maybe it's just due to a complete lack of attention, but I think M3/4 support is extremely minimal at this point. Which is not a great sign..
Looking at the drama and people stepping down, I don't think MacBooks will be properly supported on Linux in this decade.
(The M3/M4 are in progress but not supported. That's public on the project's compatibility chart.)
Also the infrequent random OS crashes were annoying. And sometimes WiFi would stop working after sleep (wold not show any access points) and would require a reboot.
M1 is 5 years old already and is still not fully stable and lacks features. It seems like the overall development effort started slowing down a couple years ago and while we did get the amazing audio daemon and graphics driver, development of other things seem to be stuck.
If I remember correctly, there were also some comments from Marcan (?) on social media about issues with supporting newer chips (M3/M4), hinting that M3 and M4 are vastly different and require significant effort to add Linux support.
So if M3, M4 and other future versions are too different to get supported in decent time frame, then that means that Asahi is all about supporting years old hardware. That reduces interest by Linux users looking to buy a laptop now, and thus potentially reducing available donations, developer pool, interest, etc.
I love what Marcan, Alyssa, James and others have achieved and how they have pushed Linux further. I think that their contributions will stay relevant and be useful for other hardware for many years to come.
Now anyone that treats it with the attitude that whatever Linux distros do is UNIX, there are enough surprises in there.
how tf does she juggle and managed to do all this? I can barely do one of the above properly.
Although most likely she’s well compensated, and doesn’t have to waste time on useless efforts at work, this level of discipline and striving towards a goal is just very rare in general.
Possibly also no family, limited social life and no other hobbies.
Half of me kinda wants another lockdown so I can do more discipline-y stuff but the other half is like, dude you're just gonna waste it playing more games. I just gotta face the music - I'm just not disciplined and I just don't have the drive.
Every person is different of course, there might be this one brilliant engineer forced to manage against his will somewhere.
However, discipline is an enormous factor too, actually using that extra available time on something “productive” is no easy feat.
Now I have kids and live in the same area as my parents and siblings again, entirely happy, but less free time.
[1] https://rosenzweig.io/resume-en.pdf
Thanks for all your amazing contributions Alyssa and all the best for the road ahead!
Well done.
64 more comments available on Hacker News