Iphone 17 Chip Becomes the Fastest Single-Core CPU in the World on Passmark
Posted4 months agoActive3 months ago
tomshardware.comTechstoryHigh profile
heatedmixed
Debate
80/100
AppleArmCPU Performance
Key topics
Apple
Arm
CPU Performance
The iPhone 17 chip has become the fastest single-core CPU in the world on PassMark, sparking discussion on its implications for future MacBooks and the limitations of Apple's closed ecosystem.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
29m
Peak period
138
0-12h
Avg / period
32
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 27, 2025 at 3:48 PM EDT
4 months ago
Step 01 - 02First comment
Sep 27, 2025 at 4:17 PM EDT
29m after posting
Step 02 - 03Peak activity
138 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 2, 2025 at 2:45 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45398802Type: storyLast synced: 11/20/2025, 7:40:50 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
It helps that for heavier work I tend to use my desktop I guess.
My m1max is getting along quite nicely, I love this thing.
But the average user will never encounter this. Typical web browsing and office tasks are bursty and mostly single threaded.
Gaming is the only area where it may be more of a problem. But it’s fine for this to simply not be a great gaming laptop. Most laptops aren’t.
Claude Code, some light video and photo editing, YouTube, and Netflix.
Not sure what others using an entry level laptop would expect it to do.
The A series has been passively cooled for 15 years inside of systems with much less thermal mass.
Ah but a Chromebook or low end budget PC laptop will? From experience these things fire up like jet engines just to open a text editor.
I would expect to see them use the same sort of game plan for an A series MacBook that they used with the iPhone SE.
Use previous gen parts for items like the screen and body (for cost savings) along with current gen SOCs.
Apple TV 4K at 200, MacBook at 600
Back of the napkin math but it checks out.
With all of Apple's service offerings these days, they could also potentially justify slim margins by positioning the A-series MacBook as both a loss leader and gateway drug.
It was only intended for highway travel at best.
But hey, you can scroll at 120hz now, right? Think different!
All this tells me is that Intel and AMD are the only manufacturers making leading chips that you can do anything with.
Edit: Fixed last sentence.
Right now Intel is losing a lot, lucky to them Nvidia invested. So far I would only use AMD. In my desktop is an AMD, because it is just the fastest desktop CPU und the threadrippers are absolute multicore beasts for servers.
You're talking about a company that limits you from having a comma on the front of the iPhone keyboard. Why?
These phones record 4K video at 120 frames a second.
They play high intensity video games.
They run on device language models.
People keep their iPhones for years — this power will ensure these phones can run iOS versions 5, 6, even 7 years from now.
And the uArch you’re seeing here will end up re-purposed for big brother M series chips in laptops and desktops.
So what exactly do you want?
The iPads even more so, as some literally have the same kinds of hardware as a MacBook already (minus the keyboard detaches).
Buy a laptop you’ll be ok.
> Buy a laptop you’ll be ok.
This kind of comment is uncalled for. Anyways, as mentioned, I do the screenshare to a MacBook today. The advantage of having both is clearly for Apple though, not me.
For whatever reason the community now is filled with a weird mix of corporate bootlickers who want to give the largest corporations money while they take away our freedoms and they simultaneously want to make known their entitled behavior by claiming individual developers and small businesses should only ever give away their software for free.
It was bad enough people here chime in technical conversations that they have no experience with loudly contesting the points of actual experts and practitioners, but now they also want to tell us that we don’t have a right to use our property how we want.
Yeah, I grew up tinkering with Linux since 1994; was good times. I've changed.
For whatever reason the community now is filled with a weird mix of corporate bootlickers who want to give the largest corporations money while they take away our freedoms and they simultaneously want to make known their entitled behavior by claiming individual developers and small businesses should only ever give away their software for free.
A bit after the turn of the millennium I became legally blind and I also needed to eat. Apple was able to serve my needs with their accessibility tools and keep me fully functional without additional costs beyond the hardware and base software, and still to this day. Nothing had come close back then nor now. While I appreciate Stallman's ideals and how he wouldn't use assistive technology if it isn't free software, I can't succumb to those artificial restrictions. The reality is that I would have been much worse off financially, being self-sufficient, going through studies, progressing in my career, etc. if it wasn't for Apple's accessibility tech. So yeah I'll own the whole corporate bootlicker nonsense, and it's why in later years of my life I now invest in Apple, because they deliver usable solutions to real problems to the vast population rather than cater to an insignificant population with piddly ideals.
It must be nice if your only issue with tech is that you need it to go brrrrrrrr.
And sure, you could go looking elsewhere in the market, but the same thing could happen with any vendor, and there's no guarantee that the specific set of features you desire is going to be available at a price point that makes sense (especially if your income changes due to further disability).
What if the current US federal administration continues its fascist descent and decides that supporting disabled people is "woke DEI shit" and puts incredible pressure on companies to discontinue features for, or even surveil and report, its disabled users? Eugenics is part and parcel of their ideology after all. Or more likely, the US govt requires a ChatControl-like feature in all commercial software.
Free software is a matter of personal self-reliance -- not just some hobby.
Your devices become useless, and the ability to run free software becomes the only way to salvage them.
Free software is a matter of personal self-reliance -- not just some hobby.
Free software provides me negligible utility, I can't rely on it for anything, since it has virtually unusable accessibility and assisitve tech. The chances free software can come to the table with something usable is a lot less likely than the scenarios you have dreamt up, so we'll be fine in the long run.
Like if the Stallman life works for you, go for it; but I would've been left stranded in many areas of my life having stunted growth (not physically) if had not found another way to use tech/computers, which would probably involve spending stupid amounts of money on (again) proprietary niche accessibility tech. Stallman and his FOSS sycophants only care about their ideals, not actually empowering people.
The hacker culture is something that fascinated me as teenager and the reason I am able to pay the bills today. I don't really know what would be of me without it.
What workload are you envisaging to be run on an iPhone where this even matters? Hyperbole aside, what target population of iPhone users even care about overclocking, and specifically what tangible benefit will they get out of it?
But there’s good news — this architecture will end up in MacBooks, Mac minis, Mac Studios etc.
It’s like complaining that they put a good engine in a civic when you can also buy that good engine in other configurations that will let you do more with it.
So why insist on doing it on a phone?
In practice, the engineering effort to enable that just doesn't seem worth it. And in a zero-sum world of engineer time, a cost better spent elsewhere. Let my laptop be a laptop, and focus on making that experience the best it can be. And let my phone be a phone, and focus on making that experience the best it can be.
I think people fundamentally don't understand Apple when they want them to engage in the same kind of "jack of all trades master of none" pursuits that led to subpar Windows experiences and the fragmented Android ecosystem.
You can kind of see Apple dabbling with this a bit with iPadOS. And it's an absolute mess. My least favorite operating system Apple makes. All available evidence right now points to Apple simply not being able to neatly converge different computing paradigms. They are right to show restraint with their most important product.
I'm ok with them experimenting with this with the iPad, because frankly, the iPad does not matter. But I do not want Apple to mess up the phone for the two people on hacker news that want to hook theirs up to a thunderbolt dock.
If I pay for hardware, I want to use it however I want. I know that is a fucking crazy idea for them corporation bootlickers.
To me feels like you have a paratistic relationship with the brand of your phone maker. And you will fight other individuals to defend their monopolistic actions.
But to address your concern, Apple is clearly not for you and that is fine, you have choices, like the premium value and control you get with PinePhone. No corporate bootlicking required!
The choices are to run a worst OS or a worst CPU. How did not I see how many choices I have, I must be blind! Thanks for opening my eyes mate.
So a market already exists for those that want full control over hardware and embrace hacker culture and don't subject you to corporate bootlicking, yet this is still not enough? I actually meant what I said when I say Apple is not for you, because this is a segment of the market they're not interested in.
If it matters to you, put your money where your mouth is, and support the markets and initiatives that embodies your values; you never know, you could contribute to efforts in creating a CPU that surpasses Apple or whoever is the incumbent.
Don't make me laugh.
You are just not only a bootlicker, because you defend corportations monopolistic practices, but a troll.
M silicon is very efficient on workstations for dev work.
I've heard that Nvidia makes some GPUs that you can do things with.
> Intel is the only manufacturer making leading chips
lol wat
From context you seem to be talking about overclocking. In which case only a handful of Intel chips are unlocked per generation. By contrast, most of AMD chips are unlocked.
Not sure why the A18 passmark scores are quite a bit lower than A17
I don’t want a MacBook with an A19, I want to use the A19 I already have connected to a screen and a keyboard with a proper software stack.
Interesting how people rationalize this stuff. My phone can dock to a screen; I've never used the feature, but it doesn't obstruct anything I do. It's a nice-to-have, and it would be kinda appealing if Firefox and Spotify both work like normal. I might even deign to say I could get work done on it (though I've never tried).
Presumably the real reason is that Apple is still afraid to segment their market. A plug-in iPhone would stop people from buying the AppleTV or Mac Mini for home theater applications.
People manage to rationalize that decision, which makes the web largely unusable for users in the name of protecting Google's bottom line.
Thankfully, there are other options.
You should try it, but you might need a desktop computer if your smartphone discriminates against browser engines.
They seem very serious about using every lever at their disposal to prevent YouTube users from having access to adblock that works.
You can already stream from your iPhone via AirPlay to at least Roku sticks/TVs and I assume others. The number of people who want to use an iPhone as a full computer is miniscule.
Just having such a feature involves a cost, and the juice is just not worth the squeeze.
Curious, if your phone supports the feature and it doesn't work, what is your recourse?
The biggest hurdle standing in the way is that making a single one of iPhone/iPad/Mac too powerful (in terms of usability, not just raw processing power) will take away sales from the other two.
> The biggest hurdle standing in the way is that making a single one of iPhone/iPad/Mac too powerful (in terms of usability, not just raw processing power) will take away sales from the other two.
No it won’t. Nobody is cross-shopping a full size laptop with screen and keyboard or a phone with a tiny screen and no keyboard.
If the pinephone had enough power to record a video, and maybe better waydroid integration I would use it. I like the convenience of using the same apps as on my laptop, and being able to develop on the same platform that I am targeting. That is a unique selling point.
Apple has the funding to do this, but they choose not to. It would damage their whole market segmentation scheme. It is a penetration strategy
And almost all of my apps have iPhone, iPad and Mac versions with cloud syncing of the data between them.
It’s just a flag for developers to allow iPad apps to run on ARM based Macs without any modifications.
> *and being able to develop on the same platform that I am targeting. That is a unique selling point.*
With ARM based Macs, they basically are the same except for the screen size. Compilation speed would be much slower on an iPhone than a Mac.
The iPad and the Mac combined is 20% of Apple’s revenue. People buy iPads because they want a larger screen.
I mean you can carry around a portable USB C monitor and plug it into your iPhone today. I have one that gets power and video from one USB cord for my laptop. But most people don’t want to do that.
Have some imagination? I know plenty of folks that use their phone as their main computer, but could use more screen space on occasion to finish a complex task at a desk with a mouse and keyboard.
Something like phone mirroring that utilizes the full display (Perhaps full macOS?) would be amazing for that use case. Could be wireless or a magsafe stand, or even a homepod-style handoff thing with the phone’s nfc chip.
The hardware for this is pretty much there, Apple just needs to productize it.
This is my iPhone 16 Pro Max with my external portable monitor. The phone is connected with a standard USB C cord and the Anker battery is plugged in with another USB C cord into the monitor.
https://imgur.com/a/1Fv6Zc6
I had my Apple Bluetooth keyboard and mouse with me. There is no reason I couldn’t pair them to my phone.
There’s no good reason I shouldn’t be able to plug my phone into my Studio Display when I need a bit more room to work on a task I started on my phone. Yes, there’s handoff between iOS & macOS, but it’s very tied to specific apps, and requires another Mac that may not really be necessary.
What work would you start on your phone that doesn’t either have an app on your Mac where everything is synced via cloud services and/or there isn’t a web app where you can’t start on one and keep going on another?
I can already use GSuite (work and personal), Office365 (personal subscription), and iWorks across my Mac, iPhone, and iPad using apps and/or the web and things automatically sync and of course notes, calendar, mail, messages, Slack, etc are synced between everything.
My personal Trello board is synced between all of them and even my third party podcast app - Overcast - has an iPhone, iPad, Mac and web interface that syncs.
It was a compelling value proposition to Crackberries of the era; Apple clearly did market research before flinging it to the masses. Danger Inc / Google were already converging on this with their first Android.
The biggest hurdle standing in the way is that making a single one of iPhone/iPad/Mac too powerful (in terms of usability, not just raw processing power) will take away sales from the other two.
Cannibalising their product line is probably the most plausible explanation, but I'm sceptical. I reckon it's just that they haven't found a "killer" use case for it and probably bottom in the list of ideas for improving the product (if it even made a list).
But for earlier examples, I had a Palm VII back in 1999. I was working for a CRM reseller and we got one to play with as a potential solution for a client project.
It was super limited but being able to browse the web while on the go was immediately obvious as a very big deal. BlackBerries didn't get web until a couple years later but I'm sure users of that would say something similar.
Please don’t say because they need to run a terminal and Docker.
For a small subset that is absolutely also stuff like terminal and Docker, but there's nothing special about that group beyond "they use a different set of apps not allowed in the App Store".
On iOS, Microsoft apps and a web browser can get you fairly far for many business/corporate use cases.
1. Zoom
2. The AWS console in a web browser
3. The terminal - and I can bring up CloudShell for simple things from the AWS web console
4. Slack/Notion/GSuite apps/Jira
5. Visual Studio Code and using Docker. For that, I would just spin up a Windows based AWS WorkSpace with the iPad client app and wouldn’t be able to tell the difference when using a regular Bluetooth keyboard and mouse.
Most people don’t need #5
The PC games market is really not that large in the grand scheme of things and what serious game player would want to play games on laptop class hardware? Even with the iPad Pro you are talking about MacBook Air level hardware with worse thermals for games.
And the most popular office Apps - Microsoft Office and GSuite are a per seat license - for both home and office. Meaning you can use the apps on your phone, laptop, or mobile devices and sync apps between them.
How many productivity apps don’t “fit in Apple’s guideline”?
I don't mean office apps, I mean enterprise apps. I do see them becoming more web focused with time (which I think is a good thing - it's ultra portable when they are) but we're certainly not ready to claim victory just because most email and document editing can be done from a webview. Hell, there's one app I have to use daily which is still officially only 32 bit Windows (it, thankfully, works in Crossover).
> Hell, there's one app I have to use daily which is still officially only 32 bit Windows (it, thankfully, works in Crossover).
As an example. This one is Intangi IRIS, a BOM and quoting tool for network products, and they officially support Crossover. They've been talking about native macOS support since before I started using it in 2019... but that's the speed of business for you :).
It's the long tail of apps like these that can make it painful - not the bulk of the day (email, conference calls, and web ticketing systems) itself.
s/uninterested/unaware/
In reality, for such hardware to make sense, it would have to be a full MacBook Air minus the PCB. Would you be willing to spend 500-1000 USD for a piece of hardware that only works when your phone is connected? (i.e. an iPhone accessory)
https://www.amazon.com/dp/B095GG31KX?ref=ppx_pop_mob_ap_shar...
I take it with me when I travel.
Did you come to that conclusion based on the market not using a product that doesn’t exist?
https://arstechnica.com/gadgets/2013/01/canonical-unveils-ub...
Okay.
Notably the people pushing for dockable computers are computer geeks who say things like "I only listen to OGG Vorbis, MP3 is unusable" and "no wireless, less space than a Nomad, lame" and "I could build Dropbox in a weekend with rsync" and "I don't understand why people need to see pictures or video, text is all I need". Normal people aren't itching to SSH from their TV using a bluetooth keyboard connected to a smartphone with a VPN so they can do X-Forwarding of Brave browser. Normal people are fine with buying a laptop with Gmail and using an Android TV stick to watch from a streaming service.
Offering a dockable screen/keyboard/mouse, using the phone battery/compute/storage seems like it would be trivial for Apple.
Obviously cannibalises laptop or tablet sales, but that's not the market's disinterest.
source? or are we just going off vibes here?
15% and increasing quickly. That’s nearly 1 in 5.
It would mean a lot more work for developers though if your app needs two different UI designs.
One issue that sticks out is that touch controls of a usable size just take up so much more screen space.
Also, look at how many years it took Microsoft to provide touch friendly access to the Windows control panels.
See Apple, but also US politics.
I stopped using Windows when Microsoft made it obvious that they fully intend to force users to use an online account to log into their own local machine.
Requiring an online account if you want to use optional online features was perfectly acceptable.
Needing an online account to access your own PC?
Absolutely not.
So by the time you put screen, speakers, keyboard, trackpad, battery, ports, case, hinge, charging into your dock you might as well computer there.
The laptop has a device makes less and less sense now that the cpu and gpu inside a phone are good enough for most. It’s propped up only because phone providers artificially limit what the software on the phone can do.
The only way this can work is wireless at which point you are reinventing a laptop.
You already have the phone. All the pieces are technically there to seamlessly connecting to a screen and a keyboard wirelessly and have it become a proper computer which is what it always was. I think Google has seen it and that’s why they have started working on their desktop mode but they are sadly stuck with underpowered cpu.
You need to think bigger. The sole reason this is not already a reality is because it would be a loss of money for Apple.
PassMark and Geekbench are closed source. I don't know why I should trust them to e.g. treat fundamentally different kinds of devices in a sane and fair way. They have vastly different cooling mechanisms, for one. It matters if they can sustain a certain load for 5 minutes or for 5 hours.
It obviously doesn't scale well if every potential end user has to benchmark every device.
SPEC has been the industry standard performance benchmark for comparing between different CPU architectures for decades.
It takes all freaking day to run, but Anandtech published the benchmarks as soon as they got their grubby little hands on a new core design for every well known architecture.
Is GeekerWan on YouTube the only outlet still doing this today, albeit in Chinese with English subtitles?
You would think that Chips and Cheese, at least, would take up the gauntlet.
Modern Apple hardware has so much more memory bandwidth than the x86 systems they're being compared to - I'm not sure it's apples to apples.
Pushing this point further, x86 chips are also slower when the entire task fits in cache.
The real observation is how this isn’t some Apple black magic. All three of the big ARM core designers (Apple, ARM, and Qualcomm) are now beating x86 in raw performance and stomping them in performance per watt (and in performance per watt per area).
It’s not just apples deep pockets either. AMD spent more in R&D than ARM’s entire gross profit margin last I checked. Either AMD sucks or x86 has more technical roadblocks than some people like to believe.
I do feel like x86 has more technical roadblocks, but disagree the amount of investment is not the primary driving factor at this point. I haven't seen designs from ARM itself beat x86 on raw performance yet, and 100% of their funding goes towards this point. E.g. the X925 core certainly doesn't, nor does the top single core Android device on e.g. Geekbench come close to current iOS/PC device scores. They've announced some future shipping stuff like the C1 which is supposed to, but now we're talking marketing claims about upcoming 2026 CPUs vs Zen 5 from 2024. Perf/Watt wise absolutely of course, that ship sailed long ago. Z1/Z2 were admirable attempts in that regard, but still a day late and a dollar short to leading ARM designs.
The other factor to consider is scale-out CPUs with massive DC core counts tend to have mediocre single core performance, and that's what AMD really builds Zen for. Compare to Graviton in the DC and AMD is actually performing really well in both single/multi performance, perf/watt, and perf/dollar. It just doesn't scale down perfectly.
Apple/Qualcomm have certainly dumped more R&D into their cores being low core count beasts, and it shows vs any competition (ARM or x86). The news likes to talk about how many of the Nuvia developers came from working on Apple Silicon, but I think that is a bit oversold - I think it's mostly that those two development programs had a ton of investment targeting this specific use case as the main outcome.
Meanwhile the AI Max+ 395 has at least twice the bandwidth + same number of cores and comes to more like a ~15% loss on single and ~30% loss on multithread due to other "traditional" reasons for performance difference. I still like my 395 though, but more for the following reason.
The more practical advantage of soldered memory on mobile devices is the power/heat reductions, same with increasing the cache on e-cores to get something out of every possible cycle you power rather than try to increase the overall computation with more wattage (i.e. transistors or clocks). Better bandwidth/latency is a cool bonus though.
For a hard number the iPhone 17 Pro Max is supposed to be around 76 GB/s, yet my iPhone 17 Pro Max has a higher PassMark single core performance score than my 9800X3D with larger L3 cache and RAM operating at >100 GB/s. The iPhone does have a TSMC node advantage to consider as well, but I still think it just comes out ahead due to "better overall engineering".
So I think there's more to it than memory bandwidth.
This makes a lot of sense. If the calculations are fast, they need to be fed quickly. You don't want to spend a bunch of time shuffling various caches.
Haven't they been playing leap frog for years now? I avoid the ARM ecosystem now because of how non-standardized BIOS is (especially after being burned with several different SoC purchases), and I prefer compatibility over performance, but I think there have been some high performance ARM chips for quite some time
I don’t know if that will be the M5 or M6 series but with active cooling and wall/big battery power I bet it will be impressive.
And a pro/ultra variant with lots of cores might post some very nice multi core numbers.
Way to go Apple. That’s a great accomplishment.
44 more comments available on Hacker News