Apple Barely Talked About AI at Its Big Iphone 17 Event
Posted4 months agoActive4 months ago
theverge.comTechstory
calmmixed
Debate
60/100
AppleAIIphone
Key topics
Apple
AI
Iphone
Apple's recent iPhone 17 event barely mentioned AI, sparking discussion among HN users about the company's approach to AI integration and its potential impact on user experience.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
13m
Peak period
51
0-6h
Avg / period
12
Comment distribution84 data points
Loading chart...
Based on 84 loaded comments
Key moments
- 01Story posted
Sep 9, 2025 at 3:54 PM EDT
4 months ago
Step 01 - 02First comment
Sep 9, 2025 at 4:07 PM EDT
13m after posting
Step 02 - 03Peak activity
51 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 12, 2025 at 1:44 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45187841Type: storyLast synced: 11/20/2025, 6:12:35 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
...except for the motion-activated lighting in our foyer and laundry room. $15, 15 minutes to install, no additional charges, no external services, no security issues, and just works year after year with 100% reliability.
I want to reach for my tools when I want to use them.
They are all done locally on your device for the last decade, at least.
Anyway, it's not the same thing: I'm fine with machine learning to give me better image search results, I'm not fine with machine learning to generate "art" or machine learning to generate text. Everyone has collectively agreed to call the latter "AI" rather than machine learning, so the term is a useful distinction.
Remember the term "smart" as applied to any device or software mode that made ~any assumptions beyond "stay on while trigger is held"? "AI" is the new "smart." Even expert systems, decision trees, and fulltext search are "AI" now.
Not really, I'm taking the hint. If they call a feature "AI", there's a 99% chance it's empty hype. If they call a feature "machine learning", there may be something useful in there.
Notice how Apple, in this event even, uses the term "machine learning" for some features (like some of their image processing stuff) and "AI" for other features. Their usage of the terms more or less matches my line of features I want and features I don't want.
But that's not true of any other actor in the market. Everyone else — but especially venture-backed companies trying to get/retain investor interest — are still trying to find a justification for calling every single thing they're selling "AI".
(And it's also not even true of Apple themselves as recently as six months ago. They were approaching their marketing this way too, right up until their whole "AI" team crashed and burned.)
Apple-of-H2-2025 is literally the only company your heuristic will actually spit out any useful information for. For everything else, you'll just end up with 100% false positives.
The same product could be produced five years ago or today, and the one produced five years ago would not be described as having "AI features", while the one produced today would.
(You can check for yourself: look at the online product listing for any mature "smart" device that got a new rev in the last three years. The Clapper would be described as an "AI" device today.)
And I make an effort to avoid those with "AI" features where practical. I do not need an AI toothbrush.
All machine learning is AI, not all AI is machine learning.
Of course this is going to be spun and turned into a negative, but I basically want ML to be invisible again. The benefits being clear, but the underlying tech no longer mattering.
(of course can't speak to this new release, obviously)
(Genuinely curious, perhaps there are third-party apps I can use to bridge the gap.)
Is it something that is not usable on Apple devices?
also hard to find a better laptop for running an LLM locally too
ChatGPT being the number one app is a weird way for people to express they don't trust AI: https://apps.apple.com/us/charts/iphone
They’ve been putting AI in a lot of places over the years.
The expectation is that Apple will eventually launch a revolutionary new product, service or feature based around AI. This is the company that envisioned the Knowledge Navigator in the 80s after all. The story is simply that it hasn't happened yet. That doesn't make it a non-story, simply an obvious one.
So was last year’s, technically, but that didn’t stop apple from making it all about AI.
I feel like he’d be obsessively working to combine AI, robotics and battery technology into the classic sci fi android.
Instead, modern Apple seems to be innovating essentially nothing unless you count the VR thing and the rumors of an Apple car, which sounds to me much like the Apple Newton.
* Apps are already logged in, so no extra friction to grant access.
* Apps mostly use Apple-developed UI frameworks, so Apple could turn them into AI-readable representations, instead of raw pixels. In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.
* iPhones already have specialized hardware for AI acceleration.
I want to be able to tell my phone to a) summarize my finances across all the apps I have b) give me a list of new articles of a certain topic from my magazine/news apps c) combine internet search with on-device files to generate personal reports.
All this is possible, but Apple doesn't care to do this. The path not taken is invisible, and no one will criticize them for squandering this opportunity. That's a more subtle drawback with only having two phone operating systems.
Edit: And add strong controls to limit what it can and cannot access, especially for the creepy stuff.
People care about extra privacy when the delta in capability is minimal. But people won't allow a massive discrepancy, like the difference between a 8B model and a 700B model.
Apps already have such an accessibility tree; it's used for VoiceOver and you can use it to write UI unit tests. (If you haven't tested your own app with VoiceOver, you should.)
This really is the problem. Why do I spend hundreds of dollars more for specialized hardware that’s better than last years specialized hardware if all the AI features are going to be an API call to chatGPT? I am pretty sure I don’t need all of that hardware to watch YouTube videos or scroll Instagram/web, which is what 95% of the users do.
A big issue to solve is battery life. Right now there's already a lot that goes on at night while the user sleeps with their phone plugged in. This helps to preserve battery life because you can run intensive tasks while hooked up to a power source.
If apps are doing a lot of AI stuff in the course of regular interaction, that could drain the battery fairly quickly.
Amazingly, I think the memory footprint of the phones will also need to get quite a bit larger to really support the big uses cases and workflows. (I do feel somewhat crazy that it is already possible to purchase an iPhone with 1TB of storage and 8GB of RAM).
https://www.bhphotovideo.com/c/product/1868375-REG/sandisk_s... 2TB $185
https://www.bhphotovideo.com/c/product/1692704-REG/sandisk_s... 1TB $90
https://www.bhphotovideo.com/c/product/1712751-REG/sandisk_s... 512GB $40
Nevermind that—iOS just needs to reliably be able to play the song I’m telling it to without complaining “sorry, something went wrong with the connection…”
IMO, it was the researcher team's fault, good riddance.
Instead, we got, what? An automated memeoji maker? Holy hell they dropped the ball on this.
https://machinelearning.apple.com/research/ferret-ui-2
just using "AI" as term... they are so on the forefront that they sent your data to ChatGPT, otherwise you would be too ahead of the pack...
THAT would make me take an upgrade. Until then, I'm just keeping this phone until it goes out of support.