What Happened to Apple's Legendary Attention to Detail?
Posted2 months agoActive2 months ago
blog.johnozbay.comTechstoryHigh profile
heatednegative
Debate
85/100
AppleIosSoftware Quality
Key topics
Apple
Ios
Software Quality
The article laments Apple's perceived decline in attention to detail, sparking a heated discussion among commenters about the company's priorities and the impact of its recent changes.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
11m
Peak period
145
0-12h
Avg / period
22.9
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 23, 2025 at 3:05 PM EDT
2 months ago
Step 01 - 02First comment
Oct 23, 2025 at 3:16 PM EDT
11m after posting
Step 02 - 03Peak activity
145 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 30, 2025 at 7:08 PM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45685551Type: storyLast synced: 11/22/2025, 11:17:55 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
(Perhaps charismatic leaders neglect it a bit on purpose because they enjoy being portrayed as irreplaceable.)
The people in power don't have a burning yes.
First, I quickly tap on the first button that has the picture of the credit card and its name. As a result I find myself in a menu that shows me the billing address (go figure)! So, I have to click back, and use the button below that simply states “Change the credit card” or something to that effect.
Why, for the love of god, for the info about the billing address Apple uses picture of CC? Why the billing address is even the first option!?
So, multiple clicks when it can be avoided by a proper design (I think in the past the picture button was the one that changed credit cards, but I don’t know if I am misremembering).
Every time it happens I think about Steve Jobs.
However, it seems that under Tim Cook, Apple has gradually lost many of its traditional values when it comes to usability and UI/UX perfectionism. I suspect that the company has not passed on "The Apple Way" to people who joined the company after Steve Jobs' passing. Not only that, there doesn't seem to be an "Apple Way" anymore.
Come to think of it, the old Apple had figures like Bruce Tognazzini who wrote about "The Apple Way"; I have a copy of Tog on Interface that distills many of the UI/UX principles of the classic Mac. I can't think of any figures like Tog in the modern era.
Gradually the Apple software ecosystem is losing its distinctiveness in a world filled with janky software. It's still better than Windows to me, but I'd be happier with Snow Leopard with a modern Web browser and security updates.
It's sad; the classic Mac and Jobs-era Mac OS X were wonderful platforms with rich ecosystems of software that conformed to the Apple Human Interface Guidelines of those eras. I wish a new company or a community open-source project would pick up from where Apple left off when Jobs passed away.
There's the Hello System[0]... not sure if it counts.
[0] https://hellosystem.github.io/docs/
Instead, they should have stayed on the Straigth and Narrow of Quality - where they were for many years - where you move up to computing paradise by having fewer features but more time spent perfecting them.
We see what you did there!
yep. The attention to details is still there, it is just changed from polishing and curating details to creating a lot of small unpolished and uncalled for and thus very annoying details. From MBA POV there isn't much difference, and the latter even produces better KPIs.
Not entirely, though; there is joy and playfulness at the core of Liquid Glass. But delight is not that common, and refinement and focus are definitely lacking. They could have used more nos and fewer yeses.
That's the bed they made themselves and lay in it willingly.
No one is forcing them to do huge yearly releases. No one is forcing them to do yearly releases. No one is forcing them to tie new features which are all software anyway to yearly releases (and in recent years actual features are shipping later and later after the announcement, so they are not really tied to releases either anymore).
Their software side however is riddled with issues, delayed or cancelled projects, meandering focus, unclear priorities. They are increasingly overpromising and underdelivering.
Their product side is neither here nor there. Vision Pro is a technically marvelous bust. iPhones rely on gimmicks because there's really nothing to differentiate them anymore. Peripherals (Homepod, AppleTV) are stagnant. iPad suddenly saw signs of life this year with good functionality updates after a decade of "we have no idea what to do with it, here are meaningless and confusing hardware updates". Macbooks have been completed as a product years ago (not that it's a bad thing), so, again, they are just randomly slapping non-sensical names on upgrades and chase thinness.
Oh. Thinness. That is literally the only feature that Apple is obsessed with. You can't build a product strategy on thinness alone.
The stock market can easily be taught anything. And Jobs didn't even care about stock market, or stock holders (Apple famously didn't even pay dividends for a very long time), or investors (regularly ignoring any and all calls and advice from the largest investors).
You need political will and taste to say a thousand nos to every yes. None of the senior citizens in charge of Apple have that.
They could easily wait longer between releasing devices. An M1 Macbook is still in 2025 a massive upgrade for anybody switching from PC - five years after release.
If Apple included fully fledged apps for photo editing and video editing, and maybe small business tools like invoicing, there would be no reason for any consumer in any segment to purchase anything other than a Mac.
They could, but then they wouldn't be a trillion dollar company. They'd be a mere $800bn company, at best. ;)
Not many consumers go out to buy an Apple device because the new one has been released. They go out to buy a new phone or new computer because their old one gave out and will just take the Apple device that is for sale.
That's also why Apple bothers to do the silent little spec-bump releases: it gives Business Leasing corporate buyers a new SKU to use to justify staying on the upgrade treadmill for their 10k devices for another cycle (rather than holding off for even a single cycle because "it's the same SKU.")
I generally see complaints about advancement aimed at the hardware. Some are unreasonable standards, some are backlash to the idea of continuing to buy a new iphone every year or two as the differences shrink, but either way software feature spam is a bad response.
And then what? Mac users would buy some janky Acer with Windows 11 and bunch of preinstalled malware instead?
1. They've stopped starting small and instead started unrealistically large. Apple Intelligence is a great recent example.
2. They've stopped iterating with small improvements and features, and instead decided that "iterating" just means "pile on more features and change things".
Either everyone is worried about the consequences of failing to produce high quality work (including at the VP level, given they can allocate additional time/resources for feature baking) or optimizing whatever OKR/KPI the CEO is on about this quarter becomes a more reliable career move.
And once that happens (spiced with scale), the company is lost in the Forest of Trying to Design Effective OKRs.
And that's not excusable - every feature should have its maintainer who should know that a large framework update like Liquid Glass can break basically anything and should re-test the app under every scenario they could think of (and as "the maintainer" they should know all the scenarios) and push to fix any found bugs...
Also a company as big as Apple should eat its own dogfood and force their employees to use the beta versions to find as many bugs as they could... If every Apple employee used the beta version on their own personal computer before release I can't realistically imagine how the "Electron app slowing down Tahoe" issue wouldn't be discovered before global release...
https://media.ycharts.com/charts/441687ba735392d10a1a8058120...
The companies forget how to make great products. The product sensibility and product genius that brought them to this monopolistic position gets rotted out by people running these companies who have no conception of a good product vs. a bad product. They have no conception of the craftsmanship that’s required to take a good idea and turn it into a good product. And they really have no feeling in their hearts about wanting to help the costumers.”
- Steve Jobs - https://en.wikipedia.org/wiki/Steve_Jobs:_The_Lost_Interview
That said, I wonder, Jobs lived through Apple's transformation, but not its peak phase where Apple was simply printing money year after year after year. I do wonder if Jobs in 2016 would have been able to keep the organization performing at such a high caliber.
Even he seemed like he was making unforced errors at times too, like the "you're holding it wrong" fiasco, but its hard to say since he didn't live through Apple 2013-2019 where it became an ever increasing money printing machine.
In the age of AI, COVID-19 etc. I wonder how jobs post 2020 would treat things.
When I interviewed at a smaller company, someone high up interviewed me last. I passed everything on paper afaik, but he didn't think I was the right person for some reason. Which is fine for a small company.
I was in a (tech) meetup last week. We meet regularly, we are somewhere between acquaintances and friends. One thing that came up was a very candid comment about how "we should be able to tell someone 'that is just stupid' whenever the situation warrants it".
I believe that does more good than harm, even to the person it is being directed to. It is a nice covenant to have, "we'll call you on your bs whenever you bring it in", that's what a good friend would do. Embracing high standards in a community makes everyone in it better.
The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
I wish he'd bless a certain Linux distro for PCs so we can have some default. Current default is kinda Ubuntu, but they've made some weird decisions in the past. Seems like he'd make reasonable choices and not freak out over pointless differences like systemd.
You can tell someone their idea is substandard without inferring their stupid, which is generally taken to be an insult. Tact in communication does matter. I don't think anyone needs to say "that is just stupid" to get a point across.
I've had plenty of tough conversations with colleagues where it was paramount to filter through ideas, and determining viable ones was really important. Not once did anyone have to punch at someone's intelligence to make the point. Even the simple "Thats a bad idea" is better than that.
>whenever the situation warrants it
Which will of course be up to interpretation by just about everyone. Thats the problem with so called "honest"[0] conversation. By using better language you can avoid this problem entirely without demeaning someone. Communication is a skill that be learned.
>The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
Linus took a sabbatical in 2018 to work on his communication and lack of emotional empathy. He's had to make changes or he absolutely risked losing the respect of his peers and others he respected. He has worked on improving his communication.
To follow Linus as an example, would be to work on communication and emotional empathy. Not disregard your peers.
[0]: Most often, I find people who are adamant about this line of thinking tend to want an excuse to be rude without accountability.
In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap, not much usually gets done. This is how you get design-by-committee lowest common denominator slop.
And even if you don't agree with what I'm saying here, "avoid criticising people" quickly turns into "avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket and the product would be greatly improved.
Those two things are not coupled. You can maintain a sense of politeness in face of conflict. This is the entire basis of Nonviolent Communication, a great book about handling and resolving conflict in such a manner.
It’s extremely effective in my experience and results in overall better clarity and less conversational churn.
>Why would anyone do that if they can't even be called out for messing something up, yet alone being held accountable
You can be, that is in part a definition of accountability and you’re conflating a lack of accountability with some idea that it requires behaving in a manner that may be construed as rude, and that’s simply not true.
So like anything, you hold them accountable. You can do that without being rude.
>In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap
I’m getting a sense you have a predisposition to disliking these things. They’re really important because they are, when correctly understood, results oriented. It frees up people to feel comfortable saying things they may not have been otherwise. That is very productive.
Abusive and abrasion language does not do that.
>This is how you get design-by-committee lowest common denominator slop
No, in my experience and many reports from others you get this for a myriad of reasons, but consistent theme is lack of ownership or organizational politics, not because people level up their communication skills
>avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket
I don’t disagree with you because I don’t believe in proper criticism, I do. I disagree with you because the implicit messaging I’m getting here is the following
- you sometimes have to be a jerk
- therefore it’s okay to be a jerk sometimes
- somehow having an expectation of treating others with respect somehow equates to poor accountability
I’ve spent a good chunk of my years learning a lot about effective communication and none of it is about avoiding accountability, of yourself or others. It’s about respecting each other and creating an environment where you can talk about tough things and people are willing to do it again because they were treated respectfully
Yes, I'm conflating that. Maybe the two aren't intrinstically coupled, but from what I have seen, that seems to happen. When you forbid surface-level aggression, people don't stop being aggressive or frustrated. They just turn to more underhanded ways of aggression, like bullying or gaslighting.
>I’m getting a sense you have a predisposition to disliking these things. They’re really important because they are, when correctly understood, results oriented. It frees up people to feel comfortable saying things they may not have been otherwise. That is very productive.
I see what you're saying, but I do not think this plays out in practice like that. It's not results-oriented thinking when you prioritise how the other person feels above all. Not to say you should never prioritise it (that's called sociopathy), but if you prioritise it too much, you can say less things, not more, because disagreeing with someone always carries the risk of offence, especially if you are going to say that their idea isn't the best. If you nurture a culture of honesty - and that does not include being abusive - then people will feel free to push back on bad ideas, and that is results-oriented thinking.
>I disagree with you because the implicit messaging I’m getting here is the following
The first two points absolutely, but I would like to push back on the third point. Not every idea deserves respect and hell, not everyone deserves respect either! It is the hollowing out of what the word originally meant. Ideally, you only respect people who deserve respect, who return it to you in turn. To be respected is a honour, not a right, because it carries implicit trust in your words. If you consistently have a negative impact on the environment, I do not think it is a reasonable expectation to continue to treat you with respect, because that wastes everyone's time.
And if someone "consistently has a negative impact on the environment" you can still confront them without being abrasive. They can still be fired without calling them stupid. Adding that kind of tone adds no information except that you lost your cool. You're making it sound like every instance that warrants confrontation is about an intentional and repeated offence.
What's wrong with calling an idea stupid? A smart person can have stupid ideas. (Or, more trivially, the person delivering a stupid idea might just be a messenger, rather than the person who originally thought of the idea.)
Though, to be clear, saying that an idea is stupid does carry the implication that someone who often thinks of such ideas is, themselves, likely to be stupid. An idea is not itself a mind that can have (a lack of) intelligence; so "that's stupid" does stand for a longer thought — something like "that is the sort of idea that only a stupid person would think of."
But saying that an idea is stupid does not carry the implication that someone is stupid just for providing that one idea. Any more than calling something you do "rude" when you fail to observe some kind of common etiquette of the society you grew up in, implies that you are yourself a "rude person". One is a one-time judgement of an action; the other is a judgement of a persistent trait. The action-judgements can add up as inductive evidence of the persistent trait; but a single action-judgement does not a trait-judgement make.
---
A philosophical tangent:
But what both of those things do — calling an idea stupid, or an action rude — is to attach a certain amount of social approbation or shame to the action/idea, beyond just the amount you'd feel when you hear all the objective reasons the action/idea is bad. Where the intended response to that "communication of shame" is for the shame to be internalized, and to backpropagate and downweight whatever thinking process produced the action/idea within the person. It's intended as a lever for social operant conditioning.
Now, that being said, some people externalize blame — i.e. they experience "shaming messaging" not by feeling shame, but by feeling enraged that someone would attempt to shame them. The social-operant-conditioning lever of shame does not work on these people. Insofar as such people exist in a group, this destabilizes the usefulness of shame as a tool in such a group.
(A personal hypothesis I have is that internalization of blame is something that largely correlates with a belief in an objective morality — and especially, an objective morality that can potentially be better-known/understood by others than oneself. And therefore, as Western society has become decreasingly religious, shame as a social tool has "burned out" in how reliably it can be employed in Western society in arbitrary social contexts. Yet Western society has not adapted fully to this shift yet; which is why so many institutions that expect shame to "work" as a tool — e.g. the democratic system, re: motivating people to vote; or e.g. the school system, re: bullying — are crashing and burning.)
I prefer the former by a lot, but of course you're free to spend your time in the latter.
The likeliest outcome from that is the other person gets defensive and everything stays the same or gets worse. It’s not difficult to learn to be tactful in communication in a way which allows you to get your point across in the same number of words and makes the other person thankful for the correction.
Plus, it saves face. It’s not that rare for someone who blatantly say something is stupid to then be proven wrong. If you’re polite and reasonable about it, when you are wrong it won’t be a big deal.
One thing I noticed about people who pride themselves in being “brutally honest” is that more often than not they get more satisfaction from being brutal than from being honest, and are incredibly thin-skinned when the “honest brutality” is directed at them.
> The Linux kernel would be absolutely trash if Linus were not allowed to be Linus.
I don’t understand why people keep using Torvalds as an example/excuse to be rude. Linus realised he had been a jerk all those years and that that was the wrong attitude. He apologised and vowed to do better, and the sky hasn’t fallen nor has Linux turned to garbage.
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
But that's a special case, not a usual one. Unfortunately, quite a lot of people say things are stupid when they don't understand them (often because of an inflated sense of their own expertise). If they can politely explain why they think an idea is bad, they are more likely to be listened to, and they can save face if the other person successfully counters their argument.
Bottom line is, if you go around calling ideas stupid you better make damn sure you're never wrong, otherwise, well... that's just stupid :)
This is a huge misunderstanding at best and a malicious re-framing of serious issues within portions of the tech industry at worst.
In what context is virtually everyone pushing to hire demonstrably unqualified people?
Any time anyone is celebrated for being an X in this role rather than being good at the role, this is being pushed for.
If you take a hardline attitude on keeping the gates up, you're just going to end up with a monoculture that stagnates.
Sure, they lack wisdom, but that doesn't mean they aren't smart, it just means they're young.
Gatekeeping doesn't have to mean "Don't hire anyone under 35" it means "Don't hire people who are bozos" and "don't hire people who don't give a shit"
I’ve worked at places that have the opposite philosophy - hire quickly and fire quickly. That works in terms of hiring people who already happen to be what you want them to be. It just leaves no room for anyone who could be, but isn’t yet, what you want them to be. It also leaves no room for anyone who is different from what you are looking for but who could still bring a lot to the table if you just take the time to figure out what that is, which I think describes a lot of people. You might have hired a mediocre programmer who would be a rockstar at documentation, for example. That kind of thing happens all the time, yet workplace culture and practices tend not to accommodate that. By all means have standards, but put in some effort to help your people reach them in their own way.
If Apple was made up of only top-end engineers led by a quality-obsessed maniac, would they put out better or worse products?
Of course, not everyone can follow this philosophy, but they don't have to, and most don't want to anyway.
The great engineers don’t graduate from college knowing everything they need to know, nor are they born with that knowledge. It takes time and help from other people to get them there. Even if they were already a top performing engineer at Netflix, that doesn’t mean they can smoothly transition into a role at your company and perform well with zero assistance. The on-ramp matters and has a huge impact on how they will perform. Some people will require more investment than others, but that’s true regardless of whether you stubbornly try to maintain your existing monoculture. And I firmly believe that everyone brings something different to the table. It’s mostly a matter of figuring out what that is for each person.
Another example is that a hobby I loved is now dead to me for lack of gatekeeping; Magic the Gathering. Wizards of the Coast started putting out products that were not for their core playerbase, and when players complained, were told "these products are not for you; but you should accept that because there's no harm in making products for different groups of people". That seems fair enough on its face. Fast forward a couple of years, and Magic's core playerbase has been completely discarded. Now Magic simply whores itself out to third party IPs; this year we'll get or have gotten Final Fantasy, Spiderman, Spongebob Squarepants, and Teenage Mutant Ninja Turtles card sets. They've found it more lucrative in the short-term to tap into the millions of fans of other media franchises while ditching the fanbase that had played Magic for 30 years. "This product is not for you" very rapidly became "this game is not for you", which is pretty unpleasant for people who've been playing it for most or all of their lives.
Also, it became the best selling set of all time even before it was out. Which isn’t an indicator of quality, for sure, but it does show Wizards understands something about their market.
I'm not sure Wizards does understand their market. As you noted, a set doing numbers pre-release has absolutely nothing to do with its quality; it just means there are a lot of Final Fantasy fans interested in collecting cards. But this is not necessarily sustainable for another 30 years, because those Final Fantasy fans are not necessarily going to stick around for Spiderman, and Spiderman fans are not necessarily going to stick around for Spongebob. The Spiderman set was already such a massive flop that they were trying to identify and blame which content creators/streamers were responsible for negatively influencing public opinion, as though that couldn't have happened organically.
At this point, I'm done with WotC. The Pinkerton thing was by far the worst and what made me turn my back forever. Bad rulesets or design designs with which I disagree are one thing, but I refuse to do business with a company that thinks it's acceptable to use force to try to bully people into sticking to their release schedules. They can pound sand forever.
My take away is that diversity at a global level, and in some specific contexts, is a great thing. But diversity in some other specific contexts is entirely destructive and analogous to rot or decomposition.
When we rely on a core societal function (firefighting, accounting, waterworks maintenance, property rights, etc.) the people responsible for maintaining these functions need to maintain in themselves a set of core characteristics (values as patterns of action), and there is room to play outside of those cores, but those cores shouldn't be jeopardized as a tradeoff for diversity and inclusion.
For example, if constructive core values of a railroad system is consistency and reliability, then these shouldnt be diminished in the name of diversity and inclusion, but if diversity and inclusion can be achieved secondarily without a tradeoff (or even to somehow further amplify the core values) then it is constructive. One has to thoughtfully weigh the tradeoffs in each context, and ensure that the most important values in that context to maintain the relevant function are treated as most important. The universe seems to favor pragmatism over ideology, at least in the long run.
So in a company if the core values that make it successful are diluted in exchange for diversity, it's no longer what it was, and it might not be able to do keep doing what it did. That said, it also might have gained something else. One thing diversity tends to offer huge complex systems is stability, especially when its incorporated into other values and not held up singularily.
In other words, my take on diversity (and by extension, inclusion) is that we need a diversity of diversity. Sometimes a lot of diversity is best, and sometimes very little diversity is best.
It's good to not exclude people for arbitrary reasons (though even this requires the caveat that one man's "arbitrary" is another man's "important part of our identity"). But we also need to recognize that it's ok for something to not be everyone's cup of tea. There isn't some kind of moral mandate that everything must be maximally welcoming to all. Unfortunately, we don't recognize that in our current culture, and in fact we stigmatize it as "gatekeeping" which is deemed to be toxic. But the culture is wrong about this.
Im sceptical. I've never seen what you describe outside of toxic "culture war clickbait videos", what i have seen is nepotism, class privileges and sprint culture pushed by investors - you know the exact opposite of what you describe.
I do not for the life of me understand your point. Gatekeeping, as its most commonly used, means controlling access to something (be it a resource, information etc) to deliberately and negatively affect others that are not part of a "blessed" group. Its not objective, and certainly is not a practice reliant on merit. Its an artificial constraint applied selectively at the whim of the gatekeeper(s).
>There's been a shift where everyone wants to welcome everyone, but the problem is it erodes your company culture and lowers the average quality.
The first assertion and the second one are not related. Being welcoming to everyone is not the same thing as holding people to different standards. Company culture sets company inertia and how employees are incentivized to behave and what they care about. You can have the most brilliant engineers in the world, like Google most certainly does have its fair share, and as we have seen, with the wrong incentives it doesn't matter. Look at Google's chat offerings, the Google Graveyard, many of their policies becoming hostile to users as time goes on etc.
Yet you can have a company with what you may deem "average quality" but exceeds in its business goals because its oriented its culture to do so. I don't think Mailchimp was ever lauded for its engineering talent like Google has been, for example, but they dominated their marketplace and built a really successful company culture, at least before the Intuit acquisition.
It used to be hard and a liability to be a nerd
I'm pretty sure this would also render the dot-com bubble the nerds fault?
Let's not go back to how nerd culture used to be regarding diversity... or lack thereof.
I remember when Bill Gates was on magazine covers, viewed as a genius, a wonderful philanthropist, even spoofed in Animaniacs as "Bill Greats."
I guess my point is, "It used to be hard and a liability to be a nerd" was never true, and is nothing but industry cope. The good old days were just smaller, more homogenous, had more mutually-shared good old toxicity and misogyny (to levels that would probably get permabans here within minutes; there's been a lot of collective memory-holing on that), combined with greater idolization of tech billionaires.
What changed in 2010?
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
Successful publicly traded companies have a responsibility to generate more revenue and increase the stock price every year. Year after year. Once their product is mature after so many years, there aren't new variations to release or new markets to enter into.
Sales stagnate and costs stagnate; investors get upset. Only way to get that continual growth is to increase prices and slash costs.
When done responsibly, it's just good business.
The problem comes in next year when you have to do it again. And again. Then the year after you have to do it again. And again.
Such as all things in life, all companies eventually die.
—Steve Jobs
https://www.youtube.com/watch?v=rQKis2Cfpeo
I don't use a Mac anymore, but I do use an iPhone. This is the worst version of iOS I can recall. Everything is low contrast and more difficult to see. The colors look washed out. The icons look blurry. In my opinion, Liquid Glass is a total bust. I don't know what these people are thinking. Times have certainly changed.
OP was talking about design languages
I think we're stuck with the notch forever on iPhones. Even if apple uses an on-screen fingerprint reader in the future like a billion phones already do they're not going to go back from the face scanner. The only thing that will work is if the face scanner can read from behind the display.
Maybe it's because I use dark mode? I can only tell it's there if I move my mouse under it.
Here's a "workaround" that might help [1]. It entirely excludes the notch area from use.
[1] https://apple.stackexchange.com/a/460903/601283
Go to Settings > Displays. In the list of resolutions you need to enable “Show all resolutions” then you can select one that will hide the notch
Got the idea from the Top Notch app, which no longer seems to work: https://topnotch.app/
I believe 2026 will finally be the year of Linux desktop.
I’ve been hearing this substituting in YYYY+1 every YYYY for the last quarter century.
The year of Linux desktop will never come. Why?
- Money. Hardware manufacturers make more money selling computers that are optimized for Windows and there is nothing on the horizon that will change that meaning that the Linux desktop experience is always the worst of the three main options for hardware compatibility.
- Microsoft. Call me when Office runs natively in Linux. You might be happy with LibreOffice or Google Docs, but MS Office still dominates the space (and as someone who does a lot of writing and has a number of options, I find Word to be better than any of the alternatives, something that 30 years ago I would have scoffed at).
- Fidgetiness. All the tweaking and customizing that Linux fans like is an annoyance for most people. Every customization I have on my computer is one more thing I need to keep track of if I get a new computer and frankly it’s more than a little bit of a pain.
Everything seems to be lazily done now - by that I mean, a modal pops-up and then it resizes to fit the content. Never seen this before.
Or, you open settings (settings!) and it's not ready to use until a full second later because things need to pop in and shift.
And it's animated- with animation time, so you just have to wait for the transitions to finish.
And "reduce motion" removes visual feedback of moving things (e.g. closing apps) so I find it entirely unusable.
And as others have noted the performance is completely unacceptable. I have a 16 pro and things are slow... And forget "low battery mode" - it's now awful.
I'm not doing anything weird and keep like all apps closed and things off when I don't use them and battery life is significantly worse. (Noticed the same on M4 + Tahoe, upgraded at the same time)
Very disappointed and I very much regret upgrading.
There's little problems that keep accumulating, like the camera app opening up and only showing black until restarting it, at which point I've missed the candid opportunity.
I'm not going anywhere, it's still the right mix of just-works across their ecosystem for me, but dang, the focus does feel different, and it's not about our experience using Apple.
[1] https://discussions.apple.com/thread/256140468?sortBy=rank
Also, I have the iPhone 15 Pro (iOS 26.0.1), never had the black screen on camera open yet. That's the kinda thing I'd get on Android.
DHH was someone I kinda read him online, but he's been going full-in on these racist talking points, e.g., https://paulbjensen.co.uk/2025/09/17/on-dhhs-as-i-remember-l... :(
I feel like that loses a majority of people right there. I like the option to do common things with the keyboard, or to configure things with a file. But for things I don't do often, holding a bunch of keyboard shortcuts in my head feels like a waste for those rare things.
I'm not sure about anyone else, but I can't run whatever Linux distro I want at work. When an OS relies on muscle memory to get smooth and fluid operation, it seems like that would make it extra hard to jump between for work vs home. I spent years jumping between OS X and Windows, and I found it's much nicer now that I'm on the same OS for work and home. Even the little things, like using a different notes app at home vs work do trip me up a little, where I'll get shortcuts wrong, or simply not invest in them, because of the switching issue. Omarchy feels like it would be that situation on steroids.
Judging by the Omarchy presentation video it feels too keyboard oriented. Hotkeys for everything. And hotkeys for AI agents? Is is opinionated indeed. Not my cup of tea.
Luckily Safari's Reader Mode didn't bug out
It's fascinating to me because that's the single thing which every user goes through. It's the main branch and not some obscure some edge case. How do you do testing that you miss that?
ironically I don't really mind the new design language, whatever, if the damned thing worked.
In this case the inefficiency was attention to detail but in other companies it might be something else.
For several years, there's been an issue with audio message recording in iMessage. Prior to iOS 26, it would silently fail; the recording would "begin" but no audio would be captured. This would happen 3, 4, even 5 times in a row before it would actually record audio.
Apple is clearly aware of the issue, because in iOS 26 the failure is no longer silent. Now, you'll get feedback that "Recording isn't available right now". Yet the incidence is...exactly the same. You might have to try 5 times before you're actually able to record a message.
It's simply infuriating, and it makes no sense to a user why this would be happening. Apple owns the entire stack, down to the hardware. Just fix the fucking bug!
You outgrew this myth, congratulations!
> Look, I've got nothing but respect for the perfectly lovely humans who work at Apple. Several are classmates from university, or people I had the pleasure of working with before at different companies. But I rather suspect what's happened here is that some project manager ... convince Tim
But haven't outgrown this one yet, well, maybe in another 8 years...
https://media.nngroup.com/media/editor/2025/10/06/1-messages...
I mean, some people are just impossible to please!
415 more comments available on Hacker News