Meta replaces WhatsApp for Windows with web wrapper
Mood
heated
Sentiment
negative
Category
tech
Key topics
Meta
Windows 11
Web Wrapper
Meta has replaced the native WhatsApp desktop app for Windows with a web wrapper, which is causing performance issues and user dissatisfaction.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
1h
Peak period
157
Day 1
Avg / period
40
Based on 160 loaded comments
Key moments
- 01Story posted
11/13/2025, 3:44:37 AM
6d ago
Step 01 - 02First comment
11/13/2025, 5:06:41 AM
1h after posting
Step 02 - 03Peak activity
157 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
11/17/2025, 8:29:13 AM
2d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Must be a tiny percentage, which is why this version is now a basic web wrapper now.
Anyway, I’d remind everyone that “using” RAM doesn’t mean “would not function with less RAM.”
Many applications just use a lot if it’s available.
RAM is not really something you explicitly ration.
I guess this modern attitude is how we are where we are.
RAM is absolutely a scarce precious resource that we optimize for. At least we used to, and some of us still do.
Oh and the browser (any browser, I tried many) just takes up 1GB per tab it seems. It’s insane. My old 8GB laptop is nearly unusable now and can barely be used to browse the internet and very little else. I can at least keep coding on emacs. Who would think emacs would one day be an example of a lean app!?
You can buy an entire complete mini PC including 16GB of modular RAM for $240 on AliExpress.
I’m not saying “don’t optimize.” I’m saying that watching your task manager, seeing a big number and freaking out isn’t really the definition of unacceptable performance.
Also, even in theory the issue isn't only with "wouldn't function", but "would function slower due to eg disk swaps / cause other apps to function slower".
> An app can use a lot of memory, and it does not necessarily mean it’s a performance nightmare, but the issue with the new WhatsApp is that it feels sluggish. You’re going to notice sluggish performance, long loading time, and other performance issues when browsing different conversations.
It most certainly is. My old pc ran on 8MB of ram. Modern ones need 16GB for a comfortable experience. They do not do much more than I needed back then. I think it's reasonable to expect a simple chat app to not take up 128 times as much memory as my entire PC had when I was young.
Okay but I’m not trying to land on the moon, I’m trying to have an HD group video call and maybe play Cyberpunk in 4K later.
Let me ask you, how much did that 8MB of RAM cost you back in the day? I bet it was more than the $100 it costs to get 32GB of RAM or the $200 it costs to get 64GB. Before you apply inflation!
I have more memory so i can do more things, not so I can do the same things only slightly slower than I could before.
The issue, I think, is who the desktop users are. They sales people, they are people who conduct business over WhatsApp. The buyers at a previous job uses whatever the sellers in Asia, eastern Europe and the middle east are using. A long time ago, that was mostly Skype, now it WhatsApp. There a huge benefit to having WhatsApp on your desktop, with easy copy/paste, Excel and everything you need to make the deals.
Maybe Meta doesn't believe you should do business over WhatsApp and don't wont to cater to that crowed.
I would love to see what a professional Windows application developer, if those are still around, could do with a native WhatsApp client. Using modern C++, or just C# and all the tooling provided by the platform, how small and functional could you actually make something like that.
Still, I think the experience reported is very similar to running Chrome, and I think any laptop with 8GB of RAM can handle the application plus Excel and a web browser (or just run WhatsApp in the browser) just fine.
A complete mini PC with 16GB RAM, 512GB storage, and a relatively modern processor goes for like $240 on AliExpress. And that’s before you consider used hardware.
It's much easier to locate an application that has it's own process and presence in the operating system.
The author of the article would have noting to complain about if this was Facebook.
> Many applications just use a lot if it’s available.
Some of that memory isn't going to be touched again, and will eventually be moved to swap, but it still pushed things out of RAM to be there and is a troublemaker.
The rest of that memory will be needed again, so if it gets swapped out it'll lag badly when you switch back to the program.
Either way 99% of programs are not doing any kind of intelligent use of spare memory. If you see them doing something that looks wasteful, that's because they're being wasteful.
The one thing to remember is that at the OS level, disk cache pretty much qualifies as free memory. But that's unrelated to this issue.
Except when something really does need more RAM, and fails. LLVM for example having, somehow, become a bit chonky and now fails to compile on 32-bit OpenBSD systems because it wants more memory than is available. Less bloated software of course does not suffer from this problem, and continues to run on still functional 32-bit systems.
> Many applications just use a lot if it’s available.
Xorg is using 92M, irssi 21M (bloated, but I've been lazy about finding something leaner), xenodm 12M. That's the top three. Oh, Windows? Yeah. About that. Best you can hope for is not to catch too much of the splatter. (What passes for Mac OS X these days also seems fairly dismal.)
> RAM is not really something you explicitly ration.
Paperclips were hung on the rack doors to make it easier to poke the wee little red reset button when some poorly written software went all gibblesquik (as poorly written software is wont to do) and the OOM killer could not cope and, whelp, reset time. Elsewhere, RAM is explicitly rationed—perhaps certain aspects of timesharing have been somewhat forgotten in this benighted era of bloat?—and malloc will return NULL, something certain programmers often fail to check for, which is generally followed by the kernel taking the error-ridden code out back and shooting it.
Everybody is on telegram today, it's not like five years ago when people did not know what it was.
Matrix gives a similar experience with e2ee though, but you have to save a recovery key for the case you lose access to all your sessions
WhatsApp does work, I use it on a daily basis. The bot/bridge makes it easy to log in with the QR code. You need to keep the phone app as you need to open it at least once a month or so.
HMU if you need more info.
I wonder if they avoided that so they could use Electron and target MacOS / Linux too
We often hear stories about the speed of development and the issues of maintaining native apps, and then there are these rewrites every few years. Don't they waste more resources vs. creating / fixing the gaps in the native app? And this isn't somes quick startup prototype app that can flop and the effort would be wasted
Yet, I really don't understand why WhatsApp would need app especially with the state mentioned here (which is a basic wrapper)
There are no calls in the web app, but modern web stack is more than enough to provide all the real functionality needed for it.
If it allowed me to do video calls from a laptop, that could be useful but obviously that can't be a feature they therefore offer.
It does though?
According to this not any more.
Zuck, six months ago: “Within 12 to 18 months, most of the code will be written by AI. And I don’t mean autocomplete.”
Meta, today: "Maintaining this basic Windows app is just too much work."
I don’t think we have. This is always what efficiency leads to, higher resource consumption. The phenomenon was described already in the 1800s: https://en.wikipedia.org/wiki/Jevons_paradox
JS and the web has seen performance improvements. They lead to more ads being served and more code being released faster to users.
That said, I do firmly agree with the parent: there is choice involved here, engineering decisions.
The Microsoft world is particularly bloated, as they insist on shoehorning in unwanted anti-features into their OS. Much more efficient operating systems (and ways of building a chat client) exist.
Jevon's paradox may describe a general tendency, but it's no excuse for awful software.
Without any regulations companies will create software that costs more to the users, but saves pennies to the company.
So, we have regressed in efficency.
They are not mutually exclusive but one follows from the other.
It’s company vs user not regression vs efficiency
Completely wrong an irrelevant analogy!
I see where you went sideways, you confused trigger with consequence completely. Here the efficiency for the very same application got very, very very, increadibly hugely, galactically worse. Not better. The premise of the linked article is that the same application gets more efficient. Then comes the increased use of the affected resource. Here the same application went shit, complete shit, concerning efficiency, and had no effect on memory manufacture and prices, WhatsApp is not that significant in computing.
Probably a better analogy was that if technological and tigtly related economical advances raise the availability of resources (here memory, CPU) then things go dumb. If something then the generalized (from time to any resources) Parkinson's law is relevant here: increasing available resources beyond reasonable leading to waste and bad quality outcomes, overcomplication.
The application is ”business logic”.
The engine is JS. The more efficient JS engines get the more compute and memory JS will use to deliver business logic in the universe.
We have more efficient hardware, so we should be seeing hardware everywhere. But actually we all use the same amount of hardware we did 20 years ago. We all have a desktop, a laptop, a smartphone, a modem, hell even a computer watch, like we did 20 years ago. But they're more efficient now.
Where we do see more hardware now, is in pre-existing appliances, like fridges, TVs. And why is there more hardware? Sometimes it's just a remote control ("turn off TV"). But more often, the increase in adoption follows a specific motive: like figuring out that they could sell ads or subscriptions through it. And the hardware itself is not what's making the ads work: it's the software, that collects the information, feeds it to companies over networks, lets them data-mine it and sell it continuously. Both of these are just serving a human interest to make more money through the creative use of surveillance and marketing. And honestly, most of this could've been done with hardware and software 20 years ago. But because it's in vogue now, we see more of the hardware and software now.
We are comforted by coming up with an explanation that makes logical sense, like the paradox. But the paradox applies most when it coincides with an unrelated human interest. What motivates us is not A+B=C, but a combination of different calculations that sometimes involve A and B, and incidentally add up to C.
That's not the same thing. If you make batteries more efficient then people build more devices that run on batteries and then you need more batteries than ever. But you also get a bunch of new devices you didn't used to have.
When computers get more efficient, miserly corporations cut staff or hire less competent programmers and force their customers to waste the efficiency gains on that, because they don't have enough competition or the customers are locked in by a network effect. The difference is whether you actually get something for it instead of having it taken from you so somebody else can cheap out.
You may not like that from a 'native look and feel' point of view, but the question 'what is a native Windows app these days anyway' is very much unanswerable, and you can actually implement stuff like this in a performant and offline-sensitive way.
But, yeah, by the time the resulting GPU worker process balloons up to 400MB, that pretty much goes out of the window. I'm actually sort-of impressed, in that I have no idea how I would even make that happen! But that's why I don't work at a powerhouse like Meta, I guess...
I completely agree it would be better to rethink what we want and have markup/code/etc optimised to the task of rendering applications. I don't think it'll happen unfortunately.
They had to stop because native widgets aren't secure enough.
JIT compiling, native graphics, quick and easy online deployment into sandboxes, support for desktop standards like keypresses, etc.
It feels like the web ate up the windows desktop experience instead of that experience spreading cross-platform and dominating.
Maybe what you thinking is a wasm runtime like wasmer.
https://engineering.fb.com/2014/10/31/ios/making-news-feed-n...
But to edit large document, visualize any large corpus with side by side comparison, unless we plug our mobile on a large screen, a keyboard and some arrow pointer handler, there is no real sane equivalent to work with on mobile.
Not any more, I kept windows 11 around for gaming but I binned the partition, how they managed to make a 7950X3D/7900XTX feel "clunky" is astounding given that I live in KDE which has a reputation for been a "heavy" DE and yet it it feels instantaneously fast in every dimension compared to windows 11.
Full disclosure: I use KDE almost exclusively.
macOS spoiled me.
(in case anyone needs a reminder of Microsoft's org chart: https://www.globalnerdy.com/2011/07/03/org-charts-of-the-big...)
To be a little glib:
As someone who has worked for a few Big Software Companies, I guarantee that Microsoft's org chart has changed significantly at least once in the last fourteen years.
Re-organizations aren't referred to as "shuffling the deck chairs [on the Titanic]" by the rank and file for no reason, yanno?
But maybe that impression is wrong and they now cooperate better. After all since some Windows 10 update the Windows Explorer can even create files and folders starting with a dot (which from a kernel, fs and cmd perspective was always valid)
Based on my experience with Blasted Corporate Hellscapes, I find it very unlikely that they cooperate better. Middle-ish management lives to stab each other in the back, belly, and face.
> ...Windows Explorer can even create files and folders starting with a dot...
That's progress! Does Windows Explorer still shit the bed when you ask it to interact with a file whose name contains the '|' character? That's always been valid in NTFS, and I think is valid in at least a subset of the Windows programming interfaces.
Every place I've worked which did not use react had steady pushback from UI/UX to move to react. It took active resistance to not use react, even though it didn't make any sense to use.
As much as I like super snappy and efficient native apps, we just gotta accept that no sane company is going to invest significant resources in something that isn’t used 99%+ of the time. WhatsApp (and the world) is almost exclusively mobile + web.
So it’s either feature lag, or something like this. And these days most users won’t even feel the 1GB waste.
I think we’re not far away from shipping native compiled to Wasm running on Electron in a Docker container inside a full blown VM with the virtualization software bundled in, all compiled once more to Wasm and running in the browser of your washing machine just to display the time. And honestly when things are cheap, who cares.
But for real, the average number of apps people downloading get fewer year over year. When the most popular/essential apps take up more RAM, this effect will only exacerbate. RAM prices have also doubled over the last 3 months and I expect this to hold true for a couple more years.
It depends what metrics are considered. We can’t continue to transform earth into a wasteland eternally just because in the narrow window it takes as reference a system disconnect reward from long terms effects.
Luckily for me, i have the ultimate power so i can just say "Firefox doesn't support that. I don't use chrome. period."
But lately i had to start saying Safari doesn't support that so we would lose all iphones, or we can start investigate after we have a working solution. God damn react.
The advantage of the web app is that it just works, without installation, so there's no friction there. I'd very much prefer a native app, but the overhead is quite high, no?
And from a manager's point of view it seems wasteful to develop the same feature across multiple platforms. And if you look at the numbers it does, but numbers-driven development has been a huge issue for a long time now. They don't consider performance or memory usage a factor, and perceived performance is "good enough" for a web app.
Happy to learn otherwise, but might be a datapoint on user behaviour (which could also drive corporate choices).
Ever since UX and UIs started to be driven mainly by metrics and numbers, I felt something started going wrong already. Since then (the decades...), I've learned about "McNamara fallacy" which seems to perfectly fit a lot of "modern" software engineering and product management today:
> The McNamara fallacy (also known as the quantitative fallacy) [...] involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.
Let google do it on your behalf.
When a developer/company decides to not implement things local and proper way and push it out and be done with it regardless of the resources the product use on the users' system, I mark the company as lazy and cheap, actually.
Shoving the complexity and cost to users' is being inconsiderate.
I guess it's because they decided to make the web client first-class, and instead of maintaining a native client for each platform (windows, mac, linux...) they opted to just serialize all non-mobile uses (which probably aren't that important to them to begin with) to web.
No 1 GB or installation needed
Why is the desktop app even a thing?
Pin tab, problem solved?
The ergonomics are significantly worse.
I want most of my browser windows full screen. I don't want my instant messenger full screen. Using it in a browser means I have to have one size, and resizing one changes the other.
The experience of using a native app is far superior.
Sure it's a little quirky at times (eg it closes if the browser restarts for update) and it doesn't have a system tray icon, but aside from that, it behaves like a separate app.
So it doesn't behave like a separate app.
I guess if you're using Microsoft's ill-advised window grouping feature it would work less well (require more clicks), but breaking websites out into entirely separate programs just so we can have separate windows because Microsoft screwed up the window management functionality seems like a very inefficient workaround.
With a native app it's just alt+tab - or, if the app is pinned to the taskbar, Win+(1/2/3/4...)
/s
the desktop app is considerably faster and more responsive.
the deaktop app allows OS level shortcut keys
the desktop app is easier to work with when applying parameters in programs like excluding it from my VPN or for sandboxing or for isolating network traffic. Or for looking at how much space it takes up on disk. (im not a web developer), it doesnt cause any confusion or mistakes to be made as its logical separation in OS is clear, this is also faster
the desktop app has better keyboard shortcuts that dont collide with your browser, and the same with right click menus
I can easily video call from various PCs while still not trusting my browser with camera/mic permissions
Meta makes more money than god and there's over a billion WhatsApp users. It's not like this thing is Blender or a AAA game, it's a chat frontend. Maintaining it has to be a rounding error in the budget.
Whatsapp screams antitrust. If you look in the dictionary for antitrust, you see Whatsapp
225 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.