Android developer verification: Early access starts
Mood
excited
Sentiment
positive
Category
tech
Key topics
Android development
Google Play Store
Developer verification
Google has started early access to Android developer verification, a new feature aimed at improving the security and trustworthiness of the Google Play Store.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
N/A
Peak period
154
Day 1
Avg / period
53.3
Based on 160 loaded comments
Key moments
- 01Story posted
11/13/2025, 12:33:25 AM
6d ago
Step 01 - 02First comment
11/13/2025, 12:33:25 AM
0s after posting
Step 02 - 03Peak activity
154 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
11/15/2025, 1:47:14 AM
4d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Based on this feedback and our ongoing conversations with the community, we are building a new advanced flow that allows experienced users to accept the risks of installing software that isn't verified. We are designing this flow specifically to resist coercion, ensuring that users aren't tricked into bypassing these safety checks while under pressure from a scammer. It will also include clear warnings to ensure users fully understand the risks involved, but ultimately, it puts the choice in their hands. We are gathering early feedback on the design of this feature now and will share more details in the coming months.
absolutely no. this is for the user side. but if you're a developer who is planning to publish the app in alternative play store/from your website, you have to do verification flow. please read the full text.
Still, it seems like good news, so I'll take it.
I'm cautiously optimistic though. I'm generally okay with nanny features as long as there's a way to turn them off and it sounds like that's what this "advanced flow" does.
I highly doubt this is your "top" priority. Or if it is then you're gotten there by completely ignoring Google account security.
> intercepts the victim's notifications
And who controls these notifications and forces application developers to use a specific service?
> bad actors can spin up new harmful apps instantly.
Like banking applications that use push or SMS for two factor authentication. You seem to approve those without hesitation. I guess their "top" priority is dependent on the situation.
Protecting their app store revenues from competition exposes them to scrutiny from competition regulators and might be counter productive.
Many governments are moving towards requiring tech companies to enforce verification of users and limit access to some types of software and services or impose conditions requiring software to limit certain features such as end to end encryption. Some prominent people in big tech believe very strongly in a surveillance state and we are seeing a lot of buy in across the political spectrum, possibly due to industry lobbying efforts. Allowing people to install unapproved software limits the effectiveness of surveillance technologies and the revenues of those selling them. If legal compliance risks are pushing this then it is a job for voters, not Google to fix.
Certainly voters need to have their say, but often their message is muffled by the layers of political and administrative material it passes through.
- Just yesterday there was a story on here about how Google found esoteric bugs in FFMPEG, and told volunteers to fix it.
- Another classic example, about how Google doesn't give a stuff about their user's security is the scam ads they allow on youtube. Google knows these are scams, but don't care because they there isn't regulation requiring oversight.
Fixed that for you. Google's public service was both entirely appropriate and highly appreciated.
Not by the maintainers it wasn't Mr. Google.
I'd highly appreciate even if the maintainers never did anything with the report, because in that case I would know to stop using ffmpeg on untrusted files.
Again, if YOU highly appreciate their service, that's great, but FFMPEG isn't fixing a codec for a decades old game studio, so all Google has done is tell cyber criminals how to infect your Rebel Assault 2. I'm glad you find that useful.
See the POC in the report by google, the command they run is just `./ffmpeg -i crash.anim -f null /dev/null -loglevel repeat+trace -threads 1` and the only relevant part of that for being vulnerable is that crash.anim is untrusted.
Edit: And to be clear, it doesn't care about the extension. You can name it kittens.mp4 instead of crash.anim and the vulnerability works the same way.
Of course, I'm not saying we shouldn't push to improve things, but I don't think this is the right reaction either.
How much they spend is no indicator of how and where they spend it, so is hardly a compelling argument.
> And who controls these notifications and forces application developers to use a specific service?
Am I alone in being alarmed by this? Are they admitting that their app sandboxing is so weak that a malicious app can exfil data from other unaffiliated apps? And they must instead rely on centralized control to disable those apps after the crime? So.. what’s the point of the sandboxing - if this is just desktop level lack of isolation?
Glossing over this ”detail” is not confidence inspiring. Either it’s a social engineering attack, in which case an app should have no meaningful advantage over traditional comms like web/email/social media impersonation. Or, it’s an issue of exploits not being patched properly, in which case it’s Google and/or vendor responsibility to push fixes quickly before mass malware distribution.
The only legit point for Google, to me, is apps that require very sensitive privileges, like packet inspection or OS control. You could make an argument that some special apps probably could benefit from verification or special approvals. But every random app?
An app can read the content of notifications if the appropriate permissions are granted, which includes 2FA codes sent by SMS or email. That those are bad ways to provide 2FA codes is its own issue.
I want that permission to exist. I use KDE Connect to display notifications on my laptop, for example. Despite the name, it's not just for KDE or Linux - there are Windows and Mac versions too.
Do apps generally do this? I've never run into one that doesn't expect me to type in the number sent via SMS or email, rather than grabbing it themselves.
I don't use a lot of apps on my android phone, though, so maybe this is a dumb question to those who do.
powerful stuff has room for abuse. I didn't really think there's much of a way to make that not the case. it's especially true for anything that you grant accessibility-level access to, and "you cannot build accessibility tools" is a terrible trade-off.
(personally I think there's some room for options with taint analysis and allowing "can read notifications = no internet" style rules, but anything capable enough will also be complex enough to be a problem)
Googles proposal was to require everyone to verify to publish any app through any channel. That would be the equivalent of a web browser enforcing a whitelist of websites, because one scam site asked for access to something bad.
If scam apps use an API designed by Google to steal user data, then they should fix that, without throwing the baby out with the bathwater.
It's not news, both iOS and Android sandboxing are Swiss cheese compared to a browser.
People should only install apps from trusted publishers (and not everything from the store is trusted as the store just gors very basic checks)
The buried lede:
> a dedicated account type for students and hobbyists. This will allow you to distribute your creations to a limited number of devices without going through the full verification
So a natural limit on how big a hobby project can get. The example they give, where verification would require scammers to burn an identity to build another app instead of just being able to do a new build whenever an app gets detected as malware, shows that apps with few installs are where the danger is. This measure just doesn't add up
> We are building a new advanced flow that allows experienced users to accept the risks of installing software that isn't verified
Also this will kill any impetus that was growing on the Linux phone development side, for better or worse. We get to live in this ecosystem a while longer, let's see if people keep damocles' sword in mind and we might see more efforts towards cross-platform builds for example
There is also the same thing with L for loss/loser. "that's an L take", "L [person]", "take the L here", etc.
They are pretty straightforward in their meaning, basically what you described. I believe it comes from sports but they are used for any good or bad outcome regardless of whether it was a contest.
This is the first sign we're getting old :) new language features feel new. The language features I picked up in school, that my parents remarked upon, were simply normal to me, not new at all. I notice it pretty strongly nowadays with my grandma, where I keep picking up new terms in Dutch (mainly loan words) but she isn't exposed to them and so I struggle to find what words she knows. Not just new/updated concepts like VR, gender-neutral pronouns, or a new word for messages that are specifically in an online chat, but also old concepts like bias. It's always been there but I'd have no idea what she'd use to describe that concept
We no longer own our devices.
We're in a worse state than we were in before. Google is becoming a dictator like Apple.
If google wants a walled garden, let it wall off it's own devices, but what right does it have to command other manufactures to bow down as well? At this stage we've got the choice of dictato-potato phone prime, or misc flavour of peasant.
If you want walled garden, go use apple. The option is there. We don't need to bring that here.
Wow, this really pulls back the veil. This Vendor (google) is only looking out for numero uno.
The angry social media narratives have been running wild from people who insert their own assumptions into what’s happening.
It’s been fairly clear from the start that this wasn’t the end of sideloading, period. However that doesn’t get as many clicks and shares as writing a headline claiming that Google is taking away your rights.
There may have been exaggerations in some cases but these hand wavy responses like "you can still do X but you just can't do Y and Z is now mandatory" or "you can always use Y" is how we got to this situation in the first place.
This is just the next evolution of SafetyNet & play integrity API. Remember how many said use alternatives. Not saying safetynet is bad but I don't believe their intentions were to stop at just that.
I suspect they mean you have to create a android developer account and sign the binaries, this new policy just allows you to proceed without completing the identity verification on that account.
No, until this post, Google had said that it wouldn't be possible to install an app from a developer who hadn't been blessed by Google completely on your device. That is unacceptable. This blog post contains a policy change from Google.
A simple yes/no alert box is not "[...] specifically to resist coercion, ensuring that users aren't tricked into bypassing these safety checks while under pressure from a scammer". In fact, AFAIK we already have exactly that alert box.
No, what they want is something so complicated that no muggle could possibly enable it, either by accident or by being guided on the phone.
Macs blocked launching apps from unverified devs, but you can override in settings. I thought they could just do something along those lines.
Maybe this sounds dark but see also how the net is tightening around phones that allow you to run open firmware after you've bought the hardware for the full and fair price. We're slowly being relegated to crappy hobbyist projects once the last major vendors decide on this as well, and I don't even understand what crime it is I'm being locked out for
We're too small a group for commercial vendors to care. Switching away isn't enough, especially when there's no solidarity, not even among hackers. Anyone who uses Apple phones votes with their wallet for locking down the ability to run software of your choice on hardware of your choice. It's as anti-hacker as you can get but it's fairly popular among the HN audience for some reason
If not even we can agree on this internally, what's a bank going to care about the fifty people in the country that can't use a banking app because they're obstinately using dev tools? What are they gonna do, try to live bankless?
Of course, so long as we can switch away: by all means. But it's not a long-term solution
It seems like a finite solution though. Having a second phone is not something most people will do, so the apps that are relegated to run on such devices will become less popular, less maintained, less and less good
Currently, you can run open software alongside e.g. government verification software. I think it's important to keep that option if somehow possible
Sure, they'll keep building it forever — this is just a delay tactic.
there cannot exist an easy way for a typical non-technical user to install “unverified apps” (whatever that means), because the governments of countries where such scams are widespread will hold Google responsible.
Meanwhile this very fact seems fundamentally unacceptable to many, so there will be no end to this discourse IMO.
Just look at everything they've done to break yt-dlp over and over again. In fact their newest countermeasure is a frontpage story right beside this one: https://news.ycombinator.com/item?id=45898407
Of course I would be much happier if I didn't need to use Shizuku in the first place.
[0]: https://play.google.com/store/apps/details?id=moe.shizuku.pr...
Not only users are not connected to WiFi all the time, but in many developing countries people often have no WiFi at home and rely on mobile data instead. It's a solution, but not a solution for everyone or a solution that works all the time.
I think number of people caring about alternative app stores, F-droid or whatever is very similar to the number of people willing to use adb if necessary, so rather small.
Google is not rolling this out to protect against YouTube ReVanced but only in a small number of countries. That’s an illogical conclusion to draw from the facts.
Also, its not SIDE loading. Its installing an app.
I'm not on the side of locking people out, but this is a poor argument.
Debian already is sideloaded on the graciousness of Microsoft's UEFI bootloader keys. Without that key, you could not install anything else than MS Windows.
Hence you don't realize how good of an argument it is, because you even bamboozled yourself without realizing it.
It gets a worse argument if we want to discuss Qubes and other distributions that are actually focused on security, e.g. via firejail, hardened kernels or user namespaces to sandbox apps.
This is only true if you use Secure boot. It is already not needed and insecure so should be turned off. Then any OS can be installed.
> It is already not needed and insecure so should be turned off.
You know what's even less secure? Having it off.
Oh, you don't use <thing literally named ‘Secure [Verb]’>?? You must not care about being secure, huh???
Dear Microsoft: fuck off; I refuse to seek your permission-via-signing-key to run my own software on my own computer.
Also Secure boot is vulnerable to many types of exploits. Having it enabled can be a danger in its self as it can be used to infect the OS that relies on it.
No one is stopping you from installing your own keys, though?
Turning off UEFI secure boot on a PC to install another "unsecure distribution"
vs.
Unlocking fastboot bootloader on Android to install another "unsecure ROM"
... is not the exact same language, which isn"t really about security but about absolute control of the device.
The parallels are astounding, given that Microsoft's signing process of binaries also meanwhile depends on WHQL and the Microsoft Store. Unsigned binaries can't be installed unless you "disable security features".
My point is that it has absolutely nothing to do with actual security improvements.
Google could've invested that money instead into building an EDR and called it Android Defender or something. Everyone worried about security would've installed that Antivirus. And on top of it, all the fake Anti Viruses in the Google Play Store (that haven't been removed by Google btw) would have no scamming business model anymore either.
The parallels are astounding, given that Microsoft's signing process of binaries also meanwhile depends on WHQL and the Microsoft Store. Unsigned binaries can't be installed unless you "disable security features".
My point is that it has absolutely nothing to do with actual security improvements."
I agree. It is the same type of language.
You can only turn off Secure Boot because Microsoft allows it. In the same way Android has its CDD with rules all OEMs must follow (otherwise they won't get Google's apps), Windows has a set of hardware certification requirements (otherwise the OEM won't be able to get Windows pre-installed), and it's these certification requirements that say "it must be possible to disable Secure Boot". A future version of Windows could easily have in its hardware certification requirements "it must not be possible to disable Secure Boot", and all OEMs would be forced to follow it if they wanted Windows.
And that already happened. Some time ago, Microsoft mandated that it must not be possible to disable Secure Boot on ARM-based devices (while keeping the rule that it must be possible to disable it on x86-based devices). I think this rule was changed later, but for ARM-based Windows laptops of that era, it's AFAIK not possible to disable Secure Boot to install an alternate OS.
The linked post is full of fluff and low on detail. Google doesn't seem to have the details themselves; they're continuing with the rollout while still designing the flow that will let experienced users install apps like normal.
But having seen how things work at large companies including Google, I find it less likely for Google's Android team to be allocating resources or making major policy decisions by considering the YouTube team. :-) (Of course if Android happened to make a change that negatively affected YouTube revenue, things may get escalated and the change may get rolled back as in the infamous Chrome-vs-Ads case, but those situations are very rare.) Taking their explanation at face value (their anti-malware team couldn't keep up: bad actors can spin up new harmful apps instantly. It becomes an endless game of whack-a-mole. Verification changes the math by forcing them to use a real identity) seems justified in this case.
My point though was that whatever the ultimate stable equilibrium becomes, it will be one in which the set of apps that the average person can easily install is limited in some way — I think Google's proposed solution here (hobbyists can make apps having not many users, and “experienced users” can opt out of the security measures) is actually a “least bad” compromise, but still not a happy outcome for those who would like a world where anyone can write apps that anyone can install.
One way to achieve this is to only allow sideloading in "developer mode", which could only be activated from the setup / onboarding screen. That way, power users who know they'll want to sideload could still sideload. The rest could enjoy the benefits of an ecosystem where somebody more competent than their 80-year-old nontechnical self can worry about cybersecurity.
Another way to do this would be to enforce a 48-hour cooldown on enabling sideloading, perhaps waived if enabled within 48 hrs of device setup. This would be enough time for most people to literally "cool off" and realize they're being scammed, while not much of an obstacle for power users.
In other words, it's not any quality of Linux other than how niche it is.
80-year-old nontechnical self can easily operate machines and devices that are much more complex and easily more dangerous than a smartphone.
And yet we're here pretending that those same people will install apps without even thinking about it.
Careless people are careless, we know that, we don't make them safer by treating everyone else like toddlers with a gun in their hands.
Yea no. Now companies have to supply two phones, one for dev and one for calling. It is hard enough to get one...
Of course that's a side effect Google probably wouldn't be sad about.
That’s why a lot of low end Android devices often have problems playing DRMed content on the Web: their keyboxes got cracked open and leaked wide enough for piracy that they got revoked and downgraded down to L3.
Naturally that got broken too, and even worse, broken when it's only supported by a minority of devices and content, because the more devices and content it's used for the easier it is to break and the larger the incentive to do it.
If you tried to require that for all content then it would have to be supported by all devices, including the bargain bin e-waste with derelict security, and what do you expect to happen then?
without opening it up physically there is no way to make it stop or get the raw stream before it's displayed
the real pain in the butt in my present is Patreon because I can't be arsed to write something separate for it. as-is, I subscribe to people on Patreon and then never bother watching any of the exclusive content because it's too much work. some solutions like Ghost (providing an API for donor content access) get part of the way to a solution, but they are not themselves a video host, and I've never seen anyone use it.
That's not real DRM then. The real DRM is sending the content such that it flows down the protected media path (https://en.wikipedia.org/wiki/Protected_Media_Path) or equivalent. Userspace never sees decrypted plaintext content. The programmable part of the GPU never seen plaintext decrypted content. Applying some no-op blur filter would be pointless since anything doing the blur couldn't see the pixels. It's not something you can work around with clever CSS. To compromise it, you need to do an EoP into ordinarily non-programmable scanout of the GPU or find bad cryptography or a side channel that lets you get the private key that can decode the frames. Very hard.
Is this how YT works today? Not on every platform. Could it work this way? Definitely. The only thing stopping them is fear of breaking compatibility with a long tail of legacy devices.
If I'm going to live in a walled garden it's going to the fanciest
Because the hardware is so constrained an iphone lasts forever compared to a similar android. My two year old pixel is slow now, but I know people completely happy with a five year old iphone. Pause, I checked and the oldest iphone that receives updates is an iphone 11, which is the exact model I had before going back to android.
You’re still missing the point the comment is making: In countries where governments are dead set on holding Google accountable for what users do on their phones, it doesn’t matter what you believe to be your natural right. The governments of these countries have made declarations about who is accountable and Google has no intention of leaving the door open for that accountability.
You can do whatever you want with the hardware you buy, but don’t confuse that with forcing another company to give you all of the tools to do anything you want easily.
I’m amazed at how gullible some people are but that’s how it is.
The era of United States companies using common sense United States principles for the whole world is coming to an end.
Of course there are no good options for open hardware, but that is a related but separate problem.
512 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.