Coinbase CEO Explains Why He Fired Engineers Who Didn't Try AI Immediately
Posted5 months agoActive4 months ago
techcrunch.comTechstory
heatednegative
Debate
85/100
AI AdoptionCrypto IndustryManagement Practices
Key topics
AI Adoption
Crypto Industry
Management Practices
Coinbase CEO Brian Armstrong fired engineers who didn't immediately try AI coding tools, sparking controversy and criticism about his management style and the company's priorities.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
17m
Peak period
70
0-12h
Avg / period
10.6
Comment distribution85 data points
Loading chart...
Based on 85 loaded comments
Key moments
- 01Story posted
Aug 22, 2025 at 7:12 PM EDT
5 months ago
Step 01 - 02First comment
Aug 22, 2025 at 7:30 PM EDT
17m after posting
Step 02 - 03Peak activity
70 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 28, 2025 at 11:48 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 44991082Type: storyLast synced: 11/20/2025, 2:40:40 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
> meeting on Saturday with everybody who hasn’t done it
Even more toxic
Seriously, why anyone listens to crypto ceos is beyond me. Modern day snake oil salesmen.
Management is hard so I’m generally a little more patient with managerial missteps. But this is a different level of unreasonable. Heck a lot of developers in the finance world adopt slowly because they’ve worked with compliance departments and it becomes a habit.
I assume "onboard" means something like "set up an account and get it working locally".
I personally ignore or delay things all the time because of actual work I'm doing. If running some random AI tool is more important than "keep the company working," that's a really sick and fragile culture.
Can’t say I’m surprised that a crypto CEO - an industry totally overflowing with contradictions - is completely unfazed when confronted with yet another contradiction
I don't think the CEO should know whether or not I've used AI though nor do I think it's fair to fire people for it.
I guess I could maybe see a case for catching someone saying "I don't care about trying out new tools" - it's a position held by some of the least productive people I've ever worked with. But there other reasons why someone might not have picked up new tools yet, like "I'm just trying to get my damn work done" or "I tried this tool but it just seemed distracting".
I fall into those camps all the time wrt new tools.
From my reading of the article, you don’t have to think AI is useful or great to keep your job there, you just have to try the tool out because the CEO said to.
Just because I've found it to be very helpful doesn't mean everyone will.
They would rather go to a Saturday meeting than do the thing their CEO explicitly asked them to do in the very reasonable timeframe they were asked to do it.
> This LLM mandate is a terrible idea. And all I have to do to for an opportunity directly explain to the highest level of management why it's a terrible idea is say I haven't installed Codex yet.
My take on the past 20 years or so is that programmers gained enough market clout to demand a fair amount of agency over things like tooling and working conditions. One of the results is, for instance, that there are no more proprietary programming languages of any importance, that I'm aware of. Even the great Microsoft open-sourced their flagship language, C#.
Non-developers like myself looked to the programming world with a bit of admiration or perhaps even envy. I use programming tools in my job, and would not choose a proprietary tool even if offered. The engineering disciplines that depended on proprietary tooling tend to be lower paying, with less job mobility.
Maybe the tables have turned, and employers have the upper hand once again, so this may all be a moot point, or a period to look back on with fondness.
It's... "uncurious" is the best way I can think of to describe it.
And he also bucks the trend by running a crypto company in the US instead of some random island in the Carribean and actually talking to regulators in the hopes of getting regulatory clarity.
1. The idea of them using AI coding tools in a forced way like this. (Meticulous code quality, and perfect understanding of every detail, are critical.)
2. The culture implications of insta-firing someone whose explanation you didn't like, for why they hadn't started using AI tools yet.
3. Scheduling the firing call for a Saturday. Are they in some kind of whip-cracking forced march, and staff going to be fatigued and stressed and sick and making mistakes?
I'm sorry but there's just no fucking way. Even before AI these crypto coins companies were absolute clown factories. There's no way they ever had it.
I've worked on the triaging side of large corporate bug bounty programmes & trust me when I say that security-by-obscurity is far more impactful in keeping our world (incidentally) secure than any active measure. Absence of exploit does not equal absence of vulnerability.
Sure, code quality is important everywhere, & even moreso in finance, but if you're going through this world believing the mean standard across financial tech is high, even before considering the likely rot of coin-brained companies on their engineer's standards, then you need to readjust your skepticism.
On the other hand, the cultural implication of feeling my superiors even have any level of granular interest in monitoring the individual tools I personally use to generate outputs that benefit the company... outside of obvious security/endpoint concerns, there's no world in which that's an environment conductive to quality.
[1]: https://fly.io/blog/youre-all-nuts/
So I would steer the conversation towards ways of using AI for dev-related tasks other than coding: understanding codebases, confirming my intuition/hypotheses. For example, I find LLMs pretty shitty at modifying larger codebases, but it's been helpful to, say, point Codex at the github repo of a large unfamiliar codebase that I had to learn and run my ideas about it by the LLM as I was learning.
Also, you probably don't want to succeed at job interviews with managers that will insist on your using AI in ways you don't like. A job interview is a two way process and all that.
"I'm interested in the technology and have been paying attention to its development, but it's not yet to the point that I believe it will be worth integrating into my workflow."
Though I will say, if you copy and paste that question into ChatGPT, it can give you some options to respond in a diplomatic way ;)
The few people who don’t will be forced out naturally when they can’t keep up.
No. The Butlerian Jihad (https://dune.fandom.com/wiki/Butlerian_Jihad) hasn't happened yet.
I would guess that something similar could exist in the Terminator / Skynet timeline, but I am not aware of the religious beliefs of the humans struggling there.
I suspect I was just laid off for not using any of the AI tools at work. Here's why I didn't.
1) They were typically very low quality. Often just more hosted chatbots (and of course they pick the cheapest hosted models) with bad RAG on a good day.
2) It wasn't clear to me that my boss wasn't able to read corespondance with chatbots the way he could with my other coworkers which creates a kind of chilling effect. I don't reflexively ask it casual questions the way I do at home.
3) Most of my blockers were administrative, not technical. Not only could AI tools not help me with that but in typical corporate fashion trying to use the few sanctioned tools actually generated more administrative work for me.
Oh well. I'm kind of over corporate employment anyway and moving onto my own thing. Just another insane misfeature of that mode of socialization at that scale.
"You want to use Claude code? Prove to us why copilot cannot do its job.". Wtf. I'm trying to do my job and the admins act as effing roadblocks so they can tick a box showing they gave everybody access. Now my job has suddenly become a judge of llms instead of doing actual work!
> Well, I thought I remembered you saying that you wanted to express yourself
A/V reference, for those inclined: https://www.youtube.com/watch?v=F7SNEdjftno
IMO this was the optimal approach, trying in my own time and not risking damaging the company codebase until it was safe... But I might have been fired, by the sounds of it.
This is not a story about AI.
Go sell stupid somewhere else. We're all stocked up here.
Is he returning a favor for all the goodies that "crypto" is getting from this administration? Like Tether being legitimized in El Salvador by best friend forever Bukele and having its finances and (alleged) USTD backing handled by Lutnick's Cantor Fitzgerald?
I think most people would agree that engineers outright refusing to comply with what was asked of them would be a "not good" reason for not onboarding.
But Brian Armstrong is playing Strong CEO for the podcast circuit. So he can't admit that engineers were let go for potentially justifiable reasons. He has to leave room for speculation. Speculation that maybe some engineers were let go for trivial reasons, because Brian is tough, and tough Brian demands complaint employees.
The people who didn't comply because they were on vacation and then had to go to a Saturday meeting to explain themselves think Brian is something -- but I guarantee it's not that he's tough.
We've all seen this playbook before. This is the incredibly dumb, Idiocracy-emulating world in which we now live.
I wonder how likely it is for CEO roles to get taken over by a sophisticated LLM at this point. I’d wager we’d see a 20x increase in value. I use and value llms in my coding and research workflows already but to fire people for careful and slow adoption speaks very poorly to individual and company maturity.
If i was working on a finance or finance adjacent company i'd be very hesitant to use anything that might send data outside the company.
I don't think this is true at all. In fact there are major tech companies that ban the use of AI when coding and those folks do it for their job everyday without an llm.