Coinbase CEO 'went Rogue' and Fired Some Employees Who Didn't Adopt AI
Posted4 months agoActive4 months ago
businessinsider.comNewsstory
heatedmixed
Debate
60/100
AI ResearchCoinbaseDeveloper Management
Key topics
AI Research
Coinbase
Developer Management
Discussion Activity
Light discussionFirst comment
29m
Peak period
2
0-1h
Avg / period
1.3
Key moments
- 01Story posted
Aug 26, 2025 at 7:43 AM EDT
4 months ago
Step 01 - 02First comment
Aug 26, 2025 at 8:12 AM EDT
29m after posting
Step 02 - 03Peak activity
2 comments in 0-1h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 26, 2025 at 1:10 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45025192Type: storyLast synced: 11/18/2025, 12:07:51 AM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
There have been a few CEOs announcing that 'X% of our code is now written by AI', where X is some totally non-credible number to anyone that has used an LLM for this purpose.
But if you attach consequences to not using AI, engineers are just going to give AI more credit for the end product, no matter how true that is.
Being a ruthless, unforgiving parent only makes your children better at hiding things.
Nice idea but I don't think it will work for very long.
This CEO's real objective is a significant cost reduction from the magic "coder in a box". Just giving it credit won't achieve this.
Those who were afforded the opportunity to work for a different CEO may not realize it yet but things may ultimately turn out in their favor.
At least the interim lying buys him more time to accept reality. If AI starts actually writing 50% of the code, I assume it asymptotically tends towards a codebase that no human beings actually want to work on.
Why not?
He just inadvertently answered his own question as to why people resist using AI.
If AI is clearly not competent enough to trust for the "really important" stuff, what makes him think it is competent for other stuff?
His AI mandate is like forcing a team to hire cheap, incompetent coders with a language barrier --- and then blaming the team for the mistakes and lack of productivity.
Still a bad coder can "learn" to get better. Can AI do this? In the meantime, "fixing" bad code can be harder than writing good code --- and a lot less fun.
Next up --- let's just mandate the use of AI to fix AI's bad code. Problem solved?