CEOs are hugely expensive. Why not automate them? (2021)
No synthesized answer yet. Check the discussion below.
It seems like you'd need some sort of fairly radical control structure (say, no board, just ai interacting directly with shareholders) to get around this. But there's still the relationship with the automation provider...
Could be good, but could also be bad if it turns out the AI is able to be even more ruthless in how it treats its workforce.
The good news is that it doesn't need to be very accurate in order to beat the performance of most execs anyways.
Where "very often" means "almost never?"
Every time the LLM CEO gets caught doing a crime and goes to 'jail', the LLMs on the exec board can vote to replace it with another instance of the same LLM model.
Because building psychopathic AI's is - at the moment - still frowned upon.
Lots of people have legal obligations.
In this case, I assume that in this case you're referring to a fiduciary duty (i.e. to act in the best interests of the company), which is typically held not by the CEO, but but by the directors.
Ultimately the responsibility to assign daily operation of the company rests with the board, both legally and practically, as does the decision to use a human or AI CEO.
More practically, accountability would be placed in the individuals approving LLMs actions and/or the entity providing the LLM service. The latter aspect being why many AI vendor deals fall through. Because everything is awesome until the contract arrives for signing and it's revealed that the vendor will accept no liability for anything their product does.
(Surprisingly though, that's enough for them to recognize that you're a human. Their models can identify your complex thought progression in your prompts - no matter how robotic your language is.)
The REAL problem here is the hideous narrative some of these CEOs spin. They swing the LLMs around to convince everyone that they are replaceable, thereby crashing the value of the job market and increasing their own profits. At the same time, they project themselves as some sort of super-intelligent divine beings with special abilities without which the world will not progress, while in reality they maintain an exclusive club of wealthy connections that they guard jealously by ruining the opportunities for the others (the proverbial 'burning the ladder behind them'.) They use their PR resources to paint a larger-than-life image that hides the extreme destruction they leave behind in the pursuit of wealth - like hiding a hideous odor with bucketfuls of perfume. These two problems are the two sides of a coin that expose their duplicity and deception.
PS: I have to say that this doesn't apply to all CEOs. There are plenty of skilled CEOs, especially founders, who play a huge role in setting the company up. Here I'm talking about the stereotypical cosmopolitan bunch that comes to our mind when we hear that word. The ones who have no qualms in destroying the world for their enjoyment and look down upon normal people as if you're just fodder for them.
The entire job is almost entirely human to human tasks: sales, networking, leading, etc.
What are people thinking CEOs do all day? The "work" work is done by their subordinates.
So, writing emails?
"Hey, ChatGPT. Write a business strategy for our widget company. Then, draft emails to each department with instructions for implementing that strategy."
There, I just saved you $20 million.
I think that you don't appreciate that charismatic emails are one of the few things that modern AI can do better than humans.
I wouldn't trust ChatGPT to do my math homework, but I would trust it to write a great op-ed piece.
What should we do while we wait for the good ol boys networks to dismantle themselves?
On a more serious note, the meritocracy, freedom, data, etc that big tech libertarians talk about seems to mostly be marketing. When we look at actions instead it's just more bog standard price fixing, insider deals, regulatory capture, bribes and other corruption with a little "create a fake government agency and gut the agencies investigating my companies" thrown in to keep things exciting.
They failed miserably in the Automotive industry in Europe. The only thing that they identified was: "Shit, the profits are falling, do something"
In order to save $20 million dollars with this technique, the first step is to hire a CEO who gets paid $20 million dollars. The second step is to replace the CEO with a bot.
I confess that I have not yet completed the first step.
Although I think it's more likely that we're going to enter an era of fully autonomous corporations, and the position of "CEO" will simply no longer exist except as a machine-to-machine protocol.
You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?
Everyone is indispensable until they aren't.
CEOs are a different class of worker, with a different set of customers, a smaller pool of workers. They operate with a different set of rules than music creation or coding, and they sit at the top of the economy. They will use AI as a tool. Someone will sit at the top of a company. What would you call them?
I can think of plenty, but none that matter.
As the AI stans say, there is nothing special about being human. What is a "CEO?" Just a closed system of inputs and outputs, stimulus and response, encased in wetware. A physical system that like all physical systems can be automated and will be automated in time.
The market they serve is themselves and powerful shareholders. They don't serve finicky consumers that have dozens of low-friction alternatives, like they do in AI slop Youtube videos, or logo generation for their new business.
A human at some point is at the top of the pyramid. Will CEOs be finding the best way to use AI to serve their agenda? They'd be foolish not to. But if you "replace the CEO", then the person below that is effectively the CEO.
That applies to every call to replace jobs with current-gen AI.
I can't name the difference between CEOs and other professions that works out in favor of keeping the CEOs over the rest, though.
And that trust can only be a person who is innately human, because the AI will make decisions which are holistically good and not specifically directed towards the above goals. And if some of the above goals are in conflict, then the CEO will make decisions which benefit the more powerful group because of an innately uncontrollable reward function, which is not true of AI by design.
This sounds a lot like the specious argument that only humans can create "art", despite copious evidence to the contrary.
You know what builds trust? A history of positive results. If AIs perform well in a certain task, then people will trust them to complete it.
> Trust from vendors/customers that someone at the company is trying to make a good product.
I can assure you that I, as a consumer, have absolutely no truth in any CEO that they are trying to making a good product. Their job is to make money, and making a good product is merely a potential side-effect.
Or maybe the person you're describing is right, and CEOs are just like a psy-rock band with a Macbook trying out some tunes hoping they make it big on Spotify.
You might as well ask why people don’t use AI pickup coaches.
And now I think such a company with an AI CEO should also have an AI CTO, COO, etc. Replace the the entire upper layer with AI so that there's zero accountability and companies can commit (more blatant) fraud freely
It is good, that CEOs also get some of this "You will be replaced by AI!" flak, that we hear from CEOs of big tech directed at developers. Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace? How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
In the end neither will work out any time soon, judging current "AI"'s actual AI level. I think for that we still need some 2-3 architectural leaps forward. And by that I don't mean simply building bigger ANNs and ingesting more data. It already seems like the returns for that are rapidly diminishing.
You can estimate the difficulty of a job by what fraction is the population can successfully do it and how much special training this takes. Both of which are reflected in the supply curve for labor for that job.
> How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
Pretty sure that (avg developer pay * number of developers) is a lot more that (avg ceo pay * number of ceos).
Since businesses need to start somewhere/when and most startups fail, I think most people who even get into the role of CEO, are doing it successfully. However, this is a lot due to circumstances and many factors outside of their control. There are also many CEOs ruining their businesses with bad decisions. It is not certain, that an "AI" wouldn't do at least as good as those failing CEOs. Similarly, many developers ruin things they touch, introducing tons of complexity, dependencies and breaking user workflows or making workflows cumbersome without listening to user feedback and so on.
In short many people do a bad job and businesses are carried by others, who do a good enough job to make a net positive for the final product. Or consequences of messing up are happening slowly, like a slow user drain, or a user replacement with bad actors until good actors start to leave, or any other possibility.
About the pay argument: Well, these days you still need a good crew of developers to make the shiny AI toys do what you want them to do, so you are not replacing all of the developers, so you can't calculate like that. If we calculate some Silicon Valley CEO making 2 million and a developer making 100k-200k, then we are still at a ratio of 10x-20x. If we manage to make only one CEO obsolete or 2 out of 3 CEOs 1.5x as efficient, we have achieved a cost saving of 10-20 developers! Yay!...
You'll easily find people preaching or selling that sort of thing on Twitter, and the sort of people who are still on Twitter are probably buying it.
(Probably mentally unhealthy people, but still it happens!)
My dad had a manager (who was a VP) that he privately nicknamed "VPGPT", because despite being a very polite and personable guy he pretty much knew nothing about the engineering he was ostensibly managing, and basically just spoke in truisms that sounded kind of meaningful unless you do any kind of analysis on them.
I'm not saying that AI would necessarily be "better", but I do kind of hate how people who are utterly incapable of anything even approaching "technical" end up being the ones making technical decisions.
I thought you meant "AI-startup CEO" for a moment and was going to agree.
Don't steal this idea it's mine I'm going to sell it for a million dollars.
The movie makes it quite clear, actually.
Did we ever see him interacting with a customer? I don't remember that part and I can't find any clip of it. We see him in many other situations. We know he was not respected and was a weirdo in many ways, but that doesn't say anything about the quality of his customer communication.
Getting thousands of employees to all work towards a common goal is EXTREMELY difficult.
Especially when there is no common goal.
The goal of a CEO is profits. As long as it goes up, everything is ok. As soon as it starts to go down: we have to sack people.
For the soft CEO skills, not so much.
I'd fully trust an AI CEO's decision making, for a predictable business, at least.
I think an AI could be strong at a few skills, if appropriately chosen:
- being gaslightingly polite while firmly telling others no;
- doing a good job of compressing company wide news into short, layperson summaries for investors and the public;
- making PR statements, shareholder calls, etc; and,
- dealing with the deluge of meetings and emails to keep its subordinates rowing in the same direction.
Would it require that we have staff support some of the traditional soft skills? Absolutely. But there’s nothing fundamentally stopping an AI CEO from running the company.
There is no shortage of data a company has at its disposal these days and a CEO will bias towards what they feel they are best at. We see that with Steve Jobs versus Tim Cook. Tim loves seeing numbers go up into the right, so that's where the passion is in the company these days. An AI CEO that could not only balance that out but cancel it out could be a real strength.
The human ceo would still be indispensable in setting the company vision, and defining its culture and values; crucial ingredients for execution.
My business experience is that company culture is very important to a company’s success and I’m just doubtful that this can be created through AI.
This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.
So you will find people who make average salaries defending the stratospheric salaries of CEOs because they believe they'll one day be the one benefitting or they've fallen for some sort of propaganda such as the myth of meritocracy or prosperity gospel.
Our entire economy is designed around exploiting working people and extracting all of their wealth to a tiny portion of the population. And we're reachign the point where the bottom 50% (if not more) have nothing left to exploit.
Ai and automation could be used to improve all of our lives. It isn't and it won't be. It'll be used to suppress wages and displace workers so this massive wealth transfer can be accelerated.
I get the point of the article. But those with the wealth won't let themselves be replaced by AI and seemingly the populace will never ask the question of why they can't be replaced until economic conditiosn deteriorate even further.
It's not that difficult to get into the top 1%. Most Americans earn a top 1% income. Even the top 1% of America is only a salary of around $500k. It's possible 19% of survey takers were in the top 1%, or were on a path to make that in the future.
I don't see how it's definitionally untrue to believe you could make $500k a year at some point...Let alone $34,000 a year...
1% of Americans earn a top 1% income. They weren't being asked "do you make more than an amputee kid in Gaza?"
> It's possible 19% of survey takers were in the top 1%…
There's a whole field of math devoted to preventing this. Polling works quite well, all things considered.
> They weren't being asked "do you make more than an amputee kid in Gaza?"
Context matters.
Often posed as a multiple choice question.
I'm not; this is quite well documented.
https://phys.org/news/2024-09-people-underestimate-income.ht...
> Barnabas Szaszi and colleagues conducted four studies to explore how well people understand the wealth held by others. In one study, 990 US residents recruited online were asked to estimate the minimum annual household income thresholds of various percentiles of American earners.
But more relevant is the top 1% of net worth is currently ~$11.6M [1], which is vastly more unattainable.
Also, the net worth of the bottom 99% is skewed by house prices. You might be sitting on a house worth $1M but when every other house also costs $1M and you have to live somewhere, you don't really have a net worth of $1M.
[1]: https://finance.yahoo.com/news/among-wealthiest-heres-net-wo...
I don't know how that particular poll was worded, but in general if your a politician who rails against the top 1%, you might suffer from the fact that people have widely varying conceptions of who the 1% are.
I swear there’s a joke or cautionary tale here somewhere about “first they came for..” or something along those lines. The phrasing escapes me.
Maybe the problem isn’t that you can’t automate a CEO, it’s that the actual tangible work just isn’t worth as much as some companies pay for it, and this thread it touching a few too many raw nerves.
Well, either way it’s hilarious.
It would be an interesting experiment to promote an executive assistant to CEO though.
Also, I think it misses the critical point. C-suite executives operate under immense pressure to deliver abstract business outcomes, but the lack of clear, immediate feedback loops and well-defined success metrics makes their roles resistant to automation. AI needs concrete reward functions that executive decision-making simply doesn't provide.
but as someone who has the honor of working witha really good ceo i can definitely say that you cannot automate them. maybe in some streamlined corporate machine like ibm or something, but not in a living growing company
Whatever the merits of the argument here (and my bolshie side has also flippantly pushed it in the past) the motivation and thrust of the essay needs to be considered in that ideological grounding.
If you've ever worked at a company that's a chaotic shitshow, you'll know how strong the effect of the CEO is - it always comes down to the guy at the top not being up to it.
The leverage of the role is enormous, and the strength of someone who can carry out this role well for a large company is sky high - not many such people in the world, and they only need one.
So the math all comes out very straightforward: even at obscene looking salaries, they're still a bargain.
It’s been tried before, it didn’t work out well.
An even more interesting one is: What will we reward?
We've been rewarding labor quantity, as well as quality via higher wages - as motivation and as incentives for more education. This reflected the productivity primacy of knowledge work in modern economies, but that might not be the case down the road.
We've also been rewarding capital. Originally this was a way for the elites to keep themselves in place (a.k.a. economic rents), but in modern times it's been more of an entrepreneurial incentive (a.k.a. economic profits.)
Without the economic profit rationale, there's no reason to reward capital accumulation. Only pro-profit decisions are good for society, pro-rent decisions are awful. If there's no profit to incentivize, capitalism is just bad all around.
If AI becomes a better profit decision-maker than an entrepreneur, any humans left in the loop are nothing but high-rollers gambling with everyone else's money.
151 more comments available on Hacker News