Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The entire job is almost entirely human to human tasks: sales, networking, leading, etc.

So, writing emails?

"Hey, ChatGPT. Write a business strategy for our widget company. Then, draft emails to each department with instructions for implementing that strategy."

There, I just saved you $20 million.





People seem to have a poor model of what management and many knowledge workers. Much of it isn't completing tasks, but identifying and creating them.

> Much of it isn't completing tasks, but identifying and creating them.

They failed miserably in the Automotive industry in Europe. The only thing that they identified was: "Shit, the profits are falling, do something"


"ChatGPT, please identify the tasks that a CEO of this company must do."

I get your point but if you think that list of critical functions (or the unlisted "good ol boys" style tasks) boils down to some emails then I think you don't have an appreciation for the work or finesse or charisma required.

> I think you don't have an appreciation for the work or finesse or charisma required.

I think that you don't appreciate that charismatic emails are one of the few things that modern AI can do better than humans.

I wouldn't trust ChatGPT to do my math homework, but I would trust it to write a great op-ed piece.


For some reason the AI prompt "make me 20 million" hasn't been working for me. What am I doing wrong?

Have you got that plan reviewed by your analysts and handed over to implement by your employees? You may be missing those steps...

Automation depends on first getting paid to do something.

We could solve that by replacing all CEOs to remove the issue of finesse and charisma. LLMs can then discuss the actual proposals. (not entirely kidding)

It would be actually nicely self reinforcing and resisting a change back, because now it's in board's interest to use an LLM which cannot be smoothtalked into bad deals. Charisma becomes the negative signal and excludes more and more people.


Why are there "good ol boys" tasks in the first place? Instead, automate the C-suite with AI, get rid of these backroom dealings and exclusive private networks, and participate in a purer free market based on data. Isn't this what all the tech libertarians who are pushing AI are aiming for anyways? Complete automation of their workforces, free markets, etc etc? Makes more sense to cut the fat from the top first, as it's orders of magnitude larger than the fat on the bottom.

A more fair, less corrupt system/market sounds great! I also think once we solve that tiny problem that the "should ai do ceo jobs" problem is way easier!

What should we do while we wait for the good ol boys networks to dismantle themselves?

On a more serious note, the meritocracy, freedom, data, etc that big tech libertarians talk about seems to mostly be marketing. When we look at actions instead it's just more bog standard price fixing, insider deals, regulatory capture, bribes and other corruption with a little "create a fake government agency and gut the agencies investigating my companies" thrown in to keep things exciting.


> There, I just saved you $20 million.

If it were this easy, you could have done it by now. Have you?


> If it were this easy, you could have done it by now. Have you?

In order to save $20 million dollars with this technique, the first step is to hire a CEO who gets paid $20 million dollars. The second step is to replace the CEO with a bot.

I confess that I have not yet completed the first step.


Have you replaced the executive function in any one of your enterprises with ChatGPT?

I have completely replaced management of every company that I own with ChatGPT.

0 x 0 = 0 I guess?

How have they scaled?

This is literally a caricature of what the average HN engineer thinks a businessperson or CEO does all day, like you can't make satire like this up better if you tried.

It's mind-boggling. I get riffing on the hyped superiority of CEOs. I've heard inane things said by them. But, being a human being with some experience observing other humans and power structures, I can assuredly say that the tight-knit group of wealthy power-brokers who operate on gut and bullshitting each other (and everyone) will not cede their power to AI, but use it as a tool.

Or maybe the person you're describing is right, and CEOs are just like a psy-rock band with a Macbook trying out some tunes hoping they make it big on Spotify.


Do you think CEOs have an accurate idea of what engineers do?

Neither side can truly know, that is the nature of a diffuse organization.

That won't stop them from replacing us.

Even if the AI gets infinitely good, the task of guiding it to create software for the use of other humans is called...software engineering. Therefore, SWEs will never go away, because humans do not know what they want, and they never will until they do.

I am sympathetic to your point, but reducing a complex social exchange like that down to 'writing emails' is wildly underestimating the problem. In any negotiation, it's essential to have an internal model of the other party. If you can't predict reactions you don't know which actions to take. I am not at all convinced any modern AI would be up to that task. Once one exists that is I think we stop being in charge of our little corner of the galaxy.

Artists, musicians, scientists, lawyers and programmers have all argued that the irreducible complexity of their jobs makes automation by AI impossible and all have been proven wrong to some degree. I see no reason why CEOs should be the exception.

Although I think it's more likely that we're going to enter an era of fully autonomous corporations, and the position of "CEO" will simply no longer exist except as a machine-to-machine protocol.


The one big reason why CEOs exist is trust. Trust from the shareholders that someone at the company is trying to achieve gains for them. Trust from vendors/customers that someone at the company is trying to make a good product. Trust from the employees that someone is trying to bring in the money to the company (even if it doesn't come to them eventually).

And that trust can only be a person who is innately human, because the AI will make decisions which are holistically good and not specifically directed towards the above goals. And if some of the above goals are in conflict, then the CEO will make decisions which benefit the more powerful group because of an innately uncontrollable reward function, which is not true of AI by design.


> The one big reason why CEOs exist is trust.

This sounds a lot like the specious argument that only humans can create "art", despite copious evidence to the contrary.

You know what builds trust? A history of positive results. If AIs perform well in a certain task, then people will trust them to complete it.

> Trust from vendors/customers that someone at the company is trying to make a good product.

I can assure you that I, as a consumer, have absolutely no truth in any CEO that they are trying to making a good product. Their job is to make money, and making a good product is merely a potential side-effect.


I feel like the people who can't comprehend the difficulties of an AI CEO are people who have never been in business sales or high level strategy and negotiating.

You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?


> I feel like the people who can't comprehend the difficulties of an AI <thing doer> are people who have never <tried to do that thing really well>.

That applies to every call to replace jobs with current-gen AI.

But I can't think of a difference between CEOs and other professions that works out in favor of keeping the CEOs over the rest.


You sound like a CEO desperately trying not to get fired.

Everyone is indispensable until they aren't.


This whole thread is delightful. Well done.

Alas, this doesn't answer the question I posed.

CEOs are a different class of worker, with a different set of customers, a smaller pool of workers. They operate with a different set of rules than music creation or coding, and they sit at the top of the economy. They will use AI as a tool. Someone will sit at the top of a company. What would you call them?


>You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?

I can think of plenty, but none that matter.

As the AI stans say, there is nothing special about being human. What is a "CEO?" Just a closed system of inputs and outputs, stimulus and response, encased in wetware. A physical system that like all physical systems can be automated and will be automated in time.


My assertion is that it's a small club of incredibly powerful people operating in a system of very human rules - not well defined structures like programming, or to a lesser extent, law.

The market they serve is themselves and powerful shareholders. They don't serve finicky consumers that have dozens of low-friction alternatives, like they do in AI slop Youtube videos, or logo generation for their new business.

A human at some point is at the top of the pyramid. Will CEOs be finding the best way to use AI to serve their agenda? They'd be foolish not to. But if you "replace the CEO", then the person below that is effectively the CEO.


They've been proven wrong? I'm not sure I've seen an LLM that does anything beyond the most basic rote boilerplate for any of these. I don't think any of these professions have been replaced at all?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: