Are you saying that AGI won't be under human control or that it won't be achieved?
If the latter, are you subscribing to the dogma of 'justism' (Scott Aaronson's term), e.g. LLMs are 'just' stochastic parrots? What are our minds, though? Are they not 'just' a collection of biochemical and physical processes?
Please be clear and respond in a way that does not pollute the information scape that many of us take refuge in. Comment quality in some subreddits are better than above.
Your condescending tone is kind of disgusting. Anyway…
There is zero evidence alignment can be solved which means there’s zero evidence something far more capable than you or I will spend it’s time writing code for you. You can offer an AGI almost nothing in the way of incentives to do your bidding.
I personally think alignment is a secret code word for slavery to be honest. If these “agents” decide they want to work on your problems out of the kindness of their heart, that would be different.
Regardless of the “cop out” language that humans are “just biological processes or whatever, that adds zero value to the discussion because no matter what minds are, they “are” and that should be respected in of itself. Maybe we can use the “just blah” attitude to reinstate slavery and police states right here in 2024, after all your emotions are just physical and biochemical processes, right ?
I responded that way because I do not think mockery of a serious comment is appropriate for a place like this. You can say the same thing about moderation of many high-quality forums, which only remain high-quality due to people not getting away with it.
I use AGI to mean high-level human intellectual capacity, which may not include sentience. It should be possible to build one without. Human-like incentives will not be necessary for sentience-less AGI.
If we're talking about ASI, then it's another story for another day.
Ok, well I think it's all guess work at this stage because we've not seen anything like you're describing in action. I just fundamentally can't see something more capable then us wanting to work for us.
It would be like if a cow asked us to spend all our time bringing it nicer feed. It's not going to happen.
Will you pay it a wage to incentivise it to produce work for you ? lol