Hoping that AI can create a pleasant world seems roughly as reasonable as hoping that humans can evolve, however gradually, until they can create a pleasant world for themselves. Is it not?
In the meantime, I'll cast my lot with creating a pleasant inner world.
In the meantime, I'll cast my lot with creating a pleasant inner world.