ChatGPT is still not a meaningful threat because knowledge workers are paid to do things with the knowledge, not merely know things.
As well, there has been zero progress towards eliminating the need for a knowledgeable human participant to converse with the bot. A chat bot will, by definition, always need a human to push the conversation forward and that's the heavy lifting of the whole thing really. I have never seen an example of a chat bot questioning whether the human's query even makes sense let alone dispense constructive advice.
ChatGPT replaces use cases where Google was always a bad fit, not knowledge workers.
Every time industrialization, automation, robotics, and now AI, makes a great leap in capability, from being able to do Task A to being able to do both Task A and Task B, someone is always there to point out "Well it can't yet do Task C or Task D, so it's not a meaningful threat!"
That is certainly the 10,000 ft view, but my experience has been that knowledge work is mostly about achieving consistency and agreement. Implementation is absolutely not independent from defining/discovering the requirements with this type of work. They're basically the same thing.
ChatGPT isn't stable tech. It has been improving dramatically and qualitatively in its short life.
Its interfaces are currently quite primitive too, so its current value may be underrated.
I would love to be able to edit an outline, while watching a filled out essay update side-by-side. The ability to tweak the outline, add hints, and tweak wording or completely take responsibility for particular sentences or paragraphs in the produced essay, would be a massive time saver, and productivity multiplier.
Note that demand for documentation tends to be a secondary, not primary, concern of the products they serve. So demand for documentation is not likely to grow dramatically just because documentation got vastly cheaper to write.
In other words, a productivity bump like this could eliminate a lot of jobs in just one category of knowledge work.
Add in the ability to use sibling tech to generate lively images, diagrams, figures ...
And translate ...
Note that the outline prompts, and other prompts, used to generate a document this way would be orders of magnitude easier to refactor to reflect future changes, than a manually written document ...
As well, there has been zero progress towards eliminating the need for a knowledgeable human participant to converse with the bot. A chat bot will, by definition, always need a human to push the conversation forward and that's the heavy lifting of the whole thing really. I have never seen an example of a chat bot questioning whether the human's query even makes sense let alone dispense constructive advice.
ChatGPT replaces use cases where Google was always a bad fit, not knowledge workers.