None of what GP describes is a hypothetical. Present-day LLMs are excellent editors and translators, and for many people, those were the only two things missing for them to be able to present a good idea convincingly.
Just because we have the tech doesn't mean we are forced to use it. we still have social cues and ettiquite shaping what is and isn't appropriate.
In this case, presenting arguments you yourself do not even understand is dishonest, for multiple reasons. And I thought we went past the "thesaurus era" of communication where we just proliferate a comment with uncommon words to sound smarter.
> In this case, presenting arguments you yourself do not even understand is dishonest, for multiple reasons.
I fully agree. However, the original comment was about helping people express an idea in a language they're not proficient in, which seems very different.
> And I thought we went past the "thesaurus era" of communication where we just proliferate a comment with uncommon words to sound smarter.
I wish. Until we are, I can't blame anyone for using tools that level the playing field.
>about helping people express an idea in a language they're not proficient in, which seems very different.
Yes, but I see it as a rare case. Also, consider tha mindset of someone learning a language:
You probably often hear "I'm sorry about my grammar, I'm not very good at English" and their communication is better than half your native peers. They are putting a lot more effort into trying to communicate while the natives take it for granted. That effort shows.
So in the context of an LLM: if they are using it to assist with their communication, they also tend to take more time to look at and properly tweak the output instead of posting it wholesale, at least without the sloppy queries that were not part of the actual output. That effort is why I'm more lenient to those situations.