Not really, sometimes it's just plausible lies. We distort the world, but respects some basic rules, making it believable. Another difference from LLMs is that we can store this distortion and lay upon it as $TRUTH.
And we can distort quite far (see cartoons in drawing, dubstep in music,...)
No reason to think an LLM (a few generations down the line if not now) cannot do that