And it depends on the person and their experience of chatbots. People were fooled in the 1960s by ELIZA, the chatbot that mostly just rephrased what the user said as a question (i.e. "I'm afraid of flying." "Why are you afraid of flying?") and people believed it was understanding them.