Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> A standard chatbot's job is to give the child the answer, thus removing the need to learn.

An LLM's job is not to give the child the answer (implying "the answer to some homework/exam question"), it's to answer the question that was asked. A huge difference. If you ask it to ask a question, it will do so. Over the next 24 hours as of today, December 5th 2025, hundreds of thousands of people will write a prompt that includes exactly that - "ask me questions".

> Learning can still happen, but only if the child forces it themselves.

This is literally what my original comment said, although "forcing" it is pure negative of a framing; rather "learning can still happen, if the child wants to". See this:

>In e.g. the US, it's a huge net negative because kids aren't probably taught these values and the required discipline. So the overwhelming majority does use it to cheat the learning process.

I never claimed that replacing effort isn't necessarily a problem either, just that such a downside has never been brought up in the context of access to a brilliant tutor, yet suddenly an impossible-to-overcome issue when it comes to LLMs.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: