Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That will solve the AI understanding you, but still won't make me more precise and clear in my instructions. Yes, the AI can hopefully infer the rest, but even then, I can currently type faster than I can speak. Will the AI be able to infer enough of what I want that I can say substantially less and have the AI still get it right? At that stage, why not go all the way and remove me from the equation completely?


On the typing angle, I'd argue the HN vs mass market point. Typing is still a skill we aren't born with. As much as it's now assumed in 1st world core demographics, there are a huge number of people for whom it isn't faster.

And if you've ever watched a grandparent try and use Windows, there are a lot of other metaphors they miss too.

I'm not saying NL recognition is going to be easy, but I am saying that getting there will be far more valuable than maybe the HN crowd assumes from self-reflection.


It depends on the use case. If its a typical Google Home-style "turn on the lights", "look up this thing", "do my accounts" then I agree (although the AI may need to do a lot of prompting since people (at least me) often don't know what we really want).

For less casual uses, I'm not sure. I guess there's a spectrum.


That's a good point on the prompting, and something I don't believe a lot of current systems do?

E.g. "Do X" "Would you like me to X1 or X2?"

I'd guess the complexity usually isn't there as (afaik) current systems tend to be single action endpoint rather than knowledge parsing the request.


Yeah I think you're right that current systems don't have the complexity yet. It's either simple tasks (turn on the lights, play music), search (find X for me) or menus. As complexity of tasks increases, I think it'll become more important.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: