Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think most software has stagnated because we have hit a wall in terms of user interface until AR/VR/Brain interface improves.

I have a few assorted goalposts:

* Where are the AR glasses? Google dropped the ball.

* I need to be able wave my hand to send things to other people's eyes.

* Most games still can't play at 60 fps, which is 2006-era tech.

* VR headsets are still poor quality.

* Cloud gaming still can't achieve < 1ms input delay. What's up with that? How am I going to post a multi-speedrun where a program randomly switches between game tabs?

* Interfaces in general are slow. There needs to be as little delay as possible between my thoughts/actions and the computer's response.

* Why can't an iPad simulate the sensation of texture? Tracing lines in sand with my finger on my phone was a cool webgl tech demo 10 years ago. Now I want to feel the sand.

* Self-driving cars still don't work.



>Cloud gaming still can't achieve < 1ms input delay. What's up with that?

The laws of physics are a hard pill to swallow. Unless you want a datacenter at every street corner, this is not gonna happen.


Much of this could be solved quickly if the Market Opportunity were made clear.

$$$$




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: