I disagree. There's definitely _some_ who will use these tools to build systems for themselves. But do you think the chef who's been pulling insane hours in the restaurant wants to come home and build his own software? Or the teacher who just had to deal with an annoying classroom all day?
People want software that just works, they'll pay for it, they don't want to use their computers to build their own software. That idea is just software and computer geeks (said affectionately) projecting their own desires on a larger community.
Does it have to be mutually exclusive? On-the-fly software does not destroy software. Gatekeeping software creation does not mean shoving the existing creators out, it just means creating a larger space that others can occupy, like when 'real' programmers had to slowly permit 'script kiddies' into their spaces. All feels a bit 'old guard' vs 'new guard'.
Not mutually exclusive, but I thought your initial post painted an overly rosy picture with the sentence "[..] allowing millions of people to finally use the devices they've been sitting in front of all these years".
I don't think it's happening at this scale. I'll admit I have no real data to back that up, it's just a hunch really. But I find it hard to believe that those people whom previously weren't interested in building software are now suddenly interested to build stuff with an LLM. I'm sure _some_ people are doing this, and then they either hit roadblocks and quit or stick with it an learn actual software engineering.
Looking at my non-tech bubble of friends and family, I don't see anyone actually doing that. I think it's a vocal minority that is doing this. That's just anecdata of course.
I think using GPT et.al. to create a bespoke tool to do what you need is giving the average home user too much credit. What I see more of is just using the prompt in the place of software to create an outcome. "Transcribe this recording", "give me a synopsis of the Godfather films", "How can I wow my girlfriend?". The fraction of home users who are using this to create software is likely highly limited to people with no skills trying to make apps to sell, which is not a tool to help them with something else. Even the software devs I know are using tools made for them, not making their own Claude Code or Cursor.
Right now, the greenfield is in how you use these tools. Making a bespoke specialized tool for yourself, or automating onboarding or CICD setups with simple commands or building bridges between "gatekept" existing software and agents are ripe for growth.
I get that we should see this as a good thing, but I see it as entering the last act of a play. Thousands of people are doing these things and coming up with uses for the tools around the clock. Novel uses for the technology will all be exhausted in the next couple of years and there will be less room for innovation than there was before LLMs.
We’re not there yet, but that chef or that teacher definitely would want an AI voice assistant as good as the computer in Star Trek. Maybe to achieve that, a language model builds software entirely autonomously and runs it to carry out the user’s command. Or maybe they want the computer to build them software that they can then use to do their own work more efficiently.
> We’re not there yet, but that chef or that teacher definitely would want an AI voice assistant as good as the computer in Star Trek.
Since you brought up Star Trek, a good analogue for AI would be the holodeck. Given the appropriate prompts, it produces amazing scenery and even immersive fantasy narratives.
But occasionally, it goes haywire, the safeties no longer work, and the characters from your fictional adventure try to kill you.
People want software that just works, they'll pay for it, they don't want to use their computers to build their own software. That idea is just software and computer geeks (said affectionately) projecting their own desires on a larger community.