Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Within the next 24 months that's what all engineers will transition to. Is it healthy or not, it's how it goes. Engineers move to abstract work like architecture, interfaces, and the implementation is handled by AIs, with humans just tweaking when it's not right.


What are you basing this on?

I've played with code generation from AIs it works sometimes but confidently produces bugs and doesn't scale at all.

I think another giant leap is required to get to the point that humans are just tweaking. I'm saying we won't get there but what in the pipeline in the next 24 months that is going to get us there?


We all confidently write bugs and discover them in testing. The same process has been implemented for GPT, for example GPT Engineer, you can also instruct the Code Interpreter model for it in GPT Plus and it works. I see some people are not up to date here.

What I base my guess on: the fact we already have GPT apps that write apps and clearly it works fine, and as we like to say "this is the worst it'll ever be". When people say "it's not very good right now, it produces garbage" I only see people who are not used with the speed of progress right now. Midjourney a year ago produced weird abstract doodles that only looked like images from distance. Now it produces photorealistic art that's taking jobs. One year.

What we need: Bigger context, expert models in constellation and scale. You need nothing else. Of course some architectural modifications will emerge in the journey of achieving this, but that's a comparatively trivial constraint to solve, it's just normal software engineering as we've done for decades, but this time for models.


Just because you happen to do a trivial and menial job, it doesn't mean that every software developer does.


First: the AI isn't that accurate yet. It gets shit wrong. It's almost there but not yet. Maybe in 24 months... but a lot of people think we hit a wall.

Second: implementation is harder then design. If AI can do implementation it can for sure do design. And it is demonstrably true that chatGPT can do design just as well if not better than implementation.


My observations are literally the opposite of what you said so YMMV. Design is where the needs of humans interface with the needs of a machine. Models are good at translating basic needs to code. Not so great knowing or figuring those needs in wider context. Yet. They’re interns.


I'll take that bet.


I will take the other side of this bet.

I don’t see this happening within the next 24 months.


We already have apps written by non coders in GPT. But sure let’s wait 24 months.


We also had apps written by "non coders" in excel. We had apps written by some hackthon participants who hadn't coded at all a week before the hackathon. Writing the code is not the hard part of software development, and never has been.

The various LLMs do very poorly at generating and implementing high level designs, especially if they need to generate the specs themselves from a bunch a requirements and/or poorly formulated user feedback.


You’re reiterating his argument almost perfectly. He initially said the code writing will be left to the AIs while high level designs will be left to the engineers. That’s exactly what they mention foreseeing in the next 24 months, which I also intuitively agree with.


Woooooooooow, a greenfield project developed by a single or a couple of people!

That for sure will scale on larger projects!


Don't drink too much of the AI kool-aid.


Do the AI's need those things? I mean, they exist so that humans can understand their 10 million line code bases. Maybe something else will be needed, for AI's to understand their 10 billion line code bases. Something that we won't even be able to discern.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: