Hacker Newsnew | past | comments | ask | show | jobs | submit | Archer6621's commentslogin

That's a nice anecdote, and I agree with the sentiment - skill development comes from practice. It's tempting to see using AI as free lunch, but it comes with a cost in the form of skill atrophy. I reckon this is even the case when using it as an interactive encyclopedia, where you may lose some skill in searching and aggregating information, but for many people the overall trade off in terms of time and energy savings is worth it; giving them room to do more or other things.

If the computer was the bicycle for the mind, then perhaps AI is the electric scooter for the mind? Gets you there, but doesn't necessarily help build the best healthy habits.

Trade offs around "room to do more of other things" are an interesting and recurring theme of these conversations. Like two opposites of a spectrum. On one end the ideal process oriented artisan taking the long way to mastery, on the other end the trailblazer moving fast and discovering entirely new things.

Comparing to the encyclopedia example: I'm already seeing my own skillset in researching online has atrophied and become less relevant. Both because the searching isn't as helpful and because my muscle memory for reaching for the chat window is shifting.


It's a servant, in the Claude Code mode of operation.

If you outsource a skill consistently, you will be engaging less with that skill. Depending on the skill, this may be acceptable, or a desirable tradeoff.

For example, using a very fast LLM to interactively make small edits to a program (a few lines at a time), outsources the work of typing, remembering stdlib names and parameter order, etc.

This way of working is more akin to power armor, where you are still continuously directing it, just with each of your intentions manifesting more rapidly (and perhaps with less precision, though it seems perfectly manageable if you keep the edit size small enough).

Whereas "just go build me this thing" and then you make a coffee is qualitatively very different, at that point you're more like a manager than a programmer.


> then perhaps AI is the electric scooter for the mind

I have a whole half-written blog post about how LLMs are the cars of the mind. Massive externalities, has to be forced on people, leads to cognitive/health issues instead of improving cognition and health.


Cars didn't have to be forced on people. They were adopted enthusiastically.


Maybe it was always about where you are going and how fast you can get there? And AI might be a few mph faster than a bicycle, and still accelerating.

I’ve also noticed that I’m less effective at research, but I think it’s our tools becoming less effective over time. Boolean doesn’t really work, and I’ve noticed that really niche things don’t surface in the search results (on Bing) even when I know the website exists. Just like LLMs seem lazy sometimes, search similarly feels lazy occasionally.

> perhaps AI is the electric scooter for the mind

More like mobility scooter for disabled. Literally Wall-E in the making.


This is the typical arrogance of developers not seeing the value in anything but the coding. I've been hands on for 45 years, but also spend 25 of those dealing with architecture and larger systems design. The actual programming is by far the simplest part of designing a large system. Outsourcing it is only dumbing you down if you don't spend the time it frees up to move up the value chain.

Talk about arrogance, Mr 45 years of experience. Ever thought that there might be people under skyscraper that is your ego? I’m pretty sure majority of tech workers aren’t even 45 years old. Where are they supposed to learn good design when slop takes over? You’ve spent at least 20 years JUST programming, assuming you’ve never touched large scale design before last 25 years. Simplest part my ass.

> Ever thought that there might be people under skyscraper that is your ego?

I do, which is exactly why I found the presumption that not spending your time doing the coding is equivalent to a disability both gross and arrogant.

> Where are they supposed to learn good design when slop takes over?

You're not learning good architecture and systems design from code. You learn good architecture and systems design from doing architecture and systems design. It's a very different discipline.

While knowing how to code can be helpful, and can even be important in narrow niches, it is a very minor part of understanding good architecture.

And, yes, I stand by the claim the coding is by far the simplest part, on the basis of having done both for longer than most developers have been doing either.


> And, yes, I stand by the claim the coding is by far the simplest part, on the basis of having done both for longer than most developers have been doing either.

Doubling down on your ignorance, bold strategy.


Speaking from experience.

"I reckon this is even the case when using it as an interactive encyclopedia".

Yes, that is my experience. I have done some C# projects recently, a language I am not familiar with. I used the interactive encylopedia method, "wrote" a decent amount of code myself, but several thousand lines of production code later, I don't I know C# any better than when I started.

OTOH, it seems that LLMs are very good at compiling pseudocode into C#. And I have always been good at reading code, even in unfamiliar languages, so it all works pretty well.

I think I have always worked in pseudocode inside my head. So with LLMs, I don't need to know any programming languages!


I agree. I also don't think forcing yourself to be an organizer is necessarily a solution to fixing the loneliness, as it also just requires a certain passion. In my experience, some people love organizing things, others just really hate it. I am in that last camp, after having organized quite a lot. For me, simply participating with things that are organized by others has done me much more good. Of course, that still requires being in a state of mind where you are able to take initiative with signing up for such group activities.

Important point. AKA the halo effect, and it can have a significant influence. In general, I feel that this is a more widespread problem with stories and experiences such as these - there are simply too many "hidden" variables to take them at face value. Environment, genetics, circumstances, upbringing, cognitive biases and instinctual/biological human nature all work together to create a cocktail of unique experiences, leading to unique conclusions.


If anything, it made it more clickbait-y due to being an unusual title.


Yes, it's imperfect that way.


I wonder whether it could be related to some kind of over-fitting, i.e. a prompting style that tends to work better with the older models, but performs worse with the newer ones.


By using AI, you learn how to use AI, not necessarily how to build architecturally sound and maintainable software, so being able to do much more in a limited amount of time will not necessarily make you a more knowledgeable programmer, or at least that knowledge will most likely only be surface-level pattern recognition. It still needs to be combined with hands-on building your own thing, to truly understand the nuts and bolts of such projects.


If you end up with a working project where you understand all the moving parts, I think AI is great for learning and the ultimate proof whether the learning was succesful if whether you can actually build (and ship) things.

So human teachers are good to have as well, but I remember they were of limited use for me when I was learning programming without AI. So many concepts they tried to teach me without having understood themself first. AI would have likely helped me to get better answers instead of, "because that is how you do it" when asking why to do something in a certain way.

So obviously I would have prefered competent teachers all the time and also now competent teachers with unlimited time instead of faulty AIs for the students, but in reality human time is limited and humans are flawed as well. So I don't see the doomsday expectations for the new generation of programmers. The ultimate goal, building something that works to the spec, did not change and horrible unmaintainable code was also shipped 20 years ago.


I don't agree, to me switching from hand coded source code to ai coded source code is like going from a hand-saw to an electric-saw for your woodworking projects. In the end you still have to know woodworking, but you experiment much more, so you learn more.

Or maybe it's more like going from analog photography to digital photography. Whatever it is, you get more programming done.

Just like when you go from assembly to c to a memory managed language like java. It did some 6502 and 68000 assembly over 35 years ago, now nowbody knows assembly.


> to me

Key words there. To you, it's a electric saw because you already know how to program, and that's the other person's point; it doesn't necessarily empower people to build software. You? Yes. Generally though when you hand the public an electric saw and say "have at it, build stuff" you end up with a lot of lost appendages.

Sadly, in this case the "lost appendages" are going to be man-decades of time spent undoing all the landmines vibecoders are going to plant around the digital commons. Which means AI even fails as a metaphorical "electric saw", because a good electric saw should strike fear into the user by promising mortal damage through misuse. AI has no such misuse deterrent, so people will freely misuse it until consequences swing back wildly, and the blast radius is community-scale.

> more like going from analog photography to digital photography. Whatever it is, you get more programming done.

By volume, the primary outcome of digital photography has been a deluge of pointless photographs to the extent we've had to invent new words to categorize them. "selfies". "sexts". "foodstagramming". Sure, AI will increase the actual programming being done, the same way digital photography gave us more photography art. But much more than that, AI will bring the equivalent of "foodstagramming" but for programs. Kind of like how the Apple App Store brought us some good apps, but at the same time 9 bajillion travel guides and flashlight apps. When you lower the bar you also open the flood gates.


Being able to do it quicker and cheaper will often ensure more people will learn the basics. Electrical tools open up woodworking to more people, same with digital photography, more people take the effort to learn the basics. There will also be many more people making rubbish, but is that really a problem?

With ai it’s cheap and fast for a professional to ask the AI: what does this rubbish software do, and can you create me a more robust version following these guidelines.


> With ai it’s cheap and fast for a professional to ask the AI: what does this rubbish software do, and can you create me a more robust version following these guidelines.

This falls apart today with sufficiently complex software and also seems to require source availability (or perfect specifications).

One of the things I keep an eye out for in terms of "have LLMs actually cracked large-product complexity yet" (vs human-overseen patches or greenfield demos) is exactly that sort of re-implementation-and-improvement you talk about. Like a greenfield Photoshop substitute.


Your last point is also something that happened when the big game engines such as Unity became free to use. All of a sudden, Steam Greenlight was getting flooded with gems such as "potato peeling simulator" et al. I suppose it is just a natural side effect of making things more accessible.


> Sadly, in this case the "lost appendages" are going to be man-decades of time spent undoing all the landmines vibecoders are going to plant around the digital commons.

Aren't you being overly optimistic that these would even get traction?


Pessimistic, but yeah. It's just my whole life has been a string of the absolute worst ideas being implemented at scale, so I don't see why this would buck the trend.


> By using AI, you learn how to use AI, not necessarily how to build architecturally sound and maintainable software

> will not necessarily make you a more knowledgeable programmer

I think we'd better start separating "building software" from programming, because the act of programming is going to continue to get less and less valuable.

I would argue that programming has been very overvalued for a while even before AI. And the industry believes it's own hype with a healthy dose of elitism mixed in.

But now AI is removing the facade and it's showing that the idea and the architecture is actually the important part, not the coding if it.


I find it super ironic that you talk about "the industry believing its own hype" and then continue with a love letter for AI.


Ok. But most developers aren't building AI tech. Instead, they're coding a SPA or CRUD app or something else that's been done 10000 times before, but just doing it slightly differently. That's exactly why LLMs are so good at this kind of (programming) work.


I would say most people are dealing with tickets and meetings about the tickets more than they are actually spending time with their editor. It may be similar, but that 1 percent difference needs to be nailed down right, as that's where the business lifeline lays.

Also, not all dev jobs are web tech or AI tech.


I think one difference between a hammer and an LLM is that hammers have existed since forever, so common sense is assumed to be there as to what their purpose is. For LLMs though, people are still discovering on a daily basis to what extent they can usefully apply them, so it's much easier to take such promises made by companies out of context if you are not knowledgeable/educated on LLMs and their limitations.


Wondering whether it subconsciously helps to have some sort of (automatic) expiration timer associated with things that end up in your "learning inbox".

I know from myself that I tend to bookmark and save many interesting talks/videos and articles for later, but often I never end up revisiting them; information hoarding in some sense.


Very nice! I like the idea that allows you to make notes skip certain bars, keeps the UI compact instead of having to show all the bars.

It would be nice if you could place notes while dragging the mouse instead of having to click all the time, and a clear button would also be cool.


Pretty damn cool, would be a nice feature :D

EDIT: Mine looks like some tentacle-headed creature


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: