Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because most people using 1% of the functionality of even a debugger like gdb, if they're aware it exists in the first place.

A debugger is something that helps sell an IDE by padding the feature list more than something most people will use much.

I'm not saying it ought to be like that, but in more than two decades of watching developers work, my conclusion is that most developers 1) are unfamiliar with debuggers, 2) if they do know about debuggers they rarely use them, 3) if they do use them, they rarely use more than the very basics .

I myself fall in category 3. I use gdb, and I'm fine with that because all I tend to use is going up and down the stack, displaying some expressions and occasionally single-stepping.

I've used Visual Studio, and I've used DDD, and frankly I fall back to the above because it's quick, and because it's very rare that spending lots of time in the debugger is as helpful as writing more test cases or spending more time reasoning about the code.

To displace more primitive debugging techniques, a debugger will need to be extremely quick to cut to the core of the problem and basically help automating what is often a very basic workflow to try to narrow down things like "that crash happened because of an incorrect value in x; when did x get that value and what contributed to that?" followed by "how did y get that value that contributed to the broken value in x".

You can provide lots of fancy ways of making those kind of steps more pleasant, but they're doing too little to beat having a very minimal UI to deal with.

Instead debuggers appears to largely have focused on making those hair-pulling frustrating and long but also exceedingly rare debugging sessions more pleasant.

At least that's my impression: 99.9% of the time, I need something that's a smidgeon above a printf; 0.1% of the time I get frustrated enough to consider looking at more advanced tools, but that's not often enough to usually be worth it.



A debugger is something that helps sell an IDE by padding the feature list more than something most people will use much.

Because you have the most experience in environments where debuggers can only reach a small fraction of their potential. (Barely better than printf) Let me assure you that there are very different environments out there, and have been for decades.


I don't see how that would change anything I said. I never claimed that there aren't people who will use debuggers more fully, because clearly there is. But most people won't. And I'd be very interested about hearing about these "very different" environments - I've worked on tiny embedded systems, and large distributed systems, and desktops, and a wide range of other varied systems, and personally I've never felt compelled to dig into more advanced debugger features. Not that I haven't wanted them - but what I'd like to see is not what debuggers tends to offer.

(What I'd like to see is more context aware automation of the environment; e.g. debugging a crash? I'd like to see attempts at tracing values backwards, and options for attempting to re-try runs with automatically added probes based on the state of the environment when it crashed; done right, that's save me time)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: