Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ban The Debugger (iancartwright.com)
49 points by fogus on Nov 24, 2009 | hide | past | favorite | 24 comments


"Ban the Debugger" was an unfortunate name for this post, because obviously debuggers are useful. Having worked in support, I always opened up a bug when log/console output was insufficient to pin down the problem a customer was having. The notion of (temporarily) banning the debugger seems great. I've seen some logging standards that you could follow, but the output still wouldn't be enough to nail down the cause of an error. If the failure of one test in (A && B && C) results in a single error, how were you supposed to know which of A/B/C caused the problem?

So, yes please, have some failure cases setup ahead of time, and send the logs to the developers and see if they can figure out which failure mode it was without getting any live access to the program. But don't call it "banning their debugger." :-)


If you can manage to ban the debugger, then you should rethink your entire debugging support strategy.

A debugger infrastructure can be a core and critical component within a non-trivial application. This whether the integrated debugging is used for classic application debugging, for call tracing and basic performance monitoring, for creating and then processing application dumps on severe errors, for associated tasks including crash notifications and restarts and failure-related statistics collection, or otherwise.

You build debugging in. From the onset.

One OS environment that I work with can launch the system's debugger entirely under program control, which means the application can detect an "unexpected" failure and can (barring cases of severe corruptions) invoke application-specific debugger command sequences which can access and display the core application context, and a debugger-readable dump then be generated, and (if appropriate) the application then restarted.

Logs never have enough data, and users (and programmers) don't always find and read logs. Better still, you don't have to switch techniques when you ship code; you're always using your primary tools. Bake debugging right into all non-trivial applications.


There are some types of bugs that are hard to track down without a debugger. If you log a "Object reference not set to an instance of an object" message on a typical ASP.Net page it tells you nothing basically and you're stuck trying to figure out why it could have happened with little more information than what page it happened on. The debugger is your friend in that case.

It seems like there are other ways to achieve the same end that the author is seeking.


You've just stated the Contrapositive of his argument.

ASP has only gotten away with it's bizzaro architecture and shitty implementation because it got such a full-fledged debugger for free.


Most of the typically criticized, "fucked up" aspects of ASP.Net (like the page life-cycle event model) has been cleaned up in ASP.Net MVC.

I still want my debugger though.


I agree with cabacon that the title is poorly-chosen. I was expecting to read one of those inane "I have transcended the debugger" posts, but instead found something insightful.

How my team tries to fix the problem of log files being insufficient is to have debug builds log to the console as well as to the log file. If we have useful log statements in place, debugging a simple problem becomes trivially easy as we can see the problem unfolding in front of our eyes. This also helps solve the problem of logging too much data, because the more we log the more console spew we have to deal with.

I do like the idea of banning the debugger right before a launch, though.


I think a lot of tools can really cripple programmers. One needs discipline to only use such tools as a convenience, once you already know what's going on under the surface - hence why I generally avoid IDE's, etc...


So you generally avoid using IDEs because you don't yet know what's going on under the surface? ;)

I've been on both sides of that particular fence. On the one hand, learning to use a fancy development tool effectively takes time and effort, and that's time and effort that often can't be used to understand what's going on under the covers. And then you're stuck learning about the inner workings anyway when something fails to build with some kind of vague error message.

But in the long run the automation provided by high-level development tools makes for more productive programming, and the time spent learning to use them is usually paid back handsomely.

My advice would be to do a few projects using little more than command-line tools (and no project templates), but then don't get stuck in a rut and do take the time to learn how to automate things and generally make your life easier with higher-level tools.

With debuggers in particular there are classes of bugs where logging (or worse, Response.Write/echo/puts debugging) is pretty ineffective: if an integer is being interpreted as a string where you expected a number, printing out the value in a log file doesn't give you much insight into what's going wrong.


I have never really understood the fascination with debuggers. A good log file will allow you to triage any error, even if it's not the one you were looking for. It seems to me that debuggers are mostly for spotting failed assumptions and won't find unreproducable or subtle systemic problems.

I consider the desire to use a debugger a bug in my logging and take corrective action.

Also, you can't grep an execution history in a debugger. I really have no idea how you would spot systemic errors without doing so.

Can anyone who lives by debuggers make a case to convince me to fire up the debugger more often?


I don't use debuggers often, but am glad they are available when I actually do.

If a running program is like a lab rat running through a maze in the dark, then using print statements is like placing bells on tripwires in select places in the maze. Using a debugger is like turning on the lights, and watching the rat.


I think this metaphor is somewhat wrong.

Log files are like attaching medical imagers to the maze and keeping a trace of it's physical and mental state as it executes the maze. Debuggers are like attaching medical imagers to the rat and being able to only see the immediate state only.

Obviously, debuggers are much faster to bootstrap than good logging, and in that category I can see them being a clear win. However, I still see them as providing less information.

PS: That said, there are a few situations I bust out a debugger too. Mostly when I'm too lost to get started.


Debuggers help you in understanding the code.


I strongly believe code that can't be understood by inspection is code that shouldn't be.


The only problem I see in the post's story is that the tech lead didn't create a logging standard and reviewed the code to make sure the team followed it. I don't see why we should blame debuggers.


He's not blaming the debuggers. He's advocating a step in the build process somewhat akin to dogfooding: encourage he developers to use the tools they've written for support to help find those last few bugs.


You're right. As others said, the title doesn't match the text. Still, I don't think this ad hoc solution is the right one. If your logs are important to run your operations (and if you are processing transactions online, they are,) they should be included in your test plan. Thankfully, the most useful points to log are easy to instrument nowadays with listeners/interceptors/decorators, and sometimes it is just a configuration setting. I just wish it were easier to automate test assertions based on absence/presence of messages in the log.


The point of this article seems to be that prior to release debuggers shouldn't be used in order force developers to output to logs that can be used in production.

I would say that developers should be logging anyway, and reading the logs in addition to using their debuggers.


Ok... When I think logging I usually think "useful logging". Log what server-side calls were executed from a webapp, log what logical points were hit, what was the logical results. "In function foo" and "x = foo" is tracing stuff. Useless and is actually counter-productive noise. We want that turned on granularity only when we really care for this, maybe even remove those and use a debugger. Writing good logging is an art form, much like writing useful comments or clean code.


Someone said it more succinctly in 1979

"The most effective debugging tool is still careful thought, coupled with judiciously placed print statements." -- Brian Kernighan

and followed it up with

"Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?"


If your program cannot say you where is the problem, it is better to invest your time into better code: add more self-diagnostic, constraints, asserts, and automated test cases.

If you waste your time on debugger, then you will have just bug fixed and nothing else.

If you invest your time into code, you will have bug fixed AND much better code.

Computers can catch bugs much faster than humans. Programming is invented for that. Think like you already have programmable debugger inside your program. Invest your time into useful tools, which you can use to trace and check your program.


debuggers help joe blow ASP.NET coder implement his spec faster. it doesn't really matter if he understands what's going on under the surface, nor does it even matter if his code sucks, because he's just a monkey. processes have been implemented so his crap code isn't a project risk.

edit: I work at a place like this.


I think there's a little more to it than that when you are talking about frameworks like ASP.NET. It's simply not plausible or necessarily even possible to log every piece of information you might possibly need to debug a given problem. That said, I am all for people putting more thought into logging and tools that help troubleshoot in production.


for that matter, you ain't taking MY debugger either, because it helps me understand everyone's complicated code faster and better.


Wow, stereotype and generalize much?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: