Debugging as Solving a Mystery
What if I told you that fictional mysteries contain practical real-world methodologies?
My day jobs as a software engineer over the past numerous years involve fixing bugs or solving bizarre cases of bad behavior in complex systems. And I feel like a detective.
Solving programming problems and debugging a program—or any system, such as a robot—often involves reducing the amount of unknowns until only one culprit is left.
The situation is like a mystery story: A detective interrogates all the suspects and gathers clues until everybody but one suspect has an alibi. Along the way various red herrings lead the hero astray. But with sufficient confidence that there must be a logical explanation, the mystery is eventually solved.
Mastermind
In a book called Mastermind: How to Think Like Sherlock Holmes, Maria Konnikova also compares the mental approaches of a detective to non-detective thinking.1Konnikova , M. Mastermind: How to Think Like Sherlock Holmes. Viking, 2013.
But Konnikova has leapt far beyond my own detective analogy by creating a metaphorical framework for mindfulness, motivation, and deduction, all tied to the fictional world of Sherlock Holmes. This framework is a convenient place to investigate cognitive biases as well.
The core components of the metaphor are:
- The Holmes system.
- The Watson system.
- The brain attic.
The systems are of human thinking, and you can probably imagine circumstances where you operated using a Watson System but in others you used a Holmes system to some degree.
Most people are probably more like Watson, who is intelligent but mentally lazy.
The Holmes system is the aspirational, hyper-aware, self-checking system that’s not afraid to take the road less traveled in order to solve the problem.
The brain attic metaphor comes in as a way to organize knowledge purposely instead of haphazardly. The Holmes system actively chooses what to store in its attic, whereas the Watson system just lets memory happen without much management.
Bias
Here’s an excerpt of one of the many bias-related issues discussed, where the “stick” is character James Mortimer’s walking stick that has been left behind:
Hardly has Watson started to describe the stick and already his personal biases are flooding his perception, his own experience and history and views framing his thoughts without his realizing it. The stick is no longer just a stick. It is the stick of the old-fashioned family practitioner, with all the characteristics that follow from that connection.
When I programmed military robots and human-robot interfaces, I often received feedback and problem reports as directly as possible from the field and/or from testers. I encouraged this because it was great from a user experience point of view, but I had to develop filters and Sherlockian methods in order to maintain sanity and actually solve the issues.
Just trying to comprehend what was wrong at all was sometimes a big hurdle. A tester or field service engineer might report a bug in the manner of their personal theory, which—like Watson—was heavily biased, and then I had to extract bits of evidence in order to come up with my own theories which may or may not be the same.
For example, some people thought the robot could go to “sleep” like their laptop (it couldn’t). Sometimes users would think the robot was overheating in the desert sun which makes sense when dealing with machines in general but it was not the problem. A couple times a field guy reported a robot “doing the funky chicken.”
Or in some cases the people closest to the field reported the issue and data objectively, but by the time it went through various Watsons, irrational assumptions of the cause had been added. Before you can figure out the problem, you have to figure out the real problem description and what data you actually have.
As Konnikova writes:
Holmes, on the other hand, realizes that there is always a step that comes before you begin to work your mind to its full potential. Unlike Watson, he doesn’t begin to observe without quite being aware of it, but rather takes hold of the process from the very beginning—and starting well before the stick itself.
And the walking stick example isn’t just the removal of bias. It’s also about increased mindfulness.
Emotions
Emotional bias comes in because that can determine what observations you are even able to access consciously, let alone remember in an organized way. For instance:
To observe the process in action, let’s revisit that initial encounter in The Signs of Four, when Mary Morstan, the mysterious lady caller, first makes her appearance. Do the two men see Mary in the same light? Not at all. The first thing Watson notices is the lady’s appearance. She is, he remarks, a rather attractive woman. Irrelevant, counters Holmes. “It is of the first importance not to allow your judgment to be biased by personal qualities,” he explains. “A client is to me a mere unit, a factor in a problem. The emotional qualities are antagonistic to clear reasoning…”
Emotions are a very important part of human minds; they evolved because of their benefits. I often talk about emotions and artificial intelligence in this blog. However, in some human contexts, maybe that we’d consider “high level,” the dichotomy of emotion vs. reason becomes an issue. Konnikova says:
It’s not that you won’t experience emotion. Nor are you likely to be able to suspend the impressions that form almost automatically in you mind. But you don’t have to let those impressions get in the way of objective reasoning.
So, may your coding and debugging avoid bias and remember—you are the hero in a mystery.