Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. – Brian W. Kernighan
I was reading In the beginning was the command line by Neal Stephenson, again, and he pointed out something rather interestingly obvious.
He wrote “Commercial OSes have to adopt the same official stance towards errors as Communist countries had towards poverty. For doctrinal reasons it was not possible to admit that poverty was a serious problem in Communist countries, because the whole point of Communism was to eradicate poverty.”
It would seem that modern society is almost entirely structured in a way that makes it impossible to admit the possibility of errors.
And I suppose there is a good reason for that – the existence of lawyers.
Where there’s a mistake, there’s blame – and where there’s blame there’s compensation.
And what that situation ends up creating is fear.
Let’s say you think of this in systems dynamics terms.
We start by making something – a product, a meal, something entertaining.
We find users, customers – people who will try it and like it or find problems with it.
We can apologise to those customers, thank them for bringing this to our attention and fix the problem – making our product better.
Or we can deny those problems exist, keeping them firmly in place and, in time, fixing a few but possibly making others worse as a result of all the hiding and denial.
All we can do is hope that the problem goes away – quite possibly making our product worse as a whole, especially when it comes to service.
The former approach would seem the better one but the latter is, unfortunately, the likely one.
Because of fear – fear of getting in trouble, fear of losing customers, but most of all, fear of losing money.
Deming got this – in his book Out of the crisis of the fourteen principles he said were necessary to transform industry was point 8: Drive out fear.
Fear, he writes, impairs performance and leads to economic losses.
It seems almost axiomatic, a law of nature, that an organisation that operates on the basis of fear will never admit it is wrong unless forced to.
Which pretty much sums up how the most powerful organisations in society work – including corporations, governments and religious institutions.
So, a necessary precondition for improvement appears to be the ability to be fearless.
Isaac Asimov was asked once how he could resist the political machinations of his university and he said that the answer could be summed up in two words – outside income.
I wrote some time back about three business models that work these days: getting tipped, staying small or having a patron.
You can only be fearless when you’re not worried about money – or the backlash from something going wrong.
And that’s why openness is a good business model.
If you can build openness into everything from the start – then you educate your customers that you’re open about how you do things.
And the fact is that nothing is bug free – programs aren’t for starters.
And everything we do in the world can be described in the form of a program, an algorithm, a plan, a proposal.
The best laid plans of mice and men, as the saying goes, often go awry.
The reason why free software works is because it comes without a guarantee – you use it entirely at your own risk.
But, ironically, that lack of a guarantee makes it easier to admit there are bugs and fix them when that’s pointed out.
Now, in many cases, guarantees are a good thing – the extreme example here is airline safety.
Every aircraft incident is investigated, and pilots go through checklists to make sure they’ve not missed anything.
Aircraft are really very safe.
So surely having a legal obligation and the huge lawsuits that result from airlines getting it wrong lead to better quality?
That would be the wrong conclusion to draw – for one very simple reason.
Pilots want to get it right because they’re in the same plane as you are.
The consequences of things going wrong end just as badly for them as they do for the passengers.
That doesn’t happen with doctors, architects or car salespeople.
What the research shows is that when you look deeply at any profession – take medicine as an example – literature and practice is rife with errors – with bugs.
The financial crisis of 2008 saw the damage that could be done by a buggy mortgage product sold by people who thought they were very clever.
There is no easy answer to this – but the world is moving in the right direction when it comes to openness.
We’re also, at the same time, moving in the wrong direction as we become increasingly polarised in our views and politics.
But that’s also natural.
As the picture shows, both situations can exist quite happily in the same world.
We need to make choices about where we’d like to spend most of our time.
And that probably starts by keeping things simple and as transparent as possible.