I read an interesting article in the New York Times this week about the latest twist in the Volkswagen emissions scandal. The CEO wrote to all employees promising them that they would not face the sack if they came forward with any information about the emissions control cheating programme. But that this ‘amnesty’ would only last until the end of the month. At first sight this all seems hard to believe. Did the Directors really have so little idea about what had been going on that they have to encourage informants to tell them. And are they saying that if workers don’t come forward by the end of the month then they could be sacked for not doing so?
And this got me thinking more deeply about the balance of individual and corporate responsibility for a scandal like this, and the dynamics of relationships within groups of technical and managerial staff that could lead to such a disastrous situation emerging.
I spend a lot of my professional life working across complex organisational systems and helping clients spot emergent, innovative, solutions and encourage their development. And in all the situations I’ve seen, positive ideas have emerged from the collaboration between people with different perspectives on what had previously thought to be an intractable problem. But what if that problem has been to sell diesel cars in a US market that seemed biased against them (and in favour of locally produced models)? And what if the solution that emerged from a combination of hardware and software teams passed all the official tests but performed very differently on the road? When should the ethical perspective kick in and say, ‘Stop! This is illegal and damaging to public health’? It’s easy to say that no-one should ever cross such a clear ethical line for a minute. But if I’m honest I also run a lot of workshops where I encourage leaders to challenge convention, break their own frame of reference. I’ve heard myself saying – ‘don’t judge someone else’s ideas – build on them and take them further, push the boundaries’. So is that all wrong?
To straighten out my own ethical knots I re-read Dan Ariely’s excellent book ‘The Honest Truth about Dishonesty’ – which I would recommend to any leader who is reflecting on the lessons of the VW scandal for their own organisation. (If you prefer the 10 minute version see his brilliant RSA animate presentation.)
In it he puts forward a theory that we all wrestle with balancing a dilemma between:
a) Wanting to view ourselves as honest, honourable people, while at the same time;
b) Wanting to benefit where we can from cheating a little bit (especially if we think those around us are all doing the same).
In a series of fascinating experiments he gives groups of students a chance to cheat on self-marking various short tests and finds that most people over-score themselves a little bit. (2 marks out of 20 in his initial example). He then tries increasing the financial reward for a high score (to $10 per correct answer) and finds this has no effect on the average level of cheating, he also experiments with increasing the likelihood of getting caught and finds this has remarkably little impact either. And he concludes that “we cheat up to the level that allows us to retain our self-image as reasonably honest individuals”.
So far so obvious perhaps – but he also discusses the ‘what the hell moment’ when a series of small dishonest actions in an individual or a close group spiral up to the point where they say ‘what the hell – we’ve already come so far let’s just be the bad guys and take this to the limit to maximise our benefits’. And it’s these spirals of deceit that I think lead to the sort of corporate disaster that VW has walked into. A series of small dishonest actions being amplified across a network.
You could say the answer is to have absolute ethical codes and inspection regimes that mean everyone lives with being watched and works in expectation of being caught out if they step out of line. But I think that flies in the face of human nature, and also probably results in squashing out all creativity along the way.
Dan Ariely offers a glimpse of another model. One that the ancient world religions have all developed in one way or another. That of confession. A mechanism that allows anyone to say (in private) that they know they have done something ethically wrong, and they want to confess, to make amends in some way, and also to re-set their own ethical compass. In that way they can start again, restoring their self- image as a reasonably honest individual – and avoid the risks of ratcheting up the ‘what the hell’ spiral.
So the question I am now pondering is whether there are secular equivalents of this process. Confidential reporting systems that allow people to say – perhaps I have overstepped a line and I want the organisation to know it, so they can mitigate the consequences of it happening again. With small sanctions for small misdemeanours acting as a relief valve to stop these growing into the sort of scandals that can bring down car companies, banks, or even whole economies.
We live in a complex world and that means leaders and Boards need to understand the dynamics of the systems and organisations they govern. It means analysing how game-changing ideas emerge whose impact can grow with explosive speed – for good or for ill. But we also need to learn how to design new safety mechanisms that can encourage people to speak up without fear of retribution, to spot trends and ethical problems early, and act as firebreaks to stop their spread before they put the whole organisation at risk.
So what is your organisation doing to get honest about dishonesty?