As humans, we love to think that we’re rational, sensible people making rational, sensible choices. The problem is that modern research suggests that this is pretty far from the truth, and that there are common biases and errors that affect our thinking.
As NGOs, charities and non-profits, some of these can really impact upon our efforts to reduce fraud and corruption to an absolute minimum. Here’s a little selection of some of those that I think I’ve observed.
1. Availability bias
In 2010, there were a number of shark attacks off the Egyptian resort of Sharm el-Sheikh. In the following weeks, it seemed to me that the news was full of shark encounter stories. What was going on? Had sharks suddenly become more aggressive? No – this was availability bias in action, a mental shortcut in which we rely on the most immediately available information to make decisions, without considering how that information became available. All that was happening, was that the media were reporting more on the subject, and that I was more attuned to the matter. (David Mcraney uses a similar example in his great book, You Are Not So Smart.)
Fraud and corruption love availability bias. These two creatures naturally hide, so availability bias means that senior managers are usually responding to more readily-apparent risks, often de-prioritising fraud and corruption. And when these phenomena are off the radar, they can blossom until they’re too big to ignore – and then it’s too late; public scandals are imminent.
Instead, we need to recognise that the unique nature of humanitarian and development agencies gives these risks high likelihoods and impact. So, they should be made a standing organisational priority, with their own reporting framework that provides management information on both the perceived risk areas and the performance of countermeasures.
2. Loss aversion
Research suggests that we feel losses more intensely than gains – so if you lost a £100,000 sports car, you’d feel that much more powerfully than you would if you won a £100,000 sports car in one of those airport car lotteries. This emphasis leads to an aversion to losses that can be stronger than the lure of benefits.
Committing to counter-fraud and corruption work means that in the long run, we have more money with which to help our beneficiaries and are more sustainable – our work is more resilient to catastrophic reputational events, and we could enjoy greater public trust. But these less tangible long-term benefits face a big challenge from very tangible short-term losses. Spending more money on anti-fraud mechanisms, or refusing to pay bribes, can mean we slow down (or perceive a lessening in) our operational delivery. That means helping fewer beneficiaries by comparison to our expectations. So implementing a counter-fraud and anti-corruption agenda can appear to come at a loss – not a gain.
Counter-fraud specialists need to clearly articulate the benefits of counter-fraud and corruption work, using every available means. This requires creativity and effort. Further, donor agencies and private supporters need to leverage the NGOs they fund, making it clear that this better way of operating represents their expectation.
Celebrated psychologist Dan Ariely conducted an interesting experiment in which he placed dollar bills and cans of Coca-Cola around the campus of an American university. When he went back, the dollar bills were all still there – but the Coke cans had gone. (You can read about this, and other experiments, in his fantastic book Predictably Irrational.)
What might be happening here is that the less like money something seems, the less like stealing it feels. This is rationalisation, the process of making something we want to do (even something dishonest) fit with our own self-respect.
This is really important for NGOs, because although we might have good controls for cash handling, do we take sufficient protective care of our physical assets, and the stock in our warehouses? Studies like this one would seem to suggest that these items are at a high risk too.
4. The fundamental attribution error
When you’re driving, have you ever noticed that if someone else makes a mistake, then they’re an idiotic and dangerous driver – but if you make a mistake it’s because you were interrupted by a passenger, the car needs servicing, or you were responding to something another car was doing? This effect is known as the fundamental attribution error – the tendency to ascribe the actions of others to their own internal factors, but yours to external factors.
Something we sometimes do in NGOs is to assume that people who commit fraud and corruption are fundamentally bad people that we need to keep out of our organisations. When we do this, we forget that people are complicated, and can become perpetrators while inside our organisations.
In Donald Cressey’s enduring ‘Fraud Triangle’ theory of behaviour, anyone can behave dishonestly if the pressure on them is sufficient, if they have an opportunity to do it with a sufficiently low chance of detection or meaningful sanction, and if they can rationalise (justify) it in their minds. Though our thresholds for each may vary, we all have a triangle, and might progress towards or away from those thresholds according to the factors acting on us throughout our lives.
This is important for NGOs, because it means that all our physical assets, funds and stock are at risk from all our staff, all of the time.
The best way to respond to this is to commit to an ongoing, holistic programme of activity that deters, prevents, detects and responds to fraud and corruption – and which is considered as fundamental to our business as having an HR or IT function.
5. Learned helplessness
A few years ago, in a Middle Eastern country, I was delivering a workshop for managers on reducing fraud and corruption. A lady interrupted me to say, ‘but this is just the way things are here!’
When I encounter that view, it reminds me of an experiment conducted by psychologist Martin Seligman. Seligman found (by accident) that if dogs received electric shocks while they were unable to escape, they would learn to accept their fate and even when an escape route became available later, the dogs wouldn’t take it. The effect is known as learned helplessness.
When we work in complex and difficult places, we can sometimes give in to learned helplessness. This phenomenon lies to us with such thoughts as ‘this is just how business is done around here,’ or ‘we can’t do this work in any other way.’
The truth is that for every problem there is an opportunity – even if that sometimes means doing things the long way round, or investing more funds in doing them. Helpful approaches include maximizing local contextual knowledge, incorporating a realistic and informed planning phase, and seeing response activities like investigations as business improvement tools that fuel a virtuous cycle of self-improvement – what can we learn in order to become more resilient?
What other effects have you seen in action, and how best can they be countered?