What Coronavirus Tells Us About Risk

As I sit down to write this post I’ve just received an email from a weekly design blog I subscribe to.

This edition is titled , alarmingly, ‘Pandemic Prep’.

It begins “We are interrupting our regularly scheduled newsletter format and rhythm to advise our clients and subscribers to prepare for the possible impacts of the coronavirus”.

Now I don’t know about you, but when seeking advice about pandemics I might look to the NHS or the World Health Organisation but I’m not sure service designers, innovation labs or bloggers would be my go-to source.

At the time of writing COVID-19 has led to approximately 3,000 deaths reported worldwide.

Deaths from regular flu on the other hand are somewhere between 291,000 to 646,000 deaths – every year.

Coronavirus is extremely serious and could yet reach pandemic levels –  but it is also a  good illustration of how we can overestimate personal risk. UPDATE 4/3/20: The virus has killed about 3.4% of confirmed cases globally. The seasonal flu’s fatality rate is below 1%

That said , why are people worrying about receiving post from asian countries , or whether you can catch the virus from beer, or even choosing not to order food from chinese takeaways?

According to Dr Ann Bostrom,  the mind has its own – entirely non-evidenced – ways of measuring danger. And the coronavirus hits nearly every cognitive trigger we have.

Paul Slovic, a University of Oregon psychologist who helped pioneer modern risk psychology, speaking to The New York Times, helps explain what is going on in our minds here.

When we encounter a potential risk, our brains do a quick search for past experiences with it. If it can easily pull up multiple alarming memories, then our brain concludes the danger is high. However it often fails to assess whether those memories are truly representative.

“A classic example is airplane crashes. If two happen in quick succession, flying suddenly feels scarier — even if your conscious mind knows that those crashes are a statistical aberration with little bearing on the safety of your next flight. But if you then take a few flights and nothing goes wrong, your brain will most likely start telling you again that flying is safe.”

When it comes to the coronavirus, Dr. Slovic says, it’s as if people are experiencing one report after another of planes crashing.

This week we’ve launched the new Bromford Lab Podcast and in the first edition we interview Vicky Holloway and Mitch Harrington exploring the relationship between risk management and innovation – and our propensity to sometimes see risk in the wrong places.

Many of our organisations, we know, are risk averse and constrain innovation. The culture is superbly designed to repel anything new or mysterious.

There are two main reasons for why we over emphasise risk:

We are scared of making mistakes

Failure is rarely promoted or even talked about in organisations. This can breed a culture where there is a fear of failure.

Existing in a culture like this will promote risk aversion as once colleagues are fearful about something they will tend to overestimate the likelihood of things going wrong. Research show that fearful individuals overestimate the danger associated with their feared objects or situations.

In the same way as my fear of spiders leads me to overestimate the ability for a spider to harm me, an organisation whose biggest fear is negative media attention will tend to overestimate the reputational damage of trying out a new service or project.

Successful innovation however requires us to fail more often, and to get better at how we fail.

Arguably it’s not fear of failure we need to tackle but fear itself. How does fear manifest itself where you work? What are you frightened of and what is it preventing you from doing?

No-one ever gets fired for exaggerating

The second reason organisations can overestimate risk is there are few negative consequences for estimating risk too highly.

Underestimating the risk of something bad happening has seen organisations go under and many people lose their jobs, but no-one has ever been sacked for over-estimation.

In 2002 , the Guardian predicted that the world would face famine in just 10 years , and a few years later the UK Prime Minister went a step further and said we had only 50 days to save the planet.

Arguably these are just well meaning attempts at highlighting a serious problem that also illustrates how hopeless we are at predicting the future. However a climate of fear is never a good climate for clear eyed problem definition.

This is why fear of failure should not go unchallenged, as it ultimately becomes debilitating and either stops you innovating or leads you to make bad choices.

As Vicky says in our podcastwe are all risk managers and generally we do it very well. We manage risk everyday in our personal lives and we largely make the right choices.

We need to look for risk in the right places and make intelligent assumptions, constantly challenging ourselves to seek out new experiences and solve problems.

The future requires us to be cautious , yes, but also to be a lot less fearful.


Labcast , the new podcast from Bromford Lab , will feature special guests discussing the innovation and design challenges of our day, the big ideas and the bad ideas. 

WhatsApp Image 2020-02-28 at 12.47.25

It’s available now. 

Subscribe on Spotify 

Subscribe on Apple Podcasts

Featured Photo by Hello I’m Nik 🍌 on Unsplash

Is the social sector really getting better at learning from failure?


Guest post with Shirley Ayres , Chris Bolton and Roxanne Persaud

Innovation in the digital sphere can be complex and risky and there are not sufficient opportunities to share learning from failure. One year on from the Practical Strategies For Learning From Failure Workshops we asked the organising team:

Is the social sector getting better at learning from failure?


I see few examples of organisations openly sharing where things haven’t gone as planned.

There’s a fascinating contradiction in behaviour. Whilst there are virtually no leaders who would argue against the importance of failure as a source of learning, our actions show a preoccupation with sharing success.

This cultural bias towards success runs deep. Projects are largely funded on the basis that they will all work – something that we all know to be entirely false.

Within organisations we still reward and promote based on results and past performance targets- not experimentation and deviation.

Destigmatising failure and capturing the learning will happen by organisations taking overt action rather than through inspirational leadership quotes shared happily on Twitter.

It will take establishing new teams and protocols – breaking the chains of targets with perverse incentives to guarantee “success”.

If the social sector is getting better at learning from failure – it’s doing so behind closed doors.


Acknowledging failure is a risky endeavour which has been unhelpfully constrained by risk management processes.  

The formula used by public sector and business leaders is simple: admit what went wrong, apologise and move on with a promise to learn lessons.  Unfortunately this usually happens in the face of mistakes which are too obvious to ignore.  Dealing with this kind of crisis in public is more likely to traumatise a team than support the kind of reflective learning we introduced in #LFFdigital.

Another failure management process which seems easier to learn from is low risk – ‘fail fast, cheap, small’.  The uncomfortable implication is that we can never get things right enough in the first place. Failing fast requires a shift in the way we do things, threatens our credibility and our own sense of ever being able to do a really good job.

Perhaps we are not seeing an open culture of learning from failure when this is a frightening prospect for even the best social sector organisations.  


What is changing are public expectations of more openness, transparency and accountability in the social sector. What may have been hidden in a report a few years ago is much more likely to be subject to scrutiny and comment through  social media. Trust in organisations whose stated mission is social good can be seriously damaged when there is a reluctance to admit they have not achieved what they set out to do.       

Whilst there seems to be consensus about the need to share learning from failure many organisations are working in a competitive funding environment which does not encourage this to happen.

Funders and commissioners have an important role in sharing the learning and also quantifying the real costs of failure in terms of missed opportunities, bad investments and continuing to make the same mistakes.  

It is time to use the digital tools now available to provide real time data and analysis about whether projects have achieved their anticipated and desired outcomes and impact.


Moving from a long standing public service culture of not dealing failure as an inevitable consequence of working in complex situations, is a huge shift in human behaviour.

Pervasive ‘social practice’ is generally one of; minimise the risk and ‘don’t talk about it’.

We behave to fit in with social practices in an unconscious, automatic manner. Shifting the burden of dealing with failure off the individual and making it part of the social practice is something you could argue the aircraft industry has achieved.

Over the last 18 months however, I’ve observed more people talking about failure as something we should recognise and work with. While this is a long way of a large scale behaviour change it is helpful in changing the ‘meaning’ around failure. If the ‘meaning’ (the unwritten rules, social taboos etc) start to shift, there is a chance we can start to change the ‘social practice’ around failure.

Are we getting better at learning from failure?

Please add your comments