Black swan

+ Understanding project risk as a whole and understanding the psychology of risk could help better manage black swan events.

I think current project risk management practices fail because they tend to be reductionist and ignore the human side of dealing with risk. One of the biggest consequences of this is that they fail to help us identify and manage what are known as ‘black swan’ events. 

Named by statistician and risk analyst Nicholas Taleb, a black swan event is a highly improbable occurrence with three characteristics: it is impossible to predict, it carries a massive impact (positive or negative) and its shock value is stunning because people could never conceive of such an event occurring.

Negative black swan events can destroy an organisation, a project or a whole industry if they aren’t managed well. For example; Levi Strauss once suffered a US$192.5m charge against its incomes to compensate for an IT project that went wrong in disastrously unexpected ways

The winners are the companies that are proactive and get benefit from these unprecedented events. Just think about how Samsung exploited the rise of smart phones, for example. 

In terms of project management; exploiting positive black swan events may mean significant cost or time savings, making it more likely the project will meet its objectives. For example, in April 2014, Chinese company Win Sun was one of the first firms to exploit the rise of additive manufacturing and 3D printers. It made 10 full-size houses using a huge 3D printer in the space of a day. 

Using 3D printed components in construction could become a game-changer for design and project management. If it does, companies like Win Sun are likely to reap the rewards while others could lose their competitive advantage.

Even though I think the importance of dealing with black swan events is evident, current project risk management practices seem to fail to cope with these risks. So what’s wrong? 

Firstly, project risk management is generally seen as an array of reductionist tools and techniques that supports a structured system. This perspective often ignores the fact that humans have inherent biases, which lead them to conclusions they find satisfying rather than conclusions that are optimal for the project.

This is why I think project risk management should have formal, structured processes to identify and measure the risk attitudes of practitioners so they are aware of their common bias. By categorising people on a scale from risk averse to risk seeker, you can make sure you’re building balanced project teams. And the decisions these teams make should be driven by data.

The second problem with today’s risk management practices is that reductionist techniques only take a partial view of projects. This allows project managers to deal mentally with complexity and decompose a project into manageable work packages. But it overlooks the fact that whole-project risk is more than the sum of individual risks. 

I think system dynamics modelling would enable teams to see the bigger picture. This analysis and simulation technique was developed to overcome key issues in complex systems such as the unintended consequences of actions and ripple effects.

If you look at this bigger picture and consider the human elements of risk, you might just be able to cope with the next black swan event.