In this video I cover about 25 insights into various ideas and evolving concepts in risk management. From human factors to global catastrophes, this fast paced presentation is a snapshot into some of the best ideas that I've come across in risk management. The video is a whirlwind sample of ideas so I've also put a collection of links at https://juliantalbot.fyi.to/5insights where you can find the references and further reading into each of the ideas. Enjoy.
Daniel Kahneman was once described by Harvard psychologist Daniel Gilbert, as " ... the most distinguished living psychologist in the world ... " So it's worth paying attention when Kahneman made the above statement in answer to a question from a colleague. "What scientific concept would improve everybody's cognitive toolkit?"
The concept is simple and obvious. But we often forget that by attempting to manage risk, we are attempting to shape, or even invent the future of our choosing. In his acclaimed book, Against The Gods, Peter Bernstein gives us an insight into the idea that by taking our future into our own hands, humanity made a fundamental change. In many early cultures, our fate was in the hands of the gods. Today, we defy the gods in large and small ways, by attempting to choose our own destiny. The audacity is impressive. Whatever will humans think of next? Or more to the point, we created the discipline 'risk management' as a way of managing our future. But what is the next step in this process of shaping our reality?
This link will take you to my presentation "Five Insights In Risk Management" which I gave at RISK AWARENESS WEEK 2020. All the other links on this page are the supporting references or recommended further reading in support of the presentation.
It's tempting to believe that other people know more than you. Especially when they are very confident. But bear in mind the case study of Long Term Capital Management. A classic case study of Nobel Prize winners and teams of PhDs who achieved 40% per annum for 5 years in the stock market... And then almost bankrupted the entire system. Just because someone is smart or sounds smart, isn't enough. The title of the book 'Only the Paranoid Survive' by Andy Grove, founder and former CEO of Intel sums it up. And last but not least, don't fall for someone who is very confident. Remember the Dunning-Kruger effect. It is a type of cognitive bias in which people believe that they are smarter and more capable than they really are. Worst of all, low ability people don't possess the skills to recognize their own incompetence. The combination of poor self-awareness and low cognitive ability leads them to overestimate their own capabilities. And to be completely confident, yet completely wrong.
This book is about luck–or more precisely, about how we perceive and deal with luck in life and business. Set against the backdrop of the most conspicuous forum in which luck is mistaken for skill–the world of trading–Fooled by Randomness provides captivating insight into one of the least understood factors in all our lives. Writing in an entertaining narrative style, the author tackles major intellectual issues related to the underestimation of the influence of happenstance on our lives.
There is one school of thought that says it takes 5 years to change the culture of an organization. Another view which says you can do it in a weekend. Yet another view which says that changing the leadership will change the culture. All these ideas have their merits and their proponents. But the surest way I've seen to change culture in order to improve performance is to train people. With new skills, people have more options to deal with situations in the workplace. This means they can and do apply a wider range of behaviors. When people behave a certain way, cognitive dissonance steps in to say "if I'm behaving this way, it must be for good reason, otherwise I wouldn't do it." As a result, peoples attitudes change. And when enough attitudes change in aliignment, you have a cultural change. And if you do this right (starting with the end in mind) you see performance benefits. Try it. You'll see.
We need a new word for risk. At least if it is to include positive outcomes. ISO31000 will tell you that risk is 'the effect of uncertainty on objectives'. This is a lovely succinct definition, that includes the possibility of both positive and negative outcomes. But the rest of the world sees it as all negative. Most people or dictionaries view risk as being negative or some variation of the following: • a situation involving exposure to danger: flouting the law was too much of a risk • the possibility that something unpleasant or unwelcome will happen: • a person or thing regarded as a threat or likely source of danger • a possibility of harm or damage against which something is insured • a person or thing regarded as likely to turn out well or badly, as specified, • the possibility of financial loss Maybe we need to start thinking about 'Uncertainty Management' or 'Possibility Management' if we want to include positive outcomes. Or a completely new word.
Feel free to drop me a line via the contact page at www.juliantalbot.com if you have any questions, suggestions, or have a particular challenge. You can download various business and risk management templates from my website. You can also find a lot of graphics which are yours to use at www.srmam.com
This book is one of the best-expressed and most thoroughly researched overviews of intuition, what it is, and how to use it, that I've come across in all my reading. The four steps of intuition (according to J Talbot) are: 1. Learn how to listen 2. Recognise the intuition from the background chatter 3. Trust that it is right 4. Take action accordingly. Leave any one of those steps out and you may as well not have bothered. Gigerenzer goes into a lot more detail about biases, heuristics, and (my favorite) 'fast and frugal' trees.
Researchers have recently identified genetic predictors of sensation-seeking that have been linked to risky and impulsive behaviors. We examine the implications of these genetic polymorphisms for economic behavior. Subsequent research suggests that this effect is not so clear cut so take this with a grain of salt for the moment. In practical terms, however, we know that some people are prone to taking more risks. The concept of risk homeostasis suggests that most of us have a 'risk set point' where we are most comfortable. DRD4 research suggests that there is a genetic component to risk-taking. People with certain configurations of DRD4 need more dopamine than other people in order to trigger a set-point where they feel 'happy' or comfortable. Other research shows that learning will trigger the release of dopamine in most people. Hence, you can, in theory, reduce risk-taking among construction workers, financiers, or any individuals but exposing them to constant or at least regular training.
I often find people have varying ideas about 'enterprise risk management' and many managers think it means identifying and treating every risk across an organization. While that sounds nice in theory, the reality of ERM is that we need to understand the organization has a whole. The idea is to understand and manage the
Probability of an event can be expressed as a number from 0 to 1 or a percentage. But then if/when that even occurs, can we really think of a risk as having a single and certain consequence? I'd suggest that even the most basic level of analysis will infer some sort of probability distribution in terms of what happens. In this diagram for example, I've mapped a theoretical range of consequences for an organization where losing 100% of net worth is what they consider an existential threat. Risk A has a 60% likelihood of happening and will most likely take 70% of their net wealth. But, if it happens, it will cost at least 40% and has a slight change of being an existential threat. In reality the curve might extend beyond the 1.0 in terms of consequence but by then it's a moot point.
I'm constantly on the lookout for my blind spots. The assumptions that I've made but haven't seen as yet. The things that others can see about me that I haven't seen. Or sometimes the completely invisible. It's a constant journey of discovery and epiphanies that make life better incrementally and sometimes exponentially. This is the best single book I've found on the topic and I commend it to all. If it were up to me, I'd make it mandatory for every homo sapien. Enjoy. Oh, and it's not just my opinion. Thinking, Fast and Slow is: a Major New York Times bestseller with over two million copies sold. Selected by the New York Times Book Review as one of the ten best books of 2011 and by The Wall Street Journal as one of the best nonfiction books of 2011 and the 2013 Presidential Medal of Freedom Recipient.
Our experience with COVID19, though tragic and challenging, is something we will come to be grateful for. In 1999, Australia helped East Timor achieve independence from Indonesia. Some Indonesians took umbrage to this and shot up the Australian Embassy in Jakarta. Australia responded by fitting bulletproof glass. When terrorists attacked the embassy in 2004 with a car bomb, the ballistic glass although not designed to be blast resistant, saved countless lives. And unexpected benefit of preparations for a different risk. Inevitably, at some point, we will meet a pandemic more deadly than anything we've seen before. The experience, strategies, and public awareness of how to manage pandemics that we are learning well, will save countless millions of lives in future pandemics. COVID19 isn't just a wake up call. Not even a warning against complacency, COVID19 is the 'training exercise' that we needed or at least will need in the future. I'll write a more detailed article one day but in the meantime, this link will take you to some background on the 2004 Australian embassy bombing took place on 9 September 2004 in Jakarta, Indonesia.
"Expect the unexpected" is a popular mantra for a reason: it's rooted in experience. Since the dawn of civilization, organizations have been rocked by natural disasters, civil unrest, international conflict, and other unexpected. Why is it that we regularly face challenges such as bank failures, intelligence failures, quality failures, and other organizational misfortunes, often sparked by organizational actions? This critical book focuses on why some organizations are better able to sustain high performance in the face of unanticipated change. High reliability organizations (HROs), including commercial aviation, emergency rooms, aircraft carrier flight operations, and firefighting units, are looked to as models of exceptional organizational preparedness. This essential text explains the development of unexpected events and guides you in improving your organization for more reliable performance.
The objective of this tool is to aid discussion and provide an initial categorization of risks into four groups: BUSINESS AS USUAL (BAU): Risks that are unlikely to occur and will probably have only minor consequences if they do. Examples include fraud, common burglary, vandalism, and shoplifting. Some analysis and ong
In his 2008 book 'Fooled by Randomness' Nassim Nicholas Taleb introduced the theory of Black Swan events. He then turned that into the highly successful book 'The Black Swan' which is where the idea really caught the public imagination.
In this seminal work, academic and practitioner, Robert Cialdini builds on the work of his iconic book 'Influence' to explore the how and why we are so easily influenced. He explores the research and shows us just how subtle the effect of a change of environment can be, how to avoid being manipulated, and how to communicate more effectively by shaping your environment.
The Internet of Things (IoT) describes the network of physical objects—“things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the internet.
"Thousands of years from now, when historians review the past, our ancient time here at the beginning of the third millennium will be seen as an amazing moment. This is the time when inhabitants of this planet first linked themselves together into one very large thing. Later the very large thing would become even larger, but you and I are alive at that moment when it first awoke. Future people will envy us, wishing they could have witnessed the birth we saw." Kevin Kelly
In every fatal or significant mishap that I've looked at, mulitple things had to 'go wrong' or barriers had to fail. It's not just Swiss Cheese were the holes line up but, in the end, a series of events that have to occure. I'm yet to find this concept better illustrated than in this presentation by Brian Appleton (Technical Adviser to the Enquiry) on the Piper Alpha Accident. If you're not familiar with the Piper Alpha incident, it is an iconic but tragic lesson from history that took the lives of 167 men in 1988, and ultimately changed the way we manage safety in virtually every industry. You can find more background online and video of the change from an oil fire to a catastrophic gas explosion on Youtube. Enjoy.
Julian has 35 years international security risk management experience up to and Including Director level and C-Suite experience in the commercial, government, and not-for-profit sectors in Australia and internationally. His credentials include a Master of Risk Management (MRiskMgt), Graduate of the Australian Institute of Company Directors (GAICD), Australian Security Medal (ASM), Certified Protection Professional (CPP), Diploma of Security Risk Management, Diploma of Project Management, Microsoft Certified Systems Engineer, Fellow of the Risk Management Institution of Australasia (RMIA), and Fellow of the Institute of Strategic Risk Management (F.ISRM). He is also lead author of the Security Risk Management Body of Knowledge.