In an exclusive Risk.net panel session, convened in collaboration with ServiceNow, experts discussed the unique challenges facing risk and resilience practitioners, and the utilisation of frameworks and risk quantification for optimal decision-making across financial services organisations
- John Goodman, Senior vice-president, Cyber Risk Institute
- Jack Jones, Chairman, FAIR Institute
- Greg Kanevski, Global head of banking, ServiceNow
- Moderator: Mark Hofberg, Financial services risk solutions executive, ServiceNow
There is no shortage of rules, methods or digital frameworks in risk management.
The role of risk quantification in decision-making and the value of harmonising regulatory requirements in risk management are key, especially with firms preparing globally for uncertainty against the backdrop of rising inflation and geopolitical pressures.
Navigating this landscape has underscored the need to transform risk assessments – via intelligent automation into digital business processes – to continuously monitor and prioritise risk. The modernisation of risk programmes aimed at continuous monitoring is on the up, providing firms with better and more timely data, and embedding processes within core business operations in a more cost-effective manner.
At the Risk.net panel session with ServiceNow, experts discussed how the speed of change in digital has combined with unprecedented events and market conditions to drive the need for more data and integration, as well as robust risk quantification frameworks to better influence decisions.
Here are the key themes that emerged from the discussion.
Over centuries of medical development, two fundamental components have instilled in us an understanding of where we once were to where we are now: experimentation and collaborative learning.
Just as medicine was unable to make greater advances until knowledge of physiology caught up with that of anatomy, similarly, within the frameworks of risk management and cyber security, risk controls are the industry’s anatomy, but the physiology – the understanding of how risk controls interact – has been lacking.
There is still a long way to go towards collaborative learning, as firms taking part in risk assessment working groups are often reluctant to share information and experiments with the wider industry. There simply isn’t enough knowledge sharing among firms, the panel said.
However, practitioners can adopt the Factor Analysis of Information Risk Controls Analytics Model (FAIR–CAM), an international standard for quantification of cyber and technology risk that models the factors that drive risk.
FAIR–CAM enables firms to assess how controls impact the magnitude and frequency of loss events. It also explains control physiology and is directly applicable to any form of loss exposure in today’s market.
Using the FAIR model, financial services firms can enable the empirical measurement of control value and efficacy, account for individual systemic effects and control functionality, and more effectively leverage telemetry in cyber risk and security. That said, challenges in the adoption of the model prevail and the key is to overcome those for optimal risk quantification.
Quantification is essential
There is no question that quantifying cyber risk greatly improves firms’ ability to focus on what matters and apply resources cost-effectively.
While it may not be straightforward to perform a cost/benefit analysis with qualitative measurements, firms can prioritise risks, the panel said. “But even that is typically done so poorly in qualitative measurements today, it’s a bit of a fallacy.”
Quantification methods are important as regulators continually beef up efforts to ensure companies keep pace with regulatory mandates. Prioritising third-party risk as it becomes more prevalent in financial services for core business functions is imperative. In addition to third parties, their providers (such as fourth parties) represent a tremendous risk if they are not adequately monitored and controlled.
The panel emphasised that regulatory agencies “will absolutely insist” on cyber risk quantification at some point in the future. “Everybody stands to gain from that in terms of holding their own in a complex, problematic threat landscape,” a panellist said.
One of the largest impacts FAIR has had on risk assessments is reducing the number of factors considered ‘high risk’. Particularly by focusing on the quantitative, it substantially reduces the volume of factors considered high risk.
“[FAIR] gives you a structured language for how to represent the different components of risk that contribute to overall risk,” said a panellist. “And, by putting labels on them, you can then start measuring against them. So it gives you a common lexicon and a common way, or ontology, for structuring your analysis of risk.”
Through the model, firms can aim to normalise risk nomenclature, as well as normalise mental models around what we mean when talking about risk.
Layering controls over each other is troublesome – and this needs rebalancing for harmonisation. “Periodically, this has to be rebalanced because controls are put in because of an inferiority, an issue in the process or someone feels as though it’s needed and the controls aren’t written well,” said Greg Kanevski, global head of banking at ServiceNow.
“There are so many [controls] out there that the first-line-of-defence managers don’t understand all of them. They don’t understand how they apply in their function. And when an oversight person – second, third, fourth line – comes in and asks about it, they don’t even understand how that control is supposed to work.”
However, firms can help themselves by avoiding low-value-add activities of compliance. They can do this by tracking controls and identifying which controls can productively help mitigate risk. Furthermore, automation and cutting-edge technology platforms can assist by mapping and reconciling all of these controls and strengthening risk frameworks.
Engage the board
A quantitative approach to risk assessment empowers businesses to make informed decisions on residual risk. However, getting the board’s support is crucial to achieving the required cultural change.
“How are you going to do it differently – where you actually bring it to the board in a manner they can consume?” Kanevski said. “Say, here’s what’s really important. It’s how we know it’s important. We’re not putting our spin on it. These are the numbers, and rate ranges that are helping us. And here’s what we must do to get that plane going. Here’s the money we need. Here are the people we need.”
If an organisation is not ready for it, and its maturity curve is not going to deploy it, there is a need to change the culture of the organisation. “That’s as important as how you deploy and what you deploy,” Kanevski added.
There is no doubt that utilising quantitative models and FAIR–CAM can provide firms with better visibility and control for identifying and managing risks, especially in times of greater uncertainty. Ultimately, the goal is to boost speed, accuracy and confidence in decision-making.
Greater industry-wide collaboration will be crucial for future progress in advancing risk assessments. Panellists urged firms against reinventing the wheel, but encouraged them to share data and their own unique experiences for the benefit of the community.
Integration and connecting the dots – and data – between risk, compliance, continuity, security and assurance functions will mean greater resilience in the future.