Risk-analytics techniques make it possible to measure, quantify and even predict risks with more certainty.

In the past, companies relied heavily on the leaders at the business-unit level to monitor, assess and report risks to their senior management teams. In today’s world of multiple data sources, however, it becomes virtually impossible to manually search through large amounts of data and gather the critical risk information needed for an enterprise-level risk perspective. Such a view, spanning the totality of a company’s risks and spanning many different parts of a corporation, is impossible without using risk analytics.

Risk analytics sets a baseline for measuring risk across the organization by pulling together many different types of risk data into one definitive system. It gives a company’s decision-makers a clear method in identifying, assessing, understanding and managing the company’s risks.

In general, a risk-analysis report can be either quantitative or qualitative. In quantitative risk analysis, an attempt is made to numerically determine the probabilities of various adverse events and the likely extent of the losses if a particular risk event takes place.

Qualitative risk analysis, which is used more often, does not involve numerical probabilities or predictions of loss. Instead, the qualitative method involves defining the various threats, determining the extent of vulnerabilities and devising countermeasures should a risk event occur.

Almost all sorts of large businesses require a minimum sort of risk analysis. For example, commercial banks need to properly hedge foreign exchange exposure of oversees loans, while large department stores must factor in the possibility of reduced revenues caused a global recession. Risk analysis allows professionals to identify and mitigate risks, but not avoid them completely. Often, it includes mathematical and statistical software programs.

How can you use risk analytics to anticipate and avoid risks, as well as take smart risks to drive value in your company?

Traditional analytics can be instrumental when it comes to better understanding past events or risks that occur with a high degree of frequency. But for forward-looking “what-if” scenarios and strategic risks, risk modeling can deliver valuable insights.

What’s the difference between risk analytics and risk modeling? The type of data they use. Risk analytics can give organizations visibility into many kinds of systemic risks, from credit risk and market risk to operational, reputational and cyber risk. It can help leaders deploy capital and manage their supply chains at a level that matches their risk tolerance. For organizations exposed to significant regulatory risk, it can be an important tool for helping to achieve compliance. And that’s just the start.

Risk modeling organizes bits and pieces of information drawn from a wide range of similar scenarios that have already played out to create a big-picture view of scenarios that are likely to occur in the future. This can be particularly helpful when weighing strategy-level risks that may shape the future of an organization. The more abstract the risks, the more modeling may be of use.

Modeling can also be a good option when an organization is grappling with massive complexity. In a large company with many different types of businesses, even if you have data on each line of business, the task of assembling and using that data to better understand risk may be virtually impossible. For large, complex systems, risk modeling may offer a more direct path to the insights needed to make smarter decisions.

Risk modeling shouldn’t be considered a replacement for risk analytics. View it instead as another tool in the analytical arsenal – one that is best used when you need to make more informed decisions on forward-looking issues of strategic importance, but don’t have traditional data sources to draw from. Here are some samples of modeling that can be used.

**Credit-risk modeling** is intended to aid financial institutions in quantifying, aggregating and managing risk across geographical and product lines. The outputs of these models also play increasingly important roles the risk management and performance measurement processes for banks and insurance companies, in particular.

The outputs of credit-risk models include performance-based compensation, customer profitability analysis, risk-based pricing and, to a lesser (but growing) degree, active portfolio management and capital-structure decisions.

**Asset and portfolio modeling** is used to manage and streamline the organization’s balance sheets by enhancing profitability while taking into account such conflicting objectives as reducing capital, minimizing risk and increasing liquidity.

**Stress-testing **is “a risk-management tool used to evaluate the potential impact on portfolio values of unlikely, although plausible, events or movements in a set of financial variables,” according to the Federal Reserve Bank of San Francisco.

**Liquidity risk modeling** analyzes the liquidity issues within an organization’s balance sheets, both under normal and under stress situation.

**Cash flow at risk**, which measures possible shortfalls in cash flow helps to analyze a company’s commodity-purchasing exposures.

It is important to keep in mind that when a company analyzes a potential project, it is forecasting potential, not actual, cash flows for a project. Forecasts, of course, are based on assumptions that may be incorrect. Company should thus perform sensitivity analyses on its assumptions to get a better sense of its estimations of risk on the project the company is about to take.

There are three risk-analysis techniques that many organizations use to analyze their risks to a potential project:

**Net-present-value analysis **is a form of sensitivity analysis that can help a company gauge how sensitive its NPV analysis is to changes in its assumptions of variables. To begin the analysis, one must first come up with a base-case scenario. This is typically the NPV using assumptions one believes are most accurate. From there, one can change various assumptions that were first made based on other potential assumptions.

NPV is then recalculated, and the sensitivity of the NPV based on the change in assumptions is determined. Depending on one’s confidence in the assumptions, one can determine how potentially variable its projections of risk can be.

**Scenario analysis** takes sensitivity analysis a step further. Rather than just looking at the sensitivity of its assumptions to the variables, scenario analysis also looks at the probability distribution of the variables.

Like sensitivity analysis, scenario analysis starts with the construction of a base-case scenario. From there, other scenarios are considered, known as the “best-case scenario” and the “worst-case scenario.” Probabilities are assigned to the scenarios and computed to arrive at an expected value. Given its simplicity, scenario analysis is one of the most frequently used risk-analysis techniques.

**Monte Carlo simulations** spit out numerous calculations of expected values given a number of constraints. Constraints are added and the system generates random variables of inputs. From there, metrics like NPV are calculated. Rather than generating just a few iterations, the simulation repeats the process many times. From the numerous results, the expected value is then calculated.

In short, almost every major decision in an organization – to drive revenue, control costs or mitigate risk, for instance – can be infused with risk analytics. The can transform a company’s risk-based decisions by infusing both historical and future risk information into the decision-making process.

*John Bugalla is a principal with ermINSIGHTS and Kristina Narvaez is president and CEO of ERM Strategies LLC.*