Debate and co-operation have fuelled the significant strides insurers have taken in risk analytics over the past two and a half decades, but there is much more to come. At Asta, we believe that exposure management has an essential role to play in the future of the insurance market and are dedicated to being at the forefront of its development. We would like to open the floor to all insurance professionals to consider the question of what the next ten years hold for exposure management, and how our tools and abilities could evolve to meet the challenges in front of us now.
By the time I entered the exposure management industry, the golden age of catastrophe modelling was already full in swing. Those powerful (for the day) tools provided carriers of all sizes access to advanced analytics, whereas previously this had been restricted to the largest reinsurers and brokers.
They were terrifying beasts. Analyses took hours. With an array of options and settings, and the quality of your calculations reliant on the accuracy of your exposure data, the smallest error set insurers back days. These models used up all the available resource of an insurers’ desktop, so that no other work was possible while they ran. The result of those hours of analysis was a simple set of metrics – mean loss, standard deviation and a handful of return periods. If insurers wanted to know almost anything else – a summary of the exposure and a view on data quality or how that mean loss broke down across policies – the only practical approach was to go diving into the raw databases that sat behind the scenes.
However, necessity is the mother of invention and as an industry, we have been inventing ever since.
The first step was to supply those using these models with a second desktop, then remote desktops. Insurers soon began to invest in developing powerful analysis servers, with multiple users being able to access all central data. Cloud technology has more recently served as a catalyst for change, dramatically widening the network of professionals able to access catastrophe risk analytics.
As with Moore’s Law, each new approach provided an exponential increase in the speed at which insurers could work. The models kept pace, performing at higher granularities, providing more detail and analysing more events. We are now blessed with access to high-resolution maps, graphics, hazard overlays, and data quality toolkits. Underwriters can now view their exposure on their iPads, identifying any potential clashes with the wider portfolio, how close it sits to those fault lines and visualising where the loss drivers are.
The presentation of results remains largely unchanged, although they are now easier to access and interrogate. Today we have access to any accumulation dimension one can consider. Hazard footprints, historical event what-ifs, blast zones, spider analysis and clash assessments. All of these can be automated to appear in the underwriter’s inbox, with little interaction required.
While there is much to celebrate, the pressure to continue to innovate, to re-invent, to improve, is unending and immense. Operational costs will be always challenged, data availability and granularity will increase and expectations of greater value from ongoing investment will continue to rise. The risk landscape also continues to grow in complexity. Hurricanes were one of the first risks that exposure management professionals tackled and this was no easy feat. However, the precision required to model US flood, the uncertainty that needs to be managed for cyber, or the data quality present in liability (which leaves much to be desired) present even greater challenges.
There are plenty equipped to embrace these challenges and an abundance of tools, technologies and models to assist them in the endeavour. The industry will look different in 10 years’ time, of that there is no doubt. But if the industry could adopt just one now, what would it be? What’s the one advancement that would bring the greatest benefit to the industry as a whole? Would it be something that sped up runtimes further or reduced operational costs in some other way? Or would it increase our ability to comprehend and communicate a particular risk, or risks in general? Perhaps it’s a technology elsewhere in insurance that brings us the greatest benefit – smart contracts, or Blockchain?
Alan Godfrey started his career at Amlin in 2004 after studying mathematics at the University of Cambridge. In 2006 he set up and led the company’s catastrophe modelling team, which by the time he left had grown to 40 full-time employees covering Reinsurance, Property and Marine classes, based across multiple international locations.
Through this role Alan gained extensive knowledge of the uses, strengths and weaknesses of the main catastrophe models, as well as the developing best practice in Exposure Management. With particular focus on the operational efficiency and effective use of capital, he provided support to Amlin in achieving one of the first Solvency II approved Internal Models. Alan joined Asta in 2015 as Head of Exposure Management.