Insurers in Lloyd’s and around the world face an ever-growing regulatory reporting burden. Many find report creation a struggle. The solution is straightforward (if not necessarily simple): get the central data source right. By ensuring data is high quality and held well in an appropriate system, reporting becomes almost effortless, and need not require seemingly endless reactive efforts.
Companies that get their data warehousing correct will enjoy a side-benefit. Those that have adopted a system allowing flexible reporting for the regulators will also be able to mine and manipulate data for their own analysis, allowing a clearer, more comprehensive, and simply better understanding of their own risks. That can include a better grasp of how individual risks may affect the balance of their portfolio if assumed. However, it is not uncommon for organisations to be too tightly focused on regulatory reports, and leave the potential benefits of improved internal reporting underdeveloped.
The first step is to make data a strategic priority for everyone, and make sure everyone buys into it. Second is to embrace data quality. Too often, ensuring data quality comes late in the day, just before creating an essential report. This usually happens because those responsible for a specific data set are too busy with their day-to-day work to find time to get data correct. When data quality is placed at the forefront of the process, complete and accurate data should always be ready when needed. The challenge of last-minute report-building is diminished. Users can spend time analysing the data itself, rather than augmenting and cleaning it.
Systems are a key element, and must be handled with care. Bespoke systems are ideal, but when over-designed they may lack the flexibility to deal with regulators’ changing demands (which seem to alter from year to year). Similarly, off-the-shelf solutions are sometimes limited, but they may be much less expensive and simpler to implement. Ultimately, two things are vital: data systems must meet each user’s needs, and integrate fully with other relevant systems, whether within the organisation, or used jointly by the market.
Start-up business such as the new syndicates which Asta supports have a real advantage: they can get data warehousing right from Day Zero. Checks and controls can be put in place long before underwriting begins, so that on Day 3, when the first data error is made using the wrong risk code or with a mistyped premium figure, the response is swift and simple.
Companies with legacy business – the majority by far – have a greater and more tumultuous challenge. However, by uprooting old systems and cleaning up the past, companies will see benefits almost immediately. A sound central data source containing high-quality data will be well worth the investment, and if data is mined effectively, self-financing in the medium term.
None of us will ever get our data perfect, but high data quality does not necessarily mean 100% accuracy. Little will be gained by chasing down every single error, but most organisations have room for vast improvement. With data quality an enterprise-wide ambition, and systems and processes configured to automate reporting, users – whether underwriters, claims managers, actuaries, or management – can spend more time analysing and understanding the risks on the books, supported by better-quality internal intelligence.
Clare Barley is a qualified Actuary. After graduating with a masters in mathematics from Warwick University, Clare worked as a Pensions Actuary before joining Lane Clark & Peacock in 2010 where she provided clients with actuarial advice on a range of insurance matters, including Solvency II, reserving and pricing.
Clare joined Asta’s risk team in 2014 and has worked across all areas of risk management, with particular focus on the ORSA process and model validation. She was appointed Deputy CRO in 2014 and promoted to CRO in February 2016.
020 7743 0809