Home | News & Insights | Blogs | Current Post

Data Precision and Timeliness Fuels Exposure Management

June 26, 2017 | Leo Lockwood

A Formula One racing team can change four tyres in under two seconds. How do they do it? As James Vowles chief strategist at Mercedes explains, speed and precision are achieved: “through data-driven adaptations. We place eight sensors on every one of the wheel gun nuts. When the gun operator initially connects to the wheel nut, I can tell that they, say, connected 20 degrees off the optimum angle.” In the world of Formula One, every second and every detail counts just to remain competitive, let alone getting ahead of the field.

For Mercedes, the drive for precision and perfection underpins every aspect of operations, from the design of the engine all the way through to the temperature of trackside drinks. And, of course, Mercedes is not an outlier. Many of the most successful businesses in the world take similar approaches. Why, then, despite the emerging influence of Insurtech, improved data for augmentation purposes and advancements in tools such as geospatial technologies, are insurance industry stakeholders still reliant upon imperfect and unprecise data? Data is the fuel that runs insurance, and it is surely incumbent on the industry to constantly strive for information of the highest possible standard.

I have been working within the field of exposure management (EM) for twelve years, specialising in the upstream energy sphere. During this time, the quality of risk data provided to underwriters has not improved significantly, with most submissions still requiring significant scrutiny and enhancements. This equates to time and effort spent manipulating and manually mining data for as much accuracy as possible in risk assessment, but with the concession that the process is typically sluggish. EM is a vital part of the insurance industry’s service offerings to insureds; even to achieve marginal gains, each part of the process needs to be examined and improved.

There are still many blind spots within the exposure management discipline. Take for example the 2011 Thai floods, in particular the Contingent Business Losses (CBI), caused by mass disruption to global supply chains resulting from the damage inflicted to 7,510 industrial plants. Carriers were caught off guard by the scale of those losses, in retrospect owing to poor quality submissions, time and resource constraints on improving the exposure data, and slow adoption of new technologies. In the age of the Fourth Industrial Revolution that we live in, the data which exposure managers receive and rely upon could surely be better.

A further illustration of poor data exchanged in the market is the non-standardised risk exposure bordereau provided by coverholders. Delegated authorities have been helping London insurers to expand their global footprint for more than a century. In this time, relatively little has changed in the methodologies used to disseminate generally poor quality risk data to risk carriers. A former client was over-lined on a Floating Production Storage and Offloading (FPSO) risk, thanks to exposure blindness caused by poor data on an unwieldly bordereau. Fortunately, the client managed to get to the reinsurance market before the FPSO structure became exposed to spinning around the North Sea. If binding authorities were introduced to the market today, the use of blockchain, or distributed databases that maintain a continuously growing list of records, would be an obvious benefit. Blockchain would allow unbroken records of reliable data to pass through each stakeholder in the chain, from insured to reinsurer.

Exposure managers should be benefiting from new technologies. With the Internet of Things increasingly at everyone’s disposal, accurate risk data is increasingly accessible. Brokers should not place business based on mediocre exposure data which requires a great deal of manual effort to decipher, improve and analyse. But, unfortunately, they still do. Ultimately this can lead to inaccurate CAT modelling results, over-spend on reinsurance, and potentially imprecise capital allocation — a highly significant factor for companies carefully measuring return on capital.

Knowledge is king and exposure management is primed and ready to benefit from future technological developments. With re/insurers having a true understanding of their exposures, blind spots and expensive surprises can be avoided and underwriting performance improved.

Xceedance is committed to supporting the drive to improve exposure data standards. We are contributing to the industry solution through our technology and data sciences units, coupled with strong and experienced exposure analytic teams. Contact Xceedance to learn more.

Leo Lockwood is senior technical manager at Xceedance, supporting the business and technology requirements of new and existing Xceedance clients in the EMEA region.

Contact Us