Richard Jelbert, product director, MyDrive Solutions, explores the different methods of data analysis available for insurers in order to extract the most accurate data and in turn offer premiums that best reflect driver risk
Growing numbers of insurance companies are exploring the potential of telematics devices to add intelligence to the premium creation process. However, whilst early figures clearly indicate a strong link between driver behaviour and propensity to claim, insurance companies’ fears of being overwhelmed by vast quantities of data are potentially undermining the true value and adoption rate of telematics in the industry.
Today, upwards of 95% of collected data is being discarded by the telematics device, leaving the industry with very basic data that does little more than provide a proxy of the very worst drivers. If the insurance industry is to realise the potential of telematics, it needs to find a way of attaining granularity of driver behaviour to create a more accurate profile reflective of the broad range of drivers on the roads today. However, there is no need for expensive or extensive big data projects. (For more on insurance telematics, see Industry insight: Insurance telematics.)
2012 has seen a significant increase in the numbers of insurers and brokers adopting an insurance telematics offering; and although policy volumes are still relatively low, a clear link between driver behaviour and propensity to make claims has emerged. However, as the popularity of telematics continues to rise, there is growing pressure for insurers to justify the pricing modules associated with a personalised premium.
Given the vast amounts of individual driver data being collated by multiple insurers in multiple ways, it comes as no surprise that the term ‘big data’ is causing confusion within the industry. Insurers collecting driver data via in-car telematics devices or a smartphone telematics app are adopting a variety of approaches to analyse this big data and then price individual’s premiums accordingly. But there is a debate as to which approach best determines the risk a driver presents and, therefore, which insurer is providing the fairest insurance telematics premiums for customers.
Certainly there is growing concern in the industry about the sheer volume of telematics data being collected. How can be it analysed? Does each company need to invest in expensive big data projects? How can this data be incorporated into existing models for premium calculation?
There is little or no appetite for a major investment in new analytic tools, or time consuming/business disrupting IT projects. Insurers want to start exploring this data immediately, preferably through the use of existing systems and knowledge. (For more on data, see Industry insight: Telematics and data and Industry Insight: Telematics, intellectual property and privacy.)
The data challenge
One option has been for providers to offer a black box solution that restricts the data that is collected thereby removing some of the demand for high volume data analysis. However, packaging up the telematics information into basic customer behaviour traits that reflect little more than a count of harsh events offers limited long term value to the industry.
Yes, drivers that brake or accelerate harder and more often are without doubt part of the group more likely to claim. But actually these are also the highest risk drivers that most companies have already identified. Furthermore, by providing actuaries with no insight into the creation of these figures, this exception counting based model relegates telematics to a position as just one more proxy.
The objective is not simply to create a link between behaviour and claim propensity but also between behaviour and type of claim. Are individuals more likely to hit pedestrians or cyclists, experience minor dents in car parks or back-end shunts at roundabouts? To gain true value from telematics, the industry needs to understand driver behaviour in far more granular detail than exception counting provides.
Rather than relying on the exception based telematics data, insurance companies need to record and retain driver behaviour every second and then exploit analytic techniques to create a highly customised sub set of that data that accurately represents individual driver behaviour and preferences. One option is to use probabilistic techniques such as neural networks and Bayesian analysis; these complex mathematical algorithms are delivering good indications of propensity to claim. However, the methods are opaque, making it difficult for actuaries to immediately incorporate the new information into existing premium calculations.
In contrast, deterministic analysis adds driving domain specific context to the calculation. Driver scores and premiums are generated by adding underlying specific information, including driving conditions, the underlying road map and time of day, to the telematics data. This contextualised approach not only provides insurers with an accurate and detailed profile based on 70 or so factors for each customer, but the model is transparent, enabling actuaries to confidently incorporate this new information into existing techniques.
Taking this approach, an insurance company does not have to consider how to analyse 1 million lines of data for each customer, instead they need just one line per customer. This massively reduces the complexity of the problem and enables the company to consider telematics scores using existing technologies and methodologies. The scale is manageable; but the data adds unprecedented depth of insight into customer driving behaviour—illuminating not only propensity to claim but the likelihood of specific accident types.
In the early stages of telematics use, actuaries are also looking for a credible, objective benchmark against which each driver capability can be measured. So what is the benchmark for safe driving? A recent survey undertaken by the University of Sussex on behalf of The Royal Society for the Prevention of Accidents (RoSPA), analysed the insurance claims made by RoSPA Gold standard drivers and discovered these individuals made in excess of 60% fewer at-fault claims than the average driver in the same demographic.
Combining deterministic analysis of detailed telematics data including RoSPA definitions of ‘safe driver’ provides an excellent best practice starting point for measuring propensity to claim. This enables insurers to both immediately and proactively begin exploring the growing volume of telematics information within their existing technology/analytics frameworks.
Telematics creates big data—and it should. It is the collection of detailed information relating to individual customer driving behaviour, by road type, time of day, and road conditions that is essential to creating profiles significantly more accurate than currently existing proxy data. Basing models on exception counting may help avoid the big data issue but it also removes the chance to provide any real insight into all but the most extreme driving behaviours.
The key issue for the insurance industry is to ensure this information is not only captured and retained, but presented in a way that is meaningful to actuaries and usable within the current premium creation process. That does not mean the industry needs to invest in expensive big data analytics: deterministic analysis of detailed telematics data can provide a factual and contextualised customer behaviour profile that not only provides unprecedented insight into propensity to claim and type of claim but, critically, is totally transparent and easily incorporated into the actuarial process to accurately reflect driver risk.
Richard Jelbert is product director of MyDrive Solutions.
For more on insurance telematics, see Industry insight: Insurance telematics.
For the latest on UBI, check out Insurance Telematics Europe 2013 on May 7-8 in London.
Coming up: V2X for Auto Safety and Mobility Europe 2013 on February 20-21 in Frankfurt, Telematics for Fleet Management Europe 2013 on March 19-20 in Amsterdam, Telematics India and South Asia 2013 on April 17-18 in India, Telematics Russia 2013 in September in Moscow, Telematics Detroit 2013 on June 5-6 and Content & Apps for Automotive Europe 2013 on June 18-19.
For exclusive telematics business analysis and insight, check out TU’s reports: In-Vehicle Smartphone Integration Report, Human Machine Interface Technologies and Smart Vehicle Technology: The Future of Insurance Telematics.
Jessica Royer Ocken explores how greater in-car connectivity can lead to greater revenues
Pascal de Mul, global head of hardware partnerships at Spotify, on standardization of in-vehicle infotainment platforms and bringing 20 million music tracks to the car in a safe and non-distracting way
Susan Kuchinskas looks at how making cars more social could enhance the driving experience - and make customers more loyal
Steven H. Bayless, senior director, telecommunications and telematics at the Intelligent Transportation Society (ITS) of America, on why a common platform for vehicle communications will provide more opportunity for the industry than individual OEM solutions
Crispin Moger, managing director of the Marmalade Group of Companies, on targeting usage-based insurance to an underserved audience
Arvin Baalu, director of automotive engineering & product delivery, Harman International (India) Pvt. Ltd., on the importance of timing and bold decision-making when designing infotainment platforms for emerging markets