Part 2: Digital Risk Assessment - Article EN - Publications • synpulse
Select your region:
Select your language:
New York - English

Part 2: Digital Risk Assessment

Next Generation Risk Engineering Market Study

This is Part 2 in a 4 Part series publishing the results of the Next Generation Risk Engineering Market Study. 

Digital Risk Assessment - The Risk Engineer of the Future

Figure 6: Data Availability and Automated Analysis Maturity 

Data Sources and Analysis Method Benchmark 

A key challenge that many Risk Engineers face when examining submission information to obtain a quality understanding of the account, is not the lack of information but the unstructured nature of it. There is a vast amount of information out there – publicly available and proprietary data sources, websites, reports, spreadsheets, or geo-encoded and geo-spatial location information – but no market trusted remedy to automatically pull the right information out and present it to the Risk Engineer.

This pain point is also reflected in the survey data. (Figure 6) The diagram above depicts risk data source and extraction methods with the vertical axis accounting for amount of data sources used and the horizontal axis accounting for the level of innovation maturity.

Surprisingly, more than 90% of respondents use either internal risk data information exclusively or pair this with manually evaluated external sources. Only 10% of all participants deploy digital solutions to gather, analyze, and extract key information and derive measures and actions. Text-mining tools leveraging machine learning or semantic rule-based tools are known to be used by large insurers concentrating on global programs and specialty business. For such functions, solutions like «Cogito» by «Expert System» have proven to be capable to extract and score relevant text passages. 

Figure 7: Leveraging Legacy Data

The Power of Legacy Information - Apply the Past to Read the Future

From Synpulse experience of legacy systems and database structures, information is often unstructured and poorly maintained, or data is not stored in a central store but instead buried in Word documents. As such, the survey sought out to examine the state of the industry on this.

The illustration above plots usage of historical data on the vertical axis and maturity of analytics capabilities for historical data insights on the horizontal axis. (Figure 7) Well-structured and maintained legacy information can reveal a plethora of opportunities to enrich and validate assessment information. It not only shows the progression of the risk quality of portfolios, accounts or single locations, it also allows to benchmark occupancies against one another, controlling for regions, jurisdictions, or other factors. Notably, only 10% of all surveyed respondents mentioned that historical data is not used for Risk Engineering assessment purposes, but a solid two-thirds of all participants see the universal importance of legacy information however lack the right tools to harvest the benefits in an automated fashion.

In addition, if the data is stored in a database or data warehouse, changes to the underlying system landscape or changes to assessment frameworks can make compatibility and comparability difficult. Only 25% of all respondents use automated tools to benchmark risk assessment information against the legacy data pool. This will most likely change in the future, as organizations start to understand the power of data and the advantage of business intelligence systems capability to extract key insights that are available off-the-shelf with reasonable implementation effort. 

Figure 8: Risk Data Availability and Usage

Virtual Assessments Done Right - How to Leverage Internal and External Information

Mounting cost pressure is forcing Underwriting and Risk Engineering to slim the submission process and overhaul risk assessment across the industry. As such, Risk Professionals are motivated to seek faster ways to classify information and take decisions to boost speed in the pre-binding Underwriting process while reducing costs. «Lean Underwriting», «Expedited Submission Handling» and «Express Quote» are some of the buzzwords that describe industry initiatives to improve speed to quote and hit ratio.  

To aid in optimizing this risk triage, a plethora of data sources are available to be consumed instantaneously upon demand by the Risk Engineer; however, the degree to which such sources are leveraged varies significantly across the industry. 

To understand the extent to which data sources are leveraged for Risk Engineering, the survey asked respondents to indicate which sources they rely upon for conducting virtual, or desktop, assessments. (Figure 8) The results are depicted in the chart above. The usual suspects score the highest with Map Services and Geocoding, Google Street View, or Site Cameras. That information is easy to obtain, is free of charge and gives a first, immediate overview of the insurable risk and its environment. However, pixel resolution is often insufficient, the imagery is dated, and only an exterior view is offered to derive conclusions of the roof protection.  
Additional insights are often gained through 3rd Party Risk Reports or Desktop Google Research to find additional information to improve the understanding of the risk and past claims history. Though these sources are being used, as indicated earlier in the study, most carriers lack a structured way of integrating loss control and data providers thus are missing out on the huge potential to derive crucial conclusions on the underlying risk quality. 

Convinced that future Underwriting processes will continue to be driven to become faster and more efficient, Synpulse is developing solutions to consolidate these information sources in a holistic way that makes it easy for the Risk Professional to digest, or even automatically process that information.

Figure 9: Adoption Rates of On-site Enablers

Risk Engineers Onsite Toolset - What is the Future Swiss Army Knife of Risk Engineering?

Field Risk Engineers are often exposed to harsh, remote areas with unreliable internet connections to gather relevant risk insights. Convenience and usability of the supporting tools are important to allow instantaneous and efficient data capturing. While most large insurers offer offline applications (tablet or desktop) to its field staff to capture all information while onsite, (final results uploaded to the web-based application once connected to the internet) many do not use new emerging technologies to supplement or enhance their risk assessments. (Figure 9)  


Only a few respondents leverage Augmented Reality (AR). AR is slowly being adopted but use cases are still rare and hardware too expansive to allow for mainstream acceptance. While AR can be successfully used for automated hazard detection, flood scenario modelling or training purposes of junior Risk Engineering resources, it creates an unfavorable disconnect to the plant manager or other participants that do not take part in the AR experience. Also, multi-angle imagery technologies for interior modelling (360-Degree Camera) or Drones to capture exterior wall and roof conditions are not yet prominently used as additional benefits do not yet outweigh the costs for logistics and training to provide these imageries. In general, these technologies are too expensive and not yet mature enough to be applied extensively for onsite visits. 

However, Laser Distance Measurement Tools, Infrared Cameras and Thermal Cameras are on the rise and more widely accepted. These tools are lightweight to carry, can be used on the spot and experienced a massive price drop in the last couple years.
 

Stay tuned for part 3 of our Risk Engineering Market Study coming in the next few weeks. But why wait? Download the full report now by clicking below!

 

Contacts

Marc Kirchhofer
Manager
marc.kirchhofer@synpulse.com
 Marc Kirchhofer
Joel Smith
Senior Consultant
joel.smith@synpulse.com
 Joel Smith
Your browser is out of date!

Update your browser to view this website correctly. Update my browser now

×