cyber insurance | cyber analytics | QOMPLX

  • Corporate
  • Sep 9, 2019
  • By QOMPLX

The Smartest Companies are Embracing the Cyber Telematics Revolution

The Smartest Companies are Embracing the Cyber Telematics Revolution

The insurance industry has struggled under the shadow of substantial uncertainty whilst writing a growing amount of cyber coverage. An important aspect of the long-term solution for sustainably managing cyber risk has surfaced with the increased use of cyber telematics. Simply stated, the ongoing management and analysis of real-time data is a necessary means of managing cyber risk assessment and supporting risk transfer. Smart companies will be working on ways to incorporate new types of cyber telematics into their processes as it becomes an industry differentiator for their future.

Cyber data, available nearly instantaneously through telematics streams, provides a current view of risk to the underwriter and risk management teams. Given the highly dynamic nature of cyber risk, enabling efforts to analyze portfolios effectively by region, peril, client, class, or arbitrary accumulation (aka monoculture or common mode failure). This necessary transformation will require companies to build or partner with expert data-driven consumer platform(s) which support client security and IT operations organically or to support specific observability goals required for effective participation in risk quantification and transfer schemes; this approach provides decision-makers the ability to understand cyber-specific peril risk at a comprehensive level.

As an initial foray into scoring and benchmarking, numerous vendors and internal technology efforts across the industry perform simple scans of externally accessible data. Yet, in retrospect, this approach has not been a satisfactory means of assessment given the paucity of robust correlations to breaches and loss data or any practical working knowledge of modern network design and operation. Rapid change is a constant in the cyber environment and, therefore, cyber risk cannot be passively evaluated by external open source intelligence gathering or internal vulnerability scans – which offer visibility into a small subset of required data. Underwriters are realizing that telematics and real-time accessibility to pertinent data sources which can aid them in more comprehensive understanding – essentially by facilitating periodic network snapshots using data from a wider variety of internal and external sensors across multiple tool vendors to predict cyber risk. Such a solution supports proactive event characterization, vulnerability/impact analysis, and accumulation management.

Even smarter companies will construct a vision to adopt this complex type of predominantly streaming data architecture which requires several types of specialized persistence layers (e.g. for graph and timeseries data) at scale. Foundational requirements include the need to ingest data from disparate, heterogeneous sources, whether internal or web, structured or unstructured and to consolidate that data in centralized areas despite it being collected across entire (sometimes global) customer IT and OT footprints. Additionally, there is a need to create sufficient uniformity to the data in order to have the ability to extract insight and ultimately implement enhanced modeling and simulations in a uniform fashion. This requires a unified ontological framework and the ability to schematize, normalize and semantify data on ingest.

When breaking down the architecture to support cyber telematics, the following capabilities are a necessity:

  • Data must be able to be stored in appropriate format to make it maximally accessible and efficiently queried and processed.
  • Any initial focus should be on observability of potentially business impacting events – no silent failures is an excellent goal.
  • Inclusion of comprehensive analytics and visualization to drive user understanding of system monitoring and enable quantification and actionable improvements that impact important metrics such as the mean time to respond to failure and mean time between failure events.
  • Machine learning and statistical trend analysis – can support retrospective modeling efforts to characterize historical decisions and interactions on networks.
  • Easy to run simulations – allowing potential problems to be identified by reviewing how assets, topologies, exploits, vulnerabilities, threat actors and tools, and broader geopolitical and threat environments impact a reasonably plausible set of outcomes – but this must be possible to do via “state estimation” from a plurality of operational data feeds and not a sideshow consulting effort.
  • Representative network topologies and asset models using mixes of self-attested, observed, and synthetically generated data must be used during the pre-bind risk selection and pricing process to move beyond retrospective modeling techniques.
  • Uncertainty quantification or sensitivity analysis must be able to help risk managers and underwriters identify areas of significant model influence for additional data gathering, expert analysis (even qualitative), or modeling – this must also be efficient such that it is scalable and repeatable.

Actual telematics data sets must always serve operational effectiveness and efficiency – not just risk transfer or risk management goals. Current approaches to harvesting log data, application and performance data, and broader security data from the network are too slow, manual and costly to achieve the aforementioned goals. Thus, enabling this new type of quantitative approach to risk transfer based on increasingly specific views of clients requires the ability to accommodate the onboarding of new data sources within minutes; this includes the unification and enrichment of disparate data into domain-specific ontologies; and the ability to clean and organize it for effective analytics efforts which can improve detection, triage, alerting, and orchestrated response actions (human or machine). Underwriters require clarity on the external threat landscape, internal network security posture, external network visibility and access, and other key criteria for the holistic view of potential risks to confidentiality, integrity, or availability of data and services inside insureds. Near real-time continuous monitoring along with a carefully selected mix of streaming, microbatching, and batch processing of primary sensor data, derived events and threat intelligence, will then allow for modelers and underwriters to more realistically and efficiently determine probabilistic attack patterns.

QOMPLX supports a range of retrospective modeling and simulation-based modeling approaches using core Q:OS capabilities within its offerings for Q:CYBER and Q:INSURANCE. Since modeling approaches have massively different data requirements (even with optional utilization of synthetic data) and mathematical limitations and costs. For example, we use a variety of graph-centric techniques to look at network resilience from snapshots of network privileges, assets and network topologies, vulnerabilities and exploits – including things like spreading phenomena for exploring graph permeability. We also use some gaming-style approaches to massive agent-based techniques which relaxes some mathematical constraints in order to simulate at a scale that can model core internet infrastructure – we have ran simulations looking at global DDoS susceptibility and impacts via modeling every autonomous system in the Internet for issues in BGP and TCP/UDP DDoS attacks. We also do higher fidelity models which retain important mathematical concepts (e.g. causality, reproducibility, and ergodicity) using discrete event simulation approaches in select cases. Finally, sometimes one needs to validate that a given network configuration and asset set will behave in a precise way, requiring a “range” to launch real attacks on real stuff. This is where full-on emulation of specific risk environments must occur to validate lower cost model approximations. QOMPLX doesn’t use all of these tools for every client. We work to maximize model relevance and appropriateness with continuous updates to select internal and external data sets (even mashing up other vendor data sets and feeds for enrichment) to ensure fitness for purpose and understandable model health. Orchestration processes and rules about model fitness limitations and retraining ensure the constant refreshing of event sets and knowledge required for advanced analytics while scaling scarce human time and aiding in maintaining confidence in the decisions from a versioned, reproducible and traceable data supply chain. The ability to establish Cyber baseline scores for specialized areas like Active Directory or UEBA is important. However, Q:OS supports extensions to base models, custom models or index creation and tracking and the ability to harvest and consume data from many sources for operations and risk management professionals. We believe telematics sources and carefully constructed scoring models can allow for the ability to assess risk in a means that can be directly applied to rating algorithms, underwriting rules and acceptance criteria while still balancing client privacy and operations concerns.

The revolution is happening, and ultimately, the smartest companies are already looking to partner with those who understand the practical challenges with massive telemetry collection, ingestion, storage and interpretation. No modeling effort will succeed if these pre-requisites are not fulfilled. Those who embrace change and leverage cyber telematics to deliver superior value to their customers and incorporate technology to gain efficiency and insight will have the industry edge. Select players are now learning to effectively incorporate cyber telematics in daily operations and are changing the way cyber coverage is written and managed across a broad swath of company sizes as superior tools democratize the ability to gain visibility of modern networks and score them more realistically.

You might also be interested in

Empowering enterprises to stay ahead of evolving threats

Empowering enterprises to stay ahead of evolving threats

QOMPLX recently joined the IBM Security App Exchange. Here’s why the integration will take your security to the next level.

Read more
Identify and Fight the Phish #CyberMonth

Identify and Fight the Phish #CyberMonth

Phishing attacks are an easy way for a bad actor to gain access to a network. Once inside, they can cause devastating losses.

Read more
How much automation?

How much automation?

Automation of underwriting decisions has a very tangible benefit - cost savings. When rules are automated and decisions are made based on reliable supporting data, underwriters can focus on the outliers and make the most of their precious time.

Read more
Request a Demo

Interested in learning more?

Subscribe today to stay informed and get regular updates from QOMPLX.