Introduction – Dealing with the data tsunami
Data, when properly processed and analyzed, can substitute assumptions with knowledge. In turn, that knowledge serves as a critical competitive advantage.
The skill sets of the Engineering and IT teams in Startups and Small & Medium Enterprises are typically aligned to the tech stack being used in the Core Applications and lack the specialist knowledge required for contemporary Data management needs. As a result, implementing Data-driven systems is a daunting prospect, leaving valuable data unused or underutilized.
This article breaks down the strategy that Startups & SMEs can adopt to avoid getting overwhelmed with data and turn it into an opportunity for innovation, service differentiation and take informed business decisions.
From our experience working with various companies in the Insurance industry, we have broken down the myriad challenges that the business encounters into manageable projects with structured “wrapper” solutions for quick results. They are:
- Big data and Fast Data have distinct uses – Plan accordingly
- Adopt RPA to cleanse & organize existing legacy data
- Invest in an API layer – Exchange data securely between Legacy and Data ecosystem
- Continuously evaluate data sources/attributes – Filter out those that do not add value
- Be transparent on data collection & use – Build trust & drive positive behavior
The wrapper strategy lets you maximize outcomes with your current systems, gain a good understanding of the data available and apply the insights to the extent possible with the current tech stack. Rather than overhauling the entire platform at one go, you can now progressively modernize platform components.
Let’s dive into the details of these five steps.
1. Big Data / Fast Data – Do I need both? Why?
Big data is the vast amounts of structured and unstructured data generated from customer interactions, IoT devices, social media, and more. Companies analyze historical data and gain insights into customer behavior and preferences that drive informed business decisions. Data could be days, weeks, or even months old and take hours to be analyzed. The insights so gathered may need to be updated.
Fast data, on the other hand, is the real-time processing of data as it is generated. With this capability, Insurtech companies assess risk based on real-time data and offer instant quotes, issue policies, detect fraud or validate claims faster. As the risk assessment is based on real-time data, the Insurer can adapt to changes in the customer context and offer personalized solutions.
InsureTechs companies like Trov use fast data to process real-time data from connected devices and sensors, determine which on-demand insurance products are relevant, and allow customers to insure specific items for specific periods.
Big data insights are used to identify market trends, determine value propositions, and influence product design. Fast data plays a vital role in ascertaining the correct risk profile and identifying the ideal product for the consumer from its portfolio or process claims. Big data investments aid in strategic decisions – While Fast data investments deliver bottom-line impact.
2. Harness existing data for analytics with RPA – Update data structures during Platform Modernization
Robotic Process Automation (RPA) is an effective, low-cost option to extract data from the silos of existing legacy systems for data analytics. RPA does not address the limitations of the data or the constraints of the legacy application architecture but provides a way to utilize the internal data available to optimize the product mix or assess the risk involved.
Prudential Financial, New York Life, and Aetna use RPA to collate data from multiple legacy systems. The clean, processed data that is extracted is available to streamline its underwriting process, allowing Agents to focus on higher-value tasks and design policies that offer the best coverage for their unique needs.
RPA is an interim approach that allows Insurers, MGAs, and Brokers to work with data spread across its current standalone systems. RPA is the wrapper that insulates this new data layer from the limitations of the legacy platform. It gives you space to plan for a phased approach to modernize Applications designed to utilize Cloud technology, to make it agile and flexible to address the users’ need for instant responses and hyper-personalized coverage expectations.
3. Build a secure API Layer – An investment with a long-term value
Application Program Interface or APIs need no introduction. The well-designed API layer isolates the backend services from the client applications and reduces the impact of changes in one component on the other. More importantly, it establishes standard data definitions, defines data formats to be adopted, and enables the exchange of data between existing application silos.
In our context though, APIs enable the current platform to access the insights derived from the data layer to augment the decision criteria configured in the application workflows. It results in improved risk assessment, customized policies, or add-on recommendations that make sense for the customer. While RPA allows data to be extracted from the current legacy applications, APIs enable the data analytics to be pushed back into the platform. However, the degree to which these data insights can be actually utilized depends on the legacy Application architecture, and an App Modernization effort is required to get the maximum results.
With the emerging trend of collaboration between Insurers, InsurTech companies, MGAs, Brokers, and non-Insurance entities as part of the InsurTech 2.0 ecosystem, this API investment is highly valuable. A secure API framework not only allows data to be shared between internal systems but also enables data to be exchanged with partners, regulators, and other 3rd parties as needed.
Honcho, an InsurTech company, offers a mobile app that uses APIs to connect with various insurance providers, exchange real-time data with its platform, and provides accurate, up-to-date insurance comparisons for customers.
4. Assess the value of the data collected – Eliminate clutter with Managed Data Services
Different data types have varying levels of relevance, quality, and value. Some data is more valuable than others – For example, a customer’s purchase history data is more valuable than information from their social media activity when trying to understand their preferences & behavior. Additionally, the amount of digital storage needed, the effort to cleanse the data, the compute capacity, and the time taken to analyze and derive insights – All rise significantly as the data volumes grow.
By carefully selecting and prioritizing the data to be analyzed, you can make the most of your big data analytics efforts. This is a step that many skip, resulting in ineffective insights that are either not actionable or lack impact. This is perhaps best handled by a Managed Data Services partner with a specific Data Ops mandate to ensure quality and relevance.
Onboard devices in cars continuously capture over 30+ real-time data points. Bright Box, a connected car platform, analyzes data subsets such as driving behavior, speed, and braking to determine the risk profile of their customers and offer customized policies. Similarly, Slice Labs provides on-demand insurance for gig economy workers. They analyze earnings, location data, and the type of gig work data subsets to determine the customer risk profile.
5. Transparency is key to cultivating user trust and influencing positive behavior
Data Analytics insights are relevant and useful when the underlying data is accurate and reliable. Customers willingly share personal data when they get tangible value from the service. In contrast, when businesses adopt opaque means to collect personal information and use lengthy terms of service as legal cover to share data with 3rd parties for marketing or unrelated reasons, customers limit access to their data, that renders it unreliable.
Insurance companies can earn customer trust by adopting transparent and ethical data practices with a combination of technology and associated business processes.
Leverage your data for actionable insights and informed business decisions
The technology elements include UX on the website, mobile app, and details within the policy document that inform customers of what data is being collected, why it’s being collected, and how it’s being used. Implement strong data security measures enabled by technology like encryption, secure storage, and restricted access. Comply with regulations, such as GDPR and HIPAA, to ensure that they are using data in a responsible and ethical manner. When sharing data with 3rd parties, ensure that the information is anonymized or the user’s identity is hashed as needed.
A culture of responsible data use that includes internal processes aimed at ensuring data privacy and security is essential. Ultimately, the weakest link in the system is the human factor, and sensitizing the people who handle customer data is important. Customer data must be shared with relevant 3rd parties for insurance-related services rather than for “list monetization.” Customers should be made aware of who their data is being shared with, why it’s being shared, and have control over it.
Hitch Insurance provides flexible, on-demand coverage for people on the go. They use cutting-edge security measures to keep customer data safe and transparent, including encryption and regular security audits. Oscar Health is a technology-driven health insurance company that uses data to improve the customer experience and reduce costs. They give customers easy access to their health data and clear explanations of how it’s used.
Conclusion
Small and medium insurance companies (SMEs) face challenges in affording big data solutions due to limited budgets and resources. Adopting the ‘Wrapper’ strategy in combination with an Incremental implementation approach that starts with the most critical data and progressively adds capabilities allows them to access these solutions at a more affordable cost. Using specialist vendors for Managed Data Services is a cost-effective alternative, given the competitive advantage of Big Data & Fast Data analytics.