Insurance data modernisation: where are we?
Question: is insurance industry on the cusp of a transformational journey? If not, it should be.
Imagine an insurance landscape where claims are processed at the speed of a click, policies are intelligently tailored to customer needs, and AI-assisted underwriting is spot on. Twenty years ago this would have been a product of our imagination, but today it is pretty much where we are. And if not, the technology is often available to streamline operations to that level.
What tends to be problematic here is twofold: the integration of automation into the day-to-day business, and having the right fuel to run these processes. Which is data. And in this article, I would like to focus on the second factor and why its modernization matters.
What is data modernisation?
This term refers to the broad idea of improving existing data platforms, systems and pipelines, and the way they are integrated. Simple enough. The aim is to:
1) take a look a current state of data-related processes and systems,
2) identify existing problems or opportunities for improvement,
3) assess which issues are most problematic or would add the most value,
4) take the initiative to introduce the necessary changes, solve the problems or reduce the risk of encountering them in the future.
It can involve either a transition from legacy data warehouse to upgraded, tidy, and scalable data platform. It could be a migration from on-premises hosting to cloud one, or vice versa. It could be a reorganization of BI reports and their datasets, or an upgrade from 20 years old tech stack to latest one which is more suitable for current challenges. It all depends on the needs and circumstances.
Data modernisation by default is not about letting go of previous solutions or legacy systems for the sake of it. It is about bringing a fresh wave of scrutiny to an organisation’s existing IT landscape and driving improvements that would be valuable from a business perspective.
The crucial importance of data modernisation for the insurance industry
Well, in today’s dynamic and digitised world, data modernisation is the key strategy for the insurance industry. After all, data today is what oil was at the end of the 19th century. Directly or indirectly, it drives the business and determines the direction of the business development. One inescapable characteristic of data is that what goes in comes out. In other words, the quality of the output data cannot exceed the quality of the input data.
There are many factors that contribute to the final quality of the data (e.g. quality of the input data, formats and encodings, freshness) and all of them need to be taken into account in the process to make the data qualitative. So, given the importance of data in our information age, if you are going to use it, it had better be good.
On the other hand, markets are not slowing down. On the contrary, they are becoming more diverse, fragmented and volatile. Data modernisation is no longer a luxury, but a necessity for insurers to remain competitive. Insurance is one of the most data-intensive industries, and there are many reasons why data modernization would be of value there. Here are three of them:
1. Regulatory compliance
Regulatory compliance is a critical issue, particularly in the insurance industry. Companies must comply with a web of local, national and international regulations and reporting requirements. For insurers, collecting and storing sensitive or confidential data, submitting data-intensive regulatory reports on a regular basis, and continuously monitoring and adapting to evolving regulatory changes while maintaining robust data security measures is part of business as usual. All of this while complying with the requirements of GDPR, Solvency II, IFRS, AML or other mandatory directives to come, such as FIDA or DORA. Adhering to these procedures takes human and financial resources, and is costly from a business perspective. Streamlining, automating and optimising them is therefore a long-term investment and a step towards cost reduction.
Recommended by LinkedIn
2. Operational efficiency
This is undoubtedly a key driver of profitability in the insurance industry. And similar to other sectors, insurance can struggle with typical IT/data inefficiencies such as shadow IT – where the company’s IT does not adapt to the needs of the business and the business has to compensate by building its IT processes. Others include a lack of single sources of truth, single points of failure in architectures, no versioning or organisation of data transformation, sub-optimal processes in data area or simply in data processing. All of these issues make IT, which is the backbone of the business and the engine for its efficiency, less resilient and adaptable to the business needs of the organisation.
3. Competitive advantage
The ultimate goal of optimising a company’s internal processes is to gain a competitive advantage. The same principle applies to data. What makes data different is its paramount importance in today’s business landscape. It has become a critical asset and one of the primary areas where investment and optimisation efforts yield the greatest returns. In the modern workplace, everyone consumes and generates vast amounts of data as part of their daily work and responsibilities. The ingestion of data fuels decision making, while the output represents the fruits of labour. As a result, the ability to manage and harness the full potential of available data has become a key business differentiator and the competitive advantage.
How to do it
There is no one-size-fits-all approach to data modernisation. Each organisation requires a customised approach that takes into account its circumstances, needs, goals and, of course, budget. However, the majority of improvements are likely to fall into one of these categories:
1. Tech debt
Tech debt can be costly to eliminate, but it is even more costly to allow it to persist and grow. It is cheaper to invest in overhauling existing legacy systems sooner rather than later. During the overhaul and rebuild, new systems can be built or old systems can be redesigned using best software development practices (e.g. Twelve-Factor Application). This may increase initial development costs and workload, but in the end it will reduce maintenance costs, reduce burnout from working with old, legacy software, and allow systems to last longer and continue to adapt to ever-changing business needs.
2. Data integration In most cases, enterprise data is spread across multiple systems. Each system supports different connectivity, data structures, formats and standards. All of this makes it difficult to integrate data from different systems. The key challenge in data engineering is to enable integration between them as seamlessly as possible in a way that does not increase complexity and maintainability. The way out is to use tools and technologies that allow direct integration and interaction of different data systems. There are several things to consider:
Additional possibility is to strive towards API-first strategy. This idea emphasizes the importance that to make software good, it should provide a robust and convenient integration interface. API-first approach is farsighted when developing a new piece of software but is just as crucial when selecting an existing market software for adoption.
3. Data governance Another way to apply data modernisation is through data governance. Data governance may seem like overkill or redundant bureaucracy, but what it does is provide a backbone for any data policy within the organisation, whether it is related to data retention, compliance, access controls, security or simply quality and management. Putting data governance into practice is challenging because the change is not just technical or regulatory. It is a cultural one, and those often takes time. Adopting better or smarter technology alone will not help. Technology (tools), decision-making (policies) and ownership (accountability) around data all need to come together to make it work. The ultimate goal of data governance is to establish a framework of responsibility for each data system, process and dataset within the organisation. The result? Data stops being an unused liability and becomes a valuable and well-managed asset.
Not easy, but worth it
Because data modernisation is difficult, it is also valuable. While there are challenges in pursuing data modernisation, the tangible benefits are there. The process is complex because insurers have to deal with many issues: budgeting, integration of legacy systems, the need for skilled data professionals, among others. A phased approach is the way to go. Starting with pilots and scaling up gradually can introduce change gently, allowing time and space to mitigate risks or adjust direction.
As the insurance industry continues to evolve and digitise, the importance of modern, orderly and secure data platforms cannot be overstated. To paraphrase the famous Chinese proverb about planting a tree, if not in the past, then today is the best time to modernise some data and build a resilient foundation for the future.
Adam Byczyński – data analyst at Sollers Consulting
Adam is involved in the development of Data Competency at Sollers. He has experience in data ranging from analysis and modelling through engineering and design of data platforms. With focus on leveraging data and data systems, he strives to improve existing business intelligence solutions and drive insights and innovation.