Improving data capture through collaboration
Adrian Rands, Chairman, Quantemplate
When examining differences between the insurance technology arena and wider
financial services, you need only look at their origins to begin to understand why the
two are fundamentally different beasts.
In the late 1970s and early 1980s, financial services companies such as banks and hedge
funds started investing in teams of internally hired programmers to create and manage
vast databases. With the information accumulated, they applied innovative algorithms
that could analyse data from the capital markets and give their employers an edge over
the competition. Over time, many of these teams split away from the banks, either as
founders of their own companies or subsidiaries to service the capital markets.
The nature of these providers' origins led to collaboration and internal investment
between the technology and financial services industries. This cemented their place as
an important fixture in the financial services landscape and an inherent part of the
culture.
In contrast, catastrophe models, arguably the first complex fintech systems endorsed
by the insurance markets, originated from academia, not the industry itself. Similarly,
businesses like Xchanging and many of the London market software companies have
been service providers to the insurance industry from the point of origin, rather than
spin-offs from internal teams or subsidiaries.
A culture of collaboration between fintech firms and the companies they serve is critical
to the successful development and integration of standardised, reliable data
A culture of collaboration between fintech firms and the companies they serve is critical
to the successful development and integration of standardised, reliable data and there
is no better way to guarantee a vested interest than investment. Bloomberg was
financially backed by Merrill Lynch, which fostered an environment for iterative product
evolution between the two parties for mutual benefit. Similarly, Markit, a younger
financial services information company, was born by demand from the banking sector,
which wanted more in-depth analysis on the less vanilla segments of the capital markets.
Collaboration
A consortium of the leading investment banks provided Markit's start-up capital and
additionally contributed their proprietary data on over-the-counter derivatives and
pockets of more unusual asset classes Bloomberg and Reuters were not addressing. The
closest the insurance world has come to this type of collaboration is with Lloyd's- and
London Market Group-driven technology initiatives such as Kinnect or Placing Platform
Limited.
The fundamental difference between technology ventures such as Kinnect and Markit is
the insurance initiatives have not been motivated by direct gains on investment, but
rather funded through a levy, by members of the market for the collective benefit.
Although this altruistic approach may be desirable to many, history has favoured
businesses with a focus on profitable performance, particularly in the financial markets.
This further demonstrates how the insurance market is far less developed than the wider
financial services arena.
Without the central management of data,
there is no standard of data in the market.
However, while a culture of collaboration is important, so too is the central management
of data, something the insurance markets are yet to develop. To process data, you need
an algorithm. An algorithm in itself is simply a process or set of rules that incorporates
logical arguments so there can be structured decision-making in the way it processes
data. It is this algorithm and the data that make up the two core elements to creating
well-managed, central data, and both rely on each other if software is to bring value to
an end user. Without the central management of data, there is no standard of data in the
market. Without having a history of storing consistent and uniform data it has been very
challenging for technologies to provide meaningful, useful services to insurers.
Fraught
The setting of data standards has been fraught in the insurance industry for a number of
reasons. When you examine data in the capital markets, say foreign exchange or
commodities trading, data variables include bid price, buy price, volume, buyer, seller,
instrument and maybe a couple of others factors. The data is very simple, consistent and
fully homogenised. Similarly, with equities, if you look at the data that is reported for
financial statements of underlying companies, you have close to 400 variables that make
up a financial statement, seven or eight that make up the market price and another 100
or 200 qualitative factors such as directors, industry and so on for a software product to
process. This is still very simple thanks to the historic, standardised data captured by
the financial services industry that has led to each factor being very well defined.
Capital Markets
Equities
Insurance
Data standards across industries
Insurance is a different animal and the issues go right back to the legal principles of the
transaction. Capital market investments operate under caveat emptor contracts, which
put the onus of discovery on the buyer and the questions that the capital markets ask
about an investment have been standardised over time. The insurance industry's use
ofuberrima fides agreements make the disclosure of all material information by a broker
or assured mandatory. The materiality of information is subjective and creates
inherently complex and unique data requirements that cannot be standardised. This is
why, from a data perspective, the insurance industry is where the capital markets were
in the 1970s (that is, without standards).
Intermediaries and brokers are now getting heavily involved in technology, both by
developing it themselves and acquiring businesses that have technological capabilities.
Many assume a key driver in Aon's acquisition of Benfield was to get hold of ReMetrica.
Equally, Willis's purchase of Towers Watson will give it Igloo, which probably drove
Towers Watson original purchase of EMB, Igloo's creator, in the first place. The creation
of Aon Grip and ReMetrica demonstrate the increased interest in technology on the sell
side of insurance to enhance services delivered to clients.
The underlying challenges and requirements between the capital markets and
insurance are so different, there is little crossover. Many tech companies that have
dominated the banking sector have later attempted to move into insurance, but most
have failed because they have come from a world where there's very good, clean,
instantly available data, to one where it is patchy, hidden and often does not exist at all.
In turn, the lack of technology buy-in among insurance professionals, due to poor quality
software, expense or complexity has left the once enthusiastic view of insurance tech
slightly dampened.
The right technology is available to the insurance industry, it has simply not hit the
mainstream yet. With ever-increasing levels of data maturity within the market, when
the tech firms that are offering genuine, innovative, insurance-specific solutions come
to the fore, cracking the challenges of insurance technology will be a victory worth
shouting about.