Friday 4 December 2009

Solvency II: Data – The Hidden Culprit?

In my previous blog, I have commented on the inadequate preparation for the implementation of Solvency 2. I had also commented on the possible challenges that a company’s Information Systems can pose on the implementation. Perhaps the most serious challenge to implementation would come from data. It has happened before – there are plenty of case studies on the Net about how inadequate and inconsistent data has led to project failures and inordinate delays. More recently, Banks faced severe challenges during implementation of Basel II as they underestimated their data issues.


 
Insurance companies have large volumes of data. Being one of the earliest adopter of IT in their businesses, companies have accumulated large quantities of data across their IT systems. However, the sheer volume of data has often proved to be a bane. As their IT estate have grown tactically, the data residing in those systems are inconsistent, duplicate and orphans. Traditionally most reporting have been standard, being generated out of a tactical process. The data users within insurance companies have learnt to receive data in ‘piecemeal’, use indigenous techniques for purification and de-duplication, before using them for their purpose. However, this approach will not see them through the new regulation.

 
Insurance companies have taken efforts in the past to cleanse their business of this problem. However, in the absence of a strategic approach, each new effort has led to creation of yet another ‘definitive’ source of data. At the enterprise level, this has added another source of (duplicate) data.

 

Solvency II: The focus on data management

 
The Supervisory Review Process under Pillar 2 regulates how ORSA (Own Risk and Solvency Assessment) is achieved. To do so, it places increased scrutiny on

  •  Controls around gathering and storage of risk data
  • Evidence that risk assessments are based on accurate, complete and appropriate historical data
  • Evidence of consistency and accuracy between data used for capital calculation and that being used for reporting across the organisation.

 During Basel II implementation, most banks failed to demonstrate data governance from its creation, through its journey across various functional repositories to the point when it became available for analysis. Banks failed to estimate the scale of data quality problem and the efforts required to fix them. Consequently, many had to adopt tactical fixes which are being fixed up till today. Another result of this underestimation was that banks had to hold a higher level of capital to compensate for increased uncertainty arising out of unresolved data issues.

 

 To avoid this pitfall, insurance companies should adopt a data management framework comprising of three key components – data architecture, data governance and data quality. By tackling the data on these three fronts, insurance companies would be able to get their data in the shape needed for compliance of solvency II as well as harnessing capital efficiencies from the regulation.

 

 
Solvency II – Data architecture

 
Under Solvency II, there will be an increased demand for historical data to support capital calculation. A ‘piecemeal’ approach to data sourcing will not be a viable option, given the volume of data requirement and increased scrutiny by the regulator on how data is being sourced. Based on the experiences of Basel II, a robust, well documented data architecture will be a key enabler for compliance.

 
I have highlighted how insurance companies have unsuccessfully attempted to create an enterprise data. With the benefit of this experience, perhaps a balanced approach to data architecture will be a do-able option. Companies should centralise the acquisition, preparation and storage of historical data, but decentralise the development of data stores for analysis and reporting. Centralising the management and sourcing of all reference data implies that data is validated within the architecture and any anomalies can be remedied as close to the source as possible. A master source of data will avoid any inconsistencies in reporting when data is sourced from functional data stores maintained by the user departments.

 

  
Solvency II – Data Quality Management

 
The effectiveness of insurance company’s internal models cannot be guaranteed unless the quality and accuracy of the input data is ensured. Insurance companies, traditionally, suffer from serious data quality issues – incomplete validation, out-of-date reference data, missing fields, orphan records, inaccurate mappings rules and tables etc. All such issues may lead to delays in updating the models, requirement to maintain provision for uncertainty, improper calculations and greater efforts. The greatest challenge with in data management is that it cannot be handled at individual department’s level. Data is a corporate asset and hence it needs to be treated accordingly.

 

 The experience of Basel II has suggested that data quality management can take considerable efforts. Given the quality of data in insurance companies, it is wise to start this exercise immediately.

  

 
Solvency II – Data Governance

 
Pillar 2 requirements requires heavy scrutiny of the data sourcing process. In the new regime it will not be sufficient to demonstrate that those involved in capital calculation have a good understanding of data. Companies, on the other hand will have to demonstrate a similar level of rigour across the organisation. Keeping this in mind, enforcing good data governance across data management and analysis process will ensure that the data platforms would stand up to the regulatory scrutiny and pave way for wider data-based decision making.

 

 Data governance must also be extended to analytical processes. Traditionally, most risk calculations happened out of macro-enabled spreadsheets residing in analysts’ PCs. These calculation tools need to be ‘stress-tested’ for robustness. The calculation methodologies too, need to be stress-tested and documented for scrutiny by the regulators.

  
Unlike previous rule-based regulations, Solvency II seeks to inculcate a cultural change in insurance companies. Companies need to embed a risk based approach in day-to-day decision making. Being compliance to Solvency II is not just a ‘tick-in-the-box’. The business implications of compliance to this regulation extends far beyond that. For those who get it right, there are larger rewards in store – companywide data architecture, stronger governance, lower operating costs, being some of them. A key success criteria for harnessing the benefits is to look beyond the obvious, understand the full implications of the regulation across the company and make early advances towards compliance.

No comments:

Post a Comment