The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data With Sarah Underwood: Getting to Grips with Data Management Development

Enterprise data management has become central to conversations at financial institutions about strategy setting, business development, profitability, risk management, regulatory compliance, technology innovation and more. Yet it remains difficult to itemise and successful solutions are elusive. Some of the conversations concentrate on long-term data standardisation and centralisation, others on data management outsourcing, and many on tactical solutions to meet today’s problems of complying with the slew of regulations that have emerged post the financial crisis.

Clearly, one size does not fit all, but there are common themes underlying data management that is fit for purpose and, equally importantly, promises business benefits and has the potential to be future proof.

One of the most complex data management scenarios is in financial institutions operating international businesses. While cross-border trading and investment has become the norm, with it comes the complexity of dealing with multiple regulatory regimes, market conventions and business definitions. Tackling these issues simultaneously is no mean feat and some firms continue to shy away from long-term data management projects, but there are pointers to success.

To help firms plan the best route to efficient, effective and beneficial data management, Wolters Kluwer Financial Services and A-Team Group’s have worked together to deliver a white paper on data management development, Data Management – a Finance, Risk and Regulatory Report. The white paper is free and you can download it here.

Essentially, the paper examines the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple types of user. It also offers guidance based on best practices for developing a data management solution.

The paper reviews the current landscape for these financial institutions, noting the need to satisfy several regulatory regimes, handle multiple data sources, types and formats in different timeframes, make access to the same underlying data available for different purposes and create a data structure that can deliver reports to a range of internal recipients including the front office, risk, finance, treasury, credit and settlement functions. While the landscape looks tough to navigate, an integrated approach to data management can deliver cost advantages from standardised data, streamlined processes and improved risk profiles. Standardised datasets and common vocabularies across different business operations can be used to break down silos and integrate data as part of a data management strategy that will not only support the business model, but also set in place data quality assurance and a framework that will help to support future regulatory requirements as they come on stream.

The concept may be clear, but for many data management practitioners the pain points remain acute. Data architects creating a unified data management infrastructure across operations in different countries must often work with different internal systems. As well as the obvious issue of inconsistent data formats, these systems can lack common vocabularies and definitions, creating a significant obstacle to achieving a group level view or consolidated reports to meet regulatory requirements. Similarly, concepts like valuations may be defined differently across systems and businesses.

Even where group level integration is possible, the granularity of individual systems is often lost, diluting the ultimate value of a report. This problem seems to be a real struggle for many financial institutions that have succeeded in creating an aggregated data view for regulatory and risk purposes, but have found it difficult to give group level users the ability to drill down into underlying apps to retrieve detailed information they will increasingly need to answer regulators’ questions. The need to provide different views of the same underlying data to meet the disparate requirements of regulators, risk departments and finance departments is also a challenge.

As if these issues were not enough, time to market must be factored in and future regulatory requirements must be predicted on the basis of past experience. And as the white paper warns, in future it won’t be a firm’s risk manager who defines reporting around use of capital; instead, regulators will set restrictions on a firm’s use of capital.

These issues may seem insurmountable, but they can be addressed, often with the help of expert third-party vendors, and a successful data infrastructure can be built and deliver significant benefits. At the highest level, standardising the data management function across borders, regulatory regimes and business units goes to the heart of requirements for firms to simplify. If institutions active in several jurisdictions can create similarities between models, data sources, vocabularies and definitions, they can also yield huge benefits at the operational level, including economies of scale and the kind of cost savings that could be enough to tip the balance of an uncertain company board in favour of a major data management development.

Once the decision is made to standardise data across a global organisation, the real work of challenging the issues and delivering the benefits begins. For some firms a centralised approach comprising a global risk and regulatory data infrastructure based on a single provider or internal central services department makes sense, although it is still necessary to take into account local idiosyncrasies. For others, a more decentralised approach that matches the business model may be appropriate.

Diving deeper into standard data infrastructure development the to-do list includes providing an integrated view of data as well as drill-down for a more granular view, making data meaningful to all potential recipients, implementing local regulatory requirements, supporting historical information, ensuring system security and building in the flexibility to answer future regulatory requirements.

These tasks are daunting and like any development on this scale there are no quick fixes, which is why Wolters Kluwer Financial Services and got together to write a white paper detailing not only the challenges and opportunities of standardising data, but also offering guidance on how to build a standardised data infrastructure.

If you have already embarked on an enterprise-wide data management programme or are about to start one, good luck! And let us know how it goes.

Related content


Recorded Webinar: Trade surveillance: Deploying monitoring and surveillance capabilities for today’s new normal

Let’s face it: The old ways aren’t coming back. A plethora of challenges brought on by the covid-19 pandemic, coupled with unrelenting market volatility and uncertainty, have pushed financial service firms to look for rigorous monitoring and surveillance solutions to meet the demands of the emerging trading landscape. Working from home (WFH) has increased the...


Ikano Bank Places TruNarrative Platform at Centre of Fraud Prevention Strategy

Swedish bank Ikano Bank will deploy fraud prevention technology from UK-based TruNarrative, as part of a wider digital transformation programme. The implementation will involve integration of the TruNarrative client onboarding and fraud detection platform with Ikano’s new technology architecture to facilitate the bank’s Europe-wide fraud prevention strategy. The TruNarrative platform is accessed via a single...


RegTech Summit London

Now in its 6th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...