About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Don’t Forget the Internal Data Controls When You Outsource

Subscribe to our newsletter

By Andrew Sexton, Director of Curium Data Systems

Asset managers and other financial services companies outsource many business functions across the enterprise to service providers, which enables them to focus on their core competencies such as product development and investment performance. The data management component supporting those functions is often included in the outsourcing arrangement. But while the firm may choose to outsource the function, they cannot outsource the risk or liability of errors due to poor quality data – that responsibility remains firmly within the remit of the asset management firm itself.

Of course, poor quality data will negatively impact the firm’s ability to conduct its business, but it can also lead to repercussions from regulators. So how can asset management firms manage data quality across services that have been outsourced, and how can they demonstrate that they have robust data quality controls to their internal management, their clients and the regulator?

What do the regulations say?

Increasingly stringent regulations specify the need for those asset managers that choose to outsource some of their operational functions to maintain internal control and ability to monitor compliance with all obligations. It’s the old adage: You can outsource the function, but you can’t outsource your responsibilities.

To take a couple of examples from regulations that many asset managers will be familiar with:

MIFID Article 13 Section 5 – An investment firm shall ensure, when relying on a third party for the performance of operational functions which are critical for the provision of continuous and satisfactory service to clients and the performance of investment activities on a continuous and satisfactory basis, that it takes reasonable steps to avoid undue additional operational risk. Outsourcing of important operational functions may not be undertaken in such a way as to impair materially the quality of its internal control and the ability of the supervisor to monitor the firm’s compliance with all obligations.

Likewise, Solvency II regulation under Article 49 makes it clear that outsourcing does not discharge any firm from any of their obligations under the directive and that any outsourced activities do not unduly increase the operational risk.

Article 82 states that the firm shall have internal processes and procedures in place to ensure the appropriateness, completeness and accuracy of the data used (in the case of Solvency II relating to the calculation of their technical provisions).

Internal Reasons for Data Controls

But it’s not just for regulatory reasons that the firm needs to consider its data controls within the outsourcing model. There are many internal reasons too. Risk management in all its forms depends on the provision of accurate and complete data, not to forget that clients will take a dim view of errors due to bad data if there is an impact on their portfolios or the error makes its way through to client reports or similar communications.

The normal outsource model will ensure that the service provider is on the hook for its service levels around the performance of the function and it will naturally be paying attention to data insofar as it impacts its ability to provide that service.

But can the service provider carry out the level of rigour around all the aspects of data quality that are necessary for the client to meet its legal, regulatory and other commitments? The provider will maintain a certain level of sign off around the key data sets to fulfil its function, but practical experience suggests that providers are not necessarily looking for the same level of data quality and across the same breadth of data attributes that the asset manager needs to.

To take a simple example: an incorrect sector classification on a particular security probably doesn’t stop the provider from processing the transaction, but uncorrected it may impact asset allocation decisions, client mandates or skew all kinds of downstream processes like performance risk and client reporting.

Without a good oversight and data control process in place, the asset manager has no way to catch or manage data errors which, as we acknowledge, could have a material impact on the manager’s transactions, decision making and reporting.

Three Key Elements

So, what level of control do you need in place so as not to increase operational risk and satisfy the regulator? There are three key elements that form the backbone of a successful and maintainable oversight process.

1) Access to the data, be able to gather the data received from the service provider regardless of the formats and technologies employed, and be able to easily manipulate/transform that data into a form that can be accessible from your data control process. Fixed format files and static reports are never enough.

2) Make sure that your own business operations teams have the ability and tools to set up, modify and own the control and oversight process around the data sets – this is a BAU process and therefore must sit as close to the business functions as possible.

3) Ensure that the data control process incorporates satisfactory levels of workflow (i.e. a guarantee of ownership, prioritisation and resolution) over data issues and audit that can give any outside party (regulator, client etc.) demonstrable evidence that the firm has oversight of its service providers.

A good oversight process is not something that can be cobbled together. It needs thought, resource and a fair smattering of good technology to make it work. Outsourcing may be the right option for many firms, but will need an investment in oversight and controls that is often overlooked.

Curium v7 introduces new ETL features that make the data gathering component of your data control process even easier to implement, including a range of options for gathering data that is managed from outside the firm’s data architecture.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real time, and flag anomalies in the same timeframe. They also present challenges including explainability, responsibility, model...

BLOG

Global Screening Services Raises $47 Million, Moves into Operational Phase

Global Screening Services (GSS), a provider of transaction screening, has completed a Series A2 funding round raising over $47 million (£37 million). The investment will support the London-based company’s transition from a development to operational phase as its cloud-native platform prepares to go live with inaugural clients. Originally incubated by AlixPartners, GSS tackles the complexity...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...