About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Don’t Forget the Internal Data Controls When You Outsource

Subscribe to our newsletter

By Andrew Sexton, Director of Curium Data Systems

Asset managers and other financial services companies outsource many business functions across the enterprise to service providers, which enables them to focus on their core competencies such as product development and investment performance. The data management component supporting those functions is often included in the outsourcing arrangement. But while the firm may choose to outsource the function, they cannot outsource the risk or liability of errors due to poor quality data – that responsibility remains firmly within the remit of the asset management firm itself.

Of course, poor quality data will negatively impact the firm’s ability to conduct its business, but it can also lead to repercussions from regulators. So how can asset management firms manage data quality across services that have been outsourced, and how can they demonstrate that they have robust data quality controls to their internal management, their clients and the regulator?

What do the regulations say?

Increasingly stringent regulations specify the need for those asset managers that choose to outsource some of their operational functions to maintain internal control and ability to monitor compliance with all obligations. It’s the old adage: You can outsource the function, but you can’t outsource your responsibilities.

To take a couple of examples from regulations that many asset managers will be familiar with:

MIFID Article 13 Section 5 – An investment firm shall ensure, when relying on a third party for the performance of operational functions which are critical for the provision of continuous and satisfactory service to clients and the performance of investment activities on a continuous and satisfactory basis, that it takes reasonable steps to avoid undue additional operational risk. Outsourcing of important operational functions may not be undertaken in such a way as to impair materially the quality of its internal control and the ability of the supervisor to monitor the firm’s compliance with all obligations.

Likewise, Solvency II regulation under Article 49 makes it clear that outsourcing does not discharge any firm from any of their obligations under the directive and that any outsourced activities do not unduly increase the operational risk.

Article 82 states that the firm shall have internal processes and procedures in place to ensure the appropriateness, completeness and accuracy of the data used (in the case of Solvency II relating to the calculation of their technical provisions).

Internal Reasons for Data Controls

But it’s not just for regulatory reasons that the firm needs to consider its data controls within the outsourcing model. There are many internal reasons too. Risk management in all its forms depends on the provision of accurate and complete data, not to forget that clients will take a dim view of errors due to bad data if there is an impact on their portfolios or the error makes its way through to client reports or similar communications.

The normal outsource model will ensure that the service provider is on the hook for its service levels around the performance of the function and it will naturally be paying attention to data insofar as it impacts its ability to provide that service.

But can the service provider carry out the level of rigour around all the aspects of data quality that are necessary for the client to meet its legal, regulatory and other commitments? The provider will maintain a certain level of sign off around the key data sets to fulfil its function, but practical experience suggests that providers are not necessarily looking for the same level of data quality and across the same breadth of data attributes that the asset manager needs to.

To take a simple example: an incorrect sector classification on a particular security probably doesn’t stop the provider from processing the transaction, but uncorrected it may impact asset allocation decisions, client mandates or skew all kinds of downstream processes like performance risk and client reporting.

Without a good oversight and data control process in place, the asset manager has no way to catch or manage data errors which, as we acknowledge, could have a material impact on the manager’s transactions, decision making and reporting.

Three Key Elements

So, what level of control do you need in place so as not to increase operational risk and satisfy the regulator? There are three key elements that form the backbone of a successful and maintainable oversight process.

1) Access to the data, be able to gather the data received from the service provider regardless of the formats and technologies employed, and be able to easily manipulate/transform that data into a form that can be accessible from your data control process. Fixed format files and static reports are never enough.

2) Make sure that your own business operations teams have the ability and tools to set up, modify and own the control and oversight process around the data sets – this is a BAU process and therefore must sit as close to the business functions as possible.

3) Ensure that the data control process incorporates satisfactory levels of workflow (i.e. a guarantee of ownership, prioritisation and resolution) over data issues and audit that can give any outside party (regulator, client etc.) demonstrable evidence that the firm has oversight of its service providers.

A good oversight process is not something that can be cobbled together. It needs thought, resource and a fair smattering of good technology to make it work. Outsourcing may be the right option for many firms, but will need an investment in oversight and controls that is often overlooked.

Curium v7 introduces new ETL features that make the data gathering component of your data control process even easier to implement, including a range of options for gathering data that is managed from outside the firm’s data architecture.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Mastering Data Lineage for Risk, Compliance, and AI Governance

18 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are under increasing pressure to ensure data transparency, regulatory compliance, and AI governance. Yet many struggle with fragmented data landscapes, poor lineage tracking and compliance gaps. This webinar will explore how enterprise-grade data lineage can help capital markets participants...

BLOG

“No WhatsApp Ban” – FCA’s Transition from Prescriptive Rulemaking to Outcome-Focused Regulation

There was a flurry of headlines recently following statements from Financial Conduct Authority (FCA) Chief Executive Nikhil Rathi on a podcast, where he laid out the FCA’s new five-year strategy and its mandate for growth. In response to a direct question about regulating encrypted messaging apps and WhatsApp specifically, Mr. Rathi stated that they’re not...

EVENT

Future of Capital Markets Tech Summit: Buy AND Build, London

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...