About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Don’t Forget the Internal Data Controls When You Outsource

Subscribe to our newsletter

By Andrew Sexton, Director of Curium Data Systems

Asset managers and other financial services companies outsource many business functions across the enterprise to service providers, which enables them to focus on their core competencies such as product development and investment performance. The data management component supporting those functions is often included in the outsourcing arrangement. But while the firm may choose to outsource the function, they cannot outsource the risk or liability of errors due to poor quality data – that responsibility remains firmly within the remit of the asset management firm itself.

Of course, poor quality data will negatively impact the firm’s ability to conduct its business, but it can also lead to repercussions from regulators. So how can asset management firms manage data quality across services that have been outsourced, and how can they demonstrate that they have robust data quality controls to their internal management, their clients and the regulator?

What do the regulations say?

Increasingly stringent regulations specify the need for those asset managers that choose to outsource some of their operational functions to maintain internal control and ability to monitor compliance with all obligations. It’s the old adage: You can outsource the function, but you can’t outsource your responsibilities.

To take a couple of examples from regulations that many asset managers will be familiar with:

MIFID Article 13 Section 5 – An investment firm shall ensure, when relying on a third party for the performance of operational functions which are critical for the provision of continuous and satisfactory service to clients and the performance of investment activities on a continuous and satisfactory basis, that it takes reasonable steps to avoid undue additional operational risk. Outsourcing of important operational functions may not be undertaken in such a way as to impair materially the quality of its internal control and the ability of the supervisor to monitor the firm’s compliance with all obligations.

Likewise, Solvency II regulation under Article 49 makes it clear that outsourcing does not discharge any firm from any of their obligations under the directive and that any outsourced activities do not unduly increase the operational risk.

Article 82 states that the firm shall have internal processes and procedures in place to ensure the appropriateness, completeness and accuracy of the data used (in the case of Solvency II relating to the calculation of their technical provisions).

Internal Reasons for Data Controls

But it’s not just for regulatory reasons that the firm needs to consider its data controls within the outsourcing model. There are many internal reasons too. Risk management in all its forms depends on the provision of accurate and complete data, not to forget that clients will take a dim view of errors due to bad data if there is an impact on their portfolios or the error makes its way through to client reports or similar communications.

The normal outsource model will ensure that the service provider is on the hook for its service levels around the performance of the function and it will naturally be paying attention to data insofar as it impacts its ability to provide that service.

But can the service provider carry out the level of rigour around all the aspects of data quality that are necessary for the client to meet its legal, regulatory and other commitments? The provider will maintain a certain level of sign off around the key data sets to fulfil its function, but practical experience suggests that providers are not necessarily looking for the same level of data quality and across the same breadth of data attributes that the asset manager needs to.

To take a simple example: an incorrect sector classification on a particular security probably doesn’t stop the provider from processing the transaction, but uncorrected it may impact asset allocation decisions, client mandates or skew all kinds of downstream processes like performance risk and client reporting.

Without a good oversight and data control process in place, the asset manager has no way to catch or manage data errors which, as we acknowledge, could have a material impact on the manager’s transactions, decision making and reporting.

Three Key Elements

So, what level of control do you need in place so as not to increase operational risk and satisfy the regulator? There are three key elements that form the backbone of a successful and maintainable oversight process.

1) Access to the data, be able to gather the data received from the service provider regardless of the formats and technologies employed, and be able to easily manipulate/transform that data into a form that can be accessible from your data control process. Fixed format files and static reports are never enough.

2) Make sure that your own business operations teams have the ability and tools to set up, modify and own the control and oversight process around the data sets – this is a BAU process and therefore must sit as close to the business functions as possible.

3) Ensure that the data control process incorporates satisfactory levels of workflow (i.e. a guarantee of ownership, prioritisation and resolution) over data issues and audit that can give any outside party (regulator, client etc.) demonstrable evidence that the firm has oversight of its service providers.

A good oversight process is not something that can be cobbled together. It needs thought, resource and a fair smattering of good technology to make it work. Outsourcing may be the right option for many firms, but will need an investment in oversight and controls that is often overlooked.

Curium v7 introduces new ETL features that make the data gathering component of your data control process even easier to implement, including a range of options for gathering data that is managed from outside the firm’s data architecture.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to ensure employees meet fit and proper requirements under global accountability regimes

Fitness and proprietary requirements for employees of financial institutions are not an option, but a regulatory obligation that calls on employers to regularly assess employees’ honesty, integrity and reputation, competence and capability, and financial soundness. In the UK, these requirements are a core element of the Senior Managers and Certification Regime (SMCR). They are also...

BLOG

Businesses Struggling with ESG Data that will Aid SFDR Compliance

Most businesses are struggling to prepare their data to meet a new European regulation that is designed in part to deliver huge troves of corporate ESG information into financial institutions’ systems. More than four-fifths of companies questioned in a study by data mastering company Semarchy said they lack confidence in their data management capabilities to...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...