About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Outsourced Data Management Services: Will This Be the Year? By Tim Lind, Senior Analyst, Investment Management, TowerGroup

Subscribe to our newsletter

Although discussions surrounding the economics of outsourcing have only increased over the past three years, creating inertia in the securities industry for business process outsourcing has been a frustrating experience. Entrenched attitudes take time to change, and outsourcing decisions continue to be driven more by political considerations than by the bottom line. The biggest hurdle has been proving a clear case for the cost effectiveness of outsourcing for both provider and prospect, especially concerning more complex functions such as the collection and management of historical and descriptive market data.

Data consumers can be a uniquely suspicious lot, particularly when it comes to any changes in their control and access to familiar sources of data. However, over the coming year, expect to see more firms start to question whether managing their own infrastructure and data administration is a competitive differentiator or whether it should be delegated to third parties offering a shared infrastructure. Vendors and consulting firms clearly sense a market opportunity for new services targeting the data administration process. Are institutions ready to delegate aspects of their data management to a third party, or is this function too close to home for firms to actually pull the trigger on an outsourcing contract? In an effort to improve the consistency, cost, and timeliness of updating security master data, many institutions have centralized data administration to provide a common repository for core applications (e.g., risk management, portfolio accounting, settlements, etc.) The objective is to reduce overlapping and redundant data management functions, introduce enterprise data standards, and improve the scale of the data administration department by leveraging a common operational model across multiple functional groups. The natural evolution of a trend toward consolidated operations is to extend the economies of scale beyond the enterprise by creating a shared infrastructure to service multiple institutions. A managed data service (MDS) would provide the technology, human capital, and expertise to capture and normalize data from a variety of internal and external sources, commercial data vendors being the primary source of content. In its simplest form, a data service would combine the content from data vendors and publish a composite record of enriched data to clients within an aggregate feed product. The MDS provider would host the software and hardware used to compare sources along with the staff to research exceptions that occur when multiple data vendors are compared against one another. It is important to note that an MDS provider acts only as the agent of the financial institution; the end client would remain liable for all the licenses paid to market data vendors. As with all outsourcing propositions, an MDS must find a way to balance the trade-off between scalability and customization. Employing a defined set of vendor inputs and consistent business rules to normalize data will ensure scalability of the service. However, it doesn’t provide for a high degree of customization or allow each customer to select the vendor source data and hierarchies used to create composite records. Further, it doesn’t address the integration of data into the client’s business applications and processes, which is the most difficult aspect of data management. The next logical step in outsourcing will offer bespoke data derivations for individual clients, a choice of vendor data inputs, and systems integration support in the form of custom data file outputs. The main value of data outsourcing resides in the skills of the service provider’s staff and their judgment, experience, and ability to resolve exceptions. The vendor’s ability to hire and retain experienced data professionals, along with the added focus that a specialty provider can offer, may affect the accuracy and timeliness with which new data records are captured. Data consultancy and professional services based on integration of the data may be the most lucrative component of the service. The key to the business model is reducing the cost of the infrastructure while improving the quality and breadth of data. The logic of spreading the technology overhead across a number of firms, including the cost of maintaining data expertise and the judgment needed to normalize data, is compelling. Service-level agreements must be airtight and clearly outline expected performance in terms of the time needed to create new records or resolve discrepancies. The biggest challenge to an MDS will be firms’ reluctance to give up control of data administration. Many firms believe they manage data better than their peers, and it will be difficult to convince them that they can get superior content through a third party. SLAs will only partially offset the customer’s fear of losing control. Maintaining the history of data manipulations and retaining the raw data as it was delivered by the market data vendor may be necessary to meet institutions’ audit and compliance requirements. Market data vendors will be apprehensive about data management outsourcing and how it might impact their relationship with the customer. MDS providers may expose data quality issues that all vendors struggle with and may reduce some of the redundancy in how institutions currently consume data. However, vendors that compete on quality of content and service will ultimately benefit from the increased transparency that outsourcing offers their clients. Forward-thinking vendors will recognize the inevitability of new service models and a future where data is not locked into the terminal. Alliances with service providers may also become another sales channel. Market data vendors will likely review their approach to data licenses in an era of centralized infrastructure to ensure that data utilization and the corresponding fees are properly aligned. The bottom line is that as long as the end client is in compliance with the data license, vendors cannot prevent customers from nominating third parties to help them manage content that has been paid for. MDS providers are beginning to show their cards. Consolidation in the boutique data management industry has begun, and two significant acquisitions have already been completed (i.e., SunGard’s acquisition of Fame and Capco’s acquisition of Iverson). As the competition unfolds, expect to see new entrants into the space and more alliances between consultants and software companies. As MDS providers assemble their platforms to deliver a service, wholesale lift-outs of the data administration department of a financial institution is possible but less likely in the near term. Although the portability of an infrastructure developed for an individual firm remains questionable, if the architecture were designed properly, with enough flexibility to support a large financial services organization, it just might be able to support multiple firms. It would be expected that early service providers would focus on relatively commoditized areas where data acquisition is more straightforward, which includes equities and other listed securities. Ironically, the first commercial services embracing a shared infrastructure model are targeting the collection and distribution of a far more complex domain: corporate actions (e.g., Fidelity ActionsXchange, DTCC Global Corporate Action). If a managed data service can cope with the complexity of scrubbing corporate actions data, then surely listed securities would also be possible. Objections to a managed data service based on the notion that the firm will lose of control of the function are derived more from emotion than reality. Ironically, using internal staff to administer data may result in less control of the process, especially when data is redundantly managed within business silos. The reality is that many firms would be hard-pressed to boast about the degree of control they have on data management functions today. Business line managers can often get better SLAs and have more influence on external providers than they can get from an internal department (you can always fire a vendor!) Some firms, after taking a hard look at the lack of data management capabilities they possess, will outsource to gain control not lose it. Despite some interest in MDS, institutions are not yet convinced that providers have the entire infrastructure for an enterprise solution in place. Expect the supply side of the equation to be slow to develop. Large systems integrators that may be attracted to the business are risk-averse when it comes to contributing their own time and materials to develop a solution in advance of having a paying customer. There is a demand for data services, but because this is a new business model, it will be slow to materialize. Providing rock-solid metrics for the business case to outsource will not be easy. Service providers need to overcome ingrained attitudes and vested interests with the institutions, which will make the sales cycle more like trench warfare. Competition from internal departments will represent the greatest challenge for an external service provider. Front-office sponsorship may be MDSs’ best hope for winning business. The market will need a boost from institutional early adopters to prove its merit. Early MDS contracts will be more tactical in nature, focusing on data requirements of specific applications, business lines, or asset classes. It is a silo-dominated world, and consumption of data services will be no exception. Since discussions of enterprise data solutions would be premature, vendors must present a phased approach to foster confidence in the business model. They will need to offer various components of the infrastructure a la carte and grow the relationship organically. As institutions centralize their data administration, TowerGroup believes firms will be more likely to purchase components of the infrastructure than to build everything from scratch. This will include engaging services companies to help managing external data. Licensing of software components like feed handlers, data models, and workflow tools will continue to grow as this market matures. Many financial institutions have already accepted the benefits of scaling the data administration function, but a consensus on the value of managed data services will take more time to develop.

Subscribe to our newsletter

Related content


Recorded Webinar: Multi-cloud environments – How to maximise data value while keeping on the right side of privacy and security

Multi-cloud environments have much to offer beyond single-vendor cloud setups, including the benefits of access to a variety of best-in-class cloud solutions, opportunities for price optimisation, greater flexibility and scalability, better risk management, and crucially, increased performance and availability. On the downside, multiple cloud vendors in a technology stack can cause complexity, more vulnerabilities, and...


SIX responds to client demand with automated Corporate Action Calendar

SIX is helping clients track and process upcoming corporate action events cost effectively with an automated Corporate Action Calendar. Corporate actions covered by the calendar include mergers and acquisitions, dividends, and stock buybacks across all portfolio companies. The calendar, which is displayed through SIX iD, the company’s intuitive data display tool, is a response to...


AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.


The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...