About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

At your service…

Subscribe to our newsletter

The world of reference data does not typically get associated with the newest and sexiest innovations in technology, more’s the pity. But there are signs that this is changing, with a growing number of players active in the service oriented architecture (SOA) space rounding on reference data distribution as a target area for implementation of their wares.

Standards-based SOAs are the hot new thing in integration. The approach is designed to simplify and speed up integration between systems, and enable a smooth and if necessary gradual migration away from legacy systems by wrapping applications and components of applications as “services”, available for re-use by other applications. Web Services standards are closely associated with the SOA approach, although they are not essential for it. The move to SOA is benefiting not only financial institutions, but their vendors, many of which are also weighed down with old, inflexible systems and need a ready way to move forward with new functionality and improved implementation times. SunGard’s Common Services Architecture (CSA) programme is an SOA approach, for example, under which it is building a repository of “services” for use in creating applications.

The concept of data as a “service” in this context has been gaining ground for some time, and indeed a number of data management system vendors report that most RFIs and RFPs they see today specify that the buyer will only consider technology approaches that adhere to SOA principles. And now a number of vendors touting modern, standards-based integration approaches are rounding on the reference data distribution space, in the belief that it is a prime target for SOA implementations.

The vendors – companies such as PolarLake, ReferenceDataFactory, Informatica and Volante – are seeking to fill a gap that is not typically currently addressed by the “data warehouse” players active in the reference data marketplace, namely the distribution of data to downstream systems. While it is all very well to create a centralised security or counterparty data master, and to apply rules to cleanse incoming data and create a golden copy, this activity is not of much value if it is then difficult or time-consuming to get data out of the central repository into other applications that need it. Historically this downstream data distribution has been done using traditional, proprietary integration techniques, which undoubtedly work but are, the SOA proponents say, more time and resource intensive than modern standards-based integration approaches. Many of these vendors are partnering with the leading data management platform providers in the hope of selling integration in behind them. While it might be slightly galling for buyers to have to acquire yet more technology to make the technology they’ve just bought work more effectively, we all know that building a business case for an enterprise data management project and generating a satisfactory ROI from such an investment are challenging. Anything that enables cheaper and more rapid implementations surely helps on both scores, and is therefore good news for EDM system vendors and users alike.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Ensuring Data Integrity in Finance – A Foundation for Efficiency and Trust

By Neil Sandle, director of product management at Gresham. In today’s financial landscape, data integrity is more than a regulatory requirement — it is the backbone of efficient operations and trustworthy decision making. Ensuring that data remains accurate, consistent, and reliable throughout its lifecycle is essential for financial institutions looking to maintain operational excellence, manage...

EVENT

Data Licensing Forum 2025

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...