About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

At your service…

Subscribe to our newsletter

The world of reference data does not typically get associated with the newest and sexiest innovations in technology, more’s the pity. But there are signs that this is changing, with a growing number of players active in the service oriented architecture (SOA) space rounding on reference data distribution as a target area for implementation of their wares.

Standards-based SOAs are the hot new thing in integration. The approach is designed to simplify and speed up integration between systems, and enable a smooth and if necessary gradual migration away from legacy systems by wrapping applications and components of applications as “services”, available for re-use by other applications. Web Services standards are closely associated with the SOA approach, although they are not essential for it. The move to SOA is benefiting not only financial institutions, but their vendors, many of which are also weighed down with old, inflexible systems and need a ready way to move forward with new functionality and improved implementation times. SunGard’s Common Services Architecture (CSA) programme is an SOA approach, for example, under which it is building a repository of “services” for use in creating applications.

The concept of data as a “service” in this context has been gaining ground for some time, and indeed a number of data management system vendors report that most RFIs and RFPs they see today specify that the buyer will only consider technology approaches that adhere to SOA principles. And now a number of vendors touting modern, standards-based integration approaches are rounding on the reference data distribution space, in the belief that it is a prime target for SOA implementations.

The vendors – companies such as PolarLake, ReferenceDataFactory, Informatica and Volante – are seeking to fill a gap that is not typically currently addressed by the “data warehouse” players active in the reference data marketplace, namely the distribution of data to downstream systems. While it is all very well to create a centralised security or counterparty data master, and to apply rules to cleanse incoming data and create a golden copy, this activity is not of much value if it is then difficult or time-consuming to get data out of the central repository into other applications that need it. Historically this downstream data distribution has been done using traditional, proprietary integration techniques, which undoubtedly work but are, the SOA proponents say, more time and resource intensive than modern standards-based integration approaches. Many of these vendors are partnering with the leading data management platform providers in the hope of selling integration in behind them. While it might be slightly galling for buyers to have to acquire yet more technology to make the technology they’ve just bought work more effectively, we all know that building a business case for an enterprise data management project and generating a satisfactory ROI from such an investment are challenging. Anything that enables cheaper and more rapid implementations surely helps on both scores, and is therefore good news for EDM system vendors and users alike.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to organise, integrate and structure data for successful AI

25 September 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are...

BLOG

SimCorp Urges a Holistic View of Buy-Side Retooling to Enable AI

You don’t have to scratch far below the surface of the artificial intelligence hype machine to see that many financial institutions are experiencing challenges in implementing the technology. Our own Data Management Insight annual preview in January of predictions for the coming year found that vendors and users alike reported the dawning of a realisation that, for...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...