About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

At your service…

Subscribe to our newsletter

The world of reference data does not typically get associated with the newest and sexiest innovations in technology, more’s the pity. But there are signs that this is changing, with a growing number of players active in the service oriented architecture (SOA) space rounding on reference data distribution as a target area for implementation of their wares.

Standards-based SOAs are the hot new thing in integration. The approach is designed to simplify and speed up integration between systems, and enable a smooth and if necessary gradual migration away from legacy systems by wrapping applications and components of applications as “services”, available for re-use by other applications. Web Services standards are closely associated with the SOA approach, although they are not essential for it. The move to SOA is benefiting not only financial institutions, but their vendors, many of which are also weighed down with old, inflexible systems and need a ready way to move forward with new functionality and improved implementation times. SunGard’s Common Services Architecture (CSA) programme is an SOA approach, for example, under which it is building a repository of “services” for use in creating applications.

The concept of data as a “service” in this context has been gaining ground for some time, and indeed a number of data management system vendors report that most RFIs and RFPs they see today specify that the buyer will only consider technology approaches that adhere to SOA principles. And now a number of vendors touting modern, standards-based integration approaches are rounding on the reference data distribution space, in the belief that it is a prime target for SOA implementations.

The vendors – companies such as PolarLake, ReferenceDataFactory, Informatica and Volante – are seeking to fill a gap that is not typically currently addressed by the “data warehouse” players active in the reference data marketplace, namely the distribution of data to downstream systems. While it is all very well to create a centralised security or counterparty data master, and to apply rules to cleanse incoming data and create a golden copy, this activity is not of much value if it is then difficult or time-consuming to get data out of the central repository into other applications that need it. Historically this downstream data distribution has been done using traditional, proprietary integration techniques, which undoubtedly work but are, the SOA proponents say, more time and resource intensive than modern standards-based integration approaches. Many of these vendors are partnering with the leading data management platform providers in the hope of selling integration in behind them. While it might be slightly galling for buyers to have to acquire yet more technology to make the technology they’ve just bought work more effectively, we all know that building a business case for an enterprise data management project and generating a satisfactory ROI from such an investment are challenging. Anything that enables cheaper and more rapid implementations surely helps on both scores, and is therefore good news for EDM system vendors and users alike.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

S&P Global Data via Cloud: Unlocking Real-Time, Scalable Insights with Snowflake and Databricks Delta Sharing

As organisations accelerate their cloud migration strategies to manage growing volumes of structured and unstructured data, demand is rising for secure, real-time, cloud-native access to trusted datasets. Leveraging Snowflake and Databricks Delta Sharing, S&P Global provides a scalable, agile foundation that allows organizations to directly access and query S&P Global and curated third-party datasets without...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2014

Welcome to the inaugural edition of the A-Team Regulatory Data Handbook. We trust you’ll find this guide a useful addition to the resources at your disposal as you navigate the maze of emerging regulations that are making ever more strenuous reporting demands on financial institutions everywhere. In putting the Handbook together, our rationale has been...