The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

At your service…

Share article

The world of reference data does not typically get associated with the newest and sexiest innovations in technology, more’s the pity. But there are signs that this is changing, with a growing number of players active in the service oriented architecture (SOA) space rounding on reference data distribution as a target area for implementation of their wares.

Standards-based SOAs are the hot new thing in integration. The approach is designed to simplify and speed up integration between systems, and enable a smooth and if necessary gradual migration away from legacy systems by wrapping applications and components of applications as “services”, available for re-use by other applications. Web Services standards are closely associated with the SOA approach, although they are not essential for it. The move to SOA is benefiting not only financial institutions, but their vendors, many of which are also weighed down with old, inflexible systems and need a ready way to move forward with new functionality and improved implementation times. SunGard’s Common Services Architecture (CSA) programme is an SOA approach, for example, under which it is building a repository of “services” for use in creating applications.

The concept of data as a “service” in this context has been gaining ground for some time, and indeed a number of data management system vendors report that most RFIs and RFPs they see today specify that the buyer will only consider technology approaches that adhere to SOA principles. And now a number of vendors touting modern, standards-based integration approaches are rounding on the reference data distribution space, in the belief that it is a prime target for SOA implementations.

The vendors – companies such as PolarLake, ReferenceDataFactory, Informatica and Volante – are seeking to fill a gap that is not typically currently addressed by the “data warehouse” players active in the reference data marketplace, namely the distribution of data to downstream systems. While it is all very well to create a centralised security or counterparty data master, and to apply rules to cleanse incoming data and create a golden copy, this activity is not of much value if it is then difficult or time-consuming to get data out of the central repository into other applications that need it. Historically this downstream data distribution has been done using traditional, proprietary integration techniques, which undoubtedly work but are, the SOA proponents say, more time and resource intensive than modern standards-based integration approaches. Many of these vendors are partnering with the leading data management platform providers in the hope of selling integration in behind them. While it might be slightly galling for buyers to have to acquire yet more technology to make the technology they’ve just bought work more effectively, we all know that building a business case for an enterprise data management project and generating a satisfactory ROI from such an investment are challenging. Anything that enables cheaper and more rapid implementations surely helps on both scores, and is therefore good news for EDM system vendors and users alike.

Related content

WEBINAR

Recorded Webinar: How to leverage the LIBOR transition to improve your data management game

The transition away from LIBOR (London Interbank Offered Rate) is well underway, but there remains considerable ambiguity around how the final stages will be executed – especially with regards to benchmark replacements in markets outside the UK. What are the options, where are the uncertainties and what stage have firms reached in their preparations? The...

BLOG

GoldenSource Ushers Reference and Pricing Data into the Front Office with Quant Workbench

Extracting value from data is a priority for financial institutions as the business looks to increase efficiency, reduce costs, identify new opportunities and gain competitive advantage. Some source in-house tools to improve the quality and accessibility of internal and external data, others look to third-parties for solutions. A new tool from GoldenSource, Quant Workbench, brings...

EVENT

RegTech Summit Virtual

Regtech Summit Virtual will explore how business and operating models have adapted post COVID and how RegTech can provide agile and enhanced compliance for managing an evolving risk and compliance landscape. As the dust settles, we will look at the outlook for the global RegTech industry, where Regulators are focusing as they get back to business, and deep dive into global regulatory priorities for the rest of the year and into 2021.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...