The world of reference data does not typically get associated with the newest and sexiest innovations in technology, more’s the pity. But there are signs that this is changing, with a growing number of players active in the service oriented architecture (SOA) space rounding on reference data distribution as a target area for implementation of their wares.
Standards-based SOAs are the hot new thing in integration. The approach is designed to simplify and speed up integration between systems, and enable a smooth and if necessary gradual migration away from legacy systems by wrapping applications and components of applications as “services”, available for re-use by other applications. Web Services standards are closely associated with the SOA approach, although they are not essential for it. The move to SOA is benefiting not only financial institutions, but their vendors, many of which are also weighed down with old, inflexible systems and need a ready way to move forward with new functionality and improved implementation times. SunGard’s Common Services Architecture (CSA) programme is an SOA approach, for example, under which it is building a repository of “services” for use in creating applications.
The concept of data as a “service” in this context has been gaining ground for some time, and indeed a number of data management system vendors report that most RFIs and RFPs they see today specify that the buyer will only consider technology approaches that adhere to SOA principles. And now a number of vendors touting modern, standards-based integration approaches are rounding on the reference data distribution space, in the belief that it is a prime target for SOA implementations.
The vendors – companies such as PolarLake, ReferenceDataFactory, Informatica and Volante – are seeking to fill a gap that is not typically currently addressed by the “data warehouse” players active in the reference data marketplace, namely the distribution of data to downstream systems. While it is all very well to create a centralised security or counterparty data master, and to apply rules to cleanse incoming data and create a golden copy, this activity is not of much value if it is then difficult or time-consuming to get data out of the central repository into other applications that need it. Historically this downstream data distribution has been done using traditional, proprietary integration techniques, which undoubtedly work but are, the SOA proponents say, more time and resource intensive than modern standards-based integration approaches. Many of these vendors are partnering with the leading data management platform providers in the hope of selling integration in behind them. While it might be slightly galling for buyers to have to acquire yet more technology to make the technology they’ve just bought work more effectively, we all know that building a business case for an enterprise data management project and generating a satisfactory ROI from such an investment are challenging. Anything that enables cheaper and more rapid implementations surely helps on both scores, and is therefore good news for EDM system vendors and users alike.
Subscribe to our newsletter