About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ReferenceDataFactory Unveils Bloomberg Adaptive Client

Subscribe to our newsletter

Data integration vendor ReferenceDataFactory (founded by a team of ex-FTI/GoldenSource people) has launched RDF Adaptive Client for Bloomberg Back Office. The service oriented architecture (SOA)-based solution for integration across Bloomberg Back Office and Per Security is designed to enable standards-based distribution of the data enterprise-wide. This Adaptive Client joins similar offerings for Reuters and Interactive Data sources within the ReferenceDataFactory stable. The vendor can also quickly create Adaptive Clients for sources on demand, due to the fact that there is a separation between the technology and the configuration of data, according to its managing director Andy Dilkes.

Dilkes says the ReferenceData-Factory technology is used by Accenture within its Managed Reference Data Service. Reference-DataFactory is also partnering with LakeFront Data Ventures, the consultancy founded by Dale Richards, formerly of SunGard, and recently bolstered by the hire of other ex-SunGard men Marc Odho and Rob Ord (Reference Data Review, February 2007). ReferenceDataFactory hopes to work with the large data vendors as well as financial institutions.

ReferenceDataFactory’s aim is “to enable the adaptive enterprise”, says Dilkes. “Our solution is a J2EE-based container, enabling you to plug in anything you like. Our intention is not to replace solutions like GoldenSource and Asset Control. We offer configurable adapters for existing databases – our technology could be used to get data into and out of databases like those. We can make an Asset Control or a GoldenSource behave like a service – or we could be implemented in conjunction with JRules from Ilog, for example.” The data management systems vendors are more focused on rules, internal data management and data centralisation than on integration with downstream systems, Dilkes adds. “Often they are built on proprietary technology bases, rather than on open platforms for data integration. The need for integration is obvious: the value comes with getting market and securities data out to downstream systems. Service oriented implementations are inevitable, and this space is ideal for them.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

UK Equity Consolidated Tape and EU MiFIR – Two Data Regimes, One Control Problem

The UK’s proposed equity consolidated tape is framed as a response to long-standing fragmentation in equity market data. By aggregating post-trade information and an attributed best bid and offer across trading venues, the tape is intended to provide a single, standardised view of UK equity trading. At the same time, transaction reporting under the Markets...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Data Management Implications of Solvency II

This special report accompanies a webinar we held on the popular topic of The Data Management Implications of Solvency II, discussing the data implications for asset managers and their custodians and asset servicers. You can register here to get immediate access to the Special Report.