About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ReferenceDataFactory Unveils Bloomberg Adaptive Client

Subscribe to our newsletter

Data integration vendor ReferenceDataFactory (founded by a team of ex-FTI/GoldenSource people) has launched RDF Adaptive Client for Bloomberg Back Office. The service oriented architecture (SOA)-based solution for integration across Bloomberg Back Office and Per Security is designed to enable standards-based distribution of the data enterprise-wide. This Adaptive Client joins similar offerings for Reuters and Interactive Data sources within the ReferenceDataFactory stable. The vendor can also quickly create Adaptive Clients for sources on demand, due to the fact that there is a separation between the technology and the configuration of data, according to its managing director Andy Dilkes.

Dilkes says the ReferenceData-Factory technology is used by Accenture within its Managed Reference Data Service. Reference-DataFactory is also partnering with LakeFront Data Ventures, the consultancy founded by Dale Richards, formerly of SunGard, and recently bolstered by the hire of other ex-SunGard men Marc Odho and Rob Ord (Reference Data Review, February 2007). ReferenceDataFactory hopes to work with the large data vendors as well as financial institutions.

ReferenceDataFactory’s aim is “to enable the adaptive enterprise”, says Dilkes. “Our solution is a J2EE-based container, enabling you to plug in anything you like. Our intention is not to replace solutions like GoldenSource and Asset Control. We offer configurable adapters for existing databases – our technology could be used to get data into and out of databases like those. We can make an Asset Control or a GoldenSource behave like a service – or we could be implemented in conjunction with JRules from Ilog, for example.” The data management systems vendors are more focused on rules, internal data management and data centralisation than on integration with downstream systems, Dilkes adds. “Often they are built on proprietary technology bases, rather than on open platforms for data integration. The need for integration is obvious: the value comes with getting market and securities data out to downstream systems. Service oriented implementations are inevitable, and this space is ideal for them.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Why AI is Making Data Ownership a Business Imperative

By Edgar Randall, UK&I Managing Director, Dun & Bradstreet. As AI becomes the engine of modern business, the question of verifiable data ownership is no longer theoretical, it’s central to how organisations build trust in AI-driven decisions. The rise of AI means models depend entirely on the quality and integrity of the data they consume....

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2018/2019 – Sixth Edition

In a testament to the enduring popularity of the A-Team Regulatory Data Handbook, we are delighted to publish a sixth edition for 2018-19 of our comprehensive guide to all the regulations and rules that might impact data and data management at your institution. As in previous editions of the Regulatory Data Handbook, we have updated...