About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ReferenceDataFactory Unveils Bloomberg Adaptive Client

Subscribe to our newsletter

Data integration vendor ReferenceDataFactory (founded by a team of ex-FTI/GoldenSource people) has launched RDF Adaptive Client for Bloomberg Back Office. The service oriented architecture (SOA)-based solution for integration across Bloomberg Back Office and Per Security is designed to enable standards-based distribution of the data enterprise-wide. This Adaptive Client joins similar offerings for Reuters and Interactive Data sources within the ReferenceDataFactory stable. The vendor can also quickly create Adaptive Clients for sources on demand, due to the fact that there is a separation between the technology and the configuration of data, according to its managing director Andy Dilkes.

Dilkes says the ReferenceData-Factory technology is used by Accenture within its Managed Reference Data Service. Reference-DataFactory is also partnering with LakeFront Data Ventures, the consultancy founded by Dale Richards, formerly of SunGard, and recently bolstered by the hire of other ex-SunGard men Marc Odho and Rob Ord (Reference Data Review, February 2007). ReferenceDataFactory hopes to work with the large data vendors as well as financial institutions.

ReferenceDataFactory’s aim is “to enable the adaptive enterprise”, says Dilkes. “Our solution is a J2EE-based container, enabling you to plug in anything you like. Our intention is not to replace solutions like GoldenSource and Asset Control. We offer configurable adapters for existing databases – our technology could be used to get data into and out of databases like those. We can make an Asset Control or a GoldenSource behave like a service – or we could be implemented in conjunction with JRules from Ilog, for example.” The data management systems vendors are more focused on rules, internal data management and data centralisation than on integration with downstream systems, Dilkes adds. “Often they are built on proprietary technology bases, rather than on open platforms for data integration. The need for integration is obvious: the value comes with getting market and securities data out to downstream systems. Service oriented implementations are inevitable, and this space is ideal for them.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Governance to be Scrutinised at Inaugural AI in Data Management Summit NYC

Ensuring artificial intelligence deployments are securely governed without stymieing their potential is a delicate balancing act. It requires carefully drawn policies, frameworks and processes. As deployment of the technology expands and its capabilities and complexity multiply, the governance structure must adapt and evolve. How to get this right is among the most important topics swirling...

EVENT

Eagle Alpha Alternative Data Conference, London, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...