About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ReferenceDataFactory Unveils Bloomberg Adaptive Client

Subscribe to our newsletter

Data integration vendor ReferenceDataFactory (founded by a team of ex-FTI/GoldenSource people) has launched RDF Adaptive Client for Bloomberg Back Office. The service oriented architecture (SOA)-based solution for integration across Bloomberg Back Office and Per Security is designed to enable standards-based distribution of the data enterprise-wide. This Adaptive Client joins similar offerings for Reuters and Interactive Data sources within the ReferenceDataFactory stable. The vendor can also quickly create Adaptive Clients for sources on demand, due to the fact that there is a separation between the technology and the configuration of data, according to its managing director Andy Dilkes.

Dilkes says the ReferenceData-Factory technology is used by Accenture within its Managed Reference Data Service. Reference-DataFactory is also partnering with LakeFront Data Ventures, the consultancy founded by Dale Richards, formerly of SunGard, and recently bolstered by the hire of other ex-SunGard men Marc Odho and Rob Ord (Reference Data Review, February 2007). ReferenceDataFactory hopes to work with the large data vendors as well as financial institutions.

ReferenceDataFactory’s aim is “to enable the adaptive enterprise”, says Dilkes. “Our solution is a J2EE-based container, enabling you to plug in anything you like. Our intention is not to replace solutions like GoldenSource and Asset Control. We offer configurable adapters for existing databases – our technology could be used to get data into and out of databases like those. We can make an Asset Control or a GoldenSource behave like a service – or we could be implemented in conjunction with JRules from Ilog, for example.” The data management systems vendors are more focused on rules, internal data management and data centralisation than on integration with downstream systems, Dilkes adds. “Often they are built on proprietary technology bases, rather than on open platforms for data integration. The need for integration is obvious: the value comes with getting market and securities data out to downstream systems. Service oriented implementations are inevitable, and this space is ideal for them.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Data Infrastructure Faces Stress Test as Private Credit Consolidation Beckons

By Charles Sayac, Managing Director EMEA West, NeoXam. A bout of consolidation unseen in the sector’s history may be on the cards for the private credit space – one that threatens to unearth a host of complex data challenges for the unprepared. A recent Carne Group report revealed almost all (96 per cent) of private debt managers...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Putting the LEI into Practice

Hundreds of thousands of pre-Legal Entity Identifiers (LEIs) have been issued by pre-Local Operating Units (LOUs) in the Global LEI System (GLEIS), and the standard entity identifier has been mandated for use by regulators in both the US and Europe. As more pre-LEIs are issued ahead of the establishment of the global systems’ Central Operating...