About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ReferenceDataFactory Unveils Bloomberg Adaptive Client

Subscribe to our newsletter

Data integration vendor ReferenceDataFactory (founded by a team of ex-FTI/GoldenSource people) has launched RDF Adaptive Client for Bloomberg Back Office. The service oriented architecture (SOA)-based solution for integration across Bloomberg Back Office and Per Security is designed to enable standards-based distribution of the data enterprise-wide. This Adaptive Client joins similar offerings for Reuters and Interactive Data sources within the ReferenceDataFactory stable. The vendor can also quickly create Adaptive Clients for sources on demand, due to the fact that there is a separation between the technology and the configuration of data, according to its managing director Andy Dilkes.

Dilkes says the ReferenceData-Factory technology is used by Accenture within its Managed Reference Data Service. Reference-DataFactory is also partnering with LakeFront Data Ventures, the consultancy founded by Dale Richards, formerly of SunGard, and recently bolstered by the hire of other ex-SunGard men Marc Odho and Rob Ord (Reference Data Review, February 2007). ReferenceDataFactory hopes to work with the large data vendors as well as financial institutions.

ReferenceDataFactory’s aim is “to enable the adaptive enterprise”, says Dilkes. “Our solution is a J2EE-based container, enabling you to plug in anything you like. Our intention is not to replace solutions like GoldenSource and Asset Control. We offer configurable adapters for existing databases – our technology could be used to get data into and out of databases like those. We can make an Asset Control or a GoldenSource behave like a service – or we could be implemented in conjunction with JRules from Ilog, for example.” The data management systems vendors are more focused on rules, internal data management and data centralisation than on integration with downstream systems, Dilkes adds. “Often they are built on proprietary technology bases, rather than on open platforms for data integration. The need for integration is obvious: the value comes with getting market and securities data out to downstream systems. Service oriented implementations are inevitable, and this space is ideal for them.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Why AI is Making Data Ownership a Business Imperative

By Edgar Randall, UK&I Managing Director, Dun & Bradstreet. As AI becomes the engine of modern business, the question of verifiable data ownership is no longer theoretical, it’s central to how organisations build trust in AI-driven decisions. The rise of AI means models depend entirely on the quality and integrity of the data they consume....

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...