About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Delta RDF Implementation on Track to Deliver Data Vendor Feed Plug and Play Functionality, Says Algorithmics’ Orde

Subscribe to our newsletter

Risk management solution vendor Algorithmics’ decision to roll out First Derivatives’ Delta Reference Data Factory as its new core reference data engine was driven by a desire to reduce new client implementation times and to allow for a more plug and play approach to changing existing customers’ data feeds, according to Roger Orde, senior director of Algo Risk Service at Algorithmics. The vendors have been in discussions about a possible rollout for quite some time, around three years in fact, but these discussions only became serious within the last eight to 12 months, as Algorithmics felt the pressure of increasing client numbers on its in-house data management system.

“We started to talk to Reference Data Factory about three years ago, before they were purchased by First Derivatives,” explains Orde. “At the time it was decided that we would build our own in-house solution because Algo Risk Service was still developing at that point in time.”

The interim period has seen on and off discussions between the two vendors, but it is only within the last eight to 12 months that Algorithmics has got “very serious” about a possible implementation, according to Orde. “The reason we decided to go with this kind of solution at this point was simply because Algo Risk Service had got to the point where many of the tools that we built in-house, although they served us very well in the past, were not the right tools to allow us to double or triple our client base,” he says.

Algorithmics also considered other solutions on the market and spoke to two other vendors in the space (one of whom is fairly well known in the EDM community), but opted for the First Derivatives offering because it was the best fit. The other two vendors had issues with technical infrastructure for data storage, the location of storage facilities (in the US only, rather than Europe) and both provided much more functionality than Algorithmics needed, according to Orde.

He explains that the vendor felt that what First Derivatives did very well, in particular, was the acquisition and normalisation and the matching and storing of both reference and pricing data. “We need to be in a position where we can cleanse any data that is coming into us from any provider. That was another key thing for us – their ability to hook up to Reuters or Bloomberg or Interactive Data or Telekurs or directly to custodians – they had a lot of the pipes already built. A lot of their products are available out of the box with minimal customisation. That, combined with their track record on shorter implementation times, sealed the deal for them,” he elaborates.

The discussions between the vendors obviously started before the acquisition of Reference Data Factory by First Derivatives, which happened back in October last year. The vendors were therefore well down the road in their contract talks before the First Derivatives deal went through. Orde explains that the deal had very little impact on the implementation itself, although it slowed the discussion down a little bit as Algorithmics was keen to see what First Derivatives would bring to the table and the future of the Reference Data Factory solution.

The implementation has been kicked off and Algorithmics has broken the rollout down into two main phases, according to Orde. He explains that the plan is for the first phase to be implemented by September. “We have contracted two of the consultants from First Derivatives to come in and help us document our current workflows with our existing in-house solution. We are in the architecture stage with Reference Data Factory, so right now we are gathering input to decide where to build out the architecture for implementation,” he explains.

Phase two, which involves the addition of more data management and distribution services, is scheduled to be completed February or March 2011. Orde reckons this will probably impact the vendor’s clients the most by simplifying client processes and the integration of new data feeds for them.

“With any software implementation the challenges are going to be related to validation and testing,” he adds. “We have gone through enough due diligence with Reference Data Factory to know that their implementations tend to go relatively smoothly.”

However, one of the key things that Orde was concerned about was that he didn’t want the data management solution vendor to come in and implement the solution for Algorithmics and then leave them tied in to Reference Data Factory forever. “So we slowed down the process so that I can use a reduced number of consultants from Reference Data Factory, but those are the top tier consultants and part of the main deliverable is knowledge transfer to my own team,” he explains.

The immediate benefits of the implementation won’t likely be obvious but what Algorithmics’ clients probably will notice is better validation and cleansing of data at the front end, says Orde. “Down the road, as clients add additional data feeds, they will notice that we are now able to turn that around very quickly and with a high degree of confidence that the data is correct. Currently, changing a data provider or picking up a data feed for a custodian is quite a task, but Reference Data Factory has these pipes built already. It will be a matter of plugging Reference Data Factory in and doing the data testing and rolling it out. Time to meet enhanced customer requirements will be the biggest thing that our current clients will notice,” he continues.

For new clients coming on board, the hope is that it should significantly reduce their implementation time: “Probably between 40-50% of the current implementation time is directly related to acquiring, mapping, validating and cleansing data. If we can make that a standard out of the box process, we can significantly reduce the implementation time.”

The key metrics for the implementation’s success are going to be turnaround times for existing clients adding new feeds and total implementation times for new clients; much the same as for many other financial institutions’ implementations of EDM solutions. “We will also be looking for improved levels of client satisfaction,” adds Orde.

This implementation, however, is far from the end of the road for Algorithmics’ data journey, he explains. “One of the key things for us in the future is moving towards a service oriented architecture (SOA), where our solutions are plug and play. Probably the second most important thing on the list would be to provide a multi-tenancy architecture. The way the service is set up right now is that each client has its own instance of the application, which makes us truly a managed service. I really want to move to a multi-tenancy architecture where we can have multiple clients sharing the same database of reference and pricing data, but obviously with security in place to ensure there is no cross pollination between them,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practice approaches to data management for regulatory reporting

13 May 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management...

BLOG

Broadridge Tradeverse – When a Data Lake is Not a Data Lake

Hugh Daly, Broadridge Financial Solutions’ head of capital markets data and artificial intelligence, is being mischievous when he describes the company’s latest innovation, its Tradeverse data platform. “It must sound very much like Tradeverse is a data lake – if it quacks like a data lake and walks like a data lake, fundamentally it must...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...