About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Delta RDF Implementation on Track to Deliver Data Vendor Feed Plug and Play Functionality, Says Algorithmics’ Orde

Subscribe to our newsletter

Risk management solution vendor Algorithmics’ decision to roll out First Derivatives’ Delta Reference Data Factory as its new core reference data engine was driven by a desire to reduce new client implementation times and to allow for a more plug and play approach to changing existing customers’ data feeds, according to Roger Orde, senior director of Algo Risk Service at Algorithmics. The vendors have been in discussions about a possible rollout for quite some time, around three years in fact, but these discussions only became serious within the last eight to 12 months, as Algorithmics felt the pressure of increasing client numbers on its in-house data management system.

“We started to talk to Reference Data Factory about three years ago, before they were purchased by First Derivatives,” explains Orde. “At the time it was decided that we would build our own in-house solution because Algo Risk Service was still developing at that point in time.”

The interim period has seen on and off discussions between the two vendors, but it is only within the last eight to 12 months that Algorithmics has got “very serious” about a possible implementation, according to Orde. “The reason we decided to go with this kind of solution at this point was simply because Algo Risk Service had got to the point where many of the tools that we built in-house, although they served us very well in the past, were not the right tools to allow us to double or triple our client base,” he says.

Algorithmics also considered other solutions on the market and spoke to two other vendors in the space (one of whom is fairly well known in the EDM community), but opted for the First Derivatives offering because it was the best fit. The other two vendors had issues with technical infrastructure for data storage, the location of storage facilities (in the US only, rather than Europe) and both provided much more functionality than Algorithmics needed, according to Orde.

He explains that the vendor felt that what First Derivatives did very well, in particular, was the acquisition and normalisation and the matching and storing of both reference and pricing data. “We need to be in a position where we can cleanse any data that is coming into us from any provider. That was another key thing for us – their ability to hook up to Reuters or Bloomberg or Interactive Data or Telekurs or directly to custodians – they had a lot of the pipes already built. A lot of their products are available out of the box with minimal customisation. That, combined with their track record on shorter implementation times, sealed the deal for them,” he elaborates.

The discussions between the vendors obviously started before the acquisition of Reference Data Factory by First Derivatives, which happened back in October last year. The vendors were therefore well down the road in their contract talks before the First Derivatives deal went through. Orde explains that the deal had very little impact on the implementation itself, although it slowed the discussion down a little bit as Algorithmics was keen to see what First Derivatives would bring to the table and the future of the Reference Data Factory solution.

The implementation has been kicked off and Algorithmics has broken the rollout down into two main phases, according to Orde. He explains that the plan is for the first phase to be implemented by September. “We have contracted two of the consultants from First Derivatives to come in and help us document our current workflows with our existing in-house solution. We are in the architecture stage with Reference Data Factory, so right now we are gathering input to decide where to build out the architecture for implementation,” he explains.

Phase two, which involves the addition of more data management and distribution services, is scheduled to be completed February or March 2011. Orde reckons this will probably impact the vendor’s clients the most by simplifying client processes and the integration of new data feeds for them.

“With any software implementation the challenges are going to be related to validation and testing,” he adds. “We have gone through enough due diligence with Reference Data Factory to know that their implementations tend to go relatively smoothly.”

However, one of the key things that Orde was concerned about was that he didn’t want the data management solution vendor to come in and implement the solution for Algorithmics and then leave them tied in to Reference Data Factory forever. “So we slowed down the process so that I can use a reduced number of consultants from Reference Data Factory, but those are the top tier consultants and part of the main deliverable is knowledge transfer to my own team,” he explains.

The immediate benefits of the implementation won’t likely be obvious but what Algorithmics’ clients probably will notice is better validation and cleansing of data at the front end, says Orde. “Down the road, as clients add additional data feeds, they will notice that we are now able to turn that around very quickly and with a high degree of confidence that the data is correct. Currently, changing a data provider or picking up a data feed for a custodian is quite a task, but Reference Data Factory has these pipes built already. It will be a matter of plugging Reference Data Factory in and doing the data testing and rolling it out. Time to meet enhanced customer requirements will be the biggest thing that our current clients will notice,” he continues.

For new clients coming on board, the hope is that it should significantly reduce their implementation time: “Probably between 40-50% of the current implementation time is directly related to acquiring, mapping, validating and cleansing data. If we can make that a standard out of the box process, we can significantly reduce the implementation time.”

The key metrics for the implementation’s success are going to be turnaround times for existing clients adding new feeds and total implementation times for new clients; much the same as for many other financial institutions’ implementations of EDM solutions. “We will also be looking for improved levels of client satisfaction,” adds Orde.

This implementation, however, is far from the end of the road for Algorithmics’ data journey, he explains. “One of the key things for us in the future is moving towards a service oriented architecture (SOA), where our solutions are plug and play. Probably the second most important thing on the list would be to provide a multi-tenancy architecture. The way the service is set up right now is that each client has its own instance of the application, which makes us truly a managed service. I really want to move to a multi-tenancy architecture where we can have multiple clients sharing the same database of reference and pricing data, but obviously with security in place to ensure there is no cross pollination between them,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...

BLOG

Generative AI Will Play Role in Data Management says JP Morgan Chase Executive

Financial institutions are missing a valuable opportunity if they don’t harness the benefits that artificial intelligence (AI) can bring to data management. From accelerating and automating routine processes to mining value from huge data sets, established and generative AI have the potential to transform the way financial institutions use and organise their data, says JP...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...