The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Delta RDF Implementation on Track to Deliver Data Vendor Feed Plug and Play Functionality, Says Algorithmics’ Orde

Share article

Risk management solution vendor Algorithmics’ decision to roll out First Derivatives’ Delta Reference Data Factory as its new core reference data engine was driven by a desire to reduce new client implementation times and to allow for a more plug and play approach to changing existing customers’ data feeds, according to Roger Orde, senior director of Algo Risk Service at Algorithmics. The vendors have been in discussions about a possible rollout for quite some time, around three years in fact, but these discussions only became serious within the last eight to 12 months, as Algorithmics felt the pressure of increasing client numbers on its in-house data management system.

“We started to talk to Reference Data Factory about three years ago, before they were purchased by First Derivatives,” explains Orde. “At the time it was decided that we would build our own in-house solution because Algo Risk Service was still developing at that point in time.”

The interim period has seen on and off discussions between the two vendors, but it is only within the last eight to 12 months that Algorithmics has got “very serious” about a possible implementation, according to Orde. “The reason we decided to go with this kind of solution at this point was simply because Algo Risk Service had got to the point where many of the tools that we built in-house, although they served us very well in the past, were not the right tools to allow us to double or triple our client base,” he says.

Algorithmics also considered other solutions on the market and spoke to two other vendors in the space (one of whom is fairly well known in the EDM community), but opted for the First Derivatives offering because it was the best fit. The other two vendors had issues with technical infrastructure for data storage, the location of storage facilities (in the US only, rather than Europe) and both provided much more functionality than Algorithmics needed, according to Orde.

He explains that the vendor felt that what First Derivatives did very well, in particular, was the acquisition and normalisation and the matching and storing of both reference and pricing data. “We need to be in a position where we can cleanse any data that is coming into us from any provider. That was another key thing for us – their ability to hook up to Reuters or Bloomberg or Interactive Data or Telekurs or directly to custodians – they had a lot of the pipes already built. A lot of their products are available out of the box with minimal customisation. That, combined with their track record on shorter implementation times, sealed the deal for them,” he elaborates.

The discussions between the vendors obviously started before the acquisition of Reference Data Factory by First Derivatives, which happened back in October last year. The vendors were therefore well down the road in their contract talks before the First Derivatives deal went through. Orde explains that the deal had very little impact on the implementation itself, although it slowed the discussion down a little bit as Algorithmics was keen to see what First Derivatives would bring to the table and the future of the Reference Data Factory solution.

The implementation has been kicked off and Algorithmics has broken the rollout down into two main phases, according to Orde. He explains that the plan is for the first phase to be implemented by September. “We have contracted two of the consultants from First Derivatives to come in and help us document our current workflows with our existing in-house solution. We are in the architecture stage with Reference Data Factory, so right now we are gathering input to decide where to build out the architecture for implementation,” he explains.

Phase two, which involves the addition of more data management and distribution services, is scheduled to be completed February or March 2011. Orde reckons this will probably impact the vendor’s clients the most by simplifying client processes and the integration of new data feeds for them.

“With any software implementation the challenges are going to be related to validation and testing,” he adds. “We have gone through enough due diligence with Reference Data Factory to know that their implementations tend to go relatively smoothly.”

However, one of the key things that Orde was concerned about was that he didn’t want the data management solution vendor to come in and implement the solution for Algorithmics and then leave them tied in to Reference Data Factory forever. “So we slowed down the process so that I can use a reduced number of consultants from Reference Data Factory, but those are the top tier consultants and part of the main deliverable is knowledge transfer to my own team,” he explains.

The immediate benefits of the implementation won’t likely be obvious but what Algorithmics’ clients probably will notice is better validation and cleansing of data at the front end, says Orde. “Down the road, as clients add additional data feeds, they will notice that we are now able to turn that around very quickly and with a high degree of confidence that the data is correct. Currently, changing a data provider or picking up a data feed for a custodian is quite a task, but Reference Data Factory has these pipes built already. It will be a matter of plugging Reference Data Factory in and doing the data testing and rolling it out. Time to meet enhanced customer requirements will be the biggest thing that our current clients will notice,” he continues.

For new clients coming on board, the hope is that it should significantly reduce their implementation time: “Probably between 40-50% of the current implementation time is directly related to acquiring, mapping, validating and cleansing data. If we can make that a standard out of the box process, we can significantly reduce the implementation time.”

The key metrics for the implementation’s success are going to be turnaround times for existing clients adding new feeds and total implementation times for new clients; much the same as for many other financial institutions’ implementations of EDM solutions. “We will also be looking for improved levels of client satisfaction,” adds Orde.

This implementation, however, is far from the end of the road for Algorithmics’ data journey, he explains. “One of the key things for us in the future is moving towards a service oriented architecture (SOA), where our solutions are plug and play. Probably the second most important thing on the list would be to provide a multi-tenancy architecture. The way the service is set up right now is that each client has its own instance of the application, which makes us truly a managed service. I really want to move to a multi-tenancy architecture where we can have multiple clients sharing the same database of reference and pricing data, but obviously with security in place to ensure there is no cross pollination between them,” he concludes.

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

FINOS and Goldman Sachs Release Legend Open Source Data Management Platform

In pursuit of data collaboration and standardisation across capital markets, the Fintech Open Source Foundation (FINOS) and Goldman Sachs have released Legend, the bank’s flagship data management and data governance platform. The source code for five of the platform’s modules is available from FINOS. Goldman Sachs contributed its PURE logical modelling language and Alloy platform...

EVENT

RegTech Summit London

Now in its 6th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...