About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

PolarLake Launches Standalone Reference Data Reconciliation Solution

Subscribe to our newsletter

Provider of reference data distribution products PolarLake has launched a standalone reference data reconciliation solution aimed at fixing the problem of poor reference data quality in downstream systems. The vendor has created the new solution as part of its plans to develop modular solutions for the market and extend its client reach.

According to PolarLake, the new solution allows organisations to reconcile centralised golden copy with the distributed copies of reference data in downstream systems.

“The key driver for this solution is really the question of data quality once you start distributing reference data. A lot of the focus up until quite recently has been on creating golden copy and building data warehouses, but I think that as the industry has matured, people as realising that is only a part of the problem,” explains John Randles, CEO of PolarLake.

Once that reference data is integrated into downstream applications – fund accounting systems, order management systems and portfolio management systems – the golden copy or reference data can be destroyed in the process, he says. There is also a huge risk around intervention from other users and other processes that mean data you thought was clean ends up as dirty data.

“Our reconciliation solution is really targeting the control process around that data. Whether they use PolarLake for distribution or they use any other mechanism, our solution gives the business management the confidence that what they are distributing still retains its data integrity. This is critical to ensuring the success of enterprise data management,” claims Randles.

PolarLake has been working for over a year on this project and has evolved this offering from the RDD solution into a standalone solution. “We engaged with a number of customers with particular problems around guaranteeing the quality of distributed data and we worked with them on the specifications. We went through a number of versions of the solution before we announced it as a standalone offering. We defined and implemented the solution and we built enhancements on top of that after feedback from these clients. We then defined a set of business processes around that to help clients validate their distribution and ensure they have good quality during the use of data in the trade lifecycle.”

One of the key reasons that the vendor chose to produce a standalone solution was to meet the needs of firms that have distribution problems but because they have invested a large sum in their data infrastructure, they are looking for specific control around data distribution rather than replacing the whole system, explains Randles.

“The key challenge around producing a solution like this is the ability to deal with multiple formats and multiple ways of describing what may be the same piece of data and putting them all into a common format where you can reconcile them. Reconciliation is a well known process and the industry has been doing it for years but the challenge around reference data is dealing with these irregular data structures and formats,” he says.

“There are 101 reconciliation systems out there but none of them are built in particular for the reference data problem. What we found was that a lot of firms were struggling with the more traditional reconciliation systems to get them to deal with the complexities of reference data. They were spending a long time reconfiguring non-purpose built solutions.”

PolarLake has two customers currently using the system and, according to Randles, both are very happy with it. They are getting visibility reports on data quality and data effectiveness and how data becomes corrupted in downstream systems much more readily than they were previously, he says.

“The problems that were caught before were caught late in the trade lifecycle but now they can be proactively caught early in the cycle. This differentiates us from a generic integration solution by providing a proactive solution to catch problems before they occur,” Randles contends.

He believes that the standalone nature of the solution addresses the ability for customers to take components of PolarLake’s data offering without having to replace their entire system. It is compatible with any other data distribution mechanism, so it can be used with home grown solutions on an existing middleware platform, he adds.

“It depends on where people are in their EDM lifecycle, if they are at the start of the process they may look for an end-to-end data distribution mechanism, but if they have put in an infrastructure and finished the integration, they can use our modular solution. There is a huge appetite for a modular approach to different value-add components that people can use around data distribution,” he continues.

To this end, PolarLake has a pipeline of different announcements that will address various challenges around distribution and integration of reference data in a modular fashion, which it will be releasing throughout the rest of the year, says Randles.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Making the Most of Mainframe Structured Data: Webinar Preview

Mainframes still provide the data and computational backbone of many financial institutions but some organisations are encountering challenges as they try to integrate them with newer architectures. Many are incompatible with cloud and server-based architectures as well as APIs. Work-arounds can be achieved but they require middleware that can be costly and time consuming to...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...