The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

PolarLake Launches Standalone Reference Data Reconciliation Solution

Provider of reference data distribution products PolarLake has launched a standalone reference data reconciliation solution aimed at fixing the problem of poor reference data quality in downstream systems. The vendor has created the new solution as part of its plans to develop modular solutions for the market and extend its client reach.

According to PolarLake, the new solution allows organisations to reconcile centralised golden copy with the distributed copies of reference data in downstream systems.

“The key driver for this solution is really the question of data quality once you start distributing reference data. A lot of the focus up until quite recently has been on creating golden copy and building data warehouses, but I think that as the industry has matured, people as realising that is only a part of the problem,” explains John Randles, CEO of PolarLake.

Once that reference data is integrated into downstream applications – fund accounting systems, order management systems and portfolio management systems – the golden copy or reference data can be destroyed in the process, he says. There is also a huge risk around intervention from other users and other processes that mean data you thought was clean ends up as dirty data.

“Our reconciliation solution is really targeting the control process around that data. Whether they use PolarLake for distribution or they use any other mechanism, our solution gives the business management the confidence that what they are distributing still retains its data integrity. This is critical to ensuring the success of enterprise data management,” claims Randles.

PolarLake has been working for over a year on this project and has evolved this offering from the RDD solution into a standalone solution. “We engaged with a number of customers with particular problems around guaranteeing the quality of distributed data and we worked with them on the specifications. We went through a number of versions of the solution before we announced it as a standalone offering. We defined and implemented the solution and we built enhancements on top of that after feedback from these clients. We then defined a set of business processes around that to help clients validate their distribution and ensure they have good quality during the use of data in the trade lifecycle.”

One of the key reasons that the vendor chose to produce a standalone solution was to meet the needs of firms that have distribution problems but because they have invested a large sum in their data infrastructure, they are looking for specific control around data distribution rather than replacing the whole system, explains Randles.

“The key challenge around producing a solution like this is the ability to deal with multiple formats and multiple ways of describing what may be the same piece of data and putting them all into a common format where you can reconcile them. Reconciliation is a well known process and the industry has been doing it for years but the challenge around reference data is dealing with these irregular data structures and formats,” he says.

“There are 101 reconciliation systems out there but none of them are built in particular for the reference data problem. What we found was that a lot of firms were struggling with the more traditional reconciliation systems to get them to deal with the complexities of reference data. They were spending a long time reconfiguring non-purpose built solutions.”

PolarLake has two customers currently using the system and, according to Randles, both are very happy with it. They are getting visibility reports on data quality and data effectiveness and how data becomes corrupted in downstream systems much more readily than they were previously, he says.

“The problems that were caught before were caught late in the trade lifecycle but now they can be proactively caught early in the cycle. This differentiates us from a generic integration solution by providing a proactive solution to catch problems before they occur,” Randles contends.

He believes that the standalone nature of the solution addresses the ability for customers to take components of PolarLake’s data offering without having to replace their entire system. It is compatible with any other data distribution mechanism, so it can be used with home grown solutions on an existing middleware platform, he adds.

“It depends on where people are in their EDM lifecycle, if they are at the start of the process they may look for an end-to-end data distribution mechanism, but if they have put in an infrastructure and finished the integration, they can use our modular solution. There is a huge appetite for a modular approach to different value-add components that people can use around data distribution,” he continues.

To this end, PolarLake has a pipeline of different announcements that will address various challenges around distribution and integration of reference data in a modular fashion, which it will be releasing throughout the rest of the year, says Randles.

Related content

WEBINAR

Recorded Webinar: Evolution of data management for the buy-side 2021

The buy-side faced a barrage of regulation in 2020 and is now under pressure to make post-Brexit adjustments and complete LIBOR transition by the end of 2021. To ensure compliance and ease the burden of in-house data management, many firms turned to outsourcing and managed services. But there is more to come, as buy-side firms...

BLOG

Refinitiv Adds Country SDG Scores to ESG Data Arsenal

Refinitiv’s launch of measures assessing the sustainable development performance of individual countries is the latest of a series of services targeted at the burgeoning ESG investing space. The addition allows Refinitiv to offer asset managers and servicers scores that indicate the degree to which funds, corporations and now countries meet the UNs Sustainable Development Goals...

EVENT

LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Managing Valuations Data for Optimal Risk Management

The US corporate actions market has long been characterised as paper-based and manually intensive, but it seems that much progress is being made of late to tackle the lack of automation due to the introduction of four little letters: XBRL. According to a survey by the American Institute of Certified Public Accountants (AICPA) and standards...