The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

PolarLake has Netted Eight New Clients This Year, Launches Reference Data Policy Engine

Commensurate with the current trend towards an uptick in investment in data management solutions this year, PolarLake’s CEO John Randles indicates that eight new clients have signed on the dotted line so far in 2010. The vendor has also just launched its PolarLake Reference Data Policy Engine, which Randles says is aimed at affording end users more control over how they interact with version 3.4 of PolarLake’s Reference Data Distribution (RDD) product.

To this end, the policy engine uses semantics technologies embedded within a business application to enable business operations and technical staff to define reference data usage control policies. This is all part of the vendor’s recognition that downstream users are getting much more involved in the data management process, notes Randles. He is convinced that enterprise data management (EDM) is evolving into what he calls “reference data supply chain management”.

“We’ve seen huge interest from firms not just wanting integration in the traditional form of middleware but more to do with implementing policy around who gets access to what data,” he explains. “This is about distribution priorities and the business uses of data. It is a much more sophisticated look at distribution from a business perspective rather than just as a technology issue.”

The new policy engine therefore features data access control policies based on downstream user and application profiles and data usage quotas aimed at ensuring appropriate and controlled use of data. The system is better able to adapt to end user requirements as it has the ability to learn and alter the data model based on observation of change in data patterns over time, says the vendor. It also provides more data on expected delivery schedules and exception escalation paths.

Randles reckons people have now got their heads around the concept of data warehousing and are now looking to extend this focus further downstream to fully realise the benefits of centralisation. “If you are building a data warehouse and it is difficult to use from the consuming systems, then this misses the whole point of data centralisation. It becomes another repository to be maintained rather than something of value,” he says.

The industry has moved on to look at the consuming application demands and end users are getting much more specific in how they want to interact with a centralised service, according to Randles. This includes the schedules and the formats in which they receive the data, as well as putting filters into the process in order to receive the data that they want rather than all the data that can be provided from the system. The focus of a reference data platform is therefore on better servicing a firm’s internal customers and making sure data distribution is conducted in a timely manner. This is all about the “last mile” of the reference data challenge: ensuring the data is for purpose.

In the second half of this year, PolarLake will release version four of its RDD, which was first released on the market back in 2006. Randles notes that this will probably happen at the start of the third quarter and should include a lot of “interesting new features”. The vendor has been working on the development of the solution with its customers, which includes six of the top 10 investment banks, two of the top five prime brokers and two of the top 10 asset managers, says Randles. “We think the product is fairly unique in solving the data supply chain challenge, rather than as a golden copy focused solution,” he adds.

The challenge for PolarLake going forward is around “education” and explaining how the vendor’s approach differs from a generic technology approach to data challenges, notes Randles. “We work in conjunction with a lot of the golden copy providers to fill the gap between generic technology and accessing the end data that they provide,” he says.

The vendor is focusing on gaining traction this year within the top 100 asset managers and the tier one and two sell side firms, adds Randles.

Related content

WEBINAR

Recorded Webinar: Evolution of data management for the buy-side 2021

The buy-side faced a barrage of regulation in 2020 and is now under pressure to make post-Brexit adjustments and complete LIBOR transition by the end of 2021. To ensure compliance and ease the burden of in-house data management, many firms turned to outsourcing and managed services. But there is more to come, as buy-side firms...

BLOG

The Deal is Done – LSEG Completes Acquisition of Refinitiv

A big day in the City last Friday as the London Stock Exchange Group (LSEG) completed its all-share acquisition of Refinitiv first mooted back in July 2019. The acquisition is expected to create a leading, UK-headquartered, global financial market infrastructure provider with a strong data and analytics business, significant capital market capabilities across multiple asset...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual brings together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...