The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

PolarLake has Netted Eight New Clients This Year, Launches Reference Data Policy Engine

Commensurate with the current trend towards an uptick in investment in data management solutions this year, PolarLake’s CEO John Randles indicates that eight new clients have signed on the dotted line so far in 2010. The vendor has also just launched its PolarLake Reference Data Policy Engine, which Randles says is aimed at affording end users more control over how they interact with version 3.4 of PolarLake’s Reference Data Distribution (RDD) product.

To this end, the policy engine uses semantics technologies embedded within a business application to enable business operations and technical staff to define reference data usage control policies. This is all part of the vendor’s recognition that downstream users are getting much more involved in the data management process, notes Randles. He is convinced that enterprise data management (EDM) is evolving into what he calls “reference data supply chain management”.

“We’ve seen huge interest from firms not just wanting integration in the traditional form of middleware but more to do with implementing policy around who gets access to what data,” he explains. “This is about distribution priorities and the business uses of data. It is a much more sophisticated look at distribution from a business perspective rather than just as a technology issue.”

The new policy engine therefore features data access control policies based on downstream user and application profiles and data usage quotas aimed at ensuring appropriate and controlled use of data. The system is better able to adapt to end user requirements as it has the ability to learn and alter the data model based on observation of change in data patterns over time, says the vendor. It also provides more data on expected delivery schedules and exception escalation paths.

Randles reckons people have now got their heads around the concept of data warehousing and are now looking to extend this focus further downstream to fully realise the benefits of centralisation. “If you are building a data warehouse and it is difficult to use from the consuming systems, then this misses the whole point of data centralisation. It becomes another repository to be maintained rather than something of value,” he says.

The industry has moved on to look at the consuming application demands and end users are getting much more specific in how they want to interact with a centralised service, according to Randles. This includes the schedules and the formats in which they receive the data, as well as putting filters into the process in order to receive the data that they want rather than all the data that can be provided from the system. The focus of a reference data platform is therefore on better servicing a firm’s internal customers and making sure data distribution is conducted in a timely manner. This is all about the “last mile” of the reference data challenge: ensuring the data is for purpose.

In the second half of this year, PolarLake will release version four of its RDD, which was first released on the market back in 2006. Randles notes that this will probably happen at the start of the third quarter and should include a lot of “interesting new features”. The vendor has been working on the development of the solution with its customers, which includes six of the top 10 investment banks, two of the top five prime brokers and two of the top 10 asset managers, says Randles. “We think the product is fairly unique in solving the data supply chain challenge, rather than as a golden copy focused solution,” he adds.

The challenge for PolarLake going forward is around “education” and explaining how the vendor’s approach differs from a generic technology approach to data challenges, notes Randles. “We work in conjunction with a lot of the golden copy providers to fill the gap between generic technology and accessing the end data that they provide,” he says.

The vendor is focusing on gaining traction this year within the top 100 asset managers and the tier one and two sell side firms, adds Randles.

Related content

WEBINAR

Upcoming Webinar: Improving Data Integrity to Address Regulatory Requirements

Date: 6 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions today face a global regulatory landscape characterised by rigorous and varied reporting requirements across their businesses. Reporting challenges include completing more data fields across more lines of business with greater frequency, adding complexity and cost. At the...

BLOG

FundGuard Plans Product Development to Drive Transparency in Fund Administration

New York-based FundGuard, a pioneer of AI in investment management and asset services, has completed a $12 million Series A funding round. The company will use the funding for product development to meet increased regulatory demands for transparency and resilience in the fund administration segment. The company works with several of the world’s largest fund...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...