The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

New PolarLake Apps Offer Insight Into Downstream Data Quality

Dublin-based integration vendor PolarLake is touting a suite of business intelligence applications it says are complementary to its reference data distribution application, PolarLake RDD (Reference Data Review, March 2007). According to the vendor’s CEO John Randles, these applications are designed “to give the business confidence in the integrity of data to downstream systems”.

PolarLake’s Data Quality Intelligence application enables reconciliation of the centralised golden copy with the actual reality of data in downstream systems, the vendor says, allowing the monitoring of trends in reference data quality over time. The Metering and Metrics Intelligence application provides a set of reports and dashboards, detailing what data was distributed to which downstream systems and in what volume over a certain period of time. The Statistical Intelligence application, which has embedded OLAP support, provides statistical reports on system performance, rule usage, scenario analysis and the ability to create custom reports using three-dimensional queries.

Says Randles: “If you are distributing data to 50 downstream systems, the ability to run a report to check the data quality, and determine whether it is as high quality as expected, gives the business confidence that the system is actually improving data throughout downstream systems. As the recent Aite Group report suggests, firms are spending a lot of money on distribution of data to downstream systems. These applications give insight into whether it has been well-spent.” According to the Aite report, the industry is currently spending $2.5 billion on the integration of reference data into downstream systems, with that figure set to grow to $3.5 billion by 2010.

The ability to track what data goes where is useful for firms in proving data consumption to data vendors, Randles continues, adding: “If you realise you are not actually distributing a given feed, you can stop subscribing to it. We are enabling firms to correlate the consumption with the inputs.”

Randles says PolarLake had always planned to create the business intelligence applications to run alongside RDD. “We knew that once we had laid the foundation and the distribution system was in place, we would have a foundation on which to define actionable intelligence,” he says. “The shape of the reports has been determined by our experience in the market, but the reports themselves were always a given.” The point of including OLAP support is “to enable people to define their own reports and queries, in order to measure the metrics of distribution that are important to them”, he adds.

While the new applications could be deployed in conjunction with other data distribution systems, Randles says, in practice “the kind of data we capture isn’t typically available when you hand-craft your own EAI”. “We have instrumented nodes on our system to capture the relevant data,” he continues.

Clients with which PolarLake has done work on the reference data side include JPMorgan, Société Générale, Pioneer Investments and Bank of Ireland. Some of the business intelligence applications have already been deployed in customer sites, Randles says. “The reconciliation and data quality analytics have been extensively used and it is interesting how valuable they have become to some of our customers, not just in their day to day operations but in their testing environments – as testing tools for when they put in a new golden copy database, for example.”

Related content

WEBINAR

Recorded Webinar: A new way of collaborating with data

Digital transformation in the financial services sector has raised many questions around data, including the cost and volume of reference data required by each financial institution. Firms want to pick and choose the reference data they need to fulfil their requirements. Emerging solutions with the potential to decrease the cost of data and increase flexibility...

BLOG

n-Tier Blockchain-Based Reference Data Consensus Solution Aims to Drive Down Errors and Costs

As reference data volumes continue to soar, bringing with them huge data cleansing, validation and management costs, financial institutions are beginning to consider collaborative solutions that can improve data accuracy while reducing cost. n-Tier, a New York headquartered company that helps firms ensure accuracy and completeness of reference data, has joined the party with a...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...