About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New PolarLake Apps Offer Insight Into Downstream Data Quality

Subscribe to our newsletter

Dublin-based integration vendor PolarLake is touting a suite of business intelligence applications it says are complementary to its reference data distribution application, PolarLake RDD (Reference Data Review, March 2007). According to the vendor’s CEO John Randles, these applications are designed “to give the business confidence in the integrity of data to downstream systems”.

PolarLake’s Data Quality Intelligence application enables reconciliation of the centralised golden copy with the actual reality of data in downstream systems, the vendor says, allowing the monitoring of trends in reference data quality over time. The Metering and Metrics Intelligence application provides a set of reports and dashboards, detailing what data was distributed to which downstream systems and in what volume over a certain period of time. The Statistical Intelligence application, which has embedded OLAP support, provides statistical reports on system performance, rule usage, scenario analysis and the ability to create custom reports using three-dimensional queries.

Says Randles: “If you are distributing data to 50 downstream systems, the ability to run a report to check the data quality, and determine whether it is as high quality as expected, gives the business confidence that the system is actually improving data throughout downstream systems. As the recent Aite Group report suggests, firms are spending a lot of money on distribution of data to downstream systems. These applications give insight into whether it has been well-spent.” According to the Aite report, the industry is currently spending $2.5 billion on the integration of reference data into downstream systems, with that figure set to grow to $3.5 billion by 2010.

The ability to track what data goes where is useful for firms in proving data consumption to data vendors, Randles continues, adding: “If you realise you are not actually distributing a given feed, you can stop subscribing to it. We are enabling firms to correlate the consumption with the inputs.”

Randles says PolarLake had always planned to create the business intelligence applications to run alongside RDD. “We knew that once we had laid the foundation and the distribution system was in place, we would have a foundation on which to define actionable intelligence,” he says. “The shape of the reports has been determined by our experience in the market, but the reports themselves were always a given.” The point of including OLAP support is “to enable people to define their own reports and queries, in order to measure the metrics of distribution that are important to them”, he adds.

While the new applications could be deployed in conjunction with other data distribution systems, Randles says, in practice “the kind of data we capture isn’t typically available when you hand-craft your own EAI”. “We have instrumented nodes on our system to capture the relevant data,” he continues.

Clients with which PolarLake has done work on the reference data side include JPMorgan, Société Générale, Pioneer Investments and Bank of Ireland. Some of the business intelligence applications have already been deployed in customer sites, Randles says. “The reconciliation and data quality analytics have been extensively used and it is interesting how valuable they have become to some of our customers, not just in their day to day operations but in their testing environments – as testing tools for when they put in a new golden copy database, for example.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

Bloomberg’s VDR Offers Data Fitting Room to Tackle Information Overload

Data overload is a phrase that’s being more commonly heard as the volume of digital information available to financial institutions swells. More than that, finding the data that best suits their specific needs has become an arduous task, especially given that it can take the best part of a financial quarter to onboard new data...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Connecting to Today’s Fast Markets

At the same time, the growth of high frequency and event-driven trading techniques is spurring demand for direct feed services sourced from exchanges and other trading venues, including alternative trading systems and multilateral trading facilities. Handling these high-speed data feeds its presenting market data managers and their infrastructure teams with a challenge: how to manage...