About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

PolarLake Proposes SOA and Semantics for Big Data Management

Subscribe to our newsletter

PolarLake has responded to market – and regulator – demand for a real-time consolidated view of trade, position, reference and client data that can inform operational efficiency, risk management and compliance with a virtualised data warehouse solution based on service-oriented architecture (SOA) and semantic technologies.

The virtualised data warehouse plays into not only risk and compliance issues, but also the Big Data debate as financial firms begin to look beyond relational databases at how best to access, manage and view vast quantities of data.

“We are talking to investment banks and large buy side firms about the virtualised data warehouse,” John Randles, CEO of PolarLake. “We don’t have to explain the problem, they know it. The pressure from business, risk managers and regulators is to get a better handle on data and understand how it links together. Organisations need to be confident about the reliability of data they are using in operations and reporting.”

The software came to market early this week, but it has been running in a pilot project at a large investment bank in North America since the last quarter of 2011 and is expected to go live at the bank later this year. The pilot consolidated 10 silos of trade, position, reference and client data in five weeks.

Randles explains: “The bank was facing a situation where it had multiple systems and wanted a better consolidated view for operational purposes, risk management and compliance. It could have looked at a traditional data warehouse solution – what we call ‘yet another data warehouse strategy’ – but that would have meant a long development programme to build a new system. An alternative strategy was to use PolarLake technology that leaves data where it is and queries it, or depending on user requirements, loads data into an element of the product called PolarLake Data Store, where it can be queried. Real-time queries can be run against both data in silos and data in the data store.”

The virtualised data warehouse has four components: a data search application that allows users to query data across all repositories; a semantic policy and rules engine that supports the creation of business rules to build consolidated records, as well as the creation of virtual identifiers across all repositories; a data store for source data and index data used in virtual queries; and a connectivity subsystem that allows communications across multiple protocols and formats in batch, real time, request reply and ad hoc distributions.

The decision to query data in source repositories, build a temporary data warehouse style store in the data store, or combine these options, depends on operational considerations. For example, if the requirement is to run a large query across many systems, it may be best to load the necessary data into the data store and take it offline to run the query.

Technologies supporting the virtual nature and performance of the data warehouse are an SOA layer and semantics. Randles explains: “We are at the point where the old approach of massive multi-year data warehousing projects is no longer tenable. The PolarLake approach of a search-based view of data with an integrated semantic policy engine has proved to deliver business requirements in weeks rather than years.”

The search functionality of the software is based on data tags and linkages between data using semantics. The data integration is based on XML pipeline technology that PolarLake patented back in 2004. It treats all data, whatever its type, format or source, as XML without converting it into XML. When using low-latency streaming data, PolarLake says these technologies mean its solution can outperform relational data models by a factor of 11.

“We are all about innovation, our DNA is in integrating all types of data. As our data management platform has evolved, we have moved beyond integration, to link, manage, distribute and search financial and reference data with speed and control,” says Randles. “Other companies have tried to build data management solutions with SOA and messaging technologies, but this is not enough. The need is to understand the data and provide intelligence for searching. We are trying to give people the best of both worlds, SOA and semantics for meaningful searches.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents.  While AI has created new opportunities to extract signal from this data, many firms...

BLOG

Broadridge Deepens AI Push with Minority Investment in DeepSee to Transform Post-Trade Operations

Broadridge Financial Solutions has taken a minority stake in agentic AI specialist DeepSee and expanded its partnership to embed intelligent automation into post-trade workflows, marking a strategic advance in its data and AI roadmap for capital markets operations. Tom Carey, President of Broadridge Global Technology and Operations (GTO), will join DeepSee’s Board of Directors as...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Third Edition

Welcome to the third edition of the Entity Data Management Handbook which is available for free download. In this updated edition we delve into the role entity data plays in the smooth running of financial institutions and capital markets, the challenges of attaining high quality data, and various aspects, approaches and technologies involved in managing...