The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

PolarLake Proposes SOA and Semantics for Big Data Management

PolarLake has responded to market – and regulator – demand for a real-time consolidated view of trade, position, reference and client data that can inform operational efficiency, risk management and compliance with a virtualised data warehouse solution based on service-oriented architecture (SOA) and semantic technologies.

The virtualised data warehouse plays into not only risk and compliance issues, but also the Big Data debate as financial firms begin to look beyond relational databases at how best to access, manage and view vast quantities of data.

“We are talking to investment banks and large buy side firms about the virtualised data warehouse,” John Randles, CEO of PolarLake. “We don’t have to explain the problem, they know it. The pressure from business, risk managers and regulators is to get a better handle on data and understand how it links together. Organisations need to be confident about the reliability of data they are using in operations and reporting.”

The software came to market early this week, but it has been running in a pilot project at a large investment bank in North America since the last quarter of 2011 and is expected to go live at the bank later this year. The pilot consolidated 10 silos of trade, position, reference and client data in five weeks.

Randles explains: “The bank was facing a situation where it had multiple systems and wanted a better consolidated view for operational purposes, risk management and compliance. It could have looked at a traditional data warehouse solution – what we call ‘yet another data warehouse strategy’ – but that would have meant a long development programme to build a new system. An alternative strategy was to use PolarLake technology that leaves data where it is and queries it, or depending on user requirements, loads data into an element of the product called PolarLake Data Store, where it can be queried. Real-time queries can be run against both data in silos and data in the data store.”

The virtualised data warehouse has four components: a data search application that allows users to query data across all repositories; a semantic policy and rules engine that supports the creation of business rules to build consolidated records, as well as the creation of virtual identifiers across all repositories; a data store for source data and index data used in virtual queries; and a connectivity subsystem that allows communications across multiple protocols and formats in batch, real time, request reply and ad hoc distributions.

The decision to query data in source repositories, build a temporary data warehouse style store in the data store, or combine these options, depends on operational considerations. For example, if the requirement is to run a large query across many systems, it may be best to load the necessary data into the data store and take it offline to run the query.

Technologies supporting the virtual nature and performance of the data warehouse are an SOA layer and semantics. Randles explains: “We are at the point where the old approach of massive multi-year data warehousing projects is no longer tenable. The PolarLake approach of a search-based view of data with an integrated semantic policy engine has proved to deliver business requirements in weeks rather than years.”

The search functionality of the software is based on data tags and linkages between data using semantics. The data integration is based on XML pipeline technology that PolarLake patented back in 2004. It treats all data, whatever its type, format or source, as XML without converting it into XML. When using low-latency streaming data, PolarLake says these technologies mean its solution can outperform relational data models by a factor of 11.

“We are all about innovation, our DNA is in integrating all types of data. As our data management platform has evolved, we have moved beyond integration, to link, manage, distribute and search financial and reference data with speed and control,” says Randles. “Other companies have tried to build data management solutions with SOA and messaging technologies, but this is not enough. The need is to understand the data and provide intelligence for searching. We are trying to give people the best of both worlds, SOA and semantics for meaningful searches.”

Related content

WEBINAR

Recorded Webinar: Embedding AI & Machine Learning into your trading operations

It’s no secret that AI and Machine Learning are changing the landscape of the trading industry. As these tools and techniques become more mainstream, their availability to even the smallest of financial firms is becoming a more viable proposition.  From speech recognition and natural language processing (NLP) to deep learning algorithms and predictive analytics, these...

BLOG

Case Study: How Liquidnet Deployed OpenFin for Investment Analytics Unit

Desktop integration specialist OpenFin has published a case study mapping out how Liquidnet deployed its platform to develop a single application uniting three of its key data and analytics offerings. The project aimed to bring together the respective capabilities of the OTAS Technologies, Prattle and RSRCHXchange units, which Liquidnet combined into a single entity called...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

Fatca – Getting to Grips with the Challenge Ahead

The industry breathed a sigh of relief when the deadline for reporting under the US Foreign Account Tax Compliance Act (Fatca) was pushed back to July 1, 2014. But what’s starting to look like perhaps the most significant regulation of the next 12 months may start to impact our marketplace sooner than we think, especially...