About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Integration and Emerging Technologies Offer Beneficial Opportunities

Subscribe to our newsletter

Data integration and the adoption of technologies including data location aware computation, in database analytics and event stream processing will improve analytics, reduce decision making time and, in turn, contribute to cutting the costs associated with duplicated data.

Addressing the issue of integrating data for high compute enterprise analytics, a panel discussion at A-Team Group’s Data Management for Risk, Analytics and Valuations Conference held in London today, discussed the move from traditional silos of data to an integrated approach including new technologies that could deliver benefits not only to front office decision making, but also to middle office functions such as risk management.

Under the auspices of panel chair Andrew Delaney, president and editor-in-chief at A-Team Group, panel members discussed elements of change to improve the data environment. All agreed that development would be a continuum, but as Colin Rickard, business development director, Europe, DataFlux, warned: “First, we must go down the route of understanding data better, if we don’t we will be in a bigger mess quicker. As data has moved from batch to near real time, there are issues of data governance and understanding that need to be challenged before moving on to analytics.”

Acknowledging that silos of data exist in many firms and that it takes too much time to move all the data to a data warehouse using ETL tools, the panellists favoured data management solutions that allow applications to access siloed data as required.

Stuart Grant, EMEA business development manager in the financial services group at Sybase, explained: “It is not necessarily important to move data from silos, but it is necessary to be able to do analytics on data where it is, so the breakdown of the data needs to be more virtual than physical. The aim is to be able to query any data, for any information, at any time.”

Looking at data types, particularly unstructured data, and again considering the option of not moving data, Amir Halfon, senior director of technology, capital markets, in Oracle’s global financial services business, said: “What is new is bringing unstructured data into the structure of analytics, not by moving the data back and forth, but by pointing to the unstructured data from the structure and, for example, undertaking sentiment analysis. We have the technology and there are big opportunities to use it to solve data challenges, but we need support from the organisation and we need the people resources to make it work.”

Grant agreed that there are opportunities to use existing technologies for new data types, but pointed out that the biggest missing data set in most firms is meta data. Taking a step back from the cutting edge, Rickard commented: “There is room for future discussion on unstructured data, but at the moment there is still headroom to sort out structured data. I have yet to find firms that have a grip on issues like meta data.”

Picking up this point, Halfon argued: “This is not about either one thing or another, but about a continuum from dealing with structured data to unstructured data and so on.” The continuum will take in new technologies and some are being deployed, he said, highlighting data location aware computation. This is relatively easy to deploy and requires few application changes while delivering increases in analytics performance and a reduction in calculation times.

Halfon also described in database analytics that put analytics in the database engine to take advantage of its multi-tasking capabilities and the ability to do analytics on database data rather than a sample of data extracted from the database for analysis on a desktop computer.

Grant said embedding in memory analytics in a database could provide execution speeds, perhaps around the pricing function, 100 times faster than traditional systems. He also described event streaming that streams data through a model, rather than stopping the data to analyse it, and delivers extremely fast results.

Complex event processing using streaming data was also on the agenda as a means of delivering decisions in micro or milliseconds, either in the front office or, for the first time, in the middle office for applications such as risk management.

“Ultimately this is about getting technologies to work together in an optimal architecture, getting data to flow smoothly and ensuring only a single version of data that is pointed to rather than moved,” concluded Halfon.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

Making the Most of Mainframe Structured Data: Webinar Preview

Mainframes still provide the data and computational backbone of many financial institutions but some organisations are encountering challenges as they try to integrate them with newer architectures. Many are incompatible with cloud and server-based architectures as well as APIs. Work-arounds can be achieved but they require middleware that can be costly and time consuming to...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...