About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Integration and Emerging Technologies Offer Beneficial Opportunities

Subscribe to our newsletter

Data integration and the adoption of technologies including data location aware computation, in database analytics and event stream processing will improve analytics, reduce decision making time and, in turn, contribute to cutting the costs associated with duplicated data.

Addressing the issue of integrating data for high compute enterprise analytics, a panel discussion at A-Team Group’s Data Management for Risk, Analytics and Valuations Conference held in London today, discussed the move from traditional silos of data to an integrated approach including new technologies that could deliver benefits not only to front office decision making, but also to middle office functions such as risk management.

Under the auspices of panel chair Andrew Delaney, president and editor-in-chief at A-Team Group, panel members discussed elements of change to improve the data environment. All agreed that development would be a continuum, but as Colin Rickard, business development director, Europe, DataFlux, warned: “First, we must go down the route of understanding data better, if we don’t we will be in a bigger mess quicker. As data has moved from batch to near real time, there are issues of data governance and understanding that need to be challenged before moving on to analytics.”

Acknowledging that silos of data exist in many firms and that it takes too much time to move all the data to a data warehouse using ETL tools, the panellists favoured data management solutions that allow applications to access siloed data as required.

Stuart Grant, EMEA business development manager in the financial services group at Sybase, explained: “It is not necessarily important to move data from silos, but it is necessary to be able to do analytics on data where it is, so the breakdown of the data needs to be more virtual than physical. The aim is to be able to query any data, for any information, at any time.”

Looking at data types, particularly unstructured data, and again considering the option of not moving data, Amir Halfon, senior director of technology, capital markets, in Oracle’s global financial services business, said: “What is new is bringing unstructured data into the structure of analytics, not by moving the data back and forth, but by pointing to the unstructured data from the structure and, for example, undertaking sentiment analysis. We have the technology and there are big opportunities to use it to solve data challenges, but we need support from the organisation and we need the people resources to make it work.”

Grant agreed that there are opportunities to use existing technologies for new data types, but pointed out that the biggest missing data set in most firms is meta data. Taking a step back from the cutting edge, Rickard commented: “There is room for future discussion on unstructured data, but at the moment there is still headroom to sort out structured data. I have yet to find firms that have a grip on issues like meta data.”

Picking up this point, Halfon argued: “This is not about either one thing or another, but about a continuum from dealing with structured data to unstructured data and so on.” The continuum will take in new technologies and some are being deployed, he said, highlighting data location aware computation. This is relatively easy to deploy and requires few application changes while delivering increases in analytics performance and a reduction in calculation times.

Halfon also described in database analytics that put analytics in the database engine to take advantage of its multi-tasking capabilities and the ability to do analytics on database data rather than a sample of data extracted from the database for analysis on a desktop computer.

Grant said embedding in memory analytics in a database could provide execution speeds, perhaps around the pricing function, 100 times faster than traditional systems. He also described event streaming that streams data through a model, rather than stopping the data to analyse it, and delivers extremely fast results.

Complex event processing using streaming data was also on the agenda as a means of delivering decisions in micro or milliseconds, either in the front office or, for the first time, in the middle office for applications such as risk management.

“Ultimately this is about getting technologies to work together in an optimal architecture, getting data to flow smoothly and ensuring only a single version of data that is pointed to rather than moved,” concluded Halfon.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Scale and Governance Top Drivers of Modern Data Architecture Plans: Webinar Review

Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands. They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...