About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Integration and Emerging Technologies Offer Beneficial Opportunities

Subscribe to our newsletter

Data integration and the adoption of technologies including data location aware computation, in database analytics and event stream processing will improve analytics, reduce decision making time and, in turn, contribute to cutting the costs associated with duplicated data.

Addressing the issue of integrating data for high compute enterprise analytics, a panel discussion at A-Team Group’s Data Management for Risk, Analytics and Valuations Conference held in London today, discussed the move from traditional silos of data to an integrated approach including new technologies that could deliver benefits not only to front office decision making, but also to middle office functions such as risk management.

Under the auspices of panel chair Andrew Delaney, president and editor-in-chief at A-Team Group, panel members discussed elements of change to improve the data environment. All agreed that development would be a continuum, but as Colin Rickard, business development director, Europe, DataFlux, warned: “First, we must go down the route of understanding data better, if we don’t we will be in a bigger mess quicker. As data has moved from batch to near real time, there are issues of data governance and understanding that need to be challenged before moving on to analytics.”

Acknowledging that silos of data exist in many firms and that it takes too much time to move all the data to a data warehouse using ETL tools, the panellists favoured data management solutions that allow applications to access siloed data as required.

Stuart Grant, EMEA business development manager in the financial services group at Sybase, explained: “It is not necessarily important to move data from silos, but it is necessary to be able to do analytics on data where it is, so the breakdown of the data needs to be more virtual than physical. The aim is to be able to query any data, for any information, at any time.”

Looking at data types, particularly unstructured data, and again considering the option of not moving data, Amir Halfon, senior director of technology, capital markets, in Oracle’s global financial services business, said: “What is new is bringing unstructured data into the structure of analytics, not by moving the data back and forth, but by pointing to the unstructured data from the structure and, for example, undertaking sentiment analysis. We have the technology and there are big opportunities to use it to solve data challenges, but we need support from the organisation and we need the people resources to make it work.”

Grant agreed that there are opportunities to use existing technologies for new data types, but pointed out that the biggest missing data set in most firms is meta data. Taking a step back from the cutting edge, Rickard commented: “There is room for future discussion on unstructured data, but at the moment there is still headroom to sort out structured data. I have yet to find firms that have a grip on issues like meta data.”

Picking up this point, Halfon argued: “This is not about either one thing or another, but about a continuum from dealing with structured data to unstructured data and so on.” The continuum will take in new technologies and some are being deployed, he said, highlighting data location aware computation. This is relatively easy to deploy and requires few application changes while delivering increases in analytics performance and a reduction in calculation times.

Halfon also described in database analytics that put analytics in the database engine to take advantage of its multi-tasking capabilities and the ability to do analytics on database data rather than a sample of data extracted from the database for analysis on a desktop computer.

Grant said embedding in memory analytics in a database could provide execution speeds, perhaps around the pricing function, 100 times faster than traditional systems. He also described event streaming that streams data through a model, rather than stopping the data to analyse it, and delivers extremely fast results.

Complex event processing using streaming data was also on the agenda as a means of delivering decisions in micro or milliseconds, either in the front office or, for the first time, in the middle office for applications such as risk management.

“Ultimately this is about getting technologies to work together in an optimal architecture, getting data to flow smoothly and ensuring only a single version of data that is pointed to rather than moved,” concluded Halfon.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to integrating legacy data with the cloud

Acceleration of cloud adoption, increasing demand for digital transformation and real-time data management have led financial institutions to rethink their data infrastructure to enable more agile operating models that can respond faster to change and make data a competitive advantage. For many, integrating data from legacy systems and data across the business landscape with a...

BLOG

Snowflake Cortex Simplifies Route to Deriving Value from Generative AI

Snowflake has unveiled Snowflake Cortex, an innovative managed service designed to simplify how organisations derive value from generative AI. The service provides access to large language models (LLMs), AI models, and vector search functionality in the Snowflake Data Cloud, and includes serverless functions that help users accelerate analytics and build contextualised LLM-powered apps within minutes,...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...