The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Integration and Emerging Technologies Offer Beneficial Opportunities

Data integration and the adoption of technologies including data location aware computation, in database analytics and event stream processing will improve analytics, reduce decision making time and, in turn, contribute to cutting the costs associated with duplicated data.

Addressing the issue of integrating data for high compute enterprise analytics, a panel discussion at A-Team Group’s Data Management for Risk, Analytics and Valuations Conference held in London today, discussed the move from traditional silos of data to an integrated approach including new technologies that could deliver benefits not only to front office decision making, but also to middle office functions such as risk management.

Under the auspices of panel chair Andrew Delaney, president and editor-in-chief at A-Team Group, panel members discussed elements of change to improve the data environment. All agreed that development would be a continuum, but as Colin Rickard, business development director, Europe, DataFlux, warned: “First, we must go down the route of understanding data better, if we don’t we will be in a bigger mess quicker. As data has moved from batch to near real time, there are issues of data governance and understanding that need to be challenged before moving on to analytics.”

Acknowledging that silos of data exist in many firms and that it takes too much time to move all the data to a data warehouse using ETL tools, the panellists favoured data management solutions that allow applications to access siloed data as required.

Stuart Grant, EMEA business development manager in the financial services group at Sybase, explained: “It is not necessarily important to move data from silos, but it is necessary to be able to do analytics on data where it is, so the breakdown of the data needs to be more virtual than physical. The aim is to be able to query any data, for any information, at any time.”

Looking at data types, particularly unstructured data, and again considering the option of not moving data, Amir Halfon, senior director of technology, capital markets, in Oracle’s global financial services business, said: “What is new is bringing unstructured data into the structure of analytics, not by moving the data back and forth, but by pointing to the unstructured data from the structure and, for example, undertaking sentiment analysis. We have the technology and there are big opportunities to use it to solve data challenges, but we need support from the organisation and we need the people resources to make it work.”

Grant agreed that there are opportunities to use existing technologies for new data types, but pointed out that the biggest missing data set in most firms is meta data. Taking a step back from the cutting edge, Rickard commented: “There is room for future discussion on unstructured data, but at the moment there is still headroom to sort out structured data. I have yet to find firms that have a grip on issues like meta data.”

Picking up this point, Halfon argued: “This is not about either one thing or another, but about a continuum from dealing with structured data to unstructured data and so on.” The continuum will take in new technologies and some are being deployed, he said, highlighting data location aware computation. This is relatively easy to deploy and requires few application changes while delivering increases in analytics performance and a reduction in calculation times.

Halfon also described in database analytics that put analytics in the database engine to take advantage of its multi-tasking capabilities and the ability to do analytics on database data rather than a sample of data extracted from the database for analysis on a desktop computer.

Grant said embedding in memory analytics in a database could provide execution speeds, perhaps around the pricing function, 100 times faster than traditional systems. He also described event streaming that streams data through a model, rather than stopping the data to analyse it, and delivers extremely fast results.

Complex event processing using streaming data was also on the agenda as a means of delivering decisions in micro or milliseconds, either in the front office or, for the first time, in the middle office for applications such as risk management.

“Ultimately this is about getting technologies to work together in an optimal architecture, getting data to flow smoothly and ensuring only a single version of data that is pointed to rather than moved,” concluded Halfon.

Related content

WEBINAR

Recorded Webinar: Brexit: Reviewing the regulatory landscape and the data management response

With Brexit behind us and the UK establishing its own regulatory regime having failed to reach equivalence with the EU, financial firms face challenges of double reporting, uncertainty about UK regulation, and a potential exodus of top talent. The data management response is not easy and could stretch some firms to the limit as they...

BLOG

FundGuard Plans Product Development to Drive Transparency in Fund Administration

New York-based FundGuard, a pioneer of AI in investment management and asset services, has completed a $12 million Series A funding round. The company will use the funding for product development to meet increased regulatory demands for transparency and resilience in the fund administration segment. The company works with several of the world’s largest fund...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

The Data Management Implications of Solvency II

This special report accompanies a webinar we held on the popular topic of The Data Management Implications of Solvency II, discussing the data implications for asset managers and their custodians and asset servicers. You can register here to get immediate access to the Special Report.