About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit – Emerging Technologies Deliver Business Benefits

Subscribe to our newsletter

Big data, cloud computing, the semantic web, big meta data, logical data models and in-memory analytics featured in a debate about emerging technologies for enterprise architecture at this week’s A-Team Group Data Management Summit.

Colin Gibson, head of data architecture, markets and international banking at the Royal Bank of Scotland set the scene describing data management development work at the bank, before A-Team Group editor-in-chief Andrew Delaney stepped up to moderate a panel discussion including Gibson; Rupert Brown, IB CTO lead architect at UBS Investment Bank; Amir Halfon, chief technologies, financial services at MarkLogic; and Eyal Gutkind, senior manager, enterprise market development at Mellanox Technologies.

Gibson presented under the title Understanding Data – Analysis, Not Archaeology. He highlighted the need to understand data if maximum value is to be extracted from it and described development of a data knowledge base at Royal Bank of Scotland. Avoiding the data inconsistencies of running numerous data silos and the difficulties of spaghetti-style application architecture, Gibson selected a meta data model for the data knowledge base, which was then filled with content. The development was not without challenges, such as mapping legacy data to the logical model and sustaining stamina throughout the build, but the outcome is a data management solution that meets business needs.

Panellists agreed that an enterprise approach to data management is essential in an increasingly regulated market that must report on both structured and unstructured data, manage internal risk and deliver agile solutions to demanding customers. Halfon commented: “If data can be made available without building a data warehouse, it is possible to be more agile and get to trading more quickly. Adding other types of information other than trade data, perhaps political data or news analysis, can deliver better returns.”

It is these types of business benefits that win C-suite buy-in for data management programmes, but the benefits cannot be delivered without technology innovation. Halfon noted an industry move towards logical data warehouses with data schema dictated by consumers rather than domain experts, as well as the emerging power of semantics in systems development. Brown, like Gibson, advocated the use of meta data models and described the criticality of data sequencing in the ‘forensic pathology’ of finding out how something has happened.

In terms of specific technologies, big data got a thumbs up from the US contingent in the conference room and a thumbs down from the Europeans. The panellists agreed that whether or not you like the term and despite having dealt with big data’s three Vs of volume, velocity and variety for many years, its elements still have a part to play in data management.

Gutkind explained: “We see people wanting to do analysis on data as it flies, so velocity is important. It is not just about putting data into a system for analysis, but analysing it as it goes in.” This is the role of in-memory processing, a technology that is gaining ground, but needs to be part of a wider velocity and volume solution that could also encompass solid state local drives and cloud storage.

On variety, Halfon said: “Variety is where the challenge and opportunity lies. If all data can be brought together in a way that has not been done before, it is possible to manage risk and regulation, and deliver revenue and returns.” He cited tools such as Hadoop and MarkLogic’s NoSQL and search capability as a means of managing and benefitting from big data.

Although cloud computing is some years down the development road, the panellists voted in its favour. Gibson acknowledged its elasticity to deal with data spikes, while Brown suggested it can support better data management as the location of data and where it is moving to and from can be tracked. He also proposed clouds including models of system topology and simulations to discover where data is best placed.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

AI is Helping to Solve New ESG Data Challenges: ESG Briefing Review

The peculiar demands that ESG data integration places on capital markets participants requires powerful techniques that are increasingly being provided through artificial intelligence, A-Team Group’s recent ESG Data and Tech Briefing London heard. From data quality monitoring and analytics to supply chain analysis and investment management, AI-based tools are already offering automated solutions to some...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...