Flexible data architecture has become essential to financial institutions seeking to respond quickly to both market opportunities and challenges, but it can be difficult to build. Experts on the subject convened at A-Team Group’s Data Management Summit for a panel discussion covering the problems inherent to developing flexible enterprise data infrastructure, as well some solutions.
Andrew Delaney, editor-in-chief at A-Team Group, opened the discussion asking why flexibility in data architecture is important and how it can be achieved. Neill Vanlint, managing director, EMEA and Asia at GoldenSource, said: “Flexibility is important because firms have to start development from where they are and most have legacy systems. Getting applications to talk to each other is like putting lipstick on a pig, but we do have tools such as application programming interfaces that can help by plugging applications into an enterprise data platform.”
Amir Halfon, chief technologist, financial services at MarkLogic, added: “We need flexible data architecture to be able to respond to business drivers, such as the need to trade more complex instruments, and to support the changing needs of reference data consumers in areas such as compliance and risk.”
Building a flexible data architecture that transcends structured relational data models can be difficult and often requires change. Rupert Brown, lead architect in the office of the CTO at UBS Investment Bank, explained: “Change is driven by circumstance, fashion and what people believe in, but in a regulated world we need to think about process and the evidence of process that is required by auditors and regulators. We also need to think about function, but first we need to sort out how data will support process and function in a flexible and efficient way.”
Responding to a question from Delaney about what is driving the take-up of more flexible data infrastructure, Peter Glerup Ericson, an IT analyst at Nordea, said: “Regulation demanding risk data aggregation and the principles of data quality are driving the need for flexible data infrastructure. In turn, this is driving banks to get together on standards such as Fibo – Financial Industry Business Ontology – but these standards need to gain more traction if they are to satisfy regulators and drive internal development.”
In terms of tools for building flexible data architecture, Ericson suggested the Unified Modelling Language. He also favoured semantic technologies that encode the meaning of words separately from data, explaining: “The Fibo initiative is trying to define one meaning for terms used across the financial industry. If the resulting ontology was used in house and by regulators, both would understand data in the same way.” Interest in semantics, particularly semantic web, was noted by a number of panellists, but they agreed that the technology has yet to become mainstream and needs drivers, perhaps a regulatory requirement, to accelerate adoption.
Questioning the characteristics of a flexible data architecture, Delaney turned to Brown, who answered: “Flexible architecture should stand the test of time and support whatever focus there is in the market. The architecture needs an underlying architecture that describes it and the data model must be sufficiently semantic to make it flexible.”
On the issue of trade-offs between data management flexibility and performance, Halfon commented: “Flexibility does have performance implications as moving data into a less structured environment becomes more taxing for systems. But a much bigger impediment to performance and success is the lack of a champion who understands new generation data management technologies.”
Subscribe to our newsletter