About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Next Frontier in Data Management Responses to Regulation

Subscribe to our newsletter

Not a conference on technology for capital markets goes by without many mentions of regulation, so here at A-Team Group we have put together a distinguished panel of speakers that will review the regulatory landscape and suggest how firms can develop data strategies that are compatible with a volatile environment at this week’s Data Management Summit in London.

The panel will be moderated by regulatory financial specialist Selwyn Blair-Ford and joined by Francis Gross, senior advisor to the directorate of general statistics at the European Central Bank; James Phillips, global head of regulatory strategy at Lombard Risk; Peter O’Keefe, an independent data management expert; Brian Sentance, CEO at Xenomorph; and Alessandro Sanos, market development manager, risk and enterprise, at Thomson Reuters.

To give you a little insight into how the discussion might unfold, we caught up with some of the speakers ahead of the event. Brexit in Europe and the Trump administration in the US are expected to cause significant changes to the order of the day and will top regulation already coming down the track, while Markets in Financial Instruments Directive II (MiFID II), the Fundamental Review of the Trading Book (FRTB) and AnaCredit are cited as the toughest regulations in sight.

Xenomorph’s Sentance, suggests that in a volatile market one key action for the industry is to put data in the hands of the business and encourage a strategic and flexible, rather than tactical, approach to data management that will make it easy for business users to access and manipulate data as events occur. In terms of regulation, he adds: “The challenge is to put a data architecture in place that can cope with multiple regulations. It needs to recognise datasets that overlap across regulations and be flexible enough to accommodate future regulation.”

Lombard Risk’s Phillips notes that more granular regulatory reporting requirements are driving the need to improve data quality and suggests the benefit of this will be the ability to send accurate data to regulators that will then look for hot spots. Like Sentance, he promotes a strategic approach to data management, saying: “The approach has to identify destinations for granular reporting, implement agility for the future and delete inefficiencies. Firms need granular data and data quality to be always on so that they are ready for reporting at any time and can gain competitive advantage.”

Looking at the financial landscape from a regulatory perspective, Francis Gross at the European Central Bank notes that regulators are building ever larger data systems and feeding them with more granular data in near real time. The reasons for this are lessons learnt from the 2008 financial crisis, when data could not be aggregated fast enough to present a clear picture of risk.

While this is an improvement on the regulatory front, Gross says a lack of change in the types of people who are regulators, typically economists and lawyers who are not technologists, and the habit of regulators to set out policy and require the market to find solutions, mean firms are throwing money at solving data problems, when the industry should be taking collective action.

He says: “We need to see regulators being more courageous and leading in fields such as standardising the digital representation of entities. Contracts also need to make progress. As reporting requirements increase, the only way to reduce the burden on industry is to build infrastructure that will allow organisations to automate the delivery of more data, faster and at a lower cost. The Legal Entity Identifier (LEI) supports this in providing a standard entity identifier. We need to do the same for contracts, then mobilise legislation and reduce the cost of regulatory compliance to a minimum.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...