About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Making Data Valuable Requires Quality and Structure

Subscribe to our newsletter

Building the ability to recognise the value in data and derive that value for the enterprise is a complicated proposition to include in data management models, according to experts who spoke at the Data Management Summit hosted by A-Team Group in New York earlier this month.

First, firms should establish an “accountability structure through a model that cascades from the data owner at the highest level, to the people on the line, the operations, the people who originate accounts, and the IT people who fix accounts with business system owners,” said Tom Mavroudis, head of enterprise data governance at MUFG Union Bank.

The data management model needs a prioritisation capability, or a mechanism to recognise value, according to Jennifer Ippoliti, firmwide data governance lead at JP Morgan Chase. “Knowing which problems to go after, in what order and what to expect from them, means when you close out that issue, you know it was worth the time and effort,” she said. “It’s specific to the technology and business partnership – and really clear about roles, responsibilities and division of labour. You cannot ever stop iterating that model and making it more precise.”

JP Morgan Chase addressed data management by reconfiguring its leadership, establishing a chief data officer (CDO) role and naming CDOs in different business lines and departments. “It’s about how we apply governance to unstructured datasets and the cloud, to make sure the foundational tools we build have the right datasets to begin with,” said Ippoliti.

Evaluating the precision of a data model can be done in many ways, but one should not discount sheer confidence in the data being used, stated Peter Serenita, group CDO, HSBC. “We can come up with a lot of different algorithms and metrics,” he said. “Some of them will mean something to someone, but ultimately, the answer is the firm’s confidence in the data it’s using. Call it a ‘confidence index’, which can indicate if the data previously did not seem to be fit-for-purpose and you didn’t want to use it, or if now, you have strong confidence in that data – that it has been checked and double-checked.”

Trust is key to such confidence in data, particularly the precision of calculations, according to David Saul, senior vice president and chief scientist at State Street. “For anyone making a decision based on the data, their level of confidence in that decision is going to be about how well they trust it, do they know where it came from and what it means,” he said. “That’s true internally and for delivering data to our clients, that they trust we’ve done all the calculations. Also, regulators need to trust that what we deliver is an accurate representation of what is needed to reduce the risk in the market. You must have hard data to prove whether it’s the legacy of the data or it’s how you did the calculations, the algorithm you used and ultimately whether it means what you say it means.”

The origin of data can be tracked successfully through creation of data ontologies, in transactional or operational systems, to determine how the data should be used for risk management, added Saul.

To build more complex accountability structures and prioritisation capabilities, as Mavroudis and Ippoliti advocated, confidence in the quality and sourcing of data is essential, as Serenita and Saul said. To derive value from data, accountability and priorities must be well defined, and be applied to trustworthy data, as the experts illustrated.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

17 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise...

BLOG

Data Seen as Solution to Weathering Climate Stress

Recent stress tests of European financial institutions’ resilience to climate change have underlined the importance that high-quality data will play in fortifying banks, insurers and other organisations against the risk of transition and environmental losses. The region’s three financial regulators and the European Central Bank said that the results of their one-off “Fit-for-55” analysis found...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...