About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Making Data Valuable Requires Quality and Structure

Subscribe to our newsletter

Building the ability to recognise the value in data and derive that value for the enterprise is a complicated proposition to include in data management models, according to experts who spoke at the Data Management Summit hosted by A-Team Group in New York earlier this month.

First, firms should establish an “accountability structure through a model that cascades from the data owner at the highest level, to the people on the line, the operations, the people who originate accounts, and the IT people who fix accounts with business system owners,” said Tom Mavroudis, head of enterprise data governance at MUFG Union Bank.

The data management model needs a prioritisation capability, or a mechanism to recognise value, according to Jennifer Ippoliti, firmwide data governance lead at JP Morgan Chase. “Knowing which problems to go after, in what order and what to expect from them, means when you close out that issue, you know it was worth the time and effort,” she said. “It’s specific to the technology and business partnership – and really clear about roles, responsibilities and division of labour. You cannot ever stop iterating that model and making it more precise.”

JP Morgan Chase addressed data management by reconfiguring its leadership, establishing a chief data officer (CDO) role and naming CDOs in different business lines and departments. “It’s about how we apply governance to unstructured datasets and the cloud, to make sure the foundational tools we build have the right datasets to begin with,” said Ippoliti.

Evaluating the precision of a data model can be done in many ways, but one should not discount sheer confidence in the data being used, stated Peter Serenita, group CDO, HSBC. “We can come up with a lot of different algorithms and metrics,” he said. “Some of them will mean something to someone, but ultimately, the answer is the firm’s confidence in the data it’s using. Call it a ‘confidence index’, which can indicate if the data previously did not seem to be fit-for-purpose and you didn’t want to use it, or if now, you have strong confidence in that data – that it has been checked and double-checked.”

Trust is key to such confidence in data, particularly the precision of calculations, according to David Saul, senior vice president and chief scientist at State Street. “For anyone making a decision based on the data, their level of confidence in that decision is going to be about how well they trust it, do they know where it came from and what it means,” he said. “That’s true internally and for delivering data to our clients, that they trust we’ve done all the calculations. Also, regulators need to trust that what we deliver is an accurate representation of what is needed to reduce the risk in the market. You must have hard data to prove whether it’s the legacy of the data or it’s how you did the calculations, the algorithm you used and ultimately whether it means what you say it means.”

The origin of data can be tracked successfully through creation of data ontologies, in transactional or operational systems, to determine how the data should be used for risk management, added Saul.

To build more complex accountability structures and prioritisation capabilities, as Mavroudis and Ippoliti advocated, confidence in the quality and sourcing of data is essential, as Serenita and Saul said. To derive value from data, accountability and priorities must be well defined, and be applied to trustworthy data, as the experts illustrated.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

GoldenSource OMNI Evolves as Buy-Side Demands Transform

Data cloud giant Snowflake’s forum in San Francisco last month was closely watched by the data management industry, especially GoldenSource. A year after its launch, the creators of GoldenSource’s OMNI data lake product for asset managers were keenly watching what Snowflake had to offer with an eye to enhancing the app’s own provisions for the...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...