About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Making Data Valuable Requires Quality and Structure

Subscribe to our newsletter

Building the ability to recognise the value in data and derive that value for the enterprise is a complicated proposition to include in data management models, according to experts who spoke at the Data Management Summit hosted by A-Team Group in New York earlier this month.

First, firms should establish an “accountability structure through a model that cascades from the data owner at the highest level, to the people on the line, the operations, the people who originate accounts, and the IT people who fix accounts with business system owners,” said Tom Mavroudis, head of enterprise data governance at MUFG Union Bank.

The data management model needs a prioritisation capability, or a mechanism to recognise value, according to Jennifer Ippoliti, firmwide data governance lead at JP Morgan Chase. “Knowing which problems to go after, in what order and what to expect from them, means when you close out that issue, you know it was worth the time and effort,” she said. “It’s specific to the technology and business partnership – and really clear about roles, responsibilities and division of labour. You cannot ever stop iterating that model and making it more precise.”

JP Morgan Chase addressed data management by reconfiguring its leadership, establishing a chief data officer (CDO) role and naming CDOs in different business lines and departments. “It’s about how we apply governance to unstructured datasets and the cloud, to make sure the foundational tools we build have the right datasets to begin with,” said Ippoliti.

Evaluating the precision of a data model can be done in many ways, but one should not discount sheer confidence in the data being used, stated Peter Serenita, group CDO, HSBC. “We can come up with a lot of different algorithms and metrics,” he said. “Some of them will mean something to someone, but ultimately, the answer is the firm’s confidence in the data it’s using. Call it a ‘confidence index’, which can indicate if the data previously did not seem to be fit-for-purpose and you didn’t want to use it, or if now, you have strong confidence in that data – that it has been checked and double-checked.”

Trust is key to such confidence in data, particularly the precision of calculations, according to David Saul, senior vice president and chief scientist at State Street. “For anyone making a decision based on the data, their level of confidence in that decision is going to be about how well they trust it, do they know where it came from and what it means,” he said. “That’s true internally and for delivering data to our clients, that they trust we’ve done all the calculations. Also, regulators need to trust that what we deliver is an accurate representation of what is needed to reduce the risk in the market. You must have hard data to prove whether it’s the legacy of the data or it’s how you did the calculations, the algorithm you used and ultimately whether it means what you say it means.”

The origin of data can be tracked successfully through creation of data ontologies, in transactional or operational systems, to determine how the data should be used for risk management, added Saul.

To build more complex accountability structures and prioritisation capabilities, as Mavroudis and Ippoliti advocated, confidence in the quality and sourcing of data is essential, as Serenita and Saul said. To derive value from data, accountability and priorities must be well defined, and be applied to trustworthy data, as the experts illustrated.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Total Portfolio Views Unlock Value from Public-Private Investments: Webinar Review

Total portfolio views within investment management platforms are becoming critical to capital markets participants as private and alternative market assets comprise an ever-larger part of institutions’ investment and risk-management strategies. Having a holistic view enables organisations to unlock the greatest value from their data, a recent A-Team Group Data Management Insight webinar discussed. Aiding in...

EVENT

Eagle Alpha Alternative Data Conference, Spring, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...