About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Making Data Valuable Requires Quality and Structure

Subscribe to our newsletter

Building the ability to recognise the value in data and derive that value for the enterprise is a complicated proposition to include in data management models, according to experts who spoke at the Data Management Summit hosted by A-Team Group in New York earlier this month.

First, firms should establish an “accountability structure through a model that cascades from the data owner at the highest level, to the people on the line, the operations, the people who originate accounts, and the IT people who fix accounts with business system owners,” said Tom Mavroudis, head of enterprise data governance at MUFG Union Bank.

The data management model needs a prioritisation capability, or a mechanism to recognise value, according to Jennifer Ippoliti, firmwide data governance lead at JP Morgan Chase. “Knowing which problems to go after, in what order and what to expect from them, means when you close out that issue, you know it was worth the time and effort,” she said. “It’s specific to the technology and business partnership – and really clear about roles, responsibilities and division of labour. You cannot ever stop iterating that model and making it more precise.”

JP Morgan Chase addressed data management by reconfiguring its leadership, establishing a chief data officer (CDO) role and naming CDOs in different business lines and departments. “It’s about how we apply governance to unstructured datasets and the cloud, to make sure the foundational tools we build have the right datasets to begin with,” said Ippoliti.

Evaluating the precision of a data model can be done in many ways, but one should not discount sheer confidence in the data being used, stated Peter Serenita, group CDO, HSBC. “We can come up with a lot of different algorithms and metrics,” he said. “Some of them will mean something to someone, but ultimately, the answer is the firm’s confidence in the data it’s using. Call it a ‘confidence index’, which can indicate if the data previously did not seem to be fit-for-purpose and you didn’t want to use it, or if now, you have strong confidence in that data – that it has been checked and double-checked.”

Trust is key to such confidence in data, particularly the precision of calculations, according to David Saul, senior vice president and chief scientist at State Street. “For anyone making a decision based on the data, their level of confidence in that decision is going to be about how well they trust it, do they know where it came from and what it means,” he said. “That’s true internally and for delivering data to our clients, that they trust we’ve done all the calculations. Also, regulators need to trust that what we deliver is an accurate representation of what is needed to reduce the risk in the market. You must have hard data to prove whether it’s the legacy of the data or it’s how you did the calculations, the algorithm you used and ultimately whether it means what you say it means.”

The origin of data can be tracked successfully through creation of data ontologies, in transactional or operational systems, to determine how the data should be used for risk management, added Saul.

To build more complex accountability structures and prioritisation capabilities, as Mavroudis and Ippoliti advocated, confidence in the quality and sourcing of data is essential, as Serenita and Saul said. To derive value from data, accountability and priorities must be well defined, and be applied to trustworthy data, as the experts illustrated.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

PE Deal Failures Highlight Importance of Private Data, Says JMAN Group

The critical importance of data to the private equity and alternatives markets sector is starkly underlined by an observation from Anush Newman, chief executive and co-founder of JMAN Group. “In the past 18 months, I know of at least 20 acquisition deals that have fallen through because the target companies didn’t have enough data to...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Entity Data Management Handbook – Third Edition

Welcome to the third edition of the Entity Data Management Handbook which is available for free download. In this updated edition we delve into the role entity data plays in the smooth running of financial institutions and capital markets, the challenges of attaining high quality data, and various aspects, approaches and technologies involved in managing...