About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

HSBC Group CDO Details the Requirements of Data Management Strategy and Practice

Subscribe to our newsletter

The chief data officer (CDO) needs to be a change agent with the aim of making data part of the business-as-usual process. Speaking at last week’s FIMA conference, Peter Serenita, group chief data officer at HSBC, described the role of the CDO and how best to develop data management strategy and practice.

On the role of the CDO, Serenita, who was appointed group CDO at HSBC earlier this year, said the requirement is for an executive who can create a data organisation, build a governance model that includes data management sponsors, develop a data management process, define a business data architecture and identify critical enterprise-wide data elements. The CDO must also be able to develop a data vision, policies and standards, and a measurement framework for data quality. Ultimately, however, the need is to communicate across the business and build data into the business culture.

Considering how best to develop data management strategy and practice for risk management, Serenita pointed to Basel Committee on Banking Supervision (BCBS) regulation that sets down practices for effective risk data aggregation and management. He noted that the BCBS regulation is a departure from typical regulations that specify particular reporting requirements, in essence telling practitioners what to do, and instead tells practitioners how to achieve compliance. By way of example, Serenita noted the data management disciplines described by BCBS, including data governance, frequency, accuracy, completeness, timeliness and adaptability.

If these disciplines are employed as the basis for data management, the challenges that remain can include a lack of common definitions in reference data, such as country codes, which make it difficult to roll up data for risk management; a lack of common sourcing, which requires an operational process to identify authoritative sources and define criteria for them; and a lack of a data ownership model.

Serenita also mentioned difficulties caused by a lack of common identifiers, and suggested the Legal Entity Identifier (LEI) may help in the case of entity data. Beyond simple identifiers such as the LEI, Serenita noted the need for implied relationship data that could require the application of big data techniques to understand implied or predictive relationships.

As well as bringing together the practices outlined by the BCBS and the challenges of data management, Serenita discussed the need to define the scope of any data management programme, be it business or enterprise wide, and put controls around this. He also reiterated the need for a good governance model, a data vision that supports business strategy, data policies that are non-negotiable, a data management process that is aligned to the change management process, a well-defined roadmap, and the ability to deliver and optimise the roadmap while measuring adoption of data management.

He acknowledged that making the business case for data management programmes can be difficult, but said this can be eased going forward by measuring results and proving business value, perhaps associating data quality with risk-weighted assets.

Concluding his presentation, Serenita noted that data management is not for the faint hearted, but that it can deliver significant benefits to business when technology is seen as an enabler rather than a starting point and an emphasis is placed on measuring results.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

17 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...