About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

HSBC Group CDO Details the Requirements of Data Management Strategy and Practice

Subscribe to our newsletter

The chief data officer (CDO) needs to be a change agent with the aim of making data part of the business-as-usual process. Speaking at last week’s FIMA conference, Peter Serenita, group chief data officer at HSBC, described the role of the CDO and how best to develop data management strategy and practice.

On the role of the CDO, Serenita, who was appointed group CDO at HSBC earlier this year, said the requirement is for an executive who can create a data organisation, build a governance model that includes data management sponsors, develop a data management process, define a business data architecture and identify critical enterprise-wide data elements. The CDO must also be able to develop a data vision, policies and standards, and a measurement framework for data quality. Ultimately, however, the need is to communicate across the business and build data into the business culture.

Considering how best to develop data management strategy and practice for risk management, Serenita pointed to Basel Committee on Banking Supervision (BCBS) regulation that sets down practices for effective risk data aggregation and management. He noted that the BCBS regulation is a departure from typical regulations that specify particular reporting requirements, in essence telling practitioners what to do, and instead tells practitioners how to achieve compliance. By way of example, Serenita noted the data management disciplines described by BCBS, including data governance, frequency, accuracy, completeness, timeliness and adaptability.

If these disciplines are employed as the basis for data management, the challenges that remain can include a lack of common definitions in reference data, such as country codes, which make it difficult to roll up data for risk management; a lack of common sourcing, which requires an operational process to identify authoritative sources and define criteria for them; and a lack of a data ownership model.

Serenita also mentioned difficulties caused by a lack of common identifiers, and suggested the Legal Entity Identifier (LEI) may help in the case of entity data. Beyond simple identifiers such as the LEI, Serenita noted the need for implied relationship data that could require the application of big data techniques to understand implied or predictive relationships.

As well as bringing together the practices outlined by the BCBS and the challenges of data management, Serenita discussed the need to define the scope of any data management programme, be it business or enterprise wide, and put controls around this. He also reiterated the need for a good governance model, a data vision that supports business strategy, data policies that are non-negotiable, a data management process that is aligned to the change management process, a well-defined roadmap, and the ability to deliver and optimise the roadmap while measuring adoption of data management.

He acknowledged that making the business case for data management programmes can be difficult, but said this can be eased going forward by measuring results and proving business value, perhaps associating data quality with risk-weighted assets.

Concluding his presentation, Serenita noted that data management is not for the faint hearted, but that it can deliver significant benefits to business when technology is seen as an enabler rather than a starting point and an emphasis is placed on measuring results.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Why Outsourcing is Shifting from Cost Centre to Being a Catalyst for Transformation

By Sarva Srinivasan, Managing Director, NeoXam Americas. For decades, outsourcing across all industries has been synonymous with trimming the back office, streamlining headcount, and delegating so called non-core processes to third parties. But in the world of finance, the ground is well and truly shifting. As the asset management and servicing industries face mounting multi-asset...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...