The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG-IT’s Di Giammarino Elaborates on Five Elements of Successful Data Strategy

Share article

In a follow up to his comments at last week’s FIMA 2009 conference, PJ Di Giammarino, CEO of think tank JWG-IT, speaks to Reference Data Review about his perspectives on where the data management community should be focusing its attentions.

“Financial institutions and the data professionals working in them do not live in a black and white world. Trade-offs need to be made on a daily basis between remediating audit points, prioritising mountains of paperwork for scarce resources and updating policies and procedures to reflect new business requirements. The cost of getting reference data 100% right is prohibitive. However, there are increasingly steep penalties for getting it wrong – both explicit (audit points, fines, trade breaks) and implicit (missed sales opportunities, cost inefficiencies),” says Di Giammarino. Many FIMA 2009 speakers highlighted the need to engage top business stakeholders in conversations about what changes are required to meet today’s quality standards, he says.

Many of these challenges are not new, but the relative importance of overcoming them is. “We noted, however, that too often the discussion tended towards the general and high level, rather than the specific and practical,” he continues.

“In our work with regulators, trade bodies, banks and members of the supply chain, we have found that there are five elements of successful reference data strategy conversation:

1. Business requirements. Productive discussions about data are grounded in solid definitions of the problems that need to be solved with good data. A thorough understanding of the business’ rules, current and future regulations and the information supply chain is required to define what data is needed, by whom, how and when. The holistic nature of the business and regulatory demands has made these conversations both more strategic and difficult to own.

2. Process. Meaningful reference data conversations need to encompass ‘what does good (wholesale) data (xyz) maintenance look like?’ A dynamic cast of actors who create, update, modify and purge the data needs to be mapped and the key decision points in the information flow defined. Only then can a group of firms discuss quality and cost trade-offs.

3. Data. Too often, general and vague terms are used to describe information that is used in multiple ways by many actors. Understandably, each firm has its own policies and procedures across the sales, middle-office, operations, compliance, credit and audit departments. Often, what one perceives as a single ‘thing’ will vary with legal and tax regimes, languages and business practices. Alignment of data models requires a commonly shared definition of the data required at a given point in a process and in the context of the decisions for which it is required.

4. Operating models. More and more, firms are realising the potential for shared services within their company, by offshoring and the in use of third parties. Conversations about utilities are now entering the mainstream. To align understanding, reference and target operating models need to be understood in the context of points 1-3 above. Managed correctly, the strengths and weaknesses of the different operating models can be quickly evidenced.

5. Business case. Banking boardrooms are not accustomed to discussions of altruism. To enjoy positive boardroom experiences, data conversations need to be contextualised in terms known to the audience. Ideally, conversations need to be grounded in the currency of the bank and denominated on a per transaction basis (for example, cost per account per counterparty).

For an example of an industry level discussion on data maintenance, see the case study of our Customer Data Management Group. The same approach has been applied to liquidity risk and is currently being scoped for macro prudential oversight. We encourage other trade bodies, firms and members to think through how they can best apply the learning from our last four years of research. Regulators have raised the bar and there are tens of thousands of actors in our industry that need to respond quickly.”

Related content

WEBINAR

Upcoming Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

Date: 21 January 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions...

BLOG

Cloud Data Protection Specialist Clumio Teams up with Microsoft 365

Clumio, a California-based cloud-security start-up focusing on software-as-a-service (SaaS) for enterprise backup, has added Microsoft 365 to its secure backup-as-a-service, providing organisations running Microsoft 365 with a globally consolidated data protection service as the global coronavirus pandemic sees a rapid increase in ransomware attacks and email scams. The two-year old company, which in December 2019...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...