The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG-IT’s Di Giammarino Elaborates on Five Elements of Successful Data Strategy

In a follow up to his comments at last week’s FIMA 2009 conference, PJ Di Giammarino, CEO of think tank JWG-IT, speaks to Reference Data Review about his perspectives on where the data management community should be focusing its attentions.

“Financial institutions and the data professionals working in them do not live in a black and white world. Trade-offs need to be made on a daily basis between remediating audit points, prioritising mountains of paperwork for scarce resources and updating policies and procedures to reflect new business requirements. The cost of getting reference data 100% right is prohibitive. However, there are increasingly steep penalties for getting it wrong – both explicit (audit points, fines, trade breaks) and implicit (missed sales opportunities, cost inefficiencies),” says Di Giammarino. Many FIMA 2009 speakers highlighted the need to engage top business stakeholders in conversations about what changes are required to meet today’s quality standards, he says.

Many of these challenges are not new, but the relative importance of overcoming them is. “We noted, however, that too often the discussion tended towards the general and high level, rather than the specific and practical,” he continues.

“In our work with regulators, trade bodies, banks and members of the supply chain, we have found that there are five elements of successful reference data strategy conversation:

1. Business requirements. Productive discussions about data are grounded in solid definitions of the problems that need to be solved with good data. A thorough understanding of the business’ rules, current and future regulations and the information supply chain is required to define what data is needed, by whom, how and when. The holistic nature of the business and regulatory demands has made these conversations both more strategic and difficult to own.

2. Process. Meaningful reference data conversations need to encompass ‘what does good (wholesale) data (xyz) maintenance look like?’ A dynamic cast of actors who create, update, modify and purge the data needs to be mapped and the key decision points in the information flow defined. Only then can a group of firms discuss quality and cost trade-offs.

3. Data. Too often, general and vague terms are used to describe information that is used in multiple ways by many actors. Understandably, each firm has its own policies and procedures across the sales, middle-office, operations, compliance, credit and audit departments. Often, what one perceives as a single ‘thing’ will vary with legal and tax regimes, languages and business practices. Alignment of data models requires a commonly shared definition of the data required at a given point in a process and in the context of the decisions for which it is required.

4. Operating models. More and more, firms are realising the potential for shared services within their company, by offshoring and the in use of third parties. Conversations about utilities are now entering the mainstream. To align understanding, reference and target operating models need to be understood in the context of points 1-3 above. Managed correctly, the strengths and weaknesses of the different operating models can be quickly evidenced.

5. Business case. Banking boardrooms are not accustomed to discussions of altruism. To enjoy positive boardroom experiences, data conversations need to be contextualised in terms known to the audience. Ideally, conversations need to be grounded in the currency of the bank and denominated on a per transaction basis (for example, cost per account per counterparty).

For an example of an industry level discussion on data maintenance, see the case study of our Customer Data Management Group. The same approach has been applied to liquidity risk and is currently being scoped for macro prudential oversight. We encourage other trade bodies, firms and members to think through how they can best apply the learning from our last four years of research. Regulators have raised the bar and there are tens of thousands of actors in our industry that need to respond quickly.”

Related content

WEBINAR

Upcoming Webinar: Evolution of data management for the buy-side 2021

Date: 27 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes The buy-side faced a barrage of regulation in 2020 and is now under pressure to make post-Brexit adjustments and complete LIBOR transition by the end of 2021. To ensure compliance and ease the burden of in-house data management, many...

BLOG

SmartStream Adds API Suite to Reference Data Utility

SmartStream Technologies has made its first move since rebranding at the end of last year and bringing its Reference Data Utility (RDU) into the mainstream of its product line-up, with the addition of an API suite for the RDU. The APIs are designed to provide faster access to accurate data, operational efficiency, greater agility, and...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Corporate Actions 2009 Edition

Rather than detracting attention away from corporate actions automation projects, the financial crisis appears to have accentuated the importance of the vital nature of this data. Financial institutions are more aware than ever before of the impact that inaccurate corporate actions data has on their bottom lines as a result of the increased focus on...