The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

How can you manage what you can’t measure?

Share article

John Bottega, chief data officer at Citigroup Corporate and Investment Bank, got FIMA 2006 in London off to a lively start earlier this month with a rousing speech congratulating the assembled reference data managers on how far they’ve come. Indeed, the invention of chief data officers is in itself a testament to the fact that the business focus of reference data management projects is better understood, and that these firms appreciate the major driver for data improvement is risk as opposed to merely cost. “It’s no longer seen as a ‘data geek’ problem,” he said. “Data management has experienced a paradigm shift. It’s not just about technology, it’s about the content: how it’s sourced and maintained. More and more, the maintenance of data is moving out of technology and into the business. It’s not just about cost – it’s about risk, including reputational risk.”

He also outlined another important shift, though – away from the “monolithic”, big bang approach, towards an approach based around “small, incremental deliverables”. “I tell our folks, the 2007 objective is to get the 2008 budget approved,” he said. Although it is clear major reference data initiatives will be multi-year projects, it is vital to focus on what is important to the business and to bring benefits to the business in an “immediate timeframe”, he added.

Bottega’s message that firms must find a balance between the strategic and the tactical was one delivered again and again by speakers during the event – and was brought home especially forcefully by Valerie Malaval of AXA Investment Managers in her presentation later the same day. She described how at every stage of AXA IM’s multi-year data project she is required to calculate and then demonstrate the ROI on its investments in data and services – and shared her policy of delivering “quick wins” to the business on an ongoing basis, as the ultimate goal is worked towards.

Malaval also highlighted the importance of having metrics in place to help to prove the benefits reference data improvement projects are bringing. The need for metrics was another consistent theme of the event, no doubt music to the ears of the much-in-evidence Mike Atkin, since establishing metrics is one raison d’etre of the EDM Council of which he’s head. As Bottega said, though “metrics are elusive”.

While it’s certainly true you can’t manage what you can’t measure, there is a need to ensure you’re measuring the right thing, as a cautionary tale from Peter Serenita, chief data officer at JPMorgan Worldwide Securities Services, ably demonstrated. JPMorgan has set up two reference data operational hubs, one in Delaware and one in Mumbai, and in the early stages of their operation one of the metrics the bank applied was around volume. The only problem was that one hub interpreted a request from the business as one piece of work, while the other interpreted a request affecting two systems as two pieces of work. Until this discrepancy was understood, it seemed, quite erroneously, that one centre was handling double the volume of the other. The reference data industry has undoubtedly matured – thanks in no small measure (as Bottega said) to conferences and newsletters dedicated to covering it – and among the many other challenges it still faces, as outlined by the great and the good at FIMA, is that of developing effective benchmarks by which to measure its increasing successes.

Related content

WEBINAR

Recorded Webinar: Last minute preparations for SFTR: What still needs to be done and are we ready?

The regulation clock is ticking. Financial firms, especially those subject to Phase I of implementation, are well aware of the impending April 2020 deadline for the Securities Financing Transactions Regulation. The question is, are they ready? Tactical, i.e painful, approaches to compliance won’t be good enough. A strategic plan of attack is necessary to combat...

BLOG

Market Volatility – What’s the Impact on Profit & Loss and Independent Pricing Valuation?

By Charlie Brown, former head of Head of IPV at Lloyds Banking Group, now Head of Market & Risk Solutions at GoldenSource. The recent market volatility has emphasized the need for transparency in the processes that generate the profit and loss (P&L) and capital charges that impact bank balance sheets. In particular there is increased scrutiny...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual which took place in June 2020 was a huge success with over 1,100 delegates registered. We are currently working on our plans for 2021 and we hope to be back with an in-person event. Whatever the future holds you can guarantee our 2021 event will be back with an exceptional guest speaker line up of Regtech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment. Can't wait until 2021? make sure you sign up to our RegTech Summit Virtual, November 2020. More info...

GUIDE

Best Practice Client Onboarding

Client onboarding is central to the success of banks, yet it continues to present challenges and the benefits of getting it right are difficult to achieve. The challenges arise from siloed systems, manual processes and poor entity data quality. The potential benefits of successful implementation include excellent client experience, improved client acquisition and loyalty, new business opportunities, reductions in costs, competitive advantage, and confidence in compliance.