About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Reference Data Getting its Day in the Sun, Marcus Evans Delegates Told

Subscribe to our newsletter

Reference data management is no longer at the fringes of an organisation in terms of the perception of its strategic importance by senior management, said Sean Taylor, director at Deutsche Wealth Management, to the delegation at Marcus Evans’ recent Reference Data conference on 12 February.

Taylor, who was chairing the event, indicated that increased regulatory scrutiny over the next 12 months would increase data management’s importance to firms even further.

“Data needs to be of a standard that doesn’t allow deep holes to be dug,” explained Taylor. “There will be winds of change throughout the financial markets over the next 12 months as the regulators begin turning over stones to find out what happened, why and how.”

He predicted that the US will see a regulatory crackdown on financial data by the end of the first quarter of 2009 and there will be a real requirement for golden copy as a result.

“It will no longer be acceptable to have silos where the right hand does not know what the left is doing,” he added. “But data management will still have to fight to keep its share of the IT budget.”

Taylor warned that there will be “fighting at the money pit” for data management project funding this year and suggested that delegates learn from each others’ experiences to help them in their quest.

This perception of the increasing importance of data management to senior management was echoed throughout the conference. Claus Thorball, head of global market data at Saxo Bank, discussed his practical experience of achieving what he called “hands on” data quality and suggested that as long as the business case is sound, management will listen.

Thorball said that although the financial crisis has placed emphasis on getting data right, it has also put a lot of pressure on institutions to drastically cut overheads. “There is not a lot of fat on budgets any more and one of the main KPIs for projects seems to be to keep it on budget,” he said. “However, the crisis has also afforded us a unique opportunity to be able to review our data providers and re-evaluate and renegotiate the services they provide.”

He warned delegates not to try to “sell” the benefits of increasing data quality to their senior management; “money talks” he recommended. In order to get backing in such a climate, project teams must explain the potential savings and the potential earnings offered by the improvement in data quality. “You need to decide on the baseline for the project before you begin it and document the creation of value at every step of the process. In order to be successful, you need the backing of the business and to engage the stakeholders in the process,” said Thorball.

The measurements to highlight are in three key areas, according to Thorball: cost reduction, revenue growth and risk reduction. He also recommended providing clarity around the governance of the project with regards to the individual stages, including implementation and maintenance.

To this end, Saxo Bank engaged in a review of its providers with regards to market data and decided to look at “hands on quality assessment tools” as the basis for renegotiation. The bank was using three main vendors for its exchange pricing data and following a review of their provision, decided to denote one as a primary provider, another as a secondary provider and the last as a back up.

“With the conscious choice of a primary provider, we got a better level of service from them in terms of quality, service level agreements (SLAs) and a significant level of cost reduction. The secondary provider is also improving its service because it wants to become our primary provider,” he explained.

Deutsche’s Taylor agreed that there has been a “flight to quality” with regards to the adoption of a qualitative approach to data within the industry. He recommended that delegates heed the lessons learnt by Saxo in its renegotiation of vendor contracts in a general sense in order to get more out of them for less.

Maurizio Garro, head of the group pricing division of UniCredit Group, explained his bank’s experience of normalising group wide pricing practices. He indicated that Basel II and International Accounting Standards (IAS) have had a significant impact on the pricing requirements of financial institutions. “IAS has meant we need sophisticated estimations and a level of transparency around these quickly,” he said.

UniCredit had a significant challenge to deal with the many inputs from various pricing sources and took the decision to create a central library for this data. “The Bulldozer system was the end result of four years of work and it provides all end users within the bank with the opportunity to simulate prices,” Garro explained. “The main challenge was to deliver all these outputs in the time required.”

Timeliness has become a key factor in data management as the once distinct lines between market data and static reference data become blurred. This was another recurring theme throughout the speaker presentations, as speakers described the new pressures on their data management departments. Being able to pinpoint data inaccuracies and solve them in real time is becoming a priority in such risk averse times, speakers indicated.

Chris Johnson, head of data management for Institutional Fund Services Europe at HSBC Securities Services, gave the third party administrator viewpoint on the data management challenge. He agreed that since the fall of Lehman last year, pricing has become a key challenge for the industry. “Pricing has become my life,” he joked.

However, rather than focus on the issues surrounding pricing alone, Johnson discussed the challenges of practical implementation of STP within the sphere of reference data. He described the complexities surrounding the various segments of the trade lifecycle and highlighted the potential cost and risk involved in data bottlenecks.

He identified the riskiest area with regards to data inaccuracies as the execution cycle, which requires an “extensive amount of data” and when things go wrong it becomes “very expensive”. Issues such as incorrect instrument data, the identification of the wrong settlement location, incorrectly structured funds, inaccuracies in FX data and incorrect calendars are just a few of the horrors that are awaiting trade data at this point, he explained.

Johnson’s main recommendation to delegates was to explain to their downstream users the perils of changing the data. “The perception within firms is that it is easier to change the data than the processes but if you do this, it will result in a very twisted securities master file. Tinker with the golden copy at your peril,” he warned.

Overall, speakers were positive about the future of data management in the current environment and had practical advice for those just now beginning to dip a toe into the data management project pool. The main advice centred around improving communication with both senior management and downstream users about the real, tangible benefits of these projects right from the start.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...

BLOG

UK Seen Timing Ratings Regulation With EU Decision

The UK is expected to announce a new regime for overseeing ESG rating providers, a move that comes hard on the heels of a report indicating continued frustration with the metrics among investors. Ministers are expected to unveil their proposals early in 2024, following industry consultations earlier this year and months after the nation’s regulator said it...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...