About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Ref Data Needs to go “Back to Basics” to Consider End User Requirements, Says Deutsche Bank’s Sean Taylor

Subscribe to our newsletter

Last year, Deutsche Bank’s director of financial intermediaries Sean Taylor predicted that the “winds of change” were due to blow through the reference data industry and this year’s FIMA saw him elaborate on what has changed thus far as a result of a regulatory shakeup and the profound shift in the way firms view risk management. Taylor noted the increased importance being placed on the data management function that has resulted from his predicted “crackdown” on data, but warned that data managers need to go “back to basics” in order to make sure end user requirements are being met and the “brand” does not suffer.

Data management practices can be key to survival in the current market, but only if they are carried out with the requirements of the business in mind, warned Taylor. “The data must feature useful characteristics for the end user and data items need to be relevant and on time,” he said. “The establishment of rigorous data processes is the key to positive business growth, as long as the data is clean, clear and compatible.”

Taylor indicated that the cost of getting data right is much more accepted across the industry as a result of the desire to restore trust in the financial system by providing increased transparency. He referred to the building of “brand data” that firms can stand behind and use as a strategic resource in order to improve their businesses. Rather than being considered solely as an item in the minus column of a firm’s P&L, reference data can actually help to restore a firm’s brand in a market where transparency is increasingly important (for how not to do it, see the regulatory fines imposed on a whole range of firms this year for their data failures).

The process required to turn this data into a strategic resource, however, is far from simple, conceded Taylor. He likened the process of getting departments to agree on the data quality basics to herding cats and highlighted the need to deal with multiple sources of data and the inheritance of numerous legacy systems. But he noted that time is of the essence, as the “window is shrinking rapidly” on the opportunity to partner with the business and drive through change. After all, bankers may soon forget the post-crisis chaos and the importance of data quality along with it.

“It takes two to tango, so engage with your business partners and get them on board as soon as you can,” he told delegates. “Don’t waste time trying to make bananas look like apples, work with the data formats that you have from your multiple data silos. Work with the apples, oranges and bananas, but make sure they are cross referenced and make sense to the end user.”

Taylor indicated that his part of Deutsche Bank is “blessed” with a CEO that appreciates that problems that bad data cause, but noted that careful elaboration of the impact of data management can get business users engaged. He pointed to risk management requirements as an obvious area of relevance with regards to getting senior level buy in to a data management project.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

Scalability the Keyword Behind S&P Global’s Enriched iLEVEL

S&P Global Market Intelligence’s update to its iLEVEL private markets data tool has been designed to enable firms to scale their engagements in private markets. The financial data company is betting that financial institutions’ growing engagement in these markets is such that they will need the sort of data provisions associated with public markets. S&P...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...