About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Getting Riled up Over Data

Subscribe to our newsletter

As you know, in the run up to Swift’s Sibos conference next month, Reference Data Review has been endeavouring to find out what readers think of the European Central Bank’s proposed reference data utility. And it’s reassuring to know that rather than existing in a vacuum, data managers are ready and waiting to provide feedback.

As well as voicing concerns over the introduction of a potentially bureaucratic approach to a business challenge, one of our readers was inspired to ask some serious questions of the ECB. “Having heard Francis Gross speak at the Xtrakter conference earlier this year my understanding is that the ECB initiative is based on automobile industry best practice where quality is instilled at the earliest point possible. Hence the logic to create a new utility data creation source that ensures consistent standards for all securities.

However, given the wide range of instruments and vendor value added fields ‘baked in’ to the business process, wouldn’t it perhaps be more feasible to use ECB’s clout to define and enforce industry standards for core data attributes that must be supported by all sources and vendors? In that way the industry could gravitate towards standards over time as part of existing change activity,” said the reader, who wished to remain anonymous.

These comments are indicative of the concern in the market that the ECB will be adding some level of confusion and duplication to what vendors already provide in the reference data space. PJ Di Giammarino, CEO of think tank JWG-IT, reckons the body that takes on the endeavour will have its work cut out for it. “Whoever takes the leadership in this area had better have the skin of a rhinoceros, the budget of King Midas and the Yoda’s ability to manipulate the force,” he told Reference Data Review earlier this month. The publication of these comments on our website, in turn, prompted a call from Per Nymand-Andersen, a colleague of Gross’ in the ECB’s statistics division and head of section, who sought to clarify some points that he felt may have been misunderstood about the proposals.

Nymand explained that the proposals are still on the drawing board and, in fact, the ECB is as yet unsure itself about how far it should extend its ambitions and is looking to the industry to provide feedback on this subject. Gross has also recently confirmed that the utility will adopt a gradual approach to standardisation rather than bite off more than it can chew to begin with. Meanwhile, as the industry discussions continue around the ECB’s proposals, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative.

The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data. The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names.

Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come. The US initiative goes one step further than the ECB’s ambitions because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing. But if some corners of the market are as yet unsure about the introduction of one utility in the reference data space, surely two is likely to provoke even more of a backlash?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

Institutions Look to Cloud to Meet New Asset Management Data Challenges: Arcesium

Institutional asset managers are looking to cloud solutions as they seek to find economies and streamline their asset management processes. Taking advantage of new digital technology that can help them meet changing client demands in an evolving financial landscape, these firms have been prompted to look to the cloud as creaking legacy infrastructure limits their...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...