About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Getting Riled up Over Data

Subscribe to our newsletter

As you know, in the run up to Swift’s Sibos conference next month, Reference Data Review has been endeavouring to find out what readers think of the European Central Bank’s proposed reference data utility. And it’s reassuring to know that rather than existing in a vacuum, data managers are ready and waiting to provide feedback.

As well as voicing concerns over the introduction of a potentially bureaucratic approach to a business challenge, one of our readers was inspired to ask some serious questions of the ECB. “Having heard Francis Gross speak at the Xtrakter conference earlier this year my understanding is that the ECB initiative is based on automobile industry best practice where quality is instilled at the earliest point possible. Hence the logic to create a new utility data creation source that ensures consistent standards for all securities.

However, given the wide range of instruments and vendor value added fields ‘baked in’ to the business process, wouldn’t it perhaps be more feasible to use ECB’s clout to define and enforce industry standards for core data attributes that must be supported by all sources and vendors? In that way the industry could gravitate towards standards over time as part of existing change activity,” said the reader, who wished to remain anonymous.

These comments are indicative of the concern in the market that the ECB will be adding some level of confusion and duplication to what vendors already provide in the reference data space. PJ Di Giammarino, CEO of think tank JWG-IT, reckons the body that takes on the endeavour will have its work cut out for it. “Whoever takes the leadership in this area had better have the skin of a rhinoceros, the budget of King Midas and the Yoda’s ability to manipulate the force,” he told Reference Data Review earlier this month. The publication of these comments on our website, in turn, prompted a call from Per Nymand-Andersen, a colleague of Gross’ in the ECB’s statistics division and head of section, who sought to clarify some points that he felt may have been misunderstood about the proposals.

Nymand explained that the proposals are still on the drawing board and, in fact, the ECB is as yet unsure itself about how far it should extend its ambitions and is looking to the industry to provide feedback on this subject. Gross has also recently confirmed that the utility will adopt a gradual approach to standardisation rather than bite off more than it can chew to begin with. Meanwhile, as the industry discussions continue around the ECB’s proposals, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative.

The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data. The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names.

Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come. The US initiative goes one step further than the ECB’s ambitions because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing. But if some corners of the market are as yet unsure about the introduction of one utility in the reference data space, surely two is likely to provoke even more of a backlash?

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for creating an effective data quality control framework

Date: 8 November 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework...

BLOG

SmartStream Combines Air, RDU to Offer Comprehensive Transaction Reporting for MiFID II

SmartStream Technologies has combined its SmartStream RDU reference data utility with its SmartStream Air (Artificial Intelligence Reconciliations) solution to create a new offering that provides regulated entities with comprehensive reporting capabilities for MiFID II. The new solution – Transaction Reporting Reconciliation and Reporting Decision Control – aims to address regulators’ growing requirement for completeness and accuracy...

EVENT

TradingTech Summit Virtual (Redirected)

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...