As you know, in the run up to Swift’s Sibos conference next month, Reference Data Review has been endeavouring to find out what readers think of the European Central Bank’s proposed reference data utility. And it’s reassuring to know that rather than existing in a vacuum, data managers are ready and waiting to provide feedback.
As well as voicing concerns over the introduction of a potentially bureaucratic approach to a business challenge, one of our readers was inspired to ask some serious questions of the ECB. “Having heard Francis Gross speak at the Xtrakter conference earlier this year my understanding is that the ECB initiative is based on automobile industry best practice where quality is instilled at the earliest point possible. Hence the logic to create a new utility data creation source that ensures consistent standards for all securities.
However, given the wide range of instruments and vendor value added fields ‘baked in’ to the business process, wouldn’t it perhaps be more feasible to use ECB’s clout to define and enforce industry standards for core data attributes that must be supported by all sources and vendors? In that way the industry could gravitate towards standards over time as part of existing change activity,” said the reader, who wished to remain anonymous.
These comments are indicative of the concern in the market that the ECB will be adding some level of confusion and duplication to what vendors already provide in the reference data space. PJ Di Giammarino, CEO of think tank JWG-IT, reckons the body that takes on the endeavour will have its work cut out for it. “Whoever takes the leadership in this area had better have the skin of a rhinoceros, the budget of King Midas and the Yoda’s ability to manipulate the force,” he told Reference Data Review earlier this month. The publication of these comments on our website, in turn, prompted a call from Per Nymand-Andersen, a colleague of Gross’ in the ECB’s statistics division and head of section, who sought to clarify some points that he felt may have been misunderstood about the proposals.
Nymand explained that the proposals are still on the drawing board and, in fact, the ECB is as yet unsure itself about how far it should extend its ambitions and is looking to the industry to provide feedback on this subject. Gross has also recently confirmed that the utility will adopt a gradual approach to standardisation rather than bite off more than it can chew to begin with. Meanwhile, as the industry discussions continue around the ECB’s proposals, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative.
The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data. The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names.
Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come. The US initiative goes one step further than the ECB’s ambitions because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing. But if some corners of the market are as yet unsure about the introduction of one utility in the reference data space, surely two is likely to provoke even more of a backlash?