If nothing else, the reference data utility panel at last week’s Xtrakter conference again proved how difficult it is to agree on the best way to proceed with data standardisation, even if the conversation about the subject has been ongoing for several years. Francis Gross, head of the external statistics division at the European Central Bank (ECB), reiterated the points he has made many times before in favour of the introduction of a “thin” reference data utility and industry participants voiced their concerns about the idea of regulatory led change in the area of data standardisation. Plus ça change…
Of course, not everyone was sceptical of the idea, David Berry, executive in charge of market data sourcing and strategy at UBS and member of the Information Providers User Group (IPUG), noted that a utility might help to force vendors to bring down their costs, especially those embedded in the data acquisition process. In his role at IPUG, Berry has been a key lobbyist to put an end to vendor pricing practices based on charging for end user licenses and he is seemingly in favour of any kind of leverage that might help in this process.
Berry elaborated on the challenges faced by his own institution in endeavouring to deal with the complexities of the current lack of standardisation in the market, which means the firm has to buy in 16 different vendor feeds. He noted that there are 58 fields within messages, for example, but only 47 of these are usually populated: “Surely there is a simpler way to do all of this?”
UBS has recently been engaged in a data quality assessment exercise to determine the total cost of ownership of its data and to remove any duplication and redundancy in order to bring these costs down. Berry therefore indicated the need for a ranking or filtering system in order to be better able to conduct exercises such as these and find the highest data accuracy for the lowest price point. He contended that a data utility might be useful in establishing a jumping off point for this process. “We need a regulator to force standards through,” he added.
But is this a good enough reason to impose a utility approach on the market? Many of the industry participant audience members I spoke to weren’t sure it was and were concerned about the progress that seems to be being made with the Office of Financial Research in the US (at this stage it seems the bill may be passed without altering this section in spite of Shelby’s concerns).
One Reference Data Review reader from the securities services practitioner world noted that the effect of imposing common standards for securities reference data will certainly be to deliver long term efficiencies, but the sheer extent of IT costs involved will mean that payback will likely take 10 to 20 years with very significant investment for the industry in the short term. He felt that the creation of a new utility would add to those costs and delays; thus adding to the current barrage of regulatory requirements on the table at the moment.
He indicated that the imposition of standards could be achieved far more quickly and effectively using existing mechanisms by mandating issuers and vendors to carry standard fields (for example, ISO fields), if necessary in addition to those they support already. “The ECB argument seems to be that the industry is incapable of agreeing common reference data standards so a new system must be built so that imposes such standards,” he noted.
As noted by other industry participants, the utility is seen by some as failing to tackle the real industry challenge of how far reference data is “hard wired” and “baked in” to the investment process. “Most of these institutions will have more than one securities master that would need to be upgraded. Each securities attribute is used to drive a downstream business process many of which would need to be re-engineered. Adding a new utility would create yet another link in this chain,” noted our reader.
He feels therefore that there is no public evidence that the lack of securities reference data standards has caused issues with transaction reporting or has led to poor investment decisions. “It is a myth that securities data, as delivered by existing methods, is of poor quality. The ECB has not delivered any specific compelling evidence of such issues. The imposition of a new warehouse could create delays in new security that would reduce trade processing STP and increase failed trades,” he contended.
Regardless of these concerns however, the US reform process is plodding on and the concept of a data utility is coming ever closer to the marker. It is likely that the politicians have no clue as to the underlying can of worms that they are beginning to crack open and the impact it could potentially have on the market (for good or bad), as well as the scale of the challenge they are taking on. I’d wager they will do soon.
Subscribe to our newsletter