As noted in my previous blog, a regulatory driven reference data utility is on its way for the US market, but it seems that not many of you out there are convinced this is the right step to be taking. My fellow panellists at a recent corporate actions focused event in London indicated that they are wary of regulators producing a utility that meets their own needs but not those of the industry.
Panellists were all in favour of more standardisation in the reference data space I general. Richard Newbury, market development manager at data vendor SIX Telekurs, for example, indicated that the complexity of the market is such that transparency can only be achieved with some level of basic standardisation so that everyone is able to identify individual instruments and accurately track risk. “We all need to be able to identify instruments in the same manner and be able to communicate that data with our counterparties,” he told delegates to CorpActions 2010. That, after all, is a vital part of the financial services business.
PJ Di Giammarino, CEO of think tank JWG, added that the 92 point action plan driving the barrage of regulation headed the industry’s way is also reason enough for more standardisation to be achieved. Firms getting their data in order is step one towards meeting the new regulatory requirements. Moreover regulators are closely monitoring progress and expect action to be taken by the industry, or they will make a move themselves (the UK Financial Services Authority’s “Dear CEO” letters and recent fines are a case in point).
Andrew Lewin, market data analyst at Credit Suisse in Europe, cautioned against letting the regulators lead the standardisation effort because the industry may end up with a set of standards that don’t suit its requirements. In general, however, Lewin was keen to see more standardisation and the stripping out of embedded identifier code charges on the part of the vendor community.
To this end, he reiterated a lot of the other recent complaints I have heard from the industry about vendor charging practices around data. For example, earlier this year, concerns such as these sparked off a spate of industry lobbying directed at Bloomberg with regards to its planned introduction of a new pricing structure. The industry appears to have won that battle, given that Bloomberg has declared itself open and recently partnered with NYSE Euronext to prove its Bloomberg Open Symbology (Bsym) initiative is living up to its claims.
Paul Kennedy, reference data business manager for Interactive Data’s European arm, added that the financial services industry is the only industry to charge users for individual parts of a whole.
Kennedy has also long been a sceptic about the idea of a reference data utility. He noted that different countries have different data standards due to historical infrastructure led developments; the UK uses Sedol, whereas in Europe ISINs are more endemic, for example. He believes that the European Central Bank’s (ECB) utility plan is misguided because although it sounds good on paper, it is not really solving an industry problem. He asked delegates to ponder: “Where is the value added? Data is a much more complex business than people would suspect and you really have to ask yourself what you are trying to achieve.”
Newbury added that if the utility is aimed at helping regulators track systemic risk, then perhaps the regulatory community just needs better cross referencing capabilities and a more efficient data processing centre. Kennedy agreed that linkages are where the value lies in being able to track systemic risk rather than raw data itself. “It is being able to roll that data back to the issuer and the counterparties affected by something like Lehman’s failure,” he said.
Kennedy therefore feels the solutions being proposed by the regulatory community are on the “simplistic side” because they do not take into account these linkages. He indicated that ISO has also failed to deliver on what the marketplace needs in terms of legal entity identifiers and that some regulatory involvement is inevitable but it needs to be better directed. “It should be about solving the business problems of the industry as well as for the regulators,” he said.
Lewin reckons that the issue of scale is also a challenge that will face the utility: “Realistically, how many instruments can the regulator look at?” He elaborated on Credit Suisse’s recent endeavours to adapt its systems to meet the Options Price Reporting Authority’s (Opra) symbology changes and the tremendous amount of effort involved. “We were looking at millions of different instruments and having to go through the process of figuring out what each referred to. Even the vendors up until two months before the changes were unable to identify every single instrument,” he said. “If even the marketplace doesn’t understand everything that is going on, how can a regulator with limited resources tackle that challenge?”
These are concerns that are being voiced time and time again with regards to the utility. Is it solving the right problems? Is the regulator the best choice to determine what these problems are? Is it even feasible to establish such a potentially unwieldy piece of infrastructure? As ever, the answers to these questions seem long off being provided. For now, the Financial Services Bill in the US that contains the proposals for a utility has stalled, along with the proposals themselves.