About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US NIF and ECB’s Reference Data Utility Need Clarity and Strong Leadership, Says JWG-IT’s Di Giammarino

Subscribe to our newsletter

In order to have any hope of succeeding in achieving their goals of introducing greater data standardisation in the market, the US National Institute of Finance (NIF) and the European Central Bank (ECB) will need to provide a great deal more clarity around the benefits of their approaches, says PJ Di Giammarino, CEO of think tank JWG-IT. “Whoever takes the leadership in this area had better have the skin of a rhinoceros, the budget of King Midas and the Yoda’s ability to manipulate the force,” he explains to Reference Data Review.

In an interview with the ECB’s head of section, Per Nymand-Andersen, earlier this week (check the next issue for the full story), Reference Data Review learnt that the plans are very much still at the drawing board stage. The ECB is currently attempting to get as much industry and regulatory feedback as possible on its proposals to create a market utility for securities reference data. However, the idea seems to be causing a great deal of confusion in some corners of the market, largely because it is so vague at this stage.

The NIF seems to be taking a more prescriptive approach to the space with its plan to create a Federal Financial Data Centre (FFDC) and a Federal Financial Research and Analysis Centre (FFRAC). The FFDC would collect, clean, maintain and secure data including financial transactions data, positions, holdings, obligations and any other data deemed important for systemic analysis. The FFRAC would provide independent analytical capabilities and computing resources to the regulatory community in order to facilitate turning the data into something useful.

Both plans stem from the same source; namely the work going on in the regulatory community to better track the financial services markets. “The G20 action plan asks policy makers and regulators to ‘lead the charge’ to a brave new world where they control macro-prudential risk by asking banks better questions, more frequently, with more precision than ever before,” says Di Giammarino. “What’s more, they are being asked to share the information and interpret the meaning of more granular data across borders in new ways.”

The central bank is in a position to understand the data challenge, according to Di Giammarino. The ECB has, probably more than any other central bank or regulator, had to assimilate masses of poor quality reference information to interpret the impact of events, like the collapse of Lehman Brothers, across the European area. “They have leant from this experience that there is no quick fix to a distributed supply chain of financial services data ‘factories’ that belch variable quality outputs at ever increasing rates,” he adds.

The ECB plays an important role in this debate and Di Giammarino reckons it is rightly concerned that it does not have the quality of information required to do its job with any level of accuracy. “We see their efforts as an attempt to promote thought leadership about the role of the centre which, to date, has lacked a significant sense of accountability for data,” he explains.

The theory behind it may be laudable, but practical implementation realities are another kettle of fish entirely. The ECB’s Strategic Reporting and Delivery Unit (SRDU), European System of Financial Supervisors’ central database and the US-based NIF’s data centre suffer from the problem of all orphans: nobody owns them and perhaps, more importantly, it is not clear what happens if they are not adopted, says Di Giammarino.

Data has been downplayed a lot of the time and, even now, regulators have not directly mentioned it in proposed legislation. “Anyone familiar with banking operations – from risk management to profitability measurement or customer relationship management – quickly appreciates the scale of the industry’s data issues. Reliable and accessible information is at the heart of any operation in financial services, yet it doesn’t get a mention in the G20 or Financial Stability Board (FSB) plans. This sort of practical omission, a common flaw in regulatory schemes, has been a traditional frustration for banks,” he explains.

Di Giammarino reckons the ECB will play an important role in this debate, but says it needs to work with whoever is granted the authority to own this effort to ensure the right resources do a proper job. However, the rhinoceros with Midas’ touch and access to the ‘force’ that actually takes on this task will have its work cut out.

Niccolo Machiavelli is famous for encapsulating the essence of this kind of change effort whilst he was in exile at the hands of the Medici family in 1512, says Di Giammarino. Machiavelli realised that “… there is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things.” This is in part due to the lukewarm reception that the change manager will initially receive from future converts, but also to the resistance from those who stand to lose from change, contends Di Giammarino.

“The leader of this effort must have a well articulated business case, realistic implementation plan and an open and transparent platform/standards selection process. Perhaps, most importantly, the owner will need strong commitment and backing from many different constituents and a strong governance process to overcome the inevitable bumps in the road that will come with a project of this magnitude and length,” he continues.

There are benefits to be had from such a solution, but the challenge is all in their articulation and convincing those that are profiting from non-standardisation to get on board. “We believe that there are plenty of real benefits in getting the reference data plumbing right as it is fundamental to achieving the objective of macro prudential risk management. However, these benefits are secondary to the arbitrage opportunities that exist thanks to the lack of transparent reference data today. On both sides of the case for change there are hundreds of billions at stake,” says Di Giammarino.

The question is: will a US$15 trillion crisis be enough to force an upgrade to one of the most complex information challenges on the face of the planet? Perhaps, says Di Giammarino: “However, unless the benefits are explained in a manner which senior management in the institutions, regulators, suppliers and trade associations understand, the case for this massive change will fall on deaf ears. The bottom line is that this industry is averse to throwing money at things that are not perceived to be problems. It will be impossible to get the commitment and funding if the many players do not see the practical, meaningful benefits.”

Although the two main endeavours in the space appear to be regional in approach, Di Giammarino is of the mind that any such work should be global in scope. “Far from an oligopoly, the financial services industry functions with thousands of moving parts held together across the globe via alignment of interest rather than prescription. The very nature of the systemic risks we are attempting to control calls upon the world’s finance ministers, central banks and regulators to define and articulate a broad and deep set of data requirements from first principles. Investment cases for point solutions are insufficient without a robust roadmap – that’s like planning a new guest bathroom without upgrading the entire antiquated plumbing system,” he explains.

There are some real barriers to upgrading a system this vast and complex. For one, the industry requires clarity on what data we are trying to collect, how we are controlling it, why and how much we are willing to spend, he says. The market also needs a common target data model, which can support decision making and provide regulators with the information they need, when they need it. Moreover, this multi-year change programme will need top level sponsorship and a neutral party to create open standards with industry input.

Di Giammarino reckons that whilst some are studying pieces of the puzzle, for example the SIIA/FISD’s data model or the EDM Council’s semantics repository and others are paying lip service to the debates, for example the various upcoming conferences on the subject, the industry has not yet seen the appropriate levels of commitment to outlining the problems which must be solved.

Exactly which market player should be in charge of the utility is still a potential political hot potato. “Unlike the World Trade Organisation, none of the global financial services efforts are bound by treaty,” says Di Giammarino. “Therefore, there is no one body who can issue a change order. This means that moving forward with a programme of this magnitude will become an exercise in consensus building.”

He says JWG-IT is are neutral on who it is that the G20 and/or the BIS/FSB will nominate to lead the effort. “If the ECB have the funding, expertise and desire to take this massive challenge on, and their colleagues across the globe are willing to follow, and they find the right person to lead it, then they can succeed,” he suggests.

Given the enormous political, legal, tax, language and business practice gamut that the ECB spans today, he reckons they may well have the scope to understand and frame the debate. “It is difficult to believe that either a single vendor – or group of vendors – would be able to take charge of this programme but it might be possible. The principles built into the Kyoto accord – profit driven models to solve global problems – could work well. Whoever takes the lead, there’s no easy way to achieve the ambitious data integration targets that have been set for them without the banks’ help,” he adds.

The vendor community are also likely to be affected significantly by such an endeavour. “The implementation of a global reference data solution would certainly step on the toes of those that provide plumbing services today. The list of affected participants includes standards bodies, numbering agencies, registrars, lawyers, trade associations, exchanges, multilateral trading facilities (MTFs), market/static data vendors, regulators and back office staff in the banks. Essentially, anyone that collects, stores and transmits financial data (issues, legal entities, issuers) would need to re-examine whether their business models remain valid,” warns Di Giammarino.

“All financial institutions and their data vendors would need to align their individual reference data to a common central view. This is not a ‘one off’ activity. It will take years of effort to define the new operating model, align historical views and manage the operational challenges of the transition period,” he continues.

Clearly, independence is fundamental to getting it done right. Most of the industry’s infrastructure is commercially motivated and the strategy, design and build out of new solutions require careful negotiation, he adds.

“When speaking to banks about their data war stories, one thing is clear: anything that helps fix the plumbing will be welcome. However, there is a big difference between using another piece of reference data plumbing and rushing towards a wholesale replacement of every tube in the house. The speed and scale at which this will be adopted will depend on the consequences of getting data wrong (few today), the comprehensiveness of the new solution (rarely to scale), the cost of making the change (inevitably high) and the business benefits of putting it in (unknown at this point in time),” he concludes.

As ever, it’s a case of wait and see…

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to trade surveillance for market abuse

Breaches of market abuse regulation can lead to reputational damage, eye-watering fines and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours; externally it can undermine confidence in markets and cause financial instability. This webinar will discuss market abuse of different types, such as insider trading...

BLOG

S&P Global Market Intelligence Adds Quantifind Sanctions Screening Data to Entity Insights

S&P Global Market Intelligence has enhanced its entity due diligence platform, Entity Insights, with the inclusion of sanctions and adverse media screening data. The additional data is provided by Quantifind and enables users to screen more than 30 million entity records from more than 30,000 global sources to meet due diligence requirements and manage reputational...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...