The regulatory community is hell bent on increasing transparency into the derivatives market by any means necessary, including forcing OTC instruments to be cleared via central counterparties (CCPs) and requiring firms to report more data to new trade repositories. Reference Data Review speaks to Steve Ingle, derivatives product manager at BNY Mellon Asset Servicing, to find out how all of these changes are impacting the way firms deal with data.
Ingle has been with BNY Mellon for more than seven years and is focused on derivatives middle and back office management, as well as process re-engineering. Prior to his appointment at the asset servicing firm, Ingle was derivatives and securities lending manager at F&C.
How has the recent regulatory focus on the area of derivatives impacted firms’ spending on automating the processing of these instruments? What about new risk reporting requirements?
Whilst there has undoubtedly been a dramatic increase in focus on the OTC derivatives industry, at this point there is a lack of clear direction on the part of any regulatory body. Accordingly, there has not been any increase in spending on projects related to regulatory changes. However, due to our focus on risk management, there has been a significant increase in expenditure by BNY Mellon over the course of the current year to reduce the risk associated with handling transactions, to improve automation and STP rates, and to continue to enhance the reporting available to clients. Clients that have outsourced their back and middle office derivative operations need to obtain as much information as possible to perform their oversight duties, and clients to whom we provide accounting services need ever-greater transparency on the valuation of the instruments that make up the total net asset value (NAV) of their portfolios.
How will the introduction of CCPs in the credit derivatives markets impact firms’ data management systems?
Central counterparties will not be able to cover every credit default swap (CDS) contract that is traded in the market. As a result, only a portion of contracts will end up being novated onto the central counterparty mechanism, while the remainder will continue to be valued, margined and managed in the same bilateral manner as today. From an administrator’s perspective this will result in an additional data management burden, with the ability to access central counterparty data for daily valuations and margin calls becoming part and parcel of supporting these instruments. However, from our current understanding of the process flows, this should be similar to clearing exchange-traded derivative instruments – given of course that the required data is easily accessible.
What about a trade repository for derivatives data and new transaction reporting requirements?
The trade repository for derivatives data is a good mechanism for managing the ongoing lifecycle events of standardised products. Again, the move towards wider use of the DTCC’s Trade Information Warehouse (TIW) will capture a good number of instruments – but not the entire population of instruments. Therefore an additional operating model will be needed to support clients’ instruments that reside within the trade repository. This could require significant builds to allow cash movement instructions to be received from the repository, verify the clients’ credit status, instruct the cash movements and report on these movements at the instrument level across the clients’ sets of accounts.
Are the data challenges of dealing with derivatives better understood in the post-crisis world? What have firms done differently to rectify any of the problems underlying the financial crisis with regards to managing data related to complex instruments?
Following the Lehman default, the biggest area of concern for our derivatives-using customers was the availability of exposure figures. During the months following that event, the spotlight was firmly on obtaining accurate exposure figures for counterparties and receiving daily reporting on the changes to these counterparty exposures. Whilst most firms were protected by robust collateral management practices, the components involved in the exposure valuation have come under intense scrutiny – for example, how are instruments valued, what reconciliation takes place and how is the collateral held and managed.
More recently, the Greek sovereign debt crisis has led to clients questioning their country and currency exposure figures, and so these are becoming important components of the risk and exposure reporting that we deliver to our clients.
What has happened to these projects given the current climate? Are firms still investing in improving data management for derivatives despite budget cuts?
BNY Mellon is committed to investing in further improving the processing of derivative products, automation and links to industry utilities, as well as helping our clients mitigate their risks when dealing with complex instruments.
How does the buy side compare to the sell side in terms of the state of the management of derivatives data? Is the trend towards centralisation stronger for the sell side, for example?
The processing, storage and ongoing management of data is a process that benefits significantly from economies of scale. As such, the large sell side institutions can obtain a greater benefit from industry centralisation than the smaller buy side players, and this leads to the increasing demand for viable outsourcing partners in the derivatives back and middle office space.
The massive growth in derivatives has led to an explosion in the number of technology providers in the complex instrument space – is there room for all these to coexist? Why/not?
There has certainly been explosive growth in the number of technology providers, but that doesn’t necessarily translate into an increased number of capable solution providers in this space. Ultimately a small number of players will get sufficient traction within the market to make a sustainable business out of offering platforms and databases to this growing area. The key reasons for this consolidation trend are the expenses involved in keeping abreast of regulatory reforms and changes, ever-changing accounting practices and the difficulties in employing skilled people who are already familiar with the requisite platforms and processes. In essence, the environment is not open to individual interpretations of acceptable practice, so only the larger, more credible technology providers – that are capable of committing to ongoing, large scale investment – will be able to compete in years to come.
What will drive investment in data management projects in the longer term? Will the increasing complexity of financial products continue to be a challenge or will the new regulation on the horizon drive business out of the market altogether?
New regulations may change the shape of business, or shift the location of certain business transactions, but they will not alter the fundamental drivers behind the ongoing development of ever more complex instruments. I firmly believe, therefore, that the ability to cope with increasingly complex financial products has become – and will continue to be – a key differentiator in the broader asset servicing and administration space.