The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Technology Capabilities Could Enable Algorithmic Descriptions for Derivatives, But Standardisation is Lacking

Following their request for input from the industry back in December, the Commodity Futures Trading Commission (CFTC) and the Securities and Exchange Commission (SEC) have published a paper outlining the findings of their joint study to examine the feasibility of adopting new algorithmic codes to identify complex and standardised derivatives. The study indicates that the regulators believe current technology is capable of representing derivatives using a common set of computer readable descriptions and that these descriptions are precise enough to ID “at least a broad cross section of derivatives,” but a few other items such as standardised entity identification must be tackled first.

The study, which has been conducted over the course of the last few months, has centred on determining the feasibility of requiring the derivatives industry to adopt standardised computer readable algorithmic descriptions that may be used to describe complex and standardised derivatives and calculate net exposures. Ergo they would become the de facto standard for the detailed identification of derivatives within the market. This is all in line with section 719(b) of the Dodd Frank Act, which established the interagency working group to conduct the study back in December.

The two main questions the group sought to answer (as referenced in the study) were: “First, is computer technology capable of representing derivatives with sufficient precision and detail to facilitate collection, reporting, and analysis of risk exposures, including calculation of net exposures, as well as to function as part or all of a binding legal contract? Second, if the technological capability exists, in consideration of the logistics of possible implementation, should these standardised, computer readable descriptions be required for all derivatives?”

The focus therefore has been on determining whether the collection, reporting, and management of risk exposures can be aided by the introduction of these new computer readable descriptions. The conclusion has evidently been that the technology is available out there in the market to facilitate the introduction of these new requirements.

However, the SEC and CFTC working group concludes in the study that before they can mandate the use of standardised descriptions for all derivatives, the industry will first need to adopt a universal entity identifier and product or instrument identifiers. Moreover, the regulatory community will need to conduct a further analysis of the costs and benefits of having all aspects of legal documents related to derivatives represented electronically. They will also need to define a uniform way to represent financial terms not covered by existing definitions.

These requirements have, no doubt, sprung from the 17 responses the group received from the industry during the comment period. These responses emanated from a broad range of industry participants and associations including: data vendor Bloomberg; the EDM Council; technology and consulting firm Investance; Goldman Sachs, exchange BM&F Bovespa; the Managed Funds Association (MFA); vendor DocGenix; derivatives specialist Markit; and Sifma and the International Swaps and Derivatives Association (ISDA).

The CFTC and SEC note in the study that, currently, non-standardised derivatives have a long way to go in terms of moving into the computer readable realm: “To the extent that non-standardised derivatives transactions are computer readable, the derivative transaction and contract information is available only in internal, proprietary systems developed and used by dealers, clearing houses, trade repositories, and large money managers.”

In terms of standards, the study references the FIX Protocol and ISDA’s Financial product Markup Language (FpML) as potential building blocks for change. It notes that the former is focused on speed of transmission, whereas the latter is focused on capturing complexity; both of which are important to the regulator. It also adds that these standards are not comprehensive in their coverage of the derivatives industry overall: “Much remains in the form of written contracts or proprietary data formats and would be expensive for firms to convert to machine readable data, to store the data, and to use the data in analysis.”

This data is also lacking standardised “computer languages and reference data”, hence these are next on the regulatory hit list. Accordingly, the study references the work going on within the Office of Financial Research (OFR) to establish a new legal entity identifier and define instrument identifiers for systemic risk monitoring purposes (see recent news on which here). It also references other related “public-private initiatives” that can be leveraged in order to take next steps with regards to algorithmic descriptions. It therefore charges the industry with the responsibility of ensuring the regulatory community is going down the right track.

Related content

WEBINAR

Recorded Webinar: The post-Brexit UK sanctions regime – how to stay safe and compliant

When the Brexit transition period came to an end on 31 December 2020, a new sanctions regime was introduced in the UK under legislation set out in the Sanctions and Anti-Money Laundering Act 2018 (aka the Sanctions Act). The regime is fundamentally different to that of the EU, requiring financial institutions to rethink their response...

BLOG

Can Asset Managers Finally Find a Cure for their Reporting Headaches?

Clement Miglietti, Chief Product Officer, NeoXam. It is hard to think of another issue that has been deliberated over more in recent times than the need for accurate reporting. The issue is, as with so many issues relating to market infrastructure, the bigger the problem the harder it is to actually come up with a...

EVENT

TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...