The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Technology Capabilities Could Enable Algorithmic Descriptions for Derivatives, But Standardisation is Lacking

Following their request for input from the industry back in December, the Commodity Futures Trading Commission (CFTC) and the Securities and Exchange Commission (SEC) have published a paper outlining the findings of their joint study to examine the feasibility of adopting new algorithmic codes to identify complex and standardised derivatives. The study indicates that the regulators believe current technology is capable of representing derivatives using a common set of computer readable descriptions and that these descriptions are precise enough to ID “at least a broad cross section of derivatives,” but a few other items such as standardised entity identification must be tackled first.

The study, which has been conducted over the course of the last few months, has centred on determining the feasibility of requiring the derivatives industry to adopt standardised computer readable algorithmic descriptions that may be used to describe complex and standardised derivatives and calculate net exposures. Ergo they would become the de facto standard for the detailed identification of derivatives within the market. This is all in line with section 719(b) of the Dodd Frank Act, which established the interagency working group to conduct the study back in December.

The two main questions the group sought to answer (as referenced in the study) were: “First, is computer technology capable of representing derivatives with sufficient precision and detail to facilitate collection, reporting, and analysis of risk exposures, including calculation of net exposures, as well as to function as part or all of a binding legal contract? Second, if the technological capability exists, in consideration of the logistics of possible implementation, should these standardised, computer readable descriptions be required for all derivatives?”

The focus therefore has been on determining whether the collection, reporting, and management of risk exposures can be aided by the introduction of these new computer readable descriptions. The conclusion has evidently been that the technology is available out there in the market to facilitate the introduction of these new requirements.

However, the SEC and CFTC working group concludes in the study that before they can mandate the use of standardised descriptions for all derivatives, the industry will first need to adopt a universal entity identifier and product or instrument identifiers. Moreover, the regulatory community will need to conduct a further analysis of the costs and benefits of having all aspects of legal documents related to derivatives represented electronically. They will also need to define a uniform way to represent financial terms not covered by existing definitions.

These requirements have, no doubt, sprung from the 17 responses the group received from the industry during the comment period. These responses emanated from a broad range of industry participants and associations including: data vendor Bloomberg; the EDM Council; technology and consulting firm Investance; Goldman Sachs, exchange BM&F Bovespa; the Managed Funds Association (MFA); vendor DocGenix; derivatives specialist Markit; and Sifma and the International Swaps and Derivatives Association (ISDA).

The CFTC and SEC note in the study that, currently, non-standardised derivatives have a long way to go in terms of moving into the computer readable realm: “To the extent that non-standardised derivatives transactions are computer readable, the derivative transaction and contract information is available only in internal, proprietary systems developed and used by dealers, clearing houses, trade repositories, and large money managers.”

In terms of standards, the study references the FIX Protocol and ISDA’s Financial product Markup Language (FpML) as potential building blocks for change. It notes that the former is focused on speed of transmission, whereas the latter is focused on capturing complexity; both of which are important to the regulator. It also adds that these standards are not comprehensive in their coverage of the derivatives industry overall: “Much remains in the form of written contracts or proprietary data formats and would be expensive for firms to convert to machine readable data, to store the data, and to use the data in analysis.”

This data is also lacking standardised “computer languages and reference data”, hence these are next on the regulatory hit list. Accordingly, the study references the work going on within the Office of Financial Research (OFR) to establish a new legal entity identifier and define instrument identifiers for systemic risk monitoring purposes (see recent news on which here). It also references other related “public-private initiatives” that can be leveraged in order to take next steps with regards to algorithmic descriptions. It therefore charges the industry with the responsibility of ensuring the regulatory community is going down the right track.

Related content

WEBINAR

Recorded Webinar: How Financial Institutions can adjust to working in the New Normal

The very sudden impact of Covid-19 and resultant shutdown of physical sites has stress-tested financial institutions and vendors to their limits. Now banks and firms are slowly starting to re-open offices. But what will the new normal look like and what steps should you be taking now to make the most of this situation? This...

BLOG

Time for Financial Institutions to Take Back Control of Market Data Costs

By Yann Bloch, Vice President of Product Management at NeoXam. Brexit may be just around the corner, but it is market data spending that financial institutions are more interested in taking back control of right now. In fact, other than regulatory equivalence post the transition period, it is hard to think of a more prominent...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...