The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Is the OFR Biting Off More Than it Can Chew?

Serious concerns have been raised about the capacity of the US Office of Financial Research (OFR) to be able to effectively process and number crunch the high volumes of reference and economic data that it will receive from the industry in order to monitor systemic risk. The influx of data inputs could potentially overwhelm the new Treasury agency and thus impede the tracking of systemic risk on a more granular level. The OFR (as well as the industry) could therefore be faced with death by data drowning.

Speakers from Sifma’s Operations and Technology Committee at the association’s annual operations conference in Florida this week highlighted the unstructured way in which reference data is provided and assessed at the moment by the regulatory community as a point of concern in this regard. Morgan Stanley’s global head of operations, technology and data, Stephen Daffron, noted that the current state of the raw data is not conducive to being fed into an analytics engine in order to track granular data such as parent/child hierarchies.

As the vendor community that has sprung up around the cleansing of reference data will testify, it is not a cheap or simple process to track this data at a firm level, let alone on an industry-wide basis. Daffron indicated three main points of concern: the sheer volume of unstructured data out there; the lack of consistency (of data itself and the OFR’s approach to that data); and the OFR’s intended approach to the challenge overall.

One of the fears raised frequently by industry participants over recent months is that the OFR will bite off more than it can chew at the outset. Daffron cautioned that a rushed approach towards collecting vast quantities of data was not the right track to go down, but that this seemed to be the way the OFR was progressing: doing “too much” in a space that is “too complex”.

His fellow panellists, JPMorgan managing director Jeffrey Bernstein, Raymond James Financial’s chief admin officer Angela Biever and the Depository Trust and Clearing Corporation’s (DTCC) COO Michael Bodson all noted that the agency is at the start of a long journey. A lot of lessons need to be learned first with regards to dealing with reference data. Good job then, that a practitioner and economist is at hand to assist.

It has been a few weeks since my last blog on the developments surrounding the OFR, during which time a new supervisor has been appointed in the form of ex-Morgan Stanley economist Richard Berner. So, it will now be up to Berner and his team to convince the industry that the agency will not go down the wrong path with regards to dealing with this data.

For now, I’d recommend small bites and regular checkups (with the industry) to ensure the OFR isn’t choked by too much data at the outset.

It is also interesting to note that Sifma, who is leading the charge with regards to providing feedback to the OFR on the legal entity identifier and other data standardisation concerns, chose a DTCC spokesperson to participate in its conference session on operations challenges and the OFR. Given that DTCC is pitching to be chosen as the technology provider to the new Treasury agency (alongside Swift as the registration authority for the new legal entity ID), is Bodson’s presence on the panel tacit confirmation of the association’s support for the DTCC’s bid? After all, DTCC is hot tipped to be the frontrunner in the race…

Expect these topics and many more to be up for discussion at our upcoming Data Management for Risk, Analytics and Valuations (#DMRAV – for you twitterers out there) in NYC in a couple of weeks’ time.

Related content

WEBINAR

Recorded Webinar: Maximising success when migrating big data and analytics to cloud

Migrating big data and analytics workflows to the cloud promises significant cost savings through efficient use of infrastructure resources and software that scales dynamically based on data volume, query load, or both. These are valuable gains for investment banks, but they can only be fully realised by taking a new approach to architecture and software...

BLOG

NatWest Keynote at Data Management Summit Virtual Details Data Strategy for ESG

Third-party data sources are the largest investment financial firms are making to deliver environmental, social and governance (ESG) strategies, yet data quality remains the biggest concern when sourcing the data. This conundrum was among many ESG issues discussed during a practitioner innovation keynote delivered at A-Team Group’s recent Data Management Summit Virtual by Sarah Walker,...

EVENT

TradingTech Summit London

TradingTech Summit London will explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility. Leveraging the cloud, AI and ML technologies to get an edge, automate processes and simplify operations in a cost effective way is the name of the game and will share practical insight from practitioners and technology leaders who are innovating and driving forward change in trading operations.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...