About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is the OFR Biting Off More Than it Can Chew?

Subscribe to our newsletter

Serious concerns have been raised about the capacity of the US Office of Financial Research (OFR) to be able to effectively process and number crunch the high volumes of reference and economic data that it will receive from the industry in order to monitor systemic risk. The influx of data inputs could potentially overwhelm the new Treasury agency and thus impede the tracking of systemic risk on a more granular level. The OFR (as well as the industry) could therefore be faced with death by data drowning.

Speakers from Sifma’s Operations and Technology Committee at the association’s annual operations conference in Florida this week highlighted the unstructured way in which reference data is provided and assessed at the moment by the regulatory community as a point of concern in this regard. Morgan Stanley’s global head of operations, technology and data, Stephen Daffron, noted that the current state of the raw data is not conducive to being fed into an analytics engine in order to track granular data such as parent/child hierarchies.

As the vendor community that has sprung up around the cleansing of reference data will testify, it is not a cheap or simple process to track this data at a firm level, let alone on an industry-wide basis. Daffron indicated three main points of concern: the sheer volume of unstructured data out there; the lack of consistency (of data itself and the OFR’s approach to that data); and the OFR’s intended approach to the challenge overall.

One of the fears raised frequently by industry participants over recent months is that the OFR will bite off more than it can chew at the outset. Daffron cautioned that a rushed approach towards collecting vast quantities of data was not the right track to go down, but that this seemed to be the way the OFR was progressing: doing “too much” in a space that is “too complex”.

His fellow panellists, JPMorgan managing director Jeffrey Bernstein, Raymond James Financial’s chief admin officer Angela Biever and the Depository Trust and Clearing Corporation’s (DTCC) COO Michael Bodson all noted that the agency is at the start of a long journey. A lot of lessons need to be learned first with regards to dealing with reference data. Good job then, that a practitioner and economist is at hand to assist.

It has been a few weeks since my last blog on the developments surrounding the OFR, during which time a new supervisor has been appointed in the form of ex-Morgan Stanley economist Richard Berner. So, it will now be up to Berner and his team to convince the industry that the agency will not go down the wrong path with regards to dealing with this data.

For now, I’d recommend small bites and regular checkups (with the industry) to ensure the OFR isn’t choked by too much data at the outset.

It is also interesting to note that Sifma, who is leading the charge with regards to providing feedback to the OFR on the legal entity identifier and other data standardisation concerns, chose a DTCC spokesperson to participate in its conference session on operations challenges and the OFR. Given that DTCC is pitching to be chosen as the technology provider to the new Treasury agency (alongside Swift as the registration authority for the new legal entity ID), is Bodson’s presence on the panel tacit confirmation of the association’s support for the DTCC’s bid? After all, DTCC is hot tipped to be the frontrunner in the race…

Expect these topics and many more to be up for discussion at our upcoming Data Management for Risk, Analytics and Valuations (#DMRAV – for you twitterers out there) in NYC in a couple of weeks’ time.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Swap Data Was Supposed to Deliver Transparency. A Decade Later, Regulators Are Still Trying to Use It

For more than a decade, regulators have collected vast quantities of derivatives transaction data through swap data repositories (SDRs) mandated by post-crisis financial reforms. Yet despite the scale of these datasets, transforming reported trade data into meaningful supervisory insight has often proved more difficult than policymakers anticipated. A new Memorandum of Understanding (MOU) between the...

EVENT

Eagle Alpha Alternative Data Conference, Spring, New York, hosted by A-Team Group

Now in its 9th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Applications of Reference Data to the Middle Office

Increasing volumes and the complexity of reference data in the post-crisis environment have left the middle office struggling to meet the requirements of the current market order. Middle office functions must therefore be robust enough to be able to deal with the spectre of globalisation, an increase in the use of esoteric security types and...