About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Recurring Refrain

Subscribe to our newsletter

Pattern recognition is obviously a handy skill to have when dabbling in the financial markets, but it’s not something often applied to regulatory developments. However, the eagle eyed of you out there may have spotted a theme and a recurring refrain within a whole host of recent regulatory papers, including the MiFID Review, UCITS IV, Basel III, Solvency II or any number of UK Financial Services Authority (FSA) papers: references to accuracy, consistency, timeliness, reliability and completeness of data are present in all of them.

Last week’s JWG event on the risk data quality imperatives underlying current regulatory requirements highlighted this perfectly, as panellists including Mark Davies, head of reference data for SSF Risk Services at RBS, discussed the impact that this barrage of new requirements is having on the lives of data managers. It seems that these requirements are proving to be both a blessing and a curse to those in the data management function, given the number of “unknown unknowns” that exist in the market with regards to “what good looks like” (a common refrain from industry participants over recent months and JWG’s own CEO PJ Di Giammarino).

To put it all into context, Di Giammarino referenced a number of recently published regulatory papers that propose the introduction of new data quality checks, including the FSA’s recent guidance on operational risk standards. This paper in particular highlights the importance of the “clarity, quality and accuracy” of the data inputs to risk management systems and suggests regular data quality testing.

Di Giammarino also discussed a recent survey conducted by JWG in which all of the 16 participant firms said data quality was of paramount importance. The industry and the regulatory community have both woken up to the issue of data quality but there are significant gaps around risk related regulation that could prove to be pitfalls in the near future.

If you want to see more buy side focused data quality specific regulation in the pipeline, look at the Committee of European Securities Regulators’ (CESR, now succeeded by ESMA) paper on counterparty risk data requirements for UCITS from August last year, for another example. These level three guidelines, which are to accompany the level two implementing measures for UCITS, are much more prescriptive in a data context than ever before and refer directly to “accuracy”, “completeness” and “consistency, timeliness and reliability” of data. The European regulator is asking firms to be able to produce historical data on demand and back up its calculation of counterparty risk for these instruments with much more granular data than previously required.

Recent discussions during the MiFID Forum regarding updated transaction reporting requirements under the second iteration of the directive have also strayed into data quality territory. Dario Crispini, manager of the Transaction Reporting Unit of the FSA, indicated that the regulator is now contemplating mandating that firms appoint a data governance officer and introduce a data assurance programme to ensure that standards of data quality are being maintained.

Solvency II is asking the buy side and insurance industries to prove their data quality on the basis of three criteria: “accuracy”, “completeness” and “appropriateness”. The onus is on firms’ boards to be able to establish a data governance framework that ensures data quality is sufficient to “meet their management information needs,” according to the Committee of European Insurance and Occupational Pensions Supervisors (CEIOPS).

In the US, Dodd Frank contains numerous references to data standardisation and data quality – if you’ve read any of my recent blogs on the Office of Financial Research (OFR), you’ll be well aware of many of the details. The Commodity Futures Trading Commission (CFTC) has been particularly active in raising the data standardisation issue.

CFTC commissioner Scott O’Malia’s recent speech to the FIA Clearing conference, for example, illustrates this focus on quality and standards: “Data is the foundation of the Commission’s surveillance programs and the new swap data repositories are where all of the data will come together. Under Dodd Frank, the Commission is charged with collecting, aggregating, monitoring, screening, and analysing data across swaps and futures markets for many purposes including enhanced surveillance capabilities aimed at identifying potential market disruptions and violations of the Commodity Exchange Act. Our current information systems are largely dependent upon the analytical capabilities and advanced computer driven surveillance technology currently provided by the exchanges and self-regulatory organisations. This is unacceptable. We must develop our own analytical capabilities and ensure that the data that comes into to the Commission is of the highest quality. This is especially critical as this data may ultimately be disseminated to the public.”

I could go on…

As for the JWG event, Davies, who has previously been vocal about the importance of data quality for risk management purposes, noted that RBS is continuing to focus on the space over the course of this year with its One Risk programme. The overall aim of the project is to enhance the data underlying the risk management, finance and treasury functions across RBS, so that all are reading from the same data hymnbook.

One can expect to see many more of these projects being kicked off this year, given the common regulatory refrain that is calling for a tuning up of data governance programmes across the industry.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

FinClear Adopts Eventus’ Validus Platform for Enhanced Trade Surveillance and Monitoring

FinClear, Australia’s leading full-service infrastructure provider for financial markets and provider of trade execution and third-party clearing services, has deployed the Validus platform from Eventus for trade surveillance and post-trade monitoring. This strategic move is aimed at bolstering FinClear’s capabilities in overseeing one in two retail transactions in Australia and managing over AU$360 billion in...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...