About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Recurring Refrain

Subscribe to our newsletter

Pattern recognition is obviously a handy skill to have when dabbling in the financial markets, but it’s not something often applied to regulatory developments. However, the eagle eyed of you out there may have spotted a theme and a recurring refrain within a whole host of recent regulatory papers, including the MiFID Review, UCITS IV, Basel III, Solvency II or any number of UK Financial Services Authority (FSA) papers: references to accuracy, consistency, timeliness, reliability and completeness of data are present in all of them.

Last week’s JWG event on the risk data quality imperatives underlying current regulatory requirements highlighted this perfectly, as panellists including Mark Davies, head of reference data for SSF Risk Services at RBS, discussed the impact that this barrage of new requirements is having on the lives of data managers. It seems that these requirements are proving to be both a blessing and a curse to those in the data management function, given the number of “unknown unknowns” that exist in the market with regards to “what good looks like” (a common refrain from industry participants over recent months and JWG’s own CEO PJ Di Giammarino).

To put it all into context, Di Giammarino referenced a number of recently published regulatory papers that propose the introduction of new data quality checks, including the FSA’s recent guidance on operational risk standards. This paper in particular highlights the importance of the “clarity, quality and accuracy” of the data inputs to risk management systems and suggests regular data quality testing.

Di Giammarino also discussed a recent survey conducted by JWG in which all of the 16 participant firms said data quality was of paramount importance. The industry and the regulatory community have both woken up to the issue of data quality but there are significant gaps around risk related regulation that could prove to be pitfalls in the near future.

If you want to see more buy side focused data quality specific regulation in the pipeline, look at the Committee of European Securities Regulators’ (CESR, now succeeded by ESMA) paper on counterparty risk data requirements for UCITS from August last year, for another example. These level three guidelines, which are to accompany the level two implementing measures for UCITS, are much more prescriptive in a data context than ever before and refer directly to “accuracy”, “completeness” and “consistency, timeliness and reliability” of data. The European regulator is asking firms to be able to produce historical data on demand and back up its calculation of counterparty risk for these instruments with much more granular data than previously required.

Recent discussions during the MiFID Forum regarding updated transaction reporting requirements under the second iteration of the directive have also strayed into data quality territory. Dario Crispini, manager of the Transaction Reporting Unit of the FSA, indicated that the regulator is now contemplating mandating that firms appoint a data governance officer and introduce a data assurance programme to ensure that standards of data quality are being maintained.

Solvency II is asking the buy side and insurance industries to prove their data quality on the basis of three criteria: “accuracy”, “completeness” and “appropriateness”. The onus is on firms’ boards to be able to establish a data governance framework that ensures data quality is sufficient to “meet their management information needs,” according to the Committee of European Insurance and Occupational Pensions Supervisors (CEIOPS).

In the US, Dodd Frank contains numerous references to data standardisation and data quality – if you’ve read any of my recent blogs on the Office of Financial Research (OFR), you’ll be well aware of many of the details. The Commodity Futures Trading Commission (CFTC) has been particularly active in raising the data standardisation issue.

CFTC commissioner Scott O’Malia’s recent speech to the FIA Clearing conference, for example, illustrates this focus on quality and standards: “Data is the foundation of the Commission’s surveillance programs and the new swap data repositories are where all of the data will come together. Under Dodd Frank, the Commission is charged with collecting, aggregating, monitoring, screening, and analysing data across swaps and futures markets for many purposes including enhanced surveillance capabilities aimed at identifying potential market disruptions and violations of the Commodity Exchange Act. Our current information systems are largely dependent upon the analytical capabilities and advanced computer driven surveillance technology currently provided by the exchanges and self-regulatory organisations. This is unacceptable. We must develop our own analytical capabilities and ensure that the data that comes into to the Commission is of the highest quality. This is especially critical as this data may ultimately be disseminated to the public.”

I could go on…

As for the JWG event, Davies, who has previously been vocal about the importance of data quality for risk management purposes, noted that RBS is continuing to focus on the space over the course of this year with its One Risk programme. The overall aim of the project is to enhance the data underlying the risk management, finance and treasury functions across RBS, so that all are reading from the same data hymnbook.

One can expect to see many more of these projects being kicked off this year, given the common regulatory refrain that is calling for a tuning up of data governance programmes across the industry.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

A-Team Group Announces Winners of the AI in Capital Markets Awards 2025

A-Team Group has announced the winners of the inaugural AI in Capital Markets Awards 2025, celebrating the most innovative and impactful applications of artificial intelligence and machine learning across the global financial markets. The new awards programme recognises technologies that have moved beyond proof-of-concept to deliver measurable value, supporting efficiency, resilience, and insight generation across...

EVENT

Eagle Alpha Alternative Data Conference, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...