About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Machine-Executable Regulations Promise Improved Data Quality and Efficiency in Reporting

Subscribe to our newsletter

Ensuring data quality for regulatory reporting is an ongoing headache for many financial institutions, but solutions that could ease the pain are emerging. We talk to Sassan Danesh, a member of the Derivatives Service Bureau (DSB) management team, about approaches to regulatory data quality issues using machine-executable validation rules ahead of his keynote at next week’s A-Team Group Data Management Summit London.

A-Team: First, what are the problems of deciphering regulatory requirements?

Sassan: Moving from human-readable to machine-executable regulations involves a complex transition from natural language to a structured set of rules. Where paper-based text can use the full breadth of language and allow for interpretation, a rules-based definition provides less room for elaboration. Of course, existing regulations have to be very carefully constructed, but the machine-readable medium is less forgiving of data variations, so the process of rule building demands more precision.

Another factor is that regulations cannot be boiled down to just a message layout and a set of validations. Regulations have a scope and a hierarchy, along with triggers, conditions and timings. All these elements need to be captured in a machine-readable definition in a way that is clear and unambiguous.

The final consideration is flexibility. Regulations change in line with market conditions and it is important that the structure of the rules is not so strict as to preclude the evolution of those rules over time.

A-Team: Moving on, what are the challenges of data quality in regulatory reporting?

Sassan: The subject of data quality is very familiar to the DSB in its maintenance of OTC ISIN product definitions and is driven by a need to ensure that received data is accurate, consistent and comprehensive. To ensure regulatory conformance, any machine-executable solution needs to be built on a solid foundation of standards-based data elements (e.g ISO) with a clearly defined source and meaning.

A-Team: How can machine-executable validation rules help solve the data quality problem?

Sassan: The validation process is the key to maintaining data quality – measuring any regulatory report against the defined requirements and rejecting any message that does not satisfy those rules. This is one of the most significant advantages of machine-executable rules since bad data will not pass rigorous and well-defined validation rules and cannot be included in any analysis of the resultant dataset.

In addition, by imposing a degree of standardisation on the required data elements, it is possible to provide a common view across different datasets (e.g. different asset classes) and should lead to improvements in matching the same data reported by independent parties.

Perhaps most importantly, the migration to machine-executable rules removes subjectivity from the process and, by imposing a precise structure, ensures an objective response with no room for misinterpretation.

A-Team: What are the data requirements of such validation rules?

Sassan: The specific requirements for the construction of regulatory validation rules are dependent on the individual report, but there are some general principles. The first is that it is important to base validation rules on standard values that are available and adopted by institutions or reporting parties. As above, ISO provides a good starting point and covers most aspects of the financial industry including currency codes, classifications, instrument identifiers and more.

To promote accuracy and avoid data duplication, it would also be recommended that IDs are requested and that they replace their constituent attribute. For example: if a bond’s ISIN reflects its issues, coupon and maturity date then the regulation needs only to request its ID. The ISIN acts as a shorthand for the bond’s primary attributes and forms the basis of data aggregation.

A-Team: What technologies are helpful here?

Sassan: Common standards for data and data management are helpful, while open standards for data ensure distribution and consumption is simple for all parties involved. With regards to data management, storing, editing and transformation of machine-readable validation rules is a key underlying technology. Any tool that is used should be an open standard that can be developed and enhanced to support the needs of industry.

A-Team: Will these type of validation rules be adopted across capital markets?

Sassan: Yes, we believe industry would be quick to adopt regulatory rules in a machine-readable format. Today, many capital markets organisations are transforming regulatory text into ‘code’ to support machine enforcement of validation rules. Receiving these rules in a machine-readable format would simplify and drive cost efficiencies for all participants.

At present, it appears that regulators are actively exploring the benefits of defining machine-executable rules and examining the way in which they can best be implemented. Given this, we would expect to see a multi-phase migration towards the new paradigm across a number of jurisdictions within the next few years.

A-Team: Finally, please sum up the key benefits of implementing machine-executable validation rules.

Sassan: The definition of a structured reporting mechanism means users know precisely what they are expected to report and the validation ensures strict compliance with the regulation, leading to a more efficient reporting process, improved data quality, and a better foundation for aggregating data, spotting trends and identifying risk.

It should also be noted that with a more structured approach, the regulator can change the rules relatively easily without going through the lengthy process of rewriting/republishing screeds of text. In effect, any change would be a technical update. Additionally, it becomes quite simple to test and refine the rules before they are implemented.

Find out more about machine-executable regulations at A-Team Group’s 11 May 2022 Data Management Summit London.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How trading venues can overcome the complexity of post Brexit MiFID II transparency requirements

Date: 6 October 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Brexit has brought a number of additional challenges to trading venues, not least a new layer of complexity when fulfilling Markets in Financial Instruments Directive II (MiFID II) transparency requirements. This has been caused by ESMA increasing the data continuity...

BLOG

FactSet Collaborates with CID to Strengthen AI Capabilities for Clients

FactSet has teamed up with AI software specialist CID to build a joint data lake for the financial services industry. The data lake merges unstructured public web content with FactSet’s structured content sets, along with optional third-party and client content, into vast entity relationship graphs. The aim of the multi-year collaboration is to provide FactSet...

EVENT

RegTech Summit New York

Now in its 6th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...