About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Machine-Executable Regulations Promise Improved Data Quality and Efficiency in Reporting

Subscribe to our newsletter

Ensuring data quality for regulatory reporting is an ongoing headache for many financial institutions, but solutions that could ease the pain are emerging. We talk to Sassan Danesh, a member of the Derivatives Service Bureau (DSB) management team, about approaches to regulatory data quality issues using machine-executable validation rules ahead of his keynote at next week’s A-Team Group Data Management Summit London.

A-Team: First, what are the problems of deciphering regulatory requirements?

Sassan: Moving from human-readable to machine-executable regulations involves a complex transition from natural language to a structured set of rules. Where paper-based text can use the full breadth of language and allow for interpretation, a rules-based definition provides less room for elaboration. Of course, existing regulations have to be very carefully constructed, but the machine-readable medium is less forgiving of data variations, so the process of rule building demands more precision.

Another factor is that regulations cannot be boiled down to just a message layout and a set of validations. Regulations have a scope and a hierarchy, along with triggers, conditions and timings. All these elements need to be captured in a machine-readable definition in a way that is clear and unambiguous.

The final consideration is flexibility. Regulations change in line with market conditions and it is important that the structure of the rules is not so strict as to preclude the evolution of those rules over time.

A-Team: Moving on, what are the challenges of data quality in regulatory reporting?

Sassan: The subject of data quality is very familiar to the DSB in its maintenance of OTC ISIN product definitions and is driven by a need to ensure that received data is accurate, consistent and comprehensive. To ensure regulatory conformance, any machine-executable solution needs to be built on a solid foundation of standards-based data elements (e.g ISO) with a clearly defined source and meaning.

A-Team: How can machine-executable validation rules help solve the data quality problem?

Sassan: The validation process is the key to maintaining data quality – measuring any regulatory report against the defined requirements and rejecting any message that does not satisfy those rules. This is one of the most significant advantages of machine-executable rules since bad data will not pass rigorous and well-defined validation rules and cannot be included in any analysis of the resultant dataset.

In addition, by imposing a degree of standardisation on the required data elements, it is possible to provide a common view across different datasets (e.g. different asset classes) and should lead to improvements in matching the same data reported by independent parties.

Perhaps most importantly, the migration to machine-executable rules removes subjectivity from the process and, by imposing a precise structure, ensures an objective response with no room for misinterpretation.

A-Team: What are the data requirements of such validation rules?

Sassan: The specific requirements for the construction of regulatory validation rules are dependent on the individual report, but there are some general principles. The first is that it is important to base validation rules on standard values that are available and adopted by institutions or reporting parties. As above, ISO provides a good starting point and covers most aspects of the financial industry including currency codes, classifications, instrument identifiers and more.

To promote accuracy and avoid data duplication, it would also be recommended that IDs are requested and that they replace their constituent attribute. For example: if a bond’s ISIN reflects its issues, coupon and maturity date then the regulation needs only to request its ID. The ISIN acts as a shorthand for the bond’s primary attributes and forms the basis of data aggregation.

A-Team: What technologies are helpful here?

Sassan: Common standards for data and data management are helpful, while open standards for data ensure distribution and consumption is simple for all parties involved. With regards to data management, storing, editing and transformation of machine-readable validation rules is a key underlying technology. Any tool that is used should be an open standard that can be developed and enhanced to support the needs of industry.

A-Team: Will these type of validation rules be adopted across capital markets?

Sassan: Yes, we believe industry would be quick to adopt regulatory rules in a machine-readable format. Today, many capital markets organisations are transforming regulatory text into ‘code’ to support machine enforcement of validation rules. Receiving these rules in a machine-readable format would simplify and drive cost efficiencies for all participants.

At present, it appears that regulators are actively exploring the benefits of defining machine-executable rules and examining the way in which they can best be implemented. Given this, we would expect to see a multi-phase migration towards the new paradigm across a number of jurisdictions within the next few years.

A-Team: Finally, please sum up the key benefits of implementing machine-executable validation rules.

Sassan: The definition of a structured reporting mechanism means users know precisely what they are expected to report and the validation ensures strict compliance with the regulation, leading to a more efficient reporting process, improved data quality, and a better foundation for aggregating data, spotting trends and identifying risk.

It should also be noted that with a more structured approach, the regulator can change the rules relatively easily without going through the lengthy process of rewriting/republishing screeds of text. In effect, any change would be a technical update. Additionally, it becomes quite simple to test and refine the rules before they are implemented.

Find out more about machine-executable regulations at A-Team Group’s 11 May 2022 Data Management Summit London.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

PE Deal Failures Highlight Importance of Private Data, Says JMAN Group

The critical importance of data to the private equity and alternatives markets sector is starkly underlined by an observation from Anush Newman, chief executive and co-founder of JMAN Group. “In the past 18 months, I know of at least 20 acquisition deals that have fallen through because the target companies didn’t have enough data to...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...