About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Datactics Extends FlowDesigner to Deliver DQ Metrics for Regulatory Data Compliance

Subscribe to our newsletter

Datactics has extended its FlowDesigner data quality platform to offer DQ Metrics, a monitoring and measurement application that allows financial institutions to measure their data compliance against regulations and industry standards on a continuous basis.

The company’s initial release of DQ Metrics covers entity identifiers, particularly LEIs required by MiFID II and EMIR, and GIIN numbers required by FATCA. It also incorporates data quality dimensions such as accuracy, timeliness and completeness that are specified in the Enterprise Data Management Council’s Data Capability Assessment Model, although as a rules and data agnostic solution it can import rules from any rules repository, data dictionary or data governance system.

Similarly, data quality and compliance rules can be built by data managers and applied to compliance processes and operational practices, allowing firms to add rules to DQ Metrics as they are issued by regulators.

The solution’s ability to handle large volumes of reference data in real time to provide continuous data quality measurement, rather than support for traditional data sampling, is based on the in-memory processing and streaming data technologies inherent to FlowDesigner. Data input to DQ Metrics can come from many data silos, while the output of data quality measurements is graphically presented on the DQ Metrics dashboard. This allows users to view an overall data quality score, data quality relevant to particular regulations and the data quality of data dimensions.

Stuart Harvey, Datactics CEO, explains: “Recent regulations require banks to be able to profile and measure the quality of underlying data, yet current methods and technologies used to measure compliance are inadequate in dealing with large volumes of data and rely on sampling data. Datactics’ breakthrough in performance and matching will allow our clients to see exactly where data mistakes are impacting operations and rapidly identify the most serious problems.”

The initial release of DQ Metrics has been trialled by two European financial institutions and is available immediately. It can be deployed onsite, which Datactics expects to be the favoured option in the near to mid-term, or used as a web service. In this instance, clients can submit data to DQ Metrics, which profiles the data according to specified rules and presents its quality metrics on an online dashboard.

Looking forward, Datactics is working to extend DQ Metrics to cover regulation BCBS 239 by creating rules that monitor and measure the quality of data at rest and data in motion. It is also discussing the addition of rules to DQ Metrics with service providers that create and maintain rules databases relating to financial entity, trade and risk data, and considering the potential of the solution for data issuers.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Sponsored by FundGuard: NAV Resilience Under DORA, A Year of Lessons Learned

The EU’s Digital Operational Resilience Act (DORA) came into force a year ago, and is reshaping how asset managers, asset owners and fund service providers think about operational risk. While DORA’s focus is squarely on ICT resilience and third-party dependencies, its implications extend deep into core operational processes that are critical to market integrity, investor...

BLOG

TRG Screen Launches AI Assist to Advance Reference Data Cost Management

Market data spend and usage management software provider TRG Screen has launched an artificial intelligence-powered capability to help financial institutions better manage spiralling data costs. The conversational AI interface sits on top of TRG Screen’s established Xmon platform, allowing users to interact with their own programme data using natural language. Instead of digging through technical reports, users can ask the system direct questions about cost optimisation opportunities and...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Connecting to Today’s Fast Markets

At the same time, the growth of high frequency and event-driven trading techniques is spurring demand for direct feed services sourced from exchanges and other trading venues, including alternative trading systems and multilateral trading facilities. Handling these high-speed data feeds its presenting market data managers and their infrastructure teams with a challenge: how to manage...