The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Broadridge Enhances Investigo Solution with Customized Data Scrubbing Capabilities

Broadridge Financial Solutions, announced today the development of a flexible two-tier Data Scrubbing Service as an enhancement to its Investigo insurance and investment data aggregation solutions.

This enhancement adds a flexible, rules-based, data scrubbing engine to the already robust aggregation capabilities of the Investigo solution. The new Investigo data scrubbing engine identifies unreconciled positions and cash balance conditions and validates specific elements of the custodial data.

Failed validations can be held from posting to the Investigo database until being analyzed by a dedicated team of operations personnel who identify the root cause of the failure. Appropriate action is then taken to address the anomalies in the data. Once corrected, the data is reprocessed for posting to the Investigo database.

“Quality data is the key to successful back-office operations and advisor client interactions. Enhancing data quality encompasses more than just finding and fixing missing or inaccurate data elements. It means delivering comprehensive, consistent, relevant, and timely data for fulfilling mission-critical needs,” said Kevin Normoyle, President – Securities Processing Solutions U.S., Broadridge. He continued, “Poor data quality costs financial services firms and their advisors vast amounts of money, leading to poor decisions and inferior customer service. The Investigo Data Scrubbing Service addresses the most pervasive and challenging data quality issues confronting data aggregation solution providers and the firms they serve.”

The Investigo Data Scrubbing Service:

Tier One: A flexible, rules-based, automated data scrubbing engine.

Automatically identifies unreconciled transactions and missing data elements and applies fixes, when possible, based on a configurable set of rules in the data scrubbing engine.

Tier Two: A manual data scrubbing service performed by a team of dedicated data stewards. All failed validations that cannot be automatically corrected with the automated data scrubbing engine can be held from posting to the database until the failure is analyzed. Once the appropriate action is taken and the failure is corrected, the data is reprocessed for posting to the database.

“The highest level of data quality is obtained when both tiers of the Investigo Data Scrubbing Service are used. Broadridge continues to make significant financial investments to upgrade the Investigo architecture and infrastructure to improve the responsiveness and scalability of our application. We recognize that our customers are highly dependent upon accurate data delivered in a timely manner. Our goal is for Investigo to be recognized as the ‘Gold Standard’ of data,” concluded Mr. Normoyle.

Related content

WEBINAR

Recorded Webinar: Bracing for the Wave—or Sailing Ahead of It? Reducing Risk Through Benchmark Data Controls

This webinar has passed, but you can view the recording here. The wave of LIBOR-inspired compliance requirements has particular implications for capital market participants. Proactive management of a firm’s risk and compliance environments is desperately needed to stay ahead of this wave. Of most concern for participants is the sense that they will be ‘unfairly’...

BLOG

IHS Markit Adds Data Dictionary for EDM Platform

IHS Markit has released a data dictionary designed to help business users gain greater visibility of their data. The Data Dictionary has a web-based interface that allows users to see what data is available within their organisation, and a portal that includes data governance metadata covering how the data is defined, where it has come...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that brings together an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...