About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Broadridge Enhances Investigo Solution with Customized Data Scrubbing Capabilities

Subscribe to our newsletter

Broadridge Financial Solutions, announced today the development of a flexible two-tier Data Scrubbing Service as an enhancement to its Investigo insurance and investment data aggregation solutions.

This enhancement adds a flexible, rules-based, data scrubbing engine to the already robust aggregation capabilities of the Investigo solution. The new Investigo data scrubbing engine identifies unreconciled positions and cash balance conditions and validates specific elements of the custodial data.

Failed validations can be held from posting to the Investigo database until being analyzed by a dedicated team of operations personnel who identify the root cause of the failure. Appropriate action is then taken to address the anomalies in the data. Once corrected, the data is reprocessed for posting to the Investigo database.

“Quality data is the key to successful back-office operations and advisor client interactions. Enhancing data quality encompasses more than just finding and fixing missing or inaccurate data elements. It means delivering comprehensive, consistent, relevant, and timely data for fulfilling mission-critical needs,” said Kevin Normoyle, President – Securities Processing Solutions U.S., Broadridge. He continued, “Poor data quality costs financial services firms and their advisors vast amounts of money, leading to poor decisions and inferior customer service. The Investigo Data Scrubbing Service addresses the most pervasive and challenging data quality issues confronting data aggregation solution providers and the firms they serve.”

The Investigo Data Scrubbing Service:

Tier One: A flexible, rules-based, automated data scrubbing engine.

Automatically identifies unreconciled transactions and missing data elements and applies fixes, when possible, based on a configurable set of rules in the data scrubbing engine.

Tier Two: A manual data scrubbing service performed by a team of dedicated data stewards. All failed validations that cannot be automatically corrected with the automated data scrubbing engine can be held from posting to the database until the failure is analyzed. Once the appropriate action is taken and the failure is corrected, the data is reprocessed for posting to the database.

“The highest level of data quality is obtained when both tiers of the Investigo Data Scrubbing Service are used. Broadridge continues to make significant financial investments to upgrade the Investigo architecture and infrastructure to improve the responsiveness and scalability of our application. We recognize that our customers are highly dependent upon accurate data delivered in a timely manner. Our goal is for Investigo to be recognized as the ‘Gold Standard’ of data,” concluded Mr. Normoyle.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Implementing Technology Business Management with Pace and Precision

By Simon Mendoza, Chief Technology Officer, Calero. Implementing a Technology Business Management (TBM) platform can feel like a major logistical challenge. Every organisation starts from a different place – different data maturity, internal priorities and levels of stakeholder engagement. But that doesn’t mean every implementation needs to be a blank slate. The fastest and most...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...