The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Broadridge Enhances Investigo Solution with Customized Data Scrubbing Capabilities

Broadridge Financial Solutions, announced today the development of a flexible two-tier Data Scrubbing Service as an enhancement to its Investigo insurance and investment data aggregation solutions.

This enhancement adds a flexible, rules-based, data scrubbing engine to the already robust aggregation capabilities of the Investigo solution. The new Investigo data scrubbing engine identifies unreconciled positions and cash balance conditions and validates specific elements of the custodial data.

Failed validations can be held from posting to the Investigo database until being analyzed by a dedicated team of operations personnel who identify the root cause of the failure. Appropriate action is then taken to address the anomalies in the data. Once corrected, the data is reprocessed for posting to the Investigo database.

“Quality data is the key to successful back-office operations and advisor client interactions. Enhancing data quality encompasses more than just finding and fixing missing or inaccurate data elements. It means delivering comprehensive, consistent, relevant, and timely data for fulfilling mission-critical needs,” said Kevin Normoyle, President – Securities Processing Solutions U.S., Broadridge. He continued, “Poor data quality costs financial services firms and their advisors vast amounts of money, leading to poor decisions and inferior customer service. The Investigo Data Scrubbing Service addresses the most pervasive and challenging data quality issues confronting data aggregation solution providers and the firms they serve.”

The Investigo Data Scrubbing Service:

Tier One: A flexible, rules-based, automated data scrubbing engine.

Automatically identifies unreconciled transactions and missing data elements and applies fixes, when possible, based on a configurable set of rules in the data scrubbing engine.

Tier Two: A manual data scrubbing service performed by a team of dedicated data stewards. All failed validations that cannot be automatically corrected with the automated data scrubbing engine can be held from posting to the database until the failure is analyzed. Once the appropriate action is taken and the failure is corrected, the data is reprocessed for posting to the database.

“The highest level of data quality is obtained when both tiers of the Investigo Data Scrubbing Service are used. Broadridge continues to make significant financial investments to upgrade the Investigo architecture and infrastructure to improve the responsiveness and scalability of our application. We recognize that our customers are highly dependent upon accurate data delivered in a timely manner. Our goal is for Investigo to be recognized as the ‘Gold Standard’ of data,” concluded Mr. Normoyle.

Related content

WEBINAR

Recorded Webinar: Making the most of data management utilities

Don’t miss this opportunity to view the recording of this recently held webinar. The potential benefits of using a data management utility include improved data accuracy, quality, consistency and timeliness – as well as the possibility to reduce costs. Considering these benefits, which play well into regulatory compliance, how can they be maximised and how...

BLOG

Blackmore Capital’s Collaboration with OTCfin Completes Integration of ESG Factors into Investment Process

Blackmore Capital, a Melbourne-based asset manager set up in 2018, and New York-based OTCfin have completed the integration of ESG factors with financial data for all Blackmore portfolios. By incorporating ESG factors into Blackmore’s investment process, OTCfin’s risk and regulatory reporting solution will help the asset manager’s team improve portfolio monitoring from both a financial...

EVENT

LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...