About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Legal and General Investment Management Picks Sun Street’s Curium for Data Quality Management

Subscribe to our newsletter

Legal and General Investment Management (LGIM) has completed its initial implementation of Curium DQM across its data management and investment operations teams in London and Chicago as part of its drive to meet key data governance objectives and visibility of data across the enterprise.

At LGIM the first phase of Curium DQM adoption provides a process control platform to manage data quality over operational data sets including reference and market data. The data management team is making use of Curium DQM’s business process management tooling around data management, specifically exception and ticket management, complementing its existing capabilities.

The Curium DQM product is aimed at ‘bringing a new approach’ to the data management challenge to enable buy side and sell side firms to quickly get a handle on their data quality with exceptions management workflow for business users, as well as data visualisation and comprehensive management information and reporting capabilities.

Andrew Sexton, sales and marketing director at Sun Street says, “Recent surveys have highlighted the pressing need for firms to invest in data governance and within that, data quality is often the most cited factor of concern. If firms can quickly improve their oversight of data, both in terms of mastering data and how it is consumed across the architecture, then a whole range of regulatory pressures and business risks can be minimised.”

Indeed, at our own Data Management Summit in New York last November, the Data Governance panel discussed the need for ‘operationalisation’ of data quality – and importantly, accountability of data quality – in order to improve quality and meet the various regulatory requirements.

Curium from Sun Street is an up and coming data management product with two core modules: as well as DQM (Data Quality Management) which provides a data quality layer on top of any data management platform, it now offers MDM which delivers master data management including the ability to run sophisticated ‘what-if’ data construction scenarios and data provenance analysis.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results – and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and LLMs promise to tackle complexity and volume at a scale never seen before. But...

BLOG

Seven 2026 RegTech Outlooks for Compliance, Reporting and Financial Crime

As 2026 gets underway, RegTechs are positioning for a shift in regulatory emphasis from refits, rewrites and attestations to demonstrable evidence. Across the jurisdictions supervisors are shifting from consultation and rulemaking into validation and testing whether firms have operationalised reforms through governance, high-quality data, defensible controls and credible evidence. The seven RegTechs that follow have...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Managing Valuations Data for Optimal Risk Management

The US corporate actions market has long been characterised as paper-based and manually intensive, but it seems that much progress is being made of late to tackle the lack of automation due to the introduction of four little letters: XBRL. According to a survey by the American Institute of Certified Public Accountants (AICPA) and standards...