About a-team Marketing Services

A-Team Insight Blogs

Regulations Pushing Basic Reference Data Management Issues up the Agenda for Asset Managers, Says MoneyMate’s Brennan

Subscribe to our newsletter

The increased media attention being given to regulation and the punitive action taken by regulators as a result of data errors has flagged the issue of data management for the buy side at large, from hedge fund managers to traditional institutional investment firms, according to MoneyMate’s chief technology officer Ronan Brennan. During a recent webinar, Brennan noted that pressures to better understand market and counterparty risk and a focus from regulators on assessing consistency, timeliness and accuracy of basic reference data has led many firms to invest in improving their data quality.

Last month, the data quality management solution vendor spoke to Reference Data Review about its recent survey of the investment management community, which indicated that many of these firms are now considering outsourcing as a viable option to meeting their data quality needs. However, this is only one potential option out of a whole host of others that firms are considering to meet the often onerous new regulatory requirements and the focus on market and counterparty risk in the post-crisis environment. This risk focus has necessarily meant that regulators are looking for additional reporting from the buy side and the initial focus is on counterparty data, Brennan contended.

The US regulatory environment, for example, is changing dramatically within the sphere of alternatives, with hedge funds now being required to register and report more data to the Securities and Exchange Commission (SEC). “These firms need to keep their data in order and collect data from multiple siloed systems into one place for SEC inspection,” he said during the webinar.

Moreover, the Dodd-Frank bill will mean regulators will have much more power than before over the markets as a whole, especially the SEC with regards to market risk, and this will result in firms being required to provide more reports and data disclosure.

Europe is also moving towards the North American rules-based model of regulation and Brennan points to the French led demands for heavy touch regulation as a case in point. He reckons one of the biggest challenges for the buy side in Europe is currently the data management challenges of UCITS IV, with the move from the simplified prospectus to the new key investor document.

Regulators are also asking for a much broader range of data sets related to areas such as liquidity risk and counterparty exposures, which necessarily involves data quality and collection challenges. Brennan believes this will mean that security and reference master data will receive much more attention, as firms seek to ensure their data underlying these calculations is correct. He notes the recent regulatory investigations looking into classifications under product and securities master data as a precursor of things to come, along with much more scrutiny of client facing data.

In terms of what regulators don’t want to see “under the bonnet”, Brennan highlights manual processes, the use of Excel, temporary staff in charge of data management, undocumented or ad hoc processes and an absence of governance. “Regulators will be looking for a much more strategic approach to the management of data and will be asking how often this approach has been reassessed and how well it is governed,” he said. “This regulatory focus is in turn making firms scrutinise their operational models and then redesigning them to ensure data quality.”

The impact of getting this process wrong is severe, as financial penalties and the associated loss of business due to reputational damage is an ever present threat, according to Brennan. Getting it right, however, means an increase in the accuracy of data and confidence in a firm’s client service capabilities, as well as cost savings that allow a reduction in headcounts or the reallocation of resources. On a basic level, it also means less reprinting of key documents is needed and the buy side has the ability to have more flexibility in its vendor relationships by providing the comparability of single source.

Brennan recommends that these firms need to first consider clear ownership and accountability for data quality right back to data source. “Data management is a function of the middle office and should not reside in the front office,” he said. “However, there should be centralised oversight and strategic management of data quality with C level ownership, which can be achieved via a chief data officer (CDO) or committee approach.”

Quality needs to be measured and objectives need to be set for continuous improvement in data quality, according to Brennan. Staff must also have access to a common data dictionary, so they are all using the same data language. “Technology should be used to empower people but it is not solution on its own,” he warned.

“Data for asset managers is the equivalent of raw materials for manufacturing and it should have been treated it in a similar manner,” added Brennan. “Had the same level of importance been attached to financial services data, then there would be less need for regulatory steps to be taken.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

From Capture to Context: Transforming Voice Comms Compliance with Theta Lake

In today’s financial landscape, voice communications compliance extends beyond simple capture and retention – it demands contextual insight and integrated oversight across multiple platforms. Firms grapple with fragmented systems, complex cloud environments, and regulatory pressures that require a more holistic compliance strategy. In this video interview, Adrian Sharp, RegTech Insight Editor at A-Team Group, talks...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...