About a-team Marketing Services

A-Team Insight Blogs

Regulations Pushing Basic Reference Data Management Issues up the Agenda for Asset Managers, Says MoneyMate’s Brennan

Subscribe to our newsletter

The increased media attention being given to regulation and the punitive action taken by regulators as a result of data errors has flagged the issue of data management for the buy side at large, from hedge fund managers to traditional institutional investment firms, according to MoneyMate’s chief technology officer Ronan Brennan. During a recent webinar, Brennan noted that pressures to better understand market and counterparty risk and a focus from regulators on assessing consistency, timeliness and accuracy of basic reference data has led many firms to invest in improving their data quality.

Last month, the data quality management solution vendor spoke to Reference Data Review about its recent survey of the investment management community, which indicated that many of these firms are now considering outsourcing as a viable option to meeting their data quality needs. However, this is only one potential option out of a whole host of others that firms are considering to meet the often onerous new regulatory requirements and the focus on market and counterparty risk in the post-crisis environment. This risk focus has necessarily meant that regulators are looking for additional reporting from the buy side and the initial focus is on counterparty data, Brennan contended.

The US regulatory environment, for example, is changing dramatically within the sphere of alternatives, with hedge funds now being required to register and report more data to the Securities and Exchange Commission (SEC). “These firms need to keep their data in order and collect data from multiple siloed systems into one place for SEC inspection,” he said during the webinar.

Moreover, the Dodd-Frank bill will mean regulators will have much more power than before over the markets as a whole, especially the SEC with regards to market risk, and this will result in firms being required to provide more reports and data disclosure.

Europe is also moving towards the North American rules-based model of regulation and Brennan points to the French led demands for heavy touch regulation as a case in point. He reckons one of the biggest challenges for the buy side in Europe is currently the data management challenges of UCITS IV, with the move from the simplified prospectus to the new key investor document.

Regulators are also asking for a much broader range of data sets related to areas such as liquidity risk and counterparty exposures, which necessarily involves data quality and collection challenges. Brennan believes this will mean that security and reference master data will receive much more attention, as firms seek to ensure their data underlying these calculations is correct. He notes the recent regulatory investigations looking into classifications under product and securities master data as a precursor of things to come, along with much more scrutiny of client facing data.

In terms of what regulators don’t want to see “under the bonnet”, Brennan highlights manual processes, the use of Excel, temporary staff in charge of data management, undocumented or ad hoc processes and an absence of governance. “Regulators will be looking for a much more strategic approach to the management of data and will be asking how often this approach has been reassessed and how well it is governed,” he said. “This regulatory focus is in turn making firms scrutinise their operational models and then redesigning them to ensure data quality.”

The impact of getting this process wrong is severe, as financial penalties and the associated loss of business due to reputational damage is an ever present threat, according to Brennan. Getting it right, however, means an increase in the accuracy of data and confidence in a firm’s client service capabilities, as well as cost savings that allow a reduction in headcounts or the reallocation of resources. On a basic level, it also means less reprinting of key documents is needed and the buy side has the ability to have more flexibility in its vendor relationships by providing the comparability of single source.

Brennan recommends that these firms need to first consider clear ownership and accountability for data quality right back to data source. “Data management is a function of the middle office and should not reside in the front office,” he said. “However, there should be centralised oversight and strategic management of data quality with C level ownership, which can be achieved via a chief data officer (CDO) or committee approach.”

Quality needs to be measured and objectives need to be set for continuous improvement in data quality, according to Brennan. Staff must also have access to a common data dictionary, so they are all using the same data language. “Technology should be used to empower people but it is not solution on its own,” he warned.

“Data for asset managers is the equivalent of raw materials for manufacturing and it should have been treated it in a similar manner,” added Brennan. “Had the same level of importance been attached to financial services data, then there would be less need for regulatory steps to be taken.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to ensure employees meet fit and proper requirements under global accountability regimes

Fitness and proprietary requirements for employees of financial institutions are not an option, but a regulatory obligation that calls on employers to regularly assess employees’ honesty, integrity and reputation, competence and capability, and financial soundness. In the UK, these requirements are a core element of the Senior Managers and Certification Regime (SMCR). They are also...

BLOG

S&P Global Market Intelligence Updates Capital IQ Pro with Fixed Income Data and GenAI Summarisation

S&P Global Market Intelligence continues to update its Capital IQ Pro data and analytics platform with the addition of more than 19.4 million fixed income securities with full reference data, pricing and analytics. The company has also added GenAI-powered earnings transcript summarisation capabilities and enhanced private markets and segment data. The fixed income data includes...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...