About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Regulations Pushing Basic Reference Data Management Issues up the Agenda for Asset Managers, Says MoneyMate’s Brennan

Subscribe to our newsletter

The increased media attention being given to regulation and the punitive action taken by regulators as a result of data errors has flagged the issue of data management for the buy side at large, from hedge fund managers to traditional institutional investment firms, according to MoneyMate’s chief technology officer Ronan Brennan. During a recent webinar, Brennan noted that pressures to better understand market and counterparty risk and a focus from regulators on assessing consistency, timeliness and accuracy of basic reference data has led many firms to invest in improving their data quality.

Last month, the data quality management solution vendor spoke to Reference Data Review about its recent survey of the investment management community, which indicated that many of these firms are now considering outsourcing as a viable option to meeting their data quality needs. However, this is only one potential option out of a whole host of others that firms are considering to meet the often onerous new regulatory requirements and the focus on market and counterparty risk in the post-crisis environment. This risk focus has necessarily meant that regulators are looking for additional reporting from the buy side and the initial focus is on counterparty data, Brennan contended.

The US regulatory environment, for example, is changing dramatically within the sphere of alternatives, with hedge funds now being required to register and report more data to the Securities and Exchange Commission (SEC). “These firms need to keep their data in order and collect data from multiple siloed systems into one place for SEC inspection,” he said during the webinar.

Moreover, the Dodd-Frank bill will mean regulators will have much more power than before over the markets as a whole, especially the SEC with regards to market risk, and this will result in firms being required to provide more reports and data disclosure.

Europe is also moving towards the North American rules-based model of regulation and Brennan points to the French led demands for heavy touch regulation as a case in point. He reckons one of the biggest challenges for the buy side in Europe is currently the data management challenges of UCITS IV, with the move from the simplified prospectus to the new key investor document.

Regulators are also asking for a much broader range of data sets related to areas such as liquidity risk and counterparty exposures, which necessarily involves data quality and collection challenges. Brennan believes this will mean that security and reference master data will receive much more attention, as firms seek to ensure their data underlying these calculations is correct. He notes the recent regulatory investigations looking into classifications under product and securities master data as a precursor of things to come, along with much more scrutiny of client facing data.

In terms of what regulators don’t want to see “under the bonnet”, Brennan highlights manual processes, the use of Excel, temporary staff in charge of data management, undocumented or ad hoc processes and an absence of governance. “Regulators will be looking for a much more strategic approach to the management of data and will be asking how often this approach has been reassessed and how well it is governed,” he said. “This regulatory focus is in turn making firms scrutinise their operational models and then redesigning them to ensure data quality.”

The impact of getting this process wrong is severe, as financial penalties and the associated loss of business due to reputational damage is an ever present threat, according to Brennan. Getting it right, however, means an increase in the accuracy of data and confidence in a firm’s client service capabilities, as well as cost savings that allow a reduction in headcounts or the reallocation of resources. On a basic level, it also means less reprinting of key documents is needed and the buy side has the ability to have more flexibility in its vendor relationships by providing the comparability of single source.

Brennan recommends that these firms need to first consider clear ownership and accountability for data quality right back to data source. “Data management is a function of the middle office and should not reside in the front office,” he said. “However, there should be centralised oversight and strategic management of data quality with C level ownership, which can be achieved via a chief data officer (CDO) or committee approach.”

Quality needs to be measured and objectives need to be set for continuous improvement in data quality, according to Brennan. Staff must also have access to a common data dictionary, so they are all using the same data language. “Technology should be used to empower people but it is not solution on its own,” he warned.

“Data for asset managers is the equivalent of raw materials for manufacturing and it should have been treated it in a similar manner,” added Brennan. “Had the same level of importance been attached to financial services data, then there would be less need for regulatory steps to be taken.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The roles of cloud and managed services in optimising enterprise data management

Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality data, and delivering improved efficiency, better decisions and competitive advantage? This webinar will answer these questions,...

BLOG

FactSet Introduces Interactive GenAI Solution Transcript Assistant

FactSet has released its first interactive GenAI solution available in the FactSet Workstation. Called Transcript Assistant, the solution is a conversational chatbot designed to accelerate in-depth research and analysis of earnings call transcripts, and help users search, analyse, and extract valuable, actionable insights from all transcripts in FactSet with a view to improving the investment...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...