The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

ING Investment Management Has Banned the Term ‘Golden Copy’, Says Murphy

Proving that for some institutions the approach to data management is changing dramatically, Nick Murphy, data specialist for pricing and evaluations at ING Investment Management, told delegates to last week’s FIMA that his own firm has banned the term “golden copy” from use. The focus is on maintaining centralised and standardised data for instruments, but it is also about meeting the requirements of the internal business users, he said.

The number one priority for data management is to get recognition of the fact that reference data is an asset to an organisation, said Murphy, reprising a theme that many other speakers elaborated upon at this year’s FIMA. It is often undervalued by the business and quality is frequently sacrificed to lower costs, he said.

Other priorities at the moment include coping with the globalisation and standardisation of data by tackling the issue of data governance (a point also raised by fellow buy side representative from UBS Ian Webster) and the continual challenge of dealing with a siloed environment. In order to help it deal with this environment, the firm is currently implementing a new data management solution from Cadis for its static and market data, which Murphy hopes will allow it to better judge data completeness and accuracy.

“Everywhere I have worked I have seen the difficulties caused by letting end users change or create classifications downstream,” he said. “Changing data fields and attributes at the individual business level causes a lot of complexity down the line.” By having a centralised structure to monitor data quality across the institution, Murphy anticipates that some of these problems can be picked up more easily.

Firms also need to draw up quality indicators on data to check whether their requirements are being met by vendor systems and internally built solutions, he suggested. To this end, Murphy is keen to draw up internal client service level agreements (SLAs) in order to ensure data quality is being maintained.

Murphy noted that vendors have been slow to properly service the needs of the buy side, however, many firms are also unsure about the requirements of their own internal end users. “The industry has to be brave but cautious about what we opt to do with vendors in a managed services environment, for example,” he said. “You don’t want to give away control over your data quality.”

So, vendors may have their faults, but they can be useful partners in bringing technology expertise to the table, according to Murphy. “Don’t bash your vendors too much,” he joked. “They are people too.”

Related content

WEBINAR

Recorded Webinar: The challenges of managing data for regulatory compliance and business performance

Don’t miss this opportunity to view the recording of this recently held webinar. Regulatory compliance is non-negotiable and business performance based on a solid understanding of data is no longer a ‘nice to have’ but critical to success. The webinar will consider the challenges of sourcing and managing data for both compliance and performance, identify...

BLOG

Kaiko and Bloomberg Promote Standards for Digital Asset Markets with Release of Financial Instrument Global Identifiers for Crypto Assets

Kaiko, a cryptocurrency market data provider, is bringing Bloomberg’s Financial Instrument Global Identifier (FIGI) into play as a means of identifying crypto assets and enabling greater interoperability between industry participants including digital asset exchanges, data aggregators, custodians, service providers and regulators. The companies’ first move is the release of a batch of FIGIs for all...

EVENT

Data Management Summit USA Virtual (Redirected)

The highly successful Data Management Summit USA Virtual was held in September 2020 and explored how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...