About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Current Solutions Lacking, Despite Foundational Nature of Reference Data

Subscribe to our newsletter

Reference data management (RDM) is a foundational element of financial enterprises, yet the collection of solutions used to manage reference data in most firms is not satisfactory, according to a report published this week.

The report – Reference Data Management: Unlocking Operational Efficiencies, published by Tabb Group in conjunction with data integration specialist Informatica – describes current sentiment around RDM. It looks at development through four generations of solutions, details the obstacles to RDM success and sets out how firms at different levels of RDM adoption can move forward towards the holy grail of centralised RMD coupled to consistent reference data processing.

Setting the scene for the report, author Paul Rowady, a senior analyst at Tabb, notes that the notion of explosive interest in reference data and RDM caused by the credit crisis and onslaught of regulatory mandates is a myth. Instead, he says a general upswing in interest since the introduction of first-generation RDM solutions, often systems where reference data resided naturally such as trading systems, was followed by a significant rise in interest in the wake of 9/11 and the introduction of data transparency initiatives such as Anti-Money Laundering (AML), Know Your Client (KYC) and the US Patriot Act.

The answer to the problem posed by these initiatives of searching for new patterns among more complex combinations of data sets was sought among second-generation, dedicated RDM solutions from third-party vendors. These emerged in the market between 2002 and 2006 with highly structured data models and the promise of golden copies of data sets, but were usurped in 2009 by third-generation RDM products that were more open and offered the flexibility to customise data models.

The report suggests this development continuum from proprietary RDM solutions, to functional packaged data models and on to flexible data model platforms is not sufficient to meet today’s market needs, and believes the need is to transition to a fourth generation of RDM solutions that take prior generations’ best attributes, fuse them in a hybrid solution and wrap them in a more rigorous data governance process.

If this is the ultimate aim, for the time being at least, the report reckons that second-generation solutions still represent the majority of installations at the largest sell-side firms and they are far from perfect.

Despite huge investments in RDM over the past decade, research carried out among 20 firms – 25% in Europe, 75% in the US, 50% on the buy side and 50% on the sell side – in April 2012 found 86% of respondents dissatisfied with their RDM capabilities. Of these, 48% are being driven to improvement for reasons related to resource optimisation and outcomes, while 35% are responding to specific catalysts such as compliance.

This highlights RDM as a strategic rather than tactical imperative. Just 14% of respondents are satisfied with their RDM and the majority of these are deploying proprietary solutions. As the report points out: “Demands placed upon most RDM solutions have grown more quickly than the solutions’ ability to keep up.”

Reflecting this point, the survey shows 79% of respondents are unlikely to switch security master systems, the primary domain of second-generation RDM solutions, while a similar number, 71%, plan to continue investing in RDM improvements.

Rowady comments: “From this survey, it is clear that current generation RDM solutions are not keeping up with the challenges and opportunities around reference data as a strategic tool. The time has come for a sea change and those firms that are first to embrace emerging fourth generation RDM solutions will be the first to grab their share of the treasure chest.”

Building on existing attributes of RDM systems, the report says fourth-generation solutions will have increased and combined functionality and flexibility, and will have a dramatic impact on both the fragmented vendor landscape and Tier 1 firms’ overall data fluency. They will bring together often-separate security and entity reference data systems into a cross-reference data hub, a need driven in part by the forthcoming introduction of a global legal entity identifier system. They will also provide improved data lifecycle governance by being viewed, both conceptually and operationally, as part of a broader enterprise data management (EDM) platform that can provide an holistic view of all data management.

If this development of RDM solutions is easy to appreciate, adoption will be more difficult. Rowady says: “Through our analysis, we’ve learned that primary buyers are hesitant to switch horses in the middle of the race, so to speak. Even so, Tabb believes the potential uptick in enterprise-scale performance is compelling. We know that many of the largest capital markets firms have recently invested heavily—in many cases, to the tune of tens of millions of dollars—in their reference data capabilities. Battles for sponsors and budgets are undoubtedly fresh in the minds of data managers at those shops. But if fourth-generation solution vendors can properly frame the functionality/flexibility boost, and simultaneously compress the pain of switching and on boarding, they might have a shot at convincing these Tier 1 players to do what they need to do: in many cases, to go back to the drawing board.”

While this could be an optimal approach to change, the report recognises that most firms must overcome existing barriers to improve their RDM capabilities. It suggests legacy fragmentation of data, solutions and processes can be addressed using virtual centralisation of reference data management and infrastructure-on-demand or cloud solutions, while downstream applications that are dependent on consistent RDM must be streamlined and mapped to centralised reference data management.

Recommending how to navigate the road ahead, the study suggests firms committed to bolstering existing suites of RDM solutions should focus on wrapping current solutions with technology that enables a consistent enterprise data governance process, while those yet to make a significant commitment to an RDM solution should seek solutions that manage multiple reference data domains in a consistent and integrated enterprise framework.

The report concludes: “There can be no glory without doing the hard work first. Data fluency, a critical precursor to data consumability, simply means that data flows more easily, which in turn means that end users must be able to find it. And, finding data requires meticulous attention to standards, labels and other metadata, however imperfect they may be now or in the future. That way, no matter how big or complex the data gets, end users will have a much better shot at harvesting value from it.”

Subscribe to our newsletter

Related content


Upcoming Webinar: Strategies, tools and techniques to extract value from unstructured data

Date: 12 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a...


Bloomberg Releases GenAI-Powered Earnings Call Summaries

Bloomberg has released AI-Powered Earnings Call Summaries, the company’s first generative AI (GenAI) product for terminal users. The tool enables users to decipher complex financial information and quickly extract key insights on topics addressed by corporate management teams, such as guidance, capital allocation, hiring and labour plans, the macro environment, new products, supply chain issues,...


Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...