About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

What’s new about the importance of reference data? Not so much as is touted. By Maryann Houglet, Director of Strategic Consulting, A-Team Group

Subscribe to our newsletter

A recent industry periodical spoke of data as “ubiquitous”… touching all corners of an enterprise and becoming the lifeblood of an organization.” Having spent my 25-year career at various securities data vendors, there’s nothing new here.

Data has always driven investment decisions, transaction processing, back office operations, securities administration, accounting, compliance, reporting and so on. Securities identification has always been an essential factor impacted by corporate affiliations and compounded by new instruments and cross-border trading. Difficult–to-find information on fixed income instruments and corporate actions has entailed ongoing risk. And, yes, the people dedicated to getting data right have long been around. 

Many of us recognize how evolution in financial services coupled with technology has taught us a lot about the importance and complexity of maintaining accurate, timely securities data.
High interest rates in the 1980’s increased the need to access bond call and sinking fund schedules; the explosion of mortgage-backed securities necessitated ease of access to pool factors. Both are still key issues.
Rapid expansion in mutual fund investment necessitated more timely pricing and ongoing, growing data demands.
The long-term drive to perfect security master files across the industry has given rise to data and software requirements. STP was a great concept, but a bit of the “cart without the data horse”, underestimating the role for automated data links.
With increasing focus on risk, attention returns to the issuer-issue link.
Lest we forget: the never-ending drive to determine a “correct” bond price or that of any complex instrument.
Now electronic trading – ECNs and Alternative Trading Systems – is upping the ante on timeliness.

The same problem data areas remain on management’s top 10 lists¹. The “reference data” umbrella ties “it” together, but technology and infrastructure are still catching up.

So, why is the industry still struggling to cope with these data issues? Because they’re very hard to solve!
Complex instruments, global investing, legacy systems, and multiple locations make it even harder to be right, timely and consistent.
Complicated data production; back offices are still pulling data off various terminals manually and faxing across locations. Lack of automation introduces error and inconsistency.
In spite of the industry’s drive to reduce redundant data cost and become operationally efficient, there’s still significant redundancy across firms. This is not conducive to a firm adding value in its services.
Regulatory compliance and risk management are big and expanding, increasing the focus on data.
Confluence of all of the above makes it hard to determine data improvement priorities.
What is new?
A significant factor – beyond automation and technology – is data awareness at all levels across the firm and the industry. Yes, senior management is giving data managers deserved recognition, but as important, senior managers are encouraging and supporting thinking outside the box and looking for solutions beyond the firm.

Data centralization across the enterprise could be just a beginning. With new technology options and focus on integration and efficiency, manual intervention could diminish. A shift from intra-firm to inter-firm data sharing – especially for consolidation, validation, cleansing of commodity-like data – could decrease industry redundancy in an operationally efficient way and be a big next step towards a “paradigm shift” in data management.
Time will tell where the shift takes us. In the meantime, it’s good to see the industry empowering the people dedicated to getting data right.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Date: 8 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real...

BLOG

Moody’s and Google Cloud Partner to Build Large Language Models and Generative AI Apps

Moody’s Corporation and Google Cloud have made a strategic partnership to explore a combination of Moody’s expertise in financial analysis and Google Cloud’s generative AI (gen AI) technologies to help Moody’s customers and employees use large language models (LLMs) to gain financial insights and summarise financial data faster. This is Moody’s second key partnership on...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...