About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Tackling a New Era of Financial Data Management

Subscribe to our newsletter

By Martijn Groot, Vice President of Product Management, Asset Control

Business users across financial services are more data hungry than ever before. They want to interact with data directly and quickly collect, manipulate and analyse the data in order to streamline operations.

There are two drivers for this. First, users increasingly expect instant access to data. Second, jobs conducted by workers in risk, finance, control and operational roles are becoming more data intensive. Workers often need to aggregate content to draft a regulatory report, for example, or to sign off on a portfolio valuation.

To meet this growing user demand, organisations need to move to a new style of data management that manages large data volumes, and supports ease of access. Unfortunately, the data provision processes and technology infrastructure within financial institutions today are lagging behind this desired goal.

In addressing the challenge, the first step is to ‘acquire’ the necessary data sources and if required different perspectives on the data. Organisations may therefore need to assimilate and assess different prices and the different opinions of brokers and market makers.

The next step is data mastering, which allows businesses to cross compare, cross reference and tie all the different data collection threads together. This helps them enrich their datasets or, for example, calculate average prices. The third element is making the data accessible to users, a process that includes ensuring it is easily embedded into workflows.

In the past, businesses have tended to concentrate on sourcing as much data as possible and placing it in large data warehouses. Unfortunately, they have focused less on how to operationalise the data and make it accessible.

To address these issues, businesses need to look closely at the needs of the users they serve. The first group, operational users, need an overview of the whole data collection process. This should include insight into where data comes from, how much has been collected, and what gaps there are. Monitoring this gives the organisation an early warning if something goes wrong.

The second category consists of users who need to interact with the data. They might want to back-test a model or price a new complex security, and they need to be able to easily interrogate the data. The third group, data scientists, expect easy integration via languages like Python or R, or just enterprise search capabilities that enable them to quickly assess available datasets.

To address the needs of these groups, businesses need to deliver:

  • Visibility of the approved data production process to ease the operational burden and satisfy regulatory requirements
  • Easier programmatic integration for data scientists to enable them to access data easily and cheaply
  • A Google style enterprise search on the data set.

To provide this level of business user enablement depends on having the right technological infrastructure in place. Many firms still carry complex legacy applications and since the financial crisis, they also have significant cost pressures and need to get more from their existing infrastructure. There will therefore be a need for rationalisation of the landscape, but also a requirement to bring in new technologies to better deal with the data intensity of current risk and evaluation processes.

The current industry focus on prudent evaluation of risk and the emergence of regulations such as Fundamental Review of Trading Book (FRTB) are putting even greater pressure on financial services organisations. In line with the changes that FRTB brings, market data and risk systems need to support convoluted market data workflows utilising new sources to meet the real-price requirements and regulatory prescribed classifications. To manage all this, organisations need to find a way to smoothly source and integrate market data, track risk factor histories, and proactively manage data quality – all through one integrated and scalable platform.

Cloud and database technology

Often, organisations across this sector will need new capabilities to cater to the sheer volume of data they need to process. That typically means technologies that can manage new deployment models in the cloud, but also deliver ease of integration for data scientists and effective enterprise search for more general users.

From the database perspective, we see a trend for businesses to adopt new technologies such as NoSQL. Traditional database technologies are struggling to cope with the growing volumes of data these organisations are collecting via mobile banking apps and for regulatory filings, for example. NoSQL is also typically cheaper to run than these technologies. It scales more easily and delivers more flexible infrastructure cost control.

Finding a way forward

Today, organisations across the financial services sector are having to manage increasingly data intensive processes in areas like operations, evaluation and risk. At the same time, they are increasingly challenged by users who have different expectations of the data management systems they engage with and who are increasingly looking for a self-service approach.

In this new era of financial data management, they need to put new processes in place that focus on the needs of the user, and leverage technologies that are open and flexible and deliver high-performance, plus ease of access and control.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practice approaches to data management for regulatory reporting

13 May 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management...

BLOG

Data Quality Posing Obstacles to AI Adoption and Other Processes, say Reports

The rush to build artificial intelligence applications has hit a wall of poor quality data and data complexity that’s hindering them from taking advantage of the technology. Those barriers are also preventing firms from upgrading other parts of their tech stacks. A slew of surveys and comments by researchers and vendors paint a picture of...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...