About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Tackling a New Era of Financial Data Management

Subscribe to our newsletter

By Martijn Groot, Vice President of Product Management, Asset Control

Business users across financial services are more data hungry than ever before. They want to interact with data directly and quickly collect, manipulate and analyse the data in order to streamline operations.

There are two drivers for this. First, users increasingly expect instant access to data. Second, jobs conducted by workers in risk, finance, control and operational roles are becoming more data intensive. Workers often need to aggregate content to draft a regulatory report, for example, or to sign off on a portfolio valuation.

To meet this growing user demand, organisations need to move to a new style of data management that manages large data volumes, and supports ease of access. Unfortunately, the data provision processes and technology infrastructure within financial institutions today are lagging behind this desired goal.

In addressing the challenge, the first step is to ‘acquire’ the necessary data sources and if required different perspectives on the data. Organisations may therefore need to assimilate and assess different prices and the different opinions of brokers and market makers.

The next step is data mastering, which allows businesses to cross compare, cross reference and tie all the different data collection threads together. This helps them enrich their datasets or, for example, calculate average prices. The third element is making the data accessible to users, a process that includes ensuring it is easily embedded into workflows.

In the past, businesses have tended to concentrate on sourcing as much data as possible and placing it in large data warehouses. Unfortunately, they have focused less on how to operationalise the data and make it accessible.

To address these issues, businesses need to look closely at the needs of the users they serve. The first group, operational users, need an overview of the whole data collection process. This should include insight into where data comes from, how much has been collected, and what gaps there are. Monitoring this gives the organisation an early warning if something goes wrong.

The second category consists of users who need to interact with the data. They might want to back-test a model or price a new complex security, and they need to be able to easily interrogate the data. The third group, data scientists, expect easy integration via languages like Python or R, or just enterprise search capabilities that enable them to quickly assess available datasets.

To address the needs of these groups, businesses need to deliver:

  • Visibility of the approved data production process to ease the operational burden and satisfy regulatory requirements
  • Easier programmatic integration for data scientists to enable them to access data easily and cheaply
  • A Google style enterprise search on the data set.

To provide this level of business user enablement depends on having the right technological infrastructure in place. Many firms still carry complex legacy applications and since the financial crisis, they also have significant cost pressures and need to get more from their existing infrastructure. There will therefore be a need for rationalisation of the landscape, but also a requirement to bring in new technologies to better deal with the data intensity of current risk and evaluation processes.

The current industry focus on prudent evaluation of risk and the emergence of regulations such as Fundamental Review of Trading Book (FRTB) are putting even greater pressure on financial services organisations. In line with the changes that FRTB brings, market data and risk systems need to support convoluted market data workflows utilising new sources to meet the real-price requirements and regulatory prescribed classifications. To manage all this, organisations need to find a way to smoothly source and integrate market data, track risk factor histories, and proactively manage data quality – all through one integrated and scalable platform.

Cloud and database technology

Often, organisations across this sector will need new capabilities to cater to the sheer volume of data they need to process. That typically means technologies that can manage new deployment models in the cloud, but also deliver ease of integration for data scientists and effective enterprise search for more general users.

From the database perspective, we see a trend for businesses to adopt new technologies such as NoSQL. Traditional database technologies are struggling to cope with the growing volumes of data these organisations are collecting via mobile banking apps and for regulatory filings, for example. NoSQL is also typically cheaper to run than these technologies. It scales more easily and delivers more flexible infrastructure cost control.

Finding a way forward

Today, organisations across the financial services sector are having to manage increasingly data intensive processes in areas like operations, evaluation and risk. At the same time, they are increasingly challenged by users who have different expectations of the data management systems they engage with and who are increasingly looking for a self-service approach.

In this new era of financial data management, they need to put new processes in place that focus on the needs of the user, and leverage technologies that are open and flexible and deliver high-performance, plus ease of access and control.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Hearing from the Experts: AI Governance Best Practices

The rapid spread of artificial intelligence in the financial industry presents data teams with novel challenges. AI’s ability to harvest and utilize vast amounts of data has raised concerns about the privacy and security of sensitive proprietary data and the ethical and legal use of external information. Robust data governance frameworks provide the guardrails needed...

BLOG

Scale and Governance Top Drivers of Modern Data Architecture Plans: Webinar Review

Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands. They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...