About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Tackling a New Era of Financial Data Management

Subscribe to our newsletter

By Martijn Groot, Vice President of Product Management, Asset Control

Business users across financial services are more data hungry than ever before. They want to interact with data directly and quickly collect, manipulate and analyse the data in order to streamline operations.

There are two drivers for this. First, users increasingly expect instant access to data. Second, jobs conducted by workers in risk, finance, control and operational roles are becoming more data intensive. Workers often need to aggregate content to draft a regulatory report, for example, or to sign off on a portfolio valuation.

To meet this growing user demand, organisations need to move to a new style of data management that manages large data volumes, and supports ease of access. Unfortunately, the data provision processes and technology infrastructure within financial institutions today are lagging behind this desired goal.

In addressing the challenge, the first step is to ‘acquire’ the necessary data sources and if required different perspectives on the data. Organisations may therefore need to assimilate and assess different prices and the different opinions of brokers and market makers.

The next step is data mastering, which allows businesses to cross compare, cross reference and tie all the different data collection threads together. This helps them enrich their datasets or, for example, calculate average prices. The third element is making the data accessible to users, a process that includes ensuring it is easily embedded into workflows.

In the past, businesses have tended to concentrate on sourcing as much data as possible and placing it in large data warehouses. Unfortunately, they have focused less on how to operationalise the data and make it accessible.

To address these issues, businesses need to look closely at the needs of the users they serve. The first group, operational users, need an overview of the whole data collection process. This should include insight into where data comes from, how much has been collected, and what gaps there are. Monitoring this gives the organisation an early warning if something goes wrong.

The second category consists of users who need to interact with the data. They might want to back-test a model or price a new complex security, and they need to be able to easily interrogate the data. The third group, data scientists, expect easy integration via languages like Python or R, or just enterprise search capabilities that enable them to quickly assess available datasets.

To address the needs of these groups, businesses need to deliver:

  • Visibility of the approved data production process to ease the operational burden and satisfy regulatory requirements
  • Easier programmatic integration for data scientists to enable them to access data easily and cheaply
  • A Google style enterprise search on the data set.

To provide this level of business user enablement depends on having the right technological infrastructure in place. Many firms still carry complex legacy applications and since the financial crisis, they also have significant cost pressures and need to get more from their existing infrastructure. There will therefore be a need for rationalisation of the landscape, but also a requirement to bring in new technologies to better deal with the data intensity of current risk and evaluation processes.

The current industry focus on prudent evaluation of risk and the emergence of regulations such as Fundamental Review of Trading Book (FRTB) are putting even greater pressure on financial services organisations. In line with the changes that FRTB brings, market data and risk systems need to support convoluted market data workflows utilising new sources to meet the real-price requirements and regulatory prescribed classifications. To manage all this, organisations need to find a way to smoothly source and integrate market data, track risk factor histories, and proactively manage data quality – all through one integrated and scalable platform.

Cloud and database technology

Often, organisations across this sector will need new capabilities to cater to the sheer volume of data they need to process. That typically means technologies that can manage new deployment models in the cloud, but also deliver ease of integration for data scientists and effective enterprise search for more general users.

From the database perspective, we see a trend for businesses to adopt new technologies such as NoSQL. Traditional database technologies are struggling to cope with the growing volumes of data these organisations are collecting via mobile banking apps and for regulatory filings, for example. NoSQL is also typically cheaper to run than these technologies. It scales more easily and delivers more flexible infrastructure cost control.

Finding a way forward

Today, organisations across the financial services sector are having to manage increasingly data intensive processes in areas like operations, evaluation and risk. At the same time, they are increasingly challenged by users who have different expectations of the data management systems they engage with and who are increasingly looking for a self-service approach.

In this new era of financial data management, they need to put new processes in place that focus on the needs of the user, and leverage technologies that are open and flexible and deliver high-performance, plus ease of access and control.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to develop a reporting framework for ESG disclosure regulation

ESG reporting is a challenge and additional burden for many financial institutions as regulations continue to evolve, ESG data management is complex, and global standards remain elusive. Helpful solutions include reporting frameworks that support the collection, understanding, and management of ESG data for disclosure. This webinar will provide practical guidance on how to build a...

BLOG

A Dive into the Detail of the Financial Data Transparency Act’s Data Standards Requirements

As the June 2024 deadline looms for US regulators within the scope of the Financial Data Transparency Act (FTDA) to provide joint rules on data standards and reporting formats, the question of what these standards and formats will be remains open. Front runners are the Legal Entity Identifier (LEI) and eXtensible Business Reporting Language (XBRL),...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...