The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Tackling a New Era of Financial Data Management

By Martijn Groot, Vice President of Product Management, Asset Control

Business users across financial services are more data hungry than ever before. They want to interact with data directly and quickly collect, manipulate and analyse the data in order to streamline operations.

There are two drivers for this. First, users increasingly expect instant access to data. Second, jobs conducted by workers in risk, finance, control and operational roles are becoming more data intensive. Workers often need to aggregate content to draft a regulatory report, for example, or to sign off on a portfolio valuation.

To meet this growing user demand, organisations need to move to a new style of data management that manages large data volumes, and supports ease of access. Unfortunately, the data provision processes and technology infrastructure within financial institutions today are lagging behind this desired goal.

In addressing the challenge, the first step is to ‘acquire’ the necessary data sources and if required different perspectives on the data. Organisations may therefore need to assimilate and assess different prices and the different opinions of brokers and market makers.

The next step is data mastering, which allows businesses to cross compare, cross reference and tie all the different data collection threads together. This helps them enrich their datasets or, for example, calculate average prices. The third element is making the data accessible to users, a process that includes ensuring it is easily embedded into workflows.

In the past, businesses have tended to concentrate on sourcing as much data as possible and placing it in large data warehouses. Unfortunately, they have focused less on how to operationalise the data and make it accessible.

To address these issues, businesses need to look closely at the needs of the users they serve. The first group, operational users, need an overview of the whole data collection process. This should include insight into where data comes from, how much has been collected, and what gaps there are. Monitoring this gives the organisation an early warning if something goes wrong.

The second category consists of users who need to interact with the data. They might want to back-test a model or price a new complex security, and they need to be able to easily interrogate the data. The third group, data scientists, expect easy integration via languages like Python or R, or just enterprise search capabilities that enable them to quickly assess available datasets.

To address the needs of these groups, businesses need to deliver:

  • Visibility of the approved data production process to ease the operational burden and satisfy regulatory requirements
  • Easier programmatic integration for data scientists to enable them to access data easily and cheaply
  • A Google style enterprise search on the data set.

To provide this level of business user enablement depends on having the right technological infrastructure in place. Many firms still carry complex legacy applications and since the financial crisis, they also have significant cost pressures and need to get more from their existing infrastructure. There will therefore be a need for rationalisation of the landscape, but also a requirement to bring in new technologies to better deal with the data intensity of current risk and evaluation processes.

The current industry focus on prudent evaluation of risk and the emergence of regulations such as Fundamental Review of Trading Book (FRTB) are putting even greater pressure on financial services organisations. In line with the changes that FRTB brings, market data and risk systems need to support convoluted market data workflows utilising new sources to meet the real-price requirements and regulatory prescribed classifications. To manage all this, organisations need to find a way to smoothly source and integrate market data, track risk factor histories, and proactively manage data quality – all through one integrated and scalable platform.

Cloud and database technology

Often, organisations across this sector will need new capabilities to cater to the sheer volume of data they need to process. That typically means technologies that can manage new deployment models in the cloud, but also deliver ease of integration for data scientists and effective enterprise search for more general users.

From the database perspective, we see a trend for businesses to adopt new technologies such as NoSQL. Traditional database technologies are struggling to cope with the growing volumes of data these organisations are collecting via mobile banking apps and for regulatory filings, for example. NoSQL is also typically cheaper to run than these technologies. It scales more easily and delivers more flexible infrastructure cost control.

Finding a way forward

Today, organisations across the financial services sector are having to manage increasingly data intensive processes in areas like operations, evaluation and risk. At the same time, they are increasingly challenged by users who have different expectations of the data management systems they engage with and who are increasingly looking for a self-service approach.

In this new era of financial data management, they need to put new processes in place that focus on the needs of the user, and leverage technologies that are open and flexible and deliver high-performance, plus ease of access and control.

Related content

WEBINAR

Recorded Webinar: Data management for ESG requirements

Environmental, Social and Governance (ESG) investing is moving into the mainstream, requiring asset managers to develop ESG strategies that deliver for both the firm and its investors. While these strategies can outperform those that do not include ESG factors, there is no clear route to success in an immature market that is only just beginning...

BLOG

Rimes Names Brad Hunt – Former Strategy Lead at BNY Mellon – as CEO

Rimes Technologies, a provider of managed data services, has appointed Brad Hunt as CEO. The company describes the appointment as a follow-on from a ‘significant growth investment’ in Rimes by Swedish private equity firm EQT Partners in February 2020. Rimes’ co-founder and former CEO, Christian Fauvelais, retains a role on the company’s board and will...

EVENT

TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...