The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Improving Operational Efficiency in the Data Management Process

Subscribe to our newsletter

By: Hugo Boer, Senior Product Manager, Asset Control

As the financial services industry starts to harness a raft of new data sources for fast, effective and usable insights, the bottleneck for financial institutions becomes how well they really understand their data management processes.

How many firms, for example, can answer the following questions: Do we understand the impact of bad data quality? Can we measure this quality, and do we have full oversight over steps in the data management process? Can we pre-empt data issues? When data issues arise, can we take restorative action quickly and track adjustments along the way without losing oversight of those changes?

If your firm can’t answer all these questions, the onus is on developing transparent, end-to-end financial data management that enables real-time insight into daily data sourcing, mastering and distribution processes, improves workflows and increases operational efficiency. This will allow firms to unlock the value from new data sources.

Data scrutiny

New regulatory drivers and business pressures have led to increased scrutiny on the data management process. For example, the European Central Bank’s Targeted Review of Internal Models (TRIM) was introduced with the aim of assessing whether internal model results to calculate risk-weighted assets were reliable and comparable. The TRIM guide contained a specific Data Quality Framework focusing on data accuracy, consistency, completeness, validity, availability and traceability as a precondition for these models.

This regulatory focus is, however, just one aspect of the growing recognition among financial institutions of the need to improve insight into data management processes. There is huge business pressure on data management teams not only to manage increasing numbers of data sources, but also to deliver accurate and consistent datasets in ever decreasing time windows.

Despite overlap between the data used by different departments, many teams are still operating in functional silos, from finance to risk. In an increasingly joined up and overlapping corporate data environment, these dispersed data management activities are inherently inefficient, from parallel data sourcing teams buying the same data multiple times to expending the same effort on data preparation. The result is not only high data sourcing and preparation costs, but also unnecessary data storage and, critically, unacceptable operational risk.

Transparent process

What is required is a single overview of the data management process; the ability to track data collection and verification progress, and gain rapid insight into any problems that could affect delivery of Service Level Agreements (SLAs). While companies have attempted to deliver point oversight via existing management information tools, they have failed to provide an intuitive single view over the entire data management process across the business. What data management teams require is transparency across the diverse data silos and deliveries to data consumers and insight into the status of every process of data sourcing, cleansing and verification through to delivery to downstream systems. Essentially, data management teams need a single view into the health of corporate data.

The implications of enhanced data transparency are significant. In addition to meeting the regulatory requirements associated with increased data scrutiny, including data quality, visibility and completeness, with a single view of the entire data management process, organisations can begin to drive significant operational change and create a culture of continuous data improvement.

For example, a complete perspective of any overlap in data resources will enable streamlining of data acquisition, reducing both purchase costs as well as data cleansing and delivery costs. It will also overcome risks associated with a lack of data understanding between different areas, which can create significant federation issues that can affect both operational performance and regulatory compliance. Simple steps such as calibrating consistently applied rules for datasets or asset classes, and ensuring changes to data cleansing rules are documented, will further reinforce the value of acquired data to the business.

Extended data understanding

Transparency into the status of data sourcing, processing and delivery should not be limited to data management experts; transparency of the data supply chain should be shared with everyone in the company, providing end users with insight into the quality of the data used for risk, finance, post-trade-reporting and so on. Data confidence is a fundamental requirement in post financial crisis trading and providing end users with access to a simplified view of the data acquisition, cleansing and provisioning process for each data source will play a key role in fostering a common, companywide understanding of the data and how it is used.

For example, showing users that Bloomberg data is used as the primary source for US corporate bonds, Thomson Reuters data for foreign exchange and Six Financial data for corporate actions; capturing comments from data analysts when this hierarchy is changed; noting what data cleansing rules have been used; and when manual intervention took place can all be valuable information. This transparency will support better data knowledge and confidence and can also overcome some of the data misalignment that has evolved over the past couple of decades.

With better understanding of the end-to-end process for each data source, firms can begin to spot trends in the relative quality of different sources per market and asset class. Are there repeat errors in a data source? Is there an alternative data source already being used somewhere else in the business? Or is it time to onboard a new provider? End-to-end data management visibility will enable firms to drive a culture of continual improvement, addressing data quality issues and seeking out the most effective data sources for the business.

Conclusion

The total cost associated with end-to-end data management is becoming far more apparent, especially given the growing overlap in data usage across the business and the rise in new data sources available. Add in the escalating regulatory expectations for robust processes and the operational risk associated with siloed data management teams and the implications of a lack of transparency become very apparent.

To maximise the value of new data sources, financial institutions need to evolve from departmental data silos and achieve end-to-end to transparency of the data management process. Further, while this will significantly improve the data management operation, it is also essential to push data responsibility and knowledge to the end users. Data quality is a business issue and providing data transparency to business teams will be key in creating a strong culture of continuous improvement and leveraging feedback to drive up data quality and confidence across the organisation.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Key steps in adopting cloud and SaaS delivery for enterprise data

Date: 29 March 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Following in the footsteps of market data migration to the cloud, enterprise data is finding its place in the cloud alongside Software-as-a-Service apps and data delivery mechanisms. The initial aim is to achieve greater efficiency and reduced costs by...

BLOG

CUSIP Global Services seeks new owner as regulator requires S&P Global to divest the company to progress merger with IHS Markit

CUSIP Global Services (CGS) is looking for a new owner following the European Commission’s phase one approval of the $44 billion merger of S&P Global and IHS Markit. While the merger will create a financial data and analytics powerhouse large enough to challenge market leaders Bloomberg and the London Stock Exchange Group (LSEG), including Refinitiv,...

EVENT

TradingTech Summit Virtual (Redirected)

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...