The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

IT Rationalisation: Mobilising Data Assets

By: Martijn Groot, Vice President of Product Strategy, Asset Control

IT rationalisation has become a major focus for financial services firms over the past couple of years – from Deutsche Bank’s Strategy 2020, which includes modernising outdated and fragmented IT architecture, to HSBC’s Simplify the Bank plan, which includes an architecture-led strategy to halve the number of applications across the whole group over a 10-year period.

This emphasis on streamlining complex infrastructure is being driven by the competitive and regulatory landscape. It has become very clear over the past decade that continuing with line of business data silos has become a significant risk, given not only the cost of regulatory compliance with its demands for cross-sectional reporting, but also the implications for speed of business change.

As a result, a key part of this rationalisation process has been an investment in APIs (application programming interfaces) to enable interoperability between applications and, hopefully, eradicate duplication. However, while many organisations have appointed data stewards with a remit to determine data and application requirements across specific business functions, the siloed mentality remains due to a lack of data governance maturity. From cost reduction to business agility, the realisation of any successful application rationalisation or data supply chain improvement project will require significantly improved models for data governance.

Demand for Openness

At the same time, the business focus is turning increasingly outward, as organisations recognise the importance of the new financial ecosystem. IT is not only tasked with rationalisation but also moving away from individual process automation to automating an end-to-end supply chain involving different service providers.

With a need to expose data to new fintech partners, as well as customers, many banks are putting in place their own API marketplaces through which they expose their data to selected third parties. While such changes in the retail market are being driven in the EU by the revised Payment Services Directive (PSD2), corporate products in cash, foreign exchange, liquidity and finance data will also demand new APIs.

Given this demand for openness both internally and externally, a common, cross-application taxonomy of products and services and uniform data dictionary is clearly important. But this model has to go further. Creating a common data model is a great start, but business users have to be empowered to explore and exploit this consistent information resource, not only to meet regulatory demands, but also to support business change.

Opening up a single, consistent data source to business users via standardised, self-service technologies – such as the Representational State Transfer (REST) API – is transformative. A simple browser-based interface that enables business users to select required data on demand, with the addition of formatting and frequency tools, effectively opens up the data asset to drive new value. Data can be accessed, integrated into other systems and/or explored via standard data discovery tools – all without reliance on IT.

Maintaining Control

Obviously, this model has to be controlled – from avoiding a data deluge to ensuring confidentiality is maintained, the data cannot be left open to everyone. The ability to manage permissions, for service providers, internal users and customers, is essential if the organisation is to ensure compliance with data privacy laws, adherence to content licence agreements and protection of commercially sensitive information. A REST API should include the ability to control access to specific data to avoid exposure of data to users who are not permitted due to licence constraints or data sensitivity.

With the right security measures in place, information that would have taken business users weeks to access while waiting for IT, can now be discovered and reported upon in days. Given the increasing need for reports – both regulatory and data discovery to support business change – this self-service access to trusted, standardised data is key.

Conclusion

The regulatory reporting requirements that have evolved over the past decade may have put the spotlight on the endemic, silo-based infrastructure model, but it has also become very clear to the financial services industry that if operational costs are to be reduced, IT rationalisation is an imperative. At the same time, an integrated financial ecosystem is becoming vital in both retail and corporate markets. Without a mature data governance model that leverages new enablers, including APIs and standard data dictionaries, organisations will struggle to realise both rationalisation and extension goals.

To realise the vision of agile, simplified financial services business models that are competitive in new digital markets, organisations need to not only create a centralised data source, but also explore new standardised technologies to mobilise data and empower users throughout the business and beyond.

Related content

WEBINAR

Recorded Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG data with traditional financial information, although the compliance burden remains with asset managers. The EU Sustainable...

BLOG

Financial Institutions Delusional on Quality of Regulatory Reports, ACA Study Finds

Despite finding that 87% of firms are confident in the quality of their MiFIR and/or EMIR reports, research commissioned by ACA Group has discovered that 97% of reports submitted to regulators via Approved Reporting Mechanisms (ARMs) and Trade Repositories contain inaccuracies. The research found reporting data quality to be poor, with each report containing an...

EVENT

Data Management Summit London

The Data Management Summit Virtual explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...