About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Taylor Discusses Why LCH.Clearnet Has Invested in its Data Architecture

Subscribe to our newsletter

UK-based clearer LCH.Clearnet is mid-way through a significant project to revamp its internal data architecture, according to Martin Taylor, group chief information officer at the clearer. Taylor told attendees to last week’s Thomson Reuters organised data management roundtable that the clearer is six months into a project that will deliver a more standardised approach to data management via the move to a master data management (MDM) structure and the implementation of a new data warehouse.

Taylor discussed the impact that the Lehman default had on the clearing house in September 2008, at a time when it was in the throes of a significant platform upgrade within its energy derivatives business. The events around the largest default it had ever handled forced the clearing house to re-evaluate its approach to data management, he explained. “It provided us with a deeper understanding of the role of clearing in the market and highlighted that we need earlier access to data,” he continued.

One of the biggest problems Lehman’s collapse, caused immediately after the event, was the lack of available staff that understood where the data was kept in order to be able to unwind the investment bank in an orderly manner. Ordinarily there are people around to assist in the unwinding process by extracting vital counterparty data from the relevant systems, but in Lehman’s case, everyone had literally left the building, he explained. Taylor stressed that the upcoming regulation around living wills would further highlight these issues to the benefit of data management overall.

For its part, LCH.Clearnet is improving the security, integrity and auditability of its internal data management systems. The data management function actually sits within the clearer’s risk management department and thus also puts emphasis on the speed and accuracy of the data for risk management purposes.

Taylor emphasised that the people involved in the data management process are as important as the systems. He highlighted the public and government concerns about data security as an indicator of how important data security has become as an issue for the industry as a whole. “We need to make sure that we understand who is handling the data and that there is an audit trail and checks and balances in place to regulate this,” he said.

To this end, he feels a single group should be put in charge of managing data across an enterprise, but noted: “the bigger the enterprise, the harder this is to achieve”.

He concluded by suggesting, like a number of the other panellists, that the financial services industry could learn data standardisation tips from other industries such as pharmaceuticals. “They are much more ruthless in their application of data management and we don’t need to reinvent the wheel. After all, they need to be able to track down components very quickly just in case one of them kills somebody,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

13 Leading AI-Based Data Management Capability Providers

Institutions are facing huge operational burdens as they ingest huge volumes of data, demand real-time analytics and face stringent regulatory scrutiny. Consequently, the new data landscape is rendering traditional data management systems inadequate for the growing number of use cases to which data is being deployed. This has necessitated a shift towards modern data management...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

FRTB Special Report

FRTB is one of the most sweeping and transformative pieces of regulation to hit the financial markets in the last two decades. With the deadline confirmed as January 2022, this Special Report provides a detailed insight into exactly what the data requirements are for FRTB in its latest (and final) incarnation, and explores what needs to be done in order to meet these needs on a cost-effective and company-wide basis.