About a-team Marketing Services

A-Team Insight Blogs

Taylor Discusses Why LCH.Clearnet Has Invested in its Data Architecture

Subscribe to our newsletter

UK-based clearer LCH.Clearnet is mid-way through a significant project to revamp its internal data architecture, according to Martin Taylor, group chief information officer at the clearer. Taylor told attendees to last week’s Thomson Reuters organised data management roundtable that the clearer is six months into a project that will deliver a more standardised approach to data management via the move to a master data management (MDM) structure and the implementation of a new data warehouse.

Taylor discussed the impact that the Lehman default had on the clearing house in September 2008, at a time when it was in the throes of a significant platform upgrade within its energy derivatives business. The events around the largest default it had ever handled forced the clearing house to re-evaluate its approach to data management, he explained. “It provided us with a deeper understanding of the role of clearing in the market and highlighted that we need earlier access to data,” he continued.

One of the biggest problems Lehman’s collapse, caused immediately after the event, was the lack of available staff that understood where the data was kept in order to be able to unwind the investment bank in an orderly manner. Ordinarily there are people around to assist in the unwinding process by extracting vital counterparty data from the relevant systems, but in Lehman’s case, everyone had literally left the building, he explained. Taylor stressed that the upcoming regulation around living wills would further highlight these issues to the benefit of data management overall.

For its part, LCH.Clearnet is improving the security, integrity and auditability of its internal data management systems. The data management function actually sits within the clearer’s risk management department and thus also puts emphasis on the speed and accuracy of the data for risk management purposes.

Taylor emphasised that the people involved in the data management process are as important as the systems. He highlighted the public and government concerns about data security as an indicator of how important data security has become as an issue for the industry as a whole. “We need to make sure that we understand who is handling the data and that there is an audit trail and checks and balances in place to regulate this,” he said.

To this end, he feels a single group should be put in charge of managing data across an enterprise, but noted: “the bigger the enterprise, the harder this is to achieve”.

He concluded by suggesting, like a number of the other panellists, that the financial services industry could learn data standardisation tips from other industries such as pharmaceuticals. “They are much more ruthless in their application of data management and we don’t need to reinvent the wheel. After all, they need to be able to track down components very quickly just in case one of them kills somebody,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Cross-Regulation Data Consistency and Accuracy

Regulatory reporting obligations continue to expand, bringing with them more overlaps of data elements across reporting regimes. As many firms struggle with the accuracy and completeness of individual reporting obligations, regulators have increasingly begun to focus on cross-regulation data consistency in their data validations and examination processes. This webinar will identify cases of data overlap...

BLOG

DSB Opens Applications for Membership of Technology Advisory Committee

The Derivatives Service Bureau (DSB) is calling for additional members to join its Technology Advisory Committee (TAC). Successful applicants will take part in the next term of the TAC, which starts in October 2022. To apply, send a covering email to the TAC Secretariat at DSB.TAC@ANNA-DSB.com, along with a copy of your CV. The application...

EVENT

RegTech Summit London

Now in its 6th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...