The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Taylor Discusses Why LCH.Clearnet Has Invested in its Data Architecture

Share article

UK-based clearer LCH.Clearnet is mid-way through a significant project to revamp its internal data architecture, according to Martin Taylor, group chief information officer at the clearer. Taylor told attendees to last week’s Thomson Reuters organised data management roundtable that the clearer is six months into a project that will deliver a more standardised approach to data management via the move to a master data management (MDM) structure and the implementation of a new data warehouse.

Taylor discussed the impact that the Lehman default had on the clearing house in September 2008, at a time when it was in the throes of a significant platform upgrade within its energy derivatives business. The events around the largest default it had ever handled forced the clearing house to re-evaluate its approach to data management, he explained. “It provided us with a deeper understanding of the role of clearing in the market and highlighted that we need earlier access to data,” he continued.

One of the biggest problems Lehman’s collapse, caused immediately after the event, was the lack of available staff that understood where the data was kept in order to be able to unwind the investment bank in an orderly manner. Ordinarily there are people around to assist in the unwinding process by extracting vital counterparty data from the relevant systems, but in Lehman’s case, everyone had literally left the building, he explained. Taylor stressed that the upcoming regulation around living wills would further highlight these issues to the benefit of data management overall.

For its part, LCH.Clearnet is improving the security, integrity and auditability of its internal data management systems. The data management function actually sits within the clearer’s risk management department and thus also puts emphasis on the speed and accuracy of the data for risk management purposes.

Taylor emphasised that the people involved in the data management process are as important as the systems. He highlighted the public and government concerns about data security as an indicator of how important data security has become as an issue for the industry as a whole. “We need to make sure that we understand who is handling the data and that there is an audit trail and checks and balances in place to regulate this,” he said.

To this end, he feels a single group should be put in charge of managing data across an enterprise, but noted: “the bigger the enterprise, the harder this is to achieve”.

He concluded by suggesting, like a number of the other panellists, that the financial services industry could learn data standardisation tips from other industries such as pharmaceuticals. “They are much more ruthless in their application of data management and we don’t need to reinvent the wheel. After all, they need to be able to track down components very quickly just in case one of them kills somebody,” he said.

Related content

WEBINAR

Recorded Webinar: Innovation in architecture: how to develop optimal, high performance trading

Don’t miss this opportunity to view the recording of this recently held webinar. Cloud, predictive analytics, sophisticated algorithms, artificial intelligence and other high performance technologies are gaining traction, innovating trading architecture and delivering market advantage to early adopters. Their potential is phenomenal, but they do come with challenges around integration with existing systems and trading...

BLOG

Live at Data Management Summit USA Virtual – Day Two Keynote and Q&As

A-Team Group’s Data Management Summit USA Virtual Day Two was based on the theme of best practices for data driven strategies in today’s new normal. The day started with an informative live keynote from Arvind Joshi, director of data management at Scotiabank, on how to establish data quality for analytics. This was followed by three...

EVENT

RegTech Summit Virtual

The highly successful RegTech Summit Virtual was held in November 2020 and explored how business and operating models are adapting post COVID and how RegTech can provide agile and enhanced compliance for managing an evolving risk and compliance landscape. The event featured daily live keynotes, panel discussions, presentations, fireside chats and Q&A sessions with content available on demand over five days.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...