The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Taylor Discusses Why LCH.Clearnet Has Invested in its Data Architecture

UK-based clearer LCH.Clearnet is mid-way through a significant project to revamp its internal data architecture, according to Martin Taylor, group chief information officer at the clearer. Taylor told attendees to last week’s Thomson Reuters organised data management roundtable that the clearer is six months into a project that will deliver a more standardised approach to data management via the move to a master data management (MDM) structure and the implementation of a new data warehouse.

Taylor discussed the impact that the Lehman default had on the clearing house in September 2008, at a time when it was in the throes of a significant platform upgrade within its energy derivatives business. The events around the largest default it had ever handled forced the clearing house to re-evaluate its approach to data management, he explained. “It provided us with a deeper understanding of the role of clearing in the market and highlighted that we need earlier access to data,” he continued.

One of the biggest problems Lehman’s collapse, caused immediately after the event, was the lack of available staff that understood where the data was kept in order to be able to unwind the investment bank in an orderly manner. Ordinarily there are people around to assist in the unwinding process by extracting vital counterparty data from the relevant systems, but in Lehman’s case, everyone had literally left the building, he explained. Taylor stressed that the upcoming regulation around living wills would further highlight these issues to the benefit of data management overall.

For its part, LCH.Clearnet is improving the security, integrity and auditability of its internal data management systems. The data management function actually sits within the clearer’s risk management department and thus also puts emphasis on the speed and accuracy of the data for risk management purposes.

Taylor emphasised that the people involved in the data management process are as important as the systems. He highlighted the public and government concerns about data security as an indicator of how important data security has become as an issue for the industry as a whole. “We need to make sure that we understand who is handling the data and that there is an audit trail and checks and balances in place to regulate this,” he said.

To this end, he feels a single group should be put in charge of managing data across an enterprise, but noted: “the bigger the enterprise, the harder this is to achieve”.

He concluded by suggesting, like a number of the other panellists, that the financial services industry could learn data standardisation tips from other industries such as pharmaceuticals. “They are much more ruthless in their application of data management and we don’t need to reinvent the wheel. After all, they need to be able to track down components very quickly just in case one of them kills somebody,” he said.

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

Blackmore Capital’s Collaboration with OTCfin Completes Integration of ESG Factors into Investment Process

Blackmore Capital, a Melbourne-based asset manager set up in 2018, and New York-based OTCfin have completed the integration of ESG factors with financial data for all Blackmore portfolios. By incorporating ESG factors into Blackmore’s investment process, OTCfin’s risk and regulatory reporting solution will help the asset manager’s team improve portfolio monitoring from both a financial...

EVENT

TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...