The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Algorithmics Partners with Banks For Credit Risk Data Initiative

Algorithmics has entered an agreement with the Pan-European Credit Data Consortium (PECDC), a credit risk data pooling initiative by European banks, to provide products and services to help satisfy pending Basel II IRB requirements for the staging, structuring and validation of historical default, loss and recovery data.

Under the agreement, each PECDC member bank will pay Algorithmics a standard annual fee, including set-up charges for each data pool. Algorithmics will design the database mapping to central definitions and provide members with access to send and receive the aggregate bank data. Reporting and tools will be provided to PECDC members via a secure, dedicated web portal and data can be extracted through an XML or CS feed

The PECDC, comprising a number of European top 15 and several global top 10 banks, was established in response to the scarcity of timely and accurate historical corporate loss and recovery data, required to satisfy Basel II. The consortium’s objective is to collect this data on a European-wide, or in some cases global, basis. Each member bank will contribute its own data, which Algorithmics collects and delivers back as a scrubbed, normalized time-series database that has been made anonymous, meaning the identity of the source or borrower/lender cannot be identified.

The data mostly spans 1998 through 2005, although some data goes back further. Although currently a European initiative, there is interest from outside, according to Craig Van Ness, senior vice president of credit data services at Algorithmics.

According to Van Ness, the PECDC initiative was originally started between the member banks and Fitch prior to its acquisition of Algorithmics in January this year. The data warehouse that Algorithmics is building is similar to the Loan Loss Database originally started in the U.S. by the Loan Pricing Corp. (LPC).

LPC and Fitch entered into a relationship in 2001 to provide risk management solutions bringing together LPC’s historical databases and analytics with Fitch Risk Management’s credit expertise.

Building on the European relationship between the banks and Fitch, Algorithmics worked with PECDC members for about one year to help establish the consortium. Van Ness now expects to export the new service offering’s technology and operating model back to the U.S.

The new infrastructure has been created off the back of Algo’s Limit Manager and Credit Administrator components in addition to bespoke components. “The service has gone from a hand-crafted solution to an industrial-strength solution providing high volume, high availability and high throughput,” says Van Ness.

The first end deliverable will be time-series data pools relating to all Basel corporate exposure categories. In this initial phase, loss and recovery observation data will be pooled. Algorithmics’ tools can be used to calculate Exposure at Default (EAD), recovery rate and Loss Given Default (LGD) values for each exposure. Algorithmics will also produce aggregate statistics and analytical reporting by industry sector/geography in accordance with guidelines developed with PECDC member banks.

The next stage of the project will focus on default observation data from which Probability of Default (PD) benchmark values will be calculated.

Members will pay an incremental fee at this stage.

Algorithmics expects that phase one will be completed by the end of this year with following phases delivered in the first quarter of next year and then semi-annually thereafter. Algorithmics will present findings at a general meeting of the PECDC on December 16 to be held at NIB Capital Bank in the Hague.

The initials findings will provide feedback on the LDG pool data that has been collected as well as findings on what derived data is possible and recommendations going forward. Banks that have contributed data will receive the first deliverable of their time-series database and analytics.

Under the Basel II IRB approach to credit risk, banks provide their own estimates of LGD (the magnitude of the likely loss on the exposure expressed as a percentage of the exposure), EAD (the amount expressed in relevant currency to which the bank is exposed at the time of default) and PD (the probability in percentage terms that an exposure will fall into default).
“With both the PECDC and our relationship with Algorithmics firmly established, we can now get the hard work really started!” commented Dr. Scott D. Aguais, director and head of credit risk methodology at Barclays Capital, and member of the PECDC
management committee.

Related content

WEBINAR

Recorded Webinar: Best practice for Regulatory Change in 2021 and beyond

How to get regulatory change management right and avoid the risks of getting it wrong The burden of regulatory change on financial firms has never been greater, leaving compliance teams under increasing pressure to ensure that changes are reviewed and acted upon in a timely manner. Technology enhancements in this space can help, allowing firms...

BLOG

Broadridge’s $2.5 Billion Itiviti Deal Aims to Simplify Firms’ Front-to-Back Workflow

Broadridge Financial Services is seeking to bolster its multi-jurisdictional reach with the €2.143 billion (approximately $2.5 billion) cash acquisition of Scandinavian trading and connectivity solutions provider Itiviti, announced last week. Itiviti’s order and execution management systems (OEMS) and messaging middleware give Broadridge a strong presence in the front office. But it also addresses clients’ needs...

EVENT

Virtual Briefing: ESG Data Management – A Strategic Imperative

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...