About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Counterparty Data Has Become a Driver for Growth, Says A-Team Group, GoldenSource and CounterpartyLink

Subscribe to our newsletter

Senior managers have finally got a handle on counterparty data’s importance with regards to risk and entity exposure and they are now attempting to leverage it as an asset to fuel growth, according to recent research carried out by A-Team Group and sponsored by GoldenSource and CounterpartyLink. The research, entitled Counterparty Data Emerges as a Business Asset to Fuel Growth, indicates that counterparty data is rising up the priority list within institutions.

Real advances have been made in ensuring counterparty data quality, reducing redundancy and putting new data management practices in place over the last year. Institutions have progressed from a focus on compliance and risk in counterparty relationships and, where possible, firms have now moved on to leveraging counterparty data. “Once firms understand risk and can identify exposure to entities, counterparty information becomes a business enabler,” said one of the respondents in the study, identified as a UK data manager at a leading bank.

A-Team Group conducted a an extensive survey into counterparty data management in September last year, but the advances made in the space of a year have been striking, according to this year’s report. “It’s now about how we leverage counterparty information into making or saving money. As you show business intelligence, more managers get on board, and you can use it (counterparty data) in profitability analytics,” another respondent explained.

Counterparty data management has assumed its place in the wider picture of central data management strategies, the report explains. However, there are ongoing discussions about how much of this data should be managed centrally. “In 2008, the movement appears to be toward centrally holding and governing common data, such as identifiers or legal entity names, needed to aggregate and monitor firm-wide exposure,” the report states.

Some data related to specific business areas or asset classes may be best served by maintaining it in the business units that are close to it. This includes data such as standing settlement instructions, which can be managed in sub-master files maintained by these downstream businesses, industry experts suggest.

In 2007, risk management and compliance ranked the highest as areas driving customer data aggregation projects, and this was again the case in this year’s survey. “Now, management appears to be focused on depth of understanding – who are the real counterparties – and validating assumptions on exposure to counterparties because of complex ownership issues,” the report states. This year’s survey indicates that improving operations also plays a key role.

The increase in investment in alternative asset classes has reinforced the need for the integration of issue to issuer links into the securities master, the report adds. Furthermore, from a data quality perspective, data validation through both internal and external sources is common practice in the market, according to survey respondents. This shows progress, as firms value the need to check internal data sources against external independent ones.

Also on the data front, the lack of a global industry accepted business entity identifier is, once again, acknowledged as a problem. There is a gap in the counterparty data solutions in the market with regards to this space and it seems that this is not likely to go away any time soon.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: An update on data standards and global identifiers

Date: 19 October 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving...

BLOG

Bloomberg Unveils BloombergGPT: A Large-Language Model Tailored for the Financial Industry

Bloomberg has introduced BloombergGPT, a generative artificial intelligence (AI) model specifically designed to enhance natural language processing (NLP) tasks within the financial sector. Developed using a vast range of financial data, this large language model (LLM) represents a significant step forward in the application of AI technology in the financial industry. While recent advancements in...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...