About a-team Marketing Services

A-Team Insight Blogs

Lack of Standardisation Remains a Sticking Point for Counterparty Data Management, Says Panel at A-Team Group, Asset Control and Avox Event

Subscribe to our newsletter

The topic of counterparty risk is no stranger to the headlines at the moment but, despite its high profile, counterparty data management has still got a long way to go, according to speakers at a recent event organised by A-Team Group, Asset Control and Avox. Martijn Groot, director of market strategy at EDM vendor Asset Control, explained to attendees to last week’s “Counterparty Risk in the Spotlight” event in London that recent market events have acted as the “ultimate stress test” for this sector. He highlighted the many gaps in counterparty data management capabilities that exist in financial institutions.

He criticised the over-emphasis on value at risk (VaR) within financial institutions at the expense of the overall soundness of systems and processes. “The fact that even the president of the European Central Bank (ECB), Jean-Claude Trichet, mentioned the problem of data in a speech demonstrates how important the subject has become,” said Groot.

He explained the need for data consistency in regulatory reporting, which will raise the scrutiny of data management systems in the short term. “Financial institutions’ product and risk silos prevent a complete view of risk across an organisation,” added Groot. “The artificial segregation of risk types in particular means that there are sets of discrete functions for risk management that do not allow a bigger picture to be seen.”

Entity data is critical across all functions for business users and recent years have seen increased complexity in the hierarchical structures of these entities. Mergers and acquisitions over the years have meant it is even harder to keep track of whom you are dealing with. However, know your customer and anti-money laundering legislation has increased, so it is more important than ever to keep track of entities, said Groot.

He referred to a recent speech by Francis Gross, head of the external statistics division at the European Central Bank (ECB), who suggested that regulators should become the “catalysts” for change in the market. “Regulators will help to set metrics and more clarity around the requirements for data management projects,” said Groot.

Ken Price, CEO of Avox, disagreed that regulators would take the lead in driving forward data management. “It is regulatory inconsistencies across regions and globally that are causing many of the data management problems,” he argued. “It is these inconsistencies that will keep many vendors in business for some time to come.”

As well as attention from the regulatory community, the industry is also facing up to increased scrutiny from clients. “We need to live up to increasing client requirements for more, faster data in areas such as valuations,” he said. This has resulted in a significant integration challenge for both internal and external data requirements.

Moreover, all these problems are exacerbated by the fact that there are no globally adopted standards in the entity data space. “ Entity data is not just another securities master file issue,” said Groot. “The data comes from lots of disparate channels and it can also be highly sensitive in nature. Entity data is reference data on steroids.”

“News and events have shaken the financial world and brought the issue of entity data to the fore,” agreed Angela Wilbraham, CEO of A-Team Group. “In the scramble for financial institutions to understand their exposures, they have found many flaws in their data management practices.”

A move towards enterprise risk management is a likely outcome of the current financial turmoil, suggested Wilbraham. “This will be tricky to achieve across traditionally siloed departments and it will be interesting to see the impact of budget cutting in the short term.”

Avox’s Price cautioned that firms must also remember that entity data is not static data. “This is a significant failure at the moment, as entity data is spread across different systems and not being kept up to date with any changes that are happening, which means it is not fit for purpose.”

Price concurred that current events have meant that senior management’s awareness of the problem has been raised, however, there are now fewer people and resources to fix the problem. “There is a slow process of maturity going on in the industry and moves are not being made at lightning speed,” he said. “But should you wait until you receive a speeding ticket before you act?”

Frank Lemieux, senior managing consultant at Capgemini, elaborated on the work that his consulting firm has been engaged in to this end. “We are currently looking to reach agreement with financial institutions about the business terminology that is being used in the counterparty data space. This is the first step and then we put together a conceptual model for an entity and metrics to put in place for these projects,” he explained.

The initial assessment period is the longest and most challenging process, according to Lemieux, and it usually takes between six and 12 weeks. “Agreeing on the business terms is the hardest part but once that has been achieved, we can usually produce a deliverable every four to six weeks after that,” he said.

The current market environment has proved challenging for these projects because of the intense scrutiny of return on investment (ROI). “Most firms are using better traceability and transparency of data as the primary argument to get these projects off the ground,” said Lemieux.

The horizon for ROI has shortened, added Groot. “The metrics being used are a combination of the streamlining of the supply chain for data, the reduction of duplicative or redundant data and the reduction in direct costs from errors,” he explained.

The panel discussed the failure of the industry to agree on the standards for the international business entity identifier (IBEI) and came to the conclusion that the issues were largely political in nature. The inertia caused by these considerations has meant that the initiative is unlikely to move forward any time soon, the panel agreed.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: New opportunities to scale data operations

Faced with tough competition and ongoing pressure on margins, many firms are reviewing their operating models and assessing whether they can reallocate more resources to high-value projects by outsourcing commoditised processes including data operations. This webinar will explore the different approaches that buy-side and sell-side firms are adopting to scale their data operations, including market...

BLOG

ESMA Data Strategy Proposes More Tech, Less Cost in Reporting Compliance

The European Securities and Markets Authority (ESMA) has published its data strategy for 2023 to 2028. Key agenda points include using new data-related technologies, reducing reporting compliance costs for regulated entities, enabling effective use of data at both EU and national level, and making data more broadly available to the public. The five-year ESMA data...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...