Members of the JWG Customer Data Management Group speak to Reference Data Review about the upcoming hot topics in the area of entity and counterparty data.
Challenging the Customer Data Conundrum
PJ Di Giammarino, CEO of JWG
A key theme within the well versed G20 93 point plan set out in spring 2009 was the call to action for national and regional authorities to review business conduct rules to protect both markets and investors. Following two years of consultation to protect UK depositors, this time last year the UK Financial Services Authority (FSA) finalised the Financial Services Compensation Scheme Policy (PS09/11).
Over the last year, a number of implementation challenges have been identified in the creation of a Single Customer View (SCV), impacting in turn the industry’s readiness in meeting this policy. Firstly the definition of ‘what good customer information looks like’ is fundamental to getting the supervision of the financial sector right. Secondly, large firms have been particularly hard hit by new, and often imprecise, information requirements drafted by different authorities across the globe. And finally, it has become evident that current data infrastructures across the industry’s information supply chain are not designed for the new regulatory requirements.
Ahead of the European Commission’s Deposit Guarantee Scheme consultation, expected in early July 2010, three members of JWG’s Customer Data Management Group (CDMG) delve into the current issues associated with SCV including the business and regulatory identifiers, the internal business drivers and transaction reporting.
Business and Regulatory Identifiers
Darren Marsh, European business manager of Risk Management and Compliance Services, Interactive Data
As the regulatory push towards imposing an SCV continues unabated, it is clear that some market participants consider implementation as purely an additional cost with no incremental business benefit. However, forward-thinking firms have recognised the benefits of a holistic view of client data across business lines and are looking to leverage SCV-based regulation as part of a wider strategy to centralise client data management.
These ‘early adaptors’ recognise the pitfalls of recreating the wheel for each new regulatory implementation and are looking to champion initiatives that can help ensure their customer data is in good shape for the slew of customer-focused regulation in the pipeline.
Many firms are now aware that they need to implement a centralised data management process for entity data linked to service downstream business functions and provide a consistent view of pre- and post-trade risk and compliance with industry regulations. By establishing a centralised, holistic view, firms can ensure consistency of data and support the in-house distribution frequencies required across business functions.
Combining ‘business entity’ and security data into a single consolidated master file can provide cross maps of securities identifiers to entity full legal names, and combined with linkages to the relevant company hierarchies, provide full disclosure of the details behind a firm’s instruments and the issuing entities, counterparties and customers records – plus the ability to track the high risk overlap between them.
Internal Business Drivers
Donald Roll, managing director in Europe, Alacra
Heightened awareness of the potential fallout from unwieldy risk exposure, added with ever-increasing scrutiny from financial services industry regulators, necessitate a deeper and more accurate understanding of customer relationships than ever before.
In a world where one institution can conduct business with hundreds of entities tied to a single parent company, and changes in corporate structure are rarely reflected simultaneously, calculating risk is like attempting to hit a moving target.
Institutions can begin to account for counterparty exposure by tying their golden copy to a wide range of external information. While talk centres on the possibility of one entity identifier for each legal entity, this reality is a long way off.
How best to take action now, when keeping data sets clean is, at best, a headache. With these mounting regulatory reporting challenges, firms are increasingly seeking help from external firms to ensure they keep accurate, up to date centralised set of reference data on counterparties to drive their risk and regulatory reporting.
Transaction Reporting
John Neasham, Financial Services Risk Management senior manager, Ernst & Young
The recent round of transaction reporting fines issued by the FSA has highlighted a weakness in the control infrastructure for reference data quality, particularly around customer information and associated unique identifiers.
As an industry we recognise the materiality and costs associated with testing software and regression testing upgrades, but data is often treated as the poor relation that lacks a repeatable test strategy.
A common finding right now – and reflecting recent cost cutting measures – is that teams responsible for maintaining customer data are often understaffed and struggling to keep up with the existing number of updates to customer records. Problems here can be further exacerbated by a lack of or through misguided compliance monitoring.
Senior IT and operations management should demand metrics about the quality of underlying referential information and the results from repeatable end to end data quality tests. If these results are out of date or not available it becomes difficult to satisfactorily demonstrate regulatory compliance. A regulatory penalty may be the least of your worries if you don’t understand who you are transacting business with.
Subscribe to our newsletter