About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

What to Expect in 2011

Subscribe to our newsletter

What a year it has been for reference data management. Data quality has been such a high profile topic that it was discussed during one of the US House-Senate conference committees and, across the pond, was also championed by the European Central Bank’s (ECB) president Jean-Claude Trichet during a number of speeches over the course of the year. And we can expect much more attention being directed this way over the course of the next year.

First up on the agenda in the US (but also with relevance to the rest of the world) will be the discussions around the introduction of a mandatory entity identifier, as part of the push to better monitor systemic risk from a regulatory perspective and the introduction of the Office of Financial Research. The industry has until the end of January to respond to the proposals, which were published in the Federal Register at the end of last month, around new standards for the identification of legal entities across the globe. The Office of Financial Research has stated its “preference to adopt through rulemaking a universal standard for identifying parties to financial contracts that is established and implemented by private industry and other relevant stakeholders through a consensus process”. Exactly who “private industry” and “relevant stakeholders” are will, no doubt, be the first point of discussion, or contention within this debate.

And the US regulators will not be hanging about: they want a legal entity identifier to be established by the 15 July 2011, at which point a regulation will be put in place to compel firms to use it in their reporting to the Office of Financial Research. This means the industry has six months from the closing of the comment period in which to agree upon a standard that it will have to adopt for counterparty identification purposes.

Swift will also need to declare its intentions in the entity identification space with regards to the future development of the Bank Identifier Code (BIC). Not much has been said publically about the work going on around the identifier this year, but if it hopes to be in with a chance in the context of the US developments, this will need to change and fast. There is some degree of debate at the moment about how, or even if the BIC should be developed to act as a more general entity identifier for the market, rather than its current function as a bank identifier within the framework of the messages carried across the Swift network. Look out for more debate on this subject in January.

Within the wider context of the establishment of the Office of Financial Research, it will be interesting to see who is appointed to run the data collection and utility aspects of the new body. The Depository Trust & Clearing Corporation (DTCC) seems to be in the frame at the moment, but what will Europe’s reaction be? Will it decide to push ahead with its own European version? What impact will the decisions made in the US have on the rest of the world?

Moreover, who will be appointed to head the Office of Financial Research itself? That will have a significant impact on the approach and the direction the new agency, which sits under the auspices of the US Treasury, will take in the future. Hopefully, President Obama will be forthcoming with a name soon.

The instrument identification challenges inherent in moving OTC derivatives onto central clearing counterparties and establishing trade data repositories in Europe and the US will also likely prove a popular topic of discussion in 2011. The European Council’s recent regulatory proposals on the subject include direct references to potential fines that may be handed out to those failing to report “correct and complete” data to repositories and regulators, and the need for these repositories to use “standardised” reference data. The proposals also indicate that the EU Council believes it is important that a “uniform OTC derivatives data reporting requirement is established at Union level”. The US is talking about similar standards for reporting to its own repositories. Making sure the regulatory community opts for the right standards for the business is key to ensuring that more duplicative and costly cross-referencing is not required.

In terms of regulatory developments, the review of MiFID is rumbling on across Europe and the transaction reporting proposals in particular will have a significant impact on the management of reference data. Dario Crispini, manager of the Transaction Reporting Unit of the UK Financial Services Authority (FSA), recently indicated that the regulator is planning to tighten scrutiny of data quality and, along with tighter scrutiny of instrument and entity identification, firms may also be facing a direct mandate for data quality assurance. The next transaction reporting subject group meeting in the UK, as part of the MiFID Forum, will be in late January or early February.

New incoming risk management reporting changes, as part of the revisions to the Capital Requirements Directive (CRD) and the gradual move to Basel III, include an update to the internal risk modelling requirements that specifies that “minimum data standards” must be met. The introduction of regular stress testing in particular will force firms to invest in the data architectures supporting their risk function. In order to meet the new weekly stress testing reporting requirements firms will need to ensure that their data is accurate and readily available for reporting purposes. Firms such as Royal Bank of Scotland (RBS) are therefore already attempting to better support their risk functions and one can expect that many more of these projects will be launched in the coming months.

Similarly, data support for the pricing and valuations function as a result of increasing transparency requirements will also require a retooling of data architectures. Providing more data round a price in a timely fashion will require a more robust system that can access quality checked reference data with lower latency.

Risk management and regulatory data reporting requirements in their entire spectrum of jurisdictional colours will therefore force investment in technology. One can expect to hear much more about a “near-time” (rather than “real-time”) approach to reference data, including data caching and even field programmable gate arrays (FPGAs) in this context. Smart use of front office technology in a back office context could potentially result in significant time and cost savings.

Furthermore all of this work will need to go on as the industry prepares for the gradual migration to more global accounting standards as a result of the US Financial Accounting Standards Board’s (FASB) related initiative with the International Accounting Standards Board (IASB). Fair value will, no doubt, be a topic of debate for some time to come.

Charging for proprietary vendor instrument identifiers has also been a contentious topic over the course of 2010, reflecting the general desire within the user community for lower costs and easier access to identifiers. The regulatory community has also continued to take an active interest, although we’re still waiting for the European Commission’s final ruling on Standard & Poor’s pricing practices concerning ISINs. Hopefully this ruling will be released towards the start of 2011 and we can certainly expect to see more declarations of “openness” from the other vendors on the block to compete with Bloomberg’s Open Symbology initiative, as they continue to feel the heat from their customers.

Swift will also continue to develop on its new five year strategy over 2011, including its push in the corporate actions space with XBRL US and DTCC, as well as its more general focus on standards. The market can certainly expect to see some results from the current pilot programme to prove the benefits of XBRL tagging and ISO 20022 messaging in the corporate actions space. Let’s hope Sibos in Toronto will see a good representation from the securities industry and the right topics on the programme.

Speaking of which, don’t forget to check out A-Team Group’s new programme of events for 2011 too. It’s certainly going to be a busy year!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlock the Future of European Equities & Trading Technology: 2024 and Beyond!

In a world where the financial landscape is perpetually evolving, 2023 has brought widespread discussions around liquidity, regulatory shifts in the EU and UK, and advancements like the consolidated tape in Europe. For the year ahead in 2024, the European market is poised for transformative changes that will influence the future of trading technology and...

BLOG

Brown Brothers Harriman Evolves Data Offerings with Infomediary Data Solutions

Brown Brothers Harriman (BBH) has announced the next evolution of its data offerings with Infomediary Data Solutions, an expanded set of solutions that brings together data management technology and managed services and is designed to help asset managers and financial institutions take command of their data. Infomediary Data Solutions builds on BBH’s Infomediary data integration...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...