The Dodd Frank Act contains a whole host of new and different data requirements in terms of reporting formats and potential standards, but how will all of this impact the practitioner and vendor communities directly? Reference Data Review speaks to Paul Filanowski, US product development at SIX Telekurs, about the changing US regulatory scene and how it is impacting the data management function on a practical level.
Filanowski is a product support specialist at data vendor SIX Telekurs in the US, a role that he has held for around two years. Prior to this, he was a senior business analyst at fund management advisory firm Commonfund for around four and a half years, an assistant vice president at Citigroup Asset Management for two years and a product manager at Multex.com. He began his career at Telekurs Financial as an assistant product manager back in 1996.
Q: Where do the biggest data management challenges lie within Dodd Frank?
Dodd Frank calls for the creation of new data, and also issues guidelines for reporting, maintaining and analysing existing data from a variety of sources. This is an overwhelming task for any institution, let alone one that hasn’t been able to commit resources or establish protocols for receiving and maintaining such data.
We can see the challenges ahead by taking a look at the complexity of implementing just one new piece of data – the legal entity identifier (LEI). In response to regulatory mandate, the financial services industry, through the EDM Council, Sifma and other industry organisations, is working to develop the global standard for unique entity identifiers that will form the basis for systemic oversight, as well as bring transparency to the OTC derivatives market.
Organisations such as Swift and the Association of National Numbering Agencies (ANNA) have expressed interest in becoming the registration agency for a new standard, but those organisations prefer a standard agreed on by ISO rather than one developed by the financial service industry itself. Nevertheless, maintenance of the LEI will be complex and issues such as parent-child hierarchies, majority ownership, corporate actions and re-domiciliation need to be carefully considered and implemented worldwide to ensure a uniform standard.
Creation of the standard itself is proving to be daunting enough, but once the industry has agreed to a standard, each institution and those companies that support the industry (such as data vendors) will have the challenge of implementing that new standard. Data repositories may not be readily extensible, and downstream systems, which allow a comprehensive view of enterprise risk and counterparty exposure, may require significant investment in order to process the new identifier instead of whatever proprietary symbology or methodology is currently in place. The goal of facilitating a systemic view of risk is obviously laudable, but the devil is in the details and wholesale replacement of current data sets will cause much disruption throughout the financial services industry before it is achieved.
But an even more basic problem is that Dodd Frank requires the intake and analysis of a large volume of disparate data from across the financial services spectrum. Anyone in the data management industry and most in the data management departments at financial institutions can appreciate the challenges of the collecting, scrubbing, maintaining, warehousing and securing of enormous volumes of data. At this point in time, the government has yet to provide specific guidance on how certain data will be reported, in which format, and at what frequency.
As an example, Title VII of Dodd-Frank allows for multiple swap data repositories (SDRs) in the marketplace, which is good for choice and competition, but without specific guidance on the how/what/when, how are the regulators going to collect, aggregate and monitor this data on a timely basis? Remember, Dodd Frank was put into place to avoid another financial crisis. That means that allocating the necessarily tools, analytics and resources to sift through and analyse the data in a timely manner to understand the systematic risk in the markets is critical.
With deadlines quickly approaching, coupled with the US federal government’s delay in committing to a 2011 budget, oversight of some activities has been pushed to the state government level, but without the allocation of funds to support these activities. This issue is further compounded by a large number of state governments significantly cutting budgets. In order for this law to achieve its goal, which is to provide transparency for the protection of our economy, we need to make sure that this doesn’t become purely a data collection and publication exercise, but one that will yield a truly timely and insightful picture of our financial health.
Q: Does the industry appreciate these challenges and how should it be anticipating these changes – where should it focus its energies to cope with these new requirements?
The industry seems to agree significant challenges lie ahead but it is still too soon to tell what the full implications will be, as the details of the implementation are yet to be determined. Especially important is that these regulations are US centric, and therefore provide only one part of the global solution.
There seems to be concern across all sectors regarding the potential financial impact of these changes, but the cost concerns are also being voiced on the other side of the fence, as the Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC) have been very vocal that budget constraints could prevent them from implementing the Dodd Frank provisions as anticipated. Therefore, the focus will need to be on the financial impact due to infrastructures that will need to be added or reengineered, workflow changes and security of the data.
Within this context, several key issues will need to be addressed:
• Investment firms will need to reach consensus on data symbology standards as opposed to having the government set the rules.
• The Office of Financial Research (OFR) mentions development of a reference database that would be easily accessible to the public. A key component of this will be safeguarding privacy and confidentiality, which is certainly a concern to all.
• Many hedge funds are required to report algorithms and other proprietary and sensitive information. This raises serious, potential issues with security since there is a lack of guidance on who would be able to review and comprehend this data.
• Market participants will certainly need to make sure that they can obtain the data in a structured and encoded feed; one which can present the different regulations from around the world in as simple to digest a format as possible. As always, data vendors will look to make all this information available as easily as possible to customers and we think that Dodd Frank related information will be an area which customers will be specifying in future requests for proposals.
Q: Will the OFR have a profound impact on the industry’s data management practices or is it doomed to failure?
The mandate of the OFR is twofold:
• Firstly, the OFR will create and oversee data standards which will allow regulators to better understand an individual firm’s fiscal health as well as the entire industry’s risk exposure through aggregation of data from individual firms.
• Secondly, the OFR needs to draw upon this aggregate data to conduct research and analyse the nation’s financial system; the “health” of which it will report to Congress on a regular basis.
To achieve these goals, a standard symbology for trades is obviously necessary, and the OFR is pushing the industry to do so, before the government feels compelled to step in and do it for them. This is a tremendous opportunity for the industry to have a profound impact on data and data management practices, and the fact is that many industry organisations are already working collectively to leverage best practices and apply real life experiences to arrive at solution that will work within the existing framework.
Similarly, the OFR has an opportunity to make its impact by conducting thorough and timely analysis that will provide insight into the nation’s fiscal health and allow Congress time to react to that information. But again, without a commitment to providing funding for tools to collect the data and resources to interpret it, the office will not succeed. One can only hope that the vigour with which the bill was enacted will translate into the full commitment needed to make the vision a reality.
Q: What potential dangers lie ahead for the US regulatory community with regards to mandating new data standards?
The only ‘data standard’ mandated by Dodd Frank thus far is the LEI, and given that the industry is working cooperatively to create a workable solution seems to indicate that mandated standards can be implemented.
The real danger lies in the reporting requirements for market participants. For example, Title IV requires that many hedge funds and asset managers report proprietary and very sensitive information around trading and investment positions, counterparty credit risk exposure and use of leverage. This raises serious concerns about confidentiality and security, particularly the fear that other market participants could get access to this information and be able to reverse engineer a firm’s trading strategies. If this happened, markets could be manipulated and market participants could find it too difficult to conduct business in the US. As such, the US regulatory community needs to engage market participants in ways that both parties can be assured that the data necessary to analyse transparency of market conditions can be provided in such a way that doesn’t compromise the ability to conduct business.
Q: What should be taken into account by the industry and the regulatory community as these various requirements come into play? What industry best practices and benchmarks should be used or are needed?
Financial firms are at different stages in their responses to Dodd Frank, but a number of best practices and priorities are starting to unfold, as some of the larger financial services institutions are establishing robust governance programmes to manage their responses to the Act. Some of the key areas that should be taken into account when establishing a programme include:
• Programme management – firms must establish and implement processes and governance functions that address the immediate effect of the Act across their businesses and promptly react to emerging regulatory changes.
• Firms must quickly evaluate implementation options and long term business strategies in the area of derivatives and swaps clearing.
• Based on the mandate of the OFR, institutions may need to enhance their data and standards in order to be able to produce granular transaction level information and do so rather quickly.
• Firms designated as systematically important financial institutions (SIFIs) must fully understand their new requirements, which include expanded reporting, additional capital standards, development of resolution and recovery plans and, finally, improvements to their overall risk management and governance structures.
• IT and data management – since data management is one of the most important factors for Dodd Frank, it is imperative that institutions focus on the quality of data on transactions, counterparties, customers and legal entities. Another key requirement would be to align IT projects and ‘business as usual activities’ with the emerging requirements.
Q: How mature is the US market in terms of data management and do you expect this to change dramatically over the next few years? What other than regulation will compel change?
There have been a number of data management projects that have become priorities across the industry and are gaining support and funding. Unless significant operational issues arose, legacy security masters and client information weren’t actively managed in the past. However, after the credit crisis, these systems came under increased scrutiny, as they are key components to providing the larger picture of enterprise risk. Projects that achieve risk reduction and cost savings have also gained priority – in fact, data management projects are in synch with risk management initiatives more often than not. We are also seeing an increased focus on ensuring the processing of high quality pricing and reference data and integration of normalised data to be distributed in real- or near real-time and fed into risk models.
Regulations will continue to be a significant driving force behind data management initiatives, as regulators respond to increased political pressure and try to quickly to address issues highlighted during the recent credit crisis. In particular, the ‘newer’ risks associated with the crisis – such as liquidity and systematic risk – have emphasised the need for a solid data foundation across all areas of the business.
Q: In the short term, what should firms be putting in place before the year is out with regards to data management solutions?
In addition to close monitoring of how the details of Dodd Frank are fleshed out over the next few months, market participants should also be setting up exploratory committees to see just how – and how much – their firms will be impacted, especially with regard to how the operational impact of implementing regulatory changes might ripple across the various functions/departments within the firm.
Just as importantly, firms should be getting involved and making their voices heard, because it is vital for those in operations – who understand the day to day challenges – be involved in shaping the realisation of this regulation.
Similarly, as discussed earlier, it is imperative that industry participants assist in setting standards, such as the LEI, because the experience and firsthand working knowledge of best practices can only come from within the participants of the industry and not from regulators.