Earlier this year, the Depository Trust and Clearing Corporation (DTCC) appointed ex-NYSE Euronext market data guru Ron Jordan to the position of chief data officer (CDO) and gave him the responsibility of guiding DTCC’s commercial and regulatory data business activities. Reference Data Review speaks to Jordan about his career highlights in the exchange sector and how these could help shape his role at DTCC, and the small matter of his new firm’s expanding data empire.
DTCC has recently drawn a line in the sand and stated that it is aspiring to be a key partner to the global regulatory community by acting in a technology and data support role to many of the current oversight endeavours. Two such moves have been the announcement with partner Swift that it will be establishing a new global FX trade data repository, as well as working on the development of a new legal entity identification (LEI) system for the financial markets.
This adds to other endeavours such as its credit default swap (CDS) regulatory portal launch back in February and its recent teaming with EFETnet to develop a trade repository for the commodities market. Jordan certainly has his work cut out for him. His new role entails working on the strategy behind all of these efforts and ensuring that the increasingly expanding team at DTCC are all on the same page across business lines.
To this end, he reports in to Andrew Gray, managing director of core product strategy and management. Thankfully, Jordan has 26 years of experience in the market data space at NYSE Euronext and has also worked for a couple of other exchanges: the American Stock Exchange and the Philadelphia Stock Exchange, which should stand him in good stead when developing data infrastructure.
How have your early career experiences shaped your view of the financial information industry? Tell me a bit more about how you got involved in this part of the market – what attracted you to this side of the financial services industry?
Prior to my move to DTCC, my entire career was spent at stock exchanges, the vast majority with the New York Stock Exchange (NYSE). I was fortunate that I was able hold a variety of positions in a number of disciplines, including operations, technology, regulation, strategy, new business development, project management, and market data. Through these positions, I acquired a solid foundation of how stock exchanges operate and interoperate with their members, regulators, and key constituents, both pre-trade and post-trade.
In addition, the fact that stock exchanges are highly regulated and were, until recently, member-owned, gave me experience in the unique challenges faced by organisations with this profile. The pace, the complexity, the opportunity and the importance of being part of the financial services industry are what continue to attract me.
Describe some of the challenges you faced at NYSE Euronext in terms of positioning the market data business and the successes that you achieved?
I moved to the market data business at the NYSE in 1996 and remained there for 14 years, by far the most time I spent in any one division. The data challenge for the NYSE was to continue to lead and grow their “regulated” data business, also known as the Consolidated Tape Association, or CTA, which is a consortium of all of the regulated equity exchanges in the United States, while developing a new, proprietary data business specifically for the NYSE.
I believe we were successful on both fronts. With CTA, we were able to dramatically improve the speed at which trades and quotes from the participant exchanges were delivered to consumers from seconds to milliseconds, and were able to grow revenues largely through attracting and servicing a whole new customer base including internet portals, online brokers and individual investors. In the proprietary space, we were able to dramatically grow revenues through introducing high speed data products like NYSE OpenBook, and establishing a reference data and pricing utility now at nyxdata.com.
Between 2002 and 2010, proprietary data product revenues increased substantially. Finally, integrating the data businesses of the various NYSE purchases and mergers between 2005 and 2010, including Arca, Amex, Euronext and Wombat, was a challenge that was successfully managed and resulted in a large reduction in overall costs and significant margin improvements.
What attracted you to the role at DTCC and how will you use your experience in the market data world to the DTCC’s advantage?
Sometimes timing is everything. When I first started to discuss the data opportunities at DTCC in mid-2010, it was just when DTCC was purchasing Avox and just as the Dodd Frank legislation was being enacted. Both have enormous impact on the potential for DTCC to establish a solid data business, both from the utility perspective and from the commercial perspective.
My experience at the NYSE in managing a regulated data business and building a commercial data business in a highly regulated, member owned organisation was the appropriate skill set for DTCC. In addition, the corporate culture of DTCC was appealing to me, and the quality and knowledge of the DTCC staff is of highest calibre.
How do you see the future of the financial information market developing over the next five years? What part will regulation and standardisation have to play in this environment?
I believe the regulatory environment will play a significant role in the financial information market over the next five years, and already has begun to do so. The need for financial institutions to better understand their risks and to “know their customers” will continue to be in the spotlight, and increased oversight by regulators will require new levels of information reporting and sharing.
Financial institutions will be looking to information solution providers to help them respond to these challenges in the most cost effective way possible. To a large extent, the new regulatory and reporting requirements will simply represent an incremental cost that the firms must manage. From the regulators’ perspective, they will need to be able to collect and process vast amounts of financial information in order to conduct systemic risk analysis. Regulators will be looking to standardise as much information as possible to help with systemic risk analysis.
Many in the industry have expressed concern about the right standards being mandated by the OFR and the regulatory community in terms of reference data – how do you think the DTCC can assist in this endeavour?
DTCC can play two critical roles to help set standards. First, as a natural “hub” within the financial services community, DTCC is in a unique position to foster communication and debate among its participants and its regulators. Ensuring the right debate between the right parties will result in the right standards being adopted.
Second, DTCC can be a thought leader and help guide the recommendations in certain instances. This is certainly the case regarding the LEI discussions that have occurred over the last year. The DTCC purchase of Avox enabled us to share with firms and regulators the realities and the challenges associated with validating legal entity data, and helped the industry and regulators to shape their current LEI recommendation.
What will be your priorities over the next year or so in terms of furthering the DTCC’s chances in the context of the OFR?
My top priority will be to serve the DTCC participants, regulators, and others in the financial industry by developing a utility data business at DTCC, a niche in the financial information business that does not exist today and which DTCC is well positioned to create. In essence, the business is centred on mutualising the cost for the industry and providing ready access to selected data elements and processes, such as the validation of legal entities, that all firms must support and that are viewed as costs rather than value-added components to those firms.
The OFR and other regulatory bodies will continue to put increased pressure on firms to collect and report information, and will need information to be standardised in order to conduct systemic risk analysis. The recent industry recommendation of DTCC to be the facilities manager for the LEI utility is a very important first step in establishing a utility data business. Our success in delivering a solution that both the industry and regulators find valuable will increase our chances to expand the utility business to cover other types of similar information.
What keeps you up at night – what concerns you in the industry most at the moment?
My perception is that there is currently a tension between the financial industry and the regulators as a result of the recent financial crisis. If a mutual cycle of trust is established, I believe real change can occur. If not, I worry that we will simply be heading for the next crisis.
What is the most innovative and groundbreaking change that you have seen during your time working within the financial information services industry?
Because I spent the majority of my career at the NYSE, by far the most innovative and groundbreaking change has been how technology and automated trading has completely changed the economics of the equity trading business, and how quickly that happened. From a historical perspective, the economics of the equity trading business changed very little for about 200 years.
A few major exchanges like the NYSE and Amex dominated trading, and they were comprised of specialist and broker members who executed trades physically on the trading floor. Within a period of about four years beginning in about 2004, electronic trading began to dominate, the barriers to entry in the equity trading business largely disappeared, competition exploded, margins collapsed, and most physical trading floors disappeared.
This seismic shift created winners and losers – firms that existed for 100 years went out of business almost overnight, and firms that were in business for less than two years became major contributors of order flow. Exchanges themselves reacted by consolidating and going public to secure investment capital, but needed to reinvent themselves to survive. From an information perspective, exchanges now compete on the latency of their feeds, and how they integrate their depth of book into the top line quote. Latency is now measured in microseconds, and trading volume usually goes to those with the fastest feed. The lesson learned is that things can change on a dime. Complacency is the real enemy.