After President Obama passed the Dodd-Frank Act last year, in an effort to avoid another financial crisis like the one in 2008, Europe is following suit with MiFID 2. While both of these regulations aim to increase transparency into financial markets and, specifically, over-the-counter (OTC) derivatives, the impact of MiFID 2 isn’t likely to be effective until 2014 as it’s still in the negotiation and adoption stage. Its pending implementation, however, is extremely important as the 2008 crisis clearly illustrated the high levels of correlation in financial instruments and the immediate impact of new developments and defaults. This has been the driving force behind such new regulations and is forcing firms to manage their risk in real-time.
In turn, this legislation and its impact puts a significant strain on financial institutions’ IT infrastructure, having to evaluate and cope with exponential increases in real-time data, which many financial institutions aren’t equipped to handle. And, with MiFID 2 set to go live in the near future, further real-time requirements will only amplify the amount of data that financial firms need to process.
Together, the Dodd-Frank Act and MiFID 2 will place a variety of new requirements, including new derivative transparency measures, on financial services companies, which will increase the amount of data that financial services companies are responsible for. With these measures in place, both pre- and post-trade transaction information will need to be made available for those contracts today traded over the counter. Those same contracts will need to be cleared through Central Counter Parties (CCPs).
Adding to this increase in data, risk management requirements soon to be imposed by MiFID 2 and the Dodd Frank Act are demanding that banks manage and record every transaction in real time. This is required in order to provide a constantly updated picture of how exposed the establishment is to risk at any given point in time. For complex Derivatives OTC products, such risk calculations will drastically increase the compute and storage requirements for trading firms.
The combination of all of these requirements is creating an environment that financial institutions are simply not equipped to handle. It puts a significant strain on firms’ technical infrastructure, which often causes extreme latency or even business-crippling downtime. As a result, companies need more powerful systems and infrastructure to be able to sustain such high volumes of reports, crunch numbers and perform calculations in real-time. The problem here is that the underlying infrastructure required to support such systems is not readily available or even accessible to most firms.
To cope with such increasing levels of data and keep pace with real-time transaction records, it’s in financial firms’ best interest to use a virtualised IT environment. With virtualised server farms, companies can pull more computing resources to complete the tasks without belaboring their IT systems and limited physical capacity. In a virtualised environment, since additional virtual machines (VMs) can be spun up or down at will, firms can adapt to high-data volumes more easily than continually adding more physical boxes.
In order to power a suitable virtual environment, financial institutions need to sustain high-density power throughout their IT facility. The problem, however, is that legacy data centres only have about one kilowatt of power per square meter, whereas a virtual environment required by today’s financial firms under these new regulations would require upwards of two kilowatts.
For this reason, many firms are realising the benefit of co-locating to a third-party data centre provider, like Interxion, that is already equipped with high-density requirements, suitable for current and future implementations and demands. In fact, Interxion has recently announced the opening of its second data centre in London, which has 2.5 kilowatts of power per square metre – more than enough to sufficiently handle any financial firm’s workload and increasing real-time data volumes – and will soon follow suit with a 4700 square meter expansion project planned for its Paris facility in Q2 of 2012.
Beyond high-density power, co-locating to a third-party data centre provider presents many benefits for financial firms looking for ways to cope with steadily rising data volumes. When looking at the footprint of a trading firm’s IT installation, at least 50% of it, in terms of footprint, compute power and cost is typically used to process market data. Similarly, trading firms’ bandwidth requirements are almost entirely used for this same purpose. As a result of this growth and corresponding market data requirements, financial firms need a massively scalable infrastructure in order to keep pace with increasing demands.
Such scalability is readily available in co-location facilities like Interxion that employ a modular approach with easy build-outs, eliminating wasteful over-provisioning. The availability of scalable resources like connectivity and power within a co-location facility are also extremely cost effective since having multiple participants all under one roof allows companies to leverage economies of scale. Finally, taking into account the facilities’ close proximity to all major liquidity venues, co-locating to a third-party data centre provider becomes a definite a win-win situation, especially in carrier-neutral facilities that provide redundant, low-latency fiber routes to any major site.
Co-location data centre facilities also allow for significant improvements in performance for financial firms’ data acquisition and data distribution. Since co-location allows data processing to be done much closer to the source, firms are able to filter out and optimise acquired data a lot earlier in the process. For instance, if a firm was sourcing a data feed from an exchange, the total throughput for the subscription would be around 100,000 messages per second at its peak, with an average message size of around 50 bytes. In this scenario, with the firm processing data remotely from the exchange, its network needs to be able to handle 100,000 messages at 50 bytes per second, which is required even if it doesn’t really need all 100,000 of the messages. If the firm is processing data in proximity to the exchange source, however, it can apply a filter to only source data from instruments it is interested in and format the data in proprietary messages so they are shorter than those of the exchange. So, given our example, the throughput required would then become 50,000 messages at 30 bytes per second, meaning less bandwidth is required for the firm. Being located close to the data source helps financial firms save substantial amounts of time and money by only focusing on the data that matters and reducing latency through proximity and network optimisation.
Similarly, for distribution processes, financial institutions typically first receive the data and then have to redistribute it to their own customers down the line. In a co-location facility, however, customers can locate within the same data centre as the distributing firm and can cross connect, presenting opportunities for instant distribution and significant time saving. Furthermore, sharing a facility with other market participants can help firms increase their reach and reduce time to market for acquiring new customers.
Increasing Infrastructure Externalisation
With unprecedented levels of data being generated in real-time from the need to manage risks associated with increasing market volatility and the immediacy of financial impacts, firms are facing a real challenge to handle the increase in data cost effectively. What’s more, the transparency requirements from the Dodd-Frank Act and upcoming MiFID 2 regulation will simply add to this data increase. Turning to a data centre co-location provider and externalising their infrastructure to meet demands and data volumes, therefore, has become extremely popular.
Indeed, with co-location facilities’ high-density power capabilities, scalability and proximity to major liquidity venues, financial institutions are finding major advantages in moving away from their legacy systems and inadequate infrastructures. While the first wave of financial regulations popularized co-located data centres for high-frequency trading performance, new requirements for risk management and analytics are showcasing the long-term benefits that financial firms gain by locating their infrastructure externally with a provider such as Interxion.