The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Winning the Big Data Race for Trading Communication

By Paul Metcalfe, Orange Business Services

Big data is increasingly becoming a key concern for technologists, especially with the recent introduction of recording for voice and SMS on mobile phones for trading activities in the UK.  As additional voice recording requirements contribute to the growing repository of data firms have to hold to comply, the pressure is on to manage and house bigger data volumes more than ever before.

As well as meeting regulatory requirements for greater transparency and a more comprehensive record of past trading activities, this data can also serve a purpose in other areas such as dispute resolution.

Data Increase

The rapidly increasing need to hold more data is a big concern the financial services industry faces. The challenges are twofold and include volume and diversity.  Trading communication now takes many forms; voice, IM and SMS as well as social networks, email and other systems related data.  All of this data has to be stored, with extra volume coming from additional data sources, such as media files from video conferencing along with information that may have influenced or could contextualise the communication being recorded (i.e. news feeds or streaming). Technologists must be prepared.

The expectation is that regulators will push for this broader range of data types to be held for far longer periods than before, with current averages in the range of three-six months moving closer to three-five years (as seen in Dodd Frank and MiFID II proposals).  Planning ahead for storage requirements is however a challenge, as volume and diversity of data is often affected by increased activity in the markets which is hard to foresee, along with the seasonality of specific market activities.  The main concern is that existing data technologies and infrastructures could soon reach their limits of scalability and performance.

Processing Pressure

Above and beyond the issues of volume and diversity is the subsequent challenge of processing the required analysis and interrogation of the data.  The turnaround of requests (speed of processing) will be key.  Being transparent to the geographic locations of data and the requesting user will also be important.  We are now starting to see the inclusion of voice, along with a wider set of communication data, in the deployment of data analytics engines.  The most advanced of these engines must be able to provide the ability to ‘slice and dice’ this diverse and generally unstructured data by specific search criteria (i.e. all activities of a trade desk within a specific time window) or key values (i.e. specific counterparty, particular word or trigger event).

Data and Analytics Hurdles

In today’s cloud based technology ‘as a service’ world, we are seeing institutions seriously considering specialist service suppliers to provide flexible data storage solutions that are fully scalable.  This strategy was not immediately accepted for critical applications, due to uncertainty on the risks in security and reliability, but the general concept of outsourcing key infrastructure (data storage) along with highly proprietary and confidential data is now gaining acceptance.  With a growing base of customers in the trading space, we would expect to see this continue to grow in popularity. Larger firms may also look at deploying their own private cloud as an alternative.

Many large firms already have programs and proof of concepts for advanced data analytics tools in place.  However, many of these will need to be expanded to a wider selection of compliance data (i.e. voice, SMS) which will in turn require further investment.  For most of the medium-small sized firms, the cost of entry and subsequent understanding of the tangible benefits must justify such a move.  This is particularly difficult when the need for these firms to conduct such analysis is not a regular requirement and could probably be dealt with manually on a case by case basis, with a rapidly growing data repository it is uncertain how long this would be feasible with a quick enough turnaround.

Once again the concept of analytics ‘as a service’ could be a way forward and some of the tools are available on this basis today.  Although we have not reached a point where a firm can easily buy data analytics for specific investigations or analysis on demand (i.e. ‘pay as you go’), this concept is reasonably easy to understand and appears to be more accessible to a wider range of budgets.  An infrequent need to perform such analysis, however, means format and availability of data would need to follow some general standards, which may not be straight forward, especially with legacy data.

Crossing the Line

The big data debate is still developing but for larger firms the need to manage data effectively will quickly become a reality.  Potential outsourcing and ‘as a service’ options could serve as alternatives to internal technology infrastructure reviews and potential investment.  However, the solutions available to market participants today are still relatively new.  Firms are still wary and cautious when it comes to risk. For traders the challenges around recording and compliance have now reached further than before.  The impact which technology convergence has already had in this sector can be seen clearly, as lines become more blurred with regards to technologies which have been deployed across trade floors.  Telecommunication specialists are building expertise in this area but there is more work to be done in order to win the big data race.

Related content


Recorded Webinar: Market data in the cloud

The Covid-19 pandemic has created new demand for financial information delivery infrastructure to accommodate the many trading and support personnel now working from home (WFH). For many firms, new cloud delivery and hosting capabilities offer a viable solution for supporting these staff, accelerating demand for cloud-based market data delivery infrastructures. This development has thrown up...


Chronicle Integrates Latency-Optimised Messaging Framework with KX’s kdb+

London-based low-latency technology specialist Chronicle Software has developed an off-the-shelf integration between its Chronicle Queue messaging framework and KX’s kdb+ high-performance database in partnership with consultancy firm AquaQ. According to Chronicle CEO Peter Lawrey, over 80% of the Top 100 banks Globally use the company’s Enterprise or Open Source products. Many of these banks are...


RegTech Summit APAC

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.


RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...