About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Winning the Big Data Race for Trading Communication

Subscribe to our newsletter

By Paul Metcalfe, Orange Business Services

Big data is increasingly becoming a key concern for technologists, especially with the recent introduction of recording for voice and SMS on mobile phones for trading activities in the UK.  As additional voice recording requirements contribute to the growing repository of data firms have to hold to comply, the pressure is on to manage and house bigger data volumes more than ever before.

As well as meeting regulatory requirements for greater transparency and a more comprehensive record of past trading activities, this data can also serve a purpose in other areas such as dispute resolution.

Data Increase

The rapidly increasing need to hold more data is a big concern the financial services industry faces. The challenges are twofold and include volume and diversity.  Trading communication now takes many forms; voice, IM and SMS as well as social networks, email and other systems related data.  All of this data has to be stored, with extra volume coming from additional data sources, such as media files from video conferencing along with information that may have influenced or could contextualise the communication being recorded (i.e. news feeds or streaming). Technologists must be prepared.

The expectation is that regulators will push for this broader range of data types to be held for far longer periods than before, with current averages in the range of three-six months moving closer to three-five years (as seen in Dodd Frank and MiFID II proposals).  Planning ahead for storage requirements is however a challenge, as volume and diversity of data is often affected by increased activity in the markets which is hard to foresee, along with the seasonality of specific market activities.  The main concern is that existing data technologies and infrastructures could soon reach their limits of scalability and performance.

Processing Pressure

Above and beyond the issues of volume and diversity is the subsequent challenge of processing the required analysis and interrogation of the data.  The turnaround of requests (speed of processing) will be key.  Being transparent to the geographic locations of data and the requesting user will also be important.  We are now starting to see the inclusion of voice, along with a wider set of communication data, in the deployment of data analytics engines.  The most advanced of these engines must be able to provide the ability to ‘slice and dice’ this diverse and generally unstructured data by specific search criteria (i.e. all activities of a trade desk within a specific time window) or key values (i.e. specific counterparty, particular word or trigger event).

Data and Analytics Hurdles

In today’s cloud based technology ‘as a service’ world, we are seeing institutions seriously considering specialist service suppliers to provide flexible data storage solutions that are fully scalable.  This strategy was not immediately accepted for critical applications, due to uncertainty on the risks in security and reliability, but the general concept of outsourcing key infrastructure (data storage) along with highly proprietary and confidential data is now gaining acceptance.  With a growing base of customers in the trading space, we would expect to see this continue to grow in popularity. Larger firms may also look at deploying their own private cloud as an alternative.

Many large firms already have programs and proof of concepts for advanced data analytics tools in place.  However, many of these will need to be expanded to a wider selection of compliance data (i.e. voice, SMS) which will in turn require further investment.  For most of the medium-small sized firms, the cost of entry and subsequent understanding of the tangible benefits must justify such a move.  This is particularly difficult when the need for these firms to conduct such analysis is not a regular requirement and could probably be dealt with manually on a case by case basis, with a rapidly growing data repository it is uncertain how long this would be feasible with a quick enough turnaround.

Once again the concept of analytics ‘as a service’ could be a way forward and some of the tools are available on this basis today.  Although we have not reached a point where a firm can easily buy data analytics for specific investigations or analysis on demand (i.e. ‘pay as you go’), this concept is reasonably easy to understand and appears to be more accessible to a wider range of budgets.  An infrequent need to perform such analysis, however, means format and availability of data would need to follow some general standards, which may not be straight forward, especially with legacy data.

Crossing the Line

The big data debate is still developing but for larger firms the need to manage data effectively will quickly become a reality.  Potential outsourcing and ‘as a service’ options could serve as alternatives to internal technology infrastructure reviews and potential investment.  However, the solutions available to market participants today are still relatively new.  Firms are still wary and cautious when it comes to risk. For traders the challenges around recording and compliance have now reached further than before.  The impact which technology convergence has already had in this sector can be seen clearly, as lines become more blurred with regards to technologies which have been deployed across trade floors.  Telecommunication specialists are building expertise in this area but there is more work to be done in order to win the big data race.

Subscribe to our newsletter

Related content


Upcoming Webinar: Leveraging interoperability: Laying the foundations for unique best-of-breed trading solutions

Date: 23 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interoperability on the trading desk promises more actionable insights, real-time decision making, faster workflows and reduced errors by ensuring data consistency across frequently used applications. But how can these promises be kept in an environment characterised by multiple applications...


S&P Global Market Intelligence Reviews Regulatory Reporting

Resourcing and data quality management are the biggest barriers to effective, accurate and cost-effective transaction regulatory reporting, according to S&P Global Market Intelligence’s annual Global Regulatory Reporting Survey – but it’s not all bad news, with the report noting that financial markets are better prepared for regulatory changes coming in 2024 than in any previous...


AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.


Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...