About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

GFT Blue Paper Presents the Potential of Big Data Technologies

Subscribe to our newsletter

As the technology infrastructure of financial services firms begins to buckle under the strain of huge and increasing volumes of data, many firms are investing in big data projects to improve data management for the purposes of regulatory compliance, risk reduction, cost efficiency and business benefit. The extent and intent of investment varies, however, with some firms taking an evolutionary approach that imposes big data technologies on existing processes and others taking a revolutionary approach that rethinks processes using big data technologies.

A Blue Paper by GFT, a specialist in designing and implementing IT solutions for the financial services industry, looks at the potential of big data solutions for investment banks, retail banks and insurance companies. Entitled Big Data – Uncovering the Hidden Business Value in the Financial Services Industry, the paper also considers use cases and benefits of big data, and provides recommendations to help firms successfully implement big data technologies.

Focussing on the investment banking sector, Karl Rieder, executive consultant at GFT and co-author of the Blue Paper, notes the problems of data silos in investment banks and the difficulty of capturing a complete view of activities and generating reports, but says banks are beginning to look at big data technologies that have the capacity to store huge volumes of data and the power to process it.

He explains: “Investment banks taking a revolutionary approach are restructuring IT systems and centralising data storage. This takes some doing and is often driven by regulations that require, for example, a complete view of the activity of clients or comprehensive risk calculations.”

The report cites MiFID, Basel III, Fatca and Dodd-Frank as some of the regulations that are driving change, as they require banks to report across asset classes and necessitate enterprise-wide views of operations and activities. Rieder adds BCBS 239, a Basel Committee regulation that forces change in how data is managed, controlled and governed. He also notes the Volcker Rule, which is part of the Frank Dodd Reform Act relating to bank levels of proprietary trading and requiring banks to report on the inventory aging of all their positions. To achieve this, explains Rieder, a bank needs to hold a view of positions for today and the past 365 days, apply a complex algorithm and calculate how long positions have been held. Without big data technologies that can store, access and process both current and historic data, the calculation is difficult and can take days rather than the minutes achieved by big data solutions.

Rieder warns that technology alone cannot solve banks’ big data problems and says it may be necessary to upskill to implement technology successfully, and that it is essential to address data quality, policies and procedures before technology deployment.

Once these issues are resolved, the technologies GFT proposes to support big data are distributed storage and distributed processing, which can together hold and process larger volumes of data than could previously be managed. When selecting specific solutions to support distributed storage and computing, GFT names Hadoop, an open source framework that includes distributed storage and processing, as its platform of choice; NoSQL databases as a means of storing and searching both structured and unstructured data; and event management platforms to support real-time processing of big data. Focussing on NoSQL, Rieder says the open source MongoDB and Cassandra

databases are favourites at investment banks, but also notes the popularity of MarkLogic, a commercial, enterprise ready NoSQL platform that includes access control and security policies.

With big data technologies in place, GFT says investment banks should be able to support a consolidated view of trades, trade analytics, market and credit risk calculations, rogue trade detection, counterparty risk monitoring and regulatory reporting. Improved IT efficiency should drive better operations, reduced IT costs and, ultimately, increased business profitability.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: ESG: A Growth Opportunity and a Regulatory Challenge

ESG investing, regulation and compliance are central concerns for financial institutions, although not all jurisdictions are equal. In the US, ESG has become a partisan issue making SEC regulation uncertain; the EU is on good form and has already implemented multiple regulations; and Asia Pacific is advancing as regulators and exchanges deploy ESG rules. Greenwashing...

BLOG

Building Trust in AI: An Imperative for Widespread Adoption

By Anshuman Prasad, Global Head of Risk Analytics, CRISIL. As large language models (LLMs) continue to surprise there is a clamour to adopt AI in a more meaningful fashion across the financial services sector, where machine learning was previously accessible only in rarefied tech or quant circles. From algorithmic trading to predictive analytics and chatbots,...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...