About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

GFT Blue Paper Presents the Potential of Big Data Technologies

Subscribe to our newsletter

As the technology infrastructure of financial services firms begins to buckle under the strain of huge and increasing volumes of data, many firms are investing in big data projects to improve data management for the purposes of regulatory compliance, risk reduction, cost efficiency and business benefit. The extent and intent of investment varies, however, with some firms taking an evolutionary approach that imposes big data technologies on existing processes and others taking a revolutionary approach that rethinks processes using big data technologies.

A Blue Paper by GFT, a specialist in designing and implementing IT solutions for the financial services industry, looks at the potential of big data solutions for investment banks, retail banks and insurance companies. Entitled Big Data – Uncovering the Hidden Business Value in the Financial Services Industry, the paper also considers use cases and benefits of big data, and provides recommendations to help firms successfully implement big data technologies.

Focussing on the investment banking sector, Karl Rieder, executive consultant at GFT and co-author of the Blue Paper, notes the problems of data silos in investment banks and the difficulty of capturing a complete view of activities and generating reports, but says banks are beginning to look at big data technologies that have the capacity to store huge volumes of data and the power to process it.

He explains: “Investment banks taking a revolutionary approach are restructuring IT systems and centralising data storage. This takes some doing and is often driven by regulations that require, for example, a complete view of the activity of clients or comprehensive risk calculations.”

The report cites MiFID, Basel III, Fatca and Dodd-Frank as some of the regulations that are driving change, as they require banks to report across asset classes and necessitate enterprise-wide views of operations and activities. Rieder adds BCBS 239, a Basel Committee regulation that forces change in how data is managed, controlled and governed. He also notes the Volcker Rule, which is part of the Frank Dodd Reform Act relating to bank levels of proprietary trading and requiring banks to report on the inventory aging of all their positions. To achieve this, explains Rieder, a bank needs to hold a view of positions for today and the past 365 days, apply a complex algorithm and calculate how long positions have been held. Without big data technologies that can store, access and process both current and historic data, the calculation is difficult and can take days rather than the minutes achieved by big data solutions.

Rieder warns that technology alone cannot solve banks’ big data problems and says it may be necessary to upskill to implement technology successfully, and that it is essential to address data quality, policies and procedures before technology deployment.

Once these issues are resolved, the technologies GFT proposes to support big data are distributed storage and distributed processing, which can together hold and process larger volumes of data than could previously be managed. When selecting specific solutions to support distributed storage and computing, GFT names Hadoop, an open source framework that includes distributed storage and processing, as its platform of choice; NoSQL databases as a means of storing and searching both structured and unstructured data; and event management platforms to support real-time processing of big data. Focussing on NoSQL, Rieder says the open source MongoDB and Cassandra

databases are favourites at investment banks, but also notes the popularity of MarkLogic, a commercial, enterprise ready NoSQL platform that includes access control and security policies.

With big data technologies in place, GFT says investment banks should be able to support a consolidated view of trades, trade analytics, market and credit risk calculations, rogue trade detection, counterparty risk monitoring and regulatory reporting. Improved IT efficiency should drive better operations, reduced IT costs and, ultimately, increased business profitability.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: FRTB: What still needs to be done before the global deadline of January 2023?

While implementation of Fundamental Review of the Trading Book (FRTB) regulation has been delayed twice for reasons first of complexity and second of the coronavirus pandemic, the final deadline of January 1, 2023 is less than a year away. For banks in scope of the regulation, the time to put necessary risk infrastructure and data...

BLOG

Options Partners with Code Willing to Offer New Quantitative Trading Service

Trading infrastructure provider Options Technology has partnered with data management company Code Willing to offer a new quantitative trading service intended to help clients better control and reduce their data analysis costs. The solution adds Code Willing’s data cleansing, organising, and cross-referencing capabilities with Options’ global network, infrastructure and managed services. Aimed at quantitative investment...

EVENT

RegTech Summit Virtual (Redirected)

The RegTech Summit Virtual is a global online event that brings together an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...