About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

GFT Blue Paper Presents the Potential of Big Data Technologies

Subscribe to our newsletter

As the technology infrastructure of financial services firms begins to buckle under the strain of huge and increasing volumes of data, many firms are investing in big data projects to improve data management for the purposes of regulatory compliance, risk reduction, cost efficiency and business benefit. The extent and intent of investment varies, however, with some firms taking an evolutionary approach that imposes big data technologies on existing processes and others taking a revolutionary approach that rethinks processes using big data technologies.

A Blue Paper by GFT, a specialist in designing and implementing IT solutions for the financial services industry, looks at the potential of big data solutions for investment banks, retail banks and insurance companies. Entitled Big Data – Uncovering the Hidden Business Value in the Financial Services Industry, the paper also considers use cases and benefits of big data, and provides recommendations to help firms successfully implement big data technologies.

Focussing on the investment banking sector, Karl Rieder, executive consultant at GFT and co-author of the Blue Paper, notes the problems of data silos in investment banks and the difficulty of capturing a complete view of activities and generating reports, but says banks are beginning to look at big data technologies that have the capacity to store huge volumes of data and the power to process it.

He explains: “Investment banks taking a revolutionary approach are restructuring IT systems and centralising data storage. This takes some doing and is often driven by regulations that require, for example, a complete view of the activity of clients or comprehensive risk calculations.”

The report cites MiFID, Basel III, Fatca and Dodd-Frank as some of the regulations that are driving change, as they require banks to report across asset classes and necessitate enterprise-wide views of operations and activities. Rieder adds BCBS 239, a Basel Committee regulation that forces change in how data is managed, controlled and governed. He also notes the Volcker Rule, which is part of the Frank Dodd Reform Act relating to bank levels of proprietary trading and requiring banks to report on the inventory aging of all their positions. To achieve this, explains Rieder, a bank needs to hold a view of positions for today and the past 365 days, apply a complex algorithm and calculate how long positions have been held. Without big data technologies that can store, access and process both current and historic data, the calculation is difficult and can take days rather than the minutes achieved by big data solutions.

Rieder warns that technology alone cannot solve banks’ big data problems and says it may be necessary to upskill to implement technology successfully, and that it is essential to address data quality, policies and procedures before technology deployment.

Once these issues are resolved, the technologies GFT proposes to support big data are distributed storage and distributed processing, which can together hold and process larger volumes of data than could previously be managed. When selecting specific solutions to support distributed storage and computing, GFT names Hadoop, an open source framework that includes distributed storage and processing, as its platform of choice; NoSQL databases as a means of storing and searching both structured and unstructured data; and event management platforms to support real-time processing of big data. Focussing on NoSQL, Rieder says the open source MongoDB and Cassandra

databases are favourites at investment banks, but also notes the popularity of MarkLogic, a commercial, enterprise ready NoSQL platform that includes access control and security policies.

With big data technologies in place, GFT says investment banks should be able to support a consolidated view of trades, trade analytics, market and credit risk calculations, rogue trade detection, counterparty risk monitoring and regulatory reporting. Improved IT efficiency should drive better operations, reduced IT costs and, ultimately, increased business profitability.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Practical considerations for regulatory change management

Date: 18 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Regulatory change management has become a norm across financial markets but a challenge for financial institutions that must monitor, manage and adapt to ensure compliance with both minor and major adjustments to obligations. This year is particularly troublesome, with...

BLOG

Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie

There’s a neat symmetry in speaking to Marion Leslie, head of financial information at SIX after one of the busiest six months in the company’s recent history. SIX, a global data aggregator and operator of exchanges in its native Switzerland, as well as in Spain, has released a flurry of new data products since January,...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Applications of Reference Data to the Middle Office

Increasing volumes and the complexity of reference data in the post-crisis environment have left the middle office struggling to meet the requirements of the current market order. Middle office functions must therefore be robust enough to be able to deal with the spectre of globalisation, an increase in the use of esoteric security types and...