About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data in Financial Services – Opportunity or Cost?

Subscribe to our newsletter

By John Bantleman, RainStor
www.rainstor.com

We recently hosted a dinner in New York City with 20 technology executives focused on big data in banking and financial services.  I found the event insightful, so I thought it would be interesting to share some of the perspectives from those who attended.

The first (and close to my heart) is the separation of the big data business problem and the available technology and solution architectures.  I am not trying to detract from the applicability and relevance of the Hadoop technology stack and the attractive, low cost scalability platform being central to making to solving big data challenges for today’s enterprise.  Yes, there is an elephant charging straight towards us but I do feel as an industry, we spend a lot more time focused on the technology itself and less on the business problems it solves.

Financial services and banking see themselves as living big data, with very high SLAs and stringent availability and security requirements for many years.  In fact, long before the big data market took off where we see start-ups forming almost every day with millions of dollars of venture funds in addition to public technology providers pouring in serious investment.  If you look at your classic Wall Street financial services organisation or bank, there is a big problem.  When you have to manage the magnitude of say 50-200 petabytes of data, which is growing at 40% to 100% under increasing pressure and scrutiny from outside regulators while trying to reduce IT spend, the problem is just not big, its staggering!

The days of consistent double-digit margins, even for large investment banks, are no longer and with little to no control over the power to increase volume on the top end – you have figure out a way to take a third of your embedded cost out.  As one executive put it:  “If you are not focused on reducing IT costs right now, you are just not paying attention.”

So is Big Data viewed as opportunity or cost?  An informal mini-poll conducted said 80% cost vs. 20% opportunity.  The regulatory environment on Wall Street is such that it now costs the industry $30 billion a year and there is simply no avoiding it.  By contrast, when we discussed this with retail banks, the desire is to better understand customer behavior and we saw a shift in the balance where the business opportunity to better understand the customer across a broad range of products and services (i.e. data sources) is a very compelling proposition for executives and line-of-business owners.  The technology enabler for that then became the “how can I keep all the history of customer transactions and clicks for a longer time without increasing infrastructure spend,” or in other words: “cheap and deep” which I covered in a previous blog a few weeks back.

Interestingly the respective big data solution architectures varied too.  Most banks have a Hadoop cluster in a sand pit where a few technical resources are playing with it, some investigating a broader usage, and nearly all interested in it for it’s low-cost storage (even cloud) where you can easily provision virtual servers, and in some cases Content Addressable Stores  (to comply with WORM requirements).  By contrast, the classic financial services or investment firm is certainly paying attention to new technology innovation but does not appear to be taking Hadoop as seriously right now – at least in terms of rolling out production clusters where the business will come to rely upon it.

There is no question that big data is a big deal in financial services.  The problem already exists as opposed to it being “anticipated” or “coming”.  The requirements are mostly set and are very high, in terms of enterprise-grade capabilities.  For a market sector that is known to be an early technology adopter, it will be very interesting to see how big data technology ecosystems play out in the coming months and years.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

Modernising for Continuous Markets: Why Infrastructure Must Be Built for Constant Change

Trading infrastructure modernisation is no longer being driven solely by latency reduction or cost efficiency. The stronger message emerging across the industry is that firms are having to prepare for markets that are increasingly global, extended-hour, automated and operationally unforgiving. That was the central takeaway from a panel discussion at A-Team Group’s recent TradingTech Summit...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Data Management Implications of Solvency II

Bombarded by a barrage of incoming regulations, data managers in Europe are looking for the ‘golden copy’ of regulatory requirements: the compliance solution that will give them most bang for the buck in meeting the demands of the rest of the regulations they are faced with. Solvency II may come close as this ‘golden regulation’:...