About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data in Financial Services – Opportunity or Cost?

Subscribe to our newsletter

By John Bantleman, RainStor
www.rainstor.com

We recently hosted a dinner in New York City with 20 technology executives focused on big data in banking and financial services.  I found the event insightful, so I thought it would be interesting to share some of the perspectives from those who attended.

The first (and close to my heart) is the separation of the big data business problem and the available technology and solution architectures.  I am not trying to detract from the applicability and relevance of the Hadoop technology stack and the attractive, low cost scalability platform being central to making to solving big data challenges for today’s enterprise.  Yes, there is an elephant charging straight towards us but I do feel as an industry, we spend a lot more time focused on the technology itself and less on the business problems it solves.

Financial services and banking see themselves as living big data, with very high SLAs and stringent availability and security requirements for many years.  In fact, long before the big data market took off where we see start-ups forming almost every day with millions of dollars of venture funds in addition to public technology providers pouring in serious investment.  If you look at your classic Wall Street financial services organisation or bank, there is a big problem.  When you have to manage the magnitude of say 50-200 petabytes of data, which is growing at 40% to 100% under increasing pressure and scrutiny from outside regulators while trying to reduce IT spend, the problem is just not big, its staggering!

The days of consistent double-digit margins, even for large investment banks, are no longer and with little to no control over the power to increase volume on the top end – you have figure out a way to take a third of your embedded cost out.  As one executive put it:  “If you are not focused on reducing IT costs right now, you are just not paying attention.”

So is Big Data viewed as opportunity or cost?  An informal mini-poll conducted said 80% cost vs. 20% opportunity.  The regulatory environment on Wall Street is such that it now costs the industry $30 billion a year and there is simply no avoiding it.  By contrast, when we discussed this with retail banks, the desire is to better understand customer behavior and we saw a shift in the balance where the business opportunity to better understand the customer across a broad range of products and services (i.e. data sources) is a very compelling proposition for executives and line-of-business owners.  The technology enabler for that then became the “how can I keep all the history of customer transactions and clicks for a longer time without increasing infrastructure spend,” or in other words: “cheap and deep” which I covered in a previous blog a few weeks back.

Interestingly the respective big data solution architectures varied too.  Most banks have a Hadoop cluster in a sand pit where a few technical resources are playing with it, some investigating a broader usage, and nearly all interested in it for it’s low-cost storage (even cloud) where you can easily provision virtual servers, and in some cases Content Addressable Stores  (to comply with WORM requirements).  By contrast, the classic financial services or investment firm is certainly paying attention to new technology innovation but does not appear to be taking Hadoop as seriously right now – at least in terms of rolling out production clusters where the business will come to rely upon it.

There is no question that big data is a big deal in financial services.  The problem already exists as opposed to it being “anticipated” or “coming”.  The requirements are mostly set and are very high, in terms of enterprise-grade capabilities.  For a market sector that is known to be an early technology adopter, it will be very interesting to see how big data technology ecosystems play out in the coming months and years.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Xceptor and Delta Capita Unite to Streamline T+1 Settlement

Data automation platform provider Xceptor has announced a partnership with Delta Capita, a global provider of managed services, technology solutions, and consulting to the financial services industry, aimed at helping financial institutions prepare for the T+1 settlement deadline of 28th May 2024. The partnership, which will leverage Xceptor’s data automation platform and Delta Capita’s expertise,...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...