About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Managing Big Data Woes: A How-to Guide for Hedge Funds

Subscribe to our newsletter

By Mark Kenney, Abacus Group
www.abacusgroupllc.com

The explosion of data over the last several years, coupled with dynamic compliance requirements, has put excessive stress on many firms’ storage capabilities. With the need to juggle multiple strategies, multiple funds and multiple prime brokers, it’s not uncommon for today’s hedge funds to deal with many terabytes of data.

In the past, most firms didn’t have the infrastructure or capital to properly store, back up and monitor “big data.”  They could keep it on site by implementing or adding capacity to an in-house storage solution – absorbing the cost of not only the high-ticket hardware and software, but also the staff with the knowledge to keep things running.

But with “the cloud” coming of age, firms now have access to substantial capacity needed to deal with big data tasks; such as, back-testing of large quantitative models, seamlessly managing multiple market data feeds and store and back up tick level data to meet regulatory requirements.

We have identified the following four essential qualities that firms require from technology providers to alleviate their big data challenges:

1. Scalability

Firms should be concerned with running their business, not trying to determine how much storage they need now – let alone forecasting future data requirements in the future.

Providers should be able to supply large-capacity burstable storage capabilities that allow funds to add or decrease storage instantly, based on their immediate requirements.  In addition, providers should be able to supply access to required market data feeds on the fly.

2. Efficiency

Firms generate different types of data that, inherently, have different requirements.  Managing quantitative models on which investment decisions are made requires more immediate access than archiving of messages or documents required for compliance reasons.  Providers should be able to supply multiple types of scalable enterprise-level storage architectures; such as serial ATA (SATA) and serial attached SCSI (SAS).  Further, providers should offer tiered pricing models which take into account how much and what type(s) of storage a firm uses.

3. Convenience

Firms that handle their big data challenges by investing in the implementation of an on-site solution also need to invest in the on-site staff and solutions to offer assistance, monitor and address any issues, and ensure the data is properly backed up.

Likewise, a provider must employ an experienced support team that can offer 24×7 assistance and storage management and monitoring, as well as flexible back-up scheduling.  Firms that use third-party providers benefit from economies of scale that allow the provider to deploy enterprise-level solutions and services, whose costs would be prohibitive for the typical single hedge fund to deploy.

4. Security

When it comes to data, firms need to know that it is protected and will be available whenever needs demand.

To accomplish these “simple” goals, providers need enterprise-level solutions to both segment data across multiple drives, and deliver comprehensive disaster recovery that replicates the data to geographically diverse data centers.  To give firms peace of mind, providers need to not only enact policies and procedures that ensure the security of the data at all levels, but also engage third-party auditors to certify these policies and procedures on a regular basis via Industry Starndards such as SAS 70 & SSAE 16

Data requirements will always outgrow the initial equipment deployed to manage this service on-site.  Cloud Providers are providing solutions which address this challenge in an effective manner.  The right cloud provider will offer the needed “on-demand” capacity in an economical manner, on a highly available, secure and redundant platform that is maintained by experienced staff.  Such solutions can easily grow with firms’ needs, ultimately meeting their critical storage demands while allowing them to focus on their core business.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: AI in Asset Management: Buy-Side Attitudes toward GenAI and LLMs

Since ChatGPT exploded onto the scene in late 2022, financial markets participants have been trying to understand the opportunities and risks posed by artificial intelligence and in particular generative AI (GenAI) and large language models (LLMs). While the full value of the technology continues to become apparent, it’s already clear that AI has enormous potential...

BLOG

Data’s Role in AI Transition and Value Creation: Data Management Summit London Preview

The rapid adoption of artificial intelligence by financial institutions has required a heavy data management uplift as organisations have upgraded their systems to incorporate the new technology. It has also provided greater opportunity to squeeze even more value from data by enabling its efficient deployment across enterprises. Just how companies manage data for AI to...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...