About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Managing Big Data Woes: A How-to Guide for Hedge Funds

Subscribe to our newsletter

By Mark Kenney, Abacus Group

The explosion of data over the last several years, coupled with dynamic compliance requirements, has put excessive stress on many firms’ storage capabilities. With the need to juggle multiple strategies, multiple funds and multiple prime brokers, it’s not uncommon for today’s hedge funds to deal with many terabytes of data.

In the past, most firms didn’t have the infrastructure or capital to properly store, back up and monitor “big data.”  They could keep it on site by implementing or adding capacity to an in-house storage solution – absorbing the cost of not only the high-ticket hardware and software, but also the staff with the knowledge to keep things running.

But with “the cloud” coming of age, firms now have access to substantial capacity needed to deal with big data tasks; such as, back-testing of large quantitative models, seamlessly managing multiple market data feeds and store and back up tick level data to meet regulatory requirements.

We have identified the following four essential qualities that firms require from technology providers to alleviate their big data challenges:

1. Scalability

Firms should be concerned with running their business, not trying to determine how much storage they need now – let alone forecasting future data requirements in the future.

Providers should be able to supply large-capacity burstable storage capabilities that allow funds to add or decrease storage instantly, based on their immediate requirements.  In addition, providers should be able to supply access to required market data feeds on the fly.

2. Efficiency

Firms generate different types of data that, inherently, have different requirements.  Managing quantitative models on which investment decisions are made requires more immediate access than archiving of messages or documents required for compliance reasons.  Providers should be able to supply multiple types of scalable enterprise-level storage architectures; such as serial ATA (SATA) and serial attached SCSI (SAS).  Further, providers should offer tiered pricing models which take into account how much and what type(s) of storage a firm uses.

3. Convenience

Firms that handle their big data challenges by investing in the implementation of an on-site solution also need to invest in the on-site staff and solutions to offer assistance, monitor and address any issues, and ensure the data is properly backed up.

Likewise, a provider must employ an experienced support team that can offer 24×7 assistance and storage management and monitoring, as well as flexible back-up scheduling.  Firms that use third-party providers benefit from economies of scale that allow the provider to deploy enterprise-level solutions and services, whose costs would be prohibitive for the typical single hedge fund to deploy.

4. Security

When it comes to data, firms need to know that it is protected and will be available whenever needs demand.

To accomplish these “simple” goals, providers need enterprise-level solutions to both segment data across multiple drives, and deliver comprehensive disaster recovery that replicates the data to geographically diverse data centers.  To give firms peace of mind, providers need to not only enact policies and procedures that ensure the security of the data at all levels, but also engage third-party auditors to certify these policies and procedures on a regular basis via Industry Starndards such as SAS 70 & SSAE 16

Data requirements will always outgrow the initial equipment deployed to manage this service on-site.  Cloud Providers are providing solutions which address this challenge in an effective manner.  The right cloud provider will offer the needed “on-demand” capacity in an economical manner, on a highly available, secure and redundant platform that is maintained by experienced staff.  Such solutions can easily grow with firms’ needs, ultimately meeting their critical storage demands while allowing them to focus on their core business.

Subscribe to our newsletter

Related content


Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...


Snowflake Report Says Asset Servicers Are Using Cloud as Agent of Change

Institutional client services (ICS) providers are harnessing cloud-based technologies to develop new capabilities, including artificial intelligence (AI), as they transform their business models to suit a fast-changing global financial system. In a bid to differentiate themselves they are seizing opportunities offered by cloud processing to “take more control over the assets they traditionally entrust to...


Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...