About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Webinar to Discuss Data Transformation in Quant Research and Trading

Subscribe to our newsletter

Quantitative workflows rely on sourcing, aggregating, normalising and managing data. This can be highly resource-intensive, leading to a situation where some financial institutions with quant shops are more focused on data management than on data science and modeling.

So how can the data ingestion, management, and pipeline processes of quant workflows be streamlined, allowing quants to concentrate on their primary responsibilities such as building models, testing them, exploring alternative data sources, accessing available data models and libraries, and ultimately generating alpha?

This will be the subject of A-Team Group’s upcoming webinar on 14th March 2023, ‘Transforming Data Experiences in Quantitative Research and Trading’, featuring James McGeehan, Industry Principal, Financial Services and Bryan Lenker, Industry Field CTO, Financial Services at Snowflake.

“The data wrangling challenge in data science is significant,” says McGeehan. “With the increasing volume, variety, velocity, and veracity of data, having the ability to process, test and run transactional and analytical workloads and to speed up time to value is crucial, particularly with the real-time data sets needed in the financial markets space. Being able to simplify infrastructure, break down data silos, and enable quick insights is becoming increasingly essential.”

The webinar will look at how leading firms are transforming their data architectures and leveraging native application frameworks to access more data, power quantitative models, uncover unique insights and ultimately add value to their end users and customers.

“One of the key challenges is accessing and utilising multiple sources of data,” says Lenker. “Legacy technology platforms are unable to manage this end-to-end. Thus, organisations need to adopt a technology stack that enables users to access, build models with, and share data efficiently. Instead of relying on multiple tools and channels, organisations should approach their enterprise data strategy as a holistic entity and consider how and where the data output will be shared, internally or externally.”

“Multiple taxonomies and physical data movement cause redundant copies, stale data, and incomplete analysis,” adds McGeehan. “Modern application frameworks that unify data intelligence without physical movement, bringing intelligence to data in a secure environment, are key to achieving faster investment research, reducing data management, promoting security and governance, and enabling quicker monetisation.”

Please join us for what we expect will be an informative and enlightening discussion.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents.  While AI has created new opportunities to extract signal from this data, many firms...

BLOG

ITRS Acquires IP-Label to Expand Digital Experience Monitoring Capabilities

ITRS, the performance monitoring and analytics provider, has agreed to acquire IP-Label, the Paris-based specialist in Digital Experience Monitoring (DEM) and performance analytics, with the aim of strengthening its DEM capabilities and expanding its presence in Europe. The acquisition brings IP-Label’s Ekara platform into the ITRS portfolio, adding capabilities including Synthetic Transaction Monitoring (STM), Real...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...