About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Webinar to Discuss Data Transformation in Quant Research and Trading

Subscribe to our newsletter

Quantitative workflows rely on sourcing, aggregating, normalising and managing data. This can be highly resource-intensive, leading to a situation where some financial institutions with quant shops are more focused on data management than on data science and modeling.

So how can the data ingestion, management, and pipeline processes of quant workflows be streamlined, allowing quants to concentrate on their primary responsibilities such as building models, testing them, exploring alternative data sources, accessing available data models and libraries, and ultimately generating alpha?

This will be the subject of A-Team Group’s upcoming webinar on 14th March 2023, ‘Transforming Data Experiences in Quantitative Research and Trading’, featuring James McGeehan, Industry Principal, Financial Services and Bryan Lenker, Industry Field CTO, Financial Services at Snowflake.

“The data wrangling challenge in data science is significant,” says McGeehan. “With the increasing volume, variety, velocity, and veracity of data, having the ability to process, test and run transactional and analytical workloads and to speed up time to value is crucial, particularly with the real-time data sets needed in the financial markets space. Being able to simplify infrastructure, break down data silos, and enable quick insights is becoming increasingly essential.”

The webinar will look at how leading firms are transforming their data architectures and leveraging native application frameworks to access more data, power quantitative models, uncover unique insights and ultimately add value to their end users and customers.

“One of the key challenges is accessing and utilising multiple sources of data,” says Lenker. “Legacy technology platforms are unable to manage this end-to-end. Thus, organisations need to adopt a technology stack that enables users to access, build models with, and share data efficiently. Instead of relying on multiple tools and channels, organisations should approach their enterprise data strategy as a holistic entity and consider how and where the data output will be shared, internally or externally.”

“Multiple taxonomies and physical data movement cause redundant copies, stale data, and incomplete analysis,” adds McGeehan. “Modern application frameworks that unify data intelligence without physical movement, bringing intelligence to data in a secure environment, are key to achieving faster investment research, reducing data management, promoting security and governance, and enabling quicker monetisation.”

Please join us for what we expect will be an informative and enlightening discussion.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

Parameta Solutions Launches Enhanced Real-Time OTC Oil Market Data Service

Parameta Solutions, the data and analytics division of TP ICAP Group, has launched an upgraded real-time data service designed to improve transparency in over-the-counter (OTC) oil trading. The service provides live, broker-sourced pricing from TP ICAP subsidiaries PVM and ICAP, with data from TP to be added later in October. Parameta claims that this makes...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...