About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Webinar to Discuss Data Transformation in Quant Research and Trading

Subscribe to our newsletter

Quantitative workflows rely on sourcing, aggregating, normalising and managing data. This can be highly resource-intensive, leading to a situation where some financial institutions with quant shops are more focused on data management than on data science and modeling.

So how can the data ingestion, management, and pipeline processes of quant workflows be streamlined, allowing quants to concentrate on their primary responsibilities such as building models, testing them, exploring alternative data sources, accessing available data models and libraries, and ultimately generating alpha?

This will be the subject of A-Team Group’s upcoming webinar on 14th March 2023, ‘Transforming Data Experiences in Quantitative Research and Trading’, featuring James McGeehan, Industry Principal, Financial Services and Bryan Lenker, Industry Field CTO, Financial Services at Snowflake.

“The data wrangling challenge in data science is significant,” says McGeehan. “With the increasing volume, variety, velocity, and veracity of data, having the ability to process, test and run transactional and analytical workloads and to speed up time to value is crucial, particularly with the real-time data sets needed in the financial markets space. Being able to simplify infrastructure, break down data silos, and enable quick insights is becoming increasingly essential.”

The webinar will look at how leading firms are transforming their data architectures and leveraging native application frameworks to access more data, power quantitative models, uncover unique insights and ultimately add value to their end users and customers.

“One of the key challenges is accessing and utilising multiple sources of data,” says Lenker. “Legacy technology platforms are unable to manage this end-to-end. Thus, organisations need to adopt a technology stack that enables users to access, build models with, and share data efficiently. Instead of relying on multiple tools and channels, organisations should approach their enterprise data strategy as a holistic entity and consider how and where the data output will be shared, internally or externally.”

“Multiple taxonomies and physical data movement cause redundant copies, stale data, and incomplete analysis,” adds McGeehan. “Modern application frameworks that unify data intelligence without physical movement, bringing intelligence to data in a secure environment, are key to achieving faster investment research, reducing data management, promoting security and governance, and enabling quicker monetisation.”

Please join us for what we expect will be an informative and enlightening discussion.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Leveraging interoperability: Laying the foundations for unique best-of-breed trading solutions

Date: 23 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interoperability on the trading desk promises more actionable insights, real-time decision making, faster workflows and reduced errors by ensuring data consistency across frequently used applications. But how can these promises be kept in an environment characterised by multiple applications...

BLOG

big xyt Partners with Baillie Gifford to Launch Portfolio Liquidity Analysis Solution for Dilution Levy Calculation

big xyt, the independent data and analytics solutions provider, has launched a new tool to automate the process of dilution. The Portfolio Liquidity Analysis solution, developed in collaboration with Baillie Gifford, is designed to enhance buy-side firms’ understanding of equity portfolio liquidity and to address the forthcoming industry guidance on the application of dilution levies....

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...