About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Thomson Sets Central Role for QAI Platform In Reference Data Delivery Strategy

Subscribe to our newsletter

Thomson Financial’s acquisition of Quantitative Analytics Inc. (QAI), which closed earlier this year, gave the company a product aimed squarely at the quantitative analyst marketplace. But the QAI platform is destined to play a more significant role as the company crystallizes its strategy for distribution of reference data.

Thomson Financial has spent the past couple of years building on its various acquired content sets to offer a reference data solution, an initiative spearheaded by Thomas Aubrey, investment management director, who was drafted in from Interactive Data Corp. a couple of years ago. While it has long been acknowledged that Thomson has had some valuable content, it is only the recent development of its Thomson Data Feed technology and focus that has enabled it to play in the reference data space.

Says Aubrey: “A couple of years ago, we pushed the feed and got traction, and realised this was a big business that we had to be in.”

Although Thomson is not often cited by the user community as a reference data supplier, Aubrey maintains “We are very much in the game.” His view is that reference data is much broader than content aimed solely at the back office and indeed covers any data outside of real-time streaming prices. Many institutions, of course, recognize the need for and are implementing centralized infrastructures to manage data supporting functions across the enterprise – front- and middle-office as well as back-office – but do not necessarily include such a broad definition of reference data.

Aubrey says that Thomson has a presence within the centralized ‘data warehouse’ space, which it hopes to expand with its more focused approach. But an essential part of Thomson’s strategy is to work with the “peripheral departments” whose data requirements aren’t necessarily met by the central data infrastructure. He cites quantitative analysis, performance attribution, portfolio evaluations and risk management as examples of departments needing more data than is often supplied centrally.

This focus on peripheral departments, says Aubrey, was one of the key rationales for acquiring QAI. Its database technology can be used to “help small teams, who aren’t getting the data they need from centralized warehouses, to manage reference data”.

Longer term, it is envisaged that the QAI platform will supersede Thomson’s existing feed technology. “This put us in a position where we were able to develop the next generation of integrated data feeds,” says Aubrey. “A lot of our data was already available on the platform through our (prior) distribution arrangement, but we have lots of things on the horizon.”

The platform is SQL-based with a data model inside, and handles Thomson, third-party vendor-sourced and proprietary data.

Aubrey says: “There are lots of areas we are moving into within the reference data space.” An example of other deals coming up, he says, includes a tick-by-tick database offering with history. “Tick data is expensive, but the acquisition of technology (through QAI) will enable us to make the data more broadly available to the market.” Thomson plans to provide ongoing and historical tick data.

Thomson’s view of reference data is that the world has moved on since the 12 points of data were defined back in the days of regulatory drive for straight-through-processing. As such, says Aubrey, Thomson includes in the definition, all nonreal-time referential data such as economics, fundamental data, historical data and quantitative analytics typically used by more front-office and middle-office functions. “It’s no longer just about coded corporate actions, it’s about workflow across the enterprise.”

Thomson is also taking a different approach to its client implementations. “We are not just putting in a big pipe of data and leaving the client to it,” says Aubrey. “We are taking a consultative approach, which although it takes longer, improves our credibility with clients. No-one knows the data like the vendor does and so we can enable the client to get the best value out of our data and make them aware of the possibilities through a consultative approach. It is essential that we understand the end-user requirements and how the data is powering different applications.”

The vendor has been making more noise about its reference data offering recently. Earlier this year it launched the Intraday Snapshot Service (ISS). Here, Aubrey says, Thomson wanted to improve its intraday data and help with consistency across the organization. Clients of Linedata Services’ Icon portfolio management offering – which was formerly owned by Thomson – are large consumers of the data.

And last month it sealed a deal – as exclusively reported in last month’s issue of Reference Data Review – with Standard & Poor’s to carry the latter’s evaluations pricing data. Thomson recognized the need from its customers for more coverage of illiquid fixed income securities, which Aubrey reckons is being driven by the growing diversity in instruments being traded, particularly structured debt. The company considered building a solution itself, but time to market negated that option, Aubrey says. When evaluating many third-party solutions, Thomson considered Standard & Poor’s to have the right coverage of very illiquid securities, including new areas of structured finance, and to have the right strategic fit, he says.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner. This webinar will review the state-of-play on ESG data, consider the challenges of sourcing and managing...

BLOG

Simplifying the Data Pipeline Through Automation

The increasing complexity of data processes and the surging use cases for that data has made pipeline automation a must for financial institutions, says Stonebranch vice president for solution management Nils Buer. Not only is there a business case for pipeline automation, but there is also a growing legal case for it as regulators pile...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...