About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Thomson Sets Central Role for QAI Platform In Reference Data Delivery Strategy

Subscribe to our newsletter

Thomson Financial’s acquisition of Quantitative Analytics Inc. (QAI), which closed earlier this year, gave the company a product aimed squarely at the quantitative analyst marketplace. But the QAI platform is destined to play a more significant role as the company crystallizes its strategy for distribution of reference data.

Thomson Financial has spent the past couple of years building on its various acquired content sets to offer a reference data solution, an initiative spearheaded by Thomas Aubrey, investment management director, who was drafted in from Interactive Data Corp. a couple of years ago. While it has long been acknowledged that Thomson has had some valuable content, it is only the recent development of its Thomson Data Feed technology and focus that has enabled it to play in the reference data space.

Says Aubrey: “A couple of years ago, we pushed the feed and got traction, and realised this was a big business that we had to be in.”

Although Thomson is not often cited by the user community as a reference data supplier, Aubrey maintains “We are very much in the game.” His view is that reference data is much broader than content aimed solely at the back office and indeed covers any data outside of real-time streaming prices. Many institutions, of course, recognize the need for and are implementing centralized infrastructures to manage data supporting functions across the enterprise – front- and middle-office as well as back-office – but do not necessarily include such a broad definition of reference data.

Aubrey says that Thomson has a presence within the centralized ‘data warehouse’ space, which it hopes to expand with its more focused approach. But an essential part of Thomson’s strategy is to work with the “peripheral departments” whose data requirements aren’t necessarily met by the central data infrastructure. He cites quantitative analysis, performance attribution, portfolio evaluations and risk management as examples of departments needing more data than is often supplied centrally.

This focus on peripheral departments, says Aubrey, was one of the key rationales for acquiring QAI. Its database technology can be used to “help small teams, who aren’t getting the data they need from centralized warehouses, to manage reference data”.

Longer term, it is envisaged that the QAI platform will supersede Thomson’s existing feed technology. “This put us in a position where we were able to develop the next generation of integrated data feeds,” says Aubrey. “A lot of our data was already available on the platform through our (prior) distribution arrangement, but we have lots of things on the horizon.”

The platform is SQL-based with a data model inside, and handles Thomson, third-party vendor-sourced and proprietary data.

Aubrey says: “There are lots of areas we are moving into within the reference data space.” An example of other deals coming up, he says, includes a tick-by-tick database offering with history. “Tick data is expensive, but the acquisition of technology (through QAI) will enable us to make the data more broadly available to the market.” Thomson plans to provide ongoing and historical tick data.

Thomson’s view of reference data is that the world has moved on since the 12 points of data were defined back in the days of regulatory drive for straight-through-processing. As such, says Aubrey, Thomson includes in the definition, all nonreal-time referential data such as economics, fundamental data, historical data and quantitative analytics typically used by more front-office and middle-office functions. “It’s no longer just about coded corporate actions, it’s about workflow across the enterprise.”

Thomson is also taking a different approach to its client implementations. “We are not just putting in a big pipe of data and leaving the client to it,” says Aubrey. “We are taking a consultative approach, which although it takes longer, improves our credibility with clients. No-one knows the data like the vendor does and so we can enable the client to get the best value out of our data and make them aware of the possibilities through a consultative approach. It is essential that we understand the end-user requirements and how the data is powering different applications.”

The vendor has been making more noise about its reference data offering recently. Earlier this year it launched the Intraday Snapshot Service (ISS). Here, Aubrey says, Thomson wanted to improve its intraday data and help with consistency across the organization. Clients of Linedata Services’ Icon portfolio management offering – which was formerly owned by Thomson – are large consumers of the data.

And last month it sealed a deal – as exclusively reported in last month’s issue of Reference Data Review – with Standard & Poor’s to carry the latter’s evaluations pricing data. Thomson recognized the need from its customers for more coverage of illiquid fixed income securities, which Aubrey reckons is being driven by the growing diversity in instruments being traded, particularly structured debt. The company considered building a solution itself, but time to market negated that option, Aubrey says. When evaluating many third-party solutions, Thomson considered Standard & Poor’s to have the right coverage of very illiquid securities, including new areas of structured finance, and to have the right strategic fit, he says.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Rimes Releases Data lakehouse Designed to Provide Insights from Diverse Data Sources

Rimes has released the Rimes data lakehouse, a service that combines the advantages of a data lake and a data warehouse to enable asset managers and owners to quickly access structured and unstructured data and derive valuable insights from diverse data sources. The lakehouse comprises an advanced data storage, processing and distribution platform delivered as...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...