The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Interactive Data’s Plans for Continuous Evaluated Pricing

Interactive Data, which has historically had strong roots in the back office with its end-of-day pricing services, has been ramping up to provide streaming evaluated bond prices in the form of its Continuous Evaluated Pricing (CEP) service and is seeking to expand its reach into core middle and front office functions within the financial enterprise. The pricing service, which was introduced in the US in late 2014, has now been introduced into the European market.

CEP is aimed at extending the reach of fixed income end-of-day evaluations into intraday applications for things like pre-trade transparency, price discovery, enhanced trading workflow, best execution analysis and more. It covers EMEA corporates and sovereigns, EMEA money markets, US corporate bonds, US treasuries, US agency debentures and TBA MBS & MBS Pass-throughs.

Use Cases for Continuous Evaluated Pricing

There are many potential uses across the financial enterprise for CEP, according to its Senior Director, Evaluated Services, Bill Gartland.

Bill says, “We have seen a strong demand for this service and have a significant number of financial organisations testing and implementing the service”.

One of the strongest propositions of the Continuous Evaluated Pricing, he says, is in exchange traded funds where users can monitor the pricing changes in fixed income assets just as they do for the underlying equities. “We’re also getting interest from the fund sponsors themselves who feel that having more accurate intraday NAV values removes uncertainty among market makers and encourages them to keep tighter bid/ask spreads, which in turn can attract more assets.

He says, “We’re now starting to engage more with the buy-side and more generally with fixed income operations and we’re working to develop the tools we need to build around CEP to make it operationally effective for them. We see significant potential here.”

Interactive Data is also exploring the areas of post-trade Transaction Cost Analysis (TCA) for fixed income, particularly in corporate bonds. Bill says, “Here, their analysis tries to capture how much slippage there is in the implementation of a trade, the arrival price of an asset, when the order price was received, and then when it executed. It’s hard to measure what is happening to the price over this window of 15 to 20 minutes and we can help.”

He also suggests it can be used as a best execution tool. “Where they might now send out an RFQ to five or six dealers when they want to buy or sell something, but this is not in the best interest of the fund. if they take our price instead as one of the reference prices they can reduce the number of dealers they reach out to. If those prices are consistent with our price we can be the tie breaker to justify their obligation for best execution.

And another area is in risk management where having a regularly updating stream of pricing on bonds can ensure they are more accurate in their mark to market.

Better Equipping the Evaluators

To deliver the broader range and increased frequency of pricing, Interactive Data has shifted the emphasis of experience in its team of evaluators, which numbers just over 200 globally based in the US, London, Frankfurt, Hong Kong and Sydney, and equipped them with better tools to get the job done.

Bill says, “The knowledge and experience of the evaluators is key and rather than relying on models to create synthetic prices. ,we believe in our approach which relies on the largest team of evaluators in our industry. They have a background in trading, portfolio management, fixed income sales and they use that information and knowledge to update their opinion about market values and to inform us on the reasoning about how investors are valuing the different bonds and issuers. We take that knowledge and make adjustments to our evaluated prices and then publish that through the day, reporting changes as they happen.

“We’ve built a new dashboard tool for the evaluators that guides valuers through the process to come up with a value. It has a routine rules engine that will trigger reviews based on tolerances set – such as sector, issuer, CUSIP levels. When they get an alert they can look in more depth and react accordingly, looking at things like whether a bond is moving as you’d expect in line with other bonds from that issuer. We look at news stories and we leverage our network of contacts with people who are actively trading in that space.”

“The new system now acts more like a modern market data terminal receiving data in real-time and then allowing us to publish data directly onto the message bus and into downstream systems,” says Bill.

Interactive Data has also hired a small team of sales specialists for CEP who have industry contacts and are working side-by-side with the existing sales people.

Leveraging the Private Backers’ Investment

The continuous pricing capability has been a critical output of the re-architected systems at Interactive Data (read more about that here) which its private investor backers of Silverlake and Warburg Pincus have invested heavily in.

Gartland says, “We’re now equipped with a modern technology framework that ties together all Interactive Data’s content properties [acquired over the years] that enables us to meet the new demands of the marketplace.”

Gartland previously worked at Tradeweb with its then CEO Jim Toffey, and then started Benchmark Solutions focused on the models-based approach. However, he says, “It’s very difficult to write formulas to capture all the nuances that people accumulate over the years”. Hence the focus on his team of evaluators.

Related content

WEBINAR

Recorded Webinar: A new way of collaborating with data

Digital transformation in the financial services sector has raised many questions around data, including the cost and volume of reference data required by each financial institution. Firms want to pick and choose the reference data they need to fulfil their requirements. Emerging solutions with the potential to decrease the cost of data and increase flexibility...

BLOG

QuantHouse Boosts Backbone Capacity to 100G Using Arista

Iress’s QuantHouse market data and infrastructure subsidiary has upgraded its data centre backbone connections globally to 100 Gigabit capacity to address the rigours of the ongoing market volatility. The upgrade is part of a wider infrastructure enhancement aimed at meeting three business objectives designed to enhance performance and resilience while adding customer self-service capabilities. As...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...