Bloomberg has broken new ground with the release of its Real-time Events Data solution, which it says will help financial institutions make better decisions faster, based on the most accurate and timely information.
The US financial data and technology behemoth has leveraged its real-time streaming API connectivity to provide subscribing clients with data from earnings reports, news bulletins and other sources the moment they are released.
As well as alerting users to those events and the details of them, the company’s automated data collection technology scours the sources for the most pertinent information for each user so that it can be used to trigger pre-programmed responses, including the initiation of trades or the adjustment of risk models.
Colette Garcia, global head of enterprise data real-time content, said the project was a “big job” for Bloomberg and required a rethinking of how it presents real-time events data to clients.
“The ability to present a full message in real time and ensure that it has integrity across all of the fields that clients need to present an accurate picture of that message is a methodology we refined and rigorously applied for this product,” Garcia told Data Management Insight.
“It required accuracy, completeness and timeliness all to be brought together to create a valuable product.”
New Frontier
Bloomberg has complemented a traditional request-response model of event data delivery to one that’s more akin to a trade request or quote feed and comes in reaction to the growing expectations placed on market participants to act quicker and in a more programmatic way as external events unfold.
As technology and high-speed communications accelerate the speeds at which trades can be made and markets move, institutions need tools that can help them source, integrate and use data to prevent them falling behind their competitors. Traditional sell-side firms in particular are under pressure from newer, nimbler alternative liquidity providers that, unrestrained by the regulatory burdens placed on banks, are investing in the latest tech to win market share.
Additionally, regulators are moving towards real-time reporting structures that are placing further technology investment obligations on institutions.
“The ability for clients to be able to respond to the volatility and the fluctuations in the market has become essential, and the data that’s needed to do that is a constant driver of what they’re investing in,” Garcia said.
“It allows for them to be more confident with decisions being made, and also to navigate market fluctuations as they happen.”
Garcia added that real-time data also ensures that all parts of the institution work together on consistent and timely information. That’s a factor that is especially important for risk management, where there is often a material lag between the time the front office receives important information and the middle-office analysts are able to incorporate it into company models.
Use Cases
Bloomberg’s Real-time Events Data solution is seen aiding a multitude of use cases, especially within fixed-income markets, said Garcia. Event alerts can be programmed to trigger a move, say, from low-touch to high-touch automated trading if the data breaches a pre-set threshold, she said. As well, events data can be harnessed to adjust portfolios, recalculate risk metrics and identify liquidity, especially in fast-moving markets.
The company has also enabled clients to more flexibly use events data, permitting the building of rules that can instigate specific responses to the information.
The data is fed to client dashboards and can be immediately integrated with other data sets and within other applications, including Bloomberg’s proprietary tools and third-party products.
Key to the solution’s efficacy is the leveraging of Bloomberg’s analytical prowess to process data so that the right information arrives in a standardised, consistent format so that clients can utilise it with ease. Garcia said that the company “thought deeply” about how it could utilise Bloomberg’s expertise to condense data from a broad array of often complex sources, verify it and incorporate it into real-time workflows within streamlined ingestion pipelines.
While artificial intelligence has always been a part of the mix, Garcia stresses that it is Bloomberg’s data processing expertise that does most of the heavy lifting.
“It is actually quite a feat and a natural evolution of how you launch and release a data point as timely as possible and as accurate as possible,” she said.
“We have everything from editors and data check teams to automated verification workflows that happen. Depending on the data type, we have built a full system to consume the data sources and also verify the accuracy.”
Subscribe to our newsletter