About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Zema Global Chief Girds for Soaring Demand for Energy Data

Subscribe to our newsletter

Since its acquisition of Morningstar’s commodity information business late last year, energy industry intelligence provider Zema Global has become an important data feed for financial institutions that invest in the net-zero transition and to those trading in renewables, biofuels and fossil fuels.

The transaction was a prescient one. While the Colorado, US-headquartered company has been providing enterprise data and data management services tailored to providers and traders of energy for a quarter of a century, it’s now that Zema Global is experiencing more engagement from financial institution, in part as a result of emerging investment opportunities in the global net-zero transition.

“We’re starting to see financial services get back into commodity trading,” Zema Global chief executive Andrea Remyn Stone told Data Management Insight during a visit to London. “Trading desks are popping up everywhere. When you have a lot more activity in both the physical trading and the derivatives trading side you’ve got this huge, almost mythical quest to find a unique dataset that gives you an edge.

“So it’s our job to make sure that customers get that breadth of data.”

New Clients

The Morningstar data acquisition came with 100 financial services customers, including seven of the world’s 10 largest banks, Remyn Stone said. They joined the thousands of subscribers that buy Zema Global’s market data that forms the price master for 95 per cent of contracts on the Chicago Mercantile Exchange (CME) – about 3 billion contracts. As well as institutional investors and traders, customers that receive Zema Global’s pricing data feed into their ERPs, trading and treasury systems include retail gas and power delivery companies.

Zema provides them with risk insights and analytics based on 14,000 data sets that offers visibility across the power and energy commodities industries. That data also includes information on anything that could influence the price of those commodities, including shipping costs, weather, wind-turbine productivity and geopolitical risks. Within its platforms, clients can interrogate their own data, third-party feeds such as that from exchanges and pricing agencies as well as publicly available information.

Within Zema Global’s end-to-end data, analytics, and integration platforms and services, it offers two data management services.

Zema Marketplace is an API-based data warehouse that sits within AWS and provides data feed subscriptions. Zema Enterprise is a software-based data management pipeline service through which clients can consume and distribute their proprietary and third-party data. It’s within this platform that clients can analyse their data and automate the creation of crucial forward curves – key metrics that model pricing for future deliveries of energy commodities.

“It’s the basis for making seven-, eight-figure decisions,” Remyn Stone said. “Our technologists and data professionals work tirelessly to build these curves, while business users – traders, research analysts, and others – rely on them to forecast supply and demand and drive pre-trade activity.”

Huge Task

Gathering, cleaning and mastering the commodity data behind those numbers is a huge task. By its nature commodity data is, in Remyn Stone’s words, idiosyncratic.

Commodities are physical assets and so are subject to a multitude of external influences in ways that financial assets aren’t – they can be affected by weather, by politics, by shipping patterns and so on. The collection methods of the data they use – often from sensors and visual monitors – are also prone to mechanical faults and interruption, all of which can affect information’s quality.

Those challenges are becoming magnified by market demands for faster data and datasets that cover a wider range of assets. Zema’s clients want more intel delivered faster to better inform their decisions on a widening array of assets, such as solar and wind energy.

The speed of data delivery is made more challenging by the “asynchronicity” of energy market-price transmission; electricity prices are reported at different intervals to those of raw materials, for instance. While pricing engines are refreshed at five-minute intervals, Remyn Stone expects those increments to narrow as technology enables faster data collection and distribution.

“Power markets are accelerating, and regulations are also requiring price reporting on a much more frequent basis, so you can’t just rely on end of day prices,” she said.

As those intervals get shorter, the need for streamlined data quality processes increases because two thirds of the time that it takes to prepare to trade a new commodity is focused on setting up, obtaining and understanding the quality of the data required, said Remyn Stone.

“Getting the data right is essential,” she said, adding that at no other time in her three-decades career in market data has innovation and demand evolved so rapidly.

Net-Zero

Providing transparency into the energy industry and markets will be of great consequence to the net-zero ambitions of nations and businesses. Today’s energy providers can be expected to provide the backbone of tomorrow’s power infrastructure and so visibility into their operations will be critical in understanding how the transition can be achieved, Remyn Stone argued.

She is in no doubt that will still be the case even as the political tailwind that initially drove the net-zero project has turned into a headwind. She has argued that common sense will drive the energy transformation as severe weather events and other climate-change related phenomena make it clear that finance has a role to play in mitigating global warming and its impacts.

There’s “a refocusing of risk and valuation on the fundamentals, but there’s still investment merit in projects that move us towards a more sustainable energy transition”.

Key to maintaining the momentum of transition will be granular data and the deep insights that it offers. The tide of headlines may be shifting against the global ESG project but the need for good quality data will remain, she said, as investors continue committing capital to desrisking measures.

One of Remyn Stone’s forecasts for the future of data is that organisations will increasingly demand the highest data “fidelity and authenticity”, and will call on a new wave of validation services to bring trust to the data they use in all of their operations, including ESG integration.

“The first generation of ESG data was basically rating methodologies that would scrape public data and give people comfort on a spectrum of morality – the spectrum of what is perceived as environmentally friendly or socially friendly or has good governance,” she said. “That’s not going to be acceptable. People are going to dive down below those ratings, and they’re going to look at those factors, including risk factors, and then they’ll look at the investment merit holistically… because we know without a doubt it makes a difference.

“I think that people are beginning to realise that no matter what your opinion is on climate change, we need to understand that intersection between the natural world, the financial world, and all of the physical assets need to be considered,” she said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies, tools and techniques to extract value from unstructured data

Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a challenge, but there are solutions designed to help financial institutions digest, manage and make best use...

BLOG

Spend, Spend, Spend: 2025 Set to be a Year of Bigger Data Budgets

Next year will be one of rising data expenditure by financial institutions as artificial intelligence (AI)-led applications flourish, according to separate surveys. Nevertheless, most aren’t prepared for AI adoption, with organisations having neither the skillsets nor regulatory processes in place, according to another survey. The final flurry of industry studies for 2024 suggest that financial...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...