About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Unlocking the Value of New Data Sources

Subscribe to our newsletter

By: Martijn Groot, Vice President, Product Management, Asset Control

A growing range of innovative data sources, from sentiment analysis to satellite imagery, is adding depth to traditional data resources and unlocking opportunities to gain new insights through machine learning, correlation and pattern spotting – insights that can improve time to market and drive down costs.

The key to finding value in this new data diversity is a fundamental change to traditional data management. Tomorrow’s financial data models have to be able to support fast and effective onboarding of new data sources, as well as efficient data exploration and easy access and distribution of data across the business.

Beyond regulatory demands

Financial institutions’ decision-making processes are set to undergo a fundamental change as organisations begin to onboard and explore a new raft of data sources. From web crawling to mine news and spot corporate events to sentiment analysis, satellite and geospatial information, traffic and travel patterns, and property listings, the way in which organisations can analyse investment opportunities, track Politically Exposed Persons, and ingest company news is being transformed.

While the agenda for the new data model has been driven by regulatory demands, the sheer depth of information now created and collected globally is extraordinary and is set to take the industry far beyond the traditional catalogue of price and reference data sources.

No longer will organisations be limited to published financial statements and earning calls; instead investment decisions can be based on a much broader and deeper – but potentially also murkier – set of data. For example, the addition of social media sentiment analysis can deliver a new level of understanding into a retailer’s performance. Indeed, with the availability now of transcripts of all earning calls, it is possible to understand who is asking specific questions and how CEOs and CFOs respond – insight that can be tracked and analysed to deliver fast, actionable investment insight. Similarly, with Know Your Customer (KYC), the ability to rapidly deep dive through multiple diverse data sources provides a chance to address the escalating overhead associated with customer onboarding and reduce the cost of doing business.

Effective data discovery

The challenge is to find a way to harness these new data sources; to onboard this new insight in a way that is fast, effective and usable. The mastering process must still provide the traditional 360-degree version of the truth that can be used across the organisation – from valuations to risk and financial reporting – and the additional data sources reinforce the need for excellent structured processes that compare sources to find discrepancies and deliver that golden source. But this process must now also deliver excellent integration, with organisations looking for robust Application Programming Interfaces (APIs) to enable the fast stitching together and exploration of new data sources.

Moreover, this fundamentally changes the emphasis of the mastering process. Rather than focusing on error detection in order to achieve consistency and accuracy, these sources enable organisations to undertake pattern discovery, leveraging new techniques including machine learning to spot new correlations or reveal unusual activity within market surveillance by way of example.

Speed is critical; fast, effective data discovery will be essential to drive down the cost of change and provide organisations with a chance to gain differentiation in time to market. Intelligent data mastering is at the heart of this new model. Combining APIs that enable integration with an easy process for testing and onboarding these new models in production will be essential – and that will require APIs that support popular data science languages, including R and Python. In addition, the use of NoSQL technology combined with the ability to deploy new models close to the data will be key to supporting the significant associated data processing demand.

A new data foundation

This ability to combine robust data mastering processes with excellent integration will build a new data foundation. It will enable an organisation to pull together these diverse data sets, create new insight based on sentiment from social media and transcripts of earning calls and the traditional measures of price history and published financials.

Maximising value also requires organisations to reconsider access and utilisation. Making these new data sets easily accessible, not only to new algorithms and data scientists, but also to end users within risk, investment, operations or compliance will mark a significant step change in data exploitation.

Ensuring the data easily integrates with the languages adopted by data scientists is fundamental, but to deliver the immense potential value to end users, data analysis must evolve beyond the traditional technical requirements of SQL queries. Offering end users self-service access via enterprise search, a browser, Excel and easy to understand interaction models, rather than via proprietary APIs and custom symbologies, will open up these new data sources to deliver even greater corporate value.

Unlocking value

These new data sources are radically different to traditional data resources and their potential value to an organisation is untapped. Pattern matching, in particular, can be oriented not only towards improving operations or reducing risk, but also towards improved pricing and new revenue opportunities. Matching of data items will not only take place through common keys, but also through spotting the same behaviour in hitherto unrelated data or otherwise finding repeating patterns in time, space and across different data sets.

Especially for active investment management, the use of non-traditional data sources can help compete and differentiate against passive investment strategies, while in compliance and risk management, accessing a broader range of sources can help trigger early warnings on suspect transactions, relationships or price movements.

The potential is incredibly exciting – and first mover advantage cannot be overstated. The key for financial institutions over the next year or so is to move beyond traditional EDM models and embrace the new mastering and distribution services that will enable essential exploitation of data across the business.

Subscribe to our newsletter

Related content


Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...


SIX Bot Automates Corporate Actions Queries

SIX has released an automated software application called SIX Bot that is based on the company’s corporate actions data and allows financial markets professionals to ask intuitive questions relating to corporate actions events. The application provides the latest information on more than 70 corporate action event types to over 600,000 financial market professionals on Symphony,...


Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


Regulatory Reporting Handbook – First Edition

Welcome to the inaugural edition of A-Team Group’s Regulatory Reporting Handbook, a comprehensive guide to reporting obligations that must be fulfilled by financial institutions on a global basis. The handbook reviews not only the current state of play within the regulatory reporting space, but also looks ahead to identify how institutions should be preparing for...