About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: A New Kind of Single Source of the Truth

Subscribe to our newsletter

I’ve been struck in recent months by the changing attitudes within our marketplace to the concept of a golden copy, or single, accepted source of the truth as far as reference data is concerned. The industry seems to be moving away from established principles of on-site enterprise data management, most visibly – noisily, perhaps? – toward the utility model of outsourced, mutualised reference data management.

But there seems to be another, less tangible theme emerging as firms seek that Holy Grail. This occurred to me as I chatted this week with friends at 1View, whose Paul Kennedy recently appeared on our Hot Topic webinar on utility approaches to data management. 1View is one of several companies I’ve come across in my travels that seem to be taking a different approach to golden copy, one that doesn’t require major infrastructure investment but instead focuses on control of existing data sources and a fair degree of cross-referencing.

1View’s approach is simple – which can be dangerous. In short, it allows clients to upload data extracts from their key systems to 1View, which then ranks them into an aggregated superset. All exceptions are flagged, and mappings made to deal with data formats and other discrepancies. The result, according to 1View chief Nigel Pickering, is that all common data across source systems is synchronised over time, without the need for major infrastructure or process change.

1View’s approach seems to resonate with those of other suppliers who appear to be taking a bottom-up rather than top-down approach to data standardisation or synchronisation. Thomson Reuters’ response to the EU ruling on its RIC code, the efforts of companies like Simplified Financial Information, and Skyler with its LiveCache platform all spring to mind.

No, they’re not addressing the exact same problem, but they are taking a similar approach by attacking data control from the ground up. Putting aside Thomson Reuters for now, SFI has teamed with performance measurement specialist TS Associates to monitor where data is being sourced and used across the financial enterprise, and Skyler’s LiveCache can be used to create a kind of Universal Bank ID that clients can use to access data pertaining to a given instrument or entity from any internal or external source used by the bank.

You may think the link here is tenuous, and I’ll concede that my thinking on this far fully baked. I do think, however, that the topic warrants further exploration, and may make for a good panel discussion at one of our future Data Management Summits or even a webinar. What do you reckon?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

AI is Helping to Solve New ESG Data Challenges: ESG Briefing Review

The peculiar demands that ESG data integration places on capital markets participants requires powerful techniques that are increasingly being provided through artificial intelligence, A-Team Group’s recent ESG Data and Tech Briefing London heard. From data quality monitoring and analytics to supply chain analysis and investment management, AI-based tools are already offering automated solutions to some...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...