About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: A New Kind of Single Source of the Truth

Subscribe to our newsletter

I’ve been struck in recent months by the changing attitudes within our marketplace to the concept of a golden copy, or single, accepted source of the truth as far as reference data is concerned. The industry seems to be moving away from established principles of on-site enterprise data management, most visibly – noisily, perhaps? – toward the utility model of outsourced, mutualised reference data management.

But there seems to be another, less tangible theme emerging as firms seek that Holy Grail. This occurred to me as I chatted this week with friends at 1View, whose Paul Kennedy recently appeared on our Hot Topic webinar on utility approaches to data management. 1View is one of several companies I’ve come across in my travels that seem to be taking a different approach to golden copy, one that doesn’t require major infrastructure investment but instead focuses on control of existing data sources and a fair degree of cross-referencing.

1View’s approach is simple – which can be dangerous. In short, it allows clients to upload data extracts from their key systems to 1View, which then ranks them into an aggregated superset. All exceptions are flagged, and mappings made to deal with data formats and other discrepancies. The result, according to 1View chief Nigel Pickering, is that all common data across source systems is synchronised over time, without the need for major infrastructure or process change.

1View’s approach seems to resonate with those of other suppliers who appear to be taking a bottom-up rather than top-down approach to data standardisation or synchronisation. Thomson Reuters’ response to the EU ruling on its RIC code, the efforts of companies like Simplified Financial Information, and Skyler with its LiveCache platform all spring to mind.

No, they’re not addressing the exact same problem, but they are taking a similar approach by attacking data control from the ground up. Putting aside Thomson Reuters for now, SFI has teamed with performance measurement specialist TS Associates to monitor where data is being sourced and used across the financial enterprise, and Skyler’s LiveCache can be used to create a kind of Universal Bank ID that clients can use to access data pertaining to a given instrument or entity from any internal or external source used by the bank.

You may think the link here is tenuous, and I’ll concede that my thinking on this far fully baked. I do think, however, that the topic warrants further exploration, and may make for a good panel discussion at one of our future Data Management Summits or even a webinar. What do you reckon?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Northern Trust Highlights Asset Owners’ Data Challenge in Private Markets

Much is spoken of the data challenges that institutional asset managers are facing as they redraw their business models to meet the demands of a new economic environment, but less is said of asset owners, who are undergoing their own operational transformations. For them, the data journey is just as challenging; as their operational models...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Regulatory Data Handbook 2018/2019 – Sixth Edition

In a testament to the enduring popularity of the A-Team Regulatory Data Handbook, we are delighted to publish a sixth edition for 2018-19 of our comprehensive guide to all the regulations and rules that might impact data and data management at your institution. As in previous editions of the Regulatory Data Handbook, we have updated...