About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management: Time for a New Best Practice Model

Subscribe to our newsletter

By Nigel Pickering, Founder, www.ref-data.com

Firms have been attempting the ‘data centralisation’ path for many years – yet few, if any, have truly succeeded in delivering a solution to the underlying drivers and objectives.

Many will have concluded that integrating their core data is the route to managing reference and transactional data across the many life-cycle events that pass through their systems and organisation. By consolidating their data they could move tactically forward towards the significant operational efficiencies and integrated control that are increasingly necessary and which would deliver massive benefits.

Experience has taught us that by using a new non-intrusive technology approach it is now viable (and arguably compelling) to achieve the significant improvements that many have already identified, but have discounted due to previous estimates and risks.

The federated model provides the best route forward

In the social networking arena, the major providers have proven that empowering and providing results for each end-user within a centralised process (i.e. the ‘federated’ model) can deliver massive data integration results. None of the new network models would exist if they needed a centralised warehouse before they were useful to their users.

If, for example, you need better control with entity or counterparty data across many locations and applications, don’t waste time trying to define a ‘golden copy’ repository or reconciling them all. It will take too long, involve too many compromises, and each local team will have many valid reasons that will compound the overall delivery delay. Instead, provide the means for each local manager to view and manage any exceptions in his/her local data compared against all other sources.

As this is likely to improve local processes, it is more likely to gain local support. If you incentivise each local site, it is possible to envisage all sites adopting it in parallel and globally integrated results emerging within weeks.

The federated model works for both Ops and IT

By incorporating your existing core operational data into a single and dynamic store you will be working with factual data, which will encourage ‘straight line’ thinking from your Ops and IT teams. It will not risk your production processes and your IT team would no longer need to spend time on producing the contractual level design specifications that were necessary In the past.

Your Ops teams will be able to work within existing silos and operate in the more effective community model. One large bank that applied this federated approach delivered a single online warehouse containing the consolidation of its Instrument and counterparty reference data, and all the different trade types from dozens of internal and external data sources within a few weeks.

By simply adding further data sources, the bank is able to establish full data lifecycle monitoring of all trades within a fully integrated data reconciliation process for a fraction of the cost it would have previously incurred.

There is no longer any reason NOT to start

The federated model is no longer constrained by IT processing capacity. Current network processing power and scalability far exceeds that required for recording over 500,000 traded transactions per day, each with many events, and thousands of client and reference data updates per global day. Indeed with the network processing power available it is viable to consider this model actually serving many trading partners.

Such a model is also not constrained by data storage: most smartphones today could hold many days of data from massive transaction volumes. It is not constrained by the specific data elements either, provided that the platform or tool used is generic and independent of the type of data gathered.

If our model is constrained at all in the near term, it is only by a lack of company vision and your IT team’s historically valid reluctance to empower the user. Today, if you have the right mindset you can now begin to move towards radically improved data processes without any high-risk investment.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The challenges and potential of data marketplaces

Date: 13 October 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data is the lifeblood of capital markets. It is also a valuable commodity providing financial institutions with additional insight when gathered in an internal data marketplace, or packaged and sold externally to other institutions. While the theory is sound,...

BLOG

Banca March Selects GoldenSource for ESG Data Management and Analytics

Banca March, a Spanish investment bank and financial services company, has selected GoldenSource ESG Impact to manage data and analytics for its ESG-related investment and banking processes. As part of the bank’s data strategy, the approach to mastering and managing ESG data will be aligned with enterprise data already managed in GoldenSource tools Security Master...

EVENT

TradingTech Summit London

Now in its 11th year the TradingTech Summit London brings together the European trading technology capital markets industry, to explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...