The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management: Time for a New Best Practice Model

By Nigel Pickering, Founder, www.ref-data.com

Firms have been attempting the ‘data centralisation’ path for many years – yet few, if any, have truly succeeded in delivering a solution to the underlying drivers and objectives.

Many will have concluded that integrating their core data is the route to managing reference and transactional data across the many life-cycle events that pass through their systems and organisation. By consolidating their data they could move tactically forward towards the significant operational efficiencies and integrated control that are increasingly necessary and which would deliver massive benefits.

Experience has taught us that by using a new non-intrusive technology approach it is now viable (and arguably compelling) to achieve the significant improvements that many have already identified, but have discounted due to previous estimates and risks.

The federated model provides the best route forward

In the social networking arena, the major providers have proven that empowering and providing results for each end-user within a centralised process (i.e. the ‘federated’ model) can deliver massive data integration results. None of the new network models would exist if they needed a centralised warehouse before they were useful to their users.

If, for example, you need better control with entity or counterparty data across many locations and applications, don’t waste time trying to define a ‘golden copy’ repository or reconciling them all. It will take too long, involve too many compromises, and each local team will have many valid reasons that will compound the overall delivery delay. Instead, provide the means for each local manager to view and manage any exceptions in his/her local data compared against all other sources.

As this is likely to improve local processes, it is more likely to gain local support. If you incentivise each local site, it is possible to envisage all sites adopting it in parallel and globally integrated results emerging within weeks.

The federated model works for both Ops and IT

By incorporating your existing core operational data into a single and dynamic store you will be working with factual data, which will encourage ‘straight line’ thinking from your Ops and IT teams. It will not risk your production processes and your IT team would no longer need to spend time on producing the contractual level design specifications that were necessary In the past.

Your Ops teams will be able to work within existing silos and operate in the more effective community model. One large bank that applied this federated approach delivered a single online warehouse containing the consolidation of its Instrument and counterparty reference data, and all the different trade types from dozens of internal and external data sources within a few weeks.

By simply adding further data sources, the bank is able to establish full data lifecycle monitoring of all trades within a fully integrated data reconciliation process for a fraction of the cost it would have previously incurred.

There is no longer any reason NOT to start

The federated model is no longer constrained by IT processing capacity. Current network processing power and scalability far exceeds that required for recording over 500,000 traded transactions per day, each with many events, and thousands of client and reference data updates per global day. Indeed with the network processing power available it is viable to consider this model actually serving many trading partners.

Such a model is also not constrained by data storage: most smartphones today could hold many days of data from massive transaction volumes. It is not constrained by the specific data elements either, provided that the platform or tool used is generic and independent of the type of data gathered.

If our model is constrained at all in the near term, it is only by a lack of company vision and your IT team’s historically valid reluctance to empower the user. Today, if you have the right mindset you can now begin to move towards radically improved data processes without any high-risk investment.

Related content

WEBINAR

Recorded Webinar: What the LEI Means for your Firm

This webinar has passed, but you can view the recording here. This A-Team Webinar looks at the implications of the emerging legal entity identifier (LEI) for financial institutions as they address how to implement the new standard, and offers suggestions on best practices as the LEI becomes available. The industry initiative to develop and promote...

BLOG

DMS US Virtual Goes Live with a Practitioner Innovation Keynote and Real-Time Q&As

A-Team Group’s Data Management Summit USA Virtual kicked off today with a hugely insightful live practitioner innovation keynote followed by two live, and lively, Q&A sessions packed with audience questions and answered by the day’s expert keynote and panel speakers. Andrew Delaney, president and chief content officer at A-Team, hosted today’s live sessions of the...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Pricing and Valuations

This special report accompanies a webinar we held a webinar on the popular topic of Pricing and Valuations, discussing issues such as transparency of pricing and how to ensure data quality. You can register here to get immediate access to the Special Report.