The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management: Time for a New Best Practice Model

By Nigel Pickering, Founder, www.ref-data.com

Firms have been attempting the ‘data centralisation’ path for many years – yet few, if any, have truly succeeded in delivering a solution to the underlying drivers and objectives.

Many will have concluded that integrating their core data is the route to managing reference and transactional data across the many life-cycle events that pass through their systems and organisation. By consolidating their data they could move tactically forward towards the significant operational efficiencies and integrated control that are increasingly necessary and which would deliver massive benefits.

Experience has taught us that by using a new non-intrusive technology approach it is now viable (and arguably compelling) to achieve the significant improvements that many have already identified, but have discounted due to previous estimates and risks.

The federated model provides the best route forward

In the social networking arena, the major providers have proven that empowering and providing results for each end-user within a centralised process (i.e. the ‘federated’ model) can deliver massive data integration results. None of the new network models would exist if they needed a centralised warehouse before they were useful to their users.

If, for example, you need better control with entity or counterparty data across many locations and applications, don’t waste time trying to define a ‘golden copy’ repository or reconciling them all. It will take too long, involve too many compromises, and each local team will have many valid reasons that will compound the overall delivery delay. Instead, provide the means for each local manager to view and manage any exceptions in his/her local data compared against all other sources.

As this is likely to improve local processes, it is more likely to gain local support. If you incentivise each local site, it is possible to envisage all sites adopting it in parallel and globally integrated results emerging within weeks.

The federated model works for both Ops and IT

By incorporating your existing core operational data into a single and dynamic store you will be working with factual data, which will encourage ‘straight line’ thinking from your Ops and IT teams. It will not risk your production processes and your IT team would no longer need to spend time on producing the contractual level design specifications that were necessary In the past.

Your Ops teams will be able to work within existing silos and operate in the more effective community model. One large bank that applied this federated approach delivered a single online warehouse containing the consolidation of its Instrument and counterparty reference data, and all the different trade types from dozens of internal and external data sources within a few weeks.

By simply adding further data sources, the bank is able to establish full data lifecycle monitoring of all trades within a fully integrated data reconciliation process for a fraction of the cost it would have previously incurred.

There is no longer any reason NOT to start

The federated model is no longer constrained by IT processing capacity. Current network processing power and scalability far exceeds that required for recording over 500,000 traded transactions per day, each with many events, and thousands of client and reference data updates per global day. Indeed with the network processing power available it is viable to consider this model actually serving many trading partners.

Such a model is also not constrained by data storage: most smartphones today could hold many days of data from massive transaction volumes. It is not constrained by the specific data elements either, provided that the platform or tool used is generic and independent of the type of data gathered.

If our model is constrained at all in the near term, it is only by a lack of company vision and your IT team’s historically valid reluctance to empower the user. Today, if you have the right mindset you can now begin to move towards radically improved data processes without any high-risk investment.

Related content

WEBINAR

Recorded Webinar: Countdown to MiFID II

This webinar has passed, but you can view the recording here. MiFID II is dominating trading technologists’ thoughts in 2016. What has been achieved so far and what is still to be done? What elements of the regulation are proving most difficult – perhaps algo testing, classification or surveillance? – and how are regulators responding?...

BLOG

Alveo Integrates ULTUMUS Index and ETF Managed Data Service with Prime

Alveo and ULTUMUS, an exchange-traded fund (ETF) specialist, have partnered to bring together Alveo’s data mastering solution, Prime, and ULTUMUS’s global ETF and index managed data service. The partnership aims to enable Alveo customers to integrate index and ETF information more quickly, and allow ULTUMUS’s clients to enhance their data mastering and data integration capabilities....

EVENT

Data Management Summit Virtual (Redirected)

The Data Management Summit Virtual brings together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...