It is hard to develop good data governance practices with a federated approach to data management, according to Hans Lux, head of data architecture, and Jagtar Bachra, lead data analyst, both from UBS Global Asset Management. Speaking at this week’s TSAM in London, Lux and Bachra argued that a more traditional, centralised EDM approach would be more successful in tackling firms’ data management challenges, especially with regards to securities master data.
“The more moving parts you have within your data infrastructure, the harder it is to keep a close handle on data governance and data quality,” contended Lux. He promoted instead the development of a common repository for reference data in which cross references and mapping between formats is facilitated. “An intuitive user interface to this centralised system, a data dictionary and a well defined data workflow are also important for success,” he added.
This approach is obviously the polar opposite of the one that has been adopted by large financial institutions such as Bank of America Merrill Lynch with its data fabric initiative. The larger sell side institution is no longer focusing on establishing a single gold copy, rather it working to build a robust data fabric to support all of its downstream users’ requirements in a federated manner.
The disparity between these approaches could likely stem from the fact one is a tier one sell side firm in which many silos persist, whereas smaller buy side institutions can more easily centralise and consolidate functions in an enterprise-wide fashion. Hence a federated approach is more appropriate within the larger firm in a practical sense and the buy side firm is more easily able to implement a traditional EDM system (although this is far from a simple procedure).
However, both firms are in agreement that regulation can be put to good use. To this end, Lux and Bachra also discussed the practicalities of setting in place data governance standards and how firms can use regulatory developments to get support from key stakeholders in the business. Lux spoke about the subject of data governance during last year’s TSAM event and recommended a similar track. He suggested then that data managers should use the regulatory climate and the audit trail to their advantage by getting senior level support for investment in EDM projects. This year, he referenced Basel III and the UK Financial Services Authority’s (FSA) remuneration code proposals as potential regulatory developments that data managers could turn to their advantage.
Bachra also discussed the challenges that lie ahead for any institution going down the EDM route including the perception of such an approach as “cumbersome” and the difficulty of dealing with different stakeholders and silos from across the business. “The fact that there is no end goal or final end state and that you are always going towards a moving target is also a challenge,” he added. From a project management perspective, the success of EDM projects can be difficult to quantify and new items are constantly being added as they proceed forwards.
In order to tackle these challenges, UBS Global Asset Management has adopted a “building block” approach to its EDM project and “iterative win driven planning” to prove return on investment as various milestones are reached, explained Bachra. Like many other data management teams, the initial focus was on quick wins to gain executive support at the start of the process. “The cleaning up of instrument data quality is one such quick win because it leads to a demonstrable reduction in risk and costs,” he said.
The building block approach involves ensuring that all of the different tactical projects that are kicked off adopt a common approach and sit within a wider, holistic framework across the enterprise, explained Lux. “Consistency of approach and a company-wide vision are key,” he said.
Technology is also only one part of the solution, he added: “Tools are key enablers but the cultural barriers and changes are just as important to deal with. Make it as simple as you can but no simpler.” Essentially this means ensuring the downstream users are able to use the new data management tools that have been put in place by making them simple to use, but also ensuring that they do their job appropriately.