About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

UBS Asset Management’s Webster Elaborates on its Constitutional Approach to Data Management

Subscribe to our newsletter

UBS Asset Management has a relatively unusual approach to its data management strategy: it has a constitution that lays out the policies and principles related to data ownership and accountability. Ian Webster, global head of data management for the buy side arm of UBS, showed the document to FIMA delegates last week and explained that the data management function is jointly categorised under operations and IT, and is sponsored by the firm’s chief operating officer.

Similar to the set up at Royal Bank of Scotland (RBS), UBS Asset Management’s data owners are from the business in order to ensure that they take accountability for the quality of the data, explained Webster. “We need to educate them about their responsibilities, as the data management function operates like HR across a business,” he said. In a comparable manner to HR, data management carries out the administrative functions around data by setting procedures and policies and educating business users about these, whereas the business users own the data and have responsibility for it.

Of course, implementation of this structure is the hard part, conceded Webster. “There are areas of the business that ‘get’ the importance of data management and there are others that are more of a challenge,” he contended. Earlier in the conference, Webster noted that the business benefits of investment in data management projects should be fairly obvious in terms of practical benefits; otherwise data managers should not go ahead with such projects.

In terms of UBS’ own progress, Webster indicated that it began with tackling its instrument and then client data and is now looking at servicing the requirements of the pre-trade environment.

A data management team’s credibility is only as good as the last project it has completed, warned Webster. “As an industry we may have matured but we need to make sure that we do not become complacent and let data management go stale,” he said. The industry has focused on data aggregation and validation for the last two or three years but it needs to move beyond this area, as it is only a small component of the overall data management function, he added.

Governance is the topic at the top of the list for UBS at the moment, said Webster, as it is a key component to solving the data management challenge. The governance aspect hinges on a focus on the downstream uses of data and their requirements as an internal client. This has resulted in the need for a more “entrepreneurial” outlook in terms of staff members in the data management team, according to Webster. “They should not be hidden away in the basement, rather they should be out there proving how they can positively impact the business. For example, measuring how quickly new asset types can be added to the system,” he contended. “The quicker you can do this, the easier it is to monetise the strategy.”

Other important aspects of the data management function overall are data sourcing, transformation, distribution, persistence and cross-referencing, which ties it all together. Sourcing includes a location strategy, taking into account regional presence and geographic coverage, and persistence relates to maintaining one version of the truth in terms of data storage.

Distribution is the “Cinderella bit” of data management at the moment, he suggested, as it is often hard to determine how well downstream clients are being serviced. “We need to measure what they have to do to get data into the system and how difficult a process this is,” he said.

Webster is of the belief that the view of a centralised, pure enterprise data management system that sits in the middle of the business and is relatively untouched by business users is the wrong route to go. “Everything changes and feeds off each other – it is a data cycle not a linear process,” he explained. The tension between the consumers and the producers of data should also be taken into account in the design of data distribution channels.

“I’m concerned that we over-conceptualise data quality,” he concluded. “We cannot treat the financial services industry in the same manner as other industries, as we are an information processing industry.” Taking one model of data management that works in one industry and trying to apply it to financial services is therefore, in Webster’s eyes, a foolhardy effort. Food for thought for those working on the utility model, for example.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to integrating legacy data with the cloud

Acceleration of cloud adoption, increasing demand for digital transformation and real-time data management have led financial institutions to rethink their data infrastructure to enable more agile operating models that can respond faster to change and make data a competitive advantage. For many, integrating data from legacy systems and data across the business landscape with a...

BLOG

Meet Patch, a pioneer of data packages designed to ease migration between cloud databases

Migrating applications and datasets from one cloud data warehouse to another can be a challenge – how can you keep apps running effectively during the migration, and avoid the time and cost of ETL projects and rebuilds? And how can you ensure data schema security, data quality, and the business benefits of accelerated migration? Meet...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...