Managed services and utilities can cut the cost of reference data, but to be truly effective managed services must be more flexible and utilities must address issues of data access and security.
A panel session led by A-Team Group editor-in-chief Andrew Delaney at the A-Team Group Data Management Summit in London set out to discover the advantages and challenges of managed services and utilities, starting with a definition of these data models.
Martijn Groot, director at Euroclear, said: “A managed service lifts out existing technology and hands it over to the managed service provider, while a utility provides common services for many users.” Tom Dalglish, CTO, group data at UBS, added: “Managed services run data solutions for us and utilities manage data for themselves.”
Based on these definitions, the panellists considered how and why managed services and utilities are developing. Dalglish commented: “We need to move away from all doing the same things with data. Managed business process outsourcing services are well understood, but utilities present more challenges – will they be run as monopolies and make data difficult to access, what is the vendor interest?” Steve Cheng, global head of data management at Rimes Technologies, added: “The market has moved on from lift outs. New technologies mean managed services can be more flexible than outsourcing.”
It is not only the nature of available services that is driving financial firms to third-party providers, but also cost and regulation, both of which are high on the agenda. Jonathan Clark, group head of financial services at Tech Mahindra, explained: “Cost is significant, but regulation is the number one issue. Regulations require more holistic and high quality data and that is high cost for firms, so they are trying to get data quality at a reasonable price point.”
Dalglish focussed on cost, saying: “The business case is about money. Large companies have lost the ability to change, a utility can help to reduce costs. Banks are looking at these data models to regain efficiencies they have lost internally and are difficult to rebuild.”
Cheng described the reference data utility model as being more like the satellite television model than water or electricity models, and noted that Rimes’ experience of customers is that they want to innovate, but not allow their cost base to increase.
While consensus among the panellists was that managed services and utilities can provide cost savings, they also agreed that it is not the cost of data, but the infrastructure, sources, services and people around the data that rack up the cost to an extent that is leading firms to seek lower cost solutions. Firms that opt to use a data utility can convert capital costs to expenditure and chip away at elements such as multiple data sources.
Dalglish commented: “If you can achieve savings of 30% to 35% that is good, but this is a conservative estimate and it should be possible to save more going forward.” Cheng added: “The rule of thumb is that for every £1 spent on data licences, £2 or £3 is spent on infrastructure and staff. The need is to identify those hidden costs so that the use of a managed service or utility can be justified.”
Returning to the pressure of regulation, Delaney asked the panel whether managed reference data services and utilities would be regulated in the same way as banks. While this is not happening at the moment, some panel members expect it to happen and warn that utilities may find a way around regulation by using disclaimers. Cheng said: “Forthcoming regulations are very prescriptive about data models and regulators may look at the whole data chain. This means utilities and managed services may in future be subject to the same regulatory requirements as other market participants.”
The concept of managed services and utilities is not new. Dalglish recalled an effort to set up a utility that did not take off back in 2005 and said that the moment has now come for utilities as the technology stack has improved, data is better understood and this is a good time for competition and collaboration in the market. Groot added: “Data delivery mechanisms have changed, the bar has been raised on projects and the business case for an internal service is difficult, making external services attractive.” Panellists also noted technologies such as the Internet and cloud facilitating mass customisation, and the benefit of utilities that are built for a single purpose.
With so much to offer, Delaney questioned the panel on what type of organisations will benefit from third-party utilities. Panel members said both large and small firms could benefit, with large companies reducing today’s massive data costs and small firms being able to hand off non-core reference data services. Clark added: “Firms that can benefit most are those that find it difficult to discover the cost of data, perhaps because it is managed in different departments or geographic regions. But these firms are also the hardest to convert because they don’t know their costs.”
A question from the audience about defining reference data, making it open and putting it in a utility for all to use, met a consensus response from panel members who said it is a great idea, but will not happen because there are too many vendors with vested interests in the market.
Closing with a blue skies scenario, Delaney asked how far the utility concept could go. Groot concluded: “There is a need for operational procedures and recovery planning, but utilities could go a long way as there is a lot of data in scope.”
Subscribe to our newsletter