New commercial models for enterprise and reference data management differ significantly from those already installed in financial institutions and may be the solution to ongoing problems including reducing costs, improving data quality and providing scale to support increasing data volumes – but as yet they have not been widely accepted and proven in the market.
Leading a session at A-Team Group’s New York Data Management Summit, Predrag Dizdarevic, a partner at data management advisory firm Element22, noted the need to manage not just reference data, but also enterprise data, and invited three providers of new data management models to discuss how their platforms could change the data management landscape and resolve the industry’s problems.
Joseph Turso, a vice president at SmartStream, detailed the utility model supporting legal entity, securities and pricing data that SmartStream has been developing over the past five years. He said: “The utility model is very different to business process outsourcing. It takes in data, normalises it to a single common model and applies rules to the data. This process is done once for everyone’s benefit. The utility doesn’t restrict clients’ data sources, clients still manage data vendor relationships and they can customise the outputs they want. This is a cost effective way to manage data as costs are shared by all those that use the utility.”
Joseph Ohayon, CEO of OTCFin, provider of a managed service offering risk, performance and analytics data, described how OTCFin works closely with clients to define architectures and requirements, and focus on providing clean and actionable datasets that can plug into a risk management system. He said: “More so than in reference data, the issue is how to aggregate risk and analytics data. The data is more complex and it needs to be supported by people with financial engineering expertise as well as by those with technology expertise. We look at data and make sense of it, this is our added value.”
Max Yankelevich, CEO of WorkFusion, a firm spun out of the Massachusetts Institute of Technology that runs a software-as-a-service platform including machine learning and crowd sourced staff for financial data management, said: “In 2009, we were thinking about data management and decided that, as it was, it was too complex and couldn’t handle more volume just by tweaking data or throwing people at it. We applied artificial intelligence to the problem and could see that a data analyst achieving 98% data quality was doing a good job. By taking that human quality and automating it over time using machine learning, we could build a virtuous loop in which the machine takes over from the human when it is confident and passes any difficulties to the human, who is crowd sourced. The machine then learns again from the human and so the cycle continues, automating more and more data management tasks.”
Acknowledging data quality and reduced data costs as the drivers behind these emerging data management solutions, Dizdarevic questioned how exactly each provider’s solution met these needs. He turned first to Turso, who has said earlier that the SmartStream reference data utility could provide 30% cost savings.
Turso noted the dichotomy between improved data quality and reduced prices, but said utilities offer a viable solution for both by taking the complexity and cost out of a
firm’s data environment and handing tasks such as data normalisation, cleansing, scrubbing, cross-referencing and corporate actions management to a utility. He said: “Over the past 10 to 15 years, firms have optimised their costs and outsourced where they can. But we have reached an impasse on reducing costs and need an alternative solution. Banks’ cost structures traditionally put operational costs first, followed by technology and processing costs, and then data costs. They now need to reverse the structure and spend more on data, enriching data and managing complex data, and less on operational and technology costs. Utilities help as they take away complexity and simplify the on-boarding and integration of data, lowering integration costs and improving data quality.”
In the risk data space, OTCFin works with firms to separate commodity data from strategy data and can then manage whatever data is neither proprietary nor competitive. Ohayon commented: “Firms need to make better sense of data and the needs of their organisations with a view to reducing the cost of data ownership. Managed service providers must demonstrate that they can reduce a firm’s costs, but the main barriers to outsourcing remain reluctance to change and concerns around confidentiality and information security. Outsourcing needs both management and IT buy-in.”
In terms of OTCFin’s products and services, Ohayon pointed to cost reductions resulting from implementing systems that already include data connectors, noting that for every dollar spent on a purchase, four dollars are spent on implementation. Similarly, he described OTCFin’s managed service platform that reduces costs by providing pre-built integration with risk systems and best-of-breed pricing data.
Yankelevich said WorkFusion can help firms reduce costs and add agility by doing more with less. He explained: “Most firms are staffed to handle a certain amount of data, but are working with more data. They can tap into existing operations teams and automate some of the workload, but automation can be more costly than running operations teams, leaving them between challenges. Our solution plays into this, automating data management as time goes on and providing crowd sourced freelancers that are paid only for their output.” WorkFusion guarantees 50% savings in reference data management and has achieved 90% for some customers.
After a lively discussion on the potential of new models for enterprise and reference data management, Dizdarevic concluded: “These are interesting times, with changes coming from many directions and disruptive approaches to data management coming from several providers.”
Subscribe to our newsletter