Budget cutting has hit the data vendor community, hard, and nowhere more so than the enterprise data management (EDM) end of the spectrum. Given the limited appetite for all encompassing projects to restructure a firm’s entire approach to its reference data and the cost, time and complexity involved in such an endeavour, it is unsurprising that not many deals have been signed in recent months.
This may also go some way to explaining the results of this month’s Reference Data Review reader poll. It seems that although EDM is still a relevant concept, distributed data management (DDM) is rising in importance. This is likely as a result of the downward pressure on costs caused by the tough economic climate and the growth of electronic trading. According to 56% of the respondents to our reader poll, centralised data management or EDM is still top of the list for data management projects (in theory if not in practice). But for 44% of respondents, distributed data models are the way forward.
Last year, analyst firm Aite Group produced a report that claimed DDM was the next big thing for data management and it seems that a significant proportion of Reference Data Review readers agree. DDM extends out of the technology associated with electronic trading such as in-memory data caches, complex event processing engines, data fabrics and grid computing. The ethos behind a distributed data architecture is the creation of multiple sets of ‘truth’ where each version is unique to the subscriber and their needs.
“You don’t need to house the data universe in a single instance. You can break out by geography, product type, data type, however you want to manage ‘truth’,” explained Adam Honoré, senior analyst with Aite Group and author of the report, on its release. Aite Group claimed that EDM was rarely realised in large firms due to flaws in the execution of such centralised data models. It accused such models of contributing to latency, creating a single point of failure, experiencing significant integration pain and requiring that like data be used on disparate systems.
EDM projects have also frequently been criticised for involving high costs and lengthy implementation times. In an environment such as today’s, where sign off for projects is predicated on them being able to be completed within short timeframes and where budgets have been slashed to the bare minimum, EDM may be suffering due to this negative view within senior management. Although risk management and regulation have both raised the profile of data management within institutions, it could be that DDM is becoming the more attractive proposition due to its perception as a more targeted and faster approach to data management.
Financial institutions are also spending their limited budgets in targeted areas, such as entity data management systems and valuations data. Last month saw the valuations vendor community come together to discuss the trends and opportunities in the market at the Valuations & Risk 2009 conference in London. Panellists agreed that there is a trend towards firms taking a greater number of valuations data feeds than ever before to ensure transparency and asking for a greater depth of data from their vendors (see our lead story for details).
The partnership between Avox and Standard & Poor’s Cusip Services Bureau is also indicative of the appetite for greater standardisation of entity data. The development of a new universal identification system for global business entities has most certainly been prompted by the intense focus by financial institutions on counterparty risk, following the troubles experienced by so many large firms last year. Whether the vendors are successful in getting the market to adopt these new identifiers is yet to be ascertained (it hasn’t even been launched yet), but there is definitely a need for someone to assume the mantle of a business entity standards champion.