About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A rose by any other name…

Subscribe to our newsletter

Budget cutting has hit the data vendor community, hard, and nowhere more so than the enterprise data management (EDM) end of the spectrum. Given the limited appetite for all encompassing projects to restructure a firm’s entire approach to its reference data and the cost, time and complexity involved in such an endeavour, it is unsurprising that not many deals have been signed in recent months.

This may also go some way to explaining the results of this month’s Reference Data Review reader poll. It seems that although EDM is still a relevant concept, distributed data management (DDM) is rising in importance. This is likely as a result of the downward pressure on costs caused by the tough economic climate and the growth of electronic trading. According to 56% of the respondents to our reader poll, centralised data management or EDM is still top of the list for data management projects (in theory if not in practice). But for 44% of respondents, distributed data models are the way forward.

Last year, analyst firm Aite Group produced a report that claimed DDM was the next big thing for data management and it seems that a significant proportion of Reference Data Review readers agree. DDM extends out of the technology associated with electronic trading such as in-memory data caches, complex event processing engines, data fabrics and grid computing. The ethos behind a distributed data architecture is the creation of multiple sets of ‘truth’ where each version is unique to the subscriber and their needs.

“You don’t need to house the data universe in a single instance. You can break out by geography, product type, data type, however you want to manage ‘truth’,” explained Adam Honoré, senior analyst with Aite Group and author of the report, on its release. Aite Group claimed that EDM was rarely realised in large firms due to flaws in the execution of such centralised data models. It accused such models of contributing to latency, creating a single point of failure, experiencing significant integration pain and requiring that like data be used on disparate systems.

EDM projects have also frequently been criticised for involving high costs and lengthy implementation times. In an environment such as today’s, where sign off for projects is predicated on them being able to be completed within short timeframes and where budgets have been slashed to the bare minimum, EDM may be suffering due to this negative view within senior management. Although risk management and regulation have both raised the profile of data management within institutions, it could be that DDM is becoming the more attractive proposition due to its perception as a more targeted and faster approach to data management.

Financial institutions are also spending their limited budgets in targeted areas, such as entity data management systems and valuations data. Last month saw the valuations vendor community come together to discuss the trends and opportunities in the market at the Valuations & Risk 2009 conference in London. Panellists agreed that there is a trend towards firms taking a greater number of valuations data feeds than ever before to ensure transparency and asking for a greater depth of data from their vendors (see our lead story for details).

The partnership between Avox and Standard & Poor’s Cusip Services Bureau is also indicative of the appetite for greater standardisation of entity data. The development of a new universal identification system for global business entities has most certainly been prompted by the intense focus by financial institutions on counterparty risk, following the troubles experienced by so many large firms last year. Whether the vendors are successful in getting the market to adopt these new identifiers is yet to be ascertained (it hasn’t even been launched yet), but there is definitely a need for someone to assume the mantle of a business entity standards champion.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

SimCorp and Alloy Partnership Enables Institutional Investors to Manage Digital Assets on SimCorp Platform

SimCorp, a subsidiary of Deutsche Börse Group and provider of investment management solutions for the buy-side, and Alloy, a provider of institutional infrastructure and technology for digital assets, have formed a strategic partnership that will allow SimCorp clients to manage digital asset investments. The partnership will enable mutual clients to holistically manage digital assets, including...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Entity Data Management & the LEI

Just over a year since the Financial Stability Board handed over leadership and direction of the interim Global Legal Entity Identifier System – or GLEIS – to the Regulatory Oversight Committee (ROC) of the LEI the entity identifier is being used for reporting under European Market Infrastructure Regulation. This report discusses recent developments in the...