About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A rose by any other name…

Subscribe to our newsletter

Budget cutting has hit the data vendor community, hard, and nowhere more so than the enterprise data management (EDM) end of the spectrum. Given the limited appetite for all encompassing projects to restructure a firm’s entire approach to its reference data and the cost, time and complexity involved in such an endeavour, it is unsurprising that not many deals have been signed in recent months.

This may also go some way to explaining the results of this month’s Reference Data Review reader poll. It seems that although EDM is still a relevant concept, distributed data management (DDM) is rising in importance. This is likely as a result of the downward pressure on costs caused by the tough economic climate and the growth of electronic trading. According to 56% of the respondents to our reader poll, centralised data management or EDM is still top of the list for data management projects (in theory if not in practice). But for 44% of respondents, distributed data models are the way forward.

Last year, analyst firm Aite Group produced a report that claimed DDM was the next big thing for data management and it seems that a significant proportion of Reference Data Review readers agree. DDM extends out of the technology associated with electronic trading such as in-memory data caches, complex event processing engines, data fabrics and grid computing. The ethos behind a distributed data architecture is the creation of multiple sets of ‘truth’ where each version is unique to the subscriber and their needs.

“You don’t need to house the data universe in a single instance. You can break out by geography, product type, data type, however you want to manage ‘truth’,” explained Adam Honoré, senior analyst with Aite Group and author of the report, on its release. Aite Group claimed that EDM was rarely realised in large firms due to flaws in the execution of such centralised data models. It accused such models of contributing to latency, creating a single point of failure, experiencing significant integration pain and requiring that like data be used on disparate systems.

EDM projects have also frequently been criticised for involving high costs and lengthy implementation times. In an environment such as today’s, where sign off for projects is predicated on them being able to be completed within short timeframes and where budgets have been slashed to the bare minimum, EDM may be suffering due to this negative view within senior management. Although risk management and regulation have both raised the profile of data management within institutions, it could be that DDM is becoming the more attractive proposition due to its perception as a more targeted and faster approach to data management.

Financial institutions are also spending their limited budgets in targeted areas, such as entity data management systems and valuations data. Last month saw the valuations vendor community come together to discuss the trends and opportunities in the market at the Valuations & Risk 2009 conference in London. Panellists agreed that there is a trend towards firms taking a greater number of valuations data feeds than ever before to ensure transparency and asking for a greater depth of data from their vendors (see our lead story for details).

The partnership between Avox and Standard & Poor’s Cusip Services Bureau is also indicative of the appetite for greater standardisation of entity data. The development of a new universal identification system for global business entities has most certainly been prompted by the intense focus by financial institutions on counterparty risk, following the troubles experienced by so many large firms last year. Whether the vendors are successful in getting the market to adopt these new identifiers is yet to be ascertained (it hasn’t even been launched yet), but there is definitely a need for someone to assume the mantle of a business entity standards champion.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: AI in Asset Management: Buy-Side Attitudes toward GenAI and LLMs

Since ChatGPT exploded onto the scene in late 2022, financial markets participants have been trying to understand the opportunities and risks posed by artificial intelligence and in particular generative AI (GenAI) and large language models (LLMs). While the full value of the technology continues to become apparent, it’s already clear that AI has enormous potential...

BLOG

Gulf Between AI Ambitions and Capabilities Remains Wide, Surveys Find

Many financial institutions and service providers remain encumbered by creaking technology systems that are preventing many from taking advantage of artificial intelligence (AI) data innovations. Despite organisations’ overwhelming desire to make use of AI to give them a competitive edge, many say also that they lack the data management expertise to adopt applications that are...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Enterprise Data Management, 2009 Edition

This year has truly been a year of change for the data management community. Regulators and industry participants alike have been keenly focused on the importance of data with regards to compliance and risk management considerations. The UK Financial Services Authority’s fining of Barclays for transaction reporting failures as a result of inconsistent underlying reference...