The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A rose by any other name…

Budget cutting has hit the data vendor community, hard, and nowhere more so than the enterprise data management (EDM) end of the spectrum. Given the limited appetite for all encompassing projects to restructure a firm’s entire approach to its reference data and the cost, time and complexity involved in such an endeavour, it is unsurprising that not many deals have been signed in recent months.

This may also go some way to explaining the results of this month’s Reference Data Review reader poll. It seems that although EDM is still a relevant concept, distributed data management (DDM) is rising in importance. This is likely as a result of the downward pressure on costs caused by the tough economic climate and the growth of electronic trading. According to 56% of the respondents to our reader poll, centralised data management or EDM is still top of the list for data management projects (in theory if not in practice). But for 44% of respondents, distributed data models are the way forward.

Last year, analyst firm Aite Group produced a report that claimed DDM was the next big thing for data management and it seems that a significant proportion of Reference Data Review readers agree. DDM extends out of the technology associated with electronic trading such as in-memory data caches, complex event processing engines, data fabrics and grid computing. The ethos behind a distributed data architecture is the creation of multiple sets of ‘truth’ where each version is unique to the subscriber and their needs.

“You don’t need to house the data universe in a single instance. You can break out by geography, product type, data type, however you want to manage ‘truth’,” explained Adam Honoré, senior analyst with Aite Group and author of the report, on its release. Aite Group claimed that EDM was rarely realised in large firms due to flaws in the execution of such centralised data models. It accused such models of contributing to latency, creating a single point of failure, experiencing significant integration pain and requiring that like data be used on disparate systems.

EDM projects have also frequently been criticised for involving high costs and lengthy implementation times. In an environment such as today’s, where sign off for projects is predicated on them being able to be completed within short timeframes and where budgets have been slashed to the bare minimum, EDM may be suffering due to this negative view within senior management. Although risk management and regulation have both raised the profile of data management within institutions, it could be that DDM is becoming the more attractive proposition due to its perception as a more targeted and faster approach to data management.

Financial institutions are also spending their limited budgets in targeted areas, such as entity data management systems and valuations data. Last month saw the valuations vendor community come together to discuss the trends and opportunities in the market at the Valuations & Risk 2009 conference in London. Panellists agreed that there is a trend towards firms taking a greater number of valuations data feeds than ever before to ensure transparency and asking for a greater depth of data from their vendors (see our lead story for details).

The partnership between Avox and Standard & Poor’s Cusip Services Bureau is also indicative of the appetite for greater standardisation of entity data. The development of a new universal identification system for global business entities has most certainly been prompted by the intense focus by financial institutions on counterparty risk, following the troubles experienced by so many large firms last year. Whether the vendors are successful in getting the market to adopt these new identifiers is yet to be ascertained (it hasn’t even been launched yet), but there is definitely a need for someone to assume the mantle of a business entity standards champion.

Related content

WEBINAR

Recorded Webinar: Best Practice for trade surveillance

Don’t miss this opportunity to view the recording of this recently held webinar. Markets in Financial Instruments Directive II (MiFID II), Market Abuse Regulation (MAR), Dodd-Frank and other regulations underline the need for effective and timely trade surveillance to counter market abuse. The webinar will discuss regulatory requirements for trade surveillance, best practice implementation, technology...

BLOG

Blackmore Capital’s Collaboration with OTCfin Completes Integration of ESG Factors into Investment Process

Blackmore Capital, a Melbourne-based asset manager set up in 2018, and New York-based OTCfin have completed the integration of ESG factors with financial data for all Blackmore portfolios. By incorporating ESG factors into Blackmore’s investment process, OTCfin’s risk and regulatory reporting solution will help the asset manager’s team improve portfolio monitoring from both a financial...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that brings together an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

The Trading Regulations Handbook

Need to know all the essentials about the regulations impacting trading infrastructure? Welcome to the first edition of our A-Team Trading Regulations Handbook which provides all the essentials about regulations impacting trading operations, data and technology. A-Team’s Trading Regulations Handbook is a great way to see at-a-glance: All the regulations that are impacting trading technology...