About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A rose by any other name…

Subscribe to our newsletter

Budget cutting has hit the data vendor community, hard, and nowhere more so than the enterprise data management (EDM) end of the spectrum. Given the limited appetite for all encompassing projects to restructure a firm’s entire approach to its reference data and the cost, time and complexity involved in such an endeavour, it is unsurprising that not many deals have been signed in recent months.

This may also go some way to explaining the results of this month’s Reference Data Review reader poll. It seems that although EDM is still a relevant concept, distributed data management (DDM) is rising in importance. This is likely as a result of the downward pressure on costs caused by the tough economic climate and the growth of electronic trading. According to 56% of the respondents to our reader poll, centralised data management or EDM is still top of the list for data management projects (in theory if not in practice). But for 44% of respondents, distributed data models are the way forward.

Last year, analyst firm Aite Group produced a report that claimed DDM was the next big thing for data management and it seems that a significant proportion of Reference Data Review readers agree. DDM extends out of the technology associated with electronic trading such as in-memory data caches, complex event processing engines, data fabrics and grid computing. The ethos behind a distributed data architecture is the creation of multiple sets of ‘truth’ where each version is unique to the subscriber and their needs.

“You don’t need to house the data universe in a single instance. You can break out by geography, product type, data type, however you want to manage ‘truth’,” explained Adam Honoré, senior analyst with Aite Group and author of the report, on its release. Aite Group claimed that EDM was rarely realised in large firms due to flaws in the execution of such centralised data models. It accused such models of contributing to latency, creating a single point of failure, experiencing significant integration pain and requiring that like data be used on disparate systems.

EDM projects have also frequently been criticised for involving high costs and lengthy implementation times. In an environment such as today’s, where sign off for projects is predicated on them being able to be completed within short timeframes and where budgets have been slashed to the bare minimum, EDM may be suffering due to this negative view within senior management. Although risk management and regulation have both raised the profile of data management within institutions, it could be that DDM is becoming the more attractive proposition due to its perception as a more targeted and faster approach to data management.

Financial institutions are also spending their limited budgets in targeted areas, such as entity data management systems and valuations data. Last month saw the valuations vendor community come together to discuss the trends and opportunities in the market at the Valuations & Risk 2009 conference in London. Panellists agreed that there is a trend towards firms taking a greater number of valuations data feeds than ever before to ensure transparency and asking for a greater depth of data from their vendors (see our lead story for details).

The partnership between Avox and Standard & Poor’s Cusip Services Bureau is also indicative of the appetite for greater standardisation of entity data. The development of a new universal identification system for global business entities has most certainly been prompted by the intense focus by financial institutions on counterparty risk, following the troubles experienced by so many large firms last year. Whether the vendors are successful in getting the market to adopt these new identifiers is yet to be ascertained (it hasn’t even been launched yet), but there is definitely a need for someone to assume the mantle of a business entity standards champion.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Transforming Data Experiences in Quantitative Research and Trading

For quantitative researchers and quant trading teams at banking and capital markets firms, the ability to access, integrate, and share data is critical. Data and how teams collaborate with data underpins the ability to generate alpha, perform execution analyses, and provide a modern and differentiated client experience. However, for most banks, legacy technology stacks and...

BLOG

Qlik Releases Cloud Data Integration Enabling Enterprise Data Fabric

Qlik, a provider of real-time data integration and analytics to industry sectors including financial services, has released Qlik Cloud Data Integration. The solution offers an enterprise integration platform as a service (eiPaaS) and is designed to support data strategy by leveraging a real-time data integration fabric that connects all enterprise applications and data sources to...

EVENT

RegTech Summit APAC

Now in its 2nd year, the RegTech Summit APAC will bring together the regtech ecosystem to explore how capital markets in the APAC region can leverage technology to drive innovation, cut costs and support regulatory change. With more opportunities than ever before for RegTech to add value, now is the time to invest for the future. Join us to hear from leading RegTech practitioners and innovators who will share insights into how they are tackling the challenges of adopting and implementing regtech and how to advance your RegTech strategy.

GUIDE

Regulatory Data Handbook 2018/2019 – Sixth Edition

In a testament to the enduring popularity of the A-Team Regulatory Data Handbook, we are delighted to publish a sixth edition for 2018-19 of our comprehensive guide to all the regulations and rules that might impact data and data management at your institution. As in previous editions of the Regulatory Data Handbook, we have updated...