About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Analysis: EDM Progress Could be Faster Than You Think

Subscribe to our newsletter

The great and the good of the enterprise data management industry globally turned out in force for FIMA USA 2007 in New York earlier this month, to participate in a conference agenda that in one sense delivered very little that was actually new.

The line-up of presentations and panel discussions explored a number of now-familiar themes. Anyone who doesn’t work in data within financial institutions has no understanding of how difficult it is, and just expects the data they need to be there when they need it, like electricity. It’s very difficult to build a business case for an enterprise data management project because the ROI is hard to prove, so the way to progress is to start with a limited implementation – “no data management project is too small, but many are too large”, as one speaker said – get some quick wins under the belt, deliver some visible benefits to the business, and use those to justify the next stage of activity. An effective governance model is absolutely crucial or you’ll never get anywhere. And work on metrics, both internally and with your data vendors, or you’ll never be able to prove you’ve achieved anything, either within the data group or to external business sponsors. So far, so same-as-last-year.

The somewhat depressing tenor of much of the FIMA discussions was further reinforced by the fact that, despite the ever-more pressing drivers for improved data management of increased regulatory burden and increased need to gain competitive advantage over other firms, the data management industry itself doesn’t really believe it has done very well in achieving many of the above goals of ROI demonstration, metrics and governance. Indeed, in a lively session led by Citigroup chief data officer John Bottega, during which he invited the audience to grade itself in a number of areas, the attending delegates gave themselves even lower scores in a range of key categories than Bottega himself had anticipated.

However, on closer inspection, a number of individual presentations did actually reveal some real progress being made, despite the limitations the data management function undoubtedly labours under, offering some rather more positive news than the overall mood of the event might have indicated. One highlight on the technology side was insight from a major investment bank into functionality it currently has in beta for “attribute arbitration” – a systematised way to automatically populate vital fields in its security master with values depending on pre-determined weightings of its data vendors according to the typical quality of their data in a given area. This has the potential to eliminate manual effort even for types of data for which accuracy is more important than timeliness. Another was a presentation on the application of sophisticated artificial intelligence-based business process management (BPM) techniques to data management – again offering future hope that the still heavy reliance on manual processes can be further alleviated, increasing the efficiency of the function going forward.

On the business side, a case study from one investment manager demonstrated how data management goals can be achieved if projects are taken in manageable chunks, the right governance is in place, communication with the business lines is optimal and the right vendor solution is chosen. This speaker also offered hope for those firms struggling with an under-resourced data function, describing a methodology for involving key personnel within each business line in the data management strategy for effective communication both ways of mutual goals, creating a vastly extended “virtual” data management team.

There was even some good news on the progress of the somewhat tricky relationship between vendors and users of market and reference data, with a number of presentations and panels exploring ways to collaborate to ensure the goals of each side are met. As one speaker said, all data vendors want their clients to be convinced they have the most accurate, most timely data, and even if the vendors remain reluctant to enter into service level agreements on data content specifically, they will almost always come to the table to discuss any issues that exist, particularly if a culture of collaboration has been established.

While there is no doubt that FIMA delegates were frustrated with the pace at which they’re being able to effect change within their organisations, it is also true that there are at least pockets of successful activity. The data management community that so harshly graded its own achievements almost certainly has more to shout about than it might think.
Look out for more coverage of FIMA 2007 in the next issue of Reference Data Review.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies, tools and techniques to extract value from unstructured data

Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a challenge, but there are solutions designed to help financial institutions digest, manage and make best use...

BLOG

Crafting an Effective Data Strategy to Unlock Innovation

By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49. Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...