The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JJ Gets Down to “Nitty-Gritty” of “Grimy” Data Business

Share article

Reference data projects are “huge, scary and often not perceived as adding measurable value”. To avoid being derailed by the complexity and long payback time of such projects, firms should break the problem down into manageable chunks and accept that great value can be added to the business by addressing just part of the problem, either as a discrete project or within the context of an overall programme. This is the view of Neil Edelstein, a 30 year veteran of the data business who recently left Accenture to join New York based professional services firm Jordan & Jordan as director and reference data specialist.

Prior to his time as senior manager of data management services at Accenture, Edelstein counts stints at S&P, Muller Data, FT Interactive Data and DTCC and SunGard among his previous positions, and has, he says, “seen the data business evolve, from a focus mainly on market data to full-blown reference data services”.

During that time, the drivers for data projects have changed, he believes. “Historically, cost reduction, staff reallocation and outsourcing opportunities have been the key drivers,” he says. “Clearly, we have learned that there is no one key driver for all firms and that data management projects necessitate a high level of customization.” The real driver these days for data projects is risk, he contends: “Increasingly, risk systems play an important part in determining the breadth and depth of a firm’s data management strategy. Are present coverage and accuracy levels compliant with industry and firms’ specific analytics?”

Based on his work with clients, Edelstein believes that enterprise wide data management projects can be decomposed, as they are comprised of “a spectrum of highly-definable sub-projects”. The optimum approach, he says – which he looks forward to applying with Jordan & Jordan clients – is to get down to the “nitty-gritty of data content”, to look at the “existing data infrastructure, data quality and coverage down to the data attribute level in order to tailor project scope accordingly”. “I believe esoteric data management projects don’t work,” he says. “Frankly, data is a grimy business, and one must dig down deeply to understand and clearly establish a firm-wide definition of reference data. I have seen confusion regarding the categorisation of third party reference data, internal data, analytics data and accounting data, causing issues related to project scope and timelines.”

While many firms have considered or initiated data management projects, often it becomes difficult to structure a solid business case, he says. By viewing data management as a series of independent projects, and providing incremental value with the completion of each project along the way, project risk is mitigated, Edelstein contends. “In short, successful data management strategies carve out the logical pieces of the lifecycle of a data management project, in order to incrementally add value by breaking the overall strategy into discrete stages that can stand on their own.”

Upfront work may be required, he says – “introspection and self-assessment of internal preparedness and end user expectations, identification of data ownership, and understanding of the use of data in downstream applications.” As part of the internal preparedness check, Jordan & Jordan focuses on firm-wide pain points being experienced – such as data consistency, coverage, timeliness or achieving downstream systems integration. “This detailed investigation clearly helps define an enterprise data strategy as well as overall project definition, but at the same time provides value to a business unit whether or not a more comprehensive firm-wide data management project is undertaken,” Edelstein says.

Not until these upfront aspects of the project have been fully understood, he says, can evaluation of data vendors, approaches to data management, platforms and technology strategies be considered.
While data content is key, Edelstein says, he also encourages a “very practical focus” on technology. “Decisions regarding internal or third party warehousing tools must be analysed with respect to complexity of data content,” he says. “In addition, workflow tools must be scaled accordingly, providing logical operational efficiencies without adding to overhead unnecessarily.”

Related content

WEBINAR

Recorded Webinar: How to leverage the LIBOR transition to improve your data management game

The transition away from LIBOR (London Interbank Offered Rate) is well underway, but there remains considerable ambiguity around how the final stages will be executed – especially with regards to benchmark replacements in markets outside the UK. What are the options, where are the uncertainties and what stage have firms reached in their preparations? The...

BLOG

Life After LIBOR – Keep Calm and Cover Your Contracts

The use of LIBOR is drawing to a close – but will you be a winner or a loser when it comes to dealing with the data management fallout from the transition? The move away from the  world’s most widely-used interbank rate is raising all sorts of data challenges – and with considerable ambiguity remaining...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...