The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JJ Gets Down to “Nitty-Gritty” of “Grimy” Data Business

Share article

Reference data projects are “huge, scary and often not perceived as adding measurable value”. To avoid being derailed by the complexity and long payback time of such projects, firms should break the problem down into manageable chunks and accept that great value can be added to the business by addressing just part of the problem, either as a discrete project or within the context of an overall programme. This is the view of Neil Edelstein, a 30 year veteran of the data business who recently left Accenture to join New York based professional services firm Jordan & Jordan as director and reference data specialist.

Prior to his time as senior manager of data management services at Accenture, Edelstein counts stints at S&P, Muller Data, FT Interactive Data and DTCC and SunGard among his previous positions, and has, he says, “seen the data business evolve, from a focus mainly on market data to full-blown reference data services”.

During that time, the drivers for data projects have changed, he believes. “Historically, cost reduction, staff reallocation and outsourcing opportunities have been the key drivers,” he says. “Clearly, we have learned that there is no one key driver for all firms and that data management projects necessitate a high level of customization.” The real driver these days for data projects is risk, he contends: “Increasingly, risk systems play an important part in determining the breadth and depth of a firm’s data management strategy. Are present coverage and accuracy levels compliant with industry and firms’ specific analytics?”

Based on his work with clients, Edelstein believes that enterprise wide data management projects can be decomposed, as they are comprised of “a spectrum of highly-definable sub-projects”. The optimum approach, he says – which he looks forward to applying with Jordan & Jordan clients – is to get down to the “nitty-gritty of data content”, to look at the “existing data infrastructure, data quality and coverage down to the data attribute level in order to tailor project scope accordingly”. “I believe esoteric data management projects don’t work,” he says. “Frankly, data is a grimy business, and one must dig down deeply to understand and clearly establish a firm-wide definition of reference data. I have seen confusion regarding the categorisation of third party reference data, internal data, analytics data and accounting data, causing issues related to project scope and timelines.”

While many firms have considered or initiated data management projects, often it becomes difficult to structure a solid business case, he says. By viewing data management as a series of independent projects, and providing incremental value with the completion of each project along the way, project risk is mitigated, Edelstein contends. “In short, successful data management strategies carve out the logical pieces of the lifecycle of a data management project, in order to incrementally add value by breaking the overall strategy into discrete stages that can stand on their own.”

Upfront work may be required, he says – “introspection and self-assessment of internal preparedness and end user expectations, identification of data ownership, and understanding of the use of data in downstream applications.” As part of the internal preparedness check, Jordan & Jordan focuses on firm-wide pain points being experienced – such as data consistency, coverage, timeliness or achieving downstream systems integration. “This detailed investigation clearly helps define an enterprise data strategy as well as overall project definition, but at the same time provides value to a business unit whether or not a more comprehensive firm-wide data management project is undertaken,” Edelstein says.

Not until these upfront aspects of the project have been fully understood, he says, can evaluation of data vendors, approaches to data management, platforms and technology strategies be considered.
While data content is key, Edelstein says, he also encourages a “very practical focus” on technology. “Decisions regarding internal or third party warehousing tools must be analysed with respect to complexity of data content,” he says. “In addition, workflow tools must be scaled accordingly, providing logical operational efficiencies without adding to overhead unnecessarily.”

Related content

WEBINAR

Upcoming Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

Date: 21 January 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions...

BLOG

GLEIF Deploys Workiva Platform, Expands into North America

The GLEIF Foundation partnered with XBRL International and Workiva in June 2020 to publish its annual report in both human and machine-readable Inline XBRL and HTML format, with GLEIF’s LEI embedded into the financial information. It constitutes only the second official business report globally to automatically link the filing entity to its verified LEI reference...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...