About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA Speakers Preach Pragmatic Approach to Data Quality

Subscribe to our newsletter

One of the most memorable pronouncements at FIMA 2007 in New York last month came from a speaker from a major investment management firm who, when asked from the floor how big a problem it is for the data management industry that there are still no global standards for security and entity identification, replied: “Unnecessary standardisation is the hobgoblin of small minds.” His message was that getting hung up on the lack of available standards to the detriment of getting on and addressing data management problems was a recipe for getting nowhere fast. This focus on knuckling down, tackling the issues faced now and being pragmatic about data management in spite of high level, industry-wide challenges that may or may not ever be solved was an ethos echoed throughout the conference sessions.

A key preoccupation of speakers and audience was how firms can maximise the effectiveness of their relationships with data vendors in seeking to achieve their data quality goals, as the practice of tying vendors into service level agreements (SLAs) becomes more prevalent. The consensus among speakers was that in most cases the vendors are more than willing to come to the table to try to resolve any issues their clients have with their data. Brian Buzzelli, product manager, Global Securities Services at Mellon Financial Corporation, explained that Mellon always attempts to source data from more than one provider. “We’ve found that this improves coverage across our securities and facilitates achieving consensus on data accuracy using data tolerance and matching comparison techniques,” he said. “There is a cost challenge inherent in accessing two or more sources, and while we don’t seek to have a relationship with every market data vendor, we do work very closely with our strategic vendors on data coverage, quality benchmarks and metrics. We and they share a common interest in improving data coverage and quality, and thereby creating greater value in the data.”

It was agreed however that while data vendors are willing to enter into SLAs that cover issues such as time-liness of delivery, they are not yet prepared to commit to SLAs on data quality itself. “The industry struggles with metrics keyed to data content from both vendor and consumer perspectives,” said Buzzelli. “While most vendors are willing to work with us, we and the industry cannot yet apply quantitative metrics to vendor data content SLAs. Our lack of consensus about what the metrics should be extends to how they should be captured and measured.”

The scope for applying innovative technology solutions to the data quality and data management challenge was another theme explored by FIMA speakers. For John Fleming, formerly of Morgan Stanley, business process management (BPM) frame-works have a key role to play. There are three types, he explained: task software-based solutions which execute the steps of a process and represent a fairly primitive approach to BPM; federated solutions, which are a combination of software and human intervention, with the software kicking out the “hard stuff” to be hand-led by human operators, and which “will never enable you to get to the level of data quality that you want”; and the far more advanced services-based BPM solutions. “This approach exploits SOA and involves encoding the underlying business processes as computer language-based services,” Fleming said. “In the future there will be BPM solutions that can focus on intuition and judgement, using aspects of artificial intelligence – software that can carry out deductive reasoning. BPM will become the foundation for enterprise content management.”
With the data management indus-try’s continued focus on the challenge of building a business case for data management projects, the presentation on automating data relationship discovery from Todd Goldman, vice president, marketing at data integration technology provider Exeros, promised some timely insight. “Data relationship discovery is a very manual and error prone process, and that kills ROI,” he said. “The obvious answer is to make discovery faster, but that is easier said than done.” Reference data masters, ETL, EAI and cleansing tools are not discovery tools, and assume you already know the rules that relate the data in your legacy systems to your golden master, he suggested, while metadata matching and profiling tools do not find transformations, which means there is still a need for manual data analysis. There is hope, however. “Automated data-driven discovery… involves automating the discovery of relationships between systems,” he explained. “It speeds up reference data management deployment by five times, and is repeatable, because the analysis is done by machine.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Taking a holistic approach to buy-side data management

As data volumes and complexity continue to increase, buy-side data management is at an inflection point. Manual processes and data siloes are no longer fit for purpose, and firms need to take a more holistic approach to data management that not only reduces manual intervention and cost, but also increases data access and accuracy for...

BLOG

Alveo Integrates MSCI ESG Research Content into DaaS Data Management Solution

Alveo has integrated MSCI ESG Research content into its Data-as-a-Service (DaaS) data management solution, giving clients access to both firms’ data and cloud data management capabilities, and providing solutions that integrate MSCI’s content into client workflows and help business users self-serve. Integrated MSCI content includes MSCI ESG Ratings, ESG Controversies and ESG Sustainable Impact Metrics,...

EVENT

TradingTech Briefing New York

TradingTech Insight Briefing New York will explore how trading firms are innovating and leveraging technology as a differentiator in today’s cloud and digital based environment.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...