The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA Speakers Preach Pragmatic Approach to Data Quality

Share article

One of the most memorable pronouncements at FIMA 2007 in New York last month came from a speaker from a major investment management firm who, when asked from the floor how big a problem it is for the data management industry that there are still no global standards for security and entity identification, replied: “Unnecessary standardisation is the hobgoblin of small minds.” His message was that getting hung up on the lack of available standards to the detriment of getting on and addressing data management problems was a recipe for getting nowhere fast. This focus on knuckling down, tackling the issues faced now and being pragmatic about data management in spite of high level, industry-wide challenges that may or may not ever be solved was an ethos echoed throughout the conference sessions.

A key preoccupation of speakers and audience was how firms can maximise the effectiveness of their relationships with data vendors in seeking to achieve their data quality goals, as the practice of tying vendors into service level agreements (SLAs) becomes more prevalent. The consensus among speakers was that in most cases the vendors are more than willing to come to the table to try to resolve any issues their clients have with their data. Brian Buzzelli, product manager, Global Securities Services at Mellon Financial Corporation, explained that Mellon always attempts to source data from more than one provider. “We’ve found that this improves coverage across our securities and facilitates achieving consensus on data accuracy using data tolerance and matching comparison techniques,” he said. “There is a cost challenge inherent in accessing two or more sources, and while we don’t seek to have a relationship with every market data vendor, we do work very closely with our strategic vendors on data coverage, quality benchmarks and metrics. We and they share a common interest in improving data coverage and quality, and thereby creating greater value in the data.”

It was agreed however that while data vendors are willing to enter into SLAs that cover issues such as time-liness of delivery, they are not yet prepared to commit to SLAs on data quality itself. “The industry struggles with metrics keyed to data content from both vendor and consumer perspectives,” said Buzzelli. “While most vendors are willing to work with us, we and the industry cannot yet apply quantitative metrics to vendor data content SLAs. Our lack of consensus about what the metrics should be extends to how they should be captured and measured.”

The scope for applying innovative technology solutions to the data quality and data management challenge was another theme explored by FIMA speakers. For John Fleming, formerly of Morgan Stanley, business process management (BPM) frame-works have a key role to play. There are three types, he explained: task software-based solutions which execute the steps of a process and represent a fairly primitive approach to BPM; federated solutions, which are a combination of software and human intervention, with the software kicking out the “hard stuff” to be hand-led by human operators, and which “will never enable you to get to the level of data quality that you want”; and the far more advanced services-based BPM solutions. “This approach exploits SOA and involves encoding the underlying business processes as computer language-based services,” Fleming said. “In the future there will be BPM solutions that can focus on intuition and judgement, using aspects of artificial intelligence – software that can carry out deductive reasoning. BPM will become the foundation for enterprise content management.”
With the data management indus-try’s continued focus on the challenge of building a business case for data management projects, the presentation on automating data relationship discovery from Todd Goldman, vice president, marketing at data integration technology provider Exeros, promised some timely insight. “Data relationship discovery is a very manual and error prone process, and that kills ROI,” he said. “The obvious answer is to make discovery faster, but that is easier said than done.” Reference data masters, ETL, EAI and cleansing tools are not discovery tools, and assume you already know the rules that relate the data in your legacy systems to your golden master, he suggested, while metadata matching and profiling tools do not find transformations, which means there is still a need for manual data analysis. There is hope, however. “Automated data-driven discovery… involves automating the discovery of relationships between systems,” he explained. “It speeds up reference data management deployment by five times, and is repeatable, because the analysis is done by machine.”

Related content

WEBINAR

Recorded Webinar: How far should counterparty screening go? Balancing the ideal and the realistic

Counterparty screening is a regulatory requirement, but do you know enough about your clients’ clients, and beyond? How can you source this information and how does it benefit your business? How far do you need to dig into entity ownership structures? This webinar discusses these challenges and how they relate to your organisation, whether you’re...

BLOG

Join Next Week’s RegTech Summit Virtual to Discuss Regulation Post-Covid, the Brexit Regime and Ongoing Challenges of KYC and AML

The coronavirus pandemic has changed the way we work and it is becoming increasingly clear that the business environment will be irrevocably changed by the 2020 experience. Remote engagement has become more important than ever, and agility and versatility have become valuable characteristics in this new normal. At the same time, regulatory oversight is becoming...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Regulation and Risk as Data Management Drivers

A-Team Group recently held a webinar on the topic of Regulation and Risk as Data Management Drivers. Fill in the form to get immediate access to the accompanying Special Report. Alongside death and taxes, perhaps the only other certainty in life is that regulation of the financial markets will increase in future years. How do...