The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA Speakers Preach Pragmatic Approach to Data Quality

One of the most memorable pronouncements at FIMA 2007 in New York last month came from a speaker from a major investment management firm who, when asked from the floor how big a problem it is for the data management industry that there are still no global standards for security and entity identification, replied: “Unnecessary standardisation is the hobgoblin of small minds.” His message was that getting hung up on the lack of available standards to the detriment of getting on and addressing data management problems was a recipe for getting nowhere fast. This focus on knuckling down, tackling the issues faced now and being pragmatic about data management in spite of high level, industry-wide challenges that may or may not ever be solved was an ethos echoed throughout the conference sessions.

A key preoccupation of speakers and audience was how firms can maximise the effectiveness of their relationships with data vendors in seeking to achieve their data quality goals, as the practice of tying vendors into service level agreements (SLAs) becomes more prevalent. The consensus among speakers was that in most cases the vendors are more than willing to come to the table to try to resolve any issues their clients have with their data. Brian Buzzelli, product manager, Global Securities Services at Mellon Financial Corporation, explained that Mellon always attempts to source data from more than one provider. “We’ve found that this improves coverage across our securities and facilitates achieving consensus on data accuracy using data tolerance and matching comparison techniques,” he said. “There is a cost challenge inherent in accessing two or more sources, and while we don’t seek to have a relationship with every market data vendor, we do work very closely with our strategic vendors on data coverage, quality benchmarks and metrics. We and they share a common interest in improving data coverage and quality, and thereby creating greater value in the data.”

It was agreed however that while data vendors are willing to enter into SLAs that cover issues such as time-liness of delivery, they are not yet prepared to commit to SLAs on data quality itself. “The industry struggles with metrics keyed to data content from both vendor and consumer perspectives,” said Buzzelli. “While most vendors are willing to work with us, we and the industry cannot yet apply quantitative metrics to vendor data content SLAs. Our lack of consensus about what the metrics should be extends to how they should be captured and measured.”

The scope for applying innovative technology solutions to the data quality and data management challenge was another theme explored by FIMA speakers. For John Fleming, formerly of Morgan Stanley, business process management (BPM) frame-works have a key role to play. There are three types, he explained: task software-based solutions which execute the steps of a process and represent a fairly primitive approach to BPM; federated solutions, which are a combination of software and human intervention, with the software kicking out the “hard stuff” to be hand-led by human operators, and which “will never enable you to get to the level of data quality that you want”; and the far more advanced services-based BPM solutions. “This approach exploits SOA and involves encoding the underlying business processes as computer language-based services,” Fleming said. “In the future there will be BPM solutions that can focus on intuition and judgement, using aspects of artificial intelligence – software that can carry out deductive reasoning. BPM will become the foundation for enterprise content management.”
With the data management indus-try’s continued focus on the challenge of building a business case for data management projects, the presentation on automating data relationship discovery from Todd Goldman, vice president, marketing at data integration technology provider Exeros, promised some timely insight. “Data relationship discovery is a very manual and error prone process, and that kills ROI,” he said. “The obvious answer is to make discovery faster, but that is easier said than done.” Reference data masters, ETL, EAI and cleansing tools are not discovery tools, and assume you already know the rules that relate the data in your legacy systems to your golden master, he suggested, while metadata matching and profiling tools do not find transformations, which means there is still a need for manual data analysis. There is hope, however. “Automated data-driven discovery… involves automating the discovery of relationships between systems,” he explained. “It speeds up reference data management deployment by five times, and is repeatable, because the analysis is done by machine.”

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

Datactics Accelerates Business Development, Scales Global Presence, Takes AI Platform to the Next Level

Datactics is about to close third round funding of £2 million. The funding comes from the company’s previous investors – Par Equity, The Bank of Ireland Kernel Capital Group Fund, and Clarendon Fund Managers – and will be used to accelerate business development, strengthen the company’s global presence, and take its AI and machine learning...

EVENT

Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

The Data Management Challenges of Client Onboarding and KYC

This special report accompanies a webinar we held on the popular topic of The Data Management Challenges of Client Onboarding and KYC, discussing the data management challenges of client onboarding and KYC, and detailing new technology solutions that have the potential to automate and streamline onboarding and KYC processes. You can register here to get immediate...