About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA Speakers Preach Pragmatic Approach to Data Quality

Subscribe to our newsletter

One of the most memorable pronouncements at FIMA 2007 in New York last month came from a speaker from a major investment management firm who, when asked from the floor how big a problem it is for the data management industry that there are still no global standards for security and entity identification, replied: “Unnecessary standardisation is the hobgoblin of small minds.” His message was that getting hung up on the lack of available standards to the detriment of getting on and addressing data management problems was a recipe for getting nowhere fast. This focus on knuckling down, tackling the issues faced now and being pragmatic about data management in spite of high level, industry-wide challenges that may or may not ever be solved was an ethos echoed throughout the conference sessions.

A key preoccupation of speakers and audience was how firms can maximise the effectiveness of their relationships with data vendors in seeking to achieve their data quality goals, as the practice of tying vendors into service level agreements (SLAs) becomes more prevalent. The consensus among speakers was that in most cases the vendors are more than willing to come to the table to try to resolve any issues their clients have with their data. Brian Buzzelli, product manager, Global Securities Services at Mellon Financial Corporation, explained that Mellon always attempts to source data from more than one provider. “We’ve found that this improves coverage across our securities and facilitates achieving consensus on data accuracy using data tolerance and matching comparison techniques,” he said. “There is a cost challenge inherent in accessing two or more sources, and while we don’t seek to have a relationship with every market data vendor, we do work very closely with our strategic vendors on data coverage, quality benchmarks and metrics. We and they share a common interest in improving data coverage and quality, and thereby creating greater value in the data.”

It was agreed however that while data vendors are willing to enter into SLAs that cover issues such as time-liness of delivery, they are not yet prepared to commit to SLAs on data quality itself. “The industry struggles with metrics keyed to data content from both vendor and consumer perspectives,” said Buzzelli. “While most vendors are willing to work with us, we and the industry cannot yet apply quantitative metrics to vendor data content SLAs. Our lack of consensus about what the metrics should be extends to how they should be captured and measured.”

The scope for applying innovative technology solutions to the data quality and data management challenge was another theme explored by FIMA speakers. For John Fleming, formerly of Morgan Stanley, business process management (BPM) frame-works have a key role to play. There are three types, he explained: task software-based solutions which execute the steps of a process and represent a fairly primitive approach to BPM; federated solutions, which are a combination of software and human intervention, with the software kicking out the “hard stuff” to be hand-led by human operators, and which “will never enable you to get to the level of data quality that you want”; and the far more advanced services-based BPM solutions. “This approach exploits SOA and involves encoding the underlying business processes as computer language-based services,” Fleming said. “In the future there will be BPM solutions that can focus on intuition and judgement, using aspects of artificial intelligence – software that can carry out deductive reasoning. BPM will become the foundation for enterprise content management.”
With the data management indus-try’s continued focus on the challenge of building a business case for data management projects, the presentation on automating data relationship discovery from Todd Goldman, vice president, marketing at data integration technology provider Exeros, promised some timely insight. “Data relationship discovery is a very manual and error prone process, and that kills ROI,” he said. “The obvious answer is to make discovery faster, but that is easier said than done.” Reference data masters, ETL, EAI and cleansing tools are not discovery tools, and assume you already know the rules that relate the data in your legacy systems to your golden master, he suggested, while metadata matching and profiling tools do not find transformations, which means there is still a need for manual data analysis. There is hope, however. “Automated data-driven discovery… involves automating the discovery of relationships between systems,” he explained. “It speeds up reference data management deployment by five times, and is repeatable, because the analysis is done by machine.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to integrating legacy data with the cloud

Acceleration of cloud adoption, increasing demand for digital transformation and real-time data management have led financial institutions to rethink their data infrastructure to enable more agile operating models that can respond faster to change and make data a competitive advantage. For many, integrating data from legacy systems and data across the business landscape with a...

BLOG

Platinum Asset Management Selects NeoXam for NextGen Target Operating Model

Platinum Asset Management, an Australia based global equity manager, is planning to upgrade its data and technology capabilities with NeoXam’s DataHub integrated solutions for investment data, performance measurement, and Impress client reporting. The NeoXam data management platform will be deployed to manage reference, market, and investment data within a single, centralised enterprise repository. This will...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...