About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA Speakers Preach Pragmatic Approach to Data Quality

Subscribe to our newsletter

One of the most memorable pronouncements at FIMA 2007 in New York last month came from a speaker from a major investment management firm who, when asked from the floor how big a problem it is for the data management industry that there are still no global standards for security and entity identification, replied: “Unnecessary standardisation is the hobgoblin of small minds.” His message was that getting hung up on the lack of available standards to the detriment of getting on and addressing data management problems was a recipe for getting nowhere fast. This focus on knuckling down, tackling the issues faced now and being pragmatic about data management in spite of high level, industry-wide challenges that may or may not ever be solved was an ethos echoed throughout the conference sessions.

A key preoccupation of speakers and audience was how firms can maximise the effectiveness of their relationships with data vendors in seeking to achieve their data quality goals, as the practice of tying vendors into service level agreements (SLAs) becomes more prevalent. The consensus among speakers was that in most cases the vendors are more than willing to come to the table to try to resolve any issues their clients have with their data. Brian Buzzelli, product manager, Global Securities Services at Mellon Financial Corporation, explained that Mellon always attempts to source data from more than one provider. “We’ve found that this improves coverage across our securities and facilitates achieving consensus on data accuracy using data tolerance and matching comparison techniques,” he said. “There is a cost challenge inherent in accessing two or more sources, and while we don’t seek to have a relationship with every market data vendor, we do work very closely with our strategic vendors on data coverage, quality benchmarks and metrics. We and they share a common interest in improving data coverage and quality, and thereby creating greater value in the data.”

It was agreed however that while data vendors are willing to enter into SLAs that cover issues such as time-liness of delivery, they are not yet prepared to commit to SLAs on data quality itself. “The industry struggles with metrics keyed to data content from both vendor and consumer perspectives,” said Buzzelli. “While most vendors are willing to work with us, we and the industry cannot yet apply quantitative metrics to vendor data content SLAs. Our lack of consensus about what the metrics should be extends to how they should be captured and measured.”

The scope for applying innovative technology solutions to the data quality and data management challenge was another theme explored by FIMA speakers. For John Fleming, formerly of Morgan Stanley, business process management (BPM) frame-works have a key role to play. There are three types, he explained: task software-based solutions which execute the steps of a process and represent a fairly primitive approach to BPM; federated solutions, which are a combination of software and human intervention, with the software kicking out the “hard stuff” to be hand-led by human operators, and which “will never enable you to get to the level of data quality that you want”; and the far more advanced services-based BPM solutions. “This approach exploits SOA and involves encoding the underlying business processes as computer language-based services,” Fleming said. “In the future there will be BPM solutions that can focus on intuition and judgement, using aspects of artificial intelligence – software that can carry out deductive reasoning. BPM will become the foundation for enterprise content management.”
With the data management indus-try’s continued focus on the challenge of building a business case for data management projects, the presentation on automating data relationship discovery from Todd Goldman, vice president, marketing at data integration technology provider Exeros, promised some timely insight. “Data relationship discovery is a very manual and error prone process, and that kills ROI,” he said. “The obvious answer is to make discovery faster, but that is easier said than done.” Reference data masters, ETL, EAI and cleansing tools are not discovery tools, and assume you already know the rules that relate the data in your legacy systems to your golden master, he suggested, while metadata matching and profiling tools do not find transformations, which means there is still a need for manual data analysis. There is hope, however. “Automated data-driven discovery… involves automating the discovery of relationships between systems,” he explained. “It speeds up reference data management deployment by five times, and is repeatable, because the analysis is done by machine.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Key steps in adopting cloud and SaaS delivery for enterprise data

Following in the footsteps of market data migration to the cloud, enterprise data is finding its place in the cloud alongside Software-as-a-Service apps and data delivery mechanisms. The initial aim is to achieve greater efficiency and reduced costs by moving non-core processes off premise, while retaining mission critical apps and data in-house. Moving forward, the...

BLOG

Alveo Accelerates Data Deployment to Workflows and Applications with Delta Data Integration Capability

Alveo, has extended its commitment to open source technologies with the release of Delta, a data distribution capability. Delta uses the latest cloud latest cloud technologies to accelerate the deployment of market data sets into business user workflows and business applications, and is an integral part of Alveo’s data management suite, which includes Prime data...

EVENT

TradingTech Summit Virtual (Redirected)

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...