About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Investment Data Utility Plans Pilot of its Data Quality Benchmark Service

Subscribe to our newsletter

The Investment Data Utility is planning a pilot of its peer review service that allows investment managers to benchmark the quality of their reference data, improve data input to business applications and reduce operational errors. While the utility initially supports investment reference data, particularly around securities and issuers, the company’s longer term plan is to repurpose the software to support bank, hedge fund and insurance reference data.

The utility has been developed by Robin Strong, a buy-side front-office technology specialist who has worked with vendors including Linedata, Fidessa and Asset Control. He explains: “The idea behind the utility goes back many years and is about challenging the unresolved problem of sourcing clean data for line of business applications, particularly hungry ones like compliance.”

The utility is designed to work with underlying reference data that is not touched frequently, perhaps shares outstanding or credit ratings, but is essential to applications such as risk and compliance, rather than data that is touched frequently, perhaps pricing, and is quickly corrected if incorrect. It works across asset classes and is cloud based, constructing a virtual peer universe of client data and analysing each client’s data against it using algorithms developed by Stock.

Daily benchmark reports including numeric scores provide each client with analysis of their data quality against that of their peers and the social network aspect of the utility allows firms to collaborate on data updates and corrections. Data submissions are anonymous and data comparison is achieved without breaching vendor data distribution restrictions.

The utility has been developed to work with existing enterprise data management or data warehouse platforms and uses a control layer on top of these software solutions to create files that will be quality benchmarked against those from peer users. Strong says the value proposition is in allowing investment managers to spot any data problems quickly and prevent operational errors that can lead to incorrect trades and, potentially, large fines. The community model should reduce manual intervention required to obtain correct data and thereby reduce costs.

Strong says: “The utility model can provide great value to users, but it needs to be very lightweight from an IT perspective if a group of firms is going to implement it. That is why the utility is in the cloud and has been developed as a highly automated service that takes feeds from users, analyses the data, manages exceptions and provides reports through a secure portal.”

Strong started to build the reference data utility model late last summer, before incorporating the company in September and completing version one of the software with feedback from market participants. He is planning to start a pilot project processing live data with asset managers towards the end of March and is looking for a few managers to join those already committed to the pilot. When the pilot is complete, Strong hopes to secure funding to expand the company and build out the utility to support reference data management at banks, hedge funds and insurance firms.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Making the Most of Mainframe Structured Data: Webinar Preview

Mainframes still provide the data and computational backbone of many financial institutions but some organisations are encountering challenges as they try to integrate them with newer architectures. Many are incompatible with cloud and server-based architectures as well as APIs. Work-arounds can be achieved but they require middleware that can be costly and time consuming to...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...