About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Formal Working Groups Needed to Tackle Downstream Data Issues, Say Majority of Readers

Subscribe to our newsletter

Interoperability and integration between downstream systems and centralised data management systems has been an issue for some time and last year’s events have brought it into the spotlight. According to the results of our latest Reference Data Review reader poll it is not just the regulators that are concerned about this area, a lot of you are also keen to tackle the downstream impact of data management projects.

Regulators are focused on the area in particular, with the focus on how counterparty and entity data, for example, is used throughout an organisation. Disconnects between centralised data repositories and the downstream systems that use this data are obviously a source of significant risk for financial institutions. It is in this light that many respondents to our latest reader poll are likely keen to address the area in a formal manner.

A total of 71% of respondents indicated that they believe formal working groups should be established in order to better tackle the issue of downstream data. The idea of informal communication on the subject was not popular, however, with only 7% of respondents wishing to tackle the area in this manner.

The trend towards formal groups established to tackle issues such as these is likely both a reaction to regulatory scrutiny (after all, it would provide proof that the industry is attempting to tackle the area) and recognition that in order to deal with such a complex issue, all parties need to be involved. Only with the establishment of a formal working group can a project such as this hope to get off the ground in a climate of great economic turmoil. Management needs to understand and be on board with any such endeavour in order to make sure it gets the right level of support.

Not everyone, however, believes that downstream systems are worthy of new working groups: 21% of respondents stated that enough attention is being paid to this already. Perhaps, their organisations are one step ahead of the curve with regards to integration, or they have other things on their minds.

Regardless of this minority, interoperability and integration between downstream systems and centralised data management systems is likely to recur as theme over the course of the year, as regulators clamp down on financial institutions’ risk management practices. Hopefully, it will lead to funding for these projects and some of the longstanding back office siloes will be broken down, but as with anything in such a climate, progress is likely to be slow.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Archive360 Girds Clients for Demise of the Single-Provider Data Pipeline

The future is fragmented. So says George Tziahanas, associated general counsel and vice president of compliance at data governance platform provider Archive360, who argues that the days of monolithic, front-to-back, one-size-fits-all data services providers may be numbered. Artificial intelligence has become both the hammer to break up single-provider data pipeline technology and the glue to...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...