The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Finding a Balance Between Standards and Flexibility in Data Architecture

The integration of standards and flexibility into data architecture is an ongoing challenge for financial firms that must not only improve operational efficiency, but also sustain adaptability to support business and regulatory change. Both are important, but potentially opposed to each other, begging the question of how best to find an optimal balance between standards and flexibility. This question will be discussed at A-Team’s forthcoming Data Management Summit in New York, but before the event we caught up with Brian Buzzelli, senior vice president, head of governance at Acadian Asset Management, and John Yelle, vice president at DTCC, to canvas their opinions on how to balance standards and flexibility.

Buzzelli describes a continuum from flexibility and no standards to numerous standards and suggests the optimal balance is the right amount of standards to ensure operational efficiency and a framework providing an understanding of data and business processes that allows flexibility and can position a firm for change.

He balances flexibility and standards to deliver what he calls Service Level Expectations (SLEs), which ensure the inclusion of data aspects such as quality, accuracy and timeliness in business processes. When business functions with SLEs send data to the next machine, this machine also has an SLE. He explains: “This integrates standardisation between business functions at the business level. It’s like a manufacturing process for finance data that includes metrics such as data quality, completeness and packaging.”

Yelle agrees on the importance of integrating both standards and flexibility, and takes into account different types of standards, particularly external standards such as ISO standards and messaging specifications that benefit interoperability and internal standards that are designed to support policies and influence behaviours.

He sees a move away from standards that were historically based on technical issues and a move toward standards for the business side, such as standards for data governance processes. Most firms will use a mix of external and internal standards, but more external standards are expected to be implemented to improve interoperability between financial institutions and provide better regulatory insight into risk.

On flexibility, Yelle says: “It could be argued that standards improve flexibility as solid foundations built on standards can be adapted without the need for reinvention. This is ideal, but not easy to get to as most firms are dealing with legacy systems.”

Related content

WEBINAR

Recorded Webinar: Managing unstructured data and extracting value

Unstructured data offers untapped potential but the platforms, tools and technologies to support it are nascent, often deployed for a specific problem with little reuse of common technologies from application to application. What are the challenges of managing and analysing this data and what are the considerations when making investments in this area? Data quality, consistency...

BLOG

Alveo and AquaQ Partner to Integrate Alveo Prime with AquaQ kdb+

Alveo and AquaQ Analytics have partnered to offer advanced data management and analytics for financial services firms. An early deliverable is the integration of Alveo’s Prime data mastering and data quality management solution with AquaQ’s kdb+ data capture solution. The bi-directional integration allows users to take mastered pricing and reference data from Prime into kdb+...

EVENT

Data Management Summit USA Virtual

Now in its 11th year, the Data Management Summit USA Virtual explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...