About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Keynote Questions the Use of Sticks or Carrots in Data Architecture

Subscribe to our newsletter

Putting data architecture before data management, Colin Gibson, head of data architecture in the markets division of Royal Bank of Scotland, presented the opening keynote at this week’s A-Team Group Data Management Summit in London. As well as a deep dive into data architecture, the summit considered the role of the chief data officer, drivers of data management, the problems it presents and some cultural and technology solutions.

Gibson titled his keynote ‘Data Architecture – Sticks or Carrots?’, and set out to describe how to build a successful and sustainable data architecture that will be used and updated consistently across an organisation. He noted a blueprint and vision of the end creation as a starting point, with the blueprint taking into account issues such as what the business needs and how it will be delivered in terms of data sources, storage, access, movement and management. Added to the blueprint are disciplines, such as data policies, rules and standards, governance, best practice, modelling and tools. Put together, the blueprint and disciplines allow maintenance of underlying data in an organisation.

Gibson explained: “For data management to be successful, it must follow the disciplines, but also keep updating the blueprint. This will maximise the effectiveness, efficiency, control and agility of the business.”

By using the blueprint to maintain underlying conceptual order in data, Gibson said it is possible to eliminate data inconsistencies, errors and confusion; improve the ability to combine data produced by different business departments; respond faster to change; rely less on subject matter experts; improve knowledge sharing; and improve the quality of business analysis for activities such as investment.

Demonstrating the economic benefits of a blueprint and disciplines, Gibson said: “Every £1 spent on documenting data architecture saves £5 on subsequent projects in similar areas; every £1 spent getting things right first time avoids £3 of remediation; every £1 spent creating a centralised source of reference data saves £20 on future investment and proves an 80% saving in running costs; and every £1 spent avoiding data replication saves £1 in data reconciliation.”

The bottom line impact of this approach is data quality, business agility, timely project management and better risk management. Gibson acknowledged that if this is the theory, live projects have deadlines and problems, but he advocated against a short-term project focus, saying that not doing things properly first time around stores up trouble for the future.

Detailing how to achieve sustainable benefits from data architecture, Gibson said: “The challenge for data architects is to plant seeds that will lead to success. The person planting the seeds will not reap the benefits, but both work for the same farmer, who says they must get things right.” Rather than a traditional approach to data architecture that uses a proverbial stick when people do not get things right, Gibson proposes a more positive approach that uses a carrot to encourage people to do the right thing.

Noting reference data as the livelihood of Royal Bank of Scotland, he described GoldRush, the bank’s central source of reference data. “The bank understands its sources of reference data, but there is a problem when other organisations in the group do their own thing. We heard it was too difficult to take data from our golden sources, so we suggested and built a library of elements such as feed handlers, messaging tools and default database schemas that can be used in any project. This has been very successful with 80 projects having used the library.”

Gibson cited Project Samurai, a project driven by regulation in Japan, as a great example of a small development team that tapped into the bank’s knowledge, saw that similar work had been done before, used a schema from the library and fed information back into the blueprint to deliver a perfectly documented project. He commented: “With the foundations already built, the team could focus on solving business requirements.”

Looking forward, Gibson said he is considering how business analysts could use the blueprint, how the bank could automate transformations and whether metadata could be used to automate reconciliations.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Arcesium Warns of Data Crunch as US Pension Funds Boost Private Market Bets

Blackstone’s launch of a business unit dedicated to the creation of products that give US pension funds access to private markets has raised the data challenge for many established investment managers. Blackstone is seeking to win pension trustees over to an investment space they had traditionally been wary of or have been restricted from entering...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...