About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Keynote Questions the Use of Sticks or Carrots in Data Architecture

Subscribe to our newsletter

Putting data architecture before data management, Colin Gibson, head of data architecture in the markets division of Royal Bank of Scotland, presented the opening keynote at this week’s A-Team Group Data Management Summit in London. As well as a deep dive into data architecture, the summit considered the role of the chief data officer, drivers of data management, the problems it presents and some cultural and technology solutions.

Gibson titled his keynote ‘Data Architecture – Sticks or Carrots?’, and set out to describe how to build a successful and sustainable data architecture that will be used and updated consistently across an organisation. He noted a blueprint and vision of the end creation as a starting point, with the blueprint taking into account issues such as what the business needs and how it will be delivered in terms of data sources, storage, access, movement and management. Added to the blueprint are disciplines, such as data policies, rules and standards, governance, best practice, modelling and tools. Put together, the blueprint and disciplines allow maintenance of underlying data in an organisation.

Gibson explained: “For data management to be successful, it must follow the disciplines, but also keep updating the blueprint. This will maximise the effectiveness, efficiency, control and agility of the business.”

By using the blueprint to maintain underlying conceptual order in data, Gibson said it is possible to eliminate data inconsistencies, errors and confusion; improve the ability to combine data produced by different business departments; respond faster to change; rely less on subject matter experts; improve knowledge sharing; and improve the quality of business analysis for activities such as investment.

Demonstrating the economic benefits of a blueprint and disciplines, Gibson said: “Every £1 spent on documenting data architecture saves £5 on subsequent projects in similar areas; every £1 spent getting things right first time avoids £3 of remediation; every £1 spent creating a centralised source of reference data saves £20 on future investment and proves an 80% saving in running costs; and every £1 spent avoiding data replication saves £1 in data reconciliation.”

The bottom line impact of this approach is data quality, business agility, timely project management and better risk management. Gibson acknowledged that if this is the theory, live projects have deadlines and problems, but he advocated against a short-term project focus, saying that not doing things properly first time around stores up trouble for the future.

Detailing how to achieve sustainable benefits from data architecture, Gibson said: “The challenge for data architects is to plant seeds that will lead to success. The person planting the seeds will not reap the benefits, but both work for the same farmer, who says they must get things right.” Rather than a traditional approach to data architecture that uses a proverbial stick when people do not get things right, Gibson proposes a more positive approach that uses a carrot to encourage people to do the right thing.

Noting reference data as the livelihood of Royal Bank of Scotland, he described GoldRush, the bank’s central source of reference data. “The bank understands its sources of reference data, but there is a problem when other organisations in the group do their own thing. We heard it was too difficult to take data from our golden sources, so we suggested and built a library of elements such as feed handlers, messaging tools and default database schemas that can be used in any project. This has been very successful with 80 projects having used the library.”

Gibson cited Project Samurai, a project driven by regulation in Japan, as a great example of a small development team that tapped into the bank’s knowledge, saw that similar work had been done before, used a schema from the library and fed information back into the blueprint to deliver a perfectly documented project. He commented: “With the foundations already built, the team could focus on solving business requirements.”

Looking forward, Gibson said he is considering how business analysts could use the blueprint, how the bank could automate transformations and whether metadata could be used to automate reconciliations.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Why Outsourcing is Shifting from Cost Centre to Being a Catalyst for Transformation

By Sarva Srinivasan, Managing Director, NeoXam Americas. For decades, outsourcing across all industries has been synonymous with trimming the back office, streamlining headcount, and delegating so called non-core processes to third parties. But in the world of finance, the ground is well and truly shifting. As the asset management and servicing industries face mounting multi-asset...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...