About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Mining the Value of Data through Agile Data Governance

Subscribe to our newsletter

Agile data governance frameworks are presenting financial institutions with valuable new ways to use their data and respond swiftly to the changing demands placed upon it. They are also offering deeper insights into firms’ own operations, helping them to identify and eliminate emerging risks, shortening time to market and streamlining decision making.

While the common principles of agile data governance are still crystalising, experts are hailing its importance in helping firms mine value from data that might otherwise have lingered unused on corporate drives.

“The benefit of agile data governance is really to align to the rapid change that we’re seeing across digital transformation and being able to react to it in a much more rapid way,” says Marc Gilman, general counsel and vice president of compliance at data security specialist Theta Lake.

How agile data governance is helping firms is the subject of next week’s Data Management Insight webinar entitled “How to optimise the business value of your data using agile data governance”. The event will gather a panel of experts who will examine the fundamentals of this rapidly developing space.

David Masters, former Chief Data Officer of SGIL (UK) and Prime Services (Global) at Société Générale will be joined by Lynn Watts, Head of Data Governance at Royal London Asset Management; Peter Baumann, Chief Executive and founder of ActiveNav; and, Philip Dutton, Co-Chief Executive and co-founder of Solidatus.

Data Surge

Agile governance methodologies have emerged largely as a consequence of the rapid increase in the volume, types and complexity of data that firms are using and generating.

Governance structures are necessary to help ensure that managers of the data get it to the right people, in the right format at the right time. Too often, methodologies have tended to focus narrowly on protection and less so on data access. Less dynamic methodologies are being eschewed for a decentralised and responsive approach, Dutton told Data Management Insight.

“You’ve now got exponential data, exponential regulation and exponential willingness and desire to go faster – to change faster,” he says. “And because you combine all those three things together at the same time, you get a ball of complexity, which is filled with risk and whose impact is really difficult to understand and assess. And so, the whole governance space is having to shift.”

Agile principles in data governance were carried across from the software development sector, in which designers and engineers collaborate across functions to bring products to market faster. Along the way, so the theory enshrined in 2001’s “Manifesto for Agile Software Development” goes, by working in an iterative manner, teams continually improve and are able to nimbly adapt to change and challenges.

In practice, the application of these principles to data governance has taken the form of individuals in a firm becoming stakeholders in the management of its data. That involves widening access to data in order for them to review processes more regularly and adapt swiftly to changes to operations, markets and regulations, while also keeping data secure.

“That business stakeholder involvement is super, super important because the data governance process is lacklustre without it,” says Gilman.

Simplifying Complexity

Dutton argues that because of their novelty, agile governance methodologies have been differently implemented across companies. But they all share a common aim of tackling the growing complexity – and consequent emergence of new risks – within enterprise data.

“It’s largely about embracing the fact that today things are changing so quickly, and that governance programmes and all the policies in place need to be looked at and reviewed and revisited at least monthly, to make sure that they’re mitigating against all the threats that they’re running into on a daily basis,” says Kyle McNabb, Solution and Product Marketing Growth Leader at data manager Rocket Software.

Commentators have argued that the transition to distributed data “ownership” was inevitable as the complexity of data increased. In this situation, an agile framework isn’t just useful, it is probably the only possible means of governance, says Suemee Shin, who works on enterprise data, architecture and strategy at Northern Trust.

“You just have to come up with some way to develop that in a distributed fashion and I think that dovetails with an agile software development methodology,” says Shin. “We don’t want a Wild West of data governance, but having a controlled environment where there’s some flexibility in the way that, say, the glossary is captured, allows for quicker implementations.”

Agile governance methodologies have many benefits, according to experts, starting with their implementation, which requires little or no investment in infrastructure, just the introduction of new ways of working. They also enable the quick and nimble response and adjustments necessary to navigate fast-changing regulatory oversight of data.

Theta Lake’s Gilman says the capture of internal communications data is of particular interest in this regard because it’s a relatively recent obligation placed on companies and one that touches on sensitive issues, including employee privacy. This alone makes agile governance necessary.

“That meeting of compliance and data governance is something that’s happening now and is likely to only increase as people leverage data tools,” he says.

No Silos

Cost savings are further to be had in the way decision making and other processes can be streamlined through the cross-referencing of data and removal of data silos.

Solidatus’ Dutton illustrates this with reference to a client that implemented agile principles and found that, among other things, it substantially slashed the time it took to comply with regulations surrounding the transfer of data across the company’s international entities. The process that had usually taken up to 18 months and the input of hundreds of people to execute was reduced to hours and just 10 people’s involvement.

“Global cross border data sharing never used to be a thing and now it is and it is strangling a lot of organisations because determining if you can share data between one jurisdiction and another is no longer a one-to-one matrix,” Dutton says. “From our platform we were able to link their datasets together. That brought savings of $60 million to $70 million.”

But not everyone is convinced that agile is a cure for all ills. Shin has argued that one of the aspects that proponents often tout – that governance processes are integrated into workflows, meaning rules don’t need to be written – can be a handicap.

The notion of “storyboarding” governance approaches without documenting them can lead to erroneous application. “If I asked for all that was actually implemented ‘can you give me all the data and where it’s coming from, where it’s going and how it’s transformed?’, the documentation doesn’t exist,” Shin says. “I am not a fan of the lesser documentation approach, because I do believe that target mapping for any data project requires time, it requires resources.”

Royal London Asset Management’s Watts is also doubtful that agile can be applied seamlessly, and questions how proactive data quality controls can be built into a solution. While she believes projects can build in controls during sprint cycles, she argues it is nevertheless vital for the integrity of the data that it is with strategic “foundational” owners from the outset. “It’s really difficult to do that in an agile way,” she says.

Even Dutton cautions that agile may not be suitable for every organisation because it would require a fundamental cultural and organisational change. “The criticality of connected complexity cannot be underestimated – data does not exist in a vacuum, it is impacted and impacts policy, people and process,” he says. “Siloed thinking and a siloed approach will ultimately lead to the same outcomes we see today.”

Covid Impact

Proponents argue that the necessity of an agile approach was made apparent in the past two years when the Covid pandemic forced companies to reassess not only the way they manage their data, but the way they do business. Importantly, it underlined how quickly risks can emerge.

In that period, companies learned that being prepared means more than having a plan – it means being able to change and implement that plan, possibly multiple times. For McNabb, the pandemic has been a salutary warning.

“There’s going to be other episodes and events that take place over the next few years where organisations are going to be constantly trying to respond,” he says. “It’s Covid today, but there’s going to be a new administration coming in someplace and new laws, new regulations. There’s also going to be new cybersecurity threats. We’ve got to be prepared for that and that means we’ve got to constantly look and relook at what are we managing.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for buy-side data management across structured and unstructured data

Date: 14 November 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step...

BLOG

Financial Firms Have Widest Data Security Perception Gap: Survey

The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks. A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...