About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: Spreadsheet Wars Reprised

Subscribe to our newsletter

In an interview with the excellent Robert Elms on BBC Radio London the other day, author and autistic savant Daniel Tammet said something along the lines of: “Mathematics is more about how you came up with the answer than the answer itself.” Tammet was promoting his forthcoming new book, Thinking in Numbers: How Maths Illuminates Our Lives.

His remark reminded me of a first year maths lesson at the lovely Westcliff High School for Boys wherein I was actually able to come up with the answer to a maths problem. When the teacher asked how I arrived at said answer, I pointed to my ink-stained back of hand, where I had done all the calculations. Unintelligible to all but myself, my lines of calculation failed to past muster and my moment of glory – I wasn’t a maths champion, so this was a rarity – quickly passed; and a lesson learned.

And this vignette is how I’m starting to think about the way regulators are looking at risk and other reports from financial institutions. It’s not just about the answer anymore; the regulators want to understand the models underlying the reports and – more pertinently to us – whether the data used in the models is both accurate and consistent.

This new veracity of approach is apparent from a reading of the latest report from the Basel Committee on Banking Supervision, published in June and downloadable from here. The report – Principles for Effective Risk Data Aggregation and Risk Reporting – concludes “that banks’ information technology and data architectures were inadequate to support the broad management of financial risks”.

I was turned on to the report by Ralph Baxter, CEO of ClusterSeven. He reckons the new regulatory attitude means more immediacy to firms’ compliance efforts, which in turn means that they’ll need to tackle data management problems now rather than present a three-year plan for tackling them, as they’ve done in the past. And that means they’ll need to get a handle on the huge number of spreadsheets that are being used throughout their organisations to generate the data used in risk, analytics and regulatory reports.

Ralph says his clients may have between 500 and 5,000 spreadsheets, many of which are linked, creating a monumental data management headache, particularly when regulators want to see where mission-critical data is being sourced. It ain’t enough to point to an inky back of hand.

ClusterSeven has been pitching spreadsheet management as an issue for years. But Baxter may have a point: that spreadsheets’ number is finally up. Here’s a selection of his highlights from the Basel report that bear out his argument (Ralph’s boldface):

• Principle 1.23 “A bank’s board and senior management should be fully aware of any limitations that prevent full risk data aggregation in terms of … reliance on manual processes.”

• Principle 3.28(b) “Where a bank relies on manual processes and desktop applications (e.g. spreadsheets, databases) and has specific risk units that use these applications for software development, it should have effective mitigants in place (e.g. end-user computing policies and procedures) and other effective controls that are consistently applied across the bank’s processes.

• Principle 3.30 “There should be an appropriate balance between automated and manual systems … a higher degree of automation is desirable to reduce the risk of errors”

• Principle 3.31 “… banks to document and explain all of their risk data aggregation processes, whether automated or manual. Documentation should include an explanation of the appropriateness of any manual workarounds”

• Principle 6.39 “A bank’s risk data aggregation capabilities should be flexible and adaptable to meet ad hoc data requests as needed.

• Principle 7.44 “To ensure the accuracy of the [risk] reports a bank should maintain, at a minimum, the following (a) Defined requirements and processes to reconcile reports to risk data; (b) Automated and manual edit and reasonableness checks….(c) procedures for identifying, reporting and explaining data errors in data integrity via exception reports.

And Ralph’s not alone in thinking that spreadsheets are presenting financial firms of all shades with some of their greatest data management challenges. At our Data Management Summit back in May, UBS’s Rupert Brown expounded on the issues around spreadsheet management on a panel on Emerging Technology Approaches to Big Data, Cloud, Grid and In-Memory.

According to Rupert, back in the 1980s he and a colleague from Midland Montague built an early CEP engine using distributed Excel on a network. “Excel has been with us for 25 years; we have unstructured data as a result of the spreadsheet; the spreadsheet remains the killer app, and actually there is no replacement for Excel. So we have this scratchpad that allows us to create arbitrary structures and link them together, and that’s why we have the varying silos of data …

So we have this lineage, where our founding stone of data is now unstructured. Thirty years ago, our foundstone of data was the mainframe, because that was all we had; now our foundstone is the spreadsheet.”

He added: “There are a whole bunch of CIOs and COOs out there in the business that are terrified about their spreadsheets. The good news is there are tools out there that now allow us to harness and measure them, and we should not be afraid of doing that; the risk people are terrified of it, but actually it is something you can control and manage, but actually that’s where architects have to earn their corn.”

Indeed, Rupert’s remarks echoed some research we conducted earlier in the year as part of a report on Big Data we conducted for IBM’s Platform unit].

As part of the research, we interviewed an enterprise architect from a major US-based fund management firm, who pinpointed spreadsheets as one of his single biggest concerns when it came to managing data across the enterprise. According to this executive, “We’re looking at traditional user files, but also we have a lot of code: Executables. These reside outside of the mainstream applications, perhaps in people’s shared drives. These are applications that people are using. But what are they? And are they covered by our governance policies? Do they look like they could expose the company in any way? Many of them are homegrown applications that have bypassed the IT department. That’s unstructured data as it’s just an executable, or an interim data repository. Are we making decisions on that? Does it go through proper channels of compliance and governance?”

And the most common form of programming?

“Excel. People can develop sophisticated applications and formulas; you don’t need to be a coding genius. You can code in the cloud these days, bring it down and execute.”

For ClusterSeven’s Baxter, current thinking as exemplified by the Basel report will force institutions to address the thorny problem of Excel spreadsheets operating outside of firms’ governance policies. Is it time to get our spreadsheet houses in order?

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for compliance with EU Market Abuse Regulation

Date: 18 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes EU Market Abuse Regulation (MAR) came into force in July 2016, rescinding the previous Market Abuse Directive and replacing it with a significantly extended scope of regulatory obligations. Eight years later, and amid constant change in capital markets regulation,...

BLOG

CUSIP Global Services Partners BeZero to Create Unique Identifiers for Carbon Credits

CUSIP Global Services, a provider of securities identification across capital markets, has partnered ratings agency BeZero Carbon to create identifiers for carbon credits. Unique CUSIP identifiers will be assigned to carbon credit projects listed on the major registries across the voluntary carbon market (VCM). This will make carbon credit trading more efficient as the market...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...