About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Case Study: Leveraging Data for Operational Risk Monitoring

Subscribe to our newsletter

Given the rise of high profile incidents related to rogue trading, a major US bank has worked with business and technology consultant Detica to pull together its data to monitor operational risk across a number of controls in order to provide an effective tool to combating these incidents. Through the process, however, other business benefits emerged, said Roger Braybrooks, head of research for global financial markets at Detica.

Rogue trading, such as the recent Casiss d’Epargne equities derivatives issue, or Jerome Kierval at Societe Generale, is generally facilitated by control failures, but no two failures are the same, said Braybrooks. “When you’re not looking at it from a holistic point of view, you won’t catch it.”

He cited a recent statement from the FSA suggesting the need for a series of yellow flags to be applied within areas of concern, which could then be aggregated to produce a red flag on control concerns across those areas. This is essentially the approach Detica has taken with its networked operational risk model.

The first step with its pilot client was to clearly identify the controls that needed to be monitored based on a defined risk model. From there, data was contributed to a shared file by various groups across the organisation, such as credit risk, market risk, compliance, finance, settlements, operations and others. Braybrooks acknowledges that getting the participation across these business units was challenging, which is why sponsorship from the top of the organisation is necessary to drive through this kind of project.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Data Quality Still Troubling Private Market Investors: Webinar Review

Obtaining and managing data remains a sticking point for investors in private and alternative assets as financial institutions sink more of their capital into the markets. In a poll of viewers during a recent A-Team LIVE Data Management Insight webinar, respondents said the single-biggest challenge to managing private markets data was a lack of transparency...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...