About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Risk and Regulation has Absolutely Raised the Profile of Data Management, Says BarCap’s Nigam

Subscribe to our newsletter

It’s all very well reading reports about the raised profile of risk management and its importance to the data management endeavour from a high level perspective, but how is this trend actually impacting the day to day lives of data management professionals? In an exclusive interview with Reference Data Review, Rakesh Nigam, managing director and global head of strategic architecture and core services at Barclays Capital, explains his perspectives on the challenges data managers are facing today.

“Risk and regulation has absolutely raised the profile of data management,” says Nigam. “Firms have become very conscious of operational errors and trying to find out what the reasons are behind these errors – are they human errors or are they to do with the system? Firms need to see the root causes of these problems.” Nigam, who has been with BarCap for four years, prior to which he worked in a similar role in the Credit Markets division of JPMorgan, reckons this search for the root causes of errors has been prompted by increased regulatory oversight that is only going to get tougher. “Whereas previously firms would have seen and reported problems, they would have been unlikely to look any further into the root causes of them. But people now need to look down to the nth degree about what the problems are and create an action plan about how to fix them,” he explains. At BarCap, these action plans are implemented as projects, says Nigam. “If we have got that kind of driver, then auditors want to see those projects go through to completion,” he adds. “When we have seen issues with static data being wrong or reference identifiers not being correct, we can go back in and fix those problems at source. Importantly, this involves the ability to follow that chain of data around the organisation. It is easier to fix high level problems but issues crop up when you start to translate that data downstream via the different systems of a firm and each time the data is changed at each step of the process. You start to lose the flow and the ability to track where that error has occurred.” Data quality measurement is not an easy endeavour but BarCap has metrics per asset class that track data downstream and this gives Nigam’s team the ability to see many different aspects including trade dates. One of the most difficult things about getting metrics set up is actually deciding on those metrics, he says. “You need to make sure that you are finding the root cause of the problem.” However, Nigam believes that some of that problem has been taken away from firms because regulators such as the Federal Reserve, for example, came up with a series of metrics that they wanted to measure in credit a couple of years ago and now they have applied that to other derivatives. “This has allowed senior management to be given a health check on an area or an asset class from front office to back,” he says. Regulatory scrutiny has also prompted a more collaborative approach within firms, according to Nigam. The collaboration between risk managers and data managers has therefore been extended. “Whether it was a reaction to what has happened in the market or it was just the adoption of good practice, we definitely have a strong partnership that goes on between what goes on in terms of risk and of data being sent downstream,” he explains. “The quality of that data across any product has improved as a result. You’re not going to get anywhere in terms of improvement unless you have strong collaboration between the data managers and the downstream users of that data.” The regulatory threat of stress testing in areas such as liquidity risk, however, poses a significant challenge to data managers. “Stress testing has implications for data management in terms of keeping the data and the granularity of the data. The work required for this probably isn’t as well understood as it could be at the moment,” says Nigam. “People are still getting to grips with understanding the regulation that has come out and understanding the implications of that regulation on a broad level, before they begin to implement projects to meet those new requirements. The implementation of those changes would be tough even in a bull market and it is even tougher in this environment.” Nigam feels the approach that a data manager takes to pushing through these regulatory compliance projects is key to getting senior management buy in. “One of the things that businesses don’t like to hear is that they have no choice, things that are mandatory. One of the main things to do with such a project is getting the business to appreciate that the work is being done not because it has to be done due to external pressure but because it is good business practice,” he elaborates. The effort required to get data management projects green lit has obviously increased, given the current economic environment, but Nigam has words of wisdom for those engaged in such an endeavour. Times are obviously tight and strong prioritisation is critical and spending is largely being driven by the understanding of what the business benefits are and what the ROI is likely to be. Everyone is spending their scarce dollars wisely. “We really try to drive things front to back, it is part of our mantra, and so we take strong ownership across the board of the data. This allows you to get a portfolio effect of what projects need to be done and what they will involve. When you tie in project A with project X downstream and make clear that those projects have a dependency on each other, it makes a difference to the business benefits. We try to look at these projects in a total capacity and then ensure that we target the right projects to get completed,” says Nigam. “We have a group of projects around making sure that we capture our product data correctly and distribute those among different systems and those projects are still going ahead. We are trying to make sure those stay alive, even though they could have easily been scrapped in the short term, but that would have been a false economy,” he continues. It is not a lengthy process to get funding at BarCap, according to Nigam. “One of the good things about BarCap is that we don’t start projects for the sake of it – there is always a clear business sponsor and the projects are all fairly short term, even the larger projects are broken up into chunks. We ensure that those projects are seen in the same way as the introduction of a new trading system or a new product going live. Therefore we don’t necessarily have these projects that sit in the background for long periods of time.” The firm is currently looking at the area of de-duplication and getting rid of some of its legacy systems and, to do so, it is working on a pilot approach. “We are proving that there are four business benefits and tying it to a big business initiative that is taking place in one of the asset classes,” Nigam elaborates. “It’s a bit like chopping greens up so small that they don’t even know they’re eating them. It is about tying it back to the business deliverable and making sure that management understands that the project is not just about doing a reference data project for the sake of it, it is about doing it for the sake of the business. This means that you have a real deadline, a real focus and business people sponsoring the project,” he says. These projects are around six to nine months long and the data management team has been drawing up plans in February to complete projects in 2009. Any project longer than this is unlikely to get buy in, says Nigam: “It is too difficult to predict too far into the future and we have not yet set the 2010 budget.” Obviously BarCap is engaged in a significant integration project at the moment, following last year’s acquisition of Lehman Brothers. However, Nigam is unable to go into the specifics at the moment, although he is able to say that he’s being kept very busy. This is something that many data managers may be having an issue with at the moment due to serious budget cuts. “I can imagine that for many firms in many areas data management projects have fallen by the wayside because people haven’t necessarily tied those business aspects with these core infrastructural projects,” says Nigam. “They haven’t been able to demonstrate the real business benefit or they haven’t been able to tie it back to the operational issues they have been experiencing day to day.” If they have managed to keep them alive then, they will have proven the relevance and shown that they can carve these projects into smaller pieces that they can get their arms around, he continues. “The key thing here is that the industry is so lemming like in that everyone does exactly what everyone else is doing and with the downturn and the negativity in the market, there is a level of retrenchment that is going on. Some firms are just waiting to see what happens in the next year.” This passivity is not the right way to cope with change, according to Nigam. “There is a different game to be played here – if you prioritise what you spend wisely and continue to think about what will happen in the long term. One of the successes within BarCap is that we have been strong about what we are doing by focusing on the future,” he concludes. Nigam’s main recommendation for his data management contemporaries is therefore to stand their ground. In such a restrictive climate, the only way to get budgets approved is to provide tangible metrics and risk is a key consideration within this. Moreover, regulation should be act as a catalyst for funding once it has been implemented. “I think the initial reaction to regulatory scrutiny will be concern, as it will be tough, but it is not just a hindrance it is also a help. In some cases it will provide some consistency about what firms need to do – metrics setting for example. It will also be useful in getting the confidence back in the market,” he explains. Although the market will take some time to get back on its feet, the crisis should be seen as an opportunity as well as a challenge. “We therefore need to be smart about where we spend our money and not just focus on the short term,” he suggests.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner. This webinar will review the state-of-play on ESG data, consider the challenges of sourcing and managing...

BLOG

ESG Space Has Strengthened Under Pressure, Summit Lead Speaker Says

The ESG space has emerged stronger from a bruising year of criticism, with financial institutions and companies alike taking a more considered approach to integrating sustainability considerations into their operations. That is the message from the lead speaker at this year’s ESG Data and Tech Summit London, who also forecasts that vendors and users will...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....