Tony Kirby may no longer work at Reuters, but he’s still secretary of the Reference Data User Group. In a month of turmoil, he spoke to Reference Data Review about the group’s activities and ongoing relations with its counterparts.
Tony Kirby is in danger of becoming a professional financial technology evangelist. A few years ago, he was highly visible as the public face of the Global Straight-Through Processing Association. The GSTPA is now defunct, but Kirby had already moved to head Reuters’ Global STP Programme.
Last year he was back in the news with the foundation of the Reference Data User Group of which he is secretary. Following Reuters’ termination of its STP programme and Kirby’s departure from the vendor at the beginning of this month, he has continued at RDUG, and his secretarial role there is now the main call on his time.
Or you could look at that the other way around: four years ago STP was the hot button topic, and Kirby was in the thick of it. By the end of 2001, a lot of that work had been done and T+1 was on the back burner. There was a new hot button topic: reference data. Reuters was clambering into bed with Capco, with TowerGroup numbers being thrown around like confetti to celebrate a marriage that turned out to be surprisingly short-lived. Then along comes RDUG, and it might be argued that Kirby’s involvement with the group was taking up more of his time than any employer would have been entirely happy with – take a look at the timeline slides on one of the RDUG presentations, such as the vendor presentation, or the session Kirby presented at the FinExpo event in London last month. (You’ll find them at www.isitc.org and www.finexpo.org respectively.)
So, a serial committee member, or an industry guru who has his finger on the pulse and his seat at the table when the grown-ups are talking turkey? Take your pick; most people have, as is the way when someone has anything approaching a profile in any business.
Whatever your choice, be ready to be battered by numbers statistics. “Sixty one per cent of respondents in the latest Tower Group survey on Outsourcing Data Management cited reference data projects as high or top priority,” he says in the RDUG Industry Issues White Paper presented this month (and of which he is one of the co-authors).
But these numbers increasingly have currency symbols in front of them. “What is the impact of failed trades in a form that a trader can understand? I’ve taken some Omgeo Fulcrum information and Plexus Group numbers and said, what is the actual cost of a failed trade? Well, for a dom-estic trade it turns out to be in the region of EUR188, cross-border was more like double that,” he says. “Take that further and look at average bargain size and so on, you can see that in the front office things like order routing and order management systems are shaving fractions of that off the cost. It’s penny wise and pound foolish.”
This issue of costs is a recurring theme in Kirby’s arguments for addressing the reference data management issues, and for doing it across the industry. A central plank of his current arguments rests on return on investment. “We require quick wins that show early RoI,” says Kirby. More figures: at the end of 2002 surveys showed that 65 per cent of companies need to provide RoI analysis to justify IT projects, up from 23 per cent the previous year. Among financial services companies, 80 per cent now require that the RoI performance of IT investments has improved over the previous year.
The good news for those involved in the reference data field is that there is evidence that users can make significant returns on their invest-ments. Kirby cites figures, based on Capco research from last year, that show that the introduction of “a comprehensive reference data solution” could realise potential annual savings of $2.25 million for a firm making 600,000 global equities and fixed income transactions a year. For firms doing 10 million transactions a year, the savings are in the $19.1 million area. A large portion of this comes from the reduction of costs associated with exception handling.
Those are numbers that are going to get you attention in any sector. When you factor in the qualitative benefits specific to the financial services sector such as improved settlement and STP rates with fewer exceptions and greater accuracy and transparency of both internal and external reporting, you’re starting to get a compelling case.
And then there is the regulatory question. “Any approach to operational risk management has to consider reference data: 40 per cent of your trade records might be reference data,” he says. “You have to identify what you’re trading, the matching, accruals and so on, and you’ve got to make sure that the instructions are passed to the right parties. It’s becoming a key regulatory issue – not just reporting but trading cost analysis. The Financial Services Authority has just issued a consultation document on best execution with the intention of having draft rules in place by the middle of the year. The Investment Management Association is also on the case, and the Group of 30 has come out with a specific recommendation.”
Kirby is sceptical about the role of the regulators. “They’d probably like to be more proscriptive, but they are realising that they can’t do that any more,” he says. “It’s one thing for the regulators to jump up and down about T+1, quite another to get the industry participants to actually do it. What they are good at is setting the goals, like a kind of Kennedy mission statement – more the what than the how, in other words.”
Why is this all happening? “There are more and more folks out there realising that as a trade advances in time it becomes more complex, and we should be focusing on quality issues, not just standards, though they are important. If the data elements sitting in the standard container are wrong, then that’s where you get problems. If you don’t have the right information at the right part of the value chain, what you end up with is expensive reconciliation costs.”
Understanding the nature of that information and where it fits in the value chain was part of the reason for setting up RDUG in the first place, says Kirby. “If 40 per cent of your trade record is reference data, then you ought to know a heck of a lot about reference data to really know about your processes. I just thought that we’d got to get an education process going.”
That education process has now evolved into a global – or, at least, transatlantic – group that numbers most of the major players among its members. At the end of this month, it will be meeting with representatives of the Reference Data Coalition in the US to explore further collaboration. “What RDUG does is focus on pain with a lot of buy-side practitioners, and REDAC will be going out to build consensus between different bodies like ANNA and Swift,” says Kirby. “That’s not something that we have the time, effort or even the inclination to do, because there is already a mechanism for doing that in the form of the FISD, which has staffing and processes. Our model is if there is something there, then use it: if there isn’t, then fill the gap.”