About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Transformative Projects are Unlikely to Happen this Year in the Risk Data Space, Says Kinetic Partners’ Collins

Subscribe to our newsletter

“In 2010 it will be difficult to do anything transformative,” John Collins, member of risk management platform provider Kinetic Partners focused on the vendor’s consulting business, told delegates to A-Team Insight Exchange in London this week. The cost pressures of the current market climate and an unwillingness to change core technology platforms will continue to hold back many firms from investing in new data architectures, agreed Collins and his fellow panellists.

The logic behind the benefits of a single pool of consistent data on which the front, middle and back office functions can rely is understood by most, said Collins: “But who pays for that single platform?” A single accessible source of data may be the desire in the market, but getting such an endeavour off the ground is far from easy, agreed the panel.

Collins, who recently joined Kinetic Partners from Rule Financial, spoke about his experience as a practitioner during his time as an investment banker at firms including Morgan Stanley and Citi, highlighting the challenges involved in getting the range of divisions across a firm to take on the cost of a platform change. The P&L mindset of these teams is likely to prove the biggest obstacle in terms of implementing an enterprise-wide platform, where no one team is willing to pay for the changes, he said.

Simon Tweddle, Mizuho International’s risk management chief operating officer, contended that the desire to drive down the total cost of ownership (TCO) might win out in the end, however. His own firm has invested in its own data architecture, for example, but he did concede that Mizuho International is the smaller, London-based offshoot of its Japanese parent and therefore had less of challenge in terms of scale.

“At Mizuho the business heads are encouraged to challenge each other and have risk-based discussions outside of P&L arguments. Everybody in the business needs certain data items to be the same, so everybody must pay for them,” he explained.

Moreover, Tweddle noted that not every reference data item needs to be standardised in an enterprise-wide context. “Some data items such as legal entity data hierarchy and a single version of a trade need to be the same, but others don’t as they may have destination dependent data attributes,” he said. “Firms therefore need to be much smarter about what they invest in by targeting the right areas with the right combination of solutions.”

Amir Halfon, senior director of technology at Oracle Financial Services, added that another key driver for investment in data management is around performance improvement for the business. Collins agreed: “Time to market, for example, is a key determiner. To be able to respond quickly to a client is a competitive advantage.”

The process of changing systems is not an easy one, however, agreed the panel. Xavier Bellouard, managing director of Quartet FS, indicated that in his experience, clients often do not want to change user front ends and want to hold on to old technology. “This is why we have to be agnostic about data formats because all these different systems produce different formats. We can’t be too rigid and therefore have to take the data as it comes to us and translate it at the very last moment,” he elaborated. “We have to reuse as much as we can.”

Tweddle suggested that a gradual phasing out of legacy applications might be the most sensible way to approach the challenge, by allowing legacy applications to coexist with new technology for a period of time.

Collins disagreed with the notion that legacy applications should be left in situ in the long term. “The market will force technology people to make tough choices. There are interesting dynamics at play in the current market and regulatory drivers, the profitability and cost equation and the changes in the risk and business paradigms will force change. Not touching any of this technology is not a choice,” he argued.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...