The introduction of 70 capital markets regulations in Europe over the next 18 months and the addition of about 300 regulations in the US are driving the need for big data solutions that can deliver on-demand risk and business analytics reports, more granular risk management and the ability to combine unstructured data behind products such as OTC derivatives with structured data.
Opening A-Team Group’s Data Management for Risk, Analytics and Valuations Conference in London this morning, A-Team Group president and editor-in-chief Andrew Delaney, outlined the scenario of increasing regulation and the crucial role of data management in the financial markets enterprise. Joining Delaney, Amir Halfon, senior director of technology for capital markets in Oracle’s global financial services business, offered potential solutions to the technology issues of big data management and, in light of regulatory pressure, emphasised the need to make data quality and management ‘front and central’.
While Delaney noted the downside difficulty of keeping on top of what forthcoming regulations mean to firms and the industry, he also described the upside opportunities of operational efficiencies presented by change. He said: “STP will be back, but operating at higher rates, and the need to comply with more regulations will have implications for both operational risk and systemic risk.”
Regulatory compliance will, undoubtedly, mean stringent risk reporting, on-demand risk and business analytics, and improved risk management. From an industry point of view this means pulling together multiple risk and analytics platforms, as well as disparate data sources, to secure an enterprise wide view of risk, broaden distribution of risk information and analytics, and report in multiple timeframes to deliver on-demand information. “This is not just about trading and risk management, it is about risk reporting across other elements of the business such as compliance, credit and the board,” Delaney commented.
Tackling the challenges posed by these requirements is work in progress, but Delaney’s research suggests an industry that has an appetite for change and a desire to build an holistic view of risk. The expansion of data management that is part of this work may lean on big data technologies such as grid and cloud computing, but the endgame must be to reduce risk.
Halfon agreed with Delaney’s needs assessment, highlighting the requirement for real time data management with examples such as the US Office of Financial Research’s entitlement to ask companies for data at any time.
Acknowledging that many firms are moving to on-demand data platforms and touching on the emergence of the legal entity identifier standard, Halfon discussed concepts and technologies behind big data management. “This is about extreme scales and volumes,” he said. “It is the perfect storm. On one side are the regulators and on the other the need to access high throughput, fast data very quickly.”
Halfon described the need to incorporate the four ‘Vs’ of volume, variety, velocity and value into big data solutions, as well the capability to push unstructured data into the structured world.
Volume can be delivered using technologies such as engineered machines and data grids, while the requirement for variety, typically integrating non-structured and structured data, can be met by XML, semantics, entity extraction and developments such as the Hadoop open source framework and, potentially, newSQL.
On velocity and value, Halfon said high velocity but low value data could be managed by engineered machines transferring data to analytical platforms, while real time data warehouses would support a closer to accurate view of the world and real time analytics.
“These are technologies that bring the compute layer closer to the data. The ultimate need is for transformation and analytics that uses data living inside a dense compute fabric rather than outside,” he concluded.
Subscribe to our newsletter