About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data Solutions Tackle the Challenges of Increasing Regulation

Subscribe to our newsletter

The introduction of 70 capital markets regulations in Europe over the next 18 months and the addition of about 300 regulations in the US are driving the need for big data solutions that can deliver on-demand risk and business analytics reports, more granular risk management and the ability to combine unstructured data behind products such as OTC derivatives with structured data.

Opening A-Team Group’s Data Management for Risk, Analytics and Valuations Conference in London this morning, A-Team Group president and editor-in-chief Andrew Delaney, outlined the scenario of increasing regulation and the crucial role of data management in the financial markets enterprise. Joining Delaney, Amir Halfon, senior director of technology for capital markets in Oracle’s global financial services business, offered potential solutions to the technology issues of big data management and, in light of regulatory pressure, emphasised the need to make data quality and management ‘front and central’.

While Delaney noted the downside difficulty of keeping on top of what forthcoming regulations mean to firms and the industry, he also described the upside opportunities of operational efficiencies presented by change. He said: “STP will be back, but operating at higher rates, and the need to comply with more regulations will have implications for both operational risk and systemic risk.”

Regulatory compliance will, undoubtedly, mean stringent risk reporting, on-demand risk and business analytics, and improved risk management. From an industry point of view this means pulling together multiple risk and analytics platforms, as well as disparate data sources, to secure an enterprise wide view of risk, broaden distribution of risk information and analytics, and report in multiple timeframes to deliver on-demand information. “This is not just about trading and risk management, it is about risk reporting across other elements of the business such as compliance, credit and the board,” Delaney commented.

Tackling the challenges posed by these requirements is work in progress, but Delaney’s research suggests an industry that has an appetite for change and a desire to build an holistic view of risk. The expansion of data management that is part of this work may lean on big data technologies such as grid and cloud computing, but the endgame must be to reduce risk.

Halfon agreed with Delaney’s needs assessment, highlighting the requirement for real time data management with examples such as the US Office of Financial Research’s entitlement to ask companies for data at any time.

Acknowledging that many firms are moving to on-demand data platforms and touching on the emergence of the legal entity identifier standard, Halfon discussed concepts and technologies behind big data management. “This is about extreme scales and volumes,” he said. “It is the perfect storm. On one side are the regulators and on the other the need to access high throughput, fast data very quickly.”

Halfon described the need to incorporate the four ‘Vs’ of volume, variety, velocity and value into big data solutions, as well the capability to push unstructured data into the structured world.

Volume can be delivered using technologies such as engineered machines and data grids, while the requirement for variety, typically integrating non-structured and structured data, can be met by XML, semantics, entity extraction and developments such as the Hadoop open source framework and, potentially, newSQL.

On velocity and value, Halfon said high velocity but low value data could be managed by engineered machines transferring data to analytical platforms, while real time data warehouses would support a closer to accurate view of the world and real time analytics.

“These are technologies that bring the compute layer closer to the data. The ultimate need is for transformation and analytics that uses data living inside a dense compute fabric rather than outside,” he concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Sponsored by FundGuard: NAV Resilience Under DORA, A Year of Lessons Learned

The EU’s Digital Operational Resilience Act (DORA) came into force a year ago, and is reshaping how asset managers, asset owners and fund service providers think about operational risk. While DORA’s focus is squarely on ICT resilience and third-party dependencies, its implications extend deep into core operational processes that are critical to market integrity, investor...

BLOG

Swap Data Was Supposed to Deliver Transparency. A Decade Later, Regulators Are Still Trying to Use It

For more than a decade, regulators have collected vast quantities of derivatives transaction data through swap data repositories (SDRs) mandated by post-crisis financial reforms. Yet despite the scale of these datasets, transforming reported trade data into meaningful supervisory insight has often proved more difficult than policymakers anticipated. A new Memorandum of Understanding (MOU) between the...

EVENT

RepRisk Sustainability Breakfast Roundtable London

The London sustainability breakfast is part of the global roundtable thought leadership event series hosted by RepRisk in key markets, including, New York, Toronto, London, Frankfurt, Oslo, Copenhagen, Stockholm, Hong Kong and Singapore in 2026.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...