The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data Solutions Tackle the Challenges of Increasing Regulation

The introduction of 70 capital markets regulations in Europe over the next 18 months and the addition of about 300 regulations in the US are driving the need for big data solutions that can deliver on-demand risk and business analytics reports, more granular risk management and the ability to combine unstructured data behind products such as OTC derivatives with structured data.

Opening A-Team Group’s Data Management for Risk, Analytics and Valuations Conference in London this morning, A-Team Group president and editor-in-chief Andrew Delaney, outlined the scenario of increasing regulation and the crucial role of data management in the financial markets enterprise. Joining Delaney, Amir Halfon, senior director of technology for capital markets in Oracle’s global financial services business, offered potential solutions to the technology issues of big data management and, in light of regulatory pressure, emphasised the need to make data quality and management ‘front and central’.

While Delaney noted the downside difficulty of keeping on top of what forthcoming regulations mean to firms and the industry, he also described the upside opportunities of operational efficiencies presented by change. He said: “STP will be back, but operating at higher rates, and the need to comply with more regulations will have implications for both operational risk and systemic risk.”

Regulatory compliance will, undoubtedly, mean stringent risk reporting, on-demand risk and business analytics, and improved risk management. From an industry point of view this means pulling together multiple risk and analytics platforms, as well as disparate data sources, to secure an enterprise wide view of risk, broaden distribution of risk information and analytics, and report in multiple timeframes to deliver on-demand information. “This is not just about trading and risk management, it is about risk reporting across other elements of the business such as compliance, credit and the board,” Delaney commented.

Tackling the challenges posed by these requirements is work in progress, but Delaney’s research suggests an industry that has an appetite for change and a desire to build an holistic view of risk. The expansion of data management that is part of this work may lean on big data technologies such as grid and cloud computing, but the endgame must be to reduce risk.

Halfon agreed with Delaney’s needs assessment, highlighting the requirement for real time data management with examples such as the US Office of Financial Research’s entitlement to ask companies for data at any time.

Acknowledging that many firms are moving to on-demand data platforms and touching on the emergence of the legal entity identifier standard, Halfon discussed concepts and technologies behind big data management. “This is about extreme scales and volumes,” he said. “It is the perfect storm. On one side are the regulators and on the other the need to access high throughput, fast data very quickly.”

Halfon described the need to incorporate the four ‘Vs’ of volume, variety, velocity and value into big data solutions, as well the capability to push unstructured data into the structured world.

Volume can be delivered using technologies such as engineered machines and data grids, while the requirement for variety, typically integrating non-structured and structured data, can be met by XML, semantics, entity extraction and developments such as the Hadoop open source framework and, potentially, newSQL.

On velocity and value, Halfon said high velocity but low value data could be managed by engineered machines transferring data to analytical platforms, while real time data warehouses would support a closer to accurate view of the world and real time analytics.

“These are technologies that bring the compute layer closer to the data. The ultimate need is for transformation and analytics that uses data living inside a dense compute fabric rather than outside,” he concluded.

Related content

WEBINAR

Upcoming Webinar: A new way of collaborating with data

Date: 15 April 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Digital transformation in the financial services sector has raised many questions around data, including the cost and volume of reference data required by each financial institution. Firms want to pick and choose the reference data they need to fulfil...

BLOG

n-Tier Blockchain-Based Reference Data Consensus Solution Aims to Drive Down Errors and Costs

As reference data volumes continue to soar, bringing with them huge data cleansing, validation and management costs, financial institutions are beginning to consider collaborative solutions that can improve data accuracy while reducing cost. n-Tier, a New York headquartered company that helps firms ensure accuracy and completeness of reference data, has joined the party with a...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...