The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

SEC Names RiskMetrics Co-Founder Gregg Berman as Head of Office of Analytics and Research

The US Securities and Exchange Commission (SEC) has named risk expert Gregg Berman as head of the Office of Analytics and Research in its Division of Trading and Markets. The office was set up last year to conduct research and analysis that will help inform the Commission’s policies on markets and market structure.

Its focus is on high-frequency trading, initially in equity markets, and its remit includes providing expertise on risk management, quantitative data analysis, trading and portfolio management. Issues that will be investigated range from market structure to new products and rule filings by exchanges.

Berman honed his risk expertise at RiskMetrics Group, the New York-based risk management, corporate governance and financial research and analysis company he co-founded and worked at for 11 years before it was acquired by MSCI in March 2010.

Berman joined the SEC in October 2009 as a senior advisor in the Division of Risk, Strategy and Financial Innovation, moving in June 2010 to become senior advisor to the director of the Division of Trading and Markets. Since joining the SEC, Berman has worked on complex issues including analysis of the causes of the May 6, 2010, so-called Flash Crash, rule making to create a consolidated audit trait and rules for derivatives trading required by the Dodd-Frank Act. He comments: “Though the markets may be complex, they are not impenetrable and I am confident in our abilities to continue developing data driven analyses to inform policy.”

Among the tools Berman and his team will use is Midas, the market information data analytics system the SEC acquired from Tradeworx last year that allows staff to analyse trading using the same exchange trading data that is used by market participants.

Related content

WEBINAR

Recorded Webinar: How to establish data quality and data governance for analytics

Data quality has been a perennial problem for financial institutions for many years, but this needs to change as firms become increasingly reliant on accurate analytics to deliver business opportunity and competitive advantage. New approaches to data quality can help firms up their game and significantly improve their analytics capability. Adding the processes, controls and...

BLOG

LIBOR’s End Should be a New Beginning for Corporate Treasuries’ Data Management

By Neil Sandle, Head of Product Management, Alveo. For many treasuries, LIBOR (London Interbank Offered Rate) is one of their most critical benchmarks. Together with the exchange rates of major currencies it is an essential piece of data, underpinning contracts worth trillions of dollars. The long-standing centrality of LIBOR is why well-publicised global moves to...

EVENT

TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

The Data Management Implications of Solvency II

Bombarded by a barrage of incoming regulations, data managers in Europe are looking for the ‘golden copy’ of regulatory requirements: the compliance solution that will give them most bang for the buck in meeting the demands of the rest of the regulations they are faced with. Solvency II may come close as this ‘golden regulation’:...