The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

US Data Transparency Coalition Focuses on Federal Data in Capital Markets

The US Data Transparency Coalition (DTC) introduced last week to advocate for the standardisation of federal data published online, warns that without significant changes in electronic data creation and management capital markets remain open to a crisis on the scale of the 2008 collapse of Lehman Brothers.

If that is a worst case scenario of inconsistent electronic federal data that is not aggregated, the DTC does point to the benefits of some potential data improvements in capital markets regulation including the forthcoming legal entity identifier (LEI).

The coalition is a private sector group with 14 members – Teradata, MarkLogic, Level One Technologies, BrightScope, Elder Research, Maryland Association of CPAs, Microsoft, Synteractive, RR Donnelley, Symplicity, Rivet Software, IPHIX, Invoke and IRIS Business Services – and is actively recruiting more members to lobby government for change. It has a formidable advisory board including: Earl Devaney, former chairman, Recovery Accountability and Transparency Board; Beth Noveck, former deputy chief technology officer of the US; and Campbell Pryde, CEO of XBRL US. The executive director is former Securities and Exchange Commission (SEC) executive Hudson Hollister and he is supported by treasurer John Runyan.

To date, the coalition has made most progress in pushing for an open, single platform for federal spending data, but it is also throwing its weight behind the need for federal data transparency, including the need for data that is easily searched and downloaded, in regulatory filing and legislative information.

Says Hollister: “The Data Transparency Coalition is advocating for common sense initiatives that encourage the productivity and transparency necessary for government reform. Too often, the federal government doesn’t publish crucial spending details, regulatory filings, corporate disclosures or legislative actions online. Even when such data is electronically published, the government often fails to adopt consistent machine-readable identifiers or uniform mark-up languages.

“Without data standardisation, citizens, members of the media, watchdog groups and even federal agencies themselves have no means of searching the information to identify spending patterns or waste, fraud and abuse.”

In capital markets, Hollister notes a near miss in data improvement following the exclusion of the proposed Financial Industry Transparency Act as an amendment to the Dodd-Frank Act. The bill requires the adoption of consistent mark-up languages –XBRL is the most likely contender at the moment – by financial regulators for the information reported to them. While the proposed Act failed to make it into Dodd-Frank, the coalition will urge members of the US House of Representatives to introduce it again in the next Congress.

“At the moment, there could be another financial crisis as there is no true view of systemic risk across markets,” says Hollister. Noting over 600 types of regulatory filings required by the SEC, he adds: “These are all in different formats and use different types of identifiers to identify entities. It’s not possible to search all this data together and get an enterprise view of the information.”

At a basic level, Hollister explains that check boxes on the cover of the US 10K financial reporting form are not connected to any databases within the SEC. Instead the SEC collects a lot of regulatory information as plain text and does not use a standardised mark-up language. This, he points, out means there is little hope of US regulators working together to identify risk as the information cannot be aggregated.

The Financial Industry Transparency Act was aimed to resolve problems likes these, proposing that each financial regulator use a consistent mark-up language for regulatory data and that the SEC should expand its XBRL reporting requirement to all data from regulated entities, not just those filling financial statements.

The DTC expects federal data transparency will take many years to achieve, but it does see some positives, such as efforts made in many federal agencies to move towards data standardisation and initiatives such as the Office of Financial Research set up by Dodd-Frank as part of the US Treasury and proposer of the LEI, a standard identification code for all companies that will be used by financial regulatory agencies across markets and jurisdictions, and will allow all data filed as part of financial regulations to be aggregated and analysed.

“The sooner the better for the LEI,” says Hollister. “There will never be a perfect way to identify all regulated entities, but the LEI represents a way to tie together different regulatory agencies and apply big data analytics to data collected by many agencies. While achieving the coalition goal of standard business reporting is many years away, there are quick pay-offs such as the LEI. It will make regulatory compliance easier and while there will be some costs for firms, typically they won’t be high as most firms already have electronic data for regulatory reporting.”

Members of the coalition have public and their own interests at heart, but as a whole are working as part of the coalition to encourage federal data standardisation and be ready to develop software solutions that support it.

The LEI is expected to be a global standard, but for the time being the coalition will be focusing its efforts on Capitol Hill. As Hollister concludes: “The US government is the largest and most expensive organisation in the world. If we succeed in passing some mandates for federal data reform, we expect the next step, especially in capital markets, will be to encourage regulators around the world to coordinate with the US, but at the moment we don’t have enough bandwidth for a global campaign. Technically it is possible to absorb regulatory information seamlessly, the problem is governance.”

Related content

WEBINAR

Recorded Webinar: Data management for ESG requirements

Environmental, Social and Governance (ESG) investing is moving into the mainstream, requiring asset managers to develop ESG strategies that deliver for both the firm and its investors. While these strategies can outperform those that do not include ESG factors, there is no clear route to success in an immature market that is only just beginning...

BLOG

UK Sanctions Regime Increases Complexity and Cost of Compliance

The post-Brexit UK sanctions regime has added complexity to compliance, changes in financial institutions’ appetite for risk, and increased costs for multinational firms that must comply with both EU and UK sanctions regulation. These issues and more will be addressed at next week’s A-Team Group webinar on The post-Brexit UK sanctions regime – how to...

EVENT

Virtual Briefing: ESG Data Management – A Strategic Imperative

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...