A-Team Insight Blogs

Share article

In an era of regulation, traders may not love compliance, but they have learned to live with it. That said, the looming US Consolidated Audit Trail (CAT) has stirred up infighting between the Securities and Exchange Commission (SEC) and the individual broker-dealer firms and exchanges affected.

There have been complaints of an ‘incredibly aggressive’ timeframe, accusations that the CAT’s funding model ‘troublingly’ favours the commercial interests of exchanges at the expense of broker-dealers, and last minute calls for the CAT to be delayed because of ‘serious concerns’ that the customer data it requires firms to hand over will become fodder for hackers

From a compliance perspective, the challenges are equally significant, with firms concerned not only about timing of the CAT taking effect, but also its asset coverage, timestamp requirements, error correction regime, data quality implications and, put simply, how much work must be done to get it right.

Compared to the Order Audit Trail System (OATs) regulation that will be replaced by the CAT, the CAT is tougher and wider-ranging. It covers equities and listed options, OATS only equities. It demands customer information, and includes all broker-dealers with no exclusions for smaller firms. Electronic events must be captured within 10 milliseconds, and for manual events and allocations, timestamps must be captured to the second and all business clocks synchronised to within 50 milliseconds of the NIST national clock.

The technology response to these requirements includes the need for firms already reporting into OATS to make changes to their reporting infrastructure to support the CAT, although what exactly must be done remains uncertain as firms are still waiting for final specifications. The requirement for customer and account data to be reported is likely to mean new reporting systems aligned with transaction reporting systems. The first draft of specifications for this reporting is due to be published in May.

Timing issues will require monitoring, timestamping and clock synchronisation tools, and firms may need to improve internal surveillance systems to fully understand trading activities that the regulator also has powers to surveil under the CAT. Considering the volume of data that must be managed, big data platforms and analytics, as well as cloud solutions, are likely to be among the options for compliance.

Concerns about the CAT

Concerns about the CAT result from its sheer scale and significance. Prompted by the 2010 Flash Crash that wiped $1 trillion off the stock market in minutes, the CAT (aka SEC Rule 613) is intended to provide overarching market surveillance. The CAT National Market System (NMS) plan, approved by the SEC in November 2016, involves creating a single, massive database – the CAT Processor, which is being built and will be run by Thesys Technologies – that captures every detail of all equities and options trades in the US, even cancelled bids and offers.

It’s estimated that US stock exchanges and 1,800 or so broker-dealers affected will have to feed in around 58 billion records daily, making the CAT the world’s biggest financial database, and one of the largest data stores of any kind. Controversially, it will ingest data on over 100 million institutional and retail accounts, including unique personally identifiable information (PII).

Tremendous risk

This unprecedented demand by regulators for access to client data, along with other issues, provoked the protests. These crystallised last November, when the Securities Industry & Financial Markets Association (SIFMA) – the ‘voice’ of Wall Street dealers, banks and asset managers – testified before a House Financial Services Subcommittee on Capital Markets hearing that: “Collecting this information in the CAT creates tremendous risk in the event of a breach.”

Represented by Lisa Dolly, CEO of BNY Mellon Pershing, SIFMA pointed out that at least 3,000 individuals from the SEC and the self-regulatory organisations (SROs) involved in running the CAT – namely, the national securities exchanges and FINRA, the Financial Industry Regulatory Authority – will have downloadable access to all CAT data. “The current CAT plan raises serious concerns around data protection and the ability to confidently secure the critical information it will contain,” SIFMA said.

But it’s not just data security. Other SIFMA worries include ‘an overly aggressive implementation timeline for CAT’ and the SROs’ funding model, which imposes 75% of all costs on broker-dealers. SIFMA states: “[This is] particularly troublesome given the SROs include the for-profit exchanges which have built the funding model to benefit their own commercial interests at the expense of the broker-dealers they regulate and with which they compete.” As a result: “SIFMA is requesting a delay in the CAT compliance deadline.”

And it’s not just SIFMA. Sharing its security concerns, last November the key US stock and options exchanges affected by the CAT – including the New York Stock Exchange, Nasdaq and CBOE Global Markets – called for a one-year postponement in its rollout, according to the Financial Times. The exchanges reportedly want a chief information security officer (CISO) to be named for the system before data collection begins.

This call has received a response, with the appointment by Thesys of a CISO, Vas Raian, earlier this month. Raian previously held a similar position at CLS Bank and will have his work cut out if he is to reassure those with concerns about the data security of the CAT that all will be well when it goes live.

Huge questions

While the arguments rage, the clock is ticking. Large broker-dealers and SROs are due to start sending data to the CAT Processor in November this year, with smaller firms following a year later in November 2019.

Which leaves the trading community with huge questions. Is a delay to the CAT likely? Is sensitive client data really going to be exposed to online fraudsters? How much compliance work does the CAT require? What are the data management challenges and potential solutions? And are there opportunities to benefit commercially from the CAT to mitigate the costs and apparent threats it presents?

First, there seems little hope that the CAT will be delayed. Just one day after the exchanges’ appeal for more time, SEC chairman Jay Clayton publicly turned them down. But he did extend an olive branch on data protection, saying: “It’s of paramount importance and I am open to various paths for addressing cybersecurity matters. Commission staff are currently conducting an evaluation of our needs for personally identifiable information and other sensitive data.”

When contacted for this article, the SEC said there is no update on this position. So traders must wait and see what, if any, new security safeguards emerge. In the meantime, what should their priorities be?

Compliance challenges

One particular problem with the CAT is that it is incredibly aggressive in the time allowed to correct data errors, says CAT expert David Emero, vice president of the Americas Regulatory Operations Group at Goldman Sachs. This stems from the fact that the CAT is intended to replace the OATs regulation, while also incorporating elements of regulations such as Electronic Blue Sheets (EBS) and Large Trader Reporting (LTR).

Participating in an A-Team Group webinar on the CAT, Emero also said OATs gives firms five days to fix rejected or mismatched data whereas: “Under the CAT, essentially you get a day-and-a-half to research why mismatches or other errors or rejects occurred, identify the fixes and resubmit the data. Those fixes could involve you having to deal with a counter-party or clarify something with a customer, and getting that resubmitted in time in my opinion is incredibly aggressive. The industry has been lobbying since the beginning that the error correction regime is frankly unrealistic. But despite our best efforts it has remained.”

To deal with this, Emero advises firms to focus on ensuring data quality upfront, principally having robust pre-submission validation capabilities. Separately, once exceptions do happen, the need is to have very sophisticated and quick processes to be able to assess, fix and resubmit data, which is a big hurdle compared to the current regulatory reporting regime.

Data quality

There are a series of other data quality and management challenges presented by the CAT. Tony Brownlee, a partner at risk, compliance and financial data solutions provider Kingland Systems, warns that firms will need to put a lot of effort into updating their legacy customer and account data to comply with CAT demands.

He told us: “For many firms, the data available on accounts in systems wasn’t set up and maintained with regulatory reporting such as the CAT in mind. We think many firms will have significant remediation efforts to unwind years and years of legacy practices for setting up these accounts. Better data may be available in other systems or even in documents collection during onboarding or KYC (Know Your Customer), but connecting that data with the transaction data required by the CAT is a big challenge. Firms should be putting plans in place around customer and account data as the specifications begin rolling out this year.”

To help with this, Brownlee advises engaging with peer companies and industry groups, such as the Financial Information Forum (FIF) or SIFMA, to decide on best practices for CAT reporting. As he put it in the A-Team Group webinar on the CAT: “The herd is valuable. Collaborate with your peers, with experts in the industry. There are a number of changes and hidden pitfalls in this process.”

Emero says: “The one part of the CAT that is most revolutionary is the reporting of customer information.” He suggests this may mean firms are tripped up by inconsistencies in their account data, because the CAT demands information that is typically held in different front-office order and execution systems, and middle-office order allocation systems.

He explains: “Ensuring integrity and consistency among that data is a really important and key data quality challenge of the CAT. Essentially, your entire account database will need to be transmitted to the CAT on an ongoing basis. The regulators will now have an unfettered view of key PII information for every account that your firm has, and the chances of that exposing inconsistencies across your account information is very likely. The CAT Processor will have much more extensive access than regulators have today and CAT may well expose data quality issues that you have.”

In line with this, Brownlee highlights the likely problem of inconsistencies between different broker-dealers’ data on the same customer. He says: “The CAT exists so regulators can have more enhanced surveillance across exchanges and broker-dealers, and there’s going to be more scrutiny of broker-dealer A and broker-dealer B maybe having conflicting information. We may find in the CAT scenario that there are a number of cross-broker-dealer inconsistencies that need to be resolved.”

He adds: “There will literally be hundreds of scenarios related to customer account challenges – issues between your data and another broker-dealer. There are a whole host of customer account problems that firms have to be prepared for.”

Brownlee suggests that to help firms pull together data from multiple systems, there are a range of technologies available that allow users to implement data links and views, and data diagnostic systems to help identify errors and data issues ‘so you have a clean bill of health when you report’.

Entity identification

Another core data management problem relates to the fact that the CAT, for flexibility, allows companies to identify customers using their own identifiers, such as an account number. These identifiers are then mapped back by the CAT Processor to the real-world identities of the entities concerned. This differs from regulations like Markets in Financial Instruments Directive II (MiFID II), which uses Legal Entity Identifiers (LEIs) to identify customers.

The problem comes in the inconsistency between the regimes, and the fact that while firms complying with the CAT don’t have to have an LEI, they have to include the LEI if one is available. “It gets very complicated very fast,” Emero says.

His advice is to get started on the CAT as soon as possible: “The implementation schedule is incredibly aggressive. Given that client identification and reporting is one of the main new things here, start to understand your firm’s client identification methodologies and processes used across different business units, different systems and across the order lifecycle. I think the exception handling regime here is really revolutionary and it will drive organisations to have a much more robust workflow in order to research, identify and fix exceptions – so make sure you don’t put off your exception handling to the end.”

Business advantage?

The promise of more robust workflow raises the question of whether firms could benefit in other ways from the work involved in CAT compliance. Nate Call, director of regtech solutions at financial technology systems provider FIS, believes so.

He says that in a January webinar poll, FIS found 13% of large broker-dealers had already appointed a CAT compliance officer and were well on their way to installing a reporting solution. The reason? “The early adopters of CAT technology are not waiting for the SEC to issue a final decree on the compliance dates because they see value in the technology beyond reporting.”

He explains how business advantage could accrue: “To manage the vast amounts of information necessary, the need is to create a massive, incredibly secure data lake, with a powerful, front-end processing engine to conveniently integrate, aggregate, analyse, prepare, submit, correct and archive the information. This means embracing Big Data as a way of life. It means a single version of truth for a trading entity unlike anything ever seen before. The engine behind the CAT can be retooled to perform incredible data mining forensics, run what-if scenarios, facilitate predictive analytics and drive business value.”

Leave a comment

Your email address will not be published. Required fields are marked *

*

Related content

WEBINAR

Recorded Webinar: Advances in Trade Infrastructure Monitoring

Date: 19 May 2020 Time: 10:00am ET / 3:00pm London / 4:00pm CET To wring greater performance from your trading infrastructure, it’s essential to be able to measure its performance. Leaders in the trading technology race are already using infrastructure monitoring capabilities to assess their trading set-ups and to derive value-added data and insights into...

BLOG

VoxSmart Extends Comms Surveillance Capability with Acquisition of Fonetic Trading Business

VoxSmart, a provider of mobile surveillance technology, has acquired the trading business of Fonetic with a view to delivering a complete communications surveillance offering that addresses market demand for consolidation and integration compliance and surveillance solutions. The acquisition, financial details of which have not been disclosed, follows some high-profile regulatory investigations and prosecutions, such as...

EVENT

Data Management Summit New York City

Now in its 9th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...