
Institutional culture has always been the defining constraint for effective compliance across capital markets and treasury. Boards may approve ever-larger budgets for surveillance technology and artificial intelligence, and regulators may intensify scrutiny of governance, risk, and compliance frameworks, yet the hardest variable to control remains behavioural norms and internal incentives. For many organisations, the question is no longer whether misconduct can be detected, but whether firms are willing to confront the cultural patterns that allow smaller breaches to persist.
Against that backdrop, RegTech Insight spoke with VoxSmart founder and CEO Oliver Blower to discuss why he believes communications surveillance must be reframed – away from fear-driven detection and towards a model grounded in product discipline, explainable AI, and cultural reform. “Unlike a lot of our peers in the space whose messaging is one of fear… our view was, firms consist of fundamentally good actors who occasionally make bad mistakes.”
That distinction informs a broader critique of how compliance is applied. Blower, who began his career as a lawyer, and later as a fixed income managing director at Bank of America and Barclays frames the culture challenge through the due process lens – as a bank employee you are presumed guilty until you prove yourself innocent. In his telling, Blower describes surveillance as “an internal enforcement lever, capable of being used selectively or politically, rather than as an objective mechanism for establishing context.”
The alternative proposition is that surveillance should not simply detect misconduct but preserve fairness. Technology, in this model, becomes a means of evidencing behaviour in context – demonstrating where a conversation was benign, where intent was misread, or where a minor lapse can be addressed before it escalates. The challenge, therefore, is not merely one of better filters or more advanced AI, but of repositioning compliance from a presumption-of-guilt framework to a product-led discipline that supports governance, accountability, and equitable treatment within the firm.
Product Thinking for Compliance
Over the past decade, the role of compliance and surveillance has shifted materially. Historically, compliance teams often sat low in the organisational hierarchy, perceived as cost centres, and sometimes drawn into internal politics. Surveillance standards were limited; tooling was basic and lexicon-based monitoring predominated.
Today, that structure looks different. Blower sees Surveillance heads increasingly operating as product owners. Teams are staffed by software engineers and data specialists, building and stress-testing monitoring systems internally. “I actually view the role of surveillance and compliance now as a product role,” he says. “It’s gone from being an IT role to a political role to a product role.”
This shift has direct implications for artificial intelligence adoption. Product-oriented teams, he argues, are better placed to understand what models are doing, to test them rigorously and to deploy them in a controlled manner. Where AI has faltered, it has often been because governance and model understanding lagged behind enthusiasm.
AI Adoption: Trust but Verify
VoxSmart’s own trajectory mirrors broader market developments. The firm began by addressing off-channel communications – capturing mobile voice, WhatsApp, and WeChat messages at a time when many banks were struggling to reconcile policy with real-world behaviour. From there, it moved into communications surveillance, evolving from lexicon-based rules to more sophisticated natural language processing and AI-driven pattern recognition.
Yet AI adoption remains cautious across the industry. Blower compares its trajectory to cloud computing adoption a decade ago, suggesting the industry is still at an early stage of maturity. “We are still very much in the, I do not trust this mode,” he says, characterising the current environment as “probably 20% trust, 80% verify.”
The challenge is not capability but explainability. Surveillance platforms operate as large-scale filters, ingesting communications, and trade data to isolate a small percentage of true positives. When AI models begin closing large volumes of alerts, firms can become uneasy. “When you start seeing no true positives, you start worrying,” he observes.
Several institutions, he suggests, have experienced false starts – reporting clean outcomes to regulators, only for internal audit to later question model decisions. In such cases, confidence erodes and deployments are scaled back. The lesson, he argues, is that explainability and governance must keep pace with model sophistication.
Culture and Micro Infractions
Beyond technology, Blower questions whether banks are genuinely motivated to change behaviour. Large rogue trading events remain statistically rare. What persists, he argues, are the “micro infractions” – late trade bookings, off-channel conversations, minor policy breaches – that accumulate operational and regulatory risk.
In one example, a bank fined $90 million for failing to book trades within 15 minutes of execution sought reporting tools to identify repeat offenders. The technology could surface the issue consistently. Behaviour, however, remained unchanged. As Blower puts it, “the change is not the technology. The change is the culture.”
As long as profitability remains strong, firms may accept a degree of operational friction provided they can demonstrate oversight. Surveillance can evidence awareness. It cannot compel accountability. The harder question is whether institutions are prepared to challenge high-performing revenue generators when cultural norms conflict with policy.
Contextual Risk and Democratising Surveillance
Blower’s longer-term vision extends beyond reactive monitoring towards contextual risk identification. He describes integrating traditional KYC and related signals (for example, challenges in peoples non-work lives) to better understand behavioural pressure points – not to weaponize them, but to intervene before “good people making bad mistakes” escalate into reportable breaches. The emphasis is preventative rather than punitive.
He also challenges the asymmetry inherent in surveillance. Employees are monitored continuously yet have no visibility into their own data. “Why am I not allowed access to my own data?” he asks. VoxSmart offers read-only access, but institutionally, it remains sensitive and firms block access.
Granting limited transparency, he argues, could rebalance the relationship between firm and employee – enabling individuals to provide context and reinforce standards rather than simply respond to investigation. “You democratize compliance… it’s not just a function of the firm to assert on the people, it’s a function of the people to reinforce within the firm.”
In Blower’s formulation, the future of surveillance lies not in expanding detection alone, but in aligning technology with cultural reform and shared accountability.
A Call to Action
Investment in AI, expanded monitoring coverage and enhanced reporting may satisfy regulatory expectations, but unless firms confront the incentives and behaviours that drive “micro infractions,” surveillance risks becoming a sophisticated diagnostic tool applied to an untreated condition. Recognising culture as a governance issue – not an abstract aspiration – is the first step towards meaningful change.
By treating compliance as a product discipline, prioritising explainability, and reframing surveillance as a mechanism for context and fairness, VoxSmart’s philosophy seeks to align governance with human behaviour rather than police it in isolation. If firms are prepared to engage openly with the cultural challenge, the next phase of surveillance may be less about catching wrongdoing and more about reinforcing responsible conduct – a model in which compliance is shared, transparent and embedded within the fabric of the organisation.
Subscribe to our newsletter



