About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

London’s Big Bang in 1986 … The Beginning of the End

Subscribe to our newsletter

Note: This is a slightly updated version of an article I wrote five years ago for A-Team Insight, on the 20th anniversary of London’s Big Bang. I thought it was worth another airing … and while the latency of the time was more slow and low, some lessons in capacity planning can still be learned today … so here goes …

“Big Bang Blows A Fuse!” read the headline of the Evening Standard on Monday, October 27th back in 1986. It referred to the failure, just minutes after opening, of the London Stock Exchange’s new electronic market and represented a sudden and comprehensive deflation of a bubble of excitement that had been building in the London market for more than three years. And it was a failure that I foresaw, and witnessed close up, since at the time I was working for the exchange’s IT group.

Big Bang was the name given to the deregulation of London’s securities market. At the insistence of the UK government, the exchange was to transform itself from a sleepy gentleman’s club into a securities market that could more readily compete with the emerging threat from overseas exchanges, such as the New York Stock Exchange and Nasdaq.

Since its formal creation in 1801, the London Stock Exchange operated a floor market where “my word is my bond” was the core operating principle. Another was the concept of fixed minimum commissions for transactions. Quaint and cosy though this market structure was (and convenient for doing business during long lunch hours in the pubs and restaurants that surrounded the exchange’s Threadneadle Street location in the City), it wasn’t stacking up to the looming competitive threat.  Concerned that London’s leading position in the global financial markets was in jeopardy, the government pushed through sweeping reforms.

In the new market, securities firms could act as both brokers and market makers, commissions would be negotiated (and hence would be lower), and capital infusion from outside companies was allowed. As a result, banking firms acquired brokers and jobbers, and foreign firms began to play alongside the traditional names whose culture descended from the coffee houses of the 17th century.

Thus, the run up to Big Bang saw frenzied m&a activity among the city’s own firms. Barclays Bank acquired top broker De Zoete & Bevan, and a leading jobber Wedd Durlacher to form Barclays De Zoete Wedd, or BZW as it was known (it has since been renamed Barclays Capital).  HSBC snapped up broker James Capel. Security Pacific purchased broker Hoare Govett. Credit Suisse acquired broker Buckmaster and Moore.  Citicorp took over brokers Scrimgeour Kemp-Gee and Vickers Da Costa to form Citicorp Scrimgeour Vickers (which, incidentally, contributed its Dogfox market data system to the Citicorp-owned Quotron to form the basis of the Quotron Securities Trader product). Those were just a few of many such deals.

Naturally, these new firms wanted to attract the best and the brightest talent to compete in what was going to be a much more competitive market. Salaries and packages for brokers, traders and even IT executives rocketed skyward, and recruitment of junior brokers, analysts and trading support staff accelerated. Times were heady, Thatcherism was the ethos and champagne became the beverage of choice for many in the square mile.

Along with the change in market structure and participants, came a massive investment in infrastructure and IT. Construction of new office complexes was to be seen everywhere. And firms invested millions of pounds in new trading floors, some experimenting for the first time with receiving market data in digital form, as opposed to the more established video services and related video switching technology.

It was, in fact, my experience of writing software to handle digital feeds that landed me a job working at the exchange itself. Peter Bennett, an IT “visionary” (I held that opinion then, and still do today) at the exchange, had begun to leverage technologies such as x.25 networks and satellite broadcast systems. Now, he was looking for people for Project Orbit, an advanced market data distribution system, which used emerging technologies such as the IBM PC and local area networks (it was later released as a product via a joint venture with IBM, dubbed Radix). And while that technology was new to me, I understood the data feed offerings from Reuters, Telerate and the exchange itself. The job interview was light-hearted and perfunctory, the salary was double what I was then making, and the cool factor was extreme.  So, in April 1985, I joined Bennett’s Advanced Systems Group – which numbered only around 20 within an IT operation that was approaching 2,000 strong.

At my time of joining, much of the IT focus of the exchange was directed towards Seaq – Stock Exchange Automated Quotation – an electronic system based on Nasdaq’s competing market maker model, which was to augment (and, as it happens, quickly obsolete) its trading floor.

Under the leadership of project director Mick Newman, Seaq was designed by the team of Ian McLelland and Peter Buck to capture quote updates (via IBM PC-based workstations) from securities firms and aggregate them into competing quote pages (formatted according to strict market trading rules). Actual execution was still conducted via the telephone, with subsequent trade reporting also flowing through Seaq.

Based on a loose cluster of Digital Equipment Corp. VAX 8600 computers – the top of the range at the time – Seaq mostly relied on an existing information distribution architecture as its link to the outside world of securities firms and quote vendors. And while the existing infrastructure was upgraded to cope with Seaq, the decision to press it into action would prove to be the Achilles’ heal of the new market system.

Already in place were Epic – Exchange Price Information Computer – a VAX-based price database and the related CRS (Computer Readable Services) system, which supported information distribution via record-based data feeds. All of the leading quote vendors tapped into these feeds, though their take up by securities firms was limited, mostly due to the complexity in handling them.

Also, the ‘official’ market prices for Seaq were deemed to be those displayed via the exchange’s Topic system, which supported a network of page-based terminals, as well as page-based data feed. Thus, delivering Topic to the vast new trading floors was an imperative.

Like Epic, Topic was one of Bennett’s inventions. And like Epic, it was one he pushed through against objections from the exchange’s council (board of directors), which wasn’t particularly competent in IT matters. Years later, the council saw the benefits, as Topic and data feed sales brought in millions of pounds in revenue.

Developed on a shoe-string budget, Topic – Teletext Output of Price Information by Computer – was cutting edge at the time. It could display continuously updated prices and text, in colour on TV-size screens (from Bishopsgate Terminals and Barco), and with graphics, due to its use of videotext protocols. It had vastly superior functionality and was priced way below the only real competition, from Reuters. And so it was an instant hit. In the build up to Big Bang, orders flowed in hourly for new installations.

At its heart, Topic used specialised real-time computers originally developed by Modcomp Computers of Fort Lauderdale, FL for use by Nasa for tracking space probes. The central cluster of Modcomps (a database system connected to several information retrieval slaves) were connected to customer sites via 9.6kbps circuits, where specialised controllers built by Gandalf of Canada handled data compression and multiplexing, page caching and automatic refreshing. These Zilog Z80-based devices were programmed by the infamous Chan, one of Bennett’s team, who later quit to work for James Capel (Now HSBC).

While Topic’s architecture was well ahead of its time, it wasn’t designed to support a fast-paced electronic market and traders’ need for second-by-second price updates. Prices on Topic screens were originally updated every 50 seconds (later reduced to 30 seconds for Seaq). That was adequate for the trading floor era, but the demands of trading electronically required something faster.

Amazingly, this need seemingly wasn’t foreseen or recognised by the exchange. While Topic was upgraded to handle an increased terminal base and a vastly increased update rate (from Seaq and also from a feed of North American prices supplied by Monchik-Webber), it was unable to provide dynamically updated real-time prices. This shortcoming was realised by some, and since I was working on handling feeds for Orbit, it was an obvious issue.

I discussed it with McLelland and Buck, who agreed, but who couldn’t raise the issue as they were confined to Seaq development. I raised it with Richard Yau, then head of Topic development, who shared my concerns (but who wasn’t too bothered as he was about to quit the exchange for a highly paid job at Salomon Brothers in New York City).

Eventually, supported by Bennett, I wrote a memo on the subject and distributed it widely, including to Newman and his boss, George Hayter, who was the exchange’s IT director. The memo went nowhere and I was basically told to mind my own business, though I think the actual language included the “off” word.

So to Big Bang Day. That Monday morning, the new marketplace started up as planned.  Market maker quotes were input, flowed through the perfectly operating Seaq and into Topic. Eager to see the most recent prices and not wanting to wait for the 30-second refresh, traders continuously hit ‘enter’ to refresh pages. The workload on the Modcomps soared, and due to a software bug not picked up in testing, they failed. Topic had been stressed to its knees. The systems were restarted only to fail again later in the day. Emergency measures were implemented, fixing the software bug and reducing service features and levels to keep the system running until a more permanent fix could be implemented.  But the damage was done.

The failure on the morning of Big Bang was more than just a software glitch that got corrected. To the City – and especially to the new players that now operated within it – the episode was a black mark of no confidence towards an institution to which they looked for IT leadership. In years to come, the exchange sold off much of its lucrative information business because it couldn’t compete with the increased competition, and outsourced IT because the people at the top thought technology was not a core competency of a marketplace – 20/20 hindsight would suggest that was a big strategic error.

Fast forward to today and the LSE is now operating in a whole different world, one where it is really just another liquidity venue in a pretty fragmented and competitive space, where technology – and low latency – is key. Sure, the institution has heritage, but I doubt whether the systems that run algo trading and HFT care much about it.

To some extent it has pulled IT back in house, through the 2009 acquisition of MillenniumIT – which has provided a world class matching engine that has been sold to other markets. And it’s expanded through acquisition – Borse Italiana, Oslo Børs and the Turquoise MTF – but the recently failed merger with Canada’s TMX Group has highlighted how difficult a path M&A is to global leadership.

While the future of global marketplaces is difficult to predict … especially with new regulations looming, other failed mergers and one big one still to happen – to be honest, it’s probably nationalistic pride that will keep the LSE from one day running out of computer suite in Basildon, or in Stockholm. Or Slough or Brick Lane, for that matter.

Subscribe to our newsletter

Related content


Recorded Webinar: Leveraging interoperability: Laying the foundations for unique best-of-breed trading solutions

Interoperability on the trading desk promises more actionable insights, real-time decision making, faster workflows and reduced errors by ensuring data consistency across frequently used applications. But how can these promises be kept in an environment characterised by multiple applications and user interfaces, numerous workflows and technology vendors competing for space on the trader’s desktop? This...


Liquidnet Introduces Roll Seeker to Streamline Futures Rolls Execution

Liquidnet, the technology-driven agency execution specialist, has launched Roll Seeker, a new tool designed to enhance workflows associated with the execution of futures rolls. The development responds to member requests to utilise Liquidnet’s block trading expertise to discover contra end-user liquidity interest more effectively. “With exchange-traded derivatives being such a mature market, we wanted to...


RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.


Tackling the Data Management Challenges of FATCA

As the July 1, 2014 deadline for compliance with the Foreign Account Tax Compliance Act – or FATCA – approaches, financial institutions around the world are working to ensure their data management and operational systems will meet the requirements of the US legislation. This report discusses the requirements of FATCA and how the legislation is...